Refine
Year of publication
Document Type
- Article (144)
- Monograph/Edited Volume (26)
- Part of a Book (21)
- Conference Proceeding (20)
- Postprint (15)
- Other (7)
- Contribution to a Periodical (5)
- Doctoral Thesis (1)
- Review (1)
Keywords
- knowledge management (7)
- Industrie 4.0 (5)
- Industry 4.0 (5)
- deep reinforcement learning (5)
- production control (5)
- ERP (4)
- Enterprise-Resource-Planning (4)
- digital learning (4)
- machine learning (4)
- systematic literature review (4)
Institute
- Wirtschaftswissenschaften (139)
- Fachgruppe Betriebswirtschaftslehre (88)
- Wirtschafts- und Sozialwissenschaftliche Fakultät (4)
- Hasso-Plattner-Institut für Digital Engineering GmbH (3)
- Fachgruppe Politik- & Verwaltungswissenschaft (2)
- Sozialwissenschaften (2)
- Bürgerliches Recht (1)
- Department Psychologie (1)
- Extern (1)
- Forschungsbereich „Politik, Verwaltung und Management“ (1)
In response to the impending spread of COVID-19, universities worldwide abruptly stopped face-to-face teaching and switched to technology-mediated teaching. As a result, the use of technology in the learning processes of students of different disciplines became essential and the only way to teach, communicate and collaborate for months. In this crisis context, we conducted a longitudinal study in four German universities, in which we collected a total of 875 responses from students of information systems and music and arts at four points in time during the spring–summer 2020 semester. Our study focused on (1) the students’ acceptance of technology-mediated learning, (2) any change in this acceptance during the semester and (3) the differences in acceptance between the two disciplines. We applied the Technology Acceptance Model and were able to validate it for the extreme situation of the COVID-19 pandemic. We extended the model with three new variables (time flexibility, learning flexibility and social isolation) that influenced the construct of perceived usefulness. Furthermore, we detected differences between the disciplines and over time. In this paper, we present and discuss our study’s results and derive short- and long-term implications for science and practice.
In Organisationen mit mehreren hundert oder mehr Mitarbeitern ist es kaum möglich, einen Überblick über die Kompetenzen und Spezialkenntnisse von Beschäftigten zu behalten. Die Suche nach Fachleuten gestaltet sich dann oft schwierig, zum Beispiel wenn ein Projektteam zusammengestellt werden soll. Hilfe versprechen Skill- Management-Systeme, vorausgesetzt, ihre Datenbank wurde sinnvoll eingerichtet und wird laufend aktualisiert. Die Autoren beschreiben Aufbau und Struktur derartiger Systeme, fragen nach ihren kosten und nennen Kriterien für professionelle Lösungen.
Skill management catalogues built via KMDL : integrating knowledge and business process modelling
(2004)
The efficient use of human capital is one of the most important factors in todays' business competition. Competition is strongly influenced by qualified staff. In order to aid the human resources department to keep up with strategic decisions various skill management systems have been created that make the development of human resources easier and more precise. Skill management systems are only as good as the information that they are based on. The mostly used basic information is the skill catalogue which shows the gaps of each employee or division within the company. But there are nearly no applicable methods yet to create such a catalogue thoroughly. This paper introduces a reasonable approach to create such a catalogue with the description language for knowledge-intensive processes KMDL. The skill catalogue built for skill management systems is one of the most important but still most neglected factors when introducing skill management.
Skill Management
(2006)
The usage of gamification in the contexts of commerce, consumption, innovation or eLearning in schools and universities has been extensively researched. However, the potentials of serious games to transfer and perpetuate knowledge and action patterns in learning factories have not been levered so far. The goal of this paper is to introduce a serious game as an instrument for knowledge transfer and perpetuation. Therefore, reqirements towards serious games in the context of learning factories are pointed out. As a result, that builds on these requirements, a serious learning game for the topic of Industry 4.0 is practically designed and evaluated.
The authors propose that while tacit knowledge is a valuable resource for developing new business models, its externalization presents several challenges. One major challenge is that individuals often don’t recognize their tacit knowledge resources, while another is the reluctance to share one’s knowledge with others. Addressing these challenges, the authors present an application-oriented serious game-based haptic modeling approach for externalize tacit knowledge, which can be used to develop the first versions of business models based on tacit knowledge. Both conceptual and practical design fundamentals are presented based on elaborated theoretical approaches, which were developed with the help of a design science approach. The development of the research process is presented step by step, whereby we focused on the high accessibility of the presented research. Practitioners are presented with guidelines for implementing their serious game projects. Scientists benefit from starting points for their research topics of externalization, internalization, and socialization of tacit knowledge, development of business models, and serious games or gamification. The paper concludes with open research desiderata and questions from the presented research process.
Adaptability of information systems has become a substantial competition factor. Today's insufficient methodical support for the realization of adaptability frequently leads to unused potentials of deployed information technology in enterprises. In this contribution a procedure is presented, which addresses the demand to determine the necessary adaptability of an enterprise related to its surrounding environmental environment.
Robotic Process Automation (RPA) steht für die softwareunterstützte Bedienung von Softwarelösungen über deren Benutzeroberfläche. Das primäre Ziel, das mit RPA erreicht werden soll, ist die automatisierte Ausführung von Routineaufgaben, die bisher einen menschlichen Eingriff erforderten. Das Potenzial von RPA, Prozesse langfristig zu verbessern, ist allerdings stark begrenzt. Die Automatisierung von Prozessen und die Überbrückung von Medienbrüchen auf der Front-End-Ebene führt zu einer Vielzahl von Abhängigkeiten und Bedingungen, die in diesem Beitrag zusammengefasst werden. Der Weg zu einer nachhaltigen Unternehmensarchitektur (bestehend aus Prozessen und Systemen) erfordert offene, adaptive Systeme mit moderner Architektur, die sich durch ein hohes Maß an Interoperabilität auf verschiedenen Ebenen auszeichnen.
Requirements for an integration of methods analyzing social issues in knowledge organizations
(2006)
Knowledge is more and more a key factor within companies [10]. Nearly 40 percent of all employees are so called 'knowledge workers'. Distribution and inquest of knowledge within companies are supported by skill management systems. Although not all aspects and potentials of this instrument are yet utilized skill management systems have spread widely within business organizations. This paper summarizes the requirements, scopes and problems for skill management system within the company.
Die Herstellung von Produkten bindet Energie sowie auch materielle Ressourcen. Viel zu langsam entwickeln sich sowohl das Bewusstsein der Konsumenten sowie der Produzenten als auch gesetzgebende Aktivitäten, um zu einem nachhaltigen Umgang mit den zur Verfügung stehenden Ressourcen zu gelangen. In diesem Beitrag wird ein lokaler Remanufacturing-Ansatz vorgestellt, der es ermöglicht, den Ressourcenverbrauch zu reduzieren, lokale Unternehmen zu fördern und effiziente Lösungen für die regionale Wieder- und Weiterverwendung von Gütern anzubieten.
Die Herstellung von Produkten bindet Energie sowie auch materielle Ressourcen. Viel zu langsam entwickeln sich sowohl das Bewusstsein der Konsumenten sowie der Produzenten als auch gesetzgebende Aktivitäten, um zu einem nachhaltigen Umgang mit den zur Verfügung stehenden Ressourcen zu gelangen. In diesem Beitrag wird ein lokaler Remanufacturing-Ansatz vorgestellt, der es ermöglicht, den Ressourcenverbrauch zu reduzieren, lokale Unternehmen zu fördern und effiziente Lösungen für die regionale Wieder- und Weiterverwendung von Gütern anzubieten.
Faced with the triad of time-cost-quality, the realization of knowledge-intensive tasks at economic conditions is not trivial. Since the number of knowledge-intensive processes is increasing more and more nowadays, the efficient design of knowledge transfers at business processes as well as the target-oriented improvement of them is essential, so that process outcomes satisfy high quality criteria and economic requirements. This particularly challenges knowledge management, aiming for the assignment of ideal manifestations of influence factors on knowledge transfers to a certain task. Faced with first attempts of knowledge transfer-based process improvements [1], this paper continues research about the quantitative examination of knowledge transfers and presents a ready-to-go experiment design that is able to examine quality of knowledge transfers empirically and is suitable to examine knowledge transfers on a quantitative level. Its use is proven by the example of four influence factors, which namely are stickiness, complexity, competence and time pressure.
The collaboration during the modeling process is uncomfortable and characterized by various limitations. Faced with the successful transfer of first process modeling languages to the augmented world, non-transparent processes can be visualized in a more comprehensive way. With the aim to rise comfortability, speed, accuracy and manifoldness of real world process augmentations, a framework for the bidirectional interplay of the common process modeling world and the augmented world has been designed as morphologic box. Its demonstration proves the working of drawn AR integrations. Identified dimensions were derived from (1) a designed knowledge construction axiom, (2) a designed meta-model, (3) designed use cases and (4) designed directional interplay modes. Through a workshop-based survey, the so far best AR modeling configuration is identified, which can serve for benchmarks and implementations.
The implementation of learning scenarios is a diversely challenging, frequently purely manual and effortful undertaking. In this contribution a process based view is used in scenario generation to overcome communication, coordination and technical gaps. A framework is provided to identify, define and integrate technological artefacts and learning content as modular, reusable building blocks along a modeled production process. The specific contribution is twofold: 1) the theoretical framework represents a unique basis for modularization of content and technology in order to enhance reusability, 2) the model based scenario definition is a starting point for automated implementation of learning scenarios in industrial learning environments that has not been created before.
Die Erfüllung sicherheitsrelevanter Aufgaben, gerade im Bereich der Wasserversorgung, erfolgt immer vor dem Hintergrund des Schutzes der Kritischen Infrastruktur selbst und eines effektiven Bevölkerungsschutzes. Daher erfordert die Organisation des Schutzes eine über die betriebsbezogene Sichtweise hinausgehende überorganisatorische Betrachtung im Gesamtkontext zunehmender Verflechtung und Abhängigkeiten der Organisationen. Die vorliegende Broschüre richtet sich daher insbesondere an kleine und mittlere Betreiber Kritischer Infrastrukturen, insbesondere im Bereich der Wasserversorgung. Diese sollen in die Lage versetzt werden, eine anforderungsgerechte, skalierbare und vor allem ressourceneffiziente Schutzkonzepterstellung durchführen zu können.
Die Bedeutung der Zulieferer für die Automobilhersteller wächst stetig, weil die Zulieferer immer stärker in den gesamten Wertschöpfungsprozess des Herstellers eingebunden werden und somit ständig Planung, Qualität und Logistikabläufe optimiert werden müssen. Betriebliche Anwendungen, wie Enterprise Resource Planning- (ERP-) oder Produktionsplanung- und -steuerungs- (PPS-) Systeme werden benötigt, um den störungsfreien und reibungslosen Ablauf der Geschäftsprozesse und damit die ständige Lieferfähigkeit gegenüber dem Automobilhersteller zu garantieren [1]. Anhand eines Marktüberblicks werden in diesem Beitrag innovative Ansätze, Möglichkeiten und Koordinationsmechanismen zur Unterstützung der Produktion in verteilten Standorten von aktuellen ERP-/PPS-Systemen vorgestellt.
Optimierung werksübergreifender Geschäftsprozesse am Beispiel der Automobilzuliefererindustrie
(2005)
Die Bedeutung der Zulieferer für die Automobilhersteller wächst stetig, weil die Zulieferer immer stärker in den gesamten Wertschöpfungsprozess des Herstellers eingebunden werden und somit ständig Planung, Qualität und Logistikabläufe optimiert werden müssen. Betriebliche Anwendungen, wie Enterprise Resource Planning- (ERP-) oder Produktionsplanung- und –steuerungs-(PPS-) Systeme werden benötigt, um den störungsfreien und reibungslosen Ablauf der Geschäftsprozesse und damit die ständige Lieferfähigkeit gegenüber dem Automobilhersteller zu garantieren [1]. Anhand eines Marktüberblicks werden in diesem Beitrag innovative Ansätze, Möglichkeiten und Koordinationsmechanismen zur Unterstützung der Produktion in verteilten Standorten von aktuellen ERP-/PPS-Systemen vorgestellt.
Openness indicators for the evaluation of digital platforms between the launch and maturity phase
(2024)
In recent years, the evaluation of digital platforms has become an important focus in the field of information systems science. The identification of influential indicators that drive changes in digital platforms, specifically those related to openness, is still an unresolved issue. This paper addresses the challenge of identifying measurable indicators and characterizing the transition from launch to maturity in digital platforms. It proposes a systematic analytical approach to identify relevant openness indicators for evaluation purposes. The main contributions of this study are the following (1) the development of a comprehensive procedure for analyzing indicators, (2) the categorization of indicators as evaluation metrics within a multidimensional grid-box model, (3) the selection and evaluation of relevant indicators, (4) the identification and assessment of digital platform architectures during the launch-to-maturity transition, and (5) the evaluation of the applicability of the conceptualization and design process for digital platform evaluation.
CO₂-Fußabdrücke sind ein aktuell viel diskutiertes Thema mit weitreichenden Implikationen für Individuen als auch Unternehmen. Firmen können einen proaktiven Beitrag zur Transparenz leisten, indem der unternehmens- oder produktbezogene CO₂-Fußabdruck ausgewiesen wird. Ist der Entschluss gefasst einen CO₂-Fußabdruck auszuweisen und die entstehenden Treibhausgase zu erfassen, existiert eine Vielzahl unterschiedlicher Normen und Zertifikate, wie die publicly available specification 2050, das Greenhouse Gas Protokoll oder die ISO 14067. Das Ziel dieses Beitrags ist es, diese drei Normen zur Berechnung des produktbezogenen CO₂-Fußabdrucks zu vergleichen, um Gemeinsamkeiten und Unterschiede sowie Vor- und Nachteile in der Anwendung aufzuzeigen. Die Übersicht soll Unternehmen bei der Entscheidungsfindung hinsichtlich der Eignung eines CO₂-Fußabdrucks für ihr Unternehmen unterstützen.
Nowadays, production planning and control must cope with mass customization, increased fluctuations in demand, and high competition pressures. Despite prevailing market risks, planning accuracy and increased adaptability in the event of disruptions or failures must be ensured, while simultaneously optimizing key process indicators. To manage that complex task, neural networks that can process large quantities of high-dimensional data in real time have been widely adopted in recent years. Although these are already extensively deployed in production systems, a systematic review of applications and implemented agent embeddings and architectures has not yet been conducted. The main contribution of this paper is to provide researchers and practitioners with an overview of applications and applied embeddings and to motivate further research in neural agent-based production. Findings indicate that neural agents are not only deployed in diverse applications, but are also increasingly implemented in multi-agent environments or in combination with conventional methods — leveraging performances compared to benchmarks and reducing dependence on human experience. This not only implies a more sophisticated focus on distributed production resources, but also broadening the perspective from a local to a global scale. Nevertheless, future research must further increase scalability and reproducibility to guarantee a simplified transfer of results to reality.
Nowadays, production planning and control must cope with mass customization, increased fluctuations in demand, and high competition pressures. Despite prevailing market risks, planning accuracy and increased adaptability in the event of disruptions or failures must be ensured, while simultaneously optimizing key process indicators. To manage that complex task, neural networks that can process large quantities of high-dimensional data in real time have been widely adopted in recent years. Although these are already extensively deployed in production systems, a systematic review of applications and implemented agent embeddings and architectures has not yet been conducted. The main contribution of this paper is to provide researchers and practitioners with an overview of applications and applied embeddings and to motivate further research in neural agent-based production. Findings indicate that neural agents are not only deployed in diverse applications, but are also increasingly implemented in multi-agent environments or in combination with conventional methods — leveraging performances compared to benchmarks and reducing dependence on human experience. This not only implies a more sophisticated focus on distributed production resources, but also broadening the perspective from a local to a global scale. Nevertheless, future research must further increase scalability and reproducibility to guarantee a simplified transfer of results to reality.
The development of new and better optimization and approximation methods for Job Shop Scheduling Problems (JSP) uses simulations to compare their performance. The test data required for this has an uncertain influence on the simulation results, because the feasable search space can be changed drastically by small variations of the initial problem model. Methods could benefit from this to varying degrees. This speaks in favor of defining standardized and reusable test data for JSP problem classes, which in turn requires a systematic describability of the test data in order to be able to compile problem adequate data sets. This article looks at the test data used for comparing methods by literature review. It also shows how and why the differences in test data have to be taken into account. From this, corresponding challenges are derived which the management of test data must face in the context of JSP research.
Keywords
The development of new and better optimization and approximation methods for Job Shop Scheduling Problems (JSP) uses simulations to compare their performance. The test data required for this has an uncertain influence on the simulation results, because the feasable search space can be changed drastically by small variations of the initial problem model. Methods could benefit from this to varying degrees. This speaks in favor of defining standardized and reusable test data for JSP problem classes, which in turn requires a systematic describability of the test data in order to be able to compile problem adequate data sets. This article looks at the test data used for comparing methods by literature review. It also shows how and why the differences in test data have to be taken into account. From this, corresponding challenges are derived which the management of test data must face in the context of JSP research.
Der vorliegende Beitrag gibt eine Einführung in die Analyse und Optimierung wissensintensiver Geschäftsprozesse. Dazu wird die Modellierungsmethodik KMDL® vorgestellt und ein Szenario ihrer Anwendung auf wissensintensive Prozesse im Finanzdienstleistungssektor entworfen. Dies wird exemplarisch am Dienstleistungsentwicklungsprozess sowie am Kundenberatungsprozess demonstriert. Dabei werden für beide Prozesse durch den Einsatz der KMDL® erkennbare Potenzialfelder aufgezeigt und Handlungsempfehlungen entworfen.
Die Ressource Wissen als Bestandteil der unternehmerischen Wertschöpfung hat in den letzten Jahren stark an Bedeutung gewonnen. Besonders davon beeinflusst sind Branchen und Geschäftsmodelle, deren Wertschöpfung zu einem Großteil auf Erwerb, Erzeugung und Nutzung von Wissen basiert. Bekannte Werkzeuge für die Geschäftsprozessmodellierung berücksichtigen in der Regel nur explizites Wissen, welches in statischer Form abgebildet wird. Dabei gerät die Betrachtung von personenbezogenem Wissen, welches nicht unmittelbar zur Erzeugung von Informationen benötigt wird, aus dem Blickfeld. In diesem Beitrag werden klassische Verfahren der Modellierung von Geschäftsprozessen auf ihre Eignung zur Darstellung wissensintensiver Geschäftsprozesse untersucht. Basierend auf den vorgefundenen Defiziten wird der Modellierungsansatz KMDL (Knowledge Modeling Description Language) vorgestellt.
Dieses Kapitel diskutiert die Notwendigkeit einer stärkeren Praxisorientierung für die Schaffung konkreter Lehr- und Lernräume in Unternehmen und zeigt die Vorteile einer Lernfabrik vor dem Hintergrund der stattfindenden Digitalisierung als Mittel zur Kompetenzentwicklung auf. Die technologiebedingt erweiterten Weiterbildungsziele erfordern die Nutzung geeigneter Konzepte und Lösungen. Dahingehend erfolgt die zielorientierte Konkretisierung der Kreation geeigneter Lehr- und Lernsituationen. Die Darstellung der Nutzbarmachung einer Modellfabrik als Lernfabrik der betrieblichen Weiterbildungspraxis zeigt nicht nur eine Lösung für die intendierte Bereitstellung flexibler Lehr- und Lernsituationen, sondern liefert ebenso Handlungsempfehlungen und Best-Practices für die erfolgreiche Kompetenzentwicklung. Insbesondere Praktiker profitieren von der Darstellung der Lernfabrik: aus dieser können sowohl betriebliche Weiterbildner als auch Geschäftsverantwortliche Implikationen für die didaktische Transformation betrieblicher Arbeitsorte in betriebliche Lern-Orte ableiten. Die detaillierte Darstellung einer Tagesschulung zum Thema Auswirkungen von Industrie 4.0 auf die Arbeit der Mitarbeiter sowie Illustration eines Lernszenarios geben reale Einblicke, wie betriebliche Weiterbildung abseits von Lehr-Lern-Kurzschluss-orientierter Didaktik gelingt.
Industry 4.0, i.e. the connection of cyber-physical systems via the Internet in production and logistics, leads to considerable changes in the socio-technical system of the factory. The effects range from a considerable need for further training, which is exacerbated by the current shortage of skilled workers, to an opening of the previously inaccessible boundaries of the factory to third-party access, an increasing merging of office IT and manufacturing IT, and a new understanding of what machines can do with their data. This results in new requirements for the modeling, analysis and design of information processing and performance mapping business processes.
In the past, procedures were developed under the name of “process-oriented knowledge management” with which the exchange and use of knowledge in business processes could be represented, analyzed and improved. However, these approaches were limited to the office environment. A method that makes it possible to document, analyze and jointly optimize the new possibilities of knowledge processing by using artificial intelligence and machine learning in production and logistics in the same way and in a manner compatible with the approach in the office environment does not exist so far. The extension of the modeling language KMDL, which is described in this paper, will contribute to close this research gap.
This paper describes first approaches for an analysis and design method for a knowledge management integrating man and machine in the age of Industry 4.0.
The Knowledge Modeler Description Language KMDL is able to represent the creation, use and necessity of knowledge along common business processes. So KMDL can be used to formalize knowledge-intensive processes with a focus on certain knowledge-specific characteristics and to identify weak points in these processes. For a computer-aided modeling and analyzing the tool K-Modeler is introduced.
Business processes are regularly modified either to capture requirements from the organization’s environment or due to internal optimization and restructuring. Implementing the changes into the individual work routines is aided by change management tools. These tools aim at the acceptance of the process by and empowerment of the process executor. They cover a wide range of general factors and seldom accurately address the changes in task execution and sequence. Furthermore, change is only framed as a learning activity, while most obstacles to change arise from the inability to unlearn or forget behavioural patterns one is acquainted with. Therefore, this paper aims to develop and demonstrate a notation to capture changes in business processes and identify elements that are likely to present obstacles during change. It connects existing research from changes in work routines and psychological insights from unlearning and intentional forgetting to the BPM domain. The results contribute to more transparency in business process models regarding knowledge changes. They provide better means to understand the dynamics and barriers of change processes.
Der Wandel zur automatisierten Produktion, die fortschreitende Digitalisierung der Wertschöpfungsprozesse sowie die stetige Implementierung von mobilen Industrial Internet of Things-Technologien (IIoT) in diese zur Unterstützung der Mitarbeiter stellen betriebliche Weiterbildung vor Herausforderungen. Komple-xere Anforderungen und veränderte Tätigkeitsprofile erfordern Handlungskom-petenzen bei Mitarbeitern im Sinne der Fähigkeit, in unbekannten Situationen auf Basis eigenen Könnens handlungsfähig zu bleiben. Jene sowie dafür notwendiges umfassendes Verständnis gegenüber digitalisierten Produktions-prozessen kann jedoch durch konventionelle Lehrmethoden nicht realisiert werden, da diese der erhöhten Anforderungskomplexität und den komplexen Rückkopplungen im Rahmen der Steuer- und Regelkreise nicht gerecht werden können. Diese Aspekte aufgreifend wird im Folgenden ein szenariobasierter Wei-terbildungsansatz für eine Lernfabrik vorgestellt, der insbesondere die Potenziale mobiler IIoT-Technologien zur Ausgestaltung dieser in den Blick nimmt.
While Information Systems (IS) Research on the individual and workgroup level of analysis is omnipresent, research on the enterprise-level IS is less frequent. Even though research on Enterprise Systems and their management is established in academic associations and conference programs, enterprise-level phenomena are underrepresented. This minitrack provides a forum to integrate existing research streams that traditionally needed to be attached to other topics (such as IS management or IS governance). The minitrack received broad attention. The three selected papers address different facets of the future role of enterprise-wide IS including aspects such as carbonization, ecosystem integration, and technology-organization fit.