Refine
Year of publication
Document Type
- Article (143)
- Monograph/Edited Volume (26)
- Part of a Book (22)
- Conference Proceeding (18)
- Postprint (15)
- Other (7)
- Contribution to a Periodical (5)
- Doctoral Thesis (1)
- Review (1)
Keywords
- knowledge management (7)
- Industrie 4.0 (5)
- Industry 4.0 (5)
- deep reinforcement learning (5)
- production control (5)
- ERP (4)
- Enterprise-Resource-Planning (4)
- digital learning (4)
- machine learning (4)
- systematic literature review (4)
Institute
- Wirtschaftswissenschaften (139)
- Fachgruppe Betriebswirtschaftslehre (86)
- Wirtschafts- und Sozialwissenschaftliche Fakultät (4)
- Hasso-Plattner-Institut für Digital Engineering GmbH (3)
- Fachgruppe Politik- & Verwaltungswissenschaft (2)
- Sozialwissenschaften (2)
- Bürgerliches Recht (1)
- Department Psychologie (1)
- Extern (1)
- Forschungsbereich „Politik, Verwaltung und Management“ (1)
Nowadays, production planning and control must cope with mass customization, increased fluctuations in demand, and high competition pressures. Despite prevailing market risks, planning accuracy and increased adaptability in the event of disruptions or failures must be ensured, while simultaneously optimizing key process indicators. To manage that complex task, neural networks that can process large quantities of high-dimensional data in real time have been widely adopted in recent years. Although these are already extensively deployed in production systems, a systematic review of applications and implemented agent embeddings and architectures has not yet been conducted. The main contribution of this paper is to provide researchers and practitioners with an overview of applications and applied embeddings and to motivate further research in neural agent-based production. Findings indicate that neural agents are not only deployed in diverse applications, but are also increasingly implemented in multi-agent environments or in combination with conventional methods — leveraging performances compared to benchmarks and reducing dependence on human experience. This not only implies a more sophisticated focus on distributed production resources, but also broadening the perspective from a local to a global scale. Nevertheless, future research must further increase scalability and reproducibility to guarantee a simplified transfer of results to reality.
Nowadays, production planning and control must cope with mass customization, increased fluctuations in demand, and high competition pressures. Despite prevailing market risks, planning accuracy and increased adaptability in the event of disruptions or failures must be ensured, while simultaneously optimizing key process indicators. To manage that complex task, neural networks that can process large quantities of high-dimensional data in real time have been widely adopted in recent years. Although these are already extensively deployed in production systems, a systematic review of applications and implemented agent embeddings and architectures has not yet been conducted. The main contribution of this paper is to provide researchers and practitioners with an overview of applications and applied embeddings and to motivate further research in neural agent-based production. Findings indicate that neural agents are not only deployed in diverse applications, but are also increasingly implemented in multi-agent environments or in combination with conventional methods — leveraging performances compared to benchmarks and reducing dependence on human experience. This not only implies a more sophisticated focus on distributed production resources, but also broadening the perspective from a local to a global scale. Nevertheless, future research must further increase scalability and reproducibility to guarantee a simplified transfer of results to reality.
The development of new and better optimization and approximation methods for Job Shop Scheduling Problems (JSP) uses simulations to compare their performance. The test data required for this has an uncertain influence on the simulation results, because the feasable search space can be changed drastically by small variations of the initial problem model. Methods could benefit from this to varying degrees. This speaks in favor of defining standardized and reusable test data for JSP problem classes, which in turn requires a systematic describability of the test data in order to be able to compile problem adequate data sets. This article looks at the test data used for comparing methods by literature review. It also shows how and why the differences in test data have to be taken into account. From this, corresponding challenges are derived which the management of test data must face in the context of JSP research.
Keywords
The development of new and better optimization and approximation methods for Job Shop Scheduling Problems (JSP) uses simulations to compare their performance. The test data required for this has an uncertain influence on the simulation results, because the feasable search space can be changed drastically by small variations of the initial problem model. Methods could benefit from this to varying degrees. This speaks in favor of defining standardized and reusable test data for JSP problem classes, which in turn requires a systematic describability of the test data in order to be able to compile problem adequate data sets. This article looks at the test data used for comparing methods by literature review. It also shows how and why the differences in test data have to be taken into account. From this, corresponding challenges are derived which the management of test data must face in the context of JSP research.
Der vorliegende Beitrag gibt eine Einführung in die Analyse und Optimierung wissensintensiver Geschäftsprozesse. Dazu wird die Modellierungsmethodik KMDL® vorgestellt und ein Szenario ihrer Anwendung auf wissensintensive Prozesse im Finanzdienstleistungssektor entworfen. Dies wird exemplarisch am Dienstleistungsentwicklungsprozess sowie am Kundenberatungsprozess demonstriert. Dabei werden für beide Prozesse durch den Einsatz der KMDL® erkennbare Potenzialfelder aufgezeigt und Handlungsempfehlungen entworfen.
Die Ressource Wissen als Bestandteil der unternehmerischen Wertschöpfung hat in den letzten Jahren stark an Bedeutung gewonnen. Besonders davon beeinflusst sind Branchen und Geschäftsmodelle, deren Wertschöpfung zu einem Großteil auf Erwerb, Erzeugung und Nutzung von Wissen basiert. Bekannte Werkzeuge für die Geschäftsprozessmodellierung berücksichtigen in der Regel nur explizites Wissen, welches in statischer Form abgebildet wird. Dabei gerät die Betrachtung von personenbezogenem Wissen, welches nicht unmittelbar zur Erzeugung von Informationen benötigt wird, aus dem Blickfeld. In diesem Beitrag werden klassische Verfahren der Modellierung von Geschäftsprozessen auf ihre Eignung zur Darstellung wissensintensiver Geschäftsprozesse untersucht. Basierend auf den vorgefundenen Defiziten wird der Modellierungsansatz KMDL (Knowledge Modeling Description Language) vorgestellt.
Dieses Kapitel diskutiert die Notwendigkeit einer stärkeren Praxisorientierung für die Schaffung konkreter Lehr- und Lernräume in Unternehmen und zeigt die Vorteile einer Lernfabrik vor dem Hintergrund der stattfindenden Digitalisierung als Mittel zur Kompetenzentwicklung auf. Die technologiebedingt erweiterten Weiterbildungsziele erfordern die Nutzung geeigneter Konzepte und Lösungen. Dahingehend erfolgt die zielorientierte Konkretisierung der Kreation geeigneter Lehr- und Lernsituationen. Die Darstellung der Nutzbarmachung einer Modellfabrik als Lernfabrik der betrieblichen Weiterbildungspraxis zeigt nicht nur eine Lösung für die intendierte Bereitstellung flexibler Lehr- und Lernsituationen, sondern liefert ebenso Handlungsempfehlungen und Best-Practices für die erfolgreiche Kompetenzentwicklung. Insbesondere Praktiker profitieren von der Darstellung der Lernfabrik: aus dieser können sowohl betriebliche Weiterbildner als auch Geschäftsverantwortliche Implikationen für die didaktische Transformation betrieblicher Arbeitsorte in betriebliche Lern-Orte ableiten. Die detaillierte Darstellung einer Tagesschulung zum Thema Auswirkungen von Industrie 4.0 auf die Arbeit der Mitarbeiter sowie Illustration eines Lernszenarios geben reale Einblicke, wie betriebliche Weiterbildung abseits von Lehr-Lern-Kurzschluss-orientierter Didaktik gelingt.
Industry 4.0, i.e. the connection of cyber-physical systems via the Internet in production and logistics, leads to considerable changes in the socio-technical system of the factory. The effects range from a considerable need for further training, which is exacerbated by the current shortage of skilled workers, to an opening of the previously inaccessible boundaries of the factory to third-party access, an increasing merging of office IT and manufacturing IT, and a new understanding of what machines can do with their data. This results in new requirements for the modeling, analysis and design of information processing and performance mapping business processes.
In the past, procedures were developed under the name of “process-oriented knowledge management” with which the exchange and use of knowledge in business processes could be represented, analyzed and improved. However, these approaches were limited to the office environment. A method that makes it possible to document, analyze and jointly optimize the new possibilities of knowledge processing by using artificial intelligence and machine learning in production and logistics in the same way and in a manner compatible with the approach in the office environment does not exist so far. The extension of the modeling language KMDL, which is described in this paper, will contribute to close this research gap.
This paper describes first approaches for an analysis and design method for a knowledge management integrating man and machine in the age of Industry 4.0.
The Knowledge Modeler Description Language KMDL is able to represent the creation, use and necessity of knowledge along common business processes. So KMDL can be used to formalize knowledge-intensive processes with a focus on certain knowledge-specific characteristics and to identify weak points in these processes. For a computer-aided modeling and analyzing the tool K-Modeler is introduced.
Business processes are regularly modified either to capture requirements from the organization’s environment or due to internal optimization and restructuring. Implementing the changes into the individual work routines is aided by change management tools. These tools aim at the acceptance of the process by and empowerment of the process executor. They cover a wide range of general factors and seldom accurately address the changes in task execution and sequence. Furthermore, change is only framed as a learning activity, while most obstacles to change arise from the inability to unlearn or forget behavioural patterns one is acquainted with. Therefore, this paper aims to develop and demonstrate a notation to capture changes in business processes and identify elements that are likely to present obstacles during change. It connects existing research from changes in work routines and psychological insights from unlearning and intentional forgetting to the BPM domain. The results contribute to more transparency in business process models regarding knowledge changes. They provide better means to understand the dynamics and barriers of change processes.
Der Wandel zur automatisierten Produktion, die fortschreitende Digitalisierung der Wertschöpfungsprozesse sowie die stetige Implementierung von mobilen Industrial Internet of Things-Technologien (IIoT) in diese zur Unterstützung der Mitarbeiter stellen betriebliche Weiterbildung vor Herausforderungen. Komple-xere Anforderungen und veränderte Tätigkeitsprofile erfordern Handlungskom-petenzen bei Mitarbeitern im Sinne der Fähigkeit, in unbekannten Situationen auf Basis eigenen Könnens handlungsfähig zu bleiben. Jene sowie dafür notwendiges umfassendes Verständnis gegenüber digitalisierten Produktions-prozessen kann jedoch durch konventionelle Lehrmethoden nicht realisiert werden, da diese der erhöhten Anforderungskomplexität und den komplexen Rückkopplungen im Rahmen der Steuer- und Regelkreise nicht gerecht werden können. Diese Aspekte aufgreifend wird im Folgenden ein szenariobasierter Wei-terbildungsansatz für eine Lernfabrik vorgestellt, der insbesondere die Potenziale mobiler IIoT-Technologien zur Ausgestaltung dieser in den Blick nimmt.
While Information Systems (IS) Research on the individual and workgroup level of analysis is omnipresent, research on the enterprise-level IS is less frequent. Even though research on Enterprise Systems and their management is established in academic associations and conference programs, enterprise-level phenomena are underrepresented. This minitrack provides a forum to integrate existing research streams that traditionally needed to be attached to other topics (such as IS management or IS governance). The minitrack received broad attention. The three selected papers address different facets of the future role of enterprise-wide IS including aspects such as carbonization, ecosystem integration, and technology-organization fit.