Refine
Year of publication
Document Type
- Article (47)
- Conference Proceeding (20)
- Part of a Book (14)
- Postprint (7)
- Other (6)
- Monograph/Edited Volume (3)
- Review (1)
Language
- English (98) (remove)
Is part of the Bibliography
- yes (98)
Keywords
- knowledge management (6)
- Industry 4.0 (5)
- deep reinforcement learning (5)
- production control (5)
- digital learning (4)
- machine learning (4)
- systematic literature review (4)
- COVID-19 (3)
- CPPS (3)
- CPS (3)
Institute
- Fachgruppe Betriebswirtschaftslehre (54)
- Wirtschaftswissenschaften (36)
- Hasso-Plattner-Institut für Digital Engineering GmbH (3)
- Wirtschafts- und Sozialwissenschaftliche Fakultät (2)
- Department Psychologie (1)
- Fachgruppe Politik- & Verwaltungswissenschaft (1)
- Forschungsbereich „Politik, Verwaltung und Management“ (1)
- Sozialwissenschaften (1)
Traditional production systems are enhanced by cyber-physical systems (CPS) and Internet of Things. A kind of next generation systems, those cyber-physical production systems (CPPS) are able to raise the level of autonomy of its production components. To find the optimal degree of autonomy in a given context, a research approach is formulated using a simulation concept. Based on requirements and assumptions, a cyber-physical market is modeled and qualitative hypotheses are formulated, which will be verified with the help of the CPPS of a hybrid simulation environment.
Manufacturing companies still have relatively few points of contact with the circular economy. Especially, extending life time of whole products or parts via remanufacturing is an promising approach to reduce waste. However, necessary cost-efficient assessment of the condition of the individual parts is challenging and assessment procedures are technically complex (e.g., scanning and testing procedures). Furthermore, these assessment procedures are usually only available after the disassembly process has been completed. This is where conceptualization, data acquisition and simulation of remanufacturing processes can help. One major constraining aspect of remanufacturing is reducing logistic efforts, since these also have negative external effects on the environment. Thus regionalization is an additional but in the end consequential challenge for remanufacturing. This article aims to fill a gap by providing an regional remanufacturing approach, in particular the design of local remanufacturing chains. Thereby, further focus lies on modeling and simulating alternative courses of action, including feasibility study and eco-nomic assessment.
Industry 4.0, i.e. the connection of cyber-physical systems via the Internet in production and logistics, leads to considerable changes in the socio-technical system of the factory. The effects range from a considerable need for further training, which is exacerbated by the current shortage of skilled workers, to an opening of the previously inaccessible boundaries of the factory to third-party access, an increasing merging of office IT and manufacturing IT, and a new understanding of what machines can do with their data. This results in new requirements for the modeling, analysis and design of information processing and performance mapping business processes.
In the past, procedures were developed under the name of “process-oriented knowledge management” with which the exchange and use of knowledge in business processes could be represented, analyzed and improved. However, these approaches were limited to the office environment. A method that makes it possible to document, analyze and jointly optimize the new possibilities of knowledge processing by using artificial intelligence and machine learning in production and logistics in the same way and in a manner compatible with the approach in the office environment does not exist so far. The extension of the modeling language KMDL, which is described in this paper, will contribute to close this research gap.
This paper describes first approaches for an analysis and design method for a knowledge management integrating man and machine in the age of Industry 4.0.
Terminology is a critical instrument for each researcher. Different terminologies for the same research object may arise in different research communities. By this inconsistency, many synergistic effects get lost. Theories and models will be more understandable and reusable if a common terminology is applied. This paper examines the terminological (in)consistence for the research field of job-shop scheduling by a literature review. There is an enormous variety in the choice of terms and mathematical notation for the same concept. The comparability, reusability and combinability of scheduling methods is unnecessarily hampered by the arbitrary use of homonyms and synonyms. The acceptance in the community of used variables and notation forms is shown by means of a compliance quotient. This is proven by the evaluation of 240 scientific publications on planning methods.
Since more and more business tasks are enabled by Artificial Intelligence (AI)-based techniques, the number of knowledge-intensive tasks increase as trivial tasks can be automated and non-trivial tasks demand human-machine interactions. With this, challenges regarding the management of knowledge workers and machines rise [9]. Furthermore, knowledge workers experience time pressure, which can lead to a decrease in output quality. Artificial Intelligence-based systems (AIS) have the potential to assist human workers in knowledge-intensive work. By providing a domain-specific language, contextual and situational awareness as well as their process embedding can be specified, which enables the management of human and AIS to ease knowledge transfer in a way that process time, cost and quality are improved significantly. This contribution outlines a framework to designing these systems and accounts for their implementation.
Faced with the triad of time-cost-quality, the realization of knowledge-intensive tasks at economic conditions is not trivial. Since the number of knowledge-intensive processes is increasing more and more nowadays, the efficient design of knowledge transfers at business processes as well as the target-oriented improvement of them is essential, so that process outcomes satisfy high quality criteria and economic requirements. This particularly challenges knowledge management, aiming for the assignment of ideal manifestations of influence factors on knowledge transfers to a certain task. Faced with first attempts of knowledge transfer-based process improvements [1], this paper continues research about the quantitative examination of knowledge transfers and presents a ready-to-go experiment design that is able to examine quality of knowledge transfers empirically and is suitable to examine knowledge transfers on a quantitative level. Its use is proven by the example of four influence factors, which namely are stickiness, complexity, competence and time pressure.
Process models are the basic ingredient for many attempts to improve business processes. The graphical depiction of otherwise not observable behavior in an enterprise is one of the most important techniques in the digital society. They help to enable decision making in the design of processes and workflows. Nevertheless it is not easy to correctly model business processes. Some approaches try to detect errors by an automated analysis of the process model. This contribution focuses on the creation of the first model from scratch. Which errors occur most frequently and how can these be avoided?
From employee to expert
(2021)
In the context of the collaborative project Ageing-appropriate, process-oriented and interactive further training in SME (API-KMU), innovative solutions for the challenges of demographic change and digitalisation are being developed for SMEs. To this end, an approach to age-appropriate training will be designed with the help of AR technology. In times of the corona pandemic, a special research design is necessary for the initial survey of the current state in the companies, which will be systematically elaborated in this paper. The results of the previous methodological considerations illustrate the necessity of a mix of methods to generate a deeper insight into the work processes. Video-based retrospective interviews seem to be a suitable instrument to adequately capture the employees' interpretative perspectives on their work activities. In conclusion, the paper identifies specific challenges, such as creating acceptance among employees, open questions, e.g., how a transfer or generalization of the results can succeed, and hypotheses that will have to be tested in the further course of the research process.
Competence development must change at all didactic levels to meet the new requirements triggered by digitization. Unlike classic learning theories and the resulting popular approaches (e.g., sender-receiver model), future-oriented vocational training must include new learning theory impulses in the discussion about competence acquisition. On the one hand, these impulses are often very well elaborated on the theoretical side, but the transfer into innovative learning environments - such as learning factories - is often still missing. On the other hand, actual learning factory (design) approaches often concentrate primarily on the technical side. Subject-oriented learning theory enables the design of competence development-oriented vocational training projectsin learning factories in which persons can obtain relevant competencies for digitization. At the same time, such learning theory approaches assume a potentially infinite number of learning interests and reasons. Following this, competence development is always located in an institutional or organizational context. The paper conceptionally answers how this theoryimmanent challenge is synthesizable with the reality of organizationally competence development requirements.
The authors propose that while tacit knowledge is a valuable resource for developing new business models, its externalization presents several challenges. One major challenge is that individuals often don’t recognize their tacit knowledge resources, while another is the reluctance to share one’s knowledge with others. Addressing these challenges, the authors present an application-oriented serious game-based haptic modeling approach for externalize tacit knowledge, which can be used to develop the first versions of business models based on tacit knowledge. Both conceptual and practical design fundamentals are presented based on elaborated theoretical approaches, which were developed with the help of a design science approach. The development of the research process is presented step by step, whereby we focused on the high accessibility of the presented research. Practitioners are presented with guidelines for implementing their serious game projects. Scientists benefit from starting points for their research topics of externalization, internalization, and socialization of tacit knowledge, development of business models, and serious games or gamification. The paper concludes with open research desiderata and questions from the presented research process.
The usage of gamification in the contexts of commerce, consumption, innovation or eLearning in schools and universities has been extensively researched. However, the potentials of serious games to transfer and perpetuate knowledge and action patterns in learning factories have not been levered so far. The goal of this paper is to introduce a serious game as an instrument for knowledge transfer and perpetuation. Therefore, reqirements towards serious games in the context of learning factories are pointed out. As a result, that builds on these requirements, a serious learning game for the topic of Industry 4.0 is practically designed and evaluated.
A growing number of business processes can be characterized as knowledge-intensive. The ability to speed up the transfer of knowledge between any kind of knowledge carriers in business processes with AR techniques can lead to a huge competitive advantage, for instance in manufacturing. This includes the transfer of person-bound knowledge as well as externalized knowledge of physical and virtual objects. The contribution builds on a time-dependent knowledge transfer model and conceptualizes an adaptable, AR-based application. Having the intention to accelerate the speed of knowledge transfers between a manufacturer and an information system, empirical results of an experimentation show the validity of this approach. For the first time, it will be possible to discover how to improve the transfer among knowledge carriers of an organization with knowledge-driven information systems (KDIS). Within an experiment setting, the paper shows how to improve the quantitative effects regarding the quality and amount of time needed for an example manufacturing process realization by an adaptable KDIS.
Process mining (PM) has established itself in recent years as a main method for visualizing and analyzing processes. However, the identification of knowledge has not been addressed adequately because PM aims solely at data-driven discovering, monitoring, and improving real-world processes from event logs available in various information systems. The following paper, therefore, outlines a novel systematic analysis view on tools for data-driven and machine learning (ML)-based identification of knowledge-intensive target processes. To support the effectiveness of the identification process, the main contributions of this study are (1) to design a procedure for a systematic review and analysis for the selection of relevant dimensions, (2) to identify different categories of dimensions as evaluation metrics to select source systems, algorithms, and tools for PM and ML as well as include them in a multi-dimensional grid box model, (3) to select and assess the most relevant dimensions of the model, (4) to identify and assess source systems, algorithms, and tools in order to find evidence for the selected dimensions, and (5) to assess the relevance and applicability of the conceptualization and design procedure for tool selection in data-driven and ML-based process mining research.
The digital transformation sets new requirements to all classes of enterprise systems in companies. ERP systems in particular, which represent the dominant class of enterprise systems, are struggling to meet the new requirements at all levels of the architecture. Therefore, there is an urgent need to reconsider the overall architecture of the systems and address the root of the related issues. Given that many restrictions ERP pose on their adaptability are related to the standardization of data, the database layer of ERP systems is addressed. Since database serve as the foundation for data storage and retrieval, they limit the flexibility of enterprise systems and the chance to adapt to new requirements accordingly. So far, relational databases are widely used. Using a systematic literature approach, recent requirements for ERP systems were identified. Prominent database approaches were assessed against the 23 requirements identified. The results reveal the strengths and weaknesses of recent database approaches. To this end, the results highlight the demand to combine multiple database approaches to fulfill recent business requirements. From a conceptual point of view, this paper supports the idea of federated databases which are interoperable to fulfill future requirements and support business operation. This research forms the basis for renewal of the current generation of ERP systems and proposes to ERP vendors to use different database concepts in the future.
To cope with the already large, and ever increasing, amount of information stored in organizational memory, "forgetting," as an important human memory process, might be transferred to the organizational context. Especially in intentionally planned change processes (e.g., change management), forgetting is an important precondition to impede the recall of obsolete routines and adapt to new strategic objectives accompanied by new organizational routines. We first comprehensively review the literature on the need for organizational forgetting and particularly on accidental vs. intentional forgetting. We discuss the current state of the art of theory and empirical evidence on forgetting from cognitive psychology in order to infer mechanisms applicable to the organizational context. In this respect, we emphasize retrieval theories and the relevance of retrieval cues important for forgetting. Subsequently, we transfer the empirical evidence that the elimination of retrieval cues leads to faster forgetting to the forgetting of organizational routines, as routines are part of organizational memory. We then propose a classification of cues (context, sensory, business process-related cues) that are relevant in the forgetting of routines, and discuss a meta-cue called the "situational strength" cue, which is relevant if cues of an old and a new routine are present simultaneously. Based on the classification as business process-related cues (information, team, task, object cues), we propose mechanisms to accelerate forgetting by eliminating specific cues based on the empirical and theoretical state of the art. We conclude that in intentional organizational change processes, the elimination of cues to accelerate forgetting should be used in change management practices.
Technological advancements are giving rise to the fourth industrial revolution - Industry 4.0 -characterized by the mass employment of smart objects in highly reconfigurable and thoroughly connected industrialproduct-service systems. The purpose of this paper is to propose a theory-based knowledgedynamics model in the smart grid scenario that would provide a holistic view on the knowledge-based interactions among smart objects, humans, and other actors as an underlyingmechanism of value co-creation in Industry 4.0. A multi-loop and three-layer - physical, virtual, and interface - model of knowledge dynamics is developedby building on the concept of ba - an enabling space for interactions and theemergence of knowledge. The model depicts how big data analytics are just one component inunlocking the value of big data, whereas the tacit engagement of humans-in-the-loop - theirsense-making and decision-making - is needed for insights to be evoked fromanalytics reports and customer needs to be met.
To cope with the already large, and ever increasing, amount of information stored in organizational memory, "forgetting," as an important human memory process, might be transferred to the organizational context. Especially in intentionally planned change processes (e.g., change management), forgetting is an important precondition to impede the recall of obsolete routines and adapt to new strategic objectives accompanied by new organizational routines. We first comprehensively review the literature on the need for organizational forgetting and particularly on accidental vs. intentional forgetting. We discuss the current state of the art of theory and empirical evidence on forgetting from cognitive psychology in order to infer mechanisms applicable to the organizational context. In this respect, we emphasize retrieval theories and the relevance of retrieval cues important for forgetting. Subsequently, we transfer the empirical evidence that the elimination of retrieval cues leads to faster forgetting to the forgetting of organizational routines, as routines are part of organizational memory. We then propose a classification of cues (context, sensory, business process-related cues) that are relevant in the forgetting of routines, and discuss a meta-cue called the "situational strength" cue, which is relevant if cues of an old and a new routine are present simultaneously. Based on the classification as business process-related cues (information, team, task, object cues), we propose mechanisms to accelerate forgetting by eliminating specific cues based on the empirical and theoretical state of the art. We conclude that in intentional organizational change processes, the elimination of cues to accelerate forgetting should be used in change management practices.
The development of new and better optimization and approximation methods for Job Shop Scheduling Problems (JSP) uses simulations to compare their performance. The test data required for this has an uncertain influence on the simulation results, because the feasable search space can be changed drastically by small variations of the initial problem model. Methods could benefit from this to varying degrees. This speaks in favor of defining standardized and reusable test data for JSP problem classes, which in turn requires a systematic describability of the test data in order to be able to compile problem adequate data sets. This article looks at the test data used for comparing methods by literature review. It also shows how and why the differences in test data have to be taken into account. From this, corresponding challenges are derived which the management of test data must face in the context of JSP research.
Keywords
The collaboration during the modeling process is uncomfortable and characterized by various limitations. Faced with the successful transfer of first process modeling languages to the augmented world, non-transparent processes can be visualized in a more comprehensive way. With the aim to rise comfortability, speed, accuracy and manifoldness of real world process augmentations, a framework for the bidirectional interplay of the common process modeling world and the augmented world has been designed as morphologic box. Its demonstration proves the working of drawn AR integrations. Identified dimensions were derived from (1) a designed knowledge construction axiom, (2) a designed meta-model, (3) designed use cases and (4) designed directional interplay modes. Through a workshop-based survey, the so far best AR modeling configuration is identified, which can serve for benchmarks and implementations.