Refine
Year of publication
Document Type
- Article (46)
- Part of a Book (15)
- Conference Proceeding (15)
- Other (6)
- Postprint (3)
- Monograph/Edited Volume (2)
- Review (1)
Language
- English (88) (remove)
Is part of the Bibliography
- yes (88)
Keywords
- knowledge management (6)
- Industry 4.0 (5)
- digital learning (4)
- COVID-19 (3)
- CPPS (3)
- CPS (3)
- JSP (3)
- business processes (3)
- evaluation (3)
- learning factories (3)
Institute
- Fachgruppe Betriebswirtschaftslehre (45)
- Wirtschaftswissenschaften (35)
- Hasso-Plattner-Institut für Digital Engineering GmbH (3)
- Wirtschafts- und Sozialwissenschaftliche Fakultät (2)
- Department Psychologie (1)
- Fachgruppe Politik- & Verwaltungswissenschaft (1)
- Forschungsbereich „Politik, Verwaltung und Management“ (1)
- Sozialwissenschaften (1)
Traditional production systems are enhanced by cyber-physical systems (CPS) and Internet of Things. A kind of next generation systems, those cyber-physical production systems (CPPS) are able to raise the level of autonomy of its production components. To find the optimal degree of autonomy in a given context, a research approach is formulated using a simulation concept. Based on requirements and assumptions, a cyber-physical market is modeled and qualitative hypotheses are formulated, which will be verified with the help of the CPPS of a hybrid simulation environment.
Manufacturing companies still have relatively few points of contact with the circular economy. Especially, extending life time of whole products or parts via remanufacturing is an promising approach to reduce waste. However, necessary cost-efficient assessment of the condition of the individual parts is challenging and assessment procedures are technically complex (e.g., scanning and testing procedures). Furthermore, these assessment procedures are usually only available after the disassembly process has been completed. This is where conceptualization, data acquisition and simulation of remanufacturing processes can help. One major constraining aspect of remanufacturing is reducing logistic efforts, since these also have negative external effects on the environment. Thus regionalization is an additional but in the end consequential challenge for remanufacturing. This article aims to fill a gap by providing an regional remanufacturing approach, in particular the design of local remanufacturing chains. Thereby, further focus lies on modeling and simulating alternative courses of action, including feasibility study and eco-nomic assessment.
Industry 4.0, i.e. the connection of cyber-physical systems via the Internet in production and logistics, leads to considerable changes in the socio-technical system of the factory. The effects range from a considerable need for further training, which is exacerbated by the current shortage of skilled workers, to an opening of the previously inaccessible boundaries of the factory to third-party access, an increasing merging of office IT and manufacturing IT, and a new understanding of what machines can do with their data. This results in new requirements for the modeling, analysis and design of information processing and performance mapping business processes.
In the past, procedures were developed under the name of “process-oriented knowledge management” with which the exchange and use of knowledge in business processes could be represented, analyzed and improved. However, these approaches were limited to the office environment. A method that makes it possible to document, analyze and jointly optimize the new possibilities of knowledge processing by using artificial intelligence and machine learning in production and logistics in the same way and in a manner compatible with the approach in the office environment does not exist so far. The extension of the modeling language KMDL, which is described in this paper, will contribute to close this research gap.
This paper describes first approaches for an analysis and design method for a knowledge management integrating man and machine in the age of Industry 4.0.
Terminology is a critical instrument for each researcher. Different terminologies for the same research object may arise in different research communities. By this inconsistency, many synergistic effects get lost. Theories and models will be more understandable and reusable if a common terminology is applied. This paper examines the terminological (in)consistence for the research field of job-shop scheduling by a literature review. There is an enormous variety in the choice of terms and mathematical notation for the same concept. The comparability, reusability and combinability of scheduling methods is unnecessarily hampered by the arbitrary use of homonyms and synonyms. The acceptance in the community of used variables and notation forms is shown by means of a compliance quotient. This is proven by the evaluation of 240 scientific publications on planning methods.
Since more and more business tasks are enabled by Artificial Intelligence (AI)-based techniques, the number of knowledge-intensive tasks increase as trivial tasks can be automated and non-trivial tasks demand human-machine interactions. With this, challenges regarding the management of knowledge workers and machines rise [9]. Furthermore, knowledge workers experience time pressure, which can lead to a decrease in output quality. Artificial Intelligence-based systems (AIS) have the potential to assist human workers in knowledge-intensive work. By providing a domain-specific language, contextual and situational awareness as well as their process embedding can be specified, which enables the management of human and AIS to ease knowledge transfer in a way that process time, cost and quality are improved significantly. This contribution outlines a framework to designing these systems and accounts for their implementation.
Faced with the triad of time-cost-quality, the realization of knowledge-intensive tasks at economic conditions is not trivial. Since the number of knowledge-intensive processes is increasing more and more nowadays, the efficient design of knowledge transfers at business processes as well as the target-oriented improvement of them is essential, so that process outcomes satisfy high quality criteria and economic requirements. This particularly challenges knowledge management, aiming for the assignment of ideal manifestations of influence factors on knowledge transfers to a certain task. Faced with first attempts of knowledge transfer-based process improvements [1], this paper continues research about the quantitative examination of knowledge transfers and presents a ready-to-go experiment design that is able to examine quality of knowledge transfers empirically and is suitable to examine knowledge transfers on a quantitative level. Its use is proven by the example of four influence factors, which namely are stickiness, complexity, competence and time pressure.
Process models are the basic ingredient for many attempts to improve business processes. The graphical depiction of otherwise not observable behavior in an enterprise is one of the most important techniques in the digital society. They help to enable decision making in the design of processes and workflows. Nevertheless it is not easy to correctly model business processes. Some approaches try to detect errors by an automated analysis of the process model. This contribution focuses on the creation of the first model from scratch. Which errors occur most frequently and how can these be avoided?
From employee to expert
(2021)
In the context of the collaborative project Ageing-appropriate, process-oriented and interactive further training in SME (API-KMU), innovative solutions for the challenges of demographic change and digitalisation are being developed for SMEs. To this end, an approach to age-appropriate training will be designed with the help of AR technology. In times of the corona pandemic, a special research design is necessary for the initial survey of the current state in the companies, which will be systematically elaborated in this paper. The results of the previous methodological considerations illustrate the necessity of a mix of methods to generate a deeper insight into the work processes. Video-based retrospective interviews seem to be a suitable instrument to adequately capture the employees' interpretative perspectives on their work activities. In conclusion, the paper identifies specific challenges, such as creating acceptance among employees, open questions, e.g., how a transfer or generalization of the results can succeed, and hypotheses that will have to be tested in the further course of the research process.
Competence development must change at all didactic levels to meet the new requirements triggered by digitization. Unlike classic learning theories and the resulting popular approaches (e.g., sender-receiver model), future-oriented vocational training must include new learning theory impulses in the discussion about competence acquisition. On the one hand, these impulses are often very well elaborated on the theoretical side, but the transfer into innovative learning environments - such as learning factories - is often still missing. On the other hand, actual learning factory (design) approaches often concentrate primarily on the technical side. Subject-oriented learning theory enables the design of competence development-oriented vocational training projectsin learning factories in which persons can obtain relevant competencies for digitization. At the same time, such learning theory approaches assume a potentially infinite number of learning interests and reasons. Following this, competence development is always located in an institutional or organizational context. The paper conceptionally answers how this theoryimmanent challenge is synthesizable with the reality of organizationally competence development requirements.
The authors propose that while tacit knowledge is a valuable resource for developing new business models, its externalization presents several challenges. One major challenge is that individuals often don’t recognize their tacit knowledge resources, while another is the reluctance to share one’s knowledge with others. Addressing these challenges, the authors present an application-oriented serious game-based haptic modeling approach for externalize tacit knowledge, which can be used to develop the first versions of business models based on tacit knowledge. Both conceptual and practical design fundamentals are presented based on elaborated theoretical approaches, which were developed with the help of a design science approach. The development of the research process is presented step by step, whereby we focused on the high accessibility of the presented research. Practitioners are presented with guidelines for implementing their serious game projects. Scientists benefit from starting points for their research topics of externalization, internalization, and socialization of tacit knowledge, development of business models, and serious games or gamification. The paper concludes with open research desiderata and questions from the presented research process.
The usage of gamification in the contexts of commerce, consumption, innovation or eLearning in schools and universities has been extensively researched. However, the potentials of serious games to transfer and perpetuate knowledge and action patterns in learning factories have not been levered so far. The goal of this paper is to introduce a serious game as an instrument for knowledge transfer and perpetuation. Therefore, reqirements towards serious games in the context of learning factories are pointed out. As a result, that builds on these requirements, a serious learning game for the topic of Industry 4.0 is practically designed and evaluated.
Digitization and demographic change are enormous challenges for companies. Learning factories as innovative learning places can help prepare older employees for the digital change but must be designed and configured based on their specific learning requirements. To date, however, there are no particular recommendations to ensure effective age-appropriate training of bluecollar workers in learning factories. Therefore, based on a literature review, design characteristics and attributes of learning factories and learning requirements of older employees are presented. Furthermore, didactical recommendations for realizing age-appropriate learning designs in learning factories and a conceptualized scenario are outlined by synthesizing the findings.
A growing number of business processes can be characterized as knowledge-intensive. The ability to speed up the transfer of knowledge between any kind of knowledge carriers in business processes with AR techniques can lead to a huge competitive advantage, for instance in manufacturing. This includes the transfer of person-bound knowledge as well as externalized knowledge of physical and virtual objects. The contribution builds on a time-dependent knowledge transfer model and conceptualizes an adaptable, AR-based application. Having the intention to accelerate the speed of knowledge transfers between a manufacturer and an information system, empirical results of an experimentation show the validity of this approach. For the first time, it will be possible to discover how to improve the transfer among knowledge carriers of an organization with knowledge-driven information systems (KDIS). Within an experiment setting, the paper shows how to improve the quantitative effects regarding the quality and amount of time needed for an example manufacturing process realization by an adaptable KDIS.
Process mining (PM) has established itself in recent years as a main method for visualizing and analyzing processes. However, the identification of knowledge has not been addressed adequately because PM aims solely at data-driven discovering, monitoring, and improving real-world processes from event logs available in various information systems. The following paper, therefore, outlines a novel systematic analysis view on tools for data-driven and machine learning (ML)-based identification of knowledge-intensive target processes. To support the effectiveness of the identification process, the main contributions of this study are (1) to design a procedure for a systematic review and analysis for the selection of relevant dimensions, (2) to identify different categories of dimensions as evaluation metrics to select source systems, algorithms, and tools for PM and ML as well as include them in a multi-dimensional grid box model, (3) to select and assess the most relevant dimensions of the model, (4) to identify and assess source systems, algorithms, and tools in order to find evidence for the selected dimensions, and (5) to assess the relevance and applicability of the conceptualization and design procedure for tool selection in data-driven and ML-based process mining research.
With the latest technological developments and associated new possibilities in teaching, the personalisation of learning is gaining more and more importance. It assumes that individual learning experiences and results could generally be improved when personal learning preferences are considered. To do justice to the complexity of the personalisation possibilities of teaching and learning processes, we illustrate the components of learning and teaching in the digital environment and their interdependencies in an initial model. Furthermore, in a pre-study, we investigate the relationships between the learner's ability to (digital) self-organise, the learner’s prior- knowledge learning in different variants of mode and learning outcomes as one part of this model. With this pre-study, we are taking the first step towards a holistic model of teaching and learning in digital environments.
The digital transformation sets new requirements to all classes of enterprise systems in companies. ERP systems in particular, which represent the dominant class of enterprise systems, are struggling to meet the new requirements at all levels of the architecture. Therefore, there is an urgent need to reconsider the overall architecture of the systems and address the root of the related issues. Given that many restrictions ERP pose on their adaptability are related to the standardization of data, the database layer of ERP systems is addressed. Since database serve as the foundation for data storage and retrieval, they limit the flexibility of enterprise systems and the chance to adapt to new requirements accordingly. So far, relational databases are widely used. Using a systematic literature approach, recent requirements for ERP systems were identified. Prominent database approaches were assessed against the 23 requirements identified. The results reveal the strengths and weaknesses of recent database approaches. To this end, the results highlight the demand to combine multiple database approaches to fulfill recent business requirements. From a conceptual point of view, this paper supports the idea of federated databases which are interoperable to fulfill future requirements and support business operation. This research forms the basis for renewal of the current generation of ERP systems and proposes to ERP vendors to use different database concepts in the future.
Yes, we can (?)
(2021)
The COVID-19 crisis has caused an extreme situation for higher education institutions around the world, where exclusively virtual teaching and learning has become obligatory rather than an additional supporting feature. This has created opportunities to explore the potential and limitations of virtual learning formats. This paper presents four theses on virtual classroom teaching and learning that are discussed critically. We use existing theoretical insights extended by empirical evidence from a survey of more than 850 students on acceptance, expectations, and attitudes regarding the positive and negative aspects of virtual teaching. The survey responses were gathered from students at different universities during the first completely digital semester (Spring-Summer 2020) in Germany. We discuss similarities and differences between the subjects being studied and highlight the advantages and disadvantages of virtual teaching and learning. Against the background of existing theory and the gathered data, we emphasize the importance of social interaction, the combination of different learning formats, and thus context-sensitive hybrid learning as the learning form of the future.
Increasingly fast development cycles and individualized products pose major challenges for today's smart production systems in times of industry 4.0. The systems must be flexible and continuously adapt to changing conditions while still guaranteeing high throughputs and robustness against external disruptions. Deep rein- forcement learning (RL) algorithms, which already reached impressive success with Google DeepMind's AlphaGo, are increasingly transferred to production systems to meet related requirements. Unlike supervised and unsupervised machine learning techniques, deep RL algorithms learn based on recently collected sensor- and process-data in direct interaction with the environment and are able to perform decisions in real-time. As such, deep RL algorithms seem promising given their potential to provide decision support in complex environments, as production systems, and simultaneously adapt to changing circumstances. While different use-cases for deep RL emerged, a structured overview and integration of findings on their application are missing. To address this gap, this contribution provides a systematic literature review of existing deep RL applications in the field of production planning and control as well as production logistics. From a performance perspective, it became evident that deep RL can beat heuristics significantly in their overall performance and provides superior solutions to various industrial use-cases. Nevertheless, safety and reliability concerns must be overcome before the widespread use of deep RL is possible which presumes more intensive testing of deep RL in real world applications besides the already ongoing intensive simulations.
To cope with the already large, and ever increasing, amount of information stored in organizational memory, "forgetting," as an important human memory process, might be transferred to the organizational context. Especially in intentionally planned change processes (e.g., change management), forgetting is an important precondition to impede the recall of obsolete routines and adapt to new strategic objectives accompanied by new organizational routines. We first comprehensively review the literature on the need for organizational forgetting and particularly on accidental vs. intentional forgetting. We discuss the current state of the art of theory and empirical evidence on forgetting from cognitive psychology in order to infer mechanisms applicable to the organizational context. In this respect, we emphasize retrieval theories and the relevance of retrieval cues important for forgetting. Subsequently, we transfer the empirical evidence that the elimination of retrieval cues leads to faster forgetting to the forgetting of organizational routines, as routines are part of organizational memory. We then propose a classification of cues (context, sensory, business process-related cues) that are relevant in the forgetting of routines, and discuss a meta-cue called the "situational strength" cue, which is relevant if cues of an old and a new routine are present simultaneously. Based on the classification as business process-related cues (information, team, task, object cues), we propose mechanisms to accelerate forgetting by eliminating specific cues based on the empirical and theoretical state of the art. We conclude that in intentional organizational change processes, the elimination of cues to accelerate forgetting should be used in change management practices.
Technological advancements are giving rise to the fourth industrial revolution - Industry 4.0 -characterized by the mass employment of smart objects in highly reconfigurable and thoroughly connected industrialproduct-service systems. The purpose of this paper is to propose a theory-based knowledgedynamics model in the smart grid scenario that would provide a holistic view on the knowledge-based interactions among smart objects, humans, and other actors as an underlyingmechanism of value co-creation in Industry 4.0. A multi-loop and three-layer - physical, virtual, and interface - model of knowledge dynamics is developedby building on the concept of ba - an enabling space for interactions and theemergence of knowledge. The model depicts how big data analytics are just one component inunlocking the value of big data, whereas the tacit engagement of humans-in-the-loop - theirsense-making and decision-making - is needed for insights to be evoked fromanalytics reports and customer needs to be met.
To cope with the already large, and ever increasing, amount of information stored in organizational memory, "forgetting," as an important human memory process, might be transferred to the organizational context. Especially in intentionally planned change processes (e.g., change management), forgetting is an important precondition to impede the recall of obsolete routines and adapt to new strategic objectives accompanied by new organizational routines. We first comprehensively review the literature on the need for organizational forgetting and particularly on accidental vs. intentional forgetting. We discuss the current state of the art of theory and empirical evidence on forgetting from cognitive psychology in order to infer mechanisms applicable to the organizational context. In this respect, we emphasize retrieval theories and the relevance of retrieval cues important for forgetting. Subsequently, we transfer the empirical evidence that the elimination of retrieval cues leads to faster forgetting to the forgetting of organizational routines, as routines are part of organizational memory. We then propose a classification of cues (context, sensory, business process-related cues) that are relevant in the forgetting of routines, and discuss a meta-cue called the "situational strength" cue, which is relevant if cues of an old and a new routine are present simultaneously. Based on the classification as business process-related cues (information, team, task, object cues), we propose mechanisms to accelerate forgetting by eliminating specific cues based on the empirical and theoretical state of the art. We conclude that in intentional organizational change processes, the elimination of cues to accelerate forgetting should be used in change management practices.
The development of new and better optimization and approximation methods for Job Shop Scheduling Problems (JSP) uses simulations to compare their performance. The test data required for this has an uncertain influence on the simulation results, because the feasable search space can be changed drastically by small variations of the initial problem model. Methods could benefit from this to varying degrees. This speaks in favor of defining standardized and reusable test data for JSP problem classes, which in turn requires a systematic describability of the test data in order to be able to compile problem adequate data sets. This article looks at the test data used for comparing methods by literature review. It also shows how and why the differences in test data have to be taken into account. From this, corresponding challenges are derived which the management of test data must face in the context of JSP research.
Keywords
The collaboration during the modeling process is uncomfortable and characterized by various limitations. Faced with the successful transfer of first process modeling languages to the augmented world, non-transparent processes can be visualized in a more comprehensive way. With the aim to rise comfortability, speed, accuracy and manifoldness of real world process augmentations, a framework for the bidirectional interplay of the common process modeling world and the augmented world has been designed as morphologic box. Its demonstration proves the working of drawn AR integrations. Identified dimensions were derived from (1) a designed knowledge construction axiom, (2) a designed meta-model, (3) designed use cases and (4) designed directional interplay modes. Through a workshop-based survey, the so far best AR modeling configuration is identified, which can serve for benchmarks and implementations.
Application of knowledge management methods for the improvement of education and training needs
(2006)
Skill Management
(2006)
Requirements for an integration of methods analyzing social issues in knowledge organizations
(2006)
Not only the public services are able to ensure the effective and efficient use of e-democracy tools. This contribution points out how a party must be structured to function as a neutral service provider for the citizen to set the results of electronic decision-making processes generally binding. The party provides only the methodology and the technology of decision making. Contents are defined exclusively from the citizens. These contents and voting results are implemented obligatorily in the parliament by the delegates of the party. The electronic democracy contributes, in order to supplement the representative democracy, scalable around direct democratic elements. The citizens can determine all 4 or 5 years with the national elections, how much each political decision has to be affected direct democratically by edemocracy tools. Such an approach is subject to other requirements than a governmental offered service.
Adaptability of information systems has become a substantial competition factor. Today's insufficient methodical support for the realization of adaptability frequently leads to unused potentials of deployed information technology in enterprises. In this contribution a procedure is presented, which addresses the demand to determine the necessary adaptability of an enterprise related to its surrounding environmental environment.
The concept of adaptability has been widely recognised as research field in recent years. Business information systems play a key part in terms of business performance. Adaptability of information systems therefore is a primary goal of vendors and end-users. However, so far concepts that help to determine the adaptability of Information Systems are missing. Based on research results of the project CHANGE1 this contribution presents an integrated process model addressing the problem and a possible solution.
Existing approaches in the area of knowledge-intensive processes focus on integrated knowledge and process management systems, the support of processes with KM systems, or the analysis of knowledge-intensive activities. For capturing knowledge-intensive business processes well known and established methods do not meet the requirements of a comprehensive and integrated approach of process-oriented knowledge management. These approaches are not able to visualise the decisions, actions and measures which are causing the sequence of the processes in an adequate manner. Parallel to conventional processes knowledge-intensive processes exist. These processes are based on conversions of knowledge within these processes. To fill these gaps in modelling knowledge-intensive business processes the Knowledge Modelling and Description Language (KMDL) got developed. The KMDL is able to represent the development, use, offer and demand of knowledge along business processes. Further it is possible to show the existing knowledge conversions which take place additionally to the normal business processes. The KMDL can be used to formalise knowledgeintensive processes with a focus on certain knowledge-specific characteristics and to identify process improvements in these processes. The KMDL modelling tool K-Modeler is introduced for a computer-aided modelling and analysing. The technical framework and the most important functionalities to support the analysis of the captured processes are introduced in the following contribution.
Process oriented knowledge management focuses on knowledge intensive business processes. For modelling and analysis of these processes the modelling technique KMDL (Knowledge Modeling and Description Language) has been developed. KMDL is a method to describe knowledge flows and conversions along and between business processes. Thereby KMDL identifies existing and utilized information as well as knowledge of individual participants and of the entire company. This research-in-progress contribution introduces a practical example in the field of software engineering, in which KMDL models are evaluated to identify process improvements, e.g. by adding knowledge management activities. Therefore three individual views focussing on selected aspects of interest are introduced.
Lately, first implementation approaches of Internet of Things (IoT) technologies penetrate industrial value-adding processes. Within this, the competence requirements for employees are changing. Employees’ organization, process, and interaction competences are of crucial importance in this new IoT environment, however, in students and vocational training not sufficiently considered yet. On the other hand, conventional learning factories evolve and transform to digital learning factories. Nevertheless, the integration of IoT technology and its usage for training in digital learning factories has been largely neglected thus far. Existing learning factories do not explicitly and properly consider IoT technology, which leads to deficiencies regarding an appropriate development of employees’ Industrial IoT competences. The goal of this contribution is to point out a didactic concept that enables development and training of these new demanded competences by using an IoT laboratory. For this purpose, a design science approach is applied. The result of this contribution is a didactic concept for the development of Industrial IoT competences in an IoT laboratory.
As Industry 4.0 infrastructures are seen as highly evolutionary environment with volatile, and time-dependent workloads for analytical tasks, particularly the optimal dimensioning of IT hardware is a challenge for decision makers because the digital processing of these tasks can be decoupled from their physical place of origin. Flexible architecture models to allocate tasks efficiently with regard to multi-facet aspects and a predefined set of local systems and external cloud services have been proven in small example scenarios. This paper provides a benchmark of existing task realization strategies, composed of (1) task distribution and (2) task prioritization in a real-world scenario simulation. It identifies heuristics as superior strategies.
Future ERP Systems
(2021)
This paper presents a research agenda on the current generation of ERP systems which was developed based on a literature review on current problems of ERP systems. The problems are presented following the ERP life cycle. In the next step, the identified problems are mapped on a reference architecture model of ERP systems that is an extension of the three-tier architecture model that is widely used in practice. The research agenda is structured according to the reference architecture model and addresses the problems identified regarding data, infrastructure, adaptation, processes, and user interface layer.
This paper presents an exploratory study investigating the influence of the factors (1) intermediary participation, (2) decision-making authority, (3) position in the enterprise, and (4) experience in open innovation on the perception and assessment of the benefits and risks expected from participating in open innovation projects. For this purpose, an online survey was conducted in Germany, Austria and Switzerland. The result of this paper is an empirical evidence showing whether and how these factors affect the perception of potential benefits and risks expected within the context of open innovation project participation. Furthermore, the identified effects are discussed against the theory. Existing theory regarding the benefits and risks of open innovation is expanded by (1) finding that they are perceived mostly independently of the factors, (2) confirming the practical relevance of benefits and risks, and (3) enabling a finer distinction between their degrees of relevance according to respective contextual specifics.
Cyber-physical systems (CPS) have shaped the discussion about Industry 4.0 (I4.0) for some time. To ensure the competitiveness of manufacturing enterprises the vision for the future figures out cyber-physical production systems (CPPS) as a core component of a modern factory. Adaptability and coping with complexity are (among others) potentials of this new generation of production management. The successful transformation of this theoretical construct into practical implementation can only take place with regard to the conditions characterizing the context of a factory. The subject of this contribution is a concept that takes up the brownfield character and describes a solution for extending existing (legacy) systems with CPS capabilities.
Coring on Digital Platforms
(2017)
Today’s mobile devices are part of powerful business ecosystems, which usually involve digital platforms. To better understand the complex phenomenon of coring and related dynamics, this paper presents a case study comparing iMessage as part of Apple’s iOS and WhatsApp. Specifically, it investigates activities regarding platform coring, as the integration of several functionalities provided by third-party applications in the platform core. The paper makes three contributions. First, a systematization of coring activities is developed. Coring modes are differentiated by the amount of coring and application maintenance. Second, the case study revealed that the phenomenon of platform coring is present on digital platforms for mobile devices. Third, the fundamentals of coring are discussed as a first step towards theoretical development. Even though coring constitutes a potential threat for third-party developers regarding their functional differentiation, an idea of what a beneficial partnership incorporating coring activities could look like is developed here.
In times of digitalization, the collection and modeling of business processes is still a challenge for companies. The demand for trustworthy process models that reflect the actual execution steps therefore increases. The respective kinds of processes significantly determine both, business process analysis and the conception of future target processes and they are the starting point for any kind of change initiatives. Existing approaches to model as-is processes, like process mining, are exclusively focused on reconstruction. Therefore, transactional protocols and limited data from a single application system are used. Heterogeneous application landscapes and business processes that are executed across multiple application systems, on the contrary, are one of the main challenges in process mining research. Using RFID technology is hence one approach to close the existing gap between different application systems. This paper focuses on methods for data collection from real world objects via RFID technology and possible combinations with application data (process mining) in order to realize a cross system mining approach.
Today’s mobile devices are part of powerful business ecosystems, which usually involve digital platforms. To better understand the complex phenomenon of coring and related dynamics, this paper presents a case study comparing iMessage as part of Apple’s iOS and WhatsApp. Specifically, it investigates activities regarding platform coring, as the integration of several functionalities provided by third-party applications in the platform core. The paper makes three contributions. First, a systematization of coring activities is developed. Coring modes are differentiated by the amount of coring and application maintenance. Second, the case study revealed that the phenomenon of platform coring is present on digital platforms for mobile devices. Third, the fundamentals of coring are discussed as a first step towards theoretical development. Even though coring constitutes a potential threat for third-party developers regarding their functional differentiation, an idea of what a beneficial partnership incorporating coring activities could look like is developed here.
Time to change
(2020)
Industry 4.0 leads to a radical change that is progressing incrementally. The new information and communication technologies provide many conceivable opportunities for their application in the context of sustainable corporate management. The combination of new digital technologies with the ecological and social goals of companies offers a multitude of unimagined potentials and challenges. Although companies already see the need for action, there was in the past and currently still is a lack of concrete measures that lever the potential of Industry 4.0 for sustainability management. During the course of this position paper we develop six theses (two from each sustainability perspective) against the background of the current situation in research and practice, and policy.
This meta-analysis synthesizes 332 effect sizes of various methods to enhance creativity. We clustered all studies into 12 methods to identify the most effective creativity enhancement methods. We found that, on average, creativity can be enhanced, Hedges’ g = 0.53, 95% CI [0.44, 0.61], with 70.09% of the participants in the enhancement conditions being more creative than the average person in the control conditions. Complex training courses, meditation, and cultural exposure were the most effective (gs = 0.66) while the use of cognitive manipulation drugs was the least and also noneffective, g = 0.10. The type of training material was also important. For instance, figural methods were more effective in enhancing creativity, and enhancing converging thinking was more effective than enhancing divergent thinking. Study effect sizes varied considerably across all studies and for many subgroup analyses, suggesting that researchers can plausibly expect to find reversed effects occasionally. We found no evidence of publication bias. We discuss theoretical implications and suggest future directions for best practices in enhancing creativity. (PsycInfo Database Record (c) 2023 APA, all rights reserved)
The development of new and better optimization and approximation methods for Job Shop Scheduling Problems (JSP) uses simulations to compare their performance. The test data required for this has an uncertain influence on the simulation results, because the feasable search space can be changed drastically by small variations of the initial problem model. Methods could benefit from this to varying degrees. This speaks in favor of defining standardized and reusable test data for JSP problem classes, which in turn requires a systematic describability of the test data in order to be able to compile problem adequate data sets. This article looks at the test data used for comparing methods by literature review. It also shows how and why the differences in test data have to be taken into account. From this, corresponding challenges are derived which the management of test data must face in the context of JSP research.
Collaborative Engineering is a promising concept to increase the competitiveness of companies. Target of this paper is to describe the industrial application of this approach, considering shipbuilding as an example. Besides the engineering partners needs to collaborate during the product development phase, there are many other stakeholders who are interested in the product ship along its whole life cycle. Therefore the Concept of Collaborative Engineering is extended by introducing the idea of Communities. Requirements on Communities in Engineering are deduced. Based on this an architectural framework for Collaborative Engineering Communities is described. Concluding research topics which have to be discussed for practical realization are outlined.
Skill management catalogues built via KMDL : integrating knowledge and business process modelling
(2004)
The efficient use of human capital is one of the most important factors in todays' business competition. Competition is strongly influenced by qualified staff. In order to aid the human resources department to keep up with strategic decisions various skill management systems have been created that make the development of human resources easier and more precise. Skill management systems are only as good as the information that they are based on. The mostly used basic information is the skill catalogue which shows the gaps of each employee or division within the company. But there are nearly no applicable methods yet to create such a catalogue thoroughly. This paper introduces a reasonable approach to create such a catalogue with the description language for knowledge-intensive processes KMDL. The skill catalogue built for skill management systems is one of the most important but still most neglected factors when introducing skill management.
Knowledge processes and business processes are linked together and should be regarded together, too. Business processes can be modeled and analyzed extensively with well known and established methods. The simple signs of static knowledge does not fulfill the requirements of a comprehensive and integrated approach of process-oriented knowledge management. The Knowledge Modeler Description Language KMDL is able to represent the creation, use and necessity of knowledge along common business processes. So KMDL can be used to formalize knowledge-intensive processes with a focus on certain knowledgespecific characteristics and to identify weak points in these processes. For computer-aided modeling and analyzing the tool K-Modeler is introduced.