Refine
Year of publication
Document Type
- Article (46)
- Part of a Book (15)
- Conference Proceeding (15)
- Other (6)
- Postprint (3)
- Monograph/Edited Volume (2)
- Review (1)
Language
- English (88) (remove)
Is part of the Bibliography
- yes (88)
Keywords
- knowledge management (6)
- Industry 4.0 (5)
- digital learning (4)
- COVID-19 (3)
- CPPS (3)
- CPS (3)
- JSP (3)
- business processes (3)
- evaluation (3)
- learning factories (3)
Institute
- Fachgruppe Betriebswirtschaftslehre (45)
- Wirtschaftswissenschaften (35)
- Hasso-Plattner-Institut für Digital Engineering GmbH (3)
- Wirtschafts- und Sozialwissenschaftliche Fakultät (2)
- Department Psychologie (1)
- Fachgruppe Politik- & Verwaltungswissenschaft (1)
- Forschungsbereich „Politik, Verwaltung und Management“ (1)
- Sozialwissenschaften (1)
Adaptability of information systems has become a substantial competition factor. Today's insufficient methodical support for the realization of adaptability frequently leads to unused potentials of deployed information technology in enterprises. In this contribution a procedure is presented, which addresses the demand to determine the necessary adaptability of an enterprise related to its surrounding environmental environment.
The concept of adaptability has been widely recognised as research field in recent years. Business information systems play a key part in terms of business performance. Adaptability of information systems therefore is a primary goal of vendors and end-users. However, so far concepts that help to determine the adaptability of Information Systems are missing. Based on research results of the project CHANGE1 this contribution presents an integrated process model addressing the problem and a possible solution.
Future ERP Systems
(2021)
This paper presents a research agenda on the current generation of ERP systems which was developed based on a literature review on current problems of ERP systems. The problems are presented following the ERP life cycle. In the next step, the identified problems are mapped on a reference architecture model of ERP systems that is an extension of the three-tier architecture model that is widely used in practice. The research agenda is structured according to the reference architecture model and addresses the problems identified regarding data, infrastructure, adaptation, processes, and user interface layer.
The digital transformation sets new requirements to all classes of enterprise systems in companies. ERP systems in particular, which represent the dominant class of enterprise systems, are struggling to meet the new requirements at all levels of the architecture. Therefore, there is an urgent need to reconsider the overall architecture of the systems and address the root of the related issues. Given that many restrictions ERP pose on their adaptability are related to the standardization of data, the database layer of ERP systems is addressed. Since database serve as the foundation for data storage and retrieval, they limit the flexibility of enterprise systems and the chance to adapt to new requirements accordingly. So far, relational databases are widely used. Using a systematic literature approach, recent requirements for ERP systems were identified. Prominent database approaches were assessed against the 23 requirements identified. The results reveal the strengths and weaknesses of recent database approaches. To this end, the results highlight the demand to combine multiple database approaches to fulfill recent business requirements. From a conceptual point of view, this paper supports the idea of federated databases which are interoperable to fulfill future requirements and support business operation. This research forms the basis for renewal of the current generation of ERP systems and proposes to ERP vendors to use different database concepts in the future.
Coring on Digital Platforms
(2017)
Today’s mobile devices are part of powerful business ecosystems, which usually involve digital platforms. To better understand the complex phenomenon of coring and related dynamics, this paper presents a case study comparing iMessage as part of Apple’s iOS and WhatsApp. Specifically, it investigates activities regarding platform coring, as the integration of several functionalities provided by third-party applications in the platform core. The paper makes three contributions. First, a systematization of coring activities is developed. Coring modes are differentiated by the amount of coring and application maintenance. Second, the case study revealed that the phenomenon of platform coring is present on digital platforms for mobile devices. Third, the fundamentals of coring are discussed as a first step towards theoretical development. Even though coring constitutes a potential threat for third-party developers regarding their functional differentiation, an idea of what a beneficial partnership incorporating coring activities could look like is developed here.
Today’s mobile devices are part of powerful business ecosystems, which usually involve digital platforms. To better understand the complex phenomenon of coring and related dynamics, this paper presents a case study comparing iMessage as part of Apple’s iOS and WhatsApp. Specifically, it investigates activities regarding platform coring, as the integration of several functionalities provided by third-party applications in the platform core. The paper makes three contributions. First, a systematization of coring activities is developed. Coring modes are differentiated by the amount of coring and application maintenance. Second, the case study revealed that the phenomenon of platform coring is present on digital platforms for mobile devices. Third, the fundamentals of coring are discussed as a first step towards theoretical development. Even though coring constitutes a potential threat for third-party developers regarding their functional differentiation, an idea of what a beneficial partnership incorporating coring activities could look like is developed here.
Enterprise systems have long played an important role in businesses of various sizes. With the increasing complexity of today’s business relationships, specialized application systems are being used more and more. Moreover, emerging technologies such as artificial intelligence are becoming accessible for enterprise systems. This raises the question of the future role of enterprise systems. This minitrack covers novel ideas that contribute to and shape the future role of enterprise systems with five contributions.
While Information Systems (IS) Research on the individual and workgroup level of analysis is omnipresent, research on the enterprise-level IS is less frequent. Even though research on Enterprise Systems and their management is established in academic associations and conference programs, enterprise-level phenomena are underrepresented. This minitrack provides a forum to integrate existing research streams that traditionally needed to be attached to other topics (such as IS management or IS governance). The minitrack received broad attention. The three selected papers address different facets of the future role of enterprise-wide IS including aspects such as carbonization, ecosystem integration, and technology-organization fit.
Modern production infrastructures of globally operating companies usually consist of multiple distributed production sites. While the organization of individual sites consisting of Industry 4.0 components itself is demanding, new questions regarding the organization and allocation of resources emerge considering the total production network. In an attempt to face the challenge of efficient distribution and processing both within and across sites, we aim to provide a hybrid simulation approach as a first step towards optimization. Using hybrid simulation allows us to include real and simulated concepts and thereby benchmark different approaches with reasonable effort. A simulation concept is conceptualized and demonstrated qualitatively using a global multi-site example.
Technological advancements are giving rise to the fourth industrial revolution - Industry 4.0 -characterized by the mass employment of smart objects in highly reconfigurable and thoroughly connected industrialproduct-service systems. The purpose of this paper is to propose a theory-based knowledgedynamics model in the smart grid scenario that would provide a holistic view on the knowledge-based interactions among smart objects, humans, and other actors as an underlyingmechanism of value co-creation in Industry 4.0. A multi-loop and three-layer - physical, virtual, and interface - model of knowledge dynamics is developedby building on the concept of ba - an enabling space for interactions and theemergence of knowledge. The model depicts how big data analytics are just one component inunlocking the value of big data, whereas the tacit engagement of humans-in-the-loop - theirsense-making and decision-making - is needed for insights to be evoked fromanalytics reports and customer needs to be met.
In times of digitalization, the collection and modeling of business processes is still a challenge for companies. The demand for trustworthy process models that reflect the actual execution steps therefore increases. The respective kinds of processes significantly determine both, business process analysis and the conception of future target processes and they are the starting point for any kind of change initiatives. Existing approaches to model as-is processes, like process mining, are exclusively focused on reconstruction. Therefore, transactional protocols and limited data from a single application system are used. Heterogeneous application landscapes and business processes that are executed across multiple application systems, on the contrary, are one of the main challenges in process mining research. Using RFID technology is hence one approach to close the existing gap between different application systems. This paper focuses on methods for data collection from real world objects via RFID technology and possible combinations with application data (process mining) in order to realize a cross system mining approach.
With the latest technological developments and associated new possibilities in teaching, the personalisation of learning is gaining more and more importance. It assumes that individual learning experiences and results could generally be improved when personal learning preferences are considered. To do justice to the complexity of the personalisation possibilities of teaching and learning processes, we illustrate the components of learning and teaching in the digital environment and their interdependencies in an initial model. Furthermore, in a pre-study, we investigate the relationships between the learner's ability to (digital) self-organise, the learner’s prior- knowledge learning in different variants of mode and learning outcomes as one part of this model. With this pre-study, we are taking the first step towards a holistic model of teaching and learning in digital environments.
Industry 4.0, i.e. the connection of cyber-physical systems via the Internet in production and logistics, leads to considerable changes in the socio-technical system of the factory. The effects range from a considerable need for further training, which is exacerbated by the current shortage of skilled workers, to an opening of the previously inaccessible boundaries of the factory to third-party access, an increasing merging of office IT and manufacturing IT, and a new understanding of what machines can do with their data. This results in new requirements for the modeling, analysis and design of information processing and performance mapping business processes.
In the past, procedures were developed under the name of “process-oriented knowledge management” with which the exchange and use of knowledge in business processes could be represented, analyzed and improved. However, these approaches were limited to the office environment. A method that makes it possible to document, analyze and jointly optimize the new possibilities of knowledge processing by using artificial intelligence and machine learning in production and logistics in the same way and in a manner compatible with the approach in the office environment does not exist so far. The extension of the modeling language KMDL, which is described in this paper, will contribute to close this research gap.
This paper describes first approaches for an analysis and design method for a knowledge management integrating man and machine in the age of Industry 4.0.
Process models are the basic ingredient for many attempts to improve business processes. The graphical depiction of otherwise not observable behavior in an enterprise is one of the most important techniques in the digital society. They help to enable decision making in the design of processes and workflows. Nevertheless it is not easy to correctly model business processes. Some approaches try to detect errors by an automated analysis of the process model. This contribution focuses on the creation of the first model from scratch. Which errors occur most frequently and how can these be avoided?
Existing factories face multiple problems due to their hierarchical structure of decision making and control. Cyber-physical systems principally allow to increase the degree of autonomy to new heights. But which degree of autonomy is really useful and beneficiary? This paper differentiates diverse definitions of autonomy and approaches to determine them. Some experimental findings in a lab environment help to answer the question raised in this paper.
Application of knowledge management methods for the improvement of education and training needs
(2006)
Skill Management
(2006)
This contribution presents an approach for requirement oriented team building in industrial processes like product development. This will be based on the knowledge modelling and description language (KMDL(R)) that enables the modelling and analysis of knowledge intensive business processes. First the basic elements of the modelling technique are described, presenting the concept and the description language. Furthermore it is shown how the KMDL(R) process models can be used as a basis for the team building component. Therefore, an algorithm was developed that is able to propose a team composition for a specific task by analyzing the knowledge and skills of the employees, which will be contrasted to the process requirements. This can be used as guidance for team building decisions.
Traditional production systems are enhanced by cyber-physical systems (CPS) and Internet of Things. A kind of next generation systems, those cyber-physical production systems (CPPS) are able to raise the level of autonomy of its production components. To find the optimal degree of autonomy in a given context, a research approach is formulated using a simulation concept. Based on requirements and assumptions, a cyber-physical market is modeled and qualitative hypotheses are formulated, which will be verified with the help of the CPPS of a hybrid simulation environment.
Collaborative Engineering is a promising concept to increase the competitiveness of companies. Target of this paper is to describe the industrial application of this approach, considering shipbuilding as an example. Besides the engineering partners needs to collaborate during the product development phase, there are many other stakeholders who are interested in the product ship along its whole life cycle. Therefore the Concept of Collaborative Engineering is extended by introducing the idea of Communities. Requirements on Communities in Engineering are deduced. Based on this an architectural framework for Collaborative Engineering Communities is described. Concluding research topics which have to be discussed for practical realization are outlined.
Existing approaches in the area of knowledge-intensive processes focus on integrated knowledge and process management systems, the support of processes with KM systems, or the analysis of knowledge-intensive activities. For capturing knowledge-intensive business processes well known and established methods do not meet the requirements of a comprehensive and integrated approach of process-oriented knowledge management. These approaches are not able to visualise the decisions, actions and measures which are causing the sequence of the processes in an adequate manner. Parallel to conventional processes knowledge-intensive processes exist. These processes are based on conversions of knowledge within these processes. To fill these gaps in modelling knowledge-intensive business processes the Knowledge Modelling and Description Language (KMDL) got developed. The KMDL is able to represent the development, use, offer and demand of knowledge along business processes. Further it is possible to show the existing knowledge conversions which take place additionally to the normal business processes. The KMDL can be used to formalise knowledgeintensive processes with a focus on certain knowledge-specific characteristics and to identify process improvements in these processes. The KMDL modelling tool K-Modeler is introduced for a computer-aided modelling and analysing. The technical framework and the most important functionalities to support the analysis of the captured processes are introduced in the following contribution.
This paper shows the KMDL Knowledge Management Approach which is based on the SECI and ba model by Nonaka and Takeuchi and the KMDL Knowledge modeling language. The approach illustrates the creation of knowledge with the focus on the knowledge conversions by Nonaka and Takeuchi. Furthermore, it emphasizes the quality of knowledge being embodied in persons and creates a personalization and socialization strategy which integrates business process modeling, skill management and the selection of knowledge management systems. The paper describes the theoretical foundations of the approach and practical effects which have been seen in the use of this approach.
In the copyright industries of the 21st century, metadata is the grease required to make the engine of copyright run smoothly and powerfully for the benefit of creators, copyright industries and users alike. However, metadata is difficult to acquire and even more difficult to keep up to date as the rights in content are mostly multi-layered, fragmented, international and volatile. This article explores the idea of a neutral metadata search and enhancement tool that could constitute a buffer to safeguard the interests of the various proprietary database owners and avoid the shortcomings of centralised databases.
The authors propose that while tacit knowledge is a valuable resource for developing new business models, its externalization presents several challenges. One major challenge is that individuals often don’t recognize their tacit knowledge resources, while another is the reluctance to share one’s knowledge with others. Addressing these challenges, the authors present an application-oriented serious game-based haptic modeling approach for externalize tacit knowledge, which can be used to develop the first versions of business models based on tacit knowledge. Both conceptual and practical design fundamentals are presented based on elaborated theoretical approaches, which were developed with the help of a design science approach. The development of the research process is presented step by step, whereby we focused on the high accessibility of the presented research. Practitioners are presented with guidelines for implementing their serious game projects. Scientists benefit from starting points for their research topics of externalization, internalization, and socialization of tacit knowledge, development of business models, and serious games or gamification. The paper concludes with open research desiderata and questions from the presented research process.
Lately, first implementation approaches of Internet of Things (IoT) technologies penetrate industrial value-adding processes. Within this, the competence requirements for employees are changing. Employees’ organization, process, and interaction competences are of crucial importance in this new IoT environment, however, in students and vocational training not sufficiently considered yet. On the other hand, conventional learning factories evolve and transform to digital learning factories. Nevertheless, the integration of IoT technology and its usage for training in digital learning factories has been largely neglected thus far. Existing learning factories do not explicitly and properly consider IoT technology, which leads to deficiencies regarding an appropriate development of employees’ Industrial IoT competences. The goal of this contribution is to point out a didactic concept that enables development and training of these new demanded competences by using an IoT laboratory. For this purpose, a design science approach is applied. The result of this contribution is a didactic concept for the development of Industrial IoT competences in an IoT laboratory.
Skill management catalogues built via KMDL : integrating knowledge and business process modelling
(2004)
The efficient use of human capital is one of the most important factors in todays' business competition. Competition is strongly influenced by qualified staff. In order to aid the human resources department to keep up with strategic decisions various skill management systems have been created that make the development of human resources easier and more precise. Skill management systems are only as good as the information that they are based on. The mostly used basic information is the skill catalogue which shows the gaps of each employee or division within the company. But there are nearly no applicable methods yet to create such a catalogue thoroughly. This paper introduces a reasonable approach to create such a catalogue with the description language for knowledge-intensive processes KMDL. The skill catalogue built for skill management systems is one of the most important but still most neglected factors when introducing skill management.
Knowledge is more and more a key factor within companies. Nearly 40 percent of all employees are so called "knowledge workers". Distribution and inquest of knowledge within companies are supported by skill management systems. Although not all aspects and potentials of this instrument are yet utilized skill management systems have spread widely within business organizations. This paper summarizes the requirements, scopes and problems for skill management system within the company.
Knowledge is more and more a key factor within companies [10]. Nearly 40 percent of all employees are so called 'knowledge workers'. Distribution and inquest of knowledge within companies are supported by skill management systems. Although not all aspects and potentials of this instrument are yet utilized skill management systems have spread widely within business organizations. This paper summarizes the requirements, scopes and problems for skill management system within the company.
The efficient use of human capital is one of the most important factors in todays' business competition. Competition is strongly influenced by qualified staff. In order to aid the human resources department to keep up with strategic decisions various competency management systems have been created that make the development of human resources easier and more precise. Competency management systems are only as good as the information that they are based on. The mostly used basic information is the skill catalogue. But there are nearly no applicable methods yet to create such a catalogue thoroughly. This paper introduces a reasonable approach to create such a catalogue with the description language for knowledge-intensive processes KMDL.
Knowledge processes and business processes are linked together and should be regarded together, too. Business processes can be modeled and analyzed extensively with well known and established methods. The simple signs of static knowledge does not fulfill the requirements of a comprehensive and integrated approach of process-oriented knowledge management. The Knowledge Modeler Description Language KMDL is able to represent the creation, use and necessity of knowledge along common business processes. So KMDL can be used to formalize knowledge-intensive processes with a focus on certain knowledgespecific characteristics and to identify weak points in these processes. For computer-aided modeling and analyzing the tool K-Modeler is introduced.
Business processes can be modelled and analysed extensively with well known and established methods. The simple signs of static knowledge do not fulfil the requirements of a comprehensive and integrated approach of process-oriented knowledge management. The Knowledge Modelling Description Language KMDL is able to represent the creation, use and necessity of knowledge along common business processes. Therefore KMDL can be used to formalise knowledge-intensive processes with a focus on certain knowledge-specific characteristics and to identify weak points in these processes. The tool K-Modeller is introduced for a computer-aided modelling and analysing.
The Knowledge Modeler Description Language KMDL is able to represent the creation, use and necessity of knowledge along common business processes. So KMDL can be used to formalize knowledge-intensive processes with a focus on certain knowledge-specific characteristics and to identify weak points in these processes. For a computer-aided modeling and analyzing the tool K-Modeler is introduced.
Process analysis usually focuses only on single and selected processes. It is either existent processes that are recorded and analysed or reference processes that are implemented. So far no evident effort has been put into generalising specific process aspects into patterns and comparing those patterns with regard to their efficiency and effectiveness. This article focuses on the combination of dynamic and holistic analytical elements in enterprise architectures. Our goal is to outline an approach to analyse the development of business processes in a cyclical matter and demonstrate this approach based on an existent modelling language. We want to show that organisational learning can derive from the systematic analysis of past and existent processes from which patterns of successful problem solving can be deducted.
Not only the public services are able to ensure the effective and efficient use of e-democracy tools. This contribution points out how a party must be structured to function as a neutral service provider for the citizen to set the results of electronic decision-making processes generally binding. The party provides only the methodology and the technology of decision making. Contents are defined exclusively from the citizens. These contents and voting results are implemented obligatorily in the parliament by the delegates of the party. The electronic democracy contributes, in order to supplement the representative democracy, scalable around direct democratic elements. The citizens can determine all 4 or 5 years with the national elections, how much each political decision has to be affected direct democratically by edemocracy tools. Such an approach is subject to other requirements than a governmental offered service.
Manufacturing companies still have relatively few points of contact with the circular economy. Especially, extending life time of whole products or parts via remanufacturing is an promising approach to reduce waste. However, necessary cost-efficient assessment of the condition of the individual parts is challenging and assessment procedures are technically complex (e.g., scanning and testing procedures). Furthermore, these assessment procedures are usually only available after the disassembly process has been completed. This is where conceptualization, data acquisition and simulation of remanufacturing processes can help. One major constraining aspect of remanufacturing is reducing logistic efforts, since these also have negative external effects on the environment. Thus regionalization is an additional but in the end consequential challenge for remanufacturing. This article aims to fill a gap by providing an regional remanufacturing approach, in particular the design of local remanufacturing chains. Thereby, further focus lies on modeling and simulating alternative courses of action, including feasibility study and eco-nomic assessment.
Faced with the increasing needs of companies, optimal dimensioning of IT hardware is becoming challenging for decision makers. In terms of analytical infrastructures, a highly evolutionary environment causes volatile, time dependent workloads in its components, and intelligent, flexible task distribution between local systems and cloud services is attractive. With the aim of developing a flexible and efficient design for analytical infrastructures, this paper proposes a flexible architecture model, which allocates tasks following a machine-specific decision heuristic. A simulation benchmarks this system with existing strategies and identifies the new decision maxim as superior in a first scenario-based simulation.
As Industry 4.0 infrastructures are seen as highly evolutionary environment with volatile, and time-dependent workloads for analytical tasks, particularly the optimal dimensioning of IT hardware is a challenge for decision makers because the digital processing of these tasks can be decoupled from their physical place of origin. Flexible architecture models to allocate tasks efficiently with regard to multi-facet aspects and a predefined set of local systems and external cloud services have been proven in small example scenarios. This paper provides a benchmark of existing task realization strategies, composed of (1) task distribution and (2) task prioritization in a real-world scenario simulation. It identifies heuristics as superior strategies.
Faced with the triad of time-cost-quality, the realization of knowledge-intensive tasks at economic conditions is not trivial. Since the number of knowledge-intensive processes is increasing more and more nowadays, the efficient design of knowledge transfers at business processes as well as the target-oriented improvement of them is essential, so that process outcomes satisfy high quality criteria and economic requirements. This particularly challenges knowledge management, aiming for the assignment of ideal manifestations of influence factors on knowledge transfers to a certain task. Faced with first attempts of knowledge transfer-based process improvements [1], this paper continues research about the quantitative examination of knowledge transfers and presents a ready-to-go experiment design that is able to examine quality of knowledge transfers empirically and is suitable to examine knowledge transfers on a quantitative level. Its use is proven by the example of four influence factors, which namely are stickiness, complexity, competence and time pressure.
A growing number of business processes can be characterized as knowledge-intensive. The ability to speed up the transfer of knowledge between any kind of knowledge carriers in business processes with AR techniques can lead to a huge competitive advantage, for instance in manufacturing. This includes the transfer of person-bound knowledge as well as externalized knowledge of physical and virtual objects. The contribution builds on a time-dependent knowledge transfer model and conceptualizes an adaptable, AR-based application. Having the intention to accelerate the speed of knowledge transfers between a manufacturer and an information system, empirical results of an experimentation show the validity of this approach. For the first time, it will be possible to discover how to improve the transfer among knowledge carriers of an organization with knowledge-driven information systems (KDIS). Within an experiment setting, the paper shows how to improve the quantitative effects regarding the quality and amount of time needed for an example manufacturing process realization by an adaptable KDIS.
The collaboration during the modeling process is uncomfortable and characterized by various limitations. Faced with the successful transfer of first process modeling languages to the augmented world, non-transparent processes can be visualized in a more comprehensive way. With the aim to rise comfortability, speed, accuracy and manifoldness of real world process augmentations, a framework for the bidirectional interplay of the common process modeling world and the augmented world has been designed as morphologic box. Its demonstration proves the working of drawn AR integrations. Identified dimensions were derived from (1) a designed knowledge construction axiom, (2) a designed meta-model, (3) designed use cases and (4) designed directional interplay modes. Through a workshop-based survey, the so far best AR modeling configuration is identified, which can serve for benchmarks and implementations.