Refine
Year of publication
- 2021 (23) (remove)
Document Type
- Article (9)
- Part of a Book (8)
- Conference Proceeding (3)
- Postprint (2)
- Monograph/Edited Volume (1)
Is part of the Bibliography
- yes (23)
Keywords
- COVID-19 (3)
- ERP (3)
- digital learning (3)
- knowledge management (3)
- Enterprise Resource Planning (2)
- TAM (2)
- deep reinforcement learning (2)
- discipline differences (2)
- e-learning (2)
- learning factory (2)
Institute
Future ERP Systems
(2021)
This paper presents a research agenda on the current generation of ERP systems which was developed based on a literature review on current problems of ERP systems. The problems are presented following the ERP life cycle. In the next step, the identified problems are mapped on a reference architecture model of ERP systems that is an extension of the three-tier architecture model that is widely used in practice. The research agenda is structured according to the reference architecture model and addresses the problems identified regarding data, infrastructure, adaptation, processes, and user interface layer.
Today’s mobile devices are part of powerful business ecosystems, which usually involve digital platforms. To better understand the complex phenomenon of coring and related dynamics, this paper presents a case study comparing iMessage as part of Apple’s iOS and WhatsApp. Specifically, it investigates activities regarding platform coring, as the integration of several functionalities provided by third-party applications in the platform core. The paper makes three contributions. First, a systematization of coring activities is developed. Coring modes are differentiated by the amount of coring and application maintenance. Second, the case study revealed that the phenomenon of platform coring is present on digital platforms for mobile devices. Third, the fundamentals of coring are discussed as a first step towards theoretical development. Even though coring constitutes a potential threat for third-party developers regarding their functional differentiation, an idea of what a beneficial partnership incorporating coring activities could look like is developed here.
Industry 4.0, i.e. the connection of cyber-physical systems via the Internet in production and logistics, leads to considerable changes in the socio-technical system of the factory. The effects range from a considerable need for further training, which is exacerbated by the current shortage of skilled workers, to an opening of the previously inaccessible boundaries of the factory to third-party access, an increasing merging of office IT and manufacturing IT, and a new understanding of what machines can do with their data. This results in new requirements for the modeling, analysis and design of information processing and performance mapping business processes.
In the past, procedures were developed under the name of “process-oriented knowledge management” with which the exchange and use of knowledge in business processes could be represented, analyzed and improved. However, these approaches were limited to the office environment. A method that makes it possible to document, analyze and jointly optimize the new possibilities of knowledge processing by using artificial intelligence and machine learning in production and logistics in the same way and in a manner compatible with the approach in the office environment does not exist so far. The extension of the modeling language KMDL, which is described in this paper, will contribute to close this research gap.
This paper describes first approaches for an analysis and design method for a knowledge management integrating man and machine in the age of Industry 4.0.
ERP-Systeme
(2021)
Robotic Process Automation (RPA) steht für die softwareunterstützte Bedienung von Softwarelösungen über deren Benutzeroberfläche. Das primäre Ziel, das mit RPA erreicht werden soll, ist die automatisierte Ausführung von Routineaufgaben, die bisher einen menschlichen Eingriff erforderten. Das Potenzial von RPA, Prozesse langfristig zu verbessern, ist allerdings stark begrenzt. Die Automatisierung von Prozessen und die Überbrückung von Medienbrüchen auf der Front-End-Ebene führt zu einer Vielzahl von Abhängigkeiten und Bedingungen, die in diesem Beitrag zusammengefasst werden. Der Weg zu einer nachhaltigen Unternehmensarchitektur (bestehend aus Prozessen und Systemen) erfordert offene, adaptive Systeme mit moderner Architektur, die sich durch ein hohes Maß an Interoperabilität auf verschiedenen Ebenen auszeichnen.
In the copyright industries of the 21st century, metadata is the grease required to make the engine of copyright run smoothly and powerfully for the benefit of creators, copyright industries and users alike. However, metadata is difficult to acquire and even more difficult to keep up to date as the rights in content are mostly multi-layered, fragmented, international and volatile. This article explores the idea of a neutral metadata search and enhancement tool that could constitute a buffer to safeguard the interests of the various proprietary database owners and avoid the shortcomings of centralised databases.
Faced with the triad of time-cost-quality, the realization of knowledge-intensive tasks at economic conditions is not trivial. Since the number of knowledge-intensive processes is increasing more and more nowadays, the efficient design of knowledge transfers at business processes as well as the target-oriented improvement of them is essential, so that process outcomes satisfy high quality criteria and economic requirements. This particularly challenges knowledge management, aiming for the assignment of ideal manifestations of influence factors on knowledge transfers to a certain task. Faced with first attempts of knowledge transfer-based process improvements [1], this paper continues research about the quantitative examination of knowledge transfers and presents a ready-to-go experiment design that is able to examine quality of knowledge transfers empirically and is suitable to examine knowledge transfers on a quantitative level. Its use is proven by the example of four influence factors, which namely are stickiness, complexity, competence and time pressure.
As the complexity of learning task requirements, computer infrastruc- tures and knowledge acquisition for artificial neuronal networks (ANN) is in- creasing, it is challenging to talk about ANN without creating misunderstandings. An efficient, transparent and failure-free design of learning tasks by models is not supported by any tool at all. For this purpose, particular the consideration of data, information and knowledge on the base of an integration with knowledge- intensive business process models and a process-oriented knowledge manage- ment are attractive. With the aim of making the design of learning tasks express- ible by models, this paper proposes a graphical modeling language called Neu- ronal Training Modeling Language (NTML), which allows the repetitive use of learning designs. An example ANN project of AI-based dynamic GUI adaptation exemplifies its use as a first demonstration.
Already successfully used products or designs, past projects or our own experiences can be the basis for the development of new products. As reference products or existing knowledge, it is reused in the development process and across generations of products. Since further, products are developed in cooperation, the development of new product generations is characterized by knowledge-intensive processes in which information and knowledge are exchanged between different kinds of knowledge carriers. The particular knowledge transfer here describes the identification of knowledge, its transmission from the knowledge carrier to the knowledge receiver, and its application by the knowledge receiver, which includes embodied knowledge of physical products. Initial empirical findings of the quantitative effects regarding the speed of knowledge transfers already have been examined. However, the factors influencing the quality of knowledge transfer to increase the efficiency and effectiveness of knowledge transfer in product development have not yet been examined empirically. Therefore, this paper prepares an experimental setting for the empirical investigation of the quality of knowledge transfers.