Refine
Has Fulltext
- yes (30)
Year of publication
- 2014 (30) (remove)
Document Type
- Doctoral Thesis (15)
- Article (12)
- Monograph/Edited Volume (2)
- Postprint (1)
Language
- English (30) (remove)
Is part of the Bibliography
- no (30) (remove)
Keywords
- Adana Basin (1)
- Adana Becken (1)
- Africa (1)
- Aktive Arbeitsmarktpolitik (1)
- Anfragepaare (1)
- Apartheid (1)
- Arbeitssuchverhalten (1)
- Automatisierung (1)
- Azobenzolhaltige Polymerfilme (1)
- Bayesian networks (1)
Institute
- Bürgerliches Recht (9)
- Institut für Biochemie und Biologie (5)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (3)
- Institut für Physik und Astronomie (2)
- Institut für Umweltwissenschaften und Geographie (2)
- Vereinigung für Jüdische Studien e. V. (2)
- Department Psychologie (1)
- Extern (1)
- Institut für Geowissenschaften (1)
- Institut für Informatik und Computational Science (1)
The term Linked Data refers to connected information sources comprising structured data about a wide range of topics and for a multitude of applications. In recent years, the conceptional and technical foundations of Linked Data have been formalized and refined. To this end, well-known technologies have been established, such as the Resource Description Framework (RDF) as a Linked Data model or the SPARQL Protocol and RDF Query Language (SPARQL) for retrieving this information. Whereas most research has been conducted in the area of generating and publishing Linked Data, this thesis presents novel approaches for improved management. In particular, we illustrate new methods for analyzing and processing SPARQL queries. Here, we present two algorithms suitable for identifying structural relationships between these queries. Both algorithms are applied to a large number of real-world requests to evaluate the performance of the approaches and the quality of their results. Based on this, we introduce different strategies enabling optimized access of Linked Data sources. We demonstrate how the presented approach facilitates effective utilization of SPARQL endpoints by prefetching results relevant for multiple subsequent requests. Furthermore, we contribute a set of metrics for determining technical characteristics of such knowledge bases. To this end, we devise practical heuristics and validate them through thorough analysis of real-world data sources. We discuss the findings and evaluate their impact on utilizing the endpoints. Moreover, we detail the adoption of a scalable infrastructure for improving Linked Data discovery and consumption. As we outline in an exemplary use case, this platform is eligible both for processing and provisioning the corresponding information.
The name Mandela became first inscribed in the annals of African liberation as nothing particularly unusual at the time. The late fifties was an era of trials and detentions in the colonies. The Treason Trial, which took place from 1956 to 1961, was closely followed by those of my generation largely through Drum Magazine.
Despite its many challenges and limitations the concept of in situ upgrading of informal settlements has become one of the most favoured approaches to the housing crisis in the ‘Global South’. Due to its inherent principles of incremental in situ development, prevention of relocations, protection of local livelihoods and democratic participation and cooperation, this approach is often perceived to be more sustainable than other housing approaches that often rely on quantitative housing delivery and top down planning methodologies. While this study does not question the benefits of the in situ upgrading approach, it seeks to identify problems of its practical implementation within a specific national and local context. The study discusses the origin and importance of this approach on the basis of a review of international housing policy development and analyses the broader political and social context of the incorporation of this approach into South African housing policy. It further uses insights from a recent case study in Cape Town to determine complications and conflicts that can arise when applying in situ upgrading of informal settlements in a complex local context. On that basis benefits and limitations of the in situ upgrading approach are specified and prerequisites for its successful implementation formulated.
The Beruriah Incident
(2014)
The story known as the Beruriah Incident, which appears in Rashi’s commentary on bAvodah Zarah 18b (related to ATU types 920A* and 823A*), describes the failure and tragic end of R. Meir and his wife Beruriah, two tannaic role-models. This article examines the authenticity of the story by tracking the method of distribution in traditional Jewish society before the modern era, and comparing the story’s components with rabbinic literature and international folklore.
Organizations try to gain competitive advantages, and to increase customer satisfaction. To ensure the quality and efficiency of their business processes, they perform business process management. An important part of process management that happens on the daily operational level is process controlling. A prerequisite of controlling is process monitoring, i.e., keeping track of the performed activities in running process instances. Only by process monitoring can business analysts detect delays and react to deviations from the expected or guaranteed performance of a process instance. To enable monitoring, process events need to be collected from the process environment. When a business process is orchestrated by a process execution engine, monitoring is available for all orchestrated process activities. Many business processes, however, do not lend themselves to automatic orchestration, e.g., because of required freedom of action. This situation is often encountered in hospitals, where most business processes are manually enacted. Hence, in practice it is often inefficient or infeasible to document and monitor every process activity. Additionally, manual process execution and documentation is prone to errors, e.g., documentation of activities can be forgotten. Thus, organizations face the challenge of process events that occur, but are not observed by the monitoring environment. These unobserved process events can serve as basis for operational process decisions, even without exact knowledge of when they happened or when they will happen. An exemplary decision is whether to invest more resources to manage timely completion of a case, anticipating that the process end event will occur too late. This thesis offers means to reason about unobserved process events in a probabilistic way. We address decisive questions of process managers (e.g., "when will the case be finished?", or "when did we perform the activity that we forgot to document?") in this thesis. As main contribution, we introduce an advanced probabilistic model to business process management that is based on a stochastic variant of Petri nets. We present a holistic approach to use the model effectively along the business process lifecycle. Therefore, we provide techniques to discover such models from historical observations, to predict the termination time of processes, and to ensure quality by missing data management. We propose mechanisms to optimize configuration for monitoring and prediction, i.e., to offer guidance in selecting important activities to monitor. An implementation is provided as a proof of concept. For evaluation, we compare the accuracy of the approach with that of state-of-the-art approaches using real process data of a hospital. Additionally, we show its more general applicability in other domains by applying the approach on process data from logistics and finance.