Nicht ermittelbar
Refine
Year of publication
Document Type
- Monograph/Edited Volume (98) (remove)
Language
- English (98) (remove)
Is part of the Bibliography
- yes (98) (remove)
Keywords
- cyber-physical systems (4)
- probabilistic timed systems (4)
- qualitative Analyse (4)
- qualitative analysis (4)
- quantitative Analyse (4)
- quantitative analysis (4)
- machine learning (3)
- maschinelles Lernen (3)
- Bounded Model Checking (2)
- Cloud Computing (2)
- Digitalisierung (2)
- Forschungsprojekte (2)
- Future SOC Lab (2)
- Graphentransformationssysteme (2)
- In-Memory Technologie (2)
- Modellprüfung (2)
- Multicore Architekturen (2)
- artifical intelligence (2)
- bounded model checking (2)
- cloud computing (2)
- cyber-physische Systeme (2)
- digitalization (2)
- graph transformation systems (2)
- in-memory technology (2)
- künstliche Intelligenz (2)
- model checking (2)
- multicore architectures (2)
- openHPI (2)
- probabilistische gezeitete Systeme (2)
- probabilistische zeitgesteuerte Systeme (2)
- research projects (2)
- Analog-zu-Digital-Konvertierung (1)
- Bahnwesen (1)
- Betriebssysteme (1)
- Blockchain (1)
- Bounded Backward Model Checking (1)
- Cyber-physikalische Systeme (1)
- Datensatz (1)
- Datenvisualisierung (1)
- Dekubitus (1)
- Distributed-Ledger-Technologie (DLT) (1)
- Echtzeit (1)
- Europe (1)
- European Union (1)
- Europäische Union (1)
- Fehlertoleranz (1)
- Generalized Discrimination Networks (1)
- GitHub (1)
- HPI Schul-Cloud (1)
- Heuristiken (1)
- Häkeln (1)
- Interval Timed Automata (1)
- Java (1)
- Kausalität (1)
- Konsensprotokolle (1)
- Kunstanalyse (1)
- Live-Programmierung (1)
- Lively Kernel (1)
- MERLOT (1)
- MOOC (1)
- Modelle mit mehreren Versionen (1)
- Netzwerkprotokolle (1)
- Posenabschätzung (1)
- Quanten-Computing (1)
- Regressionstests (1)
- Reverse Engineering (1)
- Savanne (1)
- Scrollytelling (1)
- Software/Hardware Co-Design (1)
- Standardisierung (1)
- Telemedizin (1)
- Testergebnisse (1)
- Testpriorisierungs (1)
- Timed Automata (1)
- Trajektorien (1)
- Tripel-Graph-Grammatiken (1)
- Verlässlichkeit (1)
- Versionsverwaltung (1)
- Visualisierungskonzept-Exploration (1)
- Werkzeuge (1)
- Wüstenbildung (1)
- analog-to-digital conversion (1)
- art analysis (1)
- asset management (1)
- benutzergenerierte Inhalte (1)
- bounded backward model checking (1)
- causality (1)
- compositional analysis (1)
- computer vision (1)
- computer-aided design (1)
- consensus protocols (1)
- contestation (1)
- continuous integration (1)
- convolutional neural networks (1)
- crochet (1)
- cultural heritage (1)
- cyber-physikalische Systeme (1)
- data set (1)
- data visualization (1)
- decentral identities (1)
- decubitus (1)
- deep learning (1)
- democracy (1)
- demografische Informationen (1)
- demographic information (1)
- dependability (1)
- desertification (1)
- dezentrale Identitäten (1)
- digital education (1)
- digital enlightenment (1)
- digital learning platform (1)
- digital picture archive (1)
- digital sovereignty (1)
- digitale Aufklärung (1)
- digitale Bildung (1)
- digitale Lernplattform (1)
- digitale Souveränität (1)
- digitales Bildarchiv (1)
- discrete-event model (1)
- diskretes Ereignismodell (1)
- dynamic systems (1)
- dynamische Systeme (1)
- fault tolerance (1)
- gefaltete neuronale Netze (1)
- generalized discrimination networks (1)
- global model management (1)
- globales Modellmanagement (1)
- globalization (1)
- heuristics (1)
- interactive media (1)
- interaktive Medien (1)
- international law (1)
- international legal order (1)
- international relations (1)
- international trade (1)
- interval probabilistic timed systems (1)
- interval probabilistische zeitgesteuerte Systeme (1)
- interval timed automata (1)
- intuitive Benutzeroberflächen (1)
- intuitive interfaces (1)
- juridical recording (1)
- k-inductive invariant checking (1)
- k-induktive Invariantenprüfung (1)
- kompositionale Analyse (1)
- kontinuierliche Integration (1)
- kulturelles Erbe (1)
- lebenslanges Lernen (1)
- left recursion (1)
- legal change (1)
- lifelong learning (1)
- live programming (1)
- local autonomy (1)
- local government (1)
- maschinelles Sehen (1)
- metamorphosis of international law (1)
- multi-version models (1)
- network protocols (1)
- new public management (1)
- norm change (1)
- operating systems (1)
- packrat parsing (1)
- parallel and sequential independence (1)
- parallele und Sequentielle Unabhängigkeit (1)
- parsing expression grammars (1)
- pose estimation (1)
- public sector reform (1)
- qualitative model (1)
- qualitatives Modell (1)
- quantum computing (1)
- railways (1)
- real-time (1)
- rechnerunterstütztes Konstruieren (1)
- regression testing (1)
- reverse engineering (1)
- savanna (1)
- scrollytelling (1)
- selbstbestimmte Identitäten (1)
- self-governance (1)
- self-sovereign identity (1)
- software/hardware co-design (1)
- standardization (1)
- static source-code analysis (1)
- statische Quellcodeanalyse (1)
- symbolic analysis (1)
- symbolische Analyse (1)
- telemedicine (1)
- test case prioritization (1)
- test results (1)
- tiefes Lernen (1)
- timed automata (1)
- tools (1)
- trajectories (1)
- triple graph grammars (1)
- user-generated content (1)
- value change (1)
- verifiable credentials (1)
- version control (1)
- visual language (1)
- visualization concept exploration (1)
- visuelle Sprache (1)
- web-based development (1)
- web-based development environment (1)
- web-basierte Entwicklungsumgebung (1)
- webbasierte Entwicklung (1)
- überprüfbare Nachweise (1)
Institute
- Institut für Mathematik (45)
- Hasso-Plattner-Institut für Digital Engineering GmbH (22)
- Institut für Informatik und Computational Science (17)
- Fachgruppe Politik- & Verwaltungswissenschaft (5)
- Wirtschaftswissenschaften (2)
- Bürgerliches Recht (1)
- Department Linguistik (1)
- Fachgruppe Betriebswirtschaftslehre (1)
- Institut für Anglistik und Amerikanistik (1)
- Institut für Germanistik (1)
The Convention Relating to the Status of Refugees adopted on 28 July 1951 in Geneva provides the most comprehensive codification of the rights of refugees yet attempted. Consolidating previous international instruments relating to refugees, the 1951 Convention with its 1967 Protocol marks a cornerstone in the development of international refugee law. At present, there are 144 States Parties to one or both of these instruments, expressing a worldwide consensus on the definition of the term refugee and the fundamental rights to be granted to refugees. These facts demonstrate and underline the extraordinary significance of these instruments as the indispensable legal basis of international refugee law. This Commentary provides for a systematic and comprehensive analysis of the 1951 Convention and the 1967 Protocol on an article-by-article basis, exposing the interrelationship between the different articles and discussing the latest developments in international refugee law. In addition, several thematic contributions analyse questions of international refugee law which are of general significance, such as regional developments and the relationship between refugee law and the law of the sea.
The “HPI Future SOC Lab” is a cooperation of the Hasso Plattner Institute (HPI) and industry partners. Its mission is to enable and promote exchange and interaction between the research community and the industry partners.
The HPI Future SOC Lab provides researchers with free of charge access to a complete infrastructure of state of the art hard and software. This infrastructure includes components, which might be too expensive for an ordinary research environment, such as servers with up to 64 cores and 2 TB main memory. The offerings address researchers particularly from but not limited to the areas of computer science and business information systems. Main areas of research include cloud computing, parallelization, and In-Memory technologies.
This technical report presents results of research projects executed in 2019. Selected projects have presented their results on April 9th and November 12th 2019 at the Future SOC Lab Day events.
The power of opposition
(2022)
Proposing a novel way to look at the consolidation of democratic regimes, this book presents important theoretical and empirical contributions to the study of democratic consolidation, legislative organization, and public opinion.
Theoretically, Simone Wegmann brings legislatures into focus as the main body representing both winners and losers of democratic elections. Empirically, Wegmann shows that the degree of policy-making power of opposition players varies considerably between countries. Using survey data from the CSES, the ESS, and the LAPOP and systematically analyzing more than 50 legislatures across the world and the specific rights they grant to opposition players during the policy-making process, Wegmann demonstrates that neglecting the curial role of the legislature in a democratic setting can only lead to an incomplete assessment of the importance of institutions for democratic consolidation.
The Power of Opposition will be of great interest to scholars of comparative politics, especially those working on questions related to legislative organization, democratic consolidation, and/or public opinion.
This book brings together a variety of innovative perspectives on the inclusion of gender in the governance of (counter-)terrorism and violent extremism.
Several global governance initiatives launched in recent years have explicitly sought to integrate concern for gender equality and gendered harms into efforts to counter terrorism and violent extremism (CT/CVE). As a result, commitments to gender-sensitivity and gender equality in international and regional CT/CVE initiatives, in national action plans and at the level of civil society programming, ´have become a common aspect of the multilevel governance of terrorism and violent extremism. In light of these developments, there is a need for more systematic analysis of how concerns about gender are being incorporated in the governance of (counter-)terrorism and violent extremism and how it has affected (gendered) practices and power relations in counterterrorism policy-making and implementation.
Ranging from the processes of global and regional integration of gender into the governance of terrorism, via the impact of the shift on government responses to the return of foreign fighters, to state and civil society-led CVE programming and academic discussions, the essays engage with the origins and dynamics behind recent shifts which bring gender to the forefront of the governance of terrorism. This book will be of great value to researchers and scholars interested in gender, governance and terrorism.
The chapters in this book were originally published in Critical Studies on Terrorism.
The business problem of having inefficient processes, imprecise process analyses and simulations as well as non-transparent artificial neuronal network models can be overcome by an easy-to-use modeling concept. With the aim of developing a flexible and efficient approach to modeling, simulating and optimizing processes, this paper proposes a flexible Concept of Neuronal Modeling (CoNM). The modeling concept, which is described by the modeling language designed and its mathematical formulation and is connected to a technical substantiation, is based on a collection of novel sub-artifacts. As these have been implemented as a computational model, the set of CoNM tools carries out novel kinds of Neuronal Process Modeling (NPM), Neuronal Process Simulations (NPS) and Neuronal Process Optimizations (NPO). The efficacy of the designed artifacts was demonstrated rigorously by means of six experiments and a simulator of real industrial production processes.
SNS Democracy Council 2023
(2023)
Transboundary problems such as climate change, military conflicts, trade barriers, and refugee flows require increased collaboration across borders. This is to a large extent possible using existing international organizations. In such a case, however, they need to be considerably strengthened – while current trends take us in the opposite direction, according to the researchers in the SNS Democracy Council 2023.
International law is constantly navigating the tension between preserving the status quo and adapting to new exigencies. But when and how do such adaptation processes give way to a more profound transformation, if not a crisis of international law? To address the question of how attacks on the international legal order are changing the value orientation of international law, this book brings together scholars of international law and international relations. By combining theoretical and methodological analyses with individual case studies, this book offers readers conceptualizations and tools to systematically examine value change and explore the drivers and mechanisms of these processes. These case studies scrutinize value change in the foundational norms of the post-1945 order and in norms representing the rise of the international legal order post-1990. They cover diverse issues: the prohibition of torture, the protection of women’s rights, the prohibition of the use of force, the non-proliferation of nuclear weapons, sustainability norms, and accountability for core international crimes. The challenges to each norm, the reactions by norm defenders, and the fate of each norm are also studied. Combined, the analyses show that while a few norms have remained surprisingly robust, several are changing, either in substance or in legal or social validity. The book concludes by integrating the conceptual and empirical insights from this interdisciplinary exchange to assess and explain the ambiguous nature of value change in international law beyond the extremes of mere progress or decline.
This book compares local self-government in Europe. It examines local institutional structures, autonomy, and capacities in six selected countries - France, Italy, Sweden, Hungary, Poland, and the United Kingdom - each of which represents a typical model of European local government. Within Europe, an overall trend towards more local government capacities and autonomy can be identified, but there are also some counter tendencies to this trend and major differences regarding local politico-administrative settings, functional responsibilities, and resources. The book demonstrates that a certain degree of local financial autonomy and fiscal discretion is necessary for effective service provision. Furthermore, a robust local organization, viable territorial structures, a professional public service, strong local leadership, and well-functioning tools of democratic participation are key aspects for local governments to effectively fulfill their tasks and ensure political accountability. The book will appeal to students and scholars of Public Administration and Public Management, as well as practitioners and policy-makers at different levels of government, in public enterprises, and in NGOs.
This technical report presents the results of student projects which were prepared during the lecture “Operating Systems II” offered by the “Operating Systems and Middleware” group at HPI in the Summer term of 2020. The lecture covered ad- vanced aspects of operating system implementation and architecture on topics such as Virtualization, File Systems and Input/Output Systems. In addition to attending the lecture, the participating students were encouraged to gather practical experience by completing a project on a closely related topic over the course of the semester. The results of 10 selected exceptional projects are covered in this report.
The students have completed hands-on projects on the topics of Operating System Design Concepts and Implementation, Hardware/Software Co-Design, Reverse Engineering, Quantum Computing, Static Source-Code Analysis, Operating Systems History, Application Binary Formats and more. It should be recognized that over the course of the semester all of these projects have achieved outstanding results which went far beyond the scope and the expec- tations of the lecture, and we would like to thank all participating students for their commitment and their effort in completing their respective projects, as well as their work on compiling this report.
Digital technology offers significant political, economic, and societal opportunities. At the same time, the notion of digital sovereignty has become a leitmotif in German discourse: the state’s capacity to assume its responsibilities and safeguard society’s – and individuals’ – ability to shape the digital transformation in a self-determined way. The education sector is exemplary for the challenge faced by Germany, and indeed Europe, of harnessing the benefits of digital technology while navigating concerns around sovereignty. It encompasses education as a core public good, a rapidly growing field of business, and growing pools of highly sensitive personal data. The report describes pathways to mitigating the tension between digitalization and sovereignty at three different levels – state, economy, and individual – through the lens of concrete technical projects in the education sector: the HPI Schul-Cloud (state sovereignty), the MERLOT data spaces (economic sovereignty), and the openHPI platform (individual sovereignty).
Law of raw data
(2021)
Law of Raw Data gives an overview of the legal situation across major countries and how such data is contractually handled in practice in the respective countries. In recent years, digital technologies have transformed business and society, impacting all sectors of the economy and a wide variety of areas of life. Digitization is leading to rapidly growing volumes of data with great economic potential. Data, in its raw or unstructured form, has become an important and valuable economic asset, and protection of raw data has become a crucial subject for the intellectual property community. As legislators struggle to develop a settled legal regime in this complex area, this invaluable handbook will offer a careful and dedicated analysis of the legal instruments and remedies, both existing and potential, that provide such protection across a wide variety of national legal systems.
What’s in this book:
Produced under the auspices of the International Association for the Protection of International Property (AIPPI), more than forty active specialists of the association from twenty-three countries worldwide contribute national chapters on the relevant law in their respective jurisdictions. The contributions thoroughly explain how each country approaches such crucial matters as the following:
if there is any intellectual property right available to protect raw data; the nature of such intellectual property rights that exist in unstructured data; contracts on data and which legal boundaries stand in the way of contract drafting; liability for data products or services; and questions of international private law and cross-border portability.
Each country’s rules concerning specific forms of data – such as data embedded in household appliances and consumer goods, criminal offence data, data relating to human genetics, tax and bank secrecy, medical records, and clinical trial data – are described, drawing on legislation, regulation, and case law.
How this will help you:
A matchless legal resource on one of the most important raw materials of the twenty-first century, this book provides corporate counsel, practitioners and policymakers working in the field of intellectual property rights, and concerned academics with both a broad-based global overview on emerging legal strategies in the protection of unstructured data and the latest information on existing legislation and regulation in the area.
Decubitus is one of the most relevant diseases in nursing and the most expensive to treat. It is caused by sustained pressure on tissue, so it particularly affects bed-bound patients. This work lays a foundation for pressure mattress-based decubitus prophylaxis by implementing a solution to the single-frame 2D Human Pose Estimation problem.
For this, methods of Deep Learning are employed. Two approaches are examined, a coarse-to-fine Convolutional Neural Network for direct regression of joint coordinates and a U-Net for the derivation of probability distribution heatmaps.
We conclude that training our models on a combined dataset of the publicly available Bodies at Rest and SLP data yields the best results. Furthermore, various preprocessing techniques are investigated, and a hyperparameter optimization is performed to discover an improved model architecture.
Another finding indicates that the heatmap-based approach outperforms direct regression.
This model achieves a mean per-joint position error of 9.11 cm for the Bodies at Rest data and 7.43 cm for the SLP data.
We find that it generalizes well on data from mattresses other than those seen during training but has difficulties detecting the arms correctly.
Additionally, we give a brief overview of the medical data annotation tool annoto we developed in the bachelor project and furthermore conclude that the Scrum framework and agile practices enhanced our development workflow.
RailChain
(2023)
The RailChain project designed, implemented, and experimentally evaluated a juridical recorder that is based on a distributed consensus protocol. That juridical blockchain recorder has been realized as distributed ledger on board the advanced TrainLab (ICE-TD 605 017) of Deutsche Bahn.
For the project, a consortium consisting of DB Systel, Siemens, Siemens Mobility, the Hasso Plattner Institute for Digital Engineering, Technische Universität Braunschweig, TÜV Rheinland InterTraffic, and Spherity has been formed. These partners not only concentrated competencies in railway operation, computer science, regulation, and approval, but also combined experiences from industry, research from academia, and enthusiasm from startups.
Distributed ledger technologies (DLTs) define distributed databases and express a digital protocol for transactions between business partners without the need for a trusted intermediary. The implementation of a blockchain with real-time requirements for the local network of a railway system (e.g., interlocking or train) allows to log data in the distributed system verifiably in real-time. For this, railway-specific assumptions can be leveraged to make modifications to standard blockchains protocols.
EULYNX and OCORA (Open CCS On-board Reference Architecture) are parts of a future European reference architecture for control command and signalling (CCS, Reference CCS Architecture – RCA). Both architectural concepts outline heterogeneous IT systems with components from multiple manufacturers. Such systems introduce novel challenges for the approved and safety-relevant CCS of railways which were considered neither for road-side nor for on-board systems so far. Logging implementations, such as the common juridical recorder on vehicles, can no longer be realized as a central component of a single manufacturer. All centralized approaches are in question.
The research project RailChain is funded by the mFUND program and gives practical evidence that distributed consensus protocols are a proper means to immutably (for legal purposes) store state information of many system components from multiple manufacturers. The results of RailChain have been published, prototypically implemented, and experimentally evaluated in large-scale field tests on the advanced TrainLab. At the same time, the project showed how RailChain can be integrated into the road-side and on-board architecture given by OCORA and EULYNX.
Logged data can now be analysed sooner and also their trustworthiness is being increased. This enables, e.g., auditable predictive maintenance, because it is ensured that data is authentic and unmodified at any point in time.
The “HPI Future SOC Lab” is a cooperation of the Hasso Plattner Institute (HPI) and industry partners. Its mission is to enable and promote exchange and interaction between the research community and the industry partners.
The HPI Future SOC Lab provides researchers with free of charge access to a complete infrastructure of state of the art hard and software. This infrastructure includes components, which might be too expensive for an ordinary research environment, such as servers with up to 64 cores and 2 TB main memory. The offerings address researchers particularly from but not limited to the areas of computer science and business information systems. Main areas of research include cloud computing, parallelization, and In-Memory technologies.
This technical report presents results of research projects executed in 2018. Selected projects have presented their results on April 17th and November 14th 2017 at the Future SOC Lab Day events.
Pictures are a medium that helps make the past tangible and preserve memories. Without context, they are not able to do so. Pictures are brought to life by their associated stories. However, the older pictures become, the fewer contemporary witnesses can tell these stories.
Especially for large, analog picture archives, knowledge and memories are spread over many people. This creates several challenges: First, the pictures must be digitized to save them from decaying and make them available to the public. Since a simple listing of all the pictures is confusing, the pictures should be structured accessibly. Second, known information that makes the stories vivid needs to be added to the pictures. Users should get the opportunity to contribute their knowledge and memories. To make this usable for all interested parties, even for older, less technophile generations, the interface should be intuitive and error-tolerant.
The resulting requirements are not covered in their entirety by any existing software solution without losing the intuitive interface or the scalability of the system.
Therefore, we have developed our digital picture archive within the scope of a bachelor project in cooperation with the Bad Harzburg-Stiftung. For the implementation of this web application, we use the UI framework React in the frontend, which communicates via a GraphQL interface with the Content Management System Strapi in the backend. The use of this system enables our project partner to create an efficient process from scanning analog pictures to presenting them to visitors in an organized and annotated way. To customize the solution for both picture delivery and information contribution for our target group, we designed prototypes and evaluated them with people from Bad Harzburg. This helped us gain valuable insights into our system’s usability and future challenges as well as requirements.
Our web application is already being used daily by our project partner. During the project, we still came up with numerous ideas for additional features to further support the exchange of knowledge.
Like conventional software projects, projects in model-driven software engineering require adequate management of multiple versions of development artifacts, importantly allowing living with temporary inconsistencies. In the case of model-driven software engineering, employed versioning approaches also have to handle situations where different artifacts, that is, different models, are linked via automatic model transformations.
In this report, we propose a technique for jointly handling the transformation of multiple versions of a source model into corresponding versions of a target model, which enables the use of a more compact representation that may afford improved execution time of both the transformation and further analysis operations. Our approach is based on the well-known formalism of triple graph grammars and a previously introduced encoding of model version histories called multi-version models. In addition to showing the correctness of our approach with respect to the standard semantics of triple graph grammars, we conduct an empirical evaluation that demonstrates the potential benefit regarding execution time performance.
Modular and incremental global model management with extended generalized discrimination networks
(2023)
Complex projects developed under the model-driven engineering paradigm nowadays often involve several interrelated models, which are automatically processed via a multitude of model operations. Modular and incremental construction and execution of such networks of models and model operations are required to accommodate efficient development with potentially large-scale models. The underlying problem is also called Global Model Management.
In this report, we propose an approach to modular and incremental Global Model Management via an extension to the existing technique of Generalized Discrimination Networks (GDNs). In addition to further generalizing the notion of query operations employed in GDNs, we adapt the previously query-only mechanism to operations with side effects to integrate model transformation and model synchronization. We provide incremental algorithms for the execution of the resulting extended Generalized Discrimination Networks (eGDNs), as well as a prototypical implementation for a number of example eGDN operations.
Based on this prototypical implementation, we experiment with an application scenario from the software development domain to empirically evaluate our approach with respect to scalability and conceptually demonstrate its applicability in a typical scenario. Initial results confirm that the presented approach can indeed be employed to realize efficient Global Model Management in the considered scenario.
Learning from failure
(2022)
Regression testing is a widespread practice in today's software industry to ensure software product quality. Developers derive a set of test cases, and execute them frequently to ensure that their change did not adversely affect existing functionality. As the software product and its test suite grow, the time to feedback during regression test sessions increases, and impedes programmer productivity: developers wait longer for tests to complete, and delays in fault detection render fault removal increasingly difficult.
Test case prioritization addresses the problem of long feedback loops by reordering test cases, such that test cases of high failure probability run first, and test case failures become actionable early in the testing process. We ask, given test execution schedules reconstructed from publicly available data, to which extent can their fault detection efficiency improved, and which technique yields the most efficient test schedules with respect to APFD?
To this end, we recover regression 6200 test sessions from the build log files of Travis CI, a popular continuous integration service, and gather 62000 accompanying changelists. We evaluate the efficiency of current test schedules, and examine the prioritization results of state-of-the-art lightweight, history-based heuristics. We propose and evaluate a novel set of prioritization algorithms, which connect software changes and test failures in a matrix-like data structure.
Our studies indicate that the optimization potential is substantial, because the existing test plans score only 30% APFD. The predictive power of past test failures proves to be outstanding: simple heuristics, such as repeating tests with failures in recent sessions, result in efficiency scores of 95% APFD. The best-performing matrix-based heuristic achieves a similar score of 92.5% APFD. In contrast to prior approaches, we argue that matrix-based techniques are useful beyond the scope of effective prioritization, and enable a number of use cases involving software maintenance.
We validate our findings from continuous integration processes by extending a continuous testing tool within development environments with means of test prioritization, and pose further research questions. We think that our findings are suited to propel adoption of (continuous) testing practices, and that programmers' toolboxes should contain test prioritization as an existential productivity tool.
Cyber-physical systems often encompass complex concurrent behavior with timing constraints and probabilistic failures on demand. The analysis whether such systems with probabilistic timed behavior adhere to a given specification is essential. When the states of the system can be represented by graphs, the rule-based formalism of Probabilistic Timed Graph Transformation Systems (PTGTSs) can be used to suitably capture structure dynamics as well as probabilistic and timed behavior of the system. The model checking support for PTGTSs w.r.t. properties specified using Probabilistic Timed Computation Tree Logic (PTCTL) has been already presented. Moreover, for timed graph-based runtime monitoring, Metric Temporal Graph Logic (MTGL) has been developed for stating metric temporal properties on identified subgraphs and their structural changes over time. In this paper, we (a) extend MTGL to the Probabilistic Metric Temporal Graph Logic (PMTGL) by allowing for the specification of probabilistic properties, (b) adapt our MTGL satisfaction checking approach to PTGTSs, and (c) combine the approaches for PTCTL model checking and MTGL satisfaction checking to obtain a Bounded Model Checking (BMC) approach for PMTGL. In our evaluation, we apply an implementation of our BMC approach in AutoGraph to a running example.
Modeling and Formal Analysis of Meta-Ecosystems with Dynamic Structure using Graph Transformation
(2022)
The dynamics of ecosystems is of crucial importance. Various model-based approaches exist to understand and analyze their internal effects. In this paper, we model the space structure dynamics and ecological dynamics of meta-ecosystems using the formal technique of Graph Transformation (short GT). We build GT models to describe how a meta-ecosystem (modeled as a graph) can evolve over time (modeled by GT rules) and to analyze these GT models with respect to qualitative properties such as the existence of structural stabilities. As a case study, we build three GT models describing the space structure dynamics and ecological dynamics of three different savanna meta-ecosystems. The first GT model considers a savanna meta-ecosystem that is limited in space to two ecosystem patches, whereas the other two GT models consider two savanna meta-ecosystems that are unlimited in the number of ecosystem patches and only differ in one GT rule describing how the space structure of the meta-ecosystem grows. In the first two GT models, the space structure dynamics and ecological dynamics of the meta-ecosystem shows two main structural stabilities: the first one based on grassland-savanna-woodland transitions and the second one based on grassland-desert transitions. The transition between these two structural stabilities is driven by high-intensity fires affecting the tree components. In the third GT model, the GT rule for savanna regeneration induces desertification and therefore a collapse of the meta-ecosystem. We believe that GT models provide a complementary avenue to that of existing approaches to rigorously study ecological phenomena.