Institut für Informatik und Computational Science
Refine
Has Fulltext
- no (16)
Year of publication
- 2022 (16) (remove)
Document Type
- Article (16) (remove)
Is part of the Bibliography
- yes (16)
Keywords
- Analytical models (2)
- Answer set programming (2)
- Absorbed dose (1)
- Advanced Video Codec (AVC) (1)
- Algorithms (1)
- Benchmark testing; (1)
- Circuit faults (1)
- Complexity (1)
- Deep learning (1)
- Dose rate (1)
- Encoding (1)
- Engines (1)
- FPGA (1)
- Flip-flops (1)
- H.264 (1)
- Hardware accelerator (1)
- Inference (1)
- Integrated circuit modeling (1)
- Liver neoplasms (1)
- Low Latency (1)
- Machine Learning (1)
- Machine learning (1)
- Metric learning (1)
- Network (1)
- Network security (1)
- Parameterized complexity (1)
- Phantoms (1)
- Programming (1)
- RADFET (1)
- RADFETs (1)
- Radiation hardness (1)
- Random access memory (1)
- Region of Interest (1)
- Reproducibility of results (1)
- Scalability (1)
- Search problems (1)
- Security (1)
- Self-adaptive MPSoC (1)
- Sequence embeddings (1)
- Single event upsets (1)
- Tomography (1)
- Tools (1)
- Tree decomposition (1)
- Treewidth (1)
- Treewidth-aware reductions (1)
- X-ray computed (1)
- analysis (1)
- annealing (1)
- anti-cancer drugs (1)
- approximate model counting (1)
- compliance (1)
- craters (1)
- deep neural networks (1)
- drug-sensitivity prediction (1)
- error propagation (1)
- fading (1)
- formal (1)
- hardware accelerator (1)
- higher education (1)
- ice harboring (1)
- imaging (1)
- irradiation (1)
- knowledge representation and reasoning (1)
- logic programming (1)
- lunar exploration (1)
- machine learning (1)
- machine learning algorithms (1)
- monitoring (1)
- online learning (1)
- pMOS radiation dosimeter (1)
- planning (1)
- policy evaluation (1)
- predictive models (1)
- radhard design (1)
- reliability (1)
- reliability analysis (1)
- security (1)
- selective fault tolerance (1)
- self-adaptive multiprocessing system (1)
- sensitivity (1)
- single event upset (1)
- single event upsets (1)
- solar particle event (1)
- space missions (1)
- teacher training (1)
- verification (1)
- virtual mobility (1)
Institute
We present a method employing Answer Set Programming in combination with Approximate Model Counting for fast and accurate calculation of error propagation probabilities in digital circuits. By an efficient problem encoding, we achieve an input data format similar to a Verilog netlist so that extensive preprocessing is avoided. By a tight interconnection of our application with the underlying solver, we avoid iterating over fault sites and reduce calls to the solver. Several circuits were analyzed with varying numbers of considered cycles and different degrees of approximation. Our experiments show, that the runtime can be reduced by approximation by a factor of 91, whereas the error compared to the exact result is below 1%.
Answer Set Programming (ASP) is a paradigm for modeling and solving problems for knowledge representation and reasoning. There are plenty of results dedicated to studying the hardness of (fragments of) ASP. So far, these studies resulted in characterizations in terms of computational complexity as well as in fine-grained insights presented in form of dichotomy-style results, lower bounds when translating to other formalisms like propositional satisfiability (SAT), and even detailed parameterized complexity landscapes. A generic parameter in parameterized complexity originating from graph theory is the socalled treewidth, which in a sense captures structural density of a program. Recently, there was an increase in the number of treewidth-based solvers related to SAT. While there are translations from (normal) ASP to SAT, no reduction that preserves treewidth or at least keeps track of the treewidth increase is known. In this paper we propose a novel reduction from normal ASP to SAT that is aware of the treewidth, and guarantees that a slight increase of treewidth is indeed sufficient. Further, we show a new result establishing that, when considering treewidth, already the fragment of normal ASP is slightly harder than SAT (under reasonable assumptions in computational complexity). This also confirms that our reduction probably cannot be significantly improved and that the slight increase of treewidth is unavoidable. Finally, we present an empirical study of our novel reduction from normal ASP to SAT, where we compare treewidth upper bounds that are obtained via known decomposition heuristics. Overall, our reduction works better with these heuristics than existing translations. (c) 2021 Elsevier B.V. All rights reserved.
The highly structured nature of the educational sector demands effective policy mechanisms close to the needs of the field. That is why evidence-based policy making, endorsed by the European Commission under Erasmus+ Key Action 3, aims to make an alignment between the domains of policy and practice. Against this background, this article addresses two issues: First, that there is a vertical gap in the translation of higher-level policies to local strategies and regulations. Second, that there is a horizontal gap between educational domains regarding the policy awareness of individual players. This was analyzed in quantitative and qualitative studies with domain experts from the fields of virtual mobility and teacher training. From our findings, we argue that the combination of both gaps puts the academic bridge from secondary to tertiary education at risk, including the associated knowledge proficiency levels. We discuss the role of digitalization in the academic bridge by asking the question: which value does the involved stakeholders expect from educational policies? As a theoretical basis, we rely on the model of value co-creation for and by stakeholders. We describe the used instruments along with the obtained results and proposed benefits. Moreover, we reflect on the methodology applied, and we finally derive recommendations for future academic bridge policies.
Large-scale databases that report the inhibitory capacities of many combinations of candidate drug compounds and cultivated cancer cell lines have driven the development of preclinical drug-sensitivity models based on machine learning. However, cultivated cell lines have devolved from human cancer cells over years or even decades under selective pressure in culture conditions. Moreover, models that have been trained on in vitro data cannot account for interactions with other types of cells. Drug-response data that are based on patient-derived cell cultures, xenografts, and organoids, on the other hand, are not available in the quantities that are needed to train high-capacity machine-learning models. We found that pre-training deep neural network models of drug sensitivity on in vitro drug-sensitivity databases before fine-tuning the model parameters on patient-derived data improves the models’ accuracy and improves the biological plausibility of the features, compared to training only on patient-derived data. From our experiments, we can conclude that pre-trained models outperform models that have been trained on the target domains in the vast majority of cases.
Continuous verification of network security compliance is an accepted need. Especially, the analysis of stateful packet filters plays a central role for network security in practice. But the few existing tools which support the analysis of stateful packet filters are based on general applicable formal methods like Satifiability Modulo Theories (SMT) or theorem prover and show runtimes in the order of minutes to hours making them unsuitable for continuous compliance verification. In this work, we address these challenges and present the concept of state shell interweaving to transform a stateful firewall rule set into a stateless rule set. This allows us to reuse any fast domain specific engine from the field of data plane verification tools leveraging smart, very fast, and domain specialized data structures and algorithms including Header Space Analysis (HSA). First, we introduce the formal language FPL that enables a high-level human-understandable specification of the desired state of network security. Second, we demonstrate the instantiation of a compliance process using a verification framework that analyzes the configuration of complex networks and devices - including stateful firewalls - for compliance with FPL policies. Our evaluation results show the scalability of the presented approach for the well known Internet2 and Stanford benchmarks as well as for large firewall rule sets where it outscales state-of-the-art tools by a factor of over 41.