Refine
Has Fulltext
- no (117)
Year of publication
- 2017 (117) (remove)
Document Type
- Other (117) (remove)
Is part of the Bibliography
- yes (117)
Keywords
- Internet (2)
- MOOC (2)
- affect (2)
- carbon dioxide (2)
- embodied cognition (2)
- 2.5D Treemaps (1)
- Absorption kinetics (1)
- Aluminium (1)
- Aluminium adjuvants (1)
- Aufklarung (1)
- BMI (1)
- Bandwidth (1)
- Buntsandstein (1)
- CO2 storage monitoring (1)
- Cloud Native Applications (1)
- Cloud-Security (1)
- DPP-4 inhibitors (1)
- Dialysis patients (1)
- Digital Learning Factory (1)
- Dipeptidyl peptidase IV (1)
- Durkheim (1)
- Durkheim’s German Reception, Max Weber, Georg Simmel, Jürgen Habermas (1)
- E-Learning (1)
- ERPs (1)
- Educational Technology (1)
- Enlightenment (1)
- Exploratory interfaces (1)
- Gamification (1)
- Geoeletrical imaging (1)
- Growth faltering (1)
- Growth modelling (1)
- In vitro dissolution (1)
- Industrial IoT Competences (1)
- Information Visualization (1)
- Massive Open Online Courses (1)
- Media retrieval (1)
- Memory management (1)
- Mobile Learning (1)
- Mobiles (1)
- Mortality (1)
- Multi-perspective Views (1)
- Multidimensional scaling (1)
- Obesity (1)
- Offline-Enabled (1)
- Overview plus Detail (1)
- P300 (1)
- PHEMA (1)
- Parallel programming (1)
- Performance analysis (1)
- Photon Density Wave Spectroscopy (1)
- SAW impedance sensor (1)
- Secular trend (1)
- Security-as-a-Service (1)
- Serum intact-parathyroid hormone level (1)
- Strategic growth adjustment (1)
- Student Training (1)
- Toxicokinetic modelling (1)
- Treemaps (1)
- Ubiquitous (1)
- User study (1)
- Vocational Training (1)
- Vulnerability Assessment (1)
- abstract concepts (1)
- action words (1)
- acute kidney injury (1)
- age of acquisition (1)
- agreement processing (1)
- algae cultivation (1)
- anaphor resolution (1)
- bilingualism (1)
- blind feeling (1)
- brain rhythms (1)
- brain synchronization (1)
- carbon cycle (1)
- child development (1)
- climate change (1)
- coefficient of determination (1)
- complex networks (1)
- connectivity (1)
- contingent encounters (1)
- critical period for language (1)
- damage (1)
- data based model (1)
- digital rock physics (1)
- digital technologies (1)
- dissolved (1)
- e-learning (1)
- effective elastic properties (1)
- eighteenth century (1)
- emotions (1)
- encoding (1)
- enjoyment (1)
- epilepsy (1)
- exercise (1)
- feelings (1)
- fermentation (1)
- fiber spectroscopy (1)
- filler-gap dependencies (1)
- gas storage (1)
- geothermal reservoir (1)
- gliptins (1)
- graph analysis (1)
- historiography (1)
- human rights (1)
- hybrid nanomaterials (1)
- hydrogen (1)
- implicit (1)
- interference (1)
- ischemia reperfusion injury (1)
- language (1)
- language acquisition (1)
- learning (1)
- lifespan (1)
- mattering (1)
- memory (1)
- memory retrieval (1)
- mental simulation (1)
- meteorological extremes (1)
- methane (1)
- microfluidic (1)
- mobile phone (1)
- modernity (1)
- multiple light scattering (1)
- multivariate regression (1)
- neural synchonization (1)
- non-linear dynamics (1)
- norepinephrine (1)
- numerical (1)
- numerical simulation (1)
- old/new effect (1)
- permanent downhole electrode array (1)
- physical activity (1)
- process analytical technology (1)
- regulation (1)
- renewable energy (1)
- risk analysis (1)
- saline aquifer (1)
- scenario analysis (1)
- sensation (1)
- sensitive periods (1)
- sentence comprehension (1)
- smartphones (1)
- storage capacity (1)
- tablet computers (1)
- transcutaneous vagus nerve stimulation (1)
- tropical cyclones (1)
- vulnerability (1)
- water management (1)
- wondering (1)
Institute
- Institut für Biochemie und Biologie (22)
- Institut für Physik und Astronomie (17)
- Institut für Geowissenschaften (11)
- Department Sport- und Gesundheitswissenschaften (10)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (10)
- Department Psychologie (8)
- Institut für Ernährungswissenschaft (6)
- Department Linguistik (5)
- Institut für Informatik und Computational Science (5)
- Institut für Chemie (4)
Kijko et al. (2016) present various methods to estimate parameters that are relevant for probabilistic seismic-hazard assessment. One of these parameters, although not the most influential, is the maximum possible earthquake magnitude m(max). I show that the proposed estimation of m(max) is based on an erroneous equation related to a misuse of the estimator in Cooke (1979) and leads to unstable results. So far, reported finite estimations of m(max) arise from data selection, because the estimator in Kijko et al. (2016) diverges with finite probability. This finding is independent of the assumed distribution of earthquake magnitudes. For the specific choice of the doubly truncated Gutenberg-Richter distribution, I illustrate the problems by deriving explicit equations. Finally, I conclude that point estimators are generally not a suitable approach to constrain m(max).
DOES AGE INFLUENCE BRAIN POTENTIALS DURING AFFECTIVE PICTURE PROCESSING IN MIDDLE-AGED WOMEN?
(2017)
THE P300 AND THE LC-NE SYSTEM: NEW INSIGHTS FROM TRANSCUTANEOUS VAGUS NERVE STIMULATION (TVNS)
(2017)
Predicting macroscopic elastic rock properties requires detailed information on microstructure
(2017)
Predicting variations in macroscopic mechanical rock behaviour due to microstructural changes, driven by mineral precipitation and dissolution is necessary to couple chemo-mechanical processes in geological subsurface simulations. We apply 3D numerical homogenization models to estimate Young’s moduli for five synthetic microstructures, and successfully validate our results for comparable geometries with the analytical Mori-Tanaka approach. Further, we demonstrate that considering specific rock microstructures is of paramount importance, since calculated elastic properties may deviate by up to 230 % for the same mineral composition. Moreover, agreement between simulated and experimentally determined Young’s moduli is significantly improved, when detailed spatial information are employed.
As a potentially toxic agent on nervous system and bone, the safety of aluminium exposure from adjuvants in vaccines and subcutaneous immune therapy (SCIT) products has to be continuously reevaluated, especially regarding concomitant administrations. For this purpose, knowledge on absorption and disposition of aluminium in plasma and tissues is essential. Pharmacokinetic data after vaccination in humans, however, are not available, and for methodological and ethical reasons difficult to obtain. To overcome these limitations, we discuss the possibility of an in vitro-in silico approach combining a toxicokinetic model for aluminium disposition with biorelevant kinetic absorption parameters from adjuvants. We critically review available kinetic aluminium-26 data for model building and, on the basis of a reparameterized toxicokinetic model (Nolte et al., 2001), we identify main modelling gaps. The potential of in vitro dissolution experiments for the prediction of intramuscular absorption kinetics of aluminium after vaccination is explored. It becomes apparent that there is need for detailed in vitro dissolution and in vivo absorption data to establish an in vitro-in vivo correlation (IVIVC) for aluminium adjuvants. We conclude that a combination of new experimental data and further refinement of the Nolte model has the potential to fill a gap in aluminium risk assessment. (C) 2017 Elsevier Inc. All rights reserved.
The maximum entropy method is used to predict flows on water distribution networks. This analysis extends the water distribution network formulation of Waldrip et al. (2016) Journal of Hydraulic Engineering (ASCE), by the use of a continuous relative entropy defined on a reduced parameter set. This reduction in the parameters that the entropy is defined over ensures consistency between different representations of the same network. The performance of the proposed reduced parameter method is demonstrated with a one-loop network case study.
The maximum entropy method is used to derive an alternative gravity model for a transport network. The proposed method builds on previous methods which assign the discrete value of a maximum entropy distribution to equal the traffic flow rate. The proposed method however, uses a distribution to represent each flow rate. The proposed method is shown to be able to handle uncertainty in a more elegant way and give similar results to traditional methods. It is able to incorporate more of the observed data through the entropy function, prior distribution and integration limits potentially allowing better inferences to be made.
Background: Evidence that home telemonitoring (HTM) for patients with chronic heart failure (CHF) offers clinical benefit over usual care is controversial as is evidence of a health economic advantage. Therefore the CardioBBEAT trial was designed to prospectively assess the health economic impact of a dedicated home monitoring system for patients with CHF based on actual costs directly obtained from patients’ health care providers.
Methods: Between January 2010 and June 2013, 621 patients (mean age 63,0 ± 11,5 years, 88 % male) with a confirmed diagnosis of CHF (LVEF ≤ 40 %) were enrolled and randomly assigned to two study groups comprising usual care with and without an interactive bi-directional HTM (Motiva®). The primary endpoint was the Incremental Cost-Effectiveness Ratio (ICER) established by the groups’ difference in total cost and in the combined clinical endpoint “days alive and not in hospital nor inpatient care per potential days in study” within the follow up of 12 months. Secondary outcome measures were total mortality and health related quality of life (SF-36, WHO-5 and KCCQ).
Results: In the intention-to-treat analysis, total mortality (HR 0.81; 95% CI 0.45 – 1.45) and days alive and not in hospital (343.3 ± 55.4 vs. 347.2 ± 43.9; p = 0.909) were not significantly different between HTM and usual care. While the resulting primary endpoint ICER was not positive (-181.9; 95% CI −1626.2 ± 1628.9), quality of life assessed by SF-36, WHO-5 and KCCQ as a secondary endpoint was significantly higher in the HTW group at 6 and 12 months of follow-up.
Conclusions: The first simultaneous assessment of clinical and economic outcome of HTM in patients with CHF did not demonstrate superior incremental cost effectiveness compared to usual care. On the other hand, quality of life was improved. It remains open whether the tested HTM solution represents a useful innovative approach in the recent health care setting.
The keynote article (Mayberry & Kluender, 2017) makes an important contribution to questions concerning the existence and characteristics of sensitive periods in language acquisition. Specifically, by comparing groups of non-native L1 and L2 signers, the authors have been able to ingeniously disentangle the effects of maturation from those of early language exposure. Based on L1 versus L2 contrasts, the paper convincingly argues that L2 learning is a less clear test of sensitive periods. Nevertheless, we believe Mayberry and Kluender underestimate the evidence for maturational factors in L2 learning, especially that coming from recent research.
This paper discusses a new approach for designing and deploying Security-as-a-Service (SecaaS) applications using cloud native design patterns. Current SecaaS approaches do not efficiently handle the increasing threats to computer systems and applications. For example, requests for security assessments drastically increase after a high-risk security vulnerability is disclosed. In such scenarios, SecaaS applications are unable to dynamically scale to serve requests. A root cause of this challenge is employment of architectures not specifically fitted to cloud environments. Cloud native design patterns resolve this challenge by enabling certain properties e.g. massive scalability and resiliency via the combination of microservice patterns and cloud-focused design patterns. However adopting these patterns is a complex process, during which several security issues are introduced. In this work, we investigate these security issues, we redesign and deploy a monolithic SecaaS application using cloud native design patterns while considering appropriate, layered security counter-measures i.e. at the application and cloud networking layer. Our prototype implementation out-performs traditional, monolithic applications with an average Scanner Time of 6 minutes, without compromising security. Our approach can be employed for designing secure, scalable and performant SecaaS applications that effectively handle unexpected increase in security assessment requests.
The ionospheric delay of global navigation satellite systems (GNSS) signals typically is compensated by adding a single correction value to the pseudorange measurement of a GNSS receiver. Yet, this neglects the dispersive nature of the ionosphere. In this context we analyze the ionospheric signal distortion beyond a constant delay. These effects become increasingly significant with the signal bandwidth and hence more important for new broadband navigation signals. Using measurements of the Galileo E5 signal, captured with a high gain antenna, we verify that the expected influence can indeed be observed and compensated. A new method to estimate the total electron content (TEC) from a single frequency high gain antenna measurement of a broadband GNSS signal is proposed and described in detail. The received signal is de facto unaffected by multi-path and interference because of the narrow aperture angle of the used antenna which should reduce the error source of the result in general. We would like to point out that such measurements are independent of code correlation, like in standard receiver applications. It is therefore also usable without knowledge of the signal coding. Results of the TEC estimation process are shown and discussed comparing to common TEC products like TEC maps and dual frequency receiver estimates.
S-test results for the USGS and RELM forecasts. The differences between the simulated log-likelihoods and the observed log-likelihood are labelled on the horizontal axes, with scaling adjustments for the 40year.retro experiment. The horizontal lines represent the confidence intervals, within the 0.05 significance level, for each forecast and experiment. If this range contains a log-likelihood difference of zero, the forecasted log-likelihoods are consistent with the observed, and the forecast passes the S-test (denoted by thin lines). If the minimum difference within this range does not contain zero, the forecast fails the S-test for that particular experiment, denoted by thick lines. Colours distinguish between experiments (see Table 2 for explanation of experiment durations). Due to anomalously large likelihood differences, S-test results for Wiemer-Schorlemmer.ALM during the 10year.retro and 40year.retro experiments are not displayed. The range of log-likelihoods for the Holliday-et-al.PI forecast is lower than for the other forecasts due to relatively homogeneous forecasted seismicity rates and use of a small fraction of the RELM testing region.
Massive Open Online Courses (MOOCs) have left their mark on the face of education during the recent years. At the Hasso Plattner Institute (HPI) in Potsdam, Germany, we are actively developing a MOOC platform, which provides our research with a plethora of e-learning topics, such as learning analytics, automated assessment, peer assessment, team-work, online proctoring, and gamification. We run several instances of this platform. On openHPI, we provide our own courses from within the HPI context. Further instances are openSAP, openWHO, and mooc.HOUSE, which is the smallest of these platforms, targeting customers with a less extensive course portfolio. In 2013, we started to work on the gamification of our platform. By now, we have implemented about two thirds of the features that we initially have evaluated as useful for our purposes. About a year ago we activated the implemented gamification features on mooc.HOUSE. Before activating the features on openHPI as well, we examined, and re-evaluated our initial considerations based on the data we collected so far and the changes in other contexts of our platforms.
Dielectrophoretic functionalization of nanoelectrode arrays for the detection of influenza viruses
(2017)
E-commerce marketplaces are highly dynamic with constant competition. While this competition is challenging for many merchants, it also provides plenty of opportunities, e.g., by allowing them to automatically adjust prices in order to react to changing market situations. For practitioners however, testing automated pricing strategies is time-consuming and potentially hazardously when done in production. Researchers, on the other side, struggle to study how pricing strategies interact under heavy competition. As a consequence, we built an open continuous time framework to simulate dynamic pricing competition called Price Wars. The microservice-based architecture provides a scalable platform for large competitions with dozens of merchants and a large random stream of consumers. Our platform stores each event in a distributed log. This allows to provide different performance measures enabling users to compare profit and revenue of various repricing strategies in real-time. For researchers, price trajectories are shown which ease evaluating mutual price reactions of competing strategies. Furthermore, merchants can access historical marketplace data and apply machine learning. By providing a set of customizable, artificial merchants, users can easily simulate both simple rule-based strategies as well as sophisticated data-driven strategies using demand learning to optimize their pricing strategies.
In this paper, the applicability of deep downhole geoelectrical monitoring for detecting CO2 related signatures is evaluated after a nearly ten year period of CO2 storage at the Ketzin pilot site. Deep downhole electrode arrays have been studied as part of a multi-physical monitoring concept at four CO2 pilot test sites worldwide so far. For these sites, it was considered important to implement the geoelectrical method into the measurement program of tracking the CO2 plume. Analyzing the example of the Ketzin site, it can be seen that during all phases of the CO2 storage reservoir development the resistivity measurements and their corresponding tomographic interpretation contribute in a beneficial manner to the measurement, monitoring and verification (MMV) protocol. The most important impact of a permanent electrode array is its potential as tool for estimating reservoir saturations.
Root infinitives on Twitter
(2017)
Web-based E-Learning uses Internet technologies and digital media to deliver education content to learners. Many universities in recent years apply their capacity in producing Massive Open Online Courses (MOOCs). They have been offering MOOCs with an expectation of rendering a comprehensive online apprenticeship. Typically, an online content delivery process requires an Internet connection. However, access to the broadband has never been a readily available resource in many regions. In Africa, poor and no networks are yet predominantly experienced by Internet users, frequently causing offline each moment a digital device disconnect from a network. As a result, a learning process is always disrupted, delayed and terminated in such regions. This paper raises the concern of E-Learning in poor and low bandwidths, in fact, it highlights the needs for an Offline-Enabled mode. The paper also explores technical approaches beamed to enhance the user experience inWeb-based E-Learning, particular in Africa.
DPP4 inhibition prevents AKI
(2017)
Cost models play an important role for the efficient implementation of software systems. These models can be embedded in operating systems and execution environments to optimize execution at run time. Even though non-uniform memory access (NUMA) architectures are dominating today's server landscape, there is still a lack of parallel cost models that represent NUMA system sufficiently. Therefore, the existing NUMA models are analyzed, and a two-step performance assessment strategy is proposed that incorporates low-level hardware counters as performance indicators. To support the two-step strategy, multiple tools are developed, all accumulating and enriching specific hardware event counter information, to explore, measure, and visualize these low-overhead performance indicators. The tools are showcased and discussed alongside specific experiments in the realm of performance assessment.
Gamma-ray bursts (GRBs) are some of the Universe’s most enigmatic and exotic events. However, at energies above 10 GeV their behaviour remains largely unknown. Although space based telescopes such as the Fermi-LAT have been able to detect GRBs in this energy range, their photon statistics are limited by the small detector size. Such limitations are not present in ground based gamma-ray telescopes such as the H.E.S.S. experiment, which has now entered its second phase with the addition of a large 600 m2 telescope to the centre of the array. Such a large telescope allows H.E.S.S. to access the sub 100-GeV energy range while still maintaining a large effective collection area, helping to potentially probe the short timescale emission of these events.
We present a description of the H.E.S.S. GRB observation programme, summarising the performance of the rapid GRB repointing system and the conditions under which GRB observations are initiated. Additionally we will report on the GRB follow-ups made during the 2014-15 observation campaigns.
Surface acoustic wave (SAW) devices are well-known for gravimetric sensor applications. In biosensing applications, chemical-and biochemically evoked adsorption processes at surfaces are detected in liquid environments using delay-line or resonator sensor configurations, preferably in combination with appropriate microfluidic devices. In this paper, a novel SAW-based impedance sensor type is introduced which uses only one interdigital electrode transducer (IDT) simultaneously as SAW generator and sensor element. It is shown that the amplitude of the reflected S-11 signal directly depends on the input impedance of the SAW device. The input impedance is strongly influenced by mass adsorption which causes a characteristic and measurable impedance mismatch.
The design of embedded systems is becoming continuously more complex such that efficient system-level design methods are becoming crucial. Recently, combined Answer Set Programming (ASP) and Quantifier Free Integer Difference Logic (QF-IDL) solving has been shown to be a promising approach in system synthesis. However, this approach still has several restrictions limiting its applicability. In the paper at hand, we propose a novel ASP modulo Theories (ASPmT) system synthesis approach, which (i) supports more sophisticated system models, (ii) tightly integrates the QF-IDL solving into the ASP solving, and (iii) makes use of partial assignment checking. As a result, more realistic systems are considered and an early exclusion of infeasible solutions improves the entire system synthesis.
Nanocarriers
(2017)
Handling manufacturing and aging faults with software-based techniques in tiny embedded systems
(2017)
Non-volatile memory area occupies a large portion of the area of a chip in an embedded system. Such memories are prone to manufacturing faults, retention faults, and aging faults. The paper presents a single software based technique that allows for handling all of these fault types in tiny embedded systems without the need for hardware support. This is beneficial for low-cost embedded systems with simple memory architectures. A software infrastructure and a flow are presented that demonstrate how the presented technique is used in general for fault handling right after manufacturing and in-the-field. Moreover, a full implementation is presented for a MSP430 microcontroller, along with a discussion of the performance, overhead, and reliability impacts.
This paper describes architectural extensions for a dynamically scheduled processor, so that it can be used in three different operation modes, ranging from high-performance, to high-reliability. With minor hardware-extensions of the control path, the resources of the superscalar data-path can be used either for high-performance execution, fail-safe-operation, or fault-tolerant-operation. This makes the processor-architecture a very good candidate for applications with dynamically changing reliability requirements, e.g. for automotive applications. The paper reports the hardware-overhead for the extensions, and investigates the performance penalties introduced by the fail-safe and fault-tolerant mode. Furthermore, a comprehensive fault simulation was carried out in order to investigate the fault-coverage of the proposed approach.
This special issue is the result of several fruitful conference sessions on disturbance hydrology, which started at the 2013 AGU Fall Meeting in San Francisco and have continued every year since. The stimulating presentations and discussions surrounding those sessions have focused on understanding both the disruption of hydrologic functioning following discrete disturbances, as well as the subsequent recovery or change within the affected watershed system. Whereas some hydrologic disturbances are directly linked to anthropogenic activities, such as resource extraction, the contributions to this special issue focus primarily on those with indirect or less pronounced human involvement, such as bark-beetle infestation, wildfire, and other natural hazards. However, human activities are enhancing the severity and frequency of these seemingly natural disturbances, thereby contributing to acute hydrologic problems and hazards. Major research challenges for our increasingly disturbed planet include the lack of continuous pre and postdisturbance monitoring, hydrologic impacts that vary spatially and temporally based on environmental and hydroclimatic conditions, and the preponderance of overlapping or compounding disturbance sequences. In addition, a conceptual framework for characterizing commonalities and differences among hydrologic disturbances is still in its infancy. In this introduction to the special issue, we advance the fusion of concepts and terminology from ecology and hydrology to begin filling this gap. We briefly explore some preliminary approaches for comparing different disturbances and their hydrologic impacts, which provides a starting point for further dialogue and research progress.
Gaussianity Fair
(2017)
Embedded smart home
(2017)
The popularity of MOOCs has increased considerably in the last years. A typical MOOC course consists of video content, self tests after a video and homework, which is normally in multiple choice format. After solving this homeworks for every week of a MOOC, the final exam certificate can be issued when the student has reached a sufficient score. There are also some attempts to include practical tasks, such as programming, in MOOCs for grading. Nevertheless, until now there is no known possibility to teach embedded system programming in a MOOC course where the programming can be done in a remote lab and where grading of the tasks is additionally possible. This embedded programming includes communication over GPIO pins to control LEDs and measure sensor values. We started a MOOC course called "Embedded Smart Home" as a pilot to prove the concept to teach real hardware programming in a MOOC environment under real life MOOC conditions with over 6000 students. Furthermore, also students with real hardware have the possibility to program on their own real hardware and grade their results in the MOOC course. Finally, we evaluate our approach and analyze the student acceptance of this approach to offer a course on embedded programming. We also analyze the hardware usage and working time of students solving tasks to find out if real hardware programming is an advantage and motivating achievement to support students learning success.
Influenza virus vRNPs: quantitative investigations via fluorescence
cross-correlation spectroscopy
(2017)
We compare Visual Berrypicking, an interactive approach allowing users to explore large and highly faceted information spaces using similarity-based two-dimensional maps, with traditional browsing techniques. For large datasets, current projection methods used to generate maplike overviews suffer from increased computational costs and a loss of accuracy resulting in inconsistent visualizations. We propose to interactively align inexpensive small maps, showing local neighborhoods only, which ideally creates the impression of panning a large map. For evaluation, we designed a web-based prototype for movie exploration and compared it to the web interface of The Movie Database (TMDb) in an online user study. Results suggest that users are able to effectively explore large movie collections by hopping from one neighborhood to the next. Additionally, due to the projection of movie similarities, interesting links between movies can be found more easily, and thus, compared to browsing serendipitous discoveries are more likely.
Structural health monitoring activities are of primal importance for managing transport infrastructure, however most SHM methodologies are based on point-based sensors that have limitations in terms of their spatial positioning requirements, cost of development and measurement range. This paper describes the progress on the SENSKIN EC project whose objective is to develop a dielectric-elastomer and micro-electronics-based sensor, formed from a large highly extensible capacitance sensing membrane supported by advanced microelectronic circuitry, for monitoring transport infrastructure bridges. Such a sensor could provide spatial measurements of strain in excess of 10%. The actual sensor along with the data acquisition module, the communication module and power electronics are all integrated into a compact unit, the SENSKIN device, which is energy-efficient, requires simple signal processing and it is easy to install over various surface types. In terms of communication, SENSKIN devices interact with each other to form the SENSKIN system; a fully distributed and autonomous wireless sensor network that is able to self-monitor. SENSKIN system utilizes Delay-/Disruption-Tolerant Networking technologies to ensure that the strain measurements will be received by the base station even under extreme conditions where normal communications are disrupted. This paper describes the architecture of the SENSKIN system and the development and testing of the first SENSKIN prototype sensor, the data acquisition system, and the communication system.
As virtualization drives the automation of networking, the validation of security properties becomes more and more challenging eventually ruling out manual inspections. While formal verification in Software Defined Networks is provided by comprehensive tools with high speed reverification capabilities like NetPlumber for instance, the presence of middlebox functionality like firewalls is not considered. Also, they lack the ability to handle dynamic protocol elements like IPv6 extension header chains. In this work, we provide suitable modeling abstractions to enable both - the inclusion of firewalls and dynamic protocol elements. We exemplarily model the Linux ip6tables/netfilter packet filter and also provide abstractions for an application layer gateway. Finally, we present a prototype of our formal verification system FaVe.
Mixed-projection treemaps
(2017)
This paper presents a novel technique for combining 2D and 2.5D treemaps using multi-perspective views to leverage the advantages of both treemap types. It enables a new form of overview+detail visualization for tree-structured data and contributes new concepts for real-time rendering of and interaction with treemaps. The technique operates by tilting the graphical elements representing inner nodes using affine transformations and animated state transitions. We explain how to mix orthogonal and perspective projections within a single treemap. Finally, we show application examples that benefit from the reduced interaction overhead.