Refine
Has Fulltext
- no (652) (remove)
Year of publication
Document Type
- Other (652) (remove)
Language
- English (652) (remove)
Is part of the Bibliography
- yes (652)
Keywords
- E-Learning (4)
- MOOC (4)
- Scrum (4)
- embodied cognition (4)
- errata, addenda (4)
- Cloud-Security (3)
- ISM: supernova remnants (3)
- Industry 4.0 (3)
- Internet of Things (3)
- Security Metrics (3)
Institute
- Hasso-Plattner-Institut für Digital Engineering GmbH (83)
- Institut für Biochemie und Biologie (82)
- Institut für Physik und Astronomie (82)
- Institut für Geowissenschaften (63)
- Department Psychologie (41)
- Department Sport- und Gesundheitswissenschaften (38)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (30)
- Institut für Chemie (27)
- Institut für Ernährungswissenschaft (27)
- Institut für Informatik und Computational Science (26)
The design of embedded systems is becoming continuously more complex such that efficient system-level design methods are becoming crucial. Recently, combined Answer Set Programming (ASP) and Quantifier Free Integer Difference Logic (QF-IDL) solving has been shown to be a promising approach in system synthesis. However, this approach still has several restrictions limiting its applicability. In the paper at hand, we propose a novel ASP modulo Theories (ASPmT) system synthesis approach, which (i) supports more sophisticated system models, (ii) tightly integrates the QF-IDL solving into the ASP solving, and (iii) makes use of partial assignment checking. As a result, more realistic systems are considered and an early exclusion of infeasible solutions improves the entire system synthesis.
Massive Open Online Courses (MOOCs) have left their mark on the face of education during the recent years. At the Hasso Plattner Institute (HPI) in Potsdam, Germany, we are actively developing a MOOC platform, which provides our research with a plethora of e-learning topics, such as learning analytics, automated assessment, peer assessment, team-work, online proctoring, and gamification. We run several instances of this platform. On openHPI, we provide our own courses from within the HPI context. Further instances are openSAP, openWHO, and mooc.HOUSE, which is the smallest of these platforms, targeting customers with a less extensive course portfolio. In 2013, we started to work on the gamification of our platform. By now, we have implemented about two thirds of the features that we initially have evaluated as useful for our purposes. About a year ago we activated the implemented gamification features on mooc.HOUSE. Before activating the features on openHPI as well, we examined, and re-evaluated our initial considerations based on the data we collected so far and the changes in other contexts of our platforms.
Background and Aims: Ostarek et al. (2019) claimed a conclusive
demonstration that language comprehension relies profoundly on
visual simulations. They presented participants with visual noise during sentence-picture verification (SPV) and measured lateralized button response speed. The authors selectively eliminated the classical congruency effect (faster yes decisions when pictures match the objects implied by the sentences) with ‘‘high level’’ noise made from images of other objects. However, that visual noise included tool pictures, known to activate lateralized motor affordances. Moreover, some of their sentences described motor actions. This raises the question whether motor simulation may have contaminated their results.
Methods: Replicating Ostarek et al. (2019), 33 right-handed
participants performed SPV but either without visual noise or while viewing (a) only left-handled or (b) only right-handled or (c) alternatingly left- and right-handled tools. Accuracy and reaction times of manual yes responses were analyzed. Additionally, hand-relatedness of sentences was rated.
Results: Replicating Ostarek et al. (2019), the classical SPV congruency effect appeared without noise and vanished when alternatingly handled tools were presented. Crucially, it reappeared when noise objects were consistently either left- or righthandled. Higher hand-relatedness of sentence content reduced SPV performance and accuracy was lower with right-handled noise.
Conclusion: First, we demonstrated an interaction between motor-
related language, visual affordances and motor responses in SPV.
This result supports the embodied view of language processing.
Second, we identified a motor process not previously known in SPV. This extends our understanding of mental simulation and calls for methodological controls in future studies.
Eccentric exercises (ECC) induce reversible muscle damage, delayed-onset muscle soreness and an inflammatory reaction that is often followed by a systemic anti-inflammatory response. Thus, ECC might be beneficial for treatment of metabolic disorders which are frequently accompanied by a low-grade systemic inflammation. However, extent and time course of a systemic immune response after repeated ECC bouts are poorly characterized.
PURPOSE: To analyze the (anti-)inflammatory response after repeated ECC loading of the trunk.
METHODS: Ten healthy participants (33 ± 6 y; 173 ± 14 cm; 74 ± 16 kg) performed three isokinetic strength measurements of the trunk (concentric (CON), ECC1, ECC2, each 2 wks apart; flexion/extension, velocity 60°/s, 120s MVC). Pre- and 4, 24, 48, 72, 168h post-exercise, muscle soreness (numeric rating scale, NRS) was assessed and blood samples were taken and analyzed [Creatine kinase (CK), C-reactive protein (CRP), Interleukin-6 (IL-6), IL-10, Tumor necrosis factor-α (TNF-α)]. Statistics were done by Friedman‘s test with Dunn‘s post hoc test (α=.05).
RESULTS: Mean peak torque was higher during ECC1 (319 ± 142 Nm) than during CON (268 ± 108 Nm; p<.05) and not different between ECC1 and ECC2 (297 ± 126 Nm; p>.05). Markers of muscle damage (peaks post-ECC1: NRS 48h, 4.4±2.9; CK 72h, 14407 ± 19991 U/l) were higher after ECC1 than after CON and ECC2 (p<.05). The responses over 72h (stated as Area under the Curve, AUC) were abolished after ECC2 compared to ECC1 (p<.05) indicating the presence of the repeated bout effect. CRP levels were not changed. IL-6 levels increased 2-fold post-ECC1 (pre: 0.5 ± 0.4 vs. 72h: 1.0 ± 0.8 pg/ml). The IL-6 response was enhanced after ECC1 (AUC 61 ± 37 pg/ml*72h) compared to CON (AUC 33 ± 31 pg/ml*72h; p<.05). After ECC2, the IL-6 response (AUC 43 ± 25 pg/ml*72h) remained lower than post-ECC1, but the difference was not statistically significant. Serum levels of TNF-α and of the anti-inflammatory cytokine IL-10 were below detection limits. Overall, markers of muscle damage and immune response showed high inter-individual variability.
CONCLUSION: Despite maximal ECC loading of a large muscle group, no anti-inflammatory and just weak inflammatory responses were detected in healthy adults. Whether ECC elicits a different reaction in inflammatory clinical conditions is unclear.
The ionospheric delay of global navigation satellite systems (GNSS) signals typically is compensated by adding a single correction value to the pseudorange measurement of a GNSS receiver. Yet, this neglects the dispersive nature of the ionosphere. In this context we analyze the ionospheric signal distortion beyond a constant delay. These effects become increasingly significant with the signal bandwidth and hence more important for new broadband navigation signals. Using measurements of the Galileo E5 signal, captured with a high gain antenna, we verify that the expected influence can indeed be observed and compensated. A new method to estimate the total electron content (TEC) from a single frequency high gain antenna measurement of a broadband GNSS signal is proposed and described in detail. The received signal is de facto unaffected by multi-path and interference because of the narrow aperture angle of the used antenna which should reduce the error source of the result in general. We would like to point out that such measurements are independent of code correlation, like in standard receiver applications. It is therefore also usable without knowledge of the signal coding. Results of the TEC estimation process are shown and discussed comparing to common TEC products like TEC maps and dual frequency receiver estimates.
Hulleman & Olivers' (H&O's) model introduces variation of the functional visual field (FVF) for explaining visual search behavior. Our research shows how the FVF can be studied using gaze-contingent displays and how FVF variation can be implemented in models of gaze control. Contrary to H&O, we believe that fixation duration is an important factor when modeling visual search behavior.
The Internet can be considered as the most important infrastructure for modern society and businesses. A loss of Internet connectivity has strong negative financial impacts for businesses and economies. Therefore, assessing Internet connectivity, in particular beyond their own premises and area of direct control, is of growing importance in the face of potential failures, accidents, and malicious attacks. This paper presents CORIA, a software framework for an easy analysis of connectivity risks based on large network graphs. It provides researchers, risk analysts, network managers and security consultants with a tool to assess an organization's connectivity and paths options through the Internet backbone, including a user-friendly and insightful visual representation of results. CORIA is flexibly extensible in terms of novel data sets, graph metrics, and risk scores that enable further use cases. The performance of CORIA is evaluated by several experiments on the Internet graph and further randomly generated networks.
Editorial
(2017)
Photonic sensing in highly concentrated biotechnical processes by photon density wave spectroscopy
(2017)
Photon Density Wave (PDW) spectroscopy is introduced as a new approach for photonic sensing in highly concentrated biotechnical processes. It independently quantifies the absorption and reduced scattering coefficient calibration-free and as a function of time, thus describing the optical properties in the vis/NIR range of the biomaterial during their processing. As examples of industrial relevance, enzymatic milk coagulation, beer mashing, and algae cultivation in photo bioreactors are discussed.
Conclusion
(2016)
This chapter revisits the role of the new modes of governance in areas of limited statehood. First, it states that there is no linear relationship between degrees of statehood and the overall effectiveness of new modes of sustainability governance. Second, the chapter states that, in most of the cases, national governments are hesitant or even actively hamper the development of new modes of governance. Third, it shows that the absence of the shadow of hierarchy can indeed lead to ineffective new modes of governance. However, the shadow of hierarchy does not necessarily need to be cast by states. Finally, the author reviews the complexities involved in participatory practices, stressing the importance of institutional structures and knowledgeable brokers. The chapter concludes by outlining fields for future research.
This chapter investigates the trajectory of establishing the Forest Stewardship Council (FSC) in the early 1990s as the first private transnational certification organization with an antagonistic stakeholder body. Its main contribution is a micro-analysis of the founding assembly in 1993. By investigating the role of brokers within the negotiation as one institutional scope condition for ‘arguing’ having occurred, the chapter adopts a dramaturgical approach. It contends that the authority of brokers is not necessarily institutionally given, but needs to be gained: brokers have to prove situationally that their knowledge is relevant and that they are speaking impartially in the interest of progress rather than their own. The chapter stresses the importance of procedural knowledge which brokers provide in contrast to policy knowledge.
Introduction
(2016)
The Paris Agreement for Climate Change or the Sustainable Development Goals (SDGs) rely on new modes of governance for implementation. Indeed, new modes of governance such as market-based instruments, public-private partnerships or multi-stakeholder initiatives have been praised for playing a pivotal role in effective and legitimate sustainability governance. Yet, do they also deliver in areas of limited statehood? States such as Malaysia or the Dominican Republic partly lack the ability to implement and enforce rules; their statehood is limited. This introduction provides the analytical framework of this volume and critically examines the performance of new modes of governance in areas of limited statehood, drawing on the book’s in-depth case studies on issues of climate change, biodiversity, and health.
We investigated the possibility to identify motor units (MUs) with high-density surface electromyography (HDEMG) over experimental sessions in different days. 10 subjects performed submaximal knee extensions across three sessions in three days separated by one week, while EMG was recorded from the vastus medialis muscle with high-density electrode grids. The shapes of the MU action potentials (MUAPs) over multiple channels extracted from HDEMG decomposition were matched across sessions by cross-correlation. Forty and twenty percent of the MUs decomposed could be tracked across two and three sessions, respectively (average cross correlation 0.85 +/- 0.04). The estimated properties of the matched motor units were similar across the sessions. For example, mean discharge rate and recruitment thresholds were measured with an intra-class correlation coefficient (ICCs) > 0.80. These results strongly suggest that the same MUs were indeed identified across sessions. This possibility will allow monitoring changes in MU properties following interventions or during the progression of neuromuscular disorders.
Structural health monitoring activities are of primal importance for managing transport infrastructure, however most SHM methodologies are based on point-based sensors that have limitations in terms of their spatial positioning requirements, cost of development and measurement range. This paper describes the progress on the SENSKIN EC project whose objective is to develop a dielectric-elastomer and micro-electronics-based sensor, formed from a large highly extensible capacitance sensing membrane supported by advanced microelectronic circuitry, for monitoring transport infrastructure bridges. Such a sensor could provide spatial measurements of strain in excess of 10%. The actual sensor along with the data acquisition module, the communication module and power electronics are all integrated into a compact unit, the SENSKIN device, which is energy-efficient, requires simple signal processing and it is easy to install over various surface types. In terms of communication, SENSKIN devices interact with each other to form the SENSKIN system; a fully distributed and autonomous wireless sensor network that is able to self-monitor. SENSKIN system utilizes Delay-/Disruption-Tolerant Networking technologies to ensure that the strain measurements will be received by the base station even under extreme conditions where normal communications are disrupted. This paper describes the architecture of the SENSKIN system and the development and testing of the first SENSKIN prototype sensor, the data acquisition system, and the communication system.
Surface acoustic wave (SAW) devices are well-known for gravimetric sensor applications. In biosensing applications, chemical-and biochemically evoked adsorption processes at surfaces are detected in liquid environments using delay-line or resonator sensor configurations, preferably in combination with appropriate microfluidic devices. In this paper, a novel SAW-based impedance sensor type is introduced which uses only one interdigital electrode transducer (IDT) simultaneously as SAW generator and sensor element. It is shown that the amplitude of the reflected S-11 signal directly depends on the input impedance of the SAW device. The input impedance is strongly influenced by mass adsorption which causes a characteristic and measurable impedance mismatch.
In this paper, the applicability of deep downhole geoelectrical monitoring for detecting CO2 related signatures is evaluated after a nearly ten year period of CO2 storage at the Ketzin pilot site. Deep downhole electrode arrays have been studied as part of a multi-physical monitoring concept at four CO2 pilot test sites worldwide so far. For these sites, it was considered important to implement the geoelectrical method into the measurement program of tracking the CO2 plume. Analyzing the example of the Ketzin site, it can be seen that during all phases of the CO2 storage reservoir development the resistivity measurements and their corresponding tomographic interpretation contribute in a beneficial manner to the measurement, monitoring and verification (MMV) protocol. The most important impact of a permanent electrode array is its potential as tool for estimating reservoir saturations.
Water management tools are necessary to guarantee the preservation of natural resources while ensuring optimum utilization. Linear regression models are a simple and quick solution for creating prognostic capabilities. Multivariate models show higher precision than univariate models. In the case of Waiwera, implementation of individual production rates is more accurate than applying just the total production rate. A maximum of approximately 1,075 m3/day can be pumped to ensure a water level of at least 0.5 m a.s.l. in the monitoring well. The model should be renewed annually to implement new data and current water level trends to keep the quality.
Predicting macroscopic elastic rock properties requires detailed information on microstructure
(2017)
Predicting variations in macroscopic mechanical rock behaviour due to microstructural changes, driven by mineral precipitation and dissolution is necessary to couple chemo-mechanical processes in geological subsurface simulations. We apply 3D numerical homogenization models to estimate Young’s moduli for five synthetic microstructures, and successfully validate our results for comparable geometries with the analytical Mori-Tanaka approach. Further, we demonstrate that considering specific rock microstructures is of paramount importance, since calculated elastic properties may deviate by up to 230 % for the same mineral composition. Moreover, agreement between simulated and experimentally determined Young’s moduli is significantly improved, when detailed spatial information are employed.
Integration and development of the energy supply in China and worldwide is a challenge for the years to come. The innovative idea presented here is based on an extension of the “power-to-gas-to-power” technology by establishing a closed carbon cycle. It is an implementation of a low-carbon energy system based on carbon dioxide capture and storage (CCS) to store and reuse wind and solar energy. The Chenjiacun storage project in China compares well with the German case study for the towns Potsdam and Brandenburg/Havel in the Federal State of Brandenburg based on the Ketzin pilot site for CCS.
Dissolved CO2 storage in geological formations with low pressure, low risk and large capacities
(2017)
Geological CO2 storage is a mitigation technology to reduce CO2 emissions from fossil fuel combustion. However, major concerns are the pressure increase and saltwater displacement in the mainly targeted deep groundwater aquifers due to injection of supercritical CO2. The suggested solution is storage of CO2 exclusively in the dissolved state. In our exemplary regional case study of the North East German Basin based on a highly resolved temperature and pressure distribution model and a newly developed reactive transport coupling, we have quantified that 4.7 Gt of CO2 can be stored in solution compared to 1.5 Gt in the supercritical state.
Cost models play an important role for the efficient implementation of software systems. These models can be embedded in operating systems and execution environments to optimize execution at run time. Even though non-uniform memory access (NUMA) architectures are dominating today's server landscape, there is still a lack of parallel cost models that represent NUMA system sufficiently. Therefore, the existing NUMA models are analyzed, and a two-step performance assessment strategy is proposed that incorporates low-level hardware counters as performance indicators. To support the two-step strategy, multiple tools are developed, all accumulating and enriching specific hardware event counter information, to explore, measure, and visualize these low-overhead performance indicators. The tools are showcased and discussed alongside specific experiments in the realm of performance assessment.
This paper discusses a new approach for designing and deploying Security-as-a-Service (SecaaS) applications using cloud native design patterns. Current SecaaS approaches do not efficiently handle the increasing threats to computer systems and applications. For example, requests for security assessments drastically increase after a high-risk security vulnerability is disclosed. In such scenarios, SecaaS applications are unable to dynamically scale to serve requests. A root cause of this challenge is employment of architectures not specifically fitted to cloud environments. Cloud native design patterns resolve this challenge by enabling certain properties e.g. massive scalability and resiliency via the combination of microservice patterns and cloud-focused design patterns. However adopting these patterns is a complex process, during which several security issues are introduced. In this work, we investigate these security issues, we redesign and deploy a monolithic SecaaS application using cloud native design patterns while considering appropriate, layered security counter-measures i.e. at the application and cloud networking layer. Our prototype implementation out-performs traditional, monolithic applications with an average Scanner Time of 6 minutes, without compromising security. Our approach can be employed for designing secure, scalable and performant SecaaS applications that effectively handle unexpected increase in security assessment requests.
Mixed-projection treemaps
(2017)
This paper presents a novel technique for combining 2D and 2.5D treemaps using multi-perspective views to leverage the advantages of both treemap types. It enables a new form of overview+detail visualization for tree-structured data and contributes new concepts for real-time rendering of and interaction with treemaps. The technique operates by tilting the graphical elements representing inner nodes using affine transformations and animated state transitions. We explain how to mix orthogonal and perspective projections within a single treemap. Finally, we show application examples that benefit from the reduced interaction overhead.
The maximum entropy method is used to predict flows on water distribution networks. This analysis extends the water distribution network formulation of Waldrip et al. (2016) Journal of Hydraulic Engineering (ASCE), by the use of a continuous relative entropy defined on a reduced parameter set. This reduction in the parameters that the entropy is defined over ensures consistency between different representations of the same network. The performance of the proposed reduced parameter method is demonstrated with a one-loop network case study.
The maximum entropy method is used to derive an alternative gravity model for a transport network. The proposed method builds on previous methods which assign the discrete value of a maximum entropy distribution to equal the traffic flow rate. The proposed method however, uses a distribution to represent each flow rate. The proposed method is shown to be able to handle uncertainty in a more elegant way and give similar results to traditional methods. It is able to incorporate more of the observed data through the entropy function, prior distribution and integration limits potentially allowing better inferences to be made.
Lately, first implementation approaches of Internet of Things (IoT) technologies penetrate industrial value-adding processes. Within this, the competence requirements for employees are changing. Employees’ organization, process, and interaction competences are of crucial importance in this new IoT environment, however, in students and vocational training not sufficiently considered yet. On the other hand, conventional learning factories evolve and transform to digital learning factories. Nevertheless, the integration of IoT technology and its usage for training in digital learning factories has been largely neglected thus far. Existing learning factories do not explicitly and properly consider IoT technology, which leads to deficiencies regarding an appropriate development of employees’ Industrial IoT competences. The goal of this contribution is to point out a didactic concept that enables development and training of these new demanded competences by using an IoT laboratory. For this purpose, a design science approach is applied. The result of this contribution is a didactic concept for the development of Industrial IoT competences in an IoT laboratory.
In this extended abstract, we will analyze the current challenges for the envisioned Self-Adaptive CPS. In addition, we will outline our results to approach these challenges with SMARTSOS [10] a generic approach based on extensions of graph transformation systems employing open and adaptive collaborations and models at runtime for trustworthy self-adaptation, self-organization, and evolution of the individual systems and the system-of-systems level taking the independent development, operation, management, and evolution of these systems into account.
E-commerce marketplaces are highly dynamic with constant competition. While this competition is challenging for many merchants, it also provides plenty of opportunities, e.g., by allowing them to automatically adjust prices in order to react to changing market situations. For practitioners however, testing automated pricing strategies is time-consuming and potentially hazardously when done in production. Researchers, on the other side, struggle to study how pricing strategies interact under heavy competition. As a consequence, we built an open continuous time framework to simulate dynamic pricing competition called Price Wars. The microservice-based architecture provides a scalable platform for large competitions with dozens of merchants and a large random stream of consumers. Our platform stores each event in a distributed log. This allows to provide different performance measures enabling users to compare profit and revenue of various repricing strategies in real-time. For researchers, price trajectories are shown which ease evaluating mutual price reactions of competing strategies. Furthermore, merchants can access historical marketplace data and apply machine learning. By providing a set of customizable, artificial merchants, users can easily simulate both simple rule-based strategies as well as sophisticated data-driven strategies using demand learning to optimize their pricing strategies.
We compare Visual Berrypicking, an interactive approach allowing users to explore large and highly faceted information spaces using similarity-based two-dimensional maps, with traditional browsing techniques. For large datasets, current projection methods used to generate maplike overviews suffer from increased computational costs and a loss of accuracy resulting in inconsistent visualizations. We propose to interactively align inexpensive small maps, showing local neighborhoods only, which ideally creates the impression of panning a large map. For evaluation, we designed a web-based prototype for movie exploration and compared it to the web interface of The Movie Database (TMDb) in an online user study. Results suggest that users are able to effectively explore large movie collections by hopping from one neighborhood to the next. Additionally, due to the projection of movie similarities, interesting links between movies can be found more easily, and thus, compared to browsing serendipitous discoveries are more likely.
We develop a simple two-zone interpretation of the broadband baseline Crab nebula spectrum between 10(-5) eV and similar to 100 TeV by using two distinct log-parabola energetic electrons distributions. We determine analytically the very-high energy photon spectrum as originated by inverse-Compton scattering of the far-infrared soft ambient photons within the nebula off a first population of electrons energized at the nebula termination shock. The broad and flat 200 GeV peak jointly observed by Fermi/LAT and MAGIC is naturally reproduced. The synchrotron radiation from a second energetic electron population explains the spectrum from the radio range up to similar to 10 keV. We infer from observations the energy dependence of the microscopic probability of remaining in proximity of the shock of the accelerating electrons.
Web-based E-Learning uses Internet technologies and digital media to deliver education content to learners. Many universities in recent years apply their capacity in producing Massive Open Online Courses (MOOCs). They have been offering MOOCs with an expectation of rendering a comprehensive online apprenticeship. Typically, an online content delivery process requires an Internet connection. However, access to the broadband has never been a readily available resource in many regions. In Africa, poor and no networks are yet predominantly experienced by Internet users, frequently causing offline each moment a digital device disconnect from a network. As a result, a learning process is always disrupted, delayed and terminated in such regions. This paper raises the concern of E-Learning in poor and low bandwidths, in fact, it highlights the needs for an Offline-Enabled mode. The paper also explores technical approaches beamed to enhance the user experience inWeb-based E-Learning, particular in Africa.
Recently, Kocyan & Wiland-Szymańska (2016) have published a thorough research article on one of the outstanding members of the family Hypoxidaceae on the Seychelles, which resulted in the raise of a new genus (Friedmannia Kocyan & Wiland-Szymańska 2016: 60) to accommodate the former Curculigo seychellensis Bojer ex Baker (1877: 368). However, it has turned out that the name Friedmannia Chantanachat & Bold (1962: 45) already exists in literature for a green alga, which renders the new hypoxid genus illegitimate (Melbourne Code; McNeill et al. 2012). Therefore, we assign a new generic epithet to Curculigo seychellensis.
Editorial
(2017)
Background: Infliximab (IFX), an anti-TNF monoclonal antibody approved for the treatment of inflammatory bowel disease, is dosed per kg body weight (BW). However, the rationale for body size adjustment has not been unequivocally demonstrated [1], and first attempts to improve IFX therapy have been undertaken [2]. The aim of our study was to assess the impact of different dosing strategies (i.e. body size-adjusted and fixed dosing) on drug exposure and pharmacokinetic (PK) target attainment. For this purpose, a comprehensive simulation study was performed, using patient characteristics (n=116) from an in-house clinical database.
Methods: IFX concentration-time profiles of 1000 virtual, clinically representative patients were generated using a previously published PK model for IFX in patients with Crohn's disease [3]. For each patient 1000 profiles accounting for PK variability were considered. The IFX exposure during maintenance treatment after the following dosing strategies was compared: i) fixed dose, and per ii) BW, iii) lean BW (LBW), iv) body surface area (BSA), v) height (HT), vi) body mass index (BMI) and vii) fat-free mass (FFM)). For each dosing strategy the variability in maximum concentration Cmax, minimum concentration Cmin (= C8weeks) and area under the concentration-time curve (AUC), as well as percent of patients achieving the PK target, Cmin=3 μg/mL [4] were assessed.
Results: For all dosing strategies the variability of Cmin (CV ≈110%) was highest, compared to Cmax and AUC, and was of similar extent regardless of dosing strategy. The proportion of patients reaching the PK target (≈⅓ was approximately equal for all dosing strategies.
Gaussianity Fair
(2017)
Eighteen scientists met at Jurata, Poland, to discuss various aspects of the transition from adolescence to adulthood. This transition is a delicate period facing complex interactions between the adolescents and the social group they belong to. Social identity, group identification and identity signalling, but also stress affecting basal salivary cortisol rhythms, hypertension, inappropriate nutrition causing latent and manifest obesity, moreover, in developing and under-developed countries, parasitosis causing anaemia thereby impairing growth and development, are issues to be dealt with during this period of the human development. In addition, some new aspects of the association between weight, height and head circumference in the newborns were discussed, as well as intrauterine head growth and head circumference as health risk indicators.
Root infinitives on Twitter
(2017)
Background: Evidence that home telemonitoring (HTM) for patients with chronic heart failure (CHF) offers clinical benefit over usual care is controversial as is evidence of a health economic advantage. Therefore the CardioBBEAT trial was designed to prospectively assess the health economic impact of a dedicated home monitoring system for patients with CHF based on actual costs directly obtained from patients’ health care providers.
Methods: Between January 2010 and June 2013, 621 patients (mean age 63,0 ± 11,5 years, 88 % male) with a confirmed diagnosis of CHF (LVEF ≤ 40 %) were enrolled and randomly assigned to two study groups comprising usual care with and without an interactive bi-directional HTM (Motiva®). The primary endpoint was the Incremental Cost-Effectiveness Ratio (ICER) established by the groups’ difference in total cost and in the combined clinical endpoint “days alive and not in hospital nor inpatient care per potential days in study” within the follow up of 12 months. Secondary outcome measures were total mortality and health related quality of life (SF-36, WHO-5 and KCCQ).
Results: In the intention-to-treat analysis, total mortality (HR 0.81; 95% CI 0.45 – 1.45) and days alive and not in hospital (343.3 ± 55.4 vs. 347.2 ± 43.9; p = 0.909) were not significantly different between HTM and usual care. While the resulting primary endpoint ICER was not positive (-181.9; 95% CI −1626.2 ± 1628.9), quality of life assessed by SF-36, WHO-5 and KCCQ as a secondary endpoint was significantly higher in the HTW group at 6 and 12 months of follow-up.
Conclusions: The first simultaneous assessment of clinical and economic outcome of HTM in patients with CHF did not demonstrate superior incremental cost effectiveness compared to usual care. On the other hand, quality of life was improved. It remains open whether the tested HTM solution represents a useful innovative approach in the recent health care setting.
Preclinical assessment of penetration not only in intact, but also in barrier‐disrupted skin is important to explore the surplus value of novel drug delivery systems, which can be specifically designed for diseased skin. Here, we characterized physical and chemical barrier disruption protocols for short‐term ex vivo skin cultures with regard to structural integrity, physiological and biological parameters. Further, we compared the penetration of dexamethasone (Dex) in different nanoparticle‐based formulations in stratum corneum, epidermis and dermis extracts of intact vs. barrier‐disrupted skin as well as by dermal microdialysis at 6, 12 and 24 hours after topical application. Dex was quantified by liquid‐chromatography ‐ tandem‐mass spectrometry (LC‐MS/MS). Simultaneously, we investigated the Dex efficacy by interleukin (IL) analysis. Tape‐stripping (TS) and 4 hours sodium lauryl sulfate 5 % (SLS) exposure were identified as highly effective barrier disruption methods assessed by reproducible transepidermal water loss (TEWL) changes and IL‐6/8 increase which was more pronounced in SLS‐treated skin. The barrier state has also a significant impact on the Dex penetration kinetics: for all formulations, TS highly increased dermal Dex concentration despite the fact that nanocrystals quickly and effectively penetrated both, intact and barrier‐disrupted skin reaching significantly higher dermal Dex concentration after 6 hours compared to Dex cream. The surplus value of encapsulation in ethyl cellulose nanocarriers could mostly be observed when applied on intact skin, in general showing a delayed Dex penetration. Estimation of cytokines was limited due to the trauma caused by probe insertion. In summary, ex vivo human skin is a highly interesting short‐term preclinical model for the analysis of penetration and efficacy of novel drug delivery systems.
Emergency Care in Germany being re-assessed Hybrid Medical Care Model Seen As Potential Answer
(2017)
Nanocarriers
(2017)
As a potentially toxic agent on nervous system and bone, the safety of aluminium exposure from adjuvants in vaccines and subcutaneous immune therapy (SCIT) products has to be continuously reevaluated, especially regarding concomitant administrations. For this purpose, knowledge on absorption and disposition of aluminium in plasma and tissues is essential. Pharmacokinetic data after vaccination in humans, however, are not available, and for methodological and ethical reasons difficult to obtain. To overcome these limitations, we discuss the possibility of an in vitro-in silico approach combining a toxicokinetic model for aluminium disposition with biorelevant kinetic absorption parameters from adjuvants. We critically review available kinetic aluminium-26 data for model building and, on the basis of a reparameterized toxicokinetic model (Nolte et al., 2001), we identify main modelling gaps. The potential of in vitro dissolution experiments for the prediction of intramuscular absorption kinetics of aluminium after vaccination is explored. It becomes apparent that there is need for detailed in vitro dissolution and in vivo absorption data to establish an in vitro-in vivo correlation (IVIVC) for aluminium adjuvants. We conclude that a combination of new experimental data and further refinement of the Nolte model has the potential to fill a gap in aluminium risk assessment. (C) 2017 Elsevier Inc. All rights reserved.
Recently a multitude of empirically derived damage models have been applied to project future tropical cyclone (TC) losses for the United States. In their study (Geiger et al 2016 Environ. Res. Lett. 11 084012) compared two approaches that differ in the scaling of losses with socio-economic drivers: the commonly-used approach resulting in a sub-linear scaling of historical TC losses with a nation's affected gross domestic product (GDP), and the disentangled approach that shows a sub-linear increase with affected population and a super-linear scaling of relative losses with per capita income. Statistics cannot determine which approach is preferable but since process understanding demands that there is a dependence of the loss on both GDP per capita and population, an approach that accounts for both separately is preferable to one which assumes a specific relation between the two dependencies. In the accompanying comment, Rybski et al argued that there is no rigorous evidence to reach the conclusion that high-income does not protect against hurricane losses. Here we affirm that our conclusion is drawn correctly and reply to further remarks raised in the comment, highlighting the adequateness of our approach but also the potential for future extension of our research.
Over the past few years, studying abroad and other educational international experiences have become increasingly highly regarded. Nevertheless, research shows that only a minority of students actually take part in
academic mobility programs. But what is it that distinguishes those students who take up these international opportunities from those who do not? In this
study we reviewed recent quantitative studies on why (primarily German) students choose to travel abroad or not. This revealed a pattern of predictive factors. These indicate the key role played by students’ personal and social background, as well as previous international travel and the course of studies they are enrolled in. The study then focuses on teaching students. Both facilitating and debilitating factors are discussed and included in a model illustrating the decision-making process these students use. Finally, we discuss the practical implications for ways in which international, studyrelated travel might be increased in the future. We suggest that higher education institutions analyze individual student characteristics, offering differentiated programs to better meet the needs of different groups, thus raising the likelihood of disadvantaged students participating in academic international travel.
In the course of patient treatments, psychotherapists aim to meet the challenges of being both a trusted, knowledgeable conversation partner and a diligent documentalist. We are developing the digital whiteboard system Tele-Board MED (TBM), which allows the therapist to take digital notes during the session together with the patient. This study investigates what therapists are experiencing when they document with TBM in patient sessions for the first time and whether this documentation saves them time when writing official clinical documents. As the core of this study, we conducted four anamnesis session dialogues with behavior psychotherapists and volunteers acting in the role of patients. Following a mixed-method approach, the data collection and analysis involved self-reported emotion samples, user experience curves and questionnaires. We found that even in the very first patient session with TBM, therapists come to feel comfortable, develop a positive feeling and can concentrate on the patient. Regarding administrative documentation tasks, we found with the TBM report generation feature the therapists save 60% of the time they normally spend on writing case reports to the health insurance.
The globally distributed sperm whale (Physeter macrocephalus) has a partly matrilineal social structure with predominant male dispersal. At the beginning of 2016, a total of 30 male sperm whales stranded in five different countries bordering the southern North Sea. It has been postulated that these individuals were on a migration route from the north to warmer temperate and tropical waters where females live in social groups. By including samples from four countries (n = 27), this event provided a unique chance to genetically investigate the maternal relatedness and the putative origin of these temporally and spatially co-occuring male sperm whales. To utilize existing genetic resources, we sequenced 422 bp of the mitochondrial control region, a molecular marker for which sperm whale data are readily available from the entire distribution range. Based on four single nucleotide polymorphisms (SNPs) within the mitochondrial control region, five matrilines could be distinguished within the stranded specimens, four of which matched published haplotypes previously described in the Atlantic. Among these male sperm whales, multiple matrilineal lineages co-occur. We analyzed the population differentiation and could show that the genetic diversity of these male sperm whales is comparable to the genetic diversity in sperm whales from the entire Atlantic Ocean. We confirm that within this stranding event, males do not comprise maternally related individuals and apparently include assemblages of individuals from different geographic regions. (c) 2017 Deutsche Gesellschaft fur Saugetierkunde. Published by Elsevier GmbH. All rights reserved.
Utilizing quad-trees for efficient design space exploration with partial assignment evaluation
(2018)
Recently, it has been shown that constraint-based symbolic solving techniques offer an efficient way for deciding binding and routing options in order to obtain a feasible system level implementation. In combination with various background theories, a feasibility analysis of the resulting system may already be performed on partial solutions. That is, infeasible subsets of mapping and routing options can be pruned early in the decision process, which fastens the solving accordingly. However, allowing a proper design space exploration including multi-objective optimization also requires an efficient structure for storing and managing non-dominated solutions. In this work, we propose and study the usage of the Quad-Tree data structure in the context of partial assignment evaluation during system synthesis. Out experiments show that unnecessary dominance checks can be avoided, which indicates a preference of Quad-Trees over a commonly used list-based implementation for large combinatorial optimization problems.
The classification of vulnerabilities is a fundamental step to derive formal attributes that allow a deeper analysis. Therefore, it is required that this classification has to be performed timely and accurate. Since the current situation demands a manual interaction in the classification process, the timely processing becomes a serious issue. Thus, we propose an automated alternative to the manual classification, because the amount of identified vulnerabilities per day cannot be processed manually anymore. We implemented two different approaches that are able to automatically classify vulnerabilities based on the vulnerability description. We evaluated our approaches, which use Neural Networks and the Naive Bayes methods respectively, on the base of publicly known vulnerabilities.
Preface
(2018)
This book aims at understanding the diversity of planetary and lunar magnetic fields and their interaction with the solar wind. A synergistic interdisciplinary approach combines newly developed tools for data acquisition and analysis, computer simulations of planetary interiors and dynamos, models of solar wind interaction, measurement of terrestrial rocks and meteorites, and laboratory investigations. The following chapters represent a selection of some of the scientific findings derived by the 22 projects within the DFG Priority Program Planetary Magnetism" (PlanetMag). This introductory chapter gives an overview of the individual following chapters, highlighting their role in the overall goals of the PlanetMag framework. The diversity of the different contributions reflects the wide range of magnetic phenomena in our solar system. From the program we have excluded magnetism of the sun, which is an independent broad research discipline, but include the interaction of the solar wind with planets and moons. Within the subsequent 13 chapters of this book, the authors review the field centered on their research topic within PlanetMag. Here we shortly introduce the content of all the subsequent chapters and outline the context in which they should be seen.
Foreword
(2018)
In Europe, different countries developed a rich variety of sub-municipal institutions. Out of the plethora of intra- and sub-municipal decentralization forms (reaching from local outposts of city administration to “quasi-federal” structures), this book focuses on territorial sub-municipal units (SMUs) which combine multipurpose territorial responsibility with democratic legitimacy and can be seen as institutions promoting the articulation and realization of collective choices at a sub-municipal level.
Country chapters follow a common pattern that is facilitating systematic comparisons, while at the same time leaving enough space for national peculiarities and priorities chosen and highlighted by the authors, who also take advantage of the eventually existing empirical surveys and case studies.
Business process simulation is an important means for quantitative analysis of a business process and to compare different process alternatives. With the Business Process Model and Notation (BPMN) being the state-of-the-art language for the graphical representation of business processes, many existing process simulators support already the simulation of BPMN diagrams. However, they do not provide well-defined interfaces to integrate new concepts in the simulation environment. In this work, we present the design and architecture of a proof-of-concept implementation of an open and extensible BPMN process simulator. It also supports the simulation of multiple BPMN processes at a time and relies on the building blocks of the well-founded discrete event simulation. The extensibility is assured by a plug-in concept. Its feasibility is demonstrated by extensions supporting new BPMN concepts, such as the simulation of business rule activities referencing decision models and batch activities.
In university teaching today, it is common practice to record regular lectures and special events such as conferences and speeches. With these recordings, a large fundus of video teaching material can be created quickly and easily. Typically, lectures have a length of about one and a half hours and usually take place once or twice a week based on the credit hours. Depending on the number of lectures and other events recorded, the number of recordings available is increasing rapidly, which means that an appropriate form of provisioning is essential for the students. This is usually done in the form of lecture video platforms. In this work, we have investigated how lecture video platforms and the contained knowledge can be improved and accessed more easily by an increasing number of students. We came up with a multistep process we have applied to our own lecture video web portal that can be applied to other solutions as well.
Embedded smart home — remote lab MOOC with optional real hardware experience for over 4000 students
(2018)
MOOCs (Massive Open Online Courses) become more and more popular for learners of all ages to study further or to learn new subjects of interest. The purpose of this paper is to introduce a different MOOC course style. Typically, video content is shown teaching the student new information. After watching a video, self-test questions can be answered. Finally, the student answers weekly exams and final exams like the self test questions. Out of the points that have been scored for weekly and final exams a certificate can be issued. Our approach extends the possibility to receive points for the final score with practical programming exercises on real hardware. It allows the student to do embedded programming by communicating over GPIO pins to control LEDs and measure sensor values. Additionally, they can visualize values on an embedded display using web technologies, which are an essential part of embedded and smart home devices to communicate with common APIs. Students have the opportunity to solve all tasks within the online remote lab and at home on the same kind of hardware. The evaluation of this MOOCs indicates the interesting design for students to learn an engineering technique with new technology approaches in an appropriate, modern, supporting and motivating way of teaching.
When students watch learning videos online, they usually need to watch several hours of video content. In the end, not every minute of a video is relevant for the exam. Additionally, students need to add notes to clarify issues of a lecture. There are several possibilities to enhance the metadata of a video, e.g. a typical way to add user-specific information to an online video is a comment functionality, which allows users to share their thoughts and questions with the public. In contrast to common video material which can be found online, lecture videos are used for exam preparation. Due to this difference, the idea comes up to annotate lecture videos with markers and personal notes for a better understanding of the taught content. Especially, students learning for an exam use their notes to refresh their memories. To ease this learning method with lecture videos, we introduce the annotation feature in our video lecture archive. This functionality supports the students with keeping track of their thoughts by providing an intuitive interface to easily add, modify or remove their ideas. This annotation function is integrated in the video player. Hence, scrolling to a separate annotation area on the website is not necessary. Furthermore, the annotated notes can be exported together with the slide content to a PDF file, which can then be printed easily. Lecture video annotations support and motivate students to learn and watch videos from an E-Learning video archive.
An efficient Design Space Exploration (DSE) is imperative for the design of modern, highly complex embedded systems in order to steer the development towards optimal design points. The early evaluation of design decisions at system-level abstraction layer helps to find promising regions for subsequent development steps in lower abstraction levels by diminishing the complexity of the search problem. In recent works, symbolic techniques, especially Answer Set Programming (ASP) modulo Theories (ASPmT), have been shown to find feasible solutions of highly complex system-level synthesis problems with non-linear constraints very efficiently. In this paper, we present a novel approach to a holistic system-level DSE based on ASPmT. To this end, we include additional background theories that concurrently guarantee compliance with hard constraints and perform the simultaneous optimization of several design objectives. We implement and compare our approach with a state-of-the-art preference handling framework for ASP. Experimental results indicate that our proposed method produces better solutions with respect to both diversity and convergence to the true Pareto front.
Operational decisions in business processes can be modeled by using the Decision Model and Notation (DMN). The complementary use of DMN for decision modeling and of the Business Process Model and Notation (BPMN) for process design realizes the separation of concerns principle. For supporting separation of concerns during the design phase, it is crucial to understand which aspects of decision-making enclosed in a process model should be captured by a dedicated decision model. Whereas existing work focuses on the extraction of decision models from process control flow, the connection of process-related data and decision models is still unexplored. In this paper, we investigate how process-related data used for making decisions can be represented in process models and we distinguish a set of BPMN patterns capturing such information. Then, we provide a formal mapping of the identified BPMN patterns to corresponding DMN models and apply our approach to a real-world healthcare process.
Modern server systems with large NUMA architectures necessitate (i) data being distributed over the available computing nodes and (ii) NUMA-aware query processing to enable effective parallel processing in database systems. As these architectures incur significant latency and throughout penalties for accessing non-local data, queries should be executed as close as possible to the data. To further increase both performance and efficiency, data that is not relevant for the query result should be skipped as early as possible. One way to achieve this goal is horizontal partitioning to improve static partition pruning. As part of our ongoing work on workload-driven partitioning, we have implemented a recent approach called aggressive data skipping and extended it to handle both analytical as well as transactional access patterns. In this paper, we evaluate this approach with the workload and data of a production enterprise system of a Global 2000 company. The results show that over 80% of all tuples can be skipped in average while the resulting partitioning schemata are surprisingly stable over time.
Preface
(2018)
Previous work has shown that surface modification with orthophosphoric acid can significantly enhance the charge stability on polypropylene (PP) surface by generating deeper traps. In the present study, thermally stimulated potential-decay measurements revealed that the chemical treatment may also significantly increase the number of available trapping sites on the surface. Thus, as a consequence, the so-called "cross-over" phenomenon, which is observed on as-received and thermally treated PP electrets, may be overcome in a certain range of initial charge densities. Furthermore, the discharge behavior of chemically modified samples indicates that charges can be injected from the treated surface into the bulk, and/or charges of opposite polarity can be pulled from the rear electrode into the bulk at elevated temperatures and at the high electric fields that are caused by the deposited charges. In the bulk, a lack of deep traps causes rapid charge decay already in the temperature range around 95 degrees C.
The influence of chemical composition and crystallisation conditions on the ferroelectric and paraelectric phases and the resulting morphology in Poly(vinylidene fluoride-trifluoroethylene-chlorofluoroethylene) (P(VDF-TrFE-CFE)) terpolymer films with 55.4/37.2/7.3 mol% or with 62.2/29.4/8.4 mol% of VDF/TrFE/CFE was studied. Poly(vinylidene fluoride trifluoroethylene) (P(VDF-TrFE)) with 75/25 mol% VDF/TrFE was employed as reference material. Fourier-Transform Infrared Spectroscopy (FTIR) was used to determine the fractions of the relevant terpolymer phases, and X-Ray Diffraction (XRD) was employed to assess the crystalline morphology. The FTIR results show an increase of the fraction of paraelectric phases after annealing. On the other hand, XRD results indicate a more stable paraelectric phase in the terpolymer with higher CFE content.
The electret state stability in nonpolar semicrystalline polymers is largely determined by the traps located at crystalline/ amorphous phase interfaces. Thus, the thermal history of such polymers should considerably influence their electret properties. In the present work, we investigate how recrystallization influences charge stability in low-density polyethylene corona electrets. It has been found that electret charge stability in quenched samples is higher than in slowly-crystallized ones. Phenomenologicaly, this can be explained by the increased number of deeper traps in samples with smaller crystallite size.
Published results on LDPE/MgO nanocomposites (3wt%) show that they promise to be good electrical-insulation materials. In this work, the nanocomposites are examined as a potential (ferro-)electret material as well. Isothermal surface-potential decay measurements show that charged LDPE/MgO films still exhibit significant surface potentials after heating for 4 hours at 80 degrees C, which suggests good capabilities of LDPE/MgO nanocomposites to hold electric charges of both polarities. Open-tubular-channel ferroelectrets prepared from LDPE/MgO nanocomposite films show significant piezoelectricity with d(33) coefficients of about 20 pC/N after charging and are stable up to temperatures of at least 80 degrees C. Thus LDPE/MgO nanocomposites may become available as a new ferroelectret material. To increase their d(33) coefficients, it is desirable to optimize the charging conditions and the ferroelectret structure.
This introductory essay to the HSR Special Issue “Economists, Politics, and Society” argues for a strong field-theoretical programme inspired by Pierre Bourdieu to research economic life as an integral part of different social forms. Its main aim is threefold. First, we spell out the very distinct Durkheimian legacy in Bourdieu’s thinking and the way he applies it in researching economic phenomena. Without this background, much of what is actually part of how Bourdieu analysed economic aspects of social life would be overlooked or reduced to mere economic sociology. Second, we sketch the main theoretical concepts and heuristics used to analyse economic life from a field perspective. Third, we focus on practical methodological issues of field-analytical research into economic phenomena. We conclude with a short summary of the basic characteristics of this approach and discuss the main insights provided by the contributions to this special issue.
This paper presents the concept of a community-accessible stratospheric balloon-based observatory that is currently under preparation by a consortium of European research institutes and industry. We present the technical motivation, science case, instrumentation, and a two-stage image stabilization approach of the 0.5-m UV/visible platform. In addition, we briefly describe the novel mid-sized stabilized balloon gondola under design to carry telescopes in the 0.5 to 0.6 m range as well as the currently considered flight option for this platform. Secondly, we outline the scientific and technical motivation for a large balloon-based FIR telescope and the ESBO DS approach towards such an infrastructure.
The nature restoration project ‘Lenzener Elbtalaue’, realised from 2002 to 2011 at the river Elbe, included the first large scale dike relocation in Germany (420 ha). Its aim was to initiate the development of endangered natural wetland habitats and processes, accompanied by greater biodiversity in the former grassland dominated area. The monitoring of spatial and temporal variations of soil moisture in this dike relocation area is therefore particularly important for estimating the restoration success. The topsoil moisture monitoring from 1990 to 2017 is based on the Soil Moisture Index (SMI)1 derived with the triangle method2 by use of optical remotely sensed data: land surface temperature and Normalized Differnce Vegetation Index are calculated from Landsat 4/5/7/8 data and atmospheric corrected by use of MODIS data. Spatial and temporal soil moisture variations in the restored area of the dike relocation are compared to the agricultural and pasture area behind the new dike. Ground truth data in the dike relocation area was obtained from field measurements in October 2017 with a FDR device. Additionally, data from a TERENO soil moisture sensor network (SoilNet) and mobile cosmic ray neutron sensing (CRNS) rover measurements are compared to the results of the triangle method for a region in the Harz Mountains (Germany). The SMI time series illustrates, that the dike relocation area has become significantly wetter between 1990 and 2017, due to restructuring measurements. Whereas the SMI of the dike hinterland reflects constant and drier conditions. An influence of climate is unlikely. However, validation of the dimensionless index with ground truth measurements is very difficult, mostly due to large differences in scale.
Point clouds provide high-resolution topographic data which is often classified into bare-earth, vegetation, and building points and then filtered and aggregated to gridded Digital Elevation Models (DEMs) or Digital Terrain Models (DTMs). Based on these equally-spaced grids flow-accumulation algorithms are applied to describe the hydrologic and geomorphologic mass transport on the surface. In this contribution, we propose a stochastic point-cloud filtering that, together with a spatial bootstrap sampling, allows for a flow accumulation directly on point clouds using Facet-Flow Networks (FFN). Additionally, this provides a framework for the quantification of uncertainties in point-cloud derived metrics such as Specific Catchment Area (SCA) even though the flow accumulation itself is deterministic.
Why choice matters
(2018)
Measures of democracy are in high demand. Scientific and public audiences use them to describe political realities and to substantiate causal claims about those realities. This introduction to the thematic issue reviews the history of democracy measurement since the 1950s. It identifies four development phases of the field, which are characterized by three recurrent topics of debate: (1) what is democracy, (2) what is a good measure of democracy, and (3) do our measurements of democracy register real-world developments? As the answers to those questions have been changing over time, the field of democracy measurement has adapted and reached higher levels of theoretical and methodological sophistication. In effect, the challenges facing contemporary social scientists are not only limited to the challenge of constructing a sound index of democracy. Today, they also need a profound understanding of the differences between various measures of democracy and their implications for empirical applications. The introduction outlines how the contributions to this thematic issue help scholars cope with the recurrent issues of conceptualization, measurement, and application, and concludes by identifying avenues for future research.
The problem of atmospheric emission from OH molecules is a long standing problem for near-infrared astronomy. PRAXIS is a unique spectrograph which is fed by fibres that remove the OH background and is optimised specifically to benefit from OH-Suppression. The OH suppression is achieved with fibre Bragg gratings, which were tested successfully on the GNOSIS instrument. PRAXIS uses the same fibre Bragg gratings as GNOSIS in its first implementation, and will exploit new, cheaper and more efficient, multicore fibre Bragg gratings in the second implementation. The OH lines are suppressed by a factor of similar to 1000, and the expected increase in the signal-to-noise in the interline regions compared to GNOSIS is a factor of similar to 9 with the GNOSIS gratings and a factor of similar to 17 with the new gratings. PRAXIS will enable the full exploitation of OH suppression for the first time, which was not achieved by GNOSIS (a retrofit to an existing instrument that was not OH-Suppression optimised) due to high thermal emission, low spectrograph transmission and detector noise. PRAXIS has extremely low thermal emission, through the cooling of all significantly emitting parts, including the fore-optics, the fibre Bragg gratings, a long length of fibre, and the fibre slit, and an optical design that minimises leaks of thermal emission from outside the spectrograph. PRAXIS has low detector noise through the use of a Hawaii-2RG detector, and a high throughput through a efficient VPH based spectrograph. PRAXIS will determine the absolute level of the interline continuum and enable observations of individual objects via an IFU. In this paper we give a status update and report on acceptance tests.
We present a project combining lidar, photometer and particle counter data with a regularization software tool for a closure study of aerosol microphysical property retrieval. In a first step only lidar data are used to retrieve the particle size distribution (PSD). Secondly, photometer data are added, which results in a good consistency of the retrieved PSDs. Finally, those retrieved PSDs may be compared with the measured PSD from a particle counter. The data here were taken in Ny Alesund, Svalbard, as an example.
Logical modeling has been widely used to understand and expand the knowledge about protein interactions among different pathways. Realizing this, the caspo-ts system has been proposed recently to learn logical models from time series data. It uses Answer Set Programming to enumerate Boolean Networks (BNs) given prior knowledge networks and phosphoproteomic time series data. In the resulting sequence of solutions, similar BNs are typically clustered together. This can be problematic for large scale problems where we cannot explore the whole solution space in reasonable time. Our approach extends the caspo-ts system to cope with the important use case of finding diverse solutions of a problem with a large number of solutions. We first present the algorithm for finding diverse solutions and then we demonstrate the results of the proposed approach on two different benchmark scenarios in systems biology: (1) an artificial dataset to model TCR signaling and (2) the HPN-DREAM challenge dataset to model breast cancer cell lines.
High Mountain Asia provides water for more than a billion downstream users. Many catchments receive the majority of their yearly water budget in the form of snow - the vast majority of which is not monitored by sparse weather networks. We leverage passive microwave data from the SSMI series of satellites (SSMI, SSMI/S, 1987-2016), reprocessed to 3.125 km resolution, to examine trends in the volume and spatial distribution of snow-water equivalent (SWE) in the Indus Basin. We find that the majority of the Indus has seen an increase in snow-water storage. There exists a strong elevation-trend relationship, where high-elevation zones have more positive SWE trends. Negative trends are confined to the Himalayan foreland and deeply-incised valleys which run into the Upper Indus. This implies a temperature-dependent cutoff below which precipitation increases are not translated into increased SWE. Earlier snowmelt or a higher percentage of liquid precipitation could both explain this cutoff.(1) Earlier work 2 found a negative snow-water storage trend for the entire Indus catchment over the time period 1987-2009 (-4 x 10(-3) mm/yr). In this study based on an additional seven years of data, the average trend reverses to 1.4 x 10(-3). This implies that the decade since the mid-2000s was likely wetter, and positively impacted long-term SWE trends. This conclusion is supported by an analysis of snowmelt onset and end dates which found that while long-term trends are negative, more recent (since 2005) trends are positive (moving later in the year).(3)
Metamaterial Devices
(2018)
In our hands-on demonstration, we show several objects, the functionality of which is defined by the objects' internal micro-structure. Such metamaterial machines can (1) be mechanisms based on their microstructures, (2) employ simple mechanical computation, or (3) change their outside to interact with their environment. They are 3D printed from one piece and we support their creating by providing interactive software tools.
Cloud storage brokerage is an abstraction aimed at providing value-added services. However, Cloud Service Brokers are challenged by several security issues including enlarged attack surfaces due to integration of disparate components and API interoperability issues. Therefore, appropriate security risk assessment methods are required to identify and evaluate these security issues, and examine the efficiency of countermeasures. A possible approach for satisfying these requirements is employment of threat modeling concepts, which have been successfully applied in traditional paradigms. In this work, we employ threat models including attack trees, attack graphs and Data Flow Diagrams against a Cloud Service Broker (CloudRAID) and analyze these security threats and risks. Furthermore, we propose an innovative technique for combining Common Vulnerability Scoring System (CVSS) and Common Configuration Scoring System (CCSS) base scores in probabilistic attack graphs to cater for configuration-based vulnerabilities which are typically leveraged for attacking cloud storage systems. This approach is necessary since existing schemes do not provide sufficient security metrics, which are imperatives for comprehensive risk assessments. We demonstrate the efficiency of our proposal by devising CCSS base scores for two common attacks against cloud storage: Cloud Storage Enumeration Attack and Cloud Storage Exploitation Attack. These metrics are then used in Attack Graph Metric-based risk assessment. Our experimental evaluation shows that our approach caters for the aforementioned gaps and provides efficient security hardening options. Therefore, our proposals can be employed to improve cloud security.