Refine
Has Fulltext
- no (654) (remove)
Year of publication
Document Type
- Other (654) (remove)
Language
- English (654) (remove)
Is part of the Bibliography
- yes (654)
Keywords
- E-Learning (4)
- MOOC (4)
- Scrum (4)
- embodied cognition (4)
- errata, addenda (4)
- Cloud-Security (3)
- ISM: supernova remnants (3)
- Industry 4.0 (3)
- Internet of Things (3)
- Security Metrics (3)
Institute
- Hasso-Plattner-Institut für Digital Engineering GmbH (83)
- Institut für Biochemie und Biologie (82)
- Institut für Physik und Astronomie (82)
- Institut für Geowissenschaften (63)
- Department Psychologie (42)
- Department Sport- und Gesundheitswissenschaften (39)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (30)
- Institut für Chemie (27)
- Institut für Ernährungswissenschaft (27)
- Institut für Informatik und Computational Science (26)
Gaussianity Fair
(2017)
Editorial
(2017)
The keynote article (Mayberry & Kluender, 2017) makes an important contribution to questions concerning the existence and characteristics of sensitive periods in language acquisition. Specifically, by comparing groups of non-native L1 and L2 signers, the authors have been able to ingeniously disentangle the effects of maturation from those of early language exposure. Based on L1 versus L2 contrasts, the paper convincingly argues that L2 learning is a less clear test of sensitive periods. Nevertheless, we believe Mayberry and Kluender underestimate the evidence for maturational factors in L2 learning, especially that coming from recent research.
Emergency Care in Germany being re-assessed Hybrid Medical Care Model Seen As Potential Answer
(2017)
In recent years, “transnationalism” has become a key concept for historians and other scholars in the humanities and social sciences. However, its overuse threatens to dilute what would otherwise be a distinct approach with promising heuristic potential. This danger seems especially pronounced when the notion of transnationalism is applied to Jewish history, which, paradoxically, most scholars would agree, is at its core transnational. Many studies have analyzed how Jewries in different times and places, from the biblical era to the present, have been shaped by people, ideas, texts, and institutions that migrated across state lines and between cultures. So what is new about transnationalism in Jewish Studies? What new insights does it offer?
American Jewry offers an obvious arena to test transnationalism’s significance as an approach to historical research within Jewish studies. As a “nation of nations,” the United States is made up of a distinct and unique society, built on ideas of diversity and pluralism, and transcending old European concepts of nation and state. The transformative incorporation in American life of cultural, political, and social traditions brought from abroad is one feature of this distinctiveness. American Jewish history and culture, in particular, are best understood in the context of interaction with Jews in other places, both because of American Jews’ roots in and continued entanglement with Europe, and because of their differences from other Jews.
These considerations guided the participants in a roundtable that formed a prologue to an international conference held July 20–22, 2016, at the School of Jewish Theology at the University of Potsdam and the Center for Jewish Studies Berlin-Brandenburg, Germany. The conference title, “Re-Framing American Jewish History and Thought: New Transnational Perspectives,” indicated the organizers’ conviction that the transnational approach does have the potential to shed fresh light on the American Jewish experience. The participants were asked to bring their experiences to the table, in an effort to clarify what transnationalism might mean for American Jewish Studies, and where it might yield new approaches and insights.
The conference brought together some thirty scholars of various disciplines from Europe, Israel, and the United States. In addition to exploring a relatively new approach (at least, in the field of American Jewish Studies), the conference also served a second purpose: to further the interest in American Jewry as a subject of scholarly attention in countries outside the U.S., where the topic has been curiously neglected. The assumption underlying the conference was that a transnational perspective on American Jewry would bring to bear the particular interests and skills of scholars working outside the American academy, and thereby complement, rather than replicate, the ways American Jewish Studies have been pursued in North America itself.
We compare Visual Berrypicking, an interactive approach allowing users to explore large and highly faceted information spaces using similarity-based two-dimensional maps, with traditional browsing techniques. For large datasets, current projection methods used to generate maplike overviews suffer from increased computational costs and a loss of accuracy resulting in inconsistent visualizations. We propose to interactively align inexpensive small maps, showing local neighborhoods only, which ideally creates the impression of panning a large map. For evaluation, we designed a web-based prototype for movie exploration and compared it to the web interface of The Movie Database (TMDb) in an online user study. Results suggest that users are able to effectively explore large movie collections by hopping from one neighborhood to the next. Additionally, due to the projection of movie similarities, interesting links between movies can be found more easily, and thus, compared to browsing serendipitous discoveries are more likely.
In this extended abstract, we will analyze the current challenges for the envisioned Self-Adaptive CPS. In addition, we will outline our results to approach these challenges with SMARTSOS [10] a generic approach based on extensions of graph transformation systems employing open and adaptive collaborations and models at runtime for trustworthy self-adaptation, self-organization, and evolution of the individual systems and the system-of-systems level taking the independent development, operation, management, and evolution of these systems into account.
In this paper, the applicability of deep downhole geoelectrical monitoring for detecting CO2 related signatures is evaluated after a nearly ten year period of CO2 storage at the Ketzin pilot site. Deep downhole electrode arrays have been studied as part of a multi-physical monitoring concept at four CO2 pilot test sites worldwide so far. For these sites, it was considered important to implement the geoelectrical method into the measurement program of tracking the CO2 plume. Analyzing the example of the Ketzin site, it can be seen that during all phases of the CO2 storage reservoir development the resistivity measurements and their corresponding tomographic interpretation contribute in a beneficial manner to the measurement, monitoring and verification (MMV) protocol. The most important impact of a permanent electrode array is its potential as tool for estimating reservoir saturations.
Predicting macroscopic elastic rock properties requires detailed information on microstructure
(2017)
Predicting variations in macroscopic mechanical rock behaviour due to microstructural changes, driven by mineral precipitation and dissolution is necessary to couple chemo-mechanical processes in geological subsurface simulations. We apply 3D numerical homogenization models to estimate Young’s moduli for five synthetic microstructures, and successfully validate our results for comparable geometries with the analytical Mori-Tanaka approach. Further, we demonstrate that considering specific rock microstructures is of paramount importance, since calculated elastic properties may deviate by up to 230 % for the same mineral composition. Moreover, agreement between simulated and experimentally determined Young’s moduli is significantly improved, when detailed spatial information are employed.
Water management tools are necessary to guarantee the preservation of natural resources while ensuring optimum utilization. Linear regression models are a simple and quick solution for creating prognostic capabilities. Multivariate models show higher precision than univariate models. In the case of Waiwera, implementation of individual production rates is more accurate than applying just the total production rate. A maximum of approximately 1,075 m3/day can be pumped to ensure a water level of at least 0.5 m a.s.l. in the monitoring well. The model should be renewed annually to implement new data and current water level trends to keep the quality.
Integration and development of the energy supply in China and worldwide is a challenge for the years to come. The innovative idea presented here is based on an extension of the “power-to-gas-to-power” technology by establishing a closed carbon cycle. It is an implementation of a low-carbon energy system based on carbon dioxide capture and storage (CCS) to store and reuse wind and solar energy. The Chenjiacun storage project in China compares well with the German case study for the towns Potsdam and Brandenburg/Havel in the Federal State of Brandenburg based on the Ketzin pilot site for CCS.
The maximum entropy method is used to predict flows on water distribution networks. This analysis extends the water distribution network formulation of Waldrip et al. (2016) Journal of Hydraulic Engineering (ASCE), by the use of a continuous relative entropy defined on a reduced parameter set. This reduction in the parameters that the entropy is defined over ensures consistency between different representations of the same network. The performance of the proposed reduced parameter method is demonstrated with a one-loop network case study.
The maximum entropy method is used to derive an alternative gravity model for a transport network. The proposed method builds on previous methods which assign the discrete value of a maximum entropy distribution to equal the traffic flow rate. The proposed method however, uses a distribution to represent each flow rate. The proposed method is shown to be able to handle uncertainty in a more elegant way and give similar results to traditional methods. It is able to incorporate more of the observed data through the entropy function, prior distribution and integration limits potentially allowing better inferences to be made.
The Internet can be considered as the most important infrastructure for modern society and businesses. A loss of Internet connectivity has strong negative financial impacts for businesses and economies. Therefore, assessing Internet connectivity, in particular beyond their own premises and area of direct control, is of growing importance in the face of potential failures, accidents, and malicious attacks. This paper presents CORIA, a software framework for an easy analysis of connectivity risks based on large network graphs. It provides researchers, risk analysts, network managers and security consultants with a tool to assess an organization's connectivity and paths options through the Internet backbone, including a user-friendly and insightful visual representation of results. CORIA is flexibly extensible in terms of novel data sets, graph metrics, and risk scores that enable further use cases. The performance of CORIA is evaluated by several experiments on the Internet graph and further randomly generated networks.
This paper describes architectural extensions for a dynamically scheduled processor, so that it can be used in three different operation modes, ranging from high-performance, to high-reliability. With minor hardware-extensions of the control path, the resources of the superscalar data-path can be used either for high-performance execution, fail-safe-operation, or fault-tolerant-operation. This makes the processor-architecture a very good candidate for applications with dynamically changing reliability requirements, e.g. for automotive applications. The paper reports the hardware-overhead for the extensions, and investigates the performance penalties introduced by the fail-safe and fault-tolerant mode. Furthermore, a comprehensive fault simulation was carried out in order to investigate the fault-coverage of the proposed approach.
Handling manufacturing and aging faults with software-based techniques in tiny embedded systems
(2017)
Non-volatile memory area occupies a large portion of the area of a chip in an embedded system. Such memories are prone to manufacturing faults, retention faults, and aging faults. The paper presents a single software based technique that allows for handling all of these fault types in tiny embedded systems without the need for hardware support. This is beneficial for low-cost embedded systems with simple memory architectures. A software infrastructure and a flow are presented that demonstrate how the presented technique is used in general for fault handling right after manufacturing and in-the-field. Moreover, a full implementation is presented for a MSP430 microcontroller, along with a discussion of the performance, overhead, and reliability impacts.
Preclinical studies in cell culture systems as well as in whole animal chronic kidney disease (CKD) models showed that parathyroid hormone (PTH), oxidized at the 2 methionine residues (positions 8 and 18), caused a loss of function. This was so far not considered in the development of PTH assays used in current clinical practice. Patients with advanced CKD are subject to oxidative stress, and plasma proteins (including PTH) are targets for oxidants. In patients with CKD, a considerable but variable fraction (about 70 to 90%) of measured PTH appears to be oxidized. Oxidized PTH (oxPTH) does not interact with the PTH receptor resulting in loss of biological activity. Currently used intact PTH (iPTH) assays detect both oxidized and non-oxPTH (n-oxPTH). Clinical studies demonstrated that bioactive, n-oxPTH, but not iPTH nor oxPTH, is associated with mortality in CKD patients.
Preface
(2017)
Cost models play an important role for the efficient implementation of software systems. These models can be embedded in operating systems and execution environments to optimize execution at run time. Even though non-uniform memory access (NUMA) architectures are dominating today's server landscape, there is still a lack of parallel cost models that represent NUMA system sufficiently. Therefore, the existing NUMA models are analyzed, and a two-step performance assessment strategy is proposed that incorporates low-level hardware counters as performance indicators. To support the two-step strategy, multiple tools are developed, all accumulating and enriching specific hardware event counter information, to explore, measure, and visualize these low-overhead performance indicators. The tools are showcased and discussed alongside specific experiments in the realm of performance assessment.
Moving Forces
(2017)
Throughout a large part of the twentieth century, the body was interpreted as a field of signs, the meaning of which pointed to an unconscious dimension. At the height of the popularity of structuralism, Jacques Lacan deemed the unconscious to be “structured like a language.” Starting in the early 1990s, however, a deep shift occurred in the way the body was interpreted. A new movement cast tremendous doubt on the hegemony of language and instead advocated a performative, pictorial, and affective approach — the so-called material turn — which encompassed all of these. In the words of Karen Barad, this turn inquired as to why meaning, history, and truth are assigned to language only, whereas the movements of materiality are given less prominence: “How did language come to be more trustworthy than matter? Why are language and culture granted their own agency and historicity while matter is figured as passive and immutable?” With this shift toward the material, bodies began to be seen in a different light and their materiality understood as something that follows its own laws and movements, which cannot be understood exclusively in terms of social-cultural codes. Instead, these laws and movements call into question the very dichotomies of nature/culture and body/spirit.
Hulleman & Olivers' (H&O's) model introduces variation of the functional visual field (FVF) for explaining visual search behavior. Our research shows how the FVF can be studied using gaze-contingent displays and how FVF variation can be implemented in models of gaze control. Contrary to H&O, we believe that fixation duration is an important factor when modeling visual search behavior.
S-test results for the USGS and RELM forecasts. The differences between the simulated log-likelihoods and the observed log-likelihood are labelled on the horizontal axes, with scaling adjustments for the 40year.retro experiment. The horizontal lines represent the confidence intervals, within the 0.05 significance level, for each forecast and experiment. If this range contains a log-likelihood difference of zero, the forecasted log-likelihoods are consistent with the observed, and the forecast passes the S-test (denoted by thin lines). If the minimum difference within this range does not contain zero, the forecast fails the S-test for that particular experiment, denoted by thick lines. Colours distinguish between experiments (see Table 2 for explanation of experiment durations). Due to anomalously large likelihood differences, S-test results for Wiemer-Schorlemmer.ALM during the 10year.retro and 40year.retro experiments are not displayed. The range of log-likelihoods for the Holliday-et-al.PI forecast is lower than for the other forecasts due to relatively homogeneous forecasted seismicity rates and use of a small fraction of the RELM testing region.
THE P300 AND THE LC-NE SYSTEM: NEW INSIGHTS FROM TRANSCUTANEOUS VAGUS NERVE STIMULATION (TVNS)
(2017)
DOES AGE INFLUENCE BRAIN POTENTIALS DURING AFFECTIVE PICTURE PROCESSING IN MIDDLE-AGED WOMEN?
(2017)
As virtualization drives the automation of networking, the validation of security properties becomes more and more challenging eventually ruling out manual inspections. While formal verification in Software Defined Networks is provided by comprehensive tools with high speed reverification capabilities like NetPlumber for instance, the presence of middlebox functionality like firewalls is not considered. Also, they lack the ability to handle dynamic protocol elements like IPv6 extension header chains. In this work, we provide suitable modeling abstractions to enable both - the inclusion of firewalls and dynamic protocol elements. We exemplarily model the Linux ip6tables/netfilter packet filter and also provide abstractions for an application layer gateway. Finally, we present a prototype of our formal verification system FaVe.
During their evolution, massive stars are characterized by a significant loss of mass either via spherically symmetric stellar winds or by aspherical mass-loss mechanisms, namely outflowing equatorial disks. However, the scenario that leads to the formation of a disk or rings of gas and dust around these objects is still under debate. Is it a viscous disk or an ouftlowing disk-forming wind or some other mechanism? It is also unclear how various physical mechanisms that act on the circumstellar environment of the stars affect its shape, density, kinematic, and thermal structure. We assume that the disk-forming mechanism is a viscous transport within an equatorial outflowing disk of a rapidly or even critically rotating star. We study the hydrodynamic and thermal structure of optically thick dense parts of outflowing circumstellar disks that may form around,e.g., Be stars, sgB[e] stars, or Pop m stars. We calculate self-consistent time dependent models of the inner dense region of the disk that is strongly affected either by irradiation from the central star and by contributions of viscous heating effects. We also simulate the dynamic effects of collision between expanding ejecta of supernovae and circumstellar disks that may be form in sgB[e] stars and, e.g., LBVs or Pop in stars.
Lately, first implementation approaches of Internet of Things (IoT) technologies penetrate industrial value-adding processes. Within this, the competence requirements for employees are changing. Employees’ organization, process, and interaction competences are of crucial importance in this new IoT environment, however, in students and vocational training not sufficiently considered yet. On the other hand, conventional learning factories evolve and transform to digital learning factories. Nevertheless, the integration of IoT technology and its usage for training in digital learning factories has been largely neglected thus far. Existing learning factories do not explicitly and properly consider IoT technology, which leads to deficiencies regarding an appropriate development of employees’ Industrial IoT competences. The goal of this contribution is to point out a didactic concept that enables development and training of these new demanded competences by using an IoT laboratory. For this purpose, a design science approach is applied. The result of this contribution is a didactic concept for the development of Industrial IoT competences in an IoT laboratory.
This special issue is the result of several fruitful conference sessions on disturbance hydrology, which started at the 2013 AGU Fall Meeting in San Francisco and have continued every year since. The stimulating presentations and discussions surrounding those sessions have focused on understanding both the disruption of hydrologic functioning following discrete disturbances, as well as the subsequent recovery or change within the affected watershed system. Whereas some hydrologic disturbances are directly linked to anthropogenic activities, such as resource extraction, the contributions to this special issue focus primarily on those with indirect or less pronounced human involvement, such as bark-beetle infestation, wildfire, and other natural hazards. However, human activities are enhancing the severity and frequency of these seemingly natural disturbances, thereby contributing to acute hydrologic problems and hazards. Major research challenges for our increasingly disturbed planet include the lack of continuous pre and postdisturbance monitoring, hydrologic impacts that vary spatially and temporally based on environmental and hydroclimatic conditions, and the preponderance of overlapping or compounding disturbance sequences. In addition, a conceptual framework for characterizing commonalities and differences among hydrologic disturbances is still in its infancy. In this introduction to the special issue, we advance the fusion of concepts and terminology from ecology and hydrology to begin filling this gap. We briefly explore some preliminary approaches for comparing different disturbances and their hydrologic impacts, which provides a starting point for further dialogue and research progress.
Influenza virus vRNPs: quantitative investigations via fluorescence
cross-correlation spectroscopy
(2017)
Direct visualization of APLP1 cell-cell adhesion platforms via fluorescence fluctuation spectroscopy
(2017)
Surface acoustic wave (SAW) devices are well-known for gravimetric sensor applications. In biosensing applications, chemical-and biochemically evoked adsorption processes at surfaces are detected in liquid environments using delay-line or resonator sensor configurations, preferably in combination with appropriate microfluidic devices. In this paper, a novel SAW-based impedance sensor type is introduced which uses only one interdigital electrode transducer (IDT) simultaneously as SAW generator and sensor element. It is shown that the amplitude of the reflected S-11 signal directly depends on the input impedance of the SAW device. The input impedance is strongly influenced by mass adsorption which causes a characteristic and measurable impedance mismatch.
Eighteen scientists met at Jurata, Poland, to discuss various aspects of the transition from adolescence to adulthood. This transition is a delicate period facing complex interactions between the adolescents and the social group they belong to. Social identity, group identification and identity signalling, but also stress affecting basal salivary cortisol rhythms, hypertension, inappropriate nutrition causing latent and manifest obesity, moreover, in developing and under-developed countries, parasitosis causing anaemia thereby impairing growth and development, are issues to be dealt with during this period of the human development. In addition, some new aspects of the association between weight, height and head circumference in the newborns were discussed, as well as intrauterine head growth and head circumference as health risk indicators.