Refine
Has Fulltext
- no (114)
Year of publication
- 2017 (114) (remove)
Document Type
- Other (114) (remove)
Language
- English (114) (remove)
Is part of the Bibliography
- yes (114)
Keywords
- Internet (2)
- MOOC (2)
- affect (2)
- carbon dioxide (2)
- embodied cognition (2)
- 2.5D Treemaps (1)
- Absorption kinetics (1)
- Aluminium (1)
- Aluminium adjuvants (1)
- Aufklarung (1)
Institute
- Institut für Biochemie und Biologie (22)
- Institut für Physik und Astronomie (17)
- Institut für Geowissenschaften (11)
- Department Sport- und Gesundheitswissenschaften (10)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (10)
- Department Psychologie (7)
- Department Linguistik (5)
- Institut für Ernährungswissenschaft (5)
- Institut für Informatik und Computational Science (5)
- Institut für Chemie (4)
The German Enlightenment
(2017)
The term Enlightenment (or Aufklärung) remains heavily contested. Even when historians delimit the remit of the concept, assigning it to a particular historical period rather than to an intellectual or moral programme, the public resonance of the Enlightenment remains high and problematic—especially when equated in an essentialist manner with modernity or some core values of ‘the West’. This Forum has been convened to discuss recent research on the Enlightenment in Germany, different views of the term and its ideological use in public discourse outside academia (and sometimes within it).
The interview offers a reconstruction of the German reception of Durkheim since the middle of the 1970s. Hans Joas, who was one of its major protagonists, discusses the backdrop that finally permitted a scholarly examination of Durkheim’s sociology in Germany. Focussing on his personal reception Joas then gives an account of the Durkheimian themes that inspire his work.
Just after the publication of the Theory of Communicative Action in 1981, a new generation of interpreters started a different reception of Durkheim in Germany. Hans-Peter Müller, sociologist and editor of the German translation of Leçons de sociologie, reconstructs the history of the German Durkheim’s Reception and illuminates the reasons for his interest in the French sociologist. He delivers different insights into the background which permitted the post-Habermasian generation to reach a new understanding of Durkheim’s work by enlightening the scientific and political conditions from which this new sensibility emerged.
New data from the LEADER trial show that the glucagon-like peptide 1 receptor agonist liraglutide protects against diabetic nephropathy in patients with type 2 diabetes mellitus. The renoprotective efficacy of liraglutide is not, however, as great as that reported for the sodium-glucose cotransporter 2 inhibitor emplagiflozin in the EMPA-REG OUTCOME trial.
Editorial
(2017)
Tailed bacteriophages specific for Gram‐negative bacteria encounter lipopolysaccharide (LPS) during the first infection steps. Yet, it is not well understood how biochemistry of these initial interactions relates to subsequent events that orchestrate phage adsorption and tail rearrangements to initiate cell entry. For many phages, long O‐antigen chains found on the LPS of smooth bacterial strains serve as essential receptor recognized by their tailspike proteins (TSP). Many TSP are depolymerases and O‐antigen cleavage was described as necessary step for subsequent orientation towards a secondary receptor. However, O‐antigen specific host attachment must not always come along with O‐antigen degradation. In this issue of Molecular Microbiology Prokhorov et al. report that coliphage G7C carries a TSP that deacetylates O‐antigen but does not degrade it, whereas rough strains or strains lacking O‐antigen acetylation remain unaffected. Bacteriophage G7C specifically functionalizes its tail by attaching the deacetylase TSP directly to a second TSP that is nonfunctional on the host's O‐antigen. This challenges the view that bacteriophages use their TSP only to clear their way to a secondary receptor. Rather, O‐antigen specific phages may employ enzymatically active TSP as a tool for irreversible LPS membrane binding to initiate subsequent infection steps.
Kijko et al. (2016) present various methods to estimate parameters that are relevant for probabilistic seismic-hazard assessment. One of these parameters, although not the most influential, is the maximum possible earthquake magnitude m(max). I show that the proposed estimation of m(max) is based on an erroneous equation related to a misuse of the estimator in Cooke (1979) and leads to unstable results. So far, reported finite estimations of m(max) arise from data selection, because the estimator in Kijko et al. (2016) diverges with finite probability. This finding is independent of the assumed distribution of earthquake magnitudes. For the specific choice of the doubly truncated Gutenberg-Richter distribution, I illustrate the problems by deriving explicit equations. Finally, I conclude that point estimators are generally not a suitable approach to constrain m(max).
DPP4 inhibition prevents AKI
(2017)
Gamma-ray bursts (GRBs) are some of the Universe’s most enigmatic and exotic events. However, at energies above 10 GeV their behaviour remains largely unknown. Although space based telescopes such as the Fermi-LAT have been able to detect GRBs in this energy range, their photon statistics are limited by the small detector size. Such limitations are not present in ground based gamma-ray telescopes such as the H.E.S.S. experiment, which has now entered its second phase with the addition of a large 600 m2 telescope to the centre of the array. Such a large telescope allows H.E.S.S. to access the sub 100-GeV energy range while still maintaining a large effective collection area, helping to potentially probe the short timescale emission of these events.
We present a description of the H.E.S.S. GRB observation programme, summarising the performance of the rapid GRB repointing system and the conditions under which GRB observations are initiated. Additionally we will report on the GRB follow-ups made during the 2014-15 observation campaigns.
HESS J1826-130
(2017)
HESS J1826-130 is an unidentified hard spectrum source discovered by H.E.S.S. along the Galactic plane, the spectral index being Gamma = 1.6 with an exponential cut-off at about 12 TeV. While the source does not have a clear counterpart at longer wavelengths, the very hard spectrum emission at TeV energies implies that electrons or protons accelerated up to several hundreds of TeV are responsible for the emission. In the hadronic case, the VHE emission can be produced by runaway cosmic-rays colliding with the dense molecular clouds spatially coincident with the H.E.S.S. source.
E-commerce marketplaces are highly dynamic with constant competition. While this competition is challenging for many merchants, it also provides plenty of opportunities, e.g., by allowing them to automatically adjust prices in order to react to changing market situations. For practitioners however, testing automated pricing strategies is time-consuming and potentially hazardously when done in production. Researchers, on the other side, struggle to study how pricing strategies interact under heavy competition. As a consequence, we built an open continuous time framework to simulate dynamic pricing competition called Price Wars. The microservice-based architecture provides a scalable platform for large competitions with dozens of merchants and a large random stream of consumers. Our platform stores each event in a distributed log. This allows to provide different performance measures enabling users to compare profit and revenue of various repricing strategies in real-time. For researchers, price trajectories are shown which ease evaluating mutual price reactions of competing strategies. Furthermore, merchants can access historical marketplace data and apply machine learning. By providing a set of customizable, artificial merchants, users can easily simulate both simple rule-based strategies as well as sophisticated data-driven strategies using demand learning to optimize their pricing strategies.
The ionospheric delay of global navigation satellite systems (GNSS) signals typically is compensated by adding a single correction value to the pseudorange measurement of a GNSS receiver. Yet, this neglects the dispersive nature of the ionosphere. In this context we analyze the ionospheric signal distortion beyond a constant delay. These effects become increasingly significant with the signal bandwidth and hence more important for new broadband navigation signals. Using measurements of the Galileo E5 signal, captured with a high gain antenna, we verify that the expected influence can indeed be observed and compensated. A new method to estimate the total electron content (TEC) from a single frequency high gain antenna measurement of a broadband GNSS signal is proposed and described in detail. The received signal is de facto unaffected by multi-path and interference because of the narrow aperture angle of the used antenna which should reduce the error source of the result in general. We would like to point out that such measurements are independent of code correlation, like in standard receiver applications. It is therefore also usable without knowledge of the signal coding. Results of the TEC estimation process are shown and discussed comparing to common TEC products like TEC maps and dual frequency receiver estimates.
Mixed-projection treemaps
(2017)
This paper presents a novel technique for combining 2D and 2.5D treemaps using multi-perspective views to leverage the advantages of both treemap types. It enables a new form of overview+detail visualization for tree-structured data and contributes new concepts for real-time rendering of and interaction with treemaps. The technique operates by tilting the graphical elements representing inner nodes using affine transformations and animated state transitions. We explain how to mix orthogonal and perspective projections within a single treemap. Finally, we show application examples that benefit from the reduced interaction overhead.
Nanocarriers
(2017)
Background: Evidence that home telemonitoring (HTM) for patients with chronic heart failure (CHF) offers clinical benefit over usual care is controversial as is evidence of a health economic advantage. Therefore the CardioBBEAT trial was designed to prospectively assess the health economic impact of a dedicated home monitoring system for patients with CHF based on actual costs directly obtained from patients’ health care providers.
Methods: Between January 2010 and June 2013, 621 patients (mean age 63,0 ± 11,5 years, 88 % male) with a confirmed diagnosis of CHF (LVEF ≤ 40 %) were enrolled and randomly assigned to two study groups comprising usual care with and without an interactive bi-directional HTM (Motiva®). The primary endpoint was the Incremental Cost-Effectiveness Ratio (ICER) established by the groups’ difference in total cost and in the combined clinical endpoint “days alive and not in hospital nor inpatient care per potential days in study” within the follow up of 12 months. Secondary outcome measures were total mortality and health related quality of life (SF-36, WHO-5 and KCCQ).
Results: In the intention-to-treat analysis, total mortality (HR 0.81; 95% CI 0.45 – 1.45) and days alive and not in hospital (343.3 ± 55.4 vs. 347.2 ± 43.9; p = 0.909) were not significantly different between HTM and usual care. While the resulting primary endpoint ICER was not positive (-181.9; 95% CI −1626.2 ± 1628.9), quality of life assessed by SF-36, WHO-5 and KCCQ as a secondary endpoint was significantly higher in the HTW group at 6 and 12 months of follow-up.
Conclusions: The first simultaneous assessment of clinical and economic outcome of HTM in patients with CHF did not demonstrate superior incremental cost effectiveness compared to usual care. On the other hand, quality of life was improved. It remains open whether the tested HTM solution represents a useful innovative approach in the recent health care setting.
Recently, Kocyan & Wiland-Szymańska (2016) have published a thorough research article on one of the outstanding members of the family Hypoxidaceae on the Seychelles, which resulted in the raise of a new genus (Friedmannia Kocyan & Wiland-Szymańska 2016: 60) to accommodate the former Curculigo seychellensis Bojer ex Baker (1877: 368). However, it has turned out that the name Friedmannia Chantanachat & Bold (1962: 45) already exists in literature for a green alga, which renders the new hypoxid genus illegitimate (Melbourne Code; McNeill et al. 2012). Therefore, we assign a new generic epithet to Curculigo seychellensis.
Gaussianity Fair
(2017)
Editorial
(2017)
The keynote article (Mayberry & Kluender, 2017) makes an important contribution to questions concerning the existence and characteristics of sensitive periods in language acquisition. Specifically, by comparing groups of non-native L1 and L2 signers, the authors have been able to ingeniously disentangle the effects of maturation from those of early language exposure. Based on L1 versus L2 contrasts, the paper convincingly argues that L2 learning is a less clear test of sensitive periods. Nevertheless, we believe Mayberry and Kluender underestimate the evidence for maturational factors in L2 learning, especially that coming from recent research.
Emergency Care in Germany being re-assessed Hybrid Medical Care Model Seen As Potential Answer
(2017)
In recent years, “transnationalism” has become a key concept for historians and other scholars in the humanities and social sciences. However, its overuse threatens to dilute what would otherwise be a distinct approach with promising heuristic potential. This danger seems especially pronounced when the notion of transnationalism is applied to Jewish history, which, paradoxically, most scholars would agree, is at its core transnational. Many studies have analyzed how Jewries in different times and places, from the biblical era to the present, have been shaped by people, ideas, texts, and institutions that migrated across state lines and between cultures. So what is new about transnationalism in Jewish Studies? What new insights does it offer?
American Jewry offers an obvious arena to test transnationalism’s significance as an approach to historical research within Jewish studies. As a “nation of nations,” the United States is made up of a distinct and unique society, built on ideas of diversity and pluralism, and transcending old European concepts of nation and state. The transformative incorporation in American life of cultural, political, and social traditions brought from abroad is one feature of this distinctiveness. American Jewish history and culture, in particular, are best understood in the context of interaction with Jews in other places, both because of American Jews’ roots in and continued entanglement with Europe, and because of their differences from other Jews.
These considerations guided the participants in a roundtable that formed a prologue to an international conference held July 20–22, 2016, at the School of Jewish Theology at the University of Potsdam and the Center for Jewish Studies Berlin-Brandenburg, Germany. The conference title, “Re-Framing American Jewish History and Thought: New Transnational Perspectives,” indicated the organizers’ conviction that the transnational approach does have the potential to shed fresh light on the American Jewish experience. The participants were asked to bring their experiences to the table, in an effort to clarify what transnationalism might mean for American Jewish Studies, and where it might yield new approaches and insights.
The conference brought together some thirty scholars of various disciplines from Europe, Israel, and the United States. In addition to exploring a relatively new approach (at least, in the field of American Jewish Studies), the conference also served a second purpose: to further the interest in American Jewry as a subject of scholarly attention in countries outside the U.S., where the topic has been curiously neglected. The assumption underlying the conference was that a transnational perspective on American Jewry would bring to bear the particular interests and skills of scholars working outside the American academy, and thereby complement, rather than replicate, the ways American Jewish Studies have been pursued in North America itself.
We compare Visual Berrypicking, an interactive approach allowing users to explore large and highly faceted information spaces using similarity-based two-dimensional maps, with traditional browsing techniques. For large datasets, current projection methods used to generate maplike overviews suffer from increased computational costs and a loss of accuracy resulting in inconsistent visualizations. We propose to interactively align inexpensive small maps, showing local neighborhoods only, which ideally creates the impression of panning a large map. For evaluation, we designed a web-based prototype for movie exploration and compared it to the web interface of The Movie Database (TMDb) in an online user study. Results suggest that users are able to effectively explore large movie collections by hopping from one neighborhood to the next. Additionally, due to the projection of movie similarities, interesting links between movies can be found more easily, and thus, compared to browsing serendipitous discoveries are more likely.
In this extended abstract, we will analyze the current challenges for the envisioned Self-Adaptive CPS. In addition, we will outline our results to approach these challenges with SMARTSOS [10] a generic approach based on extensions of graph transformation systems employing open and adaptive collaborations and models at runtime for trustworthy self-adaptation, self-organization, and evolution of the individual systems and the system-of-systems level taking the independent development, operation, management, and evolution of these systems into account.
In this paper, the applicability of deep downhole geoelectrical monitoring for detecting CO2 related signatures is evaluated after a nearly ten year period of CO2 storage at the Ketzin pilot site. Deep downhole electrode arrays have been studied as part of a multi-physical monitoring concept at four CO2 pilot test sites worldwide so far. For these sites, it was considered important to implement the geoelectrical method into the measurement program of tracking the CO2 plume. Analyzing the example of the Ketzin site, it can be seen that during all phases of the CO2 storage reservoir development the resistivity measurements and their corresponding tomographic interpretation contribute in a beneficial manner to the measurement, monitoring and verification (MMV) protocol. The most important impact of a permanent electrode array is its potential as tool for estimating reservoir saturations.
Predicting macroscopic elastic rock properties requires detailed information on microstructure
(2017)
Predicting variations in macroscopic mechanical rock behaviour due to microstructural changes, driven by mineral precipitation and dissolution is necessary to couple chemo-mechanical processes in geological subsurface simulations. We apply 3D numerical homogenization models to estimate Young’s moduli for five synthetic microstructures, and successfully validate our results for comparable geometries with the analytical Mori-Tanaka approach. Further, we demonstrate that considering specific rock microstructures is of paramount importance, since calculated elastic properties may deviate by up to 230 % for the same mineral composition. Moreover, agreement between simulated and experimentally determined Young’s moduli is significantly improved, when detailed spatial information are employed.
Water management tools are necessary to guarantee the preservation of natural resources while ensuring optimum utilization. Linear regression models are a simple and quick solution for creating prognostic capabilities. Multivariate models show higher precision than univariate models. In the case of Waiwera, implementation of individual production rates is more accurate than applying just the total production rate. A maximum of approximately 1,075 m3/day can be pumped to ensure a water level of at least 0.5 m a.s.l. in the monitoring well. The model should be renewed annually to implement new data and current water level trends to keep the quality.
Integration and development of the energy supply in China and worldwide is a challenge for the years to come. The innovative idea presented here is based on an extension of the “power-to-gas-to-power” technology by establishing a closed carbon cycle. It is an implementation of a low-carbon energy system based on carbon dioxide capture and storage (CCS) to store and reuse wind and solar energy. The Chenjiacun storage project in China compares well with the German case study for the towns Potsdam and Brandenburg/Havel in the Federal State of Brandenburg based on the Ketzin pilot site for CCS.
The maximum entropy method is used to predict flows on water distribution networks. This analysis extends the water distribution network formulation of Waldrip et al. (2016) Journal of Hydraulic Engineering (ASCE), by the use of a continuous relative entropy defined on a reduced parameter set. This reduction in the parameters that the entropy is defined over ensures consistency between different representations of the same network. The performance of the proposed reduced parameter method is demonstrated with a one-loop network case study.
The maximum entropy method is used to derive an alternative gravity model for a transport network. The proposed method builds on previous methods which assign the discrete value of a maximum entropy distribution to equal the traffic flow rate. The proposed method however, uses a distribution to represent each flow rate. The proposed method is shown to be able to handle uncertainty in a more elegant way and give similar results to traditional methods. It is able to incorporate more of the observed data through the entropy function, prior distribution and integration limits potentially allowing better inferences to be made.
The Internet can be considered as the most important infrastructure for modern society and businesses. A loss of Internet connectivity has strong negative financial impacts for businesses and economies. Therefore, assessing Internet connectivity, in particular beyond their own premises and area of direct control, is of growing importance in the face of potential failures, accidents, and malicious attacks. This paper presents CORIA, a software framework for an easy analysis of connectivity risks based on large network graphs. It provides researchers, risk analysts, network managers and security consultants with a tool to assess an organization's connectivity and paths options through the Internet backbone, including a user-friendly and insightful visual representation of results. CORIA is flexibly extensible in terms of novel data sets, graph metrics, and risk scores that enable further use cases. The performance of CORIA is evaluated by several experiments on the Internet graph and further randomly generated networks.
This paper describes architectural extensions for a dynamically scheduled processor, so that it can be used in three different operation modes, ranging from high-performance, to high-reliability. With minor hardware-extensions of the control path, the resources of the superscalar data-path can be used either for high-performance execution, fail-safe-operation, or fault-tolerant-operation. This makes the processor-architecture a very good candidate for applications with dynamically changing reliability requirements, e.g. for automotive applications. The paper reports the hardware-overhead for the extensions, and investigates the performance penalties introduced by the fail-safe and fault-tolerant mode. Furthermore, a comprehensive fault simulation was carried out in order to investigate the fault-coverage of the proposed approach.
Handling manufacturing and aging faults with software-based techniques in tiny embedded systems
(2017)
Non-volatile memory area occupies a large portion of the area of a chip in an embedded system. Such memories are prone to manufacturing faults, retention faults, and aging faults. The paper presents a single software based technique that allows for handling all of these fault types in tiny embedded systems without the need for hardware support. This is beneficial for low-cost embedded systems with simple memory architectures. A software infrastructure and a flow are presented that demonstrate how the presented technique is used in general for fault handling right after manufacturing and in-the-field. Moreover, a full implementation is presented for a MSP430 microcontroller, along with a discussion of the performance, overhead, and reliability impacts.
Preclinical studies in cell culture systems as well as in whole animal chronic kidney disease (CKD) models showed that parathyroid hormone (PTH), oxidized at the 2 methionine residues (positions 8 and 18), caused a loss of function. This was so far not considered in the development of PTH assays used in current clinical practice. Patients with advanced CKD are subject to oxidative stress, and plasma proteins (including PTH) are targets for oxidants. In patients with CKD, a considerable but variable fraction (about 70 to 90%) of measured PTH appears to be oxidized. Oxidized PTH (oxPTH) does not interact with the PTH receptor resulting in loss of biological activity. Currently used intact PTH (iPTH) assays detect both oxidized and non-oxPTH (n-oxPTH). Clinical studies demonstrated that bioactive, n-oxPTH, but not iPTH nor oxPTH, is associated with mortality in CKD patients.
Preface
(2017)
Cost models play an important role for the efficient implementation of software systems. These models can be embedded in operating systems and execution environments to optimize execution at run time. Even though non-uniform memory access (NUMA) architectures are dominating today's server landscape, there is still a lack of parallel cost models that represent NUMA system sufficiently. Therefore, the existing NUMA models are analyzed, and a two-step performance assessment strategy is proposed that incorporates low-level hardware counters as performance indicators. To support the two-step strategy, multiple tools are developed, all accumulating and enriching specific hardware event counter information, to explore, measure, and visualize these low-overhead performance indicators. The tools are showcased and discussed alongside specific experiments in the realm of performance assessment.
Moving Forces
(2017)
Throughout a large part of the twentieth century, the body was interpreted as a field of signs, the meaning of which pointed to an unconscious dimension. At the height of the popularity of structuralism, Jacques Lacan deemed the unconscious to be “structured like a language.” Starting in the early 1990s, however, a deep shift occurred in the way the body was interpreted. A new movement cast tremendous doubt on the hegemony of language and instead advocated a performative, pictorial, and affective approach — the so-called material turn — which encompassed all of these. In the words of Karen Barad, this turn inquired as to why meaning, history, and truth are assigned to language only, whereas the movements of materiality are given less prominence: “How did language come to be more trustworthy than matter? Why are language and culture granted their own agency and historicity while matter is figured as passive and immutable?” With this shift toward the material, bodies began to be seen in a different light and their materiality understood as something that follows its own laws and movements, which cannot be understood exclusively in terms of social-cultural codes. Instead, these laws and movements call into question the very dichotomies of nature/culture and body/spirit.
Hulleman & Olivers' (H&O's) model introduces variation of the functional visual field (FVF) for explaining visual search behavior. Our research shows how the FVF can be studied using gaze-contingent displays and how FVF variation can be implemented in models of gaze control. Contrary to H&O, we believe that fixation duration is an important factor when modeling visual search behavior.