Refine
Has Fulltext
- no (114)
Year of publication
- 2017 (114) (remove)
Document Type
- Other (114) (remove)
Language
- English (114) (remove)
Is part of the Bibliography
- yes (114)
Keywords
- Internet (2)
- MOOC (2)
- affect (2)
- carbon dioxide (2)
- embodied cognition (2)
- 2.5D Treemaps (1)
- Absorption kinetics (1)
- Aluminium (1)
- Aluminium adjuvants (1)
- Aufklarung (1)
Institute
- Institut für Biochemie und Biologie (22)
- Institut für Physik und Astronomie (17)
- Institut für Geowissenschaften (11)
- Department Sport- und Gesundheitswissenschaften (10)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (10)
- Department Psychologie (7)
- Department Linguistik (5)
- Institut für Ernährungswissenschaft (5)
- Institut für Informatik und Computational Science (5)
- Institut für Chemie (4)
- Institut für Mathematik (4)
- Sozialwissenschaften (4)
- Department Erziehungswissenschaft (1)
- Department Musik und Kunst (1)
- Fachgruppe Betriebswirtschaftslehre (1)
- Fachgruppe Politik- & Verwaltungswissenschaft (1)
- Forschungsbereich „Politik, Verwaltung und Management“ (1)
- Hasso-Plattner-Institut für Digital Engineering GmbH (1)
- Historisches Institut (1)
- Institut für Germanistik (1)
- Institut für Jüdische Theologie (1)
- Institut für Umweltwissenschaften und Geographie (1)
- Wirtschafts- und Sozialwissenschaftliche Fakultät (1)
- Wirtschaftswissenschaften (1)
The German Enlightenment
(2017)
The term Enlightenment (or Aufklärung) remains heavily contested. Even when historians delimit the remit of the concept, assigning it to a particular historical period rather than to an intellectual or moral programme, the public resonance of the Enlightenment remains high and problematic—especially when equated in an essentialist manner with modernity or some core values of ‘the West’. This Forum has been convened to discuss recent research on the Enlightenment in Germany, different views of the term and its ideological use in public discourse outside academia (and sometimes within it).
Selection of initial points, the number of clusters and finding proper clusters centers are still the main challenge in clustering processes. In this paper, we suggest genetic algorithm based method which searches several solution spaces simultaneously. The solution spaces are population groups consisting of elements with similar structure. Elements in a group have the same size, while elements in different groups are of different sizes. The proposed algorithm processes the population in groups of chromosomes with one gene, two genes to k genes. These genes hold corresponding information about the cluster centers. In the proposed method, the crossover and mutation operators can accept parents with different sizes; this can lead to versatility in population and information transfer among sub-populations. We implemented the proposed method and evaluated its performance against some random datasets and the Ruspini dataset as well. The experimental results show that the proposed method could effectively determine the appropriate number of clusters and recognize their centers. Overall this research implies that using heterogeneous population in the genetic algorithm can lead to better results.
Moving Forces
(2017)
Throughout a large part of the twentieth century, the body was interpreted as a field of signs, the meaning of which pointed to an unconscious dimension. At the height of the popularity of structuralism, Jacques Lacan deemed the unconscious to be “structured like a language.” Starting in the early 1990s, however, a deep shift occurred in the way the body was interpreted. A new movement cast tremendous doubt on the hegemony of language and instead advocated a performative, pictorial, and affective approach — the so-called material turn — which encompassed all of these. In the words of Karen Barad, this turn inquired as to why meaning, history, and truth are assigned to language only, whereas the movements of materiality are given less prominence: “How did language come to be more trustworthy than matter? Why are language and culture granted their own agency and historicity while matter is figured as passive and immutable?” With this shift toward the material, bodies began to be seen in a different light and their materiality understood as something that follows its own laws and movements, which cannot be understood exclusively in terms of social-cultural codes. Instead, these laws and movements call into question the very dichotomies of nature/culture and body/spirit.
HESS J1826-130
(2017)
HESS J1826-130 is an unidentified hard spectrum source discovered by H.E.S.S. along the Galactic plane, the spectral index being Gamma = 1.6 with an exponential cut-off at about 12 TeV. While the source does not have a clear counterpart at longer wavelengths, the very hard spectrum emission at TeV energies implies that electrons or protons accelerated up to several hundreds of TeV are responsible for the emission. In the hadronic case, the VHE emission can be produced by runaway cosmic-rays colliding with the dense molecular clouds spatially coincident with the H.E.S.S. source.
Tailed bacteriophages specific for Gram‐negative bacteria encounter lipopolysaccharide (LPS) during the first infection steps. Yet, it is not well understood how biochemistry of these initial interactions relates to subsequent events that orchestrate phage adsorption and tail rearrangements to initiate cell entry. For many phages, long O‐antigen chains found on the LPS of smooth bacterial strains serve as essential receptor recognized by their tailspike proteins (TSP). Many TSP are depolymerases and O‐antigen cleavage was described as necessary step for subsequent orientation towards a secondary receptor. However, O‐antigen specific host attachment must not always come along with O‐antigen degradation. In this issue of Molecular Microbiology Prokhorov et al. report that coliphage G7C carries a TSP that deacetylates O‐antigen but does not degrade it, whereas rough strains or strains lacking O‐antigen acetylation remain unaffected. Bacteriophage G7C specifically functionalizes its tail by attaching the deacetylase TSP directly to a second TSP that is nonfunctional on the host's O‐antigen. This challenges the view that bacteriophages use their TSP only to clear their way to a secondary receptor. Rather, O‐antigen specific phages may employ enzymatically active TSP as a tool for irreversible LPS membrane binding to initiate subsequent infection steps.
Since the Shallow Structure Hypothesis (SSH) was first put forward in 2006, it has inspired a growing body of research on grammatical processing in nonnative (L2) speakers. More than 10 years later, we think it is time for the SSH to be reconsidered in the light of new empirical findings and current theoretical assumptions about human language processing. The purpose of our critical commentary is twofold: to clarify some issues regarding the SSH and to sketch possible ways in which this hypothesis might be refined and improved to better account for L1 and L2 speakers’ performance patterns.
Direct visualization of APLP1 cell-cell adhesion platforms via fluorescence fluctuation spectroscopy
(2017)
Preclinical assessment of penetration not only in intact, but also in barrier‐disrupted skin is important to explore the surplus value of novel drug delivery systems, which can be specifically designed for diseased skin. Here, we characterized physical and chemical barrier disruption protocols for short‐term ex vivo skin cultures with regard to structural integrity, physiological and biological parameters. Further, we compared the penetration of dexamethasone (Dex) in different nanoparticle‐based formulations in stratum corneum, epidermis and dermis extracts of intact vs. barrier‐disrupted skin as well as by dermal microdialysis at 6, 12 and 24 hours after topical application. Dex was quantified by liquid‐chromatography ‐ tandem‐mass spectrometry (LC‐MS/MS). Simultaneously, we investigated the Dex efficacy by interleukin (IL) analysis. Tape‐stripping (TS) and 4 hours sodium lauryl sulfate 5 % (SLS) exposure were identified as highly effective barrier disruption methods assessed by reproducible transepidermal water loss (TEWL) changes and IL‐6/8 increase which was more pronounced in SLS‐treated skin. The barrier state has also a significant impact on the Dex penetration kinetics: for all formulations, TS highly increased dermal Dex concentration despite the fact that nanocrystals quickly and effectively penetrated both, intact and barrier‐disrupted skin reaching significantly higher dermal Dex concentration after 6 hours compared to Dex cream. The surplus value of encapsulation in ethyl cellulose nanocarriers could mostly be observed when applied on intact skin, in general showing a delayed Dex penetration. Estimation of cytokines was limited due to the trauma caused by probe insertion. In summary, ex vivo human skin is a highly interesting short‐term preclinical model for the analysis of penetration and efficacy of novel drug delivery systems.
The Internet can be considered as the most important infrastructure for modern society and businesses. A loss of Internet connectivity has strong negative financial impacts for businesses and economies. Therefore, assessing Internet connectivity, in particular beyond their own premises and area of direct control, is of growing importance in the face of potential failures, accidents, and malicious attacks. This paper presents CORIA, a software framework for an easy analysis of connectivity risks based on large network graphs. It provides researchers, risk analysts, network managers and security consultants with a tool to assess an organization's connectivity and paths options through the Internet backbone, including a user-friendly and insightful visual representation of results. CORIA is flexibly extensible in terms of novel data sets, graph metrics, and risk scores that enable further use cases. The performance of CORIA is evaluated by several experiments on the Internet graph and further randomly generated networks.
The interview offers a reconstruction of the German reception of Durkheim since the middle of the 1970s. Hans Joas, who was one of its major protagonists, discusses the backdrop that finally permitted a scholarly examination of Durkheim’s sociology in Germany. Focussing on his personal reception Joas then gives an account of the Durkheimian themes that inspire his work.
Just after the publication of the Theory of Communicative Action in 1981, a new generation of interpreters started a different reception of Durkheim in Germany. Hans-Peter Müller, sociologist and editor of the German translation of Leçons de sociologie, reconstructs the history of the German Durkheim’s Reception and illuminates the reasons for his interest in the French sociologist. He delivers different insights into the background which permitted the post-Habermasian generation to reach a new understanding of Durkheim’s work by enlightening the scientific and political conditions from which this new sensibility emerged.
We develop a simple two-zone interpretation of the broadband baseline Crab nebula spectrum between 10(-5) eV and similar to 100 TeV by using two distinct log-parabola energetic electrons distributions. We determine analytically the very-high energy photon spectrum as originated by inverse-Compton scattering of the far-infrared soft ambient photons within the nebula off a first population of electrons energized at the nebula termination shock. The broad and flat 200 GeV peak jointly observed by Fermi/LAT and MAGIC is naturally reproduced. The synchrotron radiation from a second energetic electron population explains the spectrum from the radio range up to similar to 10 keV. We infer from observations the energy dependence of the microscopic probability of remaining in proximity of the shock of the accelerating electrons.
In recent years, “transnationalism” has become a key concept for historians and other scholars in the humanities and social sciences. However, its overuse threatens to dilute what would otherwise be a distinct approach with promising heuristic potential. This danger seems especially pronounced when the notion of transnationalism is applied to Jewish history, which, paradoxically, most scholars would agree, is at its core transnational. Many studies have analyzed how Jewries in different times and places, from the biblical era to the present, have been shaped by people, ideas, texts, and institutions that migrated across state lines and between cultures. So what is new about transnationalism in Jewish Studies? What new insights does it offer?
American Jewry offers an obvious arena to test transnationalism’s significance as an approach to historical research within Jewish studies. As a “nation of nations,” the United States is made up of a distinct and unique society, built on ideas of diversity and pluralism, and transcending old European concepts of nation and state. The transformative incorporation in American life of cultural, political, and social traditions brought from abroad is one feature of this distinctiveness. American Jewish history and culture, in particular, are best understood in the context of interaction with Jews in other places, both because of American Jews’ roots in and continued entanglement with Europe, and because of their differences from other Jews.
These considerations guided the participants in a roundtable that formed a prologue to an international conference held July 20–22, 2016, at the School of Jewish Theology at the University of Potsdam and the Center for Jewish Studies Berlin-Brandenburg, Germany. The conference title, “Re-Framing American Jewish History and Thought: New Transnational Perspectives,” indicated the organizers’ conviction that the transnational approach does have the potential to shed fresh light on the American Jewish experience. The participants were asked to bring their experiences to the table, in an effort to clarify what transnationalism might mean for American Jewish Studies, and where it might yield new approaches and insights.
The conference brought together some thirty scholars of various disciplines from Europe, Israel, and the United States. In addition to exploring a relatively new approach (at least, in the field of American Jewish Studies), the conference also served a second purpose: to further the interest in American Jewry as a subject of scholarly attention in countries outside the U.S., where the topic has been curiously neglected. The assumption underlying the conference was that a transnational perspective on American Jewry would bring to bear the particular interests and skills of scholars working outside the American academy, and thereby complement, rather than replicate, the ways American Jewish Studies have been pursued in North America itself.
The identification of vulnerabilities relies on detailed information about the target infrastructure. The gathering of the necessary information is a crucial step that requires an intensive scanning or mature expertise and knowledge about the system even though the information was already available in a different context. In this paper we propose a new method to detect vulnerabilities that reuses the existing information and eliminates the necessity of a comprehensive scan of the target system. Since our approach is able to identify vulnerabilities without the additional effort of a scan, we are able to increase the overall performance of the detection. Because of the reuse and the removal of the active testing procedures, our approach could be classified as a passive vulnerability detection. We will explain the approach and illustrate the additional possibility to increase the security awareness of users. Therefore, we applied the approach on an experimental setup and extracted security relevant information from web logs.
Recently a multitude of empirically derived damage models have been applied to project future tropical cyclone (TC) losses for the United States. In their study (Geiger et al 2016 Environ. Res. Lett. 11 084012) compared two approaches that differ in the scaling of losses with socio-economic drivers: the commonly-used approach resulting in a sub-linear scaling of historical TC losses with a nation's affected gross domestic product (GDP), and the disentangled approach that shows a sub-linear increase with affected population and a super-linear scaling of relative losses with per capita income. Statistics cannot determine which approach is preferable but since process understanding demands that there is a dependence of the loss on both GDP per capita and population, an approach that accounts for both separately is preferable to one which assumes a specific relation between the two dependencies. In the accompanying comment, Rybski et al argued that there is no rigorous evidence to reach the conclusion that high-income does not protect against hurricane losses. Here we affirm that our conclusion is drawn correctly and reply to further remarks raised in the comment, highlighting the adequateness of our approach but also the potential for future extension of our research.
Editorial
(2017)
In this extended abstract, we will analyze the current challenges for the envisioned Self-Adaptive CPS. In addition, we will outline our results to approach these challenges with SMARTSOS [10] a generic approach based on extensions of graph transformation systems employing open and adaptive collaborations and models at runtime for trustworthy self-adaptation, self-organization, and evolution of the individual systems and the system-of-systems level taking the independent development, operation, management, and evolution of these systems into account.
Background: Infliximab (IFX), an anti-TNF monoclonal antibody approved for the treatment of inflammatory bowel disease, is dosed per kg body weight (BW). However, the rationale for body size adjustment has not been unequivocally demonstrated [1], and first attempts to improve IFX therapy have been undertaken [2]. The aim of our study was to assess the impact of different dosing strategies (i.e. body size-adjusted and fixed dosing) on drug exposure and pharmacokinetic (PK) target attainment. For this purpose, a comprehensive simulation study was performed, using patient characteristics (n=116) from an in-house clinical database.
Methods: IFX concentration-time profiles of 1000 virtual, clinically representative patients were generated using a previously published PK model for IFX in patients with Crohn's disease [3]. For each patient 1000 profiles accounting for PK variability were considered. The IFX exposure during maintenance treatment after the following dosing strategies was compared: i) fixed dose, and per ii) BW, iii) lean BW (LBW), iv) body surface area (BSA), v) height (HT), vi) body mass index (BMI) and vii) fat-free mass (FFM)). For each dosing strategy the variability in maximum concentration Cmax, minimum concentration Cmin (= C8weeks) and area under the concentration-time curve (AUC), as well as percent of patients achieving the PK target, Cmin=3 μg/mL [4] were assessed.
Results: For all dosing strategies the variability of Cmin (CV ≈110%) was highest, compared to Cmax and AUC, and was of similar extent regardless of dosing strategy. The proportion of patients reaching the PK target (≈⅓ was approximately equal for all dosing strategies.
Lately, first implementation approaches of Internet of Things (IoT) technologies penetrate industrial value-adding processes. Within this, the competence requirements for employees are changing. Employees’ organization, process, and interaction competences are of crucial importance in this new IoT environment, however, in students and vocational training not sufficiently considered yet. On the other hand, conventional learning factories evolve and transform to digital learning factories. Nevertheless, the integration of IoT technology and its usage for training in digital learning factories has been largely neglected thus far. Existing learning factories do not explicitly and properly consider IoT technology, which leads to deficiencies regarding an appropriate development of employees’ Industrial IoT competences. The goal of this contribution is to point out a didactic concept that enables development and training of these new demanded competences by using an IoT laboratory. For this purpose, a design science approach is applied. The result of this contribution is a didactic concept for the development of Industrial IoT competences in an IoT laboratory.
Editorial
(2017)
Photonic sensing in highly concentrated biotechnical processes by photon density wave spectroscopy
(2017)
Photon Density Wave (PDW) spectroscopy is introduced as a new approach for photonic sensing in highly concentrated biotechnical processes. It independently quantifies the absorption and reduced scattering coefficient calibration-free and as a function of time, thus describing the optical properties in the vis/NIR range of the biomaterial during their processing. As examples of industrial relevance, enzymatic milk coagulation, beer mashing, and algae cultivation in photo bioreactors are discussed.
New data from the LEADER trial show that the glucagon-like peptide 1 receptor agonist liraglutide protects against diabetic nephropathy in patients with type 2 diabetes mellitus. The renoprotective efficacy of liraglutide is not, however, as great as that reported for the sodium-glucose cotransporter 2 inhibitor emplagiflozin in the EMPA-REG OUTCOME trial.
Preclinical studies in cell culture systems as well as in whole animal chronic kidney disease (CKD) models showed that parathyroid hormone (PTH), oxidized at the 2 methionine residues (positions 8 and 18), caused a loss of function. This was so far not considered in the development of PTH assays used in current clinical practice. Patients with advanced CKD are subject to oxidative stress, and plasma proteins (including PTH) are targets for oxidants. In patients with CKD, a considerable but variable fraction (about 70 to 90%) of measured PTH appears to be oxidized. Oxidized PTH (oxPTH) does not interact with the PTH receptor resulting in loss of biological activity. Currently used intact PTH (iPTH) assays detect both oxidized and non-oxPTH (n-oxPTH). Clinical studies demonstrated that bioactive, n-oxPTH, but not iPTH nor oxPTH, is associated with mortality in CKD patients.
Emergency Care in Germany being re-assessed Hybrid Medical Care Model Seen As Potential Answer
(2017)
Editorial
(2017)
Preface
(2017)