Refine
Has Fulltext
- no (1753) (remove)
Year of publication
- 2018 (1753) (remove)
Document Type
- Article (1420)
- Other (181)
- Review (79)
- Doctoral Thesis (47)
- Part of a Book (10)
- Monograph/Edited Volume (6)
- Habilitation Thesis (5)
- Conference Proceeding (4)
- Journal/Publication series (1)
Language
- English (1753) (remove)
Is part of the Bibliography
- yes (1753)
Keywords
- gamma rays: general (17)
- climate change (11)
- cosmic rays (11)
- stars: massive (11)
- ISM: supernova remnants (10)
- Germany (9)
- radiation mechanisms: non-thermal (9)
- X-rays: binaries (8)
- German (7)
- astroparticle physics (7)
Institute
- Institut für Physik und Astronomie (276)
- Institut für Biochemie und Biologie (273)
- Institut für Geowissenschaften (268)
- Institut für Chemie (158)
- Department Psychologie (102)
- Hasso-Plattner-Institut für Digital Engineering GmbH (73)
- Institut für Umweltwissenschaften und Geographie (71)
- Institut für Ernährungswissenschaft (69)
- Department Linguistik (67)
- Department Sport- und Gesundheitswissenschaften (65)
In this article the author proposes a new reading for the opening words of the Bible, "In the beginning God created the heaven and the earth. Now the earth was unformed and void ... ; and the spirit of God hovered over the water" (Gen. 1:1-2). This new reading is based on the connections drawn by Otto Eissfeldt between the Ugaritic literature and the Bible. God, according to this opening picture, connects intimately, empathetically, with the existing matter (the tehom) in dialogic address. It is from this relationship, which today we call "love," that all comes to be "born" from the material "womb" of the tehom. From this "big bang," all continues to be born.
This study aimed to estimate the optimal body size, limb segment length, and girth or breadth ratios of 100-m breaststroke performance in youth swimmers. In total, 59 swimmers [male: n= 39, age = 11.5 (1.3) y; female: n= 20, age = 12.0 (1.0) y] participated in this study. To identify size/shape characteristics associated with 100-m breaststroke swimming performance, we computed a multiplicative allometric log-linear regression model, which was refined using backward elimination. Results showed that the 100-m breaststroke performance revealed a significant negative association with fat mass and a significant positive association with the segment length ratio (arm ratio = hand length/forearm length) and limb girth ratio (girth ratio = forearm girth/wrist girth). In addition, leg length, biacromial breadth, and biiliocristal breadth revealed significant positive associations with the 100-m breaststroke performance. However, height and body mass did not contribute to the model, suggesting that the advantage of longer levers was limb-specific rather than a general whole-body advantage. In fact, it is only by adopting multiplicative allometric models that the previously mentioned ratios could have been derived. These results highlighted the importance of considering anthropometric characteristics of youth breaststroke swimmers for talent identification and/or athlete monitoring purposes. In addition, these findings may assist orienting swimmers to the appropriate stroke based on their anthropometric characteristics.
The Gongjue basin from the eastern Qiangtang terrane is located in the transition region where the regional structural lineation curves from east-west-oriented in Tibet to north-south-oriented in Yunnan. In this study, we sampled the red beds in the basin from the lower Gongjue to upper Ranmugou formations for the first time covering the entire stratigraphic profile. The stratigraphic ages are bracketed within 53-43Ma by new detrital zircon U-Pb ages constraining the maximum deposition age to 52.51.5Ma. Rock magnetic and petrographic studies indicate that detrital magnetite and hematite are the magnetic carriers. Positive reversals and fold tests demonstrate that the characteristic remanent magnetization has a primary origin. The Gongjue and Ranmugou formations yield mean characteristic remanent magnetization directions of D-s/I-s=31.0 degrees/21.3 degrees and D-s/I-s=15.9 degrees/22.0 degrees, respectively. The magnetic inclination of these characteristic remanent magnetizations is significantly shallowed compared to the expected inclination for the locality. However, the elongation/inclination correction method does not provide a meaningful correction, likely because of syn-depositional rotation. Rotations relative to the Eurasian apparent polar wander path occurred in three stages: Stage I, 33.33.4 degrees clockwise rotation during the deposition of the Gongjue and lower Ranmugou formations; Stage II, 26.93.7 degrees counterclockwise rotation during deposition of the lower and middle Ranmugou formation; and Stage III, 17.73.3 degrees clockwise rotation after 43Ma. The complex rotation history recorded in the basin is possibly linked to sinistral shear along the Qiangtang block during India indentation into Asia and the early stage of the extrusion of the northwestern Indochina blocks away from eastern Tibet.
A balance to death
(2018)
Leaf senescence plays a crucial role in nutrient recovery in late-stage plant development and requires vast transcriptional reprogramming by transcription factors such as ORESARA1 (ORE1). A proteolytic mechanism is now found to control ORE1 degradation, and thus senescence, during nitrogen starvation.
Ecological communities are complex adaptive systems that exhibit remarkable feedbacks between their biomass and trait dynamics. Trait-based aggregate models cope with this complexity by focusing on the temporal development of the community’s aggregate properties such as its total biomass, mean trait and trait variance. They are based on particular assumptions about the shape of the underlying trait distribution, which is commonly assumed to be normal. However, ecologically important traits are usually restricted to a finite range, and empirical trait distributions are often skewed or multimodal. As a result, normal distribution-based aggregate models may fail to adequately represent the biomass and trait dynamics of natural communities. We resolve this mismatch by developing a new moment closure approach assuming the trait values to be beta-distributed. We show that the beta distribution captures important shape properties of both observed and simulated trait distributions, which cannot be captured by a Gaussian. We further demonstrate that a beta distribution-based moment closure can strongly enhance the reliability of trait-based aggregate models. We compare the biomass, mean trait and variance dynamics of a full trait distribution (FD) model to the ones of beta (BA) and normal (NA) distribution-based aggregate models, under different selection regimes. This way, we demonstrate under which general conditions (stabilizing, fluctuating or disruptive selection) different aggregate models are reliable tools. All three models predicted very similar biomass and trait dynamics under stabilizing selection yielding unimodal trait distributions with small standing trait variation. We also obtained an almost perfect match between the results of the FD and BA models under fluctuating selection, promoting skewed trait distributions and ongoing oscillations in the biomass and trait dynamics. In contrast, the NA model showed unrealistic trait dynamics and exhibited different alternative stable states, and thus a high sensitivity to initial conditions under fluctuating selection. Under disruptive selection, both aggregate models failed to reproduce the results of the FD model with the mean trait values remaining within their ecologically feasible ranges in the BA model but not in the NA model. Overall, a beta distribution-based moment closure strongly improved the realism of trait-based aggregate models.
On 6 June 1982, Israel invaded Lebanon to fight the Palestinian Liberation Organization (PLO). Between August 1982 and February 1984, the US, France, Britain and Italy deployed a Multinational Force (MNF) to Beirut. Its task was to act as an interposition force to bolster the government and to bring peace to the people. The mission is often forgotten or merely remembered in context with the bombing of US Marines’ barracks. However, an analysis of the Italian contingent shows that the MNF was not doomed to fail and could accomplish its task when operational and diplomatic efforts were coordinated. The Italian commander in Beirut, General Franco Angioni, followed a successful approach that sustained neutrality, respectful behaviour and minimal force, which resulted in a qualified success of the Italian efforts.
Stochastically triggered photospheric light variations reaching similar to 40 mmag peak-to-valley amplitudes have been detected in the O8 Iaf supergiant V973 Scorpii as the outcome of 2 months of high-precision time-resolved photometric observations with the BRIght Target Explorer (BRITE) nanosatellites. The amplitude spectrum of the time series photometry exhibits a pronounced broad bump in the low-frequency regime (less than or similar to 0.9 d(-1)) where several prominent frequencies are detected. A time-frequency analysis of the observations reveals typical mode lifetimes of the order of 5-10 d. The overall features of the observed brightness amplitude spectrum of V973 Sco match well with those extrapolated from two-dimensional hydrodynamical simulations of convectively driven internal gravity waves randomly excited from deep in the convective cores of massive stars. An alternative or additional possible source of excitation from a sub-surface convection zone needs to be explored in future theoretical investigations.
We analyze the problem of response suggestion in a closed domain along a real-world scenario of a digital library. We present a text-processing pipeline to generate question-answer pairs from chat transcripts. On this limited amount of training data, we compare retrieval-based, conditioned-generation, and dedicated representation learning approaches for response suggestion. Our results show that retrieval-based methods that strive to find similar, known contexts are preferable over parametric approaches from the conditioned-generation family, when the training data is limited. We, however, identify a specific representation learning approach that is competitive to the retrieval-based approaches despite the training data limitation.
A close call
(2018)
The present study investigated how lexical selection is influenced by the number of semantically related representations (semantic neighbourhood density) and their similarity (semantic distance) to the target in a speeded picture-naming task. Semantic neighbourhood density and similarity as continuous variables were used to assess lexical selection for which competitive and noncompetitive mechanisms have been proposed. Previous studies found mixed effects of semantic neighbourhood variables, leaving this issue unresolved. Here, we demonstrate interference of semantic neighbourhood similarity with less accurate naming responses and a higher likelihood of producing semantic errors and omissions over accurate responses for words with semantically more similar (closer) neighbours. No main effect of semantic neighbourhood density and no interaction between semantic neighbourhood density and similarity was found. We assessed further whether semantic neighbourhood density can affect naming performance if semantic neighbours exceed a certain degree of semantic similarity. Semantic similarity between the target and each neighbour was used to split semantic neighbourhood density into two different density variables: The number of semantically close neighbours versus distant neighbours. The results showed a significant effect of close, but not of distant, semantic neighbourhood density: Naming pictures of targets with more close semantic neighbours led to longer naming latencies, less accurate responses, and a higher likelihood for the production of semantic errors and omissions over accurate responses. The results show that word inherent semantic attributes such as semantic neighbourhood similarity and the number of coactivated close semantic neighbours modulate lexical selection supporting theories of competitive lexical processing.
To explore the genetic determinants of obesity and Type 2 diabetes (T2D), the German Center for Diabetes Research (DZD) conducted crossbreedings of the obese and diabetes-prone New Zealand Obese mouse strain with four different lean strains (B6, DBA, C3H, 129P2) that vary in their susceptibility to develop T2D. Genome-wide linkage analyses localized more than 290 quantitative trait loci (QTL) for obesity, 190 QTL for diabetes-related traits and 100 QTL for plasma metabolites in the out-cross populations. A computational framework was developed that allowed to refine critical regions and to nominate a small number of candidate genes by integrating reciprocal haplotype mapping and transcriptome data. The efficiency of the complex procedure was demonstrated for one obesity QTL. The genomic interval of 35 Mb with 502 annotated candidate genes was narrowed down to six candidates. Accordingly, congenic mice retained the obesity phenotype owing to an interval that contains three of the six candidate genes. Among these the phospholipase PLA2G4A exhibited an elevated expression in adipose tissue of obese human subjects and is therefore a critical regulator of the obesity locus. Together, our broad and complex approach demonstrates that combined- and comparative-cross analysis exhibits improved mapping resolution and represents a valid tool for the identification of disease genes.
We present new radio/millimeter measurements of the hot magnetic star HR5907 obtained with the VLA and ALMA interferometers. We find that HR5907 is the most radio luminous early type star in the cm-mm band among those presently known. Its multi-wavelength radio light curves are strongly variable with an amplitude that increases with radio frequency. The radio emission can be explained by the populations of the non-thermal electrons accelerated in the current sheets on the outer border of the magnetosphere of this fast-rotating magnetic star. We classify HR5907 as another member of the growing class of strongly magnetic fast-rotating hot stars where the gyro-synchrotron emission mechanism efficiently operates in their magnetospheres. The new radio observations of HR5907 are combined with archival X-ray data to study the physical condition of its magnetosphere. The X-ray spectra of HR5907 show tentative evidence for the presence of non-thermal spectral component. We suggest that non-thermal X-rays originate a stellar X-ray aurora due to streams of non-thermal electrons impacting on the stellar surface. Taking advantage of the relation between the spectral indices of the X-ray power-law spectrum and the non-thermal electron energy distributions, we perform 3-D modelling of the radio emission for HR5907. The wavelength-dependent radio light curves probe magnetospheric layers at different heights above the stellar surface. A detailed comparison between simulated and observed radio light curves leads us to conclude that the stellar magnetic field of HR 5907 is likely non-dipolar, providing further indirect evidence of the complex magnetic field topology of HR5907.
Risk-based insurance is a commonly proposed and discussed flood risk adaptation mechanism in policy debates across the world such as in the United Kingdom and the United States of America. However, both risk-based premiums and growing risk pose increasing difficulties for insurance to remain affordable. An empirical concept of affordability is required as the affordability of adaption strategies is an important concern for policymakers, yet such a concept is not often examined. Therefore, a robust metric with a commonly acceptable affordability threshold is required. A robust metric allows for a previously normative concept to be quantified in monetary terms, and in this way, the metric is rendered more suitable for integration into public policy debates. This paper investigates the degree to which risk-based flood insurance premiums are unaffordable in Europe. In addition, this paper compares the outcomes generated by three different definitions of unaffordability in order to investigate the most robust definition. In doing so, the residual income definition was found to be the least sensitive to changes in the threshold. While this paper focuses on Europe, the selected definition can be employed elsewhere in the world and across adaption measures in order to develop a common metric for indicating the potential unaffordability problem.
Home range estimation is routine practice in ecological research. While advances in animal tracking technology have increased our capacity to collect data to support home range analysis, these same advances have also resulted in increasingly autocorrelated data. Consequently, the question of which home range estimator to use on modern, highly autocorrelated tracking data remains open. This question is particularly relevant given that most estimators assume independently sampled data. Here, we provide a comprehensive evaluation of the effects of autocorrelation on home range estimation. We base our study on an extensive data set of GPS locations from 369 individuals representing 27 species distributed across five continents. We first assemble a broad array of home range estimators, including Kernel Density Estimation (KDE) with four bandwidth optimizers (Gaussian reference function, autocorrelated‐Gaussian reference function [AKDE], Silverman's rule of thumb, and least squares cross‐validation), Minimum Convex Polygon, and Local Convex Hull methods. Notably, all of these estimators except AKDE assume independent and identically distributed (IID) data. We then employ half‐sample cross‐validation to objectively quantify estimator performance, and the recently introduced effective sample size for home range area estimation ( N̂ area
) to quantify the information content of each data set. We found that AKDE 95% area estimates were larger than conventional IID‐based estimates by a mean factor of 2. The median number of cross‐validated locations included in the hold‐out sets by AKDE 95% (or 50%) estimates was 95.3% (or 50.1%), confirming the larger AKDE ranges were appropriately selective at the specified quantile. Conversely, conventional estimates exhibited negative bias that increased with decreasing N̂ area. To contextualize our empirical results, we performed a detailed simulation study to tease apart how sampling frequency, sampling duration, and the focal animal's movement conspire to affect range estimates. Paralleling our empirical results, the simulation study demonstrated that AKDE was generally more accurate than conventional methods, particularly for small N̂ area. While 72% of the 369 empirical data sets had >1,000 total observations, only 4% had an N̂ area >1,000, where 30% had an N̂ area <30. In this frequently encountered scenario of small N̂ area, AKDE was the only estimator capable of producing an accurate home range estimate on autocorrelated data.
Narcissists are assumed to lack the motivation and ability to share and understand the mental states of others. Prior empirical research, however, has yielded inconclusive findings and has differed with respect to the specific aspects of narcissism and socioemotional cognition that have been examined. Here, we propose a differentiated facet approach that can be applied across research traditions and that distinguishes between facets of narcissism (agentic vs. antagonistic) on the one hand, and facets of socioemotional cognition ability (SECA; self-perceived vs. actual) on the other. Using five nonclinical samples in two studies (total N = 602), we investigated the effect of facets of grandiose narcissism on aspects of socioemotional cognition across measures of affective and cognitive empathy, Theory of Mind, and emotional intelligence, while also controlling for general reasoning ability. Across both studies, agentic facets of narcissism were found to be positively related to perceived SECA, whereas antagonistic facets of narcissism were found to be negatively related to perceived SECA. However, both narcissism facets were negatively related to actual SECA. Exploratory condition-based regression analyses further showed that agentic narcissists had a higher directed discrepancy between perceived and actual SECA: They self-enhanced their socio-emotional capacities. Implications of these results for the multifaceted theoretical understanding of the narcissism-SECA link are discussed.
We present a computational evaluation of three hypotheses about sources of deficit in sentence comprehension in aphasia: slowed processing, intermittent deficiency, and resource reduction. The ACT-R based Lewis and Vasishth (2005) model is used to implement these three proposals. Slowed processing is implemented as slowed execution time of parse steps; intermittent deficiency as increased random noise in activation of elements in memory; and resource reduction as reduced spreading activation. As data, we considered subject vs. object relative sentences, presented in a self-paced listening modality to 56 individuals with aphasia (IWA) and 46 matched controls. The participants heard the sentences and carried out a picture verification task to decide on an interpretation of the sentence. These response accuracies are used to identify the best parameters (for each participant) that correspond to the three hypotheses mentioned above. We show that controls have more tightly clustered (less variable) parameter values than IWA; specifically, compared to controls, among IWA there are more individuals with slow parsing times, high noise, and low spreading activation. We find that (a) individual IWA show differential amounts of deficit along the three dimensions of slowed processing, intermittent deficiency, and resource reduction, (b) overall, there is evidence for all three sources of deficit playing a role, and (c) IWA have a more variable range of parameter values than controls. An important implication is that it may be meaningless to talk about sources of deficit with respect to an abstract verage IWA; the focus should be on the individual's differential degrees of deficit along different dimensions, and on understanding the causes of variability in deficit between participants.
BACKGROUND: Work capacity demands are a concept to describe which psychological capacities are required in a job. Assessing psychological work capacity demands is of specific importance when mental health problems at work endanger work ability. Exploring psychological work capacity demands is the basis for mental hazard analysis or rehabilitative action, e.g. in terms of work adjustment. OBJECTIVE: This is the first study investigating psychological work capacity demands in rehabilitation patients with and without mental disorders. METHODS: A structured interview on psychological work capacity demands (Mini-ICF-Work; Muschalla, 2015; Linden et al., 2015) was done with 166 rehabilitation patients of working age. All interviews were done by a state-licensed socio-medically trained psychotherapist. Inter-rater-reliability was assessed by determining agreement in independent co-rating in 65 interviews. For discriminant validity purposes, participants filled in the Short Questionnaire for Work Analysis (KFZA, Prumper et al., 1994). RESULTS: In different professional fields, different psychological work capacity demands were of importance. The Mini-ICF-Work capacity dimensions reflect different aspects than the KFZA. Patients with mental disorders were longer on sick leave and had worse work ability prognosis than patients without mental disorders, although both groups reported similar work capacity demands. CONCLUSIONS: Psychological work demands - which are highly relevant for work ability prognosis and work adjustment processes - can be explored and differentiated in terms of psychological capacity demands.
The literature contains a sizable number of publications where weather types are used to decompose climate shifts or trends into contributions of frequency and mean of those types. They are all based on the product rule, that is, a transformation of a product of sums into a sum of products, the latter providing the decomposition. While there is nothing to argue about the transformation itself, its interpretation as a climate shift or trend decomposition is bound to fail. While the case of a climate shift may be viewed as an incomplete description of a more complex behaviour, trend decomposition indeed produces bogus trends, as demonstrated by a synthetic counterexample with well-defined trends in type frequency and mean. Consequently, decompositions based on that transformation, be it for climate shifts or trends, must not be used.
Microservice Architectures (MSA) structure applications as a collection of loosely coupled services that implement business capabilities. The key advantages of MSA include inherent support for continuous deployment of large complex applications, agility and enhanced productivity. However, studies indicate that most MSA are homogeneous, and introduce shared vulnerabilites, thus vulnerable to multi-step attacks, which are economics-of-scale incentives to attackers. In this paper, we address the issue of shared vulnerabilities in microservices with a novel solution based on the concept of Moving Target Defenses (MTD). Our mechanism works by performing risk analysis against microservices to detect and prioritize vulnerabilities. Thereafter, security risk-oriented software diversification is employed, guided by a defined diversification index. The diversification is performed at runtime, leveraging both model and template based automatic code generation techniques to automatically transform programming languages and container images of the microservices. Consequently, the microservices attack surfaces are altered thereby introducing uncertainty for attackers while reducing the attackability of the microservices. Our experiments demonstrate the efficiency of our solution, with an average success rate of over 70% attack surface randomization.
Faced with the increasing needs of companies, optimal dimensioning of IT hardware is becoming challenging for decision makers. In terms of analytical infrastructures, a highly evolutionary environment causes volatile, time dependent workloads in its components, and intelligent, flexible task distribution between local systems and cloud services is attractive. With the aim of developing a flexible and efficient design for analytical infrastructures, this paper proposes a flexible architecture model, which allocates tasks following a machine-specific decision heuristic. A simulation benchmarks this system with existing strategies and identifies the new decision maxim as superior in a first scenario-based simulation.