Extern
Refine
Year of publication
Document Type
- Article (240)
- Conference Proceeding (122)
- Doctoral Thesis (110)
- Postprint (68)
- Working Paper (39)
- Review (15)
- Monograph/Edited Volume (14)
- Preprint (6)
- Master's Thesis (4)
- Habilitation Thesis (2)
- Bachelor Thesis (1)
- Part of a Book (1)
Language
- English (622) (remove)
Keywords
- USA (7)
- United States (7)
- climate change (7)
- Arktis (6)
- moderne jüdische Geschichte (6)
- Arctic (5)
- COVID-19 (5)
- football (5)
- modern Jewish history (5)
- 20. Jahrhundert (4)
Institute
- Extern (622) (remove)
In recent decades, astronomy has seen a boom in large-scale stellar surveys of the Galaxy. The detailed information obtained about millions of individual stars in the Milky Way is bringing us a step closer to answering one of the most outstanding questions in astrophysics: how do galaxies form and evolve? The Milky Way is the only galaxy where we can dissect many stars into their high-dimensional chemical composition and complete phase space, which analogously as fossil records can unveil the past history of the genesis of the Galaxy. The processes that lead to large structure formation, such as the Milky Way, are critical for constraining cosmological models; we call this line of study Galactic archaeology or near-field cosmology.
At the core of this work, we present a collection of efforts to chemically and dynamically characterise the disks and bulge of our Galaxy. The results we present in this thesis have only been possible thanks to the advent of the Gaia astrometric satellite, which has revolutionised the field of Galactic archaeology by precisely measuring the positions, parallax distances and motions of more than a billion stars. Another, though not less important, breakthrough is the APOGEE survey, which has observed spectra in the near-infrared peering into the dusty regions of the Galaxy, allowing us to determine detailed chemical abundance patterns in hundreds of thousands of stars. To accurately depict the Milky Way structure, we use and develop the Bayesian isochrone fitting tool/code called StarHorse; this software can predict stellar distances, extinctions and ages by combining astrometry, photometry and spectroscopy based on stellar evolutionary models. The StarHorse code is pivotal to calculating distances where Gaia parallaxes alone cannot allow accurate estimates.
We show that by combining Gaia, APOGEE, photometric surveys and using StarHorse, we can produce a chemical cartography of the Milky way disks from their outermost to innermost parts. Such a map is unprecedented in the inner Galaxy. It reveals a continuity of the bimodal chemical pattern previously detected in the solar neighbourhood, indicating two populations with distinct formation histories. Furthermore, the data reveals a chemical gradient within the thin disk where the content of 𝛼-process elements and metals is higher towards the centre. Focusing on a sample in the inner MW we confirm the extension of the chemical duality to the innermost regions of the Galaxy. We find stars with bar shape orbits to show both high- and low-𝛼 abundances, suggesting the bar formed by secular evolution trapping stars that already existed. By analysing the chemical orbital space of the inner Galactic regions, we disentangle the multiple populations that inhabit this complex region. We reveal the presence of the thin disk, thick disk, bar, and a counter-rotating population, which resembles the outcome of a perturbed proto-Galactic disk. Our study also finds that the inner Galaxy holds a high quantity of super metal-rich stars up to three times solar suggesting it is a possible repository of old super-metal-rich stars found in the solar neighbourhood.
We also enter into the complicated task of deriving individual stellar ages. With StarHorse, we calculate the ages of main-sequence turn-off and sub-giant stars for several public spectroscopic surveys. We validate our results by investigating linear relations between chemical abundances and time since the 𝛼 and neutron capture elements are sensitive to age as a reflection of the different enrichment timescales of these elements. For further study of the disks in the solar neighbourhood, we use an unsupervised machine learning algorithm to delineate a multidimensional separation of chrono-chemical stellar groups revealing the chemical thick disk, the thin disk, and young 𝛼-rich stars. The thick disk is shown to have a small age dispersion indicating its fast formation contrary to the thin disk that spans a wide range of ages.
With groundbreaking data, this thesis encloses a detailed chemo-dynamical view of the disk and bulge of our Galaxy. Our findings on the Milky Way can be linked to the evolution of high redshift disk galaxies, helping to solve the conundrum of galaxy formation.
Many widely used observational data sets are comprised of several overlapping instrument records. While data inter-calibration techniques often yield continuous and reliable data for trend analysis, less attention is generally paid to maintaining higher-order statistics such as variance and autocorrelation. A growing body of work uses these metrics to quantify the stability or resilience of a system under study and potentially to anticipate an approaching critical transition in the system. Exploring the degree to which changes in resilience indicators such as the variance or autocorrelation can be attributed to non-stationary characteristics of the measurement process – rather than actual changes in the dynamical properties of the system – is important in this context. In this work we use both synthetic and empirical data to explore how changes in the noise structure of a data set are propagated into the commonly used resilience metrics lag-one autocorrelation and variance. We focus on examples from remotely sensed vegetation indicators such as vegetation optical depth and the normalized difference vegetation index from different satellite sources. We find that time series resulting from mixing signals from sensors with varied uncertainties and covering overlapping time spans can lead to biases in inferred resilience changes. These biases are typically more pronounced when resilience metrics are aggregated (for example, by land-cover type or region), whereas estimates for individual time series remain reliable at reasonable sensor signal-to-noise ratios. Our work provides guidelines for the treatment and aggregation of multi-instrument data in studies of critical transitions and resilience.
Background
The association between bivariate variables may not necessarily be homogeneous throughout the whole range of the variables. We present a new technique to describe inhomogeneity in the association of bivariate variables.
Methods
We consider the correlation of two normally distributed random variables. The 45° diagonal through the origin of coordinates represents the line on which all points would lie if the two variables completely agreed. If the two variables do not completely agree, the points will scatter on both sides of the diagonal and form a cloud. In case of a high association between the variables, the band width of this cloud will be narrow, in case of a low association, the band width will be wide. The band width directly relates to the magnitude of the correlation coefficient. We then determine the Euclidean distances between the diagonal and each point of the bivariate correlation, and rotate the coordinate system clockwise by 45°. The standard deviation of all Euclidean distances, named “global standard deviation”, reflects the band width of all points along the former diagonal. Calculating moving averages of the standard deviation along the former diagonal results in “locally structured standard deviations” and reflect patterns of “locally structured correlations (LSC)”. LSC highlight inhomogeneity of bivariate correlations. We exemplify this technique by analyzing the association between body mass index (BMI) and hip circumference (HC) in 6313 healthy East German adults aged 18 to 70 years.
Results
The correlation between BMI and HC in healthy adults is not homogeneous. LSC is able to identify regions where the predictive power of the bivariate correlation between BMI and HC increases or decreases, and highlights in our example that slim people have a higher association between BMI and HC than obese people.
Conclusion
Locally structured correlations (LSC) identify regions of higher or lower than average correlation between two normally distributed variables.
Physical activity and exercise are effective approaches in prevention and therapy of multiple diseases. Although the specific characteristics of lengthening contractions have the potential to be beneficial in many clinical conditions, eccentric training is not commonly used in clinical populations with metabolic, orthopaedic, or neurologic conditions. The purpose of this pilot study is to investigate the feasibility, functional benefits, and systemic responses of an eccentric exercise program focused on the trunk and lower extremities in people with low back pain (LBP) and multiple sclerosis (MS). A six-week eccentric training program with three weekly sessions is performed by people with LBP and MS. The program consists of ten exercises addressing strength of the trunk and lower extremities. The study follows a four-group design (N = 12 per group) in two study centers (Israel and Germany): three groups perform the eccentric training program: A) control group (healthy, asymptomatic); B) people with LBP; C) people with MS; group D (people with MS) receives standard care physiotherapy. Baseline measurements are conducted before first training, post-measurement takes place after the last session both comprise blood sampling, self-reported questionnaires, mobility, balance, and strength testing. The feasibility of the eccentric training program will be evaluated using quantitative and qualitative measures related to the study process, compliance and adherence, safety, and overall program assessment. For preliminary assessment of potential intervention effects, surrogate parameters related to mobility, postural control, muscle strength and systemic effects are assessed. The presented study will add knowledge regarding safety, feasibility, and initial effects of eccentric training in people with orthopaedic and neurological conditions. The simple exercises, that are easily modifiable in complexity and intensity, are likely beneficial to other populations. Thus, multiple applications and implementation pathways for the herein presented training program are conceivable.
Recent research suggests that design thinking practices may foster the development of needed capabilities in new digitalised landscapes. However, existing publications represent individual contributions, and we lack a holistic understanding of the value of design thinking in a digital world. No review, to date, has offered a holistic retrospection of this research. In response, in this bibliometric review, we aim to shed light on the intellectual structure of multidisciplinary design thinking literature related to capabilities relevant to the digital world in higher education and business settings, highlight current trends and suggest further studies to advance theoretical and empirical underpinnings. Our study addresses this aim using bibliometric methods—bibliographic coupling and co-word analysis as they are particularly suitable for identifying current trends and future research priorities at the forefront of the research. Overall, bibliometric analyses of the publications dealing with the related topics published in the last 10 years (extracted from the Web of Science database) expose six trends and two possible future research developments highlighting the expanding scope of the design thinking scientific field related to capabilities required for the (more sustainable and human-centric) digital world. Relatedly, design thinking becomes a relevant approach to be included in higher education curricula and human resources training to prepare students and workers for the changing work demands. This paper is well-suited for education and business practitioners seeking to embed design thinking capabilities in their curricula and for design thinking and other scholars wanting to understand the field and possible directions for future research.
Inverted perovskite solar cells still suffer from significant non-radiative recombination losses at the perovskite surface and across the perovskite/C₆₀ interface, limiting the future development of perovskite-based single- and multi-junction photovoltaics. Therefore, more effective inter- or transport layers are urgently required. To tackle these recombination losses, we introduce ortho-carborane as an interlayer material that has a spherical molecular structure and a three-dimensional aromaticity. Based on a variety of experimental techniques, we show that ortho-carborane decorated with phenylamino groups effectively passivates the perovskite surface and essentially eliminates the non-radiative recombination loss across the perovskite/C₆₀ interface with high thermal stability. We further demonstrate the potential of carborane as an electron transport material, facilitating electron extraction while blocking holes from the interface. The resulting inverted perovskite solar cells deliver a power conversion efficiency of over 23% with a low non-radiative voltage loss of 110 mV, and retain >97% of the initial efficiency after 400 h of maximum power point tracking. Overall, the designed carborane based interlayer simultaneously enables passivation, electron-transport and hole-blocking and paves the way toward more efficient and stable perovskite solar cells.
Inverted perovskite solar cells still suffer from significant non-radiative recombination losses at the perovskite surface and across the perovskite/C₆₀ interface, limiting the future development of perovskite-based single- and multi-junction photovoltaics. Therefore, more effective inter- or transport layers are urgently required. To tackle these recombination losses, we introduce ortho-carborane as an interlayer material that has a spherical molecular structure and a three-dimensional aromaticity. Based on a variety of experimental techniques, we show that ortho-carborane decorated with phenylamino groups effectively passivates the perovskite surface and essentially eliminates the non-radiative recombination loss across the perovskite/C₆₀ interface with high thermal stability. We further demonstrate the potential of carborane as an electron transport material, facilitating electron extraction while blocking holes from the interface. The resulting inverted perovskite solar cells deliver a power conversion efficiency of over 23% with a low non-radiative voltage loss of 110 mV, and retain >97% of the initial efficiency after 400 h of maximum power point tracking. Overall, the designed carborane based interlayer simultaneously enables passivation, electron-transport and hole-blocking and paves the way toward more efficient and stable perovskite solar cells.
Achilles tendinopathy (AT) is a debilitating injury in athletes, especially for those engaged in repetitive stretch-shortening cycle activities. Clinical risk factors are numerous, but it has been suggested that altered biomechanics might be associated with AT. No systematic review has been conducted investigating these biomechanical alterations in specifically athletic populations. Therefore, the aim of this systematic review was to compare the lower-limb biomechanics of athletes with AT to athletically matched asymptomatic controls. Databases were searched for relevant studies investigating biomechanics during gait activities and other motor tasks such as hopping, isolated strength tasks, and reflex responses. Inclusion criteria for studies were an AT diagnosis in at least one group, cross-sectional or prospective data, at least one outcome comparing biomechanical data between an AT and healthy group, and athletic populations. Studies were excluded if patients had Achilles tendon rupture/surgery, participants reported injuries other than AT, and when only within-subject data was available.. Effect sizes (Cohen's d) with 95% confidence intervals were calculated for relevant outcomes. The initial search yielded 4,442 studies. After screening, twenty studies (775 total participants) were synthesised, reporting on a wide range of biomechanical outcomes. Females were under-represented and patients in the AT group were three years older on average. Biomechanical alterations were identified in some studies during running, hopping, jumping, strength tasks and reflex activity. Equally, several biomechanical variables studied were not associated with AT in included studies, indicating a conflicting picture. Kinematics in AT patients appeared to be altered in the lower limb, potentially indicating a pattern of “medial collapse”. Muscular activity of the calf and hips was different between groups, whereby AT patients exhibited greater calf electromyographic amplitudes despite lower plantar flexor strength. Overall, dynamic maximal strength of the plantar flexors, and isometric strength of the hips might be reduced in the AT group. This systematic review reports on several biomechanical alterations in athletes with AT. With further research, these factors could potentially form treatment targets for clinicians, although clinical approaches should take other contributing health factors into account. The studies included were of low quality, and currently no solid conclusions can be drawn.
Intuitively, strongly constraining contexts should lead to stronger probabilistic representations of sentences in memory. Encountering unexpected words could therefore be expected to trigger costlier shifts in these representations than expected words. However, psycholinguistic measures commonly used to study probabilistic processing, such as the N400 event-related potential (ERP) component, are sensitive to word predictability but not to contextual constraint. Some research suggests that constraint-related processing cost may be measurable via an ERP positivity following the N400, known as the anterior post-N400 positivity (PNP). The PNP is argued to reflect update of a sentence representation and to be distinct from the posterior P600, which reflects conflict detection and reanalysis. However, constraint-related PNP findings are inconsistent. We sought to conceptually replicate Federmeier et al. (2007) and Kuperberg et al. (2020), who observed that the PNP, but not the N400 or the P600, was affected by constraint at unexpected but plausible words. Using a pre-registered design and statistical approach maximising power, we demonstrated a dissociated effect of predictability and constraint: strong evidence for predictability but not constraint in the N400 window, and strong evidence for constraint but not predictability in the later window. However, the constraint effect was consistent with a P600 and not a PNP, suggesting increased conflict between a strong representation and unexpected input rather than greater update of the representation. We conclude that either a simple strong/weak constraint design is not always sufficient to elicit the PNP, or that previous PNP constraint findings could be an artifact of smaller sample size.
Intuitively, strongly constraining contexts should lead to stronger probabilistic representations of sentences in memory. Encountering unexpected words could therefore be expected to trigger costlier shifts in these representations than expected words. However, psycholinguistic measures commonly used to study probabilistic processing, such as the N400 event-related potential (ERP) component, are sensitive to word predictability but not to contextual constraint. Some research suggests that constraint-related processing cost may be measurable via an ERP positivity following the N400, known as the anterior post-N400 positivity (PNP). The PNP is argued to reflect update of a sentence representation and to be distinct from the posterior P600, which reflects conflict detection and reanalysis. However, constraint-related PNP findings are inconsistent. We sought to conceptually replicate Federmeier et al. (2007) and Kuperberg et al. (2020), who observed that the PNP, but not the N400 or the P600, was affected by constraint at unexpected but plausible words. Using a pre-registered design and statistical approach maximising power, we demonstrated a dissociated effect of predictability and constraint: strong evidence for predictability but not constraint in the N400 window, and strong evidence for constraint but not predictability in the later window. However, the constraint effect was consistent with a P600 and not a PNP, suggesting increased conflict between a strong representation and unexpected input rather than greater update of the representation. We conclude that either a simple strong/weak constraint design is not always sufficient to elicit the PNP, or that previous PNP constraint findings could be an artifact of smaller sample size.