Refine
Year of publication
- 2022 (2462) (remove)
Document Type
- Article (1605)
- Doctoral Thesis (253)
- Postprint (163)
- Part of a Book (146)
- Monograph/Edited Volume (94)
- Review (47)
- Other (30)
- Working Paper (30)
- Part of Periodical (21)
- Conference Proceeding (19)
- Master's Thesis (18)
- Report (11)
- Contribution to a Periodical (9)
- Bachelor Thesis (8)
- Habilitation Thesis (3)
- Journal/Publication series (3)
- Course Material (2)
Language
- English (1741)
- German (688)
- Hebrew (10)
- French (9)
- Spanish (7)
- Italian (3)
- Multiple languages (3)
- Portuguese (1)
Keywords
- climate change (20)
- machine learning (18)
- COVID-19 (16)
- Germany (12)
- exercise (11)
- obesity (11)
- diffusion (10)
- gender (10)
- adolescents (9)
- depression (9)
Institute
- Institut für Biochemie und Biologie (271)
- Institut für Physik und Astronomie (268)
- Extern (197)
- Institut für Geowissenschaften (187)
- Historisches Institut (115)
- Institut für Chemie (105)
- Bürgerliches Recht (102)
- Fachgruppe Politik- & Verwaltungswissenschaft (96)
- Institut für Umweltwissenschaften und Geographie (92)
- Öffentliches Recht (90)
Art. 21 AEUV [Freizügigkeit]
(2022)
ArcticBeach v1.0
(2022)
In the Arctic, air temperatures are increasing and sea ice is declining, resulting in larger waves and a longer open water season, all of which intensify the thaw and erosion of ice-rich coasts. Climate change has been shown to increase the rate of Arctic coastal erosion, causing problems for Arctic cultural heritage, existing industrial, military, and civil infrastructure, as well as changes in nearshore biogeochemistry. Numerical models that reproduce historical and project future Arctic erosion rates are necessary to understand how further climate change will affect these problems, and no such model yet exists to simulate the physics of erosion on a pan-Arctic scale. We have coupled a bathystrophic storm surge model to a simplified physical erosion model of a permafrost coastline. This Arctic erosion model, called ArcticBeach v1.0, is a first step toward a physical parameterization of Arctic shoreline erosion for larger-scale models. It is forced by wind speed and direction, wave period and height, sea surface temperature, all of which are masked during times of sea ice cover near the coastline. Model tuning requires observed historical retreat rates (at least one value), as well as rough nearshore bathymetry. These parameters are already available on a pan-Arctic scale. The model is validated at three study sites at 1) Drew Point (DP), Alaska, 2) Mamontovy Khayata (MK), Siberia, and 3) Veslebogen Cliffs, Svalbard. Simulated cumulative retreat rates for DP and MK respectively (169 and 170 m) over the time periods studied at each site (2007-2016, and 1995-2018) are found to the same order of magnitude as observed cumulative retreat (172 and 120 m). The rocky Veslebogen cliffs have small observed cumulative retreat rates (0.05 m over 2014-2016), and our model was also able to reproduce this same order of magnitude of retreat (0.08 m). Given the large differences in geomorphology between the study sites, this study provides a proof-of-concept that ArcticBeach v1.0 can be applied on very different permafrost coastlines. ArcticBeach v1.0 provides a promising starting point to project retreat of Arctic shorelines, or to evaluate historical retreat in places that have had few observations.
Increasing arctic coastal erosion rates imply a greater release of sediments and organic matter into the coastal zone. With 213 sediment samples taken around Herschel Island-Qikiqtaruk, Canadian Beaufort Sea, we aimed to gain new insights on sediment dynamics and geochemical properties of a shallow arctic nearshore zone. Spatial characteristics of nearshore sediment texture (moderately to poorly sorted silt) are dictated by hydrodynamic processes, but ice-related processes also play a role. We determined organic matter (OM) distribution and inferred the origin and quality of organic carbon by C/N ratios and stable carbon isotopes delta C-13. The carbon content was higher offshore and in sheltered areas (mean: 1.0 wt.%., S.D.: 0.9) and the C/N ratios also showed a similar spatial pattern (mean: 11.1, S.D.: 3.1), while the delta C-13 (mean: -26.4 parts per thousand VPDB, S.D.: 0.4) distribution was more complex. We compared the geochemical parameters of our study with terrestrial and marine samples from other studies using a bootstrap approach. Sediments of the current study contained 6.5 times and 1.8 times less total organic carbon than undisturbed and disturbed terrestrial sediments, respectively. Therefore, degradation of OM and separation of carbon pools take place on land and continue in the nearshore zone, where OM is leached, mineralized, or transported beyond the study area.
We propose a global geomagnetic field model for the last 14 thousand years, based on thermoremanent records. We call the model ArchKalmag14k. ArchKalmag14k is constructed by modifying recently proposed algorithms, based on space-time correlations. Due to the amount of data and complexity of the model, the full Bayesian posterior is numerically intractable. To tackle this, we sequentialize the inversion by implementing a Kalman-filter with a fixed time step. Every step consists of a prediction, based on a degree dependent temporal covariance, and a correction via Gaussian process regression. Dating errors are treated via a noisy input formulation. Cross correlations are reintroduced by a smoothing algorithm and model parameters are inferred from the data. Due to the specific statistical nature of the proposed algorithms, the model comes with space and time-dependent uncertainty estimates. The new model ArchKalmag14k shows less variation in the large-scale degrees than comparable models. Local predictions represent the underlying data and agree with comparable models, if the location is sampled well. Uncertainties are bigger for earlier times and in regions of sparse data coverage. We also use ArchKalmag14k to analyze the appearance and evolution of the South Atlantic anomaly together with reverse flux patches at the core-mantle boundary, considering the model uncertainties. While we find good agreement with earlier models for recent times, our model suggests a different evolution of intensity minima prior to 1650 CE. In general, our results suggest that prior to 6000 BCE the data is not sufficient to support global models.
The use of neural networks is considered as the state of the art in the field of image classification. A large number of different networks are available for this purpose, which, appropriately trained, permit a high level of classification accuracy. Typically, these networks are applied to uncompressed image data, since a corresponding training was also carried out using image data of similar high quality. However, if image data contains image errors, the classification accuracy deteriorates drastically. This applies in particular to coding artifacts which occur due to image and video compression. Typical application scenarios for video compression are narrowband transmission channels for which video coding is required but a subsequent classification is to be carried out on the receiver side. In this paper we present a special H.264/Advanced Video Codec (AVC) based video codec that allows certain regions of a picture to be coded with near constant picture quality in order to allow a reliable classification using neural networks, whereas the remaining image will be coded using constant bit rate. We have combined this feature with the ability to run with lowest latency properties, which is usually also required in remote control applications scenarios. The codec has been implemented as a fully hardwired High Definition video capable hardware architecture which is suitable for Field Programmable Gate Arrays.
Arbeitswelt 4.0
(2022)
Arbeitsrecht
(2022)
Technological progress allows for producing ever more complex predictive models on the basis of increasingly big datasets. For risk management of natural hazards, a multitude of models is needed as basis for decision-making, e.g. in the evaluation of observational data, for the prediction of hazard scenarios, or for statistical estimates of expected damage. The question arises, how modern modelling approaches like machine learning or data-mining can be meaningfully deployed in this thematic field. In addition, with respect to data availability and accessibility, the trend is towards open data. Topic of this thesis is therefore to investigate the possibilities and limitations of machine learning and open geospatial data in the field of flood risk modelling in the broad sense. As this overarching topic is broad in scope, individual relevant aspects are identified and inspected in detail.
A prominent data source in the flood context is satellite-based mapping of inundated areas, for example made openly available by the Copernicus service of the European Union. Great expectations are directed towards these products in scientific literature, both for acute support of relief forces during emergency response action, and for modelling via hydrodynamic models or for damage estimation. Therefore, a focus of this work was set on evaluating these flood masks. From the observation that the quality of these products is insufficient in forested and built-up areas, a procedure for subsequent improvement via machine learning was developed. This procedure is based on a classification algorithm that only requires training data from a particular class to be predicted, in this specific case data of flooded areas, but not of the negative class (dry areas). The application for hurricane Harvey in Houston shows the high potential of this method, which depends on the quality of the initial flood mask.
Next, it is investigated how much the predicted statistical risk from a process-based model chain is dependent on implemented physical process details. Thereby it is demonstrated what a risk study based on established models can deliver. Even for fluvial flooding, such model chains are already quite complex, though, and are hardly available for compound or cascading events comprising torrential rainfall, flash floods, and other processes. In the fourth chapter of this thesis it is therefore tested whether machine learning based on comprehensive damage data can offer a more direct path towards damage modelling, that avoids explicit conception of such a model chain. For that purpose, a state-collected dataset of damaged buildings from the severe El Niño event 2017 in Peru is used. In this context, the possibilities of data-mining for extracting process knowledge are explored as well. It can be shown that various openly available geodata sources contain useful information for flood hazard and damage modelling for complex events, e.g. satellite-based rainfall measurements, topographic and hydrographic information, mapped settlement areas, as well as indicators from spectral data. Further, insights on damaging processes are discovered, which mainly are in line with prior expectations. The maximum intensity of rainfall, for example, acts stronger in cities and steep canyons, while the sum of rain was found more informative in low-lying river catchments and forested areas. Rural areas of Peru exhibited higher vulnerability in the presented study compared to urban areas. However, the general limitations of the methods and the dependence on specific datasets and algorithms also become obvious.
In the overarching discussion, the different methods – process-based modelling, predictive machine learning, and data-mining – are evaluated with respect to the overall research questions. In the case of hazard observation it seems that a focus on novel algorithms makes sense for future research. In the subtopic of hazard modelling, especially for river floods, the improvement of physical models and the integration of process-based and statistical procedures is suggested. For damage modelling the large and representative datasets necessary for the broad application of machine learning are still lacking. Therefore, the improvement of the data basis in the field of damage is currently regarded as more important than the selection of algorithms.
Forest microclimate can buffer biotic responses to summer heat waves, which are expected to become more extreme under climate warming. Prediction of forest microclimate is limited because meteorological observation standards seldom include situations inside forests.
We use eXtreme Gradient Boosting - a Machine Learning technique - to predict the microclimate of forest sites in Brandenburg, Germany, using seasonal data comprising weather features.
The analysis was amended by applying a SHapley Additive explanation to show the interaction effect of variables and individualised feature attributions.
We evaluate model performance in comparison to artificial neural networks, random forest, support vector machine, and multi-linear regression.
After implementing a feature selection, an ensemble approach was applied to combine individual models for each forest and improve robustness over a given single prediction model.
The resulting model can be applied to translate climate change scenarios into temperatures inside forests to assess temperature-related ecosystem services provided by forests.
Physical activity and exercise are effective approaches in prevention and therapy of multiple diseases. Although the specific characteristics of lengthening contractions have the potential to be beneficial in many clinical conditions, eccentric training is not commonly used in clinical populations with metabolic, orthopaedic, or neurologic conditions. The purpose of this pilot study is to investigate the feasibility, functional benefits, and systemic responses of an eccentric exercise program focused on the trunk and lower extremities in people with low back pain (LBP) and multiple sclerosis (MS). A six-week eccentric training program with three weekly sessions is performed by people with LBP and MS. The program consists of ten exercises addressing strength of the trunk and lower extremities. The study follows a four-group design (N = 12 per group) in two study centers (Israel and Germany): three groups perform the eccentric training program: A) control group (healthy, asymptomatic); B) people with LBP; C) people with MS; group D (people with MS) receives standard care physiotherapy. Baseline measurements are conducted before first training, post-measurement takes place after the last session both comprise blood sampling, self-reported questionnaires, mobility, balance, and strength testing. The feasibility of the eccentric training program will be evaluated using quantitative and qualitative measures related to the study process, compliance and adherence, safety, and overall program assessment. For preliminary assessment of potential intervention effects, surrogate parameters related to mobility, postural control, muscle strength and systemic effects are assessed. The presented study will add knowledge regarding safety, feasibility, and initial effects of eccentric training in people with orthopaedic and neurological conditions. The simple exercises, that are easily modifiable in complexity and intensity, are likely beneficial to other populations. Thus, multiple applications and implementation pathways for the herein presented training program are conceivable.
Physical activity and exercise are effective approaches in prevention and therapy of multiple diseases. Although the specific characteristics of lengthening contractions have the potential to be beneficial in many clinical conditions, eccentric training is not commonly used in clinical populations with metabolic, orthopaedic, or neurologic conditions. The purpose of this pilot study is to investigate the feasibility, functional benefits, and systemic responses of an eccentric exercise program focused on the trunk and lower extremities in people with low back pain (LBP) and multiple sclerosis (MS). A six-week eccentric training program with three weekly sessions is performed by people with LBP and MS. The program consists of ten exercises addressing strength of the trunk and lower extremities. The study follows a four-group design (N = 12 per group) in two study centers (Israel and Germany): three groups perform the eccentric training program: A) control group (healthy, asymptomatic); B) people with LBP; C) people with MS; group D (people with MS) receives standard care physiotherapy. Baseline measurements are conducted before first training, post-measurement takes place after the last session both comprise blood sampling, self-reported questionnaires, mobility, balance, and strength testing. The feasibility of the eccentric training program will be evaluated using quantitative and qualitative measures related to the study process, compliance and adherence, safety, and overall program assessment. For preliminary assessment of potential intervention effects, surrogate parameters related to mobility, postural control, muscle strength and systemic effects are assessed. The presented study will add knowledge regarding safety, feasibility, and initial effects of eccentric training in people with orthopaedic and neurological conditions. The simple exercises, that are easily modifiable in complexity and intensity, are likely beneficial to other populations. Thus, multiple applications and implementation pathways for the herein presented training program are conceivable.
Although phytoliths are recognized as an important proxy for paleoenvironmental reconstruction, the quantitative relationship between phytoliths and climate is still debated. In order to provide an improved basis for phytolith-based paleoclimate reconstructions, a representative modern phytolith dataset is essential. Here, we synthesize a modern topsoil phytolith dataset for Northeast China, analyze its climatic significance, and apply it to a fossil phytolith series from the Hani peat core in Northeast China. The dataset comprises 660 topsoil phytolith assemblages from 289 sample sites. We compiled modern meteorological data to assess the quantitative relationship between the phytolith assemblages and climatic variables. Detrended correspondence analysis (DCA) and Redundancy analysis (RDA) were used to determine the dominant climatic variable influencing the phytolith distributions. The results showed that mean annual temperature (MAT) is the dominant variable controlling the spatial distribution of phytoliths, accounting for 8.91% of the total variance. Transfer function based on inverse deshrinking locally-weighted weighted averaging (LWWA_Inv) was developed for MAT (R-_boot(2) = 0.86, RMSEP = 1.02 degrees C). Applying the LWWA_Inv transfer function to fossil phytolith records from the Hani peat core enables quantitative inferences to be made about Holocene climate changes in Northeast China. Overall, combined with the LWWA_Inv method, the topsoil phytolith dataset of Northeast China can be used for reliable quantitative MAT reconstruction.
We analyse mobile-immobile transport of particles that switch between the mobile and immobile phases with finite rates. Despite this seemingly simple assumption of Poissonian switching, we unveil a rich transport dynamics including significant transient anomalous diffusion and non-Gaussian displacement distributions. Our discussion is based on experimental parameters for tau proteins in neuronal cells, but the results obtained here are expected to be of relevance for a broad class of processes in complex systems. Specifically, we obtain that, when the mean binding time is significantly longer than the mean mobile time, transient anomalous diffusion is observed at short and intermediate time scales, with a strong dependence on the fraction of initially mobile and immobile particles. We unveil a Laplace distribution of particle displacements at relevant intermediate time scales. For any initial fraction of mobile particles, the respective mean squared displacement (MSD) displays a plateau. Moreover, we demonstrate a short-time cubic time dependence of the MSD for immobile tracers when initially all particles are immobile.
Extracting information about past tectonic or climatic environmental changes from sedimentary records is a key objective of provenance research. Interpreting the imprint of such changes remains challenging as signals might be altered in the sediment-routing system.
We investigate the sedimentary provenance of the Oligocene/Miocene Upper Austrian Northern Alpine Foreland Basin and its response to the tectonically driven exhumation of the Tauern Window metamorphic dome (28 +/- 1 Ma) in the Eastern European Alps by using the unprecedented combination of Nd isotopic composition of bulk-rock clay-sized samples and partly previously published multi-proxy (Nd isotopic composition, trace-element geochemistry, U-Pb dating) sand-sized apatite single-grain analysis.
The basin offers an excellent opportunity to investigate environmental signal propagation into the sedimentary record because comprehensive stratigraphic and seismic datasets can be combined with present research results. The bulk-rock clay-sized fraction epsilon Nd values of well-cutting samples from one well on the northern basin slope remained stable at similar to-9.7 from 27 to 19 Ma but increased after 19 Ma to similar to-9.1. In contrast, apatite single-grain distributions, which were extracted from 22 drill-core samples, changed significantly around 23.3 Ma from apatites dominantly from low-grade (<upper amphibolite-facies) metamorphic sources with Permo-Mesozoic and late Variscan U-Pb ages and epsilon Nd values of -4.4 to dominantly high-grade metamorphic apatites with late Variscan U-Pb ages and epsilon Nd values of -2.2.
The change in apatite single-grain distributions at 23.3 Ma is interpreted to result from the exposure of a new Upper Austroalpine source nappe with less negative epsilon Nd values triggered by the ongoing Tauern Window exhumation. Combining these data with the clay-sized bulk-rock epsilon Nd values reveals that the provenance changed 4-5 Myrs later at 19 Ma in the clay-sized fraction.
Reasons for the delayed provenance-change recording are rooted in the characteristics of the applied methods.
Whereas single-grain distributions of orogen-wide sediment-routing systems can be dominated by geographically small areas with high erosion and mineral fertility rates, bulk-rock methods integrate over the entire drainage basin, thus diminishing extreme values. Hence, by combining these two methods, spatial information are uncovered, enabling a previously unattained understanding of the underlying environmental change.
Alexander von Humboldt gehört zu den ersten Fachleuten, die für die sichere Beherrschung technischer und ökonomischer Prozesse gleichzeitig naturwissenschaftliche und kameralistische Kenntnisse anwendeten und dabei weiterentwickelten. Durch die allseitige Betrachtung von Herstellungsvorgängen gelang es ihm, für seine Vorgesetzten gut durchdachte Vorschläge zur Gestaltung der Rohstoffverarbeitung zu formulieren. Nach einigen Fakten aus seinem kameralistischen und hüttenmännischen Studium wird das anhand von Beispielen belegt. Dazu gehören u. a. die Anlage von Gradierwerken, der Einsatz hochwertiger Rohstoffe in der Steingutfertigung, die Anwendung von Komplexrohstoffen zur Glasschmelze, die Auswahl effektiver Wasserräder, die Dimensionierung der Gaskanäle in Porzellanbrennöfen sowie der Einsatz von Flussmitteln für die Roheisenschmelze. Bei der Ausreichung eines königlichen Kredites für die Blaufarbenherstellung zeigte sich der Kameralist.
Polymeric antimicrobial peptide mimics are a promising alternative for the future management of the daunting problems associated with antimicrobial resistance. However, the development of successful antimicrobial polymers (APs) requires careful control of factors such as amphiphilic balance, molecular weight, dispersity, sequence, and architecture. While most of the earlier developed APs focus on random linear copolymers, the development of APs with advanced architectures proves to be more potent. It is recently developed multivalent bottlebrush APs with improved antibacterial and hemocompatibility profiles, outperforming their linear counterparts. Understanding the rationale behind the outstanding biological activity of these newly developed antimicrobials is vital to further improving their performance. This work investigates the physicochemical properties governing the differences in activity between linear and bottlebrush architectures using various spectroscopic and microscopic techniques. Linear copolymers are more solvated, thermo-responsive, and possess facial amphiphilicity resulting in random aggregations when interacting with liposomes mimicking Escheria coli membranes. The bottlebrush copolymers adopt a more stable secondary conformation in aqueous solution in comparison to linear copolymers, conferring rapid and more specific binding mechanism to membranes. The advantageous physicochemical properties of the bottlebrush topology seem to be a determinant factor in the activity of these promising APs.