Refine
Has Fulltext
- yes (27) (remove)
Year of publication
Document Type
- Postprint (22)
- Doctoral Thesis (4)
- Master's Thesis (1)
Is part of the Bibliography
- yes (27)
Keywords
- model (27) (remove)
Institute
- Mathematisch-Naturwissenschaftliche Fakultät (16)
- Institut für Geowissenschaften (3)
- Institut für Physik und Astronomie (3)
- Institut für Umweltwissenschaften und Geographie (3)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (2)
- Institut für Informatik und Computational Science (2)
- Department Psychologie (1)
- Hasso-Plattner-Institut für Digital Engineering GmbH (1)
- Humanwissenschaftliche Fakultät (1)
- Institut für Ernährungswissenschaft (1)
Isostasy is one of the oldest and most widely applied concepts in the geosciences, but the geoscientific community lacks a coherent, easy-to-use tool to simulate flexure of a realistic (i.e., laterally heterogeneous) lithosphere under an arbitrary set of surface loads. Such a model is needed for studies of mountain building, sedimentary basin formation, glaciation, sea-level change, and other tectonic, geodynamic, and surface processes. Here I present gFlex (for GNU flexure), an open-source model that can produce analytical and finite difference solutions for lithospheric flexure in one (profile) and two (map view) dimensions. To simulate the flexural isostatic response to an imposed load, it can be used by itself or within GRASS GIS for better integration with field data. gFlex is also a component with the Community Surface Dynamics Modeling System (CSDMS) and Landlab modeling frameworks for coupling with a wide range of Earth-surface-related models, and can be coupled to additional models within Python scripts. As an example of this in-script coupling, I simulate the effects of spatially variable lithospheric thickness on a modeled Iceland ice cap. Finite difference solutions in gFlex can use any of five types of boundary conditions: 0-displacement, 0-slope (i.e., clamped); 0-slope, 0-shear; 0-moment, 0-shear (i.e., broken plate); mirror symmetry; and periodic. Typical calculations with gFlex require << 1 s to similar to 1 min on a personal laptop computer. These characteristics - multiple ways to run the model, multiple solution methods, multiple boundary conditions, and short compute time - make gFlex an effective tool for flexural isostatic modeling across the geosciences.
After the United Kingdom has left the European Union it remains unclear whether the two parties can successfully negotiate and sign a trade agreement within the transition period. Ongoing negotiations, practical obstacles and resulting uncertainties make it highly unlikely that economic actors would be fully prepared to a “no-trade-deal” situation. Here we provide an economic shock simulation of the immediate aftermath of such a post-Brexit no-trade-deal scenario by computing the time evolution of more than 1.8 million interactions between more than 6,600 economic actors in the global trade network. We find an abrupt decline in the number of goods produced in the UK and the EU. This sudden output reduction is caused by drops in demand as customers on the respective other side of the Channel incorporate the new trade restriction into their decision-making. As a response, producers reduce prices in order to stimulate demand elsewhere. In the short term consumers benefit from lower prices but production value decreases with potentially severe socio-economic consequences in the longer term.
While previous research underscores the role of leaders in stimulating employee voice behaviour, comparatively little is known about what affects leaders' support for such constructive but potentially threatening employee behaviours. We introduce leader member exchange quality (LMX) as a central predictor of leaders' support for employees' ideas for constructive change. Apart from a general benefit of high LMX for leaders' idea support, we propose that high LMX is particularly critical to leaders' idea support if the idea voiced by an employee constitutes a power threat to the leader. We investigate leaders' attribution of prosocial and egoistic employee intentions as mediators of these effects. Hypotheses were tested in a quasi-experimental vignette study (N = 160), in which leaders evaluated a simulated employee idea, and a field study (N = 133), in which leaders evaluated an idea that had been voiced to them at work. Results show an indirect effect of LMX on leaders' idea support via attributed prosocial intentions but not via attributed egoistic intentions, and a buffering effect of high LMX on the negative effect of power threat on leaders' idea support. Results differed across studies with regard to the main effect of LMX on idea support.
Business process models are used within a range of organizational initiatives, where every stakeholder has a unique perspective on a process and demands the respective model. As a consequence, multiple process models capturing the very same business process coexist. Keeping such models in sync is a challenge within an ever changing business environment: once a process is changed, all its models have to be updated. Due to a large number of models and their complex relations, model maintenance becomes error-prone and expensive. Against this background, business process model abstraction emerged as an operation reducing the number of stored process models and facilitating model management. Business process model abstraction is an operation preserving essential process properties and leaving out insignificant details in order to retain information relevant for a particular purpose. Process model abstraction has been addressed by several researchers. The focus of their studies has been on particular use cases and model transformations supporting these use cases. This thesis systematically approaches the problem of business process model abstraction shaping the outcome into a framework. We investigate the current industry demand in abstraction summarizing it in a catalog of business process model abstraction use cases. The thesis focuses on one prominent use case where the user demands a model with coarse-grained activities and overall process ordering constraints. We develop model transformations that support this use case starting with the transformations based on process model structure analysis. Further, abstraction methods considering the semantics of process model elements are investigated. First, we suggest how semantically related activities can be discovered in process models-a barely researched challenge. The thesis validates the designed abstraction methods against sets of industrial process models and discusses the method implementation aspects. Second, we develop a novel model transformation, which combined with the related activity discovery allows flexible non-hierarchical abstraction. In this way this thesis advocates novel model transformations that facilitate business process model management and provides the foundations for innovative tool support.
The Cluster mission has produced a large data set of electron flux measurements in the Earth's magnetosphere since its launch in late 2000. Electron fluxes are measured using Research with Adaptive Particle Imaging Detector (RAPID)/Imaging Electron Spectrometer (IES) detector as a function of energy, pitch angle, spacecraft position, and time. However, no adiabatic invariants have been calculated for Cluster so far. In this paper we present a step-by-step guide to calculations of adiabatic invariants and conversion of the electron flux to phase space density (PSD) in these coordinates. The electron flux is measured in two RAPID/IES energy channels providing pitch angle distribution at energies 39.2-50.5 and 68.1-94.5 keV in nominal mode since 2004. A fitting method allows to expand the conversion of the differential fluxes to the range from 40 to 150 keV. Best data coverage for phase space density in adiabatic invariant coordinates can be obtained for values of second adiabatic invariant, K, similar to 10(2), and values of the first adiabatic invariant mu in the range approximate to 5-20 MeV/G. Furthermore, we describe the production of a new data product "LSTAR," equivalent to the third adiabatic invariant, available through the Cluster Science Archive for years 2001-2018 with 1-min resolution. The produced data set adds to the availability of observations in Earth's radiation belts region and can be used for long-term statistical purposes.
Ermittlung historischer Parameter eines kleinen Einzugsgebietes am Beispiel des Pfefferfließes
(2010)
Am Beispiel eines Fließgewässers (Pfefferfließ) wurde unter Verwendung verschiedener Methoden die hydrologische Situation eines naturnahen Zustandes des 18. Jh. dargestellt bzw. ermittelt. Die Grundlage zur Ermittlung eines naturnahen Zustandes des 18. Jh. waren historische Daten wie z.B. Karten, Handschriften, Meliorationspläne. Die Detektierung bzw. Aufnahme historischer Querschnitte sowie die Modellierung des Abflusses im 18 Jh. tragen ebenfalls zu einer Generierung des Gesamtbildes im 18.Jh. bei. Die aus diesen Daten gewonnenen Erkenntnisse wurden auf die weitere Anwendung als Leitbild für Renaturierungsmaßnahmen überprüft.
Robust appraisals of climate impacts at different levels of global-mean temperature increase are vital to guide assessments of dangerous anthropogenic interference with the climate system. The 2015 Paris Agreement includes a two-headed temperature goal: "holding the increase in the global average temperature to well below 2 degrees C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5 degrees C". Despite the prominence of these two temperature limits, a comprehensive overview of the differences in climate impacts at these levels is still missing. Here we provide an assessment of key impacts of climate change at warming levels of 1.5 degrees C and 2 degrees C, including extreme weather events, water availability, agricultural yields, sea-level rise and risk of coral reef loss. Our results reveal substantial differences in impacts between a 1.5 degrees C and 2 degrees C warming that are highly relevant for the assessment of dangerous anthropogenic interference with the climate system. For heat-related extremes, the additional 0.5 degrees C increase in global-mean temperature marks the difference between events at the upper limit of present-day natural variability and a new climate regime, particularly in tropical regions. Similarly, this warming difference is likely to be decisive for the future of tropical coral reefs. In a scenario with an end-of-century warming of 2 degrees C, virtually all tropical coral reefs are projected to be at risk of severe degradation due to temperature-induced bleaching from 2050 onwards. This fraction is reduced to about 90% in 2050 and projected to decline to 70% by 2100 for a 1.5 degrees C scenario. Analyses of precipitation-related impacts reveal distinct regional differences and hot-spots of change emerge. Regional reduction in median water availability for the Mediterranean is found to nearly double from 9% to 17% between 1.5 degrees C and 2 degrees C, and the projected lengthening of regional dry spells increases from 7 to 11%. Projections for agricultural yields differ between crop types as well as world regions. While some (in particular high-latitude) regions may benefit, tropical regions like West Africa, South-East Asia, as well as Central and northern South America are projected to face substantial local yield reductions, particularly for wheat and maize. Best estimate sea-level rise projections based on two illustrative scenarios indicate a 50cm rise by 2100 relative to year 2000-levels for a 2 degrees C scenario, and about 10 cm lower levels for a 1.5 degrees C scenario. In a 1.5 degrees C scenario, the rate of sea-level rise in 2100 would be reduced by about 30% compared to a 2 degrees C scenario. Our findings highlight the importance of regional differentiation to assess both future climate risks and different vulnerabilities to incremental increases in global-mean temperature. The article provides a consistent and comprehensive assessment of existing projections and a good basis for future work on refining our understanding of the difference between impacts at 1.5 degrees C and 2 degrees C warming.
Winter storms are the most costly natural hazard for European residential property. We compare four distinct storm damage functions with respect to their forecast accuracy and variability, with particular regard to the most severe winter storms. The analysis focuses on daily loss estimates under differing spatial aggregation, ranging from district to country level. We discuss the broad and heavily skewed distribution of insured losses posing difficulties for both the calibration and the evaluation of damage functions. From theoretical considerations, we provide a synthesis between the frequently discussed cubic wind–damage relationship and recent studies that report much steeper damage functions for European winter storms. The performance of the storm loss models is evaluated for two sources of wind gust data, direct observations by the German Weather Service and ERA-Interim reanalysis data. While the choice of gust data has little impact on the evaluation of German storm loss, spatially resolved coefficients of variation reveal dependence between model and data choice. The comparison shows that the probabilistic models by Heneka et al. (2006) and Prahl et al. (2012) both provide accurate loss predictions for moderate to extreme losses, with generally small coefficients of variation. We favour the latter model in terms of model applicability. Application of the versatile deterministic model by Klawa and Ulbrich (2003) should be restricted to extreme loss, for which it shows the least bias and errors comparable to the probabilistic model by Prahl et al. (2012).
Most climate change impacts manifest in the form of natural hazards. Damage assessment typically relies on damage functions that translate the magnitude of extreme events to a quantifiable damage. In practice, the availability of damage functions is limited due to a lack of data sources and a lack of understanding of damage processes. The study of the characteristics of damage functions for different hazards could strengthen the theoretical foundation of damage functions and support their development and validation. Accordingly, we investigate analogies of damage functions for coastal flooding and for wind storms and identify a unified approach. This approach has general applicability for granular portfolios and may also be applied, for example, to heat-related mortality. Moreover, the unification enables the transfer of methodology between hazards and a consistent treatment of uncertainty. This is demonstrated by a sensitivity analysis on the basis of two simple case studies (for coastal flood and storm damage). The analysis reveals the relevance of the various uncertainty sources at varying hazard magnitude and on both the microscale and the macroscale level. Main findings are the dominance of uncertainty from the hazard magnitude and the persistent behaviour of intrinsic uncertainties on both scale levels. Our results shed light on the general role of uncertainties and provide useful insight for the application of the unified approach.
The economic assessment of the impacts of storm surges and sea-level rise in coastal cities requires high-level information on the damage and protection costs associated with varying flood heights. We provide a systematically and consistently calculated dataset of macroscale damage and protection cost curves for the 600 largest European coastal cities opening the perspective for a wide range of applications. Offering the first comprehensive dataset to include the costs of dike protection, we provide the underpinning information to run comparative assessments of costs and benefits of coastal adaptation. Aggregate cost curves for coastal flooding at the city-level are commonly regarded as by-products of impact assessments and are generally not published as a standalone dataset. Hence, our work also aims at initiating a more critical discussion on the availability and derivation of cost curves.