Refine
Year of publication
Document Type
- Article (11)
- Doctoral Thesis (6)
- Postprint (4)
- Habilitation Thesis (1)
- Review (1)
Language
- English (23)
Is part of the Bibliography
- yes (23)
Keywords
- uncertainty (23) (remove)
Institute
- Institut für Umweltwissenschaften und Geographie (10)
- Institut für Geowissenschaften (7)
- Extern (2)
- Department Sport- und Gesundheitswissenschaften (1)
- Fachgruppe Betriebswirtschaftslehre (1)
- Fachgruppe Volkswirtschaftslehre (1)
- Hasso-Plattner-Institut für Digital Engineering GmbH (1)
- Institut für Mathematik (1)
- Mathematisch-Naturwissenschaftliche Fakultät (1)
- Wirtschaftswissenschaften (1)
Navigating the unknown
(2024)
Visionary leadership is considered to be one of the most important elements of effective leadership. Among other things, it is related to followers' perceived meaningfulness of their work. However, little is known about whether uncertainty in the workplace affects visionary leadership's effects. Given that uncertainty is rising in many, if not most, workplaces, it is vital to understand whether this development influences the extent to which visionary leadership is associated with followers' perceived meaningfulness. In a two-source, lagged design field study of 258 leader-follower dyads from different settings, we show that uncertainty moderates the relation between visionary leadership and followers' perceived meaningfulness such that this relation is more strongly positive when uncertainty is high, rather than low. Moreover, we show that with increasing uncertainty, visionary leadership is more negatively related to followers' turnover intentions via perceived meaningfulness. This research broadens our understanding of how visionary leadership may be a particularly potent tool in times of increasing uncertainty.
Within the field of species distribution modelling an apparent dichotomy exists between process-based and correlative approaches, where the processes are explicit in the former and implicit in the latter. However, these intuitive distinctions can become blurred when comparing species distribution modelling approaches in more detail. In this review article, we contrast the extremes of the correlativeprocess spectrum of species distribution models with respect to core assumptions, model building and selection strategies, validation, uncertainties, common errors and the questions they are most suited to answer. The extremes of such approaches differ clearly in many aspects, such as model building approaches, parameter estimation strategies and transferability. However, they also share strengths and weaknesses. We show that claims of one approach being intrinsically superior to the other are misguided and that they ignore the processcorrelation continuum as well as the domains of questions that each approach is addressing. Nonetheless, the application of process-based approaches to species distribution modelling lags far behind more correlative (process-implicit) methods and more research is required to explore their potential benefits. Critical issues for the employment of species distribution modelling approaches are given, together with a guideline for appropriate usage. We close with challenges for future development of process-explicit species distribution models and how they may complement current approaches to study species distributions.
The fluxes of water and solutes in the subsurface compartment of the Critical Zone are temporally dynamic and it is unclear how this impacts microbial mediated nutrient cycling in the spatially heterogeneous subsurface. To investigate this, we undertook numerical modeling, simulating the transport in a wide range of spatially heterogeneous domains, and the biogeochemical transformation of organic carbon and nitrogen compounds using a complex microbial community with four (4) distinct functional groups, in water saturated subsurface compartments. We performed a comprehensive uncertainty analysis accounting for varying residence times and spatial heterogeneity. While the aggregated removal of chemical species in the domains over the entire simulation period was approximately the same as that in steady state conditions, the sub-scale temporal variation of microbial biomass and chemical discharge from a domain depended strongly on the interplay of spatial heterogeneity and temporal dynamics of the forcing. We showed that the travel time and the Damkohler number (Da) can be used to predict the temporally varying chemical discharge from a spatially heterogeneous domain. In homogeneous domains, chemical discharge in temporally dynamic conditions could be double of that in the steady state conditions while microbial biomass varied up to 75% of that in steady state conditions. In heterogeneous domains, the interquartile range of uncertainty in chemical discharge in reaction dominated systems (log(10)Da > 0) was double of that in steady state conditions. However, high heterogeneous domains resulted in outliers where chemical discharge could be as high as 10-20 times of that in steady state conditions in high flow periods. And in transport dominated systems (log(10)Da < 0), the chemical discharge could be half of that in steady state conditions in unusually low flow conditions. In conclusion, ignoring spatio-temporal heterogeneities in a numerical modeling approach may exacerbate inaccurate estimation of nutrient export and microbial biomass. The results are relevant to long-term field monitoring studies, and for homogeneous soil column-scale experiments investigating the role of temporal dynamics on microbial redox dynamics.
Bayesian geomorphology
(2020)
The rapidly growing amount and diversity of data are confronting us more than ever with the need to make informed predictions under uncertainty. The adverse impacts of climate change and natural hazards also motivate our search for reliable predictions. The range of statistical techniques that geomorphologists use to tackle this challenge has been growing, but rarely involves Bayesian methods. Instead, many geomorphic models rely on estimated averages that largely miss out on the variability of form and process. Yet seemingly fixed estimates of channel heads, sediment rating curves or glacier equilibrium lines, for example, are all prone to uncertainties. Neighbouring scientific disciplines such as physics, hydrology or ecology have readily embraced Bayesian methods to fully capture and better explain such uncertainties, as the necessary computational tools have advanced greatly. The aim of this article is to introduce the Bayesian toolkit to scientists concerned with Earth surface processes and landforms, and to show how geomorphic models might benefit from probabilistic concepts. I briefly review the use of Bayesian reasoning in geomorphology, and outline the corresponding variants of regression and classification in several worked examples.
Bayesian geomorphology
(2020)
The rapidly growing amount and diversity of data are confronting us more than ever with the need to make informed predictions under uncertainty. The adverse impacts of climate change and natural hazards also motivate our search for reliable predictions. The range of statistical techniques that geomorphologists use to tackle this challenge has been growing, but rarely involves Bayesian methods. Instead, many geomorphic models rely on estimated averages that largely miss out on the variability of form and process. Yet seemingly fixed estimates of channel heads, sediment rating curves or glacier equilibrium lines, for example, are all prone to uncertainties. Neighbouring scientific disciplines such as physics, hydrology or ecology have readily embraced Bayesian methods to fully capture and better explain such uncertainties, as the necessary computational tools have advanced greatly. The aim of this article is to introduce the Bayesian toolkit to scientists concerned with Earth surface processes and landforms, and to show how geomorphic models might benefit from probabilistic concepts. I briefly review the use of Bayesian reasoning in geomorphology, and outline the corresponding variants of regression and classification in several worked examples.
Uncertainty is an essential part of atmospheric processes and thus inherent to weather forecasts. Nevertheless, weather forecasts and warnings are still predominately issued as deterministic (yes or no) forecasts, although research suggests that providing weather forecast users with additional information about the forecast uncertainty can enhance the preparation of mitigation measures. Communicating forecast uncertainty would allow for a provision of information on possible future events at an earlier time. The desired benefit is to enable the users to start with preparatory protective action at an earlier stage of time based on the their own risk assessment and decision threshold. But not all users have the same threshold for taking action. In the course of the project WEXICOM (‘Wetterwarnungen: Von der Extremereignis-Information zu Kommunikation und Handlung’) funded by the Deutscher Wetterdienst (DWD), three studies were conducted between the years 2012 and 2016 to reveal how weather forecasts and warnings are reflected in weather-related decision-making. The studies asked which factors influence the perception of forecasts and the decision to take protective action and how forecast users make sense of probabilistic information and the additional lead time. In a first exploratory study conducted in 2012, members of emergency services in Germany were asked questions about how weather warnings are communicated to professional endusers in the emergency community and how the warnings are converted into mitigation measures. A large number of open questions were selected to identify new topics of interest. The questions covered topics like users’ confidence in forecasts, their understanding of probabilistic information as well as their lead time and decision thresholds to start with preparatory mitigation measures. Results show that emergency service personnel generally have a good sense of uncertainty inherent in weather forecasts. Although no single probability threshold could be identified for organisations to start with preparatory mitigation measures, it became clear that emergency services tend to avoid forecasts based on low probabilities as a basis for their decisions. Based on this findings, a second study conducted with residents of Berlin in 2014 further investigated the question of decision thresholds. The survey questions related to the topics of the perception of and prior experience with severe weather, trustworthiness of forecasters and confidence in weather forecasts, and socio-demographic and social-economic characteristics. Within the questionnaire a scenario was created to determine individual decision thresholds and see whether subgroups of the sample lead to different thresholds. The results show that people’s willingness to act tends to be higher and decision thresholds tend to be lower if the expected weather event is more severe or the property at risk is of higher value. Several influencing factors of risk perception have significant effects such as education, housing status and ability to act, whereas socio-demographic determinants alone are often not sufficient to fully grasp risk perception and protection behaviour. Parallel to the quantitative studies, an interview study was conducted with 27 members of German civil protection between 2012 and 2016. The results show that the latest developments in (numerical) weather forecasting do not necessarily fit the current practice of German emergency services. These practices are mostly carried out on alarms and ground truth in a reactive manner rather than on anticipation based on prognosis or forecasts. As the potential consequences rather than the event characteristics determine protective action, the findings support the call and need for impact-based warnings. Forecasters will rely on impact data and need to learn the users’ understanding of impact. Therefore, it is recommended to enhance weather communication not only by improving computer models and observation tools, but also by focusing on the aspects of communication and collaboration. Using information about uncertainty demands awareness about and acceptance of the limits of knowledge, hence, the capabilities of the forecaster to anticipate future developments of the atmosphere and the capabilities of the user to make sense of this information.
Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth-damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable Bagging decision Tree Flood Loss Estimation MOdel (BT-FLEMO) for residential buildings was developed. The application of BT-FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT-FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth-damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT-FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT-FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT-FLEMO well represents the variation range of loss estimates of the other models in the case study.
Communicating uncertainties in scientific evidence is important to accurately reflect scientific knowledge , increase public understanding of uncertainty, and to signal transparency and honesty in reporting. While techniques have been developed to facilitate the communication of uncertainty, many have not been empirically tested, compared for communicating different types of uncertainty, or their effects on different cognitive, trust, and behavioral outcomes have not been evaluated. The present study examined how a point estimate, imprecise estimate, conflicting estimates, or a statement about the lack of evidence about treatment effects, influenced participant's responses to communications about medical evidence. For each type of uncertainty, we adapted three display formats to communicate the information: tables, bar graphs, and icon arrays. We compared participant's best estimates of treatment effects, as well as effects on recall, subjective evaluations (understandability and usefuleness), certainty perceptions, perceptions of trustworthiness of the information, and behavioral intentions. We did not find any detrimental effects from communicating imprecision or conflicting estimates relative to a point estimate across any outcome. Furthermore, there were more favorable responses to communicating imprecision or conflicting estimates relative to lack of evidence, where participants estimated the treatment would improve outcomes by 30-50% relative to a placebo. There were no differences across display formats, suggesting that, if well-designed, it may not matter which format is used. Future research on specific display formats or uncertainty types and with larger sample sizes would be needed to detect small effects. Implications for the communication of uncertainty are discussed.
Model uncertainty quantification is an essential component of effective data assimilation. Model errors associated with sub-grid scale processes are often represented through stochastic parameterizations of the unresolved process. Many existing Stochastic Parameterization schemes are only applicable when knowledge of the true sub-grid scale process or full observations of the coarse scale process are available, which is typically not the case in real applications. We present a methodology for estimating the statistics of sub-grid scale processes for the more realistic case that only partial observations of the coarse scale process are available. Model error realizations are estimated over a training period by minimizing their conditional sum of squared deviations given some informative covariates (e.g., state of the system), constrained by available observations and assuming that the observation errors are smaller than the model errors. From these realizations a conditional probability distribution of additive model errors given these covariates is obtained, allowing for complex non-Gaussian error structures. Random draws from this density are then used in actual ensemble data assimilation experiments. We demonstrate the efficacy of the approach through numerical experiments with the multi-scale Lorenz 96 system using both small and large time scale separations between slow (coarse scale) and fast (fine scale) variables. The resulting error estimates and forecasts obtained with this new method are superior to those from two existing methods.
Extreme weather events are likely to occur more often under climate change and the resulting effects on ecosystems could lead to a further acceleration of climate change. But not all extreme weather events lead to extreme ecosystem response. Here, we focus on hazardous ecosystem behaviour and identify coinciding weather conditions. We use a simple probabilistic risk assessment based on time series of ecosystem behaviour and climate conditions. Given the risk assessment terminology, vulnerability and risk for the previously defined hazard are estimated on the basis of observed hazardous ecosystem behaviour.
We apply this approach to extreme responses of terrestrial ecosystems to drought, defining the hazard as a negative net biome productivity over a 12-month period. We show an application for two selected sites using data for 1981-2010 and then apply the method to the pan-European scale for the same period, based on numerical modelling results (LPJmL for ecosystem behaviour; ERA-Interim data for climate).
Our site-specific results demonstrate the applicability of the proposed method, using the SPEI to describe the climate condition. The site in Spain provides an example of vulnerability to drought because the expected value of the SPEI is 0.4 lower for hazardous than for non-hazardous ecosystem behaviour. In northern Germany, on the contrary, the site is not vulnerable to drought because the SPEI expectation values imply wetter conditions in the hazard case than in the non-hazard case.
At the pan-European scale, ecosystem vulnerability to drought is calculated in the Mediterranean and temperate region, whereas Scandinavian ecosystems are vulnerable under conditions without water shortages. These first model- based applications indicate the conceptual advantages of the proposed method by focusing on the identification of critical weather conditions for which we observe hazardous ecosystem behaviour in the analysed data set. Application of the method to empirical time series and to future climate would be important next steps to test the approach.