TY - JOUR A1 - Khurana, Swamini A1 - Hesse, Falk A1 - Kleidon-Hildebrandt, Anke A1 - Thullner, Martin T1 - Should we worry about surficial dynamics when assessing nutrient cycling in the groundwater? JF - Frontiers in water N2 - The fluxes of water and solutes in the subsurface compartment of the Critical Zone are temporally dynamic and it is unclear how this impacts microbial mediated nutrient cycling in the spatially heterogeneous subsurface. To investigate this, we undertook numerical modeling, simulating the transport in a wide range of spatially heterogeneous domains, and the biogeochemical transformation of organic carbon and nitrogen compounds using a complex microbial community with four (4) distinct functional groups, in water saturated subsurface compartments. We performed a comprehensive uncertainty analysis accounting for varying residence times and spatial heterogeneity. While the aggregated removal of chemical species in the domains over the entire simulation period was approximately the same as that in steady state conditions, the sub-scale temporal variation of microbial biomass and chemical discharge from a domain depended strongly on the interplay of spatial heterogeneity and temporal dynamics of the forcing. We showed that the travel time and the Damkohler number (Da) can be used to predict the temporally varying chemical discharge from a spatially heterogeneous domain. In homogeneous domains, chemical discharge in temporally dynamic conditions could be double of that in the steady state conditions while microbial biomass varied up to 75% of that in steady state conditions. In heterogeneous domains, the interquartile range of uncertainty in chemical discharge in reaction dominated systems (log(10)Da > 0) was double of that in steady state conditions. However, high heterogeneous domains resulted in outliers where chemical discharge could be as high as 10-20 times of that in steady state conditions in high flow periods. And in transport dominated systems (log(10)Da < 0), the chemical discharge could be half of that in steady state conditions in unusually low flow conditions. In conclusion, ignoring spatio-temporal heterogeneities in a numerical modeling approach may exacerbate inaccurate estimation of nutrient export and microbial biomass. The results are relevant to long-term field monitoring studies, and for homogeneous soil column-scale experiments investigating the role of temporal dynamics on microbial redox dynamics. KW - reactive transport modeling KW - spatio-temporal heterogeneity KW - uncertainty KW - geomicrobial activity KW - nutrient export Y1 - 2022 U6 - https://doi.org/10.3389/frwa.2022.780297 SN - 2624-9375 VL - 4 PB - Frontiers Media CY - Lausanne ER - TY - JOUR A1 - Sieg, Tobias A1 - Vogel, Kristin A1 - Merz, Bruno A1 - Kreibich, Heidi T1 - Seamless Estimation of Hydrometeorological Risk Across Spatial Scales JF - Earth's Future N2 - Hydrometeorological hazards caused losses of approximately 110 billion U.S. Dollars in 2016 worldwide. Current damage estimations do not consider the uncertainties in a comprehensive way, and they are not consistent between spatial scales. Aggregated land use data are used at larger spatial scales, although detailed exposure data at the object level, such as openstreetmap.org, is becoming increasingly available across the globe.We present a probabilistic approach for object-based damage estimation which represents uncertainties and is fully scalable in space. The approach is applied and validated to company damage from the flood of 2013 in Germany. Damage estimates are more accurate compared to damage models using land use data, and the estimation works reliably at all spatial scales. Therefore, it can as well be used for pre-event analysis and risk assessments. This method takes hydrometeorological damage estimation and risk assessments to the next level, making damage estimates and their uncertainties fully scalable in space, from object to country level, and enabling the exploitation of new exposure data. KW - spatial scales KW - risk assessment KW - hydro-meterological hazards KW - object-based damage modeling KW - uncertainty KW - probabilistic approaches Y1 - 2019 U6 - https://doi.org/10.1029/2018EF001122 SN - 2328-4277 VL - 7 IS - 5 SP - 574 EP - 581 PB - Wiley-Blackwell CY - Hoboken, NJ ER - TY - JOUR A1 - Kreibich, Heidi A1 - Botto, Anna A1 - Merz, Bruno A1 - Schröter, Kai T1 - Probabilistic, Multivariable Flood Loss Modeling on the Mesoscale with BT-FLEMO JF - Risk analysis N2 - Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth-damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable Bagging decision Tree Flood Loss Estimation MOdel (BT-FLEMO) for residential buildings was developed. The application of BT-FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT-FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth-damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT-FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT-FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT-FLEMO well represents the variation range of loss estimates of the other models in the case study. KW - Damage modeling KW - multiparameter KW - probabilistic KW - uncertainty KW - validation Y1 - 2016 U6 - https://doi.org/10.1111/risa.12650 SN - 0272-4332 SN - 1539-6924 VL - 37 IS - 4 SP - 774 EP - 787 PB - Wiley CY - Hoboken ER - TY - JOUR A1 - van der Aa, Han A1 - Leopold, Henrik A1 - Weidlich, Matthias T1 - Partial order resolution of event logs for process conformance checking JF - Decision support systems : DSS N2 - While supporting the execution of business processes, information systems record event logs. Conformance checking relies on these logs to analyze whether the recorded behavior of a process conforms to the behavior of a normative specification. A key assumption of existing conformance checking techniques, however, is that all events are associated with timestamps that allow to infer a total order of events per process instance. Unfortunately, this assumption is often violated in practice. Due to synchronization issues, manual event recordings, or data corruption, events are only partially ordered. In this paper, we put forward the problem of partial order resolution of event logs to close this gap. It refers to the construction of a probability distribution over all possible total orders of events of an instance. To cope with the order uncertainty in real-world data, we present several estimators for this task, incorporating different notions of behavioral abstraction. Moreover, to reduce the runtime of conformance checking based on partial order resolution, we introduce an approximation method that comes with a bounded error in terms of accuracy. Our experiments with real-world and synthetic data reveal that our approach improves accuracy over the state-of-the-art considerably. KW - process mining KW - conformance checking KW - partial order resolution KW - data KW - uncertainty Y1 - 2020 U6 - https://doi.org/10.1016/j.dss.2020.113347 SN - 0167-9236 SN - 1873-5797 VL - 136 PB - Elsevier CY - Amsterdam [u.a.] ER - TY - JOUR A1 - Wagener, Thorsten A1 - Reinecke, Robert A1 - Pianosi, Francesca T1 - On the evaluation of climate change impact models JF - Wiley interdisciplinary reviews : Climate change N2 - In-depth understanding of the potential implications of climate change is required to guide decision- and policy-makers when developing adaptation strategies and designing infrastructure suitable for future conditions. Impact models that translate potential future climate conditions into variables of interest are needed to create the causal connection between a changing climate and its impact for different sectors. Recent surveys suggest that the primary strategy for validating such models (and hence for justifying their use) heavily relies on assessing the accuracy of model simulations by comparing them against historical observations. We argue that such a comparison is necessary and valuable, but not sufficient to achieve a comprehensive evaluation of climate change impact models. We believe that a complementary, largely observation-independent, step of model evaluation is needed to ensure more transparency of model behavior and greater robustness of scenario-based analyses. This step should address the following four questions: (1) Do modeled dominant process controls match our system perception? (2) Is my model's sensitivity to changing forcing as expected? (3) Do modeled decision levers show adequate influence? (4) Can we attribute uncertainty sources throughout the projection horizon? We believe that global sensitivity analysis, with its ability to investigate a model's response to joint variations of multiple inputs in a structured way, offers a coherent approach to address all four questions comprehensively. Such additional model evaluation would strengthen stakeholder confidence in model projections and, therefore, into the adaptation strategies derived with the help of impact models. This article is categorized under: Climate Models and Modeling > Knowledge Generation with Models Assessing Impacts of Climate Change > Evaluating Future Impacts of Climate Change KW - adaptation KW - sensitivity analysis KW - uncertainty KW - validation Y1 - 2022 U6 - https://doi.org/10.1002/wcc.772 SN - 1757-7780 SN - 1757-7799 VL - 13 IS - 3 PB - Wiley CY - Hoboken ER - TY - JOUR A1 - Buss, Martin A1 - Kearney, Eric T1 - Navigating the unknown BT - uncertainty moderates the link between visionary leadership, perceived meaningfulness, and turnover intentions JF - Journal of occupational and organizational psychology N2 - Visionary leadership is considered to be one of the most important elements of effective leadership. Among other things, it is related to followers' perceived meaningfulness of their work. However, little is known about whether uncertainty in the workplace affects visionary leadership's effects. Given that uncertainty is rising in many, if not most, workplaces, it is vital to understand whether this development influences the extent to which visionary leadership is associated with followers' perceived meaningfulness. In a two-source, lagged design field study of 258 leader-follower dyads from different settings, we show that uncertainty moderates the relation between visionary leadership and followers' perceived meaningfulness such that this relation is more strongly positive when uncertainty is high, rather than low. Moreover, we show that with increasing uncertainty, visionary leadership is more negatively related to followers' turnover intentions via perceived meaningfulness. This research broadens our understanding of how visionary leadership may be a particularly potent tool in times of increasing uncertainty. KW - follower turnover intentions KW - perceived meaningfulness KW - uncertainty KW - visionary leadership Y1 - 2024 U6 - https://doi.org/10.1111/joop.12500 SN - 0963-1798 SN - 2044-8325 PB - Wiley CY - Hoboken, NJ ER - TY - JOUR A1 - Pathiraja, Sahani Darschika A1 - Leeuwen, Peter Jan van T1 - Multiplicative Non-Gaussian model error estimation in data assimilation JF - Journal of advances in modeling earth systems : JAMES N2 - Model uncertainty quantification is an essential component of effective data assimilation. Model errors associated with sub-grid scale processes are often represented through stochastic parameterizations of the unresolved process. Many existing Stochastic Parameterization schemes are only applicable when knowledge of the true sub-grid scale process or full observations of the coarse scale process are available, which is typically not the case in real applications. We present a methodology for estimating the statistics of sub-grid scale processes for the more realistic case that only partial observations of the coarse scale process are available. Model error realizations are estimated over a training period by minimizing their conditional sum of squared deviations given some informative covariates (e.g., state of the system), constrained by available observations and assuming that the observation errors are smaller than the model errors. From these realizations a conditional probability distribution of additive model errors given these covariates is obtained, allowing for complex non-Gaussian error structures. Random draws from this density are then used in actual ensemble data assimilation experiments. We demonstrate the efficacy of the approach through numerical experiments with the multi-scale Lorenz 96 system using both small and large time scale separations between slow (coarse scale) and fast (fine scale) variables. The resulting error estimates and forecasts obtained with this new method are superior to those from two existing methods. KW - model uncertainty KW - non-Gaussian KW - data-driven KW - uncertainty KW - quantification KW - Lorenz 96 KW - sub-grid scale Y1 - 2022 U6 - https://doi.org/10.1029/2021MS002564 SN - 1942-2466 VL - 14 IS - 4 PB - American Geophysical Union CY - Washington ER - TY - JOUR A1 - Dormann, Carsten F. A1 - Schymanski, Stanislaus J. A1 - Cabral, Juliano Sarmento A1 - Chuine, Isabelle A1 - Graham, Catherine A1 - Hartig, Florian A1 - Kearney, Michael A1 - Morin, Xavier A1 - Römermann, Christine A1 - Schröder-Esselbach, Boris A1 - Singer, Alexander T1 - Correlation and process in species distribution models: bridging a dichotomy JF - Journal of biogeography N2 - Within the field of species distribution modelling an apparent dichotomy exists between process-based and correlative approaches, where the processes are explicit in the former and implicit in the latter. However, these intuitive distinctions can become blurred when comparing species distribution modelling approaches in more detail. In this review article, we contrast the extremes of the correlativeprocess spectrum of species distribution models with respect to core assumptions, model building and selection strategies, validation, uncertainties, common errors and the questions they are most suited to answer. The extremes of such approaches differ clearly in many aspects, such as model building approaches, parameter estimation strategies and transferability. However, they also share strengths and weaknesses. We show that claims of one approach being intrinsically superior to the other are misguided and that they ignore the processcorrelation continuum as well as the domains of questions that each approach is addressing. Nonetheless, the application of process-based approaches to species distribution modelling lags far behind more correlative (process-implicit) methods and more research is required to explore their potential benefits. Critical issues for the employment of species distribution modelling approaches are given, together with a guideline for appropriate usage. We close with challenges for future development of process-explicit species distribution models and how they may complement current approaches to study species distributions. KW - Hypothesis generation KW - mechanistic model KW - parameterization KW - process-based model KW - species distribution model KW - SDM KW - uncertainty KW - validation Y1 - 2012 U6 - https://doi.org/10.1111/j.1365-2699.2011.02659.x SN - 0305-0270 VL - 39 IS - 12 SP - 2119 EP - 2131 PB - Wiley-Blackwell CY - Hoboken ER - TY - JOUR A1 - McDowell, Michelle A1 - Kause, Astrid T1 - Communicating uncertainties about the effects of medical interventions using different display formats JF - Risk analysis : an international journal N2 - Communicating uncertainties in scientific evidence is important to accurately reflect scientific knowledge , increase public understanding of uncertainty, and to signal transparency and honesty in reporting. While techniques have been developed to facilitate the communication of uncertainty, many have not been empirically tested, compared for communicating different types of uncertainty, or their effects on different cognitive, trust, and behavioral outcomes have not been evaluated. The present study examined how a point estimate, imprecise estimate, conflicting estimates, or a statement about the lack of evidence about treatment effects, influenced participant's responses to communications about medical evidence. For each type of uncertainty, we adapted three display formats to communicate the information: tables, bar graphs, and icon arrays. We compared participant's best estimates of treatment effects, as well as effects on recall, subjective evaluations (understandability and usefuleness), certainty perceptions, perceptions of trustworthiness of the information, and behavioral intentions. We did not find any detrimental effects from communicating imprecision or conflicting estimates relative to a point estimate across any outcome. Furthermore, there were more favorable responses to communicating imprecision or conflicting estimates relative to lack of evidence, where participants estimated the treatment would improve outcomes by 30-50% relative to a placebo. There were no differences across display formats, suggesting that, if well-designed, it may not matter which format is used. Future research on specific display formats or uncertainty types and with larger sample sizes would be needed to detect small effects. Implications for the communication of uncertainty are discussed. KW - risk communication KW - uncertainty KW - visual displays Y1 - 2021 U6 - https://doi.org/10.1111/risa.13739 SN - 0272-4332 SN - 1539-6924 VL - 41 IS - 12 SP - 2220 EP - 2239 PB - Wiley CY - Hoboken ER - TY - JOUR A1 - Korup, Oliver T1 - Bayesian geomorphology JF - Earth surface processes and landforms : the journal of the British Geomorphological Research Group N2 - The rapidly growing amount and diversity of data are confronting us more than ever with the need to make informed predictions under uncertainty. The adverse impacts of climate change and natural hazards also motivate our search for reliable predictions. The range of statistical techniques that geomorphologists use to tackle this challenge has been growing, but rarely involves Bayesian methods. Instead, many geomorphic models rely on estimated averages that largely miss out on the variability of form and process. Yet seemingly fixed estimates of channel heads, sediment rating curves or glacier equilibrium lines, for example, are all prone to uncertainties. Neighbouring scientific disciplines such as physics, hydrology or ecology have readily embraced Bayesian methods to fully capture and better explain such uncertainties, as the necessary computational tools have advanced greatly. The aim of this article is to introduce the Bayesian toolkit to scientists concerned with Earth surface processes and landforms, and to show how geomorphic models might benefit from probabilistic concepts. I briefly review the use of Bayesian reasoning in geomorphology, and outline the corresponding variants of regression and classification in several worked examples. KW - Bayes' rule KW - probability KW - uncertainty KW - prediction Y1 - 2020 U6 - https://doi.org/10.1002/esp.4995 SN - 0197-9337 SN - 1096-9837 VL - 46 IS - 1 SP - 151 EP - 172 PB - Wiley CY - Hoboken ER -