TY - THES A1 - Kriegler, Elmar T1 - Imprecise probability analysis for integrated assessment of climate change T1 - Anwendung der Theorie der unscharfen Wahrscheinlichkeit in der integrierten Analyse des Klimawandels N2 - We present an application of imprecise probability theory to the quantification of uncertainty in the integrated assessment of climate change. Our work is motivated by the fact that uncertainty about climate change is pervasive, and therefore requires a thorough treatment in the integrated assessment process. Classical probability theory faces some severe difficulties in this respect, since it cannot capture very poor states of information in a satisfactory manner. A more general framework is provided by imprecise probability theory, which offers a similarly firm evidential and behavioural foundation, while at the same time allowing to capture more diverse states of information. An imprecise probability describes the information in terms of lower and upper bounds on probability. For the purpose of our imprecise probability analysis, we construct a diffusion ocean energy balance climate model that parameterises the global mean temperature response to secular trends in the radiative forcing in terms of climate sensitivity and effective vertical ocean heat diffusivity. We compare the model behaviour to the 20th century temperature record in order to derive a likelihood function for these two parameters and the forcing strength of anthropogenic sulphate aerosols. Results show a strong positive correlation between climate sensitivity and ocean heat diffusivity, and between climate sensitivity and absolute strength of the sulphate forcing. We identify two suitable imprecise probability classes for an efficient representation of the uncertainty about the climate model parameters and provide an algorithm to construct a belief function for the prior parameter uncertainty from a set of probability constraints that can be deduced from the literature or observational data. For the purpose of updating the prior with the likelihood function, we establish a methodological framework that allows us to perform the updating procedure efficiently for two different updating rules: Dempster's rule of conditioning and the Generalised Bayes' rule. Dempster's rule yields a posterior belief function in good qualitative agreement with previous studies that tried to constrain climate sensitivity and sulphate aerosol cooling. In contrast, we are not able to produce meaningful imprecise posterior probability bounds from the application of the Generalised Bayes' Rule. We can attribute this result mainly to our choice of representing the prior uncertainty by a belief function. We project the Dempster-updated belief function for the climate model parameters onto estimates of future global mean temperature change under several emissions scenarios for the 21st century, and several long-term stabilisation policies. Within the limitations of our analysis we find that it requires a stringent stabilisation level of around 450 ppm carbon dioxide equivalent concentration to obtain a non-negligible lower probability of limiting the warming to 2 degrees Celsius. We discuss several frameworks of decision-making under ambiguity and show that they can lead to a variety of, possibly imprecise, climate policy recommendations. We find, however, that poor states of information do not necessarily impede a useful policy advice. We conclude that imprecise probabilities constitute indeed a promising candidate for the adequate treatment of uncertainty in the integrated assessment of climate change. We have constructed prior belief functions that allow much weaker assumptions on the prior state of information than a prior probability would require and, nevertheless, can be propagated through the entire assessment process. As a caveat, the updating issue needs further investigation. Belief functions constitute only a sensible choice for the prior uncertainty representation if more restrictive updating rules than the Generalised Bayes'Rule are available. N2 - Diese Arbeit untersucht die Eignung der Theorie der unscharfen Wahrscheinlichkeiten für die Beschreibung der Unsicherheit in der integrierten Analyse des Klimawandels. Die wissenschaftliche Unsicherheit bezüglich vieler Aspekte des Klimawandels ist beträchtlich, so dass ihre angemessene Beschreibung von großer Wichtigkeit ist. Die klassische Wahrscheinlichkeitstheorie weist in diesem Zusammenhang einige Probleme auf, da sie Zustände sehr geringer Information nicht zufriedenstellend beschreiben kann. Die unscharfe Wahrscheinlichkeitstheorie bietet ein gleichermaßen fundiertes Theoriegebäude, welches jedoch eine größere Flexibilität bei der Beschreibung verschiedenartiger Informationszustände erlaubt. Unscharfe Wahrscheinlichkeiten erfassen solche Informationszustände durch die Spezifizierung von unteren und oberen Grenzen an zulässige Werte der Wahrscheinlichkeit. Unsere Analyse des Klimawandels beruht auf einem Energiebilanzmodell mit diffusivem Ozean, welches die globale Temperaturantwort auf eine Änderung der Strahlungsbilanz in Abhängigkeit von zwei Parametern beschreibt: die Klimasensitivität, und die effektive vertikale Wärmediffusivität im Ozean. Wir vergleichen das Modellverhalten mit den Temperaturmessungen des 20. Jahrhunderts, um eine sogenannte Likelihood-Funktion für die Hypothesen zu diesen beiden Parametern sowie dem kühlenden Einfluss der Sulfataerosole zu ermitteln. Im Ergebnis zeigt sich eine stark positive Korrelation zwischen Klimasensitivität und Wärmediffusivität im Ozean, und Klimasensitivität und kühlendem Einfluss der Sulfataerosole. Für die effiziente Beschreibung der Parameterunsicherheit ziehen wir zwei geeignete Modelltypen aus der unscharfen Wahrscheinlichkeitstheorie heran. Wir formulieren einen Algorithmus, der den Informationsgehalt beider Modelle durch eine sogenannte Belief-Funktion beschreibt. Mit Hilfe dieses Algorithmus konstruieren wir Belief-Funktionen für die A-priori-Parameterunsicherheit auf der Grundlage von divergierenden Wahrscheinlichkeitsschätzungen in der Literatur bzw. Beobachtungsdaten. Wir leiten eine Methode her, um die A-priori-Belief-Funktion im Lichte der Likelihood-Funktion zu aktualisieren. Dabei ziehen wir zwei verschiedene Regeln zur Durchführung des Lernprozesses in Betracht: die Dempstersche Regel und die verallgemeinerte Bayessche Regel. Durch Anwendung der Dempsterschen Regel erhalten wir eineA-posteriori-Belief-Funktion, deren Informationsgehalt qualitativ mit den Ergebnissen bisheriger Studien übereinstimmt, die eine Einschränkung der Unsicherheit über die Klimasensitivität und die kühlende Wirkung der Sulfataerosole versucht haben. Im Gegensatz dazu finden wir bei Anwendung der verallgemeinerten Bayesschen Regel keine sinnvollen unteren und oberen Grenzen an die A-posteriori-Wahrscheinlichkeit. Wir stellen fest, dass dieses Resultat maßgeblich durch die Wahl einer Belief-Funktion zur Beschreibung der A-priori-Unsicherheit bedingt ist. Die A-posteriori-Belief-Funktion für die Modellparameter, die wir aus der Anwendung der Dempsterschen Regel erhalten haben, wird zur Abschätzung des zukünftigen Temperaturanstiegs eingesetzt. Wir betrachten verschiedene Emissionsszenarien für das 21. Jahrhundert sowie verschiedene Stabilisierungsziele für den Treibhausgasgehalt in der Atmosphäre. Im Rahmen unserer Analyse finden wir, dass sehr strikte Stabilisierungsziele im Bereich einer Kohlendioxid-Äquivalentkonzentration von ca. 450 ppm in der Atmosphäre notwendig sind, um nicht eine vernachlässigbar kleine untere Wahrscheinlichkeit für die Begrenzung der Erwärmung auf 2 Grad Celsius zu erhalten. Wir diskutieren verschiedene Kriterien für die Entscheidungsfindung unter unscharfer Wahrscheinlichkeit, und zeigen dass sie zu verschiedenen teilweise unscharfen Politikempfehlungen führen können. Nichtsdestotrotz stellen wir fest, dass eine klare Politikempfehlung auch bei Zuständen schwacher Information möglich sein kann. Wir schließen, dass unscharfe Wahrscheinlichkeiten tatsächlich ein geeignetes Mittel zur Beschreibung der Unsicherheit in der integrierten Analyse des Klimawandels darstellen. Wir haben Algorithmen zur Generierung und Weiterverarbeitung von Belief-Funktionen etabliert, die eine deutlich größere A-priori-Unsicherheit beschreiben können, als durch eine A-priori-Wahrscheinlichkeit möglich wäre. Allerdings erfordert die Frage des Lernprozesses für unscharfe Wahrscheinlichkeiten eine weitergehende Untersuchung. Belief-Funktionen stellen nur dann eine vernünftige Wahl für die Beschreibung der A-priori-Unsicherheit dar, wenn striktere Regeln als die verallgemeinerte Bayessche Regel für den Lernprozess gerechtfertigt werden können. KW - Anthropogene Klimaänderung KW - Klima / Umweltschutz KW - Unschärfe KW - Intervallwahrscheinlichkeit KW - Entscheidung bei Unsicherheit KW - Climate Change KW - Integrated Assessment KW - Uncertainty KW - Imprecise Probability KW - Decision Making under Ambiguity Y1 - 2005 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-5611 ER - TY - JOUR A1 - Schwanghart, Wolfgang A1 - Heckmann, Tobias T1 - Fuzzy delineation of drainage basins through probabilistic interpretation of diverging flow algorithms JF - Environmental modelling & software with environment data news N2 - The assessment of uncertainty is a major challenge in geomorphometry. Methods to quantify uncertainty in digital elevation models (DEM) are needed to assess and report derivatives such as drainage basins. While Monte-Carlo (MC) techniques have been developed and employed to assess the variability of second-order derivatives of DEMs, their application requires explicit error modeling and numerous simulations to reliably calculate error bounds. Here, we develop an analytical model to quantify and visualize uncertainty in drainage basin delineation in DEMs. The model is based on the assumption that multiple flow directions (MFD) represent a discrete probability distribution of non-diverging flow networks. The Shannon Index quantifies the uncertainty of each cell to drain into a specific drainage basin outlet. In addition, error bounds for drainage areas can be derived. An application of the model shows that it identifies areas in a DEM where drainage basin delineation is highly uncertain owing to flow dispersion on convex landforms such as alluvial fans. The model allows for a quantitative assessment of the magnitudes of expected drainage area variability and delivers constraints for observed volatile hydrological behavior in a palaeoenvironmental record of lake level change. Since the model cannot account for all uncertainties in drainage basin delineation we conclude that a joint application with MC techniques is promising for an efficient and comprehensive error assessment in the future. KW - Digital terrain analysis KW - Digital elevation model KW - Uncertainty KW - Drainage networks KW - Fuzzy Y1 - 2012 U6 - https://doi.org/10.1016/j.envsoft.2012.01.016 SN - 1364-8152 VL - 33 SP - 106 EP - 113 PB - Elsevier CY - Oxford ER - TY - JOUR A1 - Zimmermann, Alexander A1 - Zimmermann, Beate T1 - Requirements for throughfall monitoring: The roles of temporal scale and canopy complexity JF - Agricultural and forest meteorology N2 - A wide range of basic and applied problems in water resources research requires high-quality estimates of the spatial mean of throughfall. Many throughfall sampling schemes, however, are not optimally adapted to the system under study. The application of inappropriate sampling schemes may partly reflect the lack of generally applicable guidelines on throughfall sampling strategies. In this study we conducted virtual sampling experiments using simulated fields which are based on empirical throughfall data from three structurally distinct forests (a 12-year old teak plantation, a 5-year old young secondary forest, and a 130-year old secondary forest). In the virtual sampling experiments we assessed the relative error of mean throughfall estimates for 38 different throughfall sampling schemes comprising a variety of funnel- and trough-type collectors and a large range of sample sizes. Moreover, we tested the performance of each scheme for both event-based and accumulated throughfall data. The key findings of our study are threefold. First, as errors of mean throughfall estimates vary as a function of throughfall depth, the decision on which temporal scale (i.e. event-based versus accumulated data) to sample strongly influences the required sampling effort. Second, given a chosen temporal scale throughfall estimates can vary considerably as a function of canopy complexity. Accordingly, throughfall sampling in simply structured forests requires a comparatively modest effort, whereas heterogeneous forests can be extreme in terms of sampling requirements, particularly if the focus is on reliable data of small events. Third, the efficiency of trough-type collectors depends on the spatial structure of throughfall. Strong, long-ranging throughfall patterns decrease the efficiency of troughs substantially. Based on the results of our virtual sampling experiments, which we evaluated by applying two contrasting sampling approaches simultaneously, we derive readily applicable guidelines for throughfall monitoring. (C) 2014 Elsevier B.V. All rights reserved. KW - Throughfall KW - Interception KW - Uncertainty KW - Spatial structure KW - Sampling strategy KW - Forest ecosystem Y1 - 2014 U6 - https://doi.org/10.1016/j.agrformet.2014.01.014 SN - 0168-1923 SN - 1873-2240 VL - 189 SP - 125 EP - 139 PB - Elsevier CY - Amsterdam ER - TY - JOUR A1 - Rumpf, Michael A1 - Tronicke, Jens T1 - Assessing uncertainty in refraction seismic traveltime inversion using a global inversion strategy JF - Geophysical prospecting N2 - To analyse and invert refraction seismic travel time data, different approaches and techniques have been proposed. One common approach is to invert first-break travel times employing local optimization approaches. However, these approaches result in a single velocity model, and it is difficult to assess the quality and to quantify uncertainties and non-uniqueness of the found solution. To address these problems, we propose an inversion strategy relying on a global optimization approach known as particle swarm optimization. With this approach we generate an ensemble of acceptable velocity models, i.e., models explaining our data equally well. We test and evaluate our approach using synthetic seismic travel times and field data collected across a creeping hillslope in the Austrian Alps. Our synthetic study mimics a layered near-surface environment, including a sharp velocity increase with depth and complex refractor topography. Analysing the generated ensemble of acceptable solutions using different statistical measures demonstrates that our inversion strategy is able to reconstruct the input velocity model, including reasonable, quantitative estimates of uncertainty. Our field data set is inverted, employing the same strategy, and we further compare our results with the velocity model obtained by a standard local optimization approach and the information from a nearby borehole. This comparison shows that both inversion strategies result in geologically reasonable models (in agreement with the borehole information). However, analysing the model variability of the ensemble generated using our global approach indicates that the result of the local optimization approach is part of this model ensemble. Our results show the benefit of employing a global inversion strategy to generate near-surface velocity models from refraction seismic data sets, especially in cases where no detailed a priori information regarding subsurface structures and velocity variations is available. KW - Inversion KW - Seismic refraction KW - Uncertainty Y1 - 2015 U6 - https://doi.org/10.1111/1365-2478.12240 SN - 0016-8025 SN - 1365-2478 VL - 63 IS - 5 SP - 1188 EP - 1197 PB - Wiley-Blackwell CY - Hoboken ER - TY - JOUR A1 - Ktenidou, Olga-Joan A1 - Roumelioti, Zafeiria A1 - Abrahamson, Norman A1 - Cotton, Fabrice Pierre A1 - Pitilakis, Kyriazis A1 - Hollender, Fabrice T1 - Understanding single-station ground motion variability and uncertainty (sigma) BT - lessons learnt from EUROSEISTEST JF - Bulletin of earthquake engineering : official publication of the European Association for Earthquake Engineering N2 - Accelerometric data from the well-studied valley EUROSEISTEST are used to investigate ground motion uncertainty and variability. We define a simple local ground motion prediction equation (GMPE) and investigate changes in standard deviation (σ) and its components, the between-event variability (τ) and within-event variability (φ). Improving seismological metadata significantly reduces τ (30–50%), which in turn reduces the total σ. Improving site information reduces the systematic site-to-site variability, φ S2S (20–30%), in turn reducing φ, and ultimately, σ. Our values of standard deviations are lower than global values from literature, and closer to path-specific than site-specific values. However, our data have insufficient azimuthal coverage for single-path analysis. Certain stations have higher ground-motion variability, possibly due to topography, basin edge or downgoing wave effects. Sensitivity checks show that 3 recordings per event is a sufficient data selection criterion, however, one of the dataset’s advantages is the large number of recordings per station (9–90) that yields good site term estimates. We examine uncertainty components binning our data with magnitude from 0.01 to 2 s; at smaller magnitudes, τ decreases and φ SS increases, possibly due to κ and source-site trade-offs Finally, we investigate the alternative approach of computing φ SS using existing GMPEs instead of creating an ad hoc local GMPE. This is important where data are insufficient to create one, or when site-specific PSHA is performed. We show that global GMPEs may still capture φ SS , provided that: (1) the magnitude scaling errors are accommodated by the event terms; (2) there are no distance scaling errors (use of a regionally applicable model). Site terms (φ S2S ) computed by different global GMPEs (using different site-proxies) vary significantly, especially for hard-rock sites. This indicates that GMPEs may be poorly constrained where they are sometimes most needed, i.e., for hard rock. KW - Ground motion KW - Variability KW - Uncertainty KW - Single station sigma KW - Site response Y1 - 2018 U6 - https://doi.org/10.1007/s10518-017-0098-6 SN - 1570-761X SN - 1573-1456 VL - 16 IS - 6 SP - 2311 EP - 2336 PB - Springer CY - Dordrecht ER - TY - JOUR A1 - Fortesa, Josep A1 - García-Comendador, Julian A1 - Calsamiglia, A. A1 - López-Tarazón, José Andrés A1 - Latron, J. A1 - Alorda, B. A1 - Estrany, Joan T1 - Comparison of stage/discharge rating curves derived from different recording systems BT - Consequences for streamflow data and water management in a Mediterranean island JF - The science of the total environment : an international journal for scientific research into the environment and its relationship with man N2 - Obtaining representative hydrometric values is essential for characterizing extreme events, hydrological dynamics and detecting possible changes on the long-term hydrology. Reliability of streamflow data requires a temporal continuity and a maintenance of the gauging stations, which data are affected by epistemic and random sources of error. An assessment of discharge meterings' and stage-discharge rating curves' uncertainties were carried out by comparing the accuracy of the measuring instruments of two different hydrometric networks (i.e., one analogical and one digital) established in the same river location at the Mediterranean island of Mallorca. Furthermore, the effects of such uncertainties were assessed on the hydrological dynamics, considering the significant global change impacts beset this island. Evaluation was developed at four representative gauging stations of the hydrographic network with analogic (≈40 years) and digital (≈10 years) data series. The study revealed that the largest source of uncertainty in the analogical (28 to 274%) and in the digital (17–37%) networks were the stage-discharge rating curves. Their impact on the water resources was also evaluated at the event and annual scales, resulting in an average difference of water yields of 183% and 142% respectively. Such improvement on the comprehension of hydrometric networks uncertainties will dramatically benefit the interpretation of the long-term streamflow by providing better insights into the hydrologic and flood hazard planning, management and modelling. KW - Hydrometric networks KW - Stage-discharge KW - Metering KW - Uncertainty KW - Error propagation Y1 - 2019 U6 - https://doi.org/10.1016/j.scitotenv.2019.02.158 SN - 0048-9697 SN - 1879-1026 VL - 665 SP - 968 EP - 981 PB - Elsevier Science CY - Amsterdam ER - TY - JOUR A1 - Wendt, Julia A1 - Morriss, Jayne T1 - An examination of intolerance of uncertainty and contingency instruction on multiple indices during threat acquisition and extinction training JF - International journal of psychophysiology : official journal of the International Organization of Psychophysiology N2 - Individuals who score high in self-reported Intolerance of Uncertainty (IU) tend to find uncertainty aversive. Prior research has demonstrated that under uncertainty individuals with high IU display difficulties in updating learned threat associations to safety associations. Importantly, recent research has shown that providing contingency instructions about threat and safety contingencies (i.e. reducing uncertainty) to individuals with high IU promotes the updating of learned threat associations to safety associations. Here we aimed to conceptually replicate IU and contingency instruction-based effects by conducting a secondary analysis of self-reported IU, ratings, skin conductance, and functional magnetic resonance imaging (fMRI) data recorded during uninstructed/instructed blocks of threat acquisition and threat extinction training (n = 48). Generally, no significant associations were observed between self-reported IU and differential responding to learned threat and safety cues for any measure during uninstructed/instructed blocks of threat acquisition and threat extinction training. There was some tentative evidence that higher IU was associated with greater ratings of unpleasantness and arousal to the safety cue after the experiment and greater skin conductance response to the safety cue during extinction generally. Potential explanations for these null effects and directions for future research are discussed. KW - Acquisition KW - Extinction KW - Threat KW - Instructions KW - Intolerance of KW - Uncertainty KW - Skin conductance KW - fMRI Y1 - 2022 U6 - https://doi.org/10.1016/j.ijpsycho.2022.05.005 SN - 0167-8760 SN - 1872-7697 VL - 177 SP - 171 EP - 178 PB - Elsevier CY - Amsterdam [u.a.] ER -