Refine
Has Fulltext
- yes (5)
Document Type
- Postprint (4)
- Doctoral Thesis (1)
Is part of the Bibliography
- yes (5) (remove)
Keywords
- scenarios (5) (remove)
Zwischen 1990 und 1994 wurden rund 1000 Liegenschaften, die in der ehemaligen DDR von der Sowjetarmee und der NVA für militärische Übungen genutzt wurden, an Bund und Länder übergeben. Die größten Truppenübungsplätze liegen in Brandenburg und sind heute teilweise in Großschutzgebiete integriert, andere Plätze werden von der Bundeswehr weiterhin aktiv genutzt. Aufgrund des militärischen Betriebs sind die Böden dieser Truppenübungsplätze oft durch Blindgänger, Munitionsreste, Treibstoff- und Schmierölreste bis hin zu chemischen Kampfstoffen belastet. Allerdings existieren auf fast allen Liegenschaften neben diesen durch Munition und militärische Übungen belasteten Bereichen auch naturschutzfachlich wertvolle Flächen; gerade in den Offenlandbereichen kann dies durchaus mit einer Belastung durch Kampfmittel einhergehen. Charakteristisch für diese offenen Flächen, zu denen u.a. Zwergstrauchheiden, Trockenrasen, wüstenähnliche Sandflächen und andere nährstoffarme baumlose Lebensräume gehören, sind Großflächigkeit, Abgeschiedenheit sowie ihre besondere Nutzung und Bewirtschaftung, d.h. die Abwesenheit von land- und forstwirtschaftlichem Betrieb sowie von Siedlungsflächen. Diese Charakteristik war die Grundlage für die Entwicklung einer speziell angepassten Flora und Fauna. Nach Beendigung des Militärbetriebs setzte dann in weiten Teilen eine großflächige Sukzession – die allmähliche Veränderung der Zusammensetzung von Pflanzen- und Tiergesellschaften – ein, die diese offenen Bereiche teilweise bereits in Wald verwandelte und somit verschwinden ließ. Dies wiederum führte zum Verlust der an diese Offenlandflächen gebundenen Tier- und Pflanzenarten. Zur Erhaltung, Gestaltung und Entwicklung dieser offenen Flächen wurden daher von einer interdisziplinären Gruppe von Naturwissenschaftlern verschiedene Methoden und Konzepte auf ihre jeweilige Wirksamkeit untersucht. So konnten schließlich die für die jeweiligen Standortbedingungen geeigneten Maßnahmen eingeleitet werden. Voraussetzung für die Einleitung der Maßnahmen sind zum einen Kenntnisse zu diesen jeweiligen Standortbedingungen, d.h. zum Ist-Zustand, sowie zur Entwicklung der Flächen, d.h. zur Dynamik. So kann eine Abschätzung über die zukünftige Flächenentwicklung getroffen werden, damit ein effizienter Maßnahmeneinsatz stattfinden kann. Geoinformationssysteme (GIS) spielen dabei eine entscheidende Rolle zur digitalen Dokumentation der Biotop- und Nutzungstypen, da sie die Möglichkeit bieten, raum- und zeitbezogene Geometrie- und Sachdaten in großen Mengen zu verarbeiten. Daher wurde ein fachspezifisches GIS für Truppenübungsplätze entwickelt und implementiert. Die Aufgaben umfassten die Konzeption der Datenbank und des Objektmodells sowie fachspezifischer Modellierungs-, Analyse- und Präsentationsfunktionen. Für die Integration von Fachdaten in die GIS-Datenbank wurde zudem ein Metadatenkatalog entwickelt, der in Form eines zusätzlichen GIS-Tools verfügbar ist. Die Basisdaten für das GIS wurden aus Fernerkundungsdaten, topographischen Karten sowie Geländekartierungen gewonnen. Als Instrument für die Abschätzung der zukünftigen Entwicklung wurde das Simulationstool AST4D entwickelt, in dem sowohl die Nutzung der (Raster-)Daten des GIS als Ausgangsdaten für die Simulationen als auch die Nutzung der Simulationsergebnisse im GIS möglich ist. Zudem können die Daten in AST4D raumbezogen visualisiert werden. Das mathematische Konstrukt für das Tool war ein so genannter Zellulärer Automat, mit dem die Flächenentwicklung unter verschiedenen Voraussetzungen simuliert werden kann. So war die Bildung verschiedener Szenarien möglich, d.h. die Simulation der Flächenentwicklung mit verschiedenen (bekannten) Eingangsparametern und den daraus resultierenden unterschiedlichen (unbekannten) Endzuständen. Vor der Durchführung einer der drei in AST4D möglichen Simulationsstufen können angepasst an das jeweilige Untersuchungsgebiet benutzerspezifische Festlegungen getroffen werden.
Flood damage has increased significantly and is expected to rise further in many parts of the world. For assessing potential changes in flood risk, this paper presents an integrated model chain quantifying flood hazards and losses while considering climate and land use changes. In the case study region, risk estimates for the present and the near future illustrate that changes in flood risk by 2030 are relatively low compared to historic periods. While the impact of climate change on the flood hazard and risk by 2030 is slight or negligible, strong urbanisation associated with economic growth contributes to a remarkable increase in flood risk. Therefore, it is recommended to frequently consider land use scenarios and economic developments when assessing future flood risks. Further, an adapted and sustainable risk management is necessary to encounter rising flood losses, in which non-structural measures are becoming more and more important. The case study demonstrates that adaptation by non-structural measures such as stricter land use regulations or enhancement of private precaution is capable of reducing flood risk by around 30 %. Ignoring flood risks, in contrast, always leads to further increasing losses-with our assumptions by 17 %. These findings underline that private precaution and land use regulation could be taken into account as low cost adaptation strategies to global climate change in many flood prone areas. Since such measures reduce flood risk regardless of climate or land use changes, they can also be recommended as no-regret measures.
Information on extreme precipitation for future climate is needed to assess the changes in the frequency and intensity of flooding. The primary source of information in climate change impact studies is climate model projections. However, due to the coarse resolution and biases of these models, they cannot be directly used in hydrological models. Hence, statistical downscaling is necessary to address climate change impacts at the catchment scale.
This study compares eight statistical downscaling methods (SDMs) often used in climate change impact studies. Four methods are based on change factors (CFs), three are bias correction (BC) methods, and one is a perfect prognosis method. The eight methods are used to downscale precipitation output from 15 regional climate models (RCMs) from the ENSEMBLES project for 11 catchments in Europe. The overall results point to an increase in extreme precipitation in most catchments in both winter and summer. For individual catchments, the downscaled time series tend to agree on the direction of the change but differ in the magnitude. Differences between the SDMs vary between the catchments and depend on the season analysed. Similarly, general conclusions cannot be drawn regarding the differences between CFs and BC methods. The performance of the BC methods during the control period also depends on the catchment, but in most cases they represent an improvement compared to RCM outputs. Analysis of the variance in the ensemble of RCMs and SDMs indicates that at least 30% and up to approximately half of the total variance is derived from the SDMs. This study illustrates the large variability in the expected changes in extreme precipitation and highlights the need for considering an ensemble of both SDMs and climate models. Recommendations are provided for the selection of the most suitable SDMs to include in the analysis.
Robust appraisals of climate impacts at different levels of global-mean temperature increase are vital to guide assessments of dangerous anthropogenic interference with the climate system. The 2015 Paris Agreement includes a two-headed temperature goal: "holding the increase in the global average temperature to well below 2 degrees C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5 degrees C". Despite the prominence of these two temperature limits, a comprehensive overview of the differences in climate impacts at these levels is still missing. Here we provide an assessment of key impacts of climate change at warming levels of 1.5 degrees C and 2 degrees C, including extreme weather events, water availability, agricultural yields, sea-level rise and risk of coral reef loss. Our results reveal substantial differences in impacts between a 1.5 degrees C and 2 degrees C warming that are highly relevant for the assessment of dangerous anthropogenic interference with the climate system. For heat-related extremes, the additional 0.5 degrees C increase in global-mean temperature marks the difference between events at the upper limit of present-day natural variability and a new climate regime, particularly in tropical regions. Similarly, this warming difference is likely to be decisive for the future of tropical coral reefs. In a scenario with an end-of-century warming of 2 degrees C, virtually all tropical coral reefs are projected to be at risk of severe degradation due to temperature-induced bleaching from 2050 onwards. This fraction is reduced to about 90% in 2050 and projected to decline to 70% by 2100 for a 1.5 degrees C scenario. Analyses of precipitation-related impacts reveal distinct regional differences and hot-spots of change emerge. Regional reduction in median water availability for the Mediterranean is found to nearly double from 9% to 17% between 1.5 degrees C and 2 degrees C, and the projected lengthening of regional dry spells increases from 7 to 11%. Projections for agricultural yields differ between crop types as well as world regions. While some (in particular high-latitude) regions may benefit, tropical regions like West Africa, South-East Asia, as well as Central and northern South America are projected to face substantial local yield reductions, particularly for wheat and maize. Best estimate sea-level rise projections based on two illustrative scenarios indicate a 50cm rise by 2100 relative to year 2000-levels for a 2 degrees C scenario, and about 10 cm lower levels for a 1.5 degrees C scenario. In a 1.5 degrees C scenario, the rate of sea-level rise in 2100 would be reduced by about 30% compared to a 2 degrees C scenario. Our findings highlight the importance of regional differentiation to assess both future climate risks and different vulnerabilities to incremental increases in global-mean temperature. The article provides a consistent and comprehensive assessment of existing projections and a good basis for future work on refining our understanding of the difference between impacts at 1.5 degrees C and 2 degrees C warming.
To understand past flood changes in the Rhine catchment and in particular the role of anthropogenic climate change in extreme flows, an attribution study relying on a proper GCM (general circulation model) downscaling is needed. A downscaling based on conditioning a stochastic weather generator on weather patterns is a promising approach. This approach assumes a strong link between weather patterns and local climate, and sufficient GCM skill in reproducing weather pattern climatology. These presuppositions are unprecedentedly evaluated here using 111 years of daily climate data from 490 stations in the Rhine basin and comprehensively testing the number of classification parameters and GCM weather pattern characteristics. A classification based on a combination of mean sea level pressure, temperature, and humidity from the ERA20C reanalysis of atmospheric fields over central Europe with 40 weather types was found to be the most appropriate for stratifying six local climate variables. The corresponding skill is quite diverse though, ranging from good for radiation to poor for precipitation. Especially for the latter it was apparent that pressure fields alone cannot sufficiently stratify local variability. To test the skill of the latest generation of GCMs from the CMIP5 ensemble in reproducing the frequency, seasonality, and persistence of the derived weather patterns, output from 15 GCMs is evaluated. Most GCMs are able to capture these characteristics well, but some models showed consistent deviations in all three evaluation criteria and should be excluded from further attribution analysis.