Refine
Year of publication
Document Type
- Article (22)
- Postprint (2)
- Doctoral Thesis (1)
- Review (1)
Keywords
- Climate variability (2)
- Europe (2)
- adaptation (2)
- catchment (2)
- historical floods (2)
- trend analysis (2)
- vulnerability (2)
- Ahr (1)
- Ahr River (1)
- Asia (1)
One common approach to cope with floods is the implementation of structural flood protection measures, such as levees or flood-control reservoirs, which substantially reduce the probability of flooding at the time of implementation. Numerous scholars have problematized this approach. They have shown that increasing the levels of flood protection can attract more settlements and high-value assets in the areas protected by the new measures. Other studies have explored how structural measures can generate a sense of complacency, which can act to reduce preparedness. These paradoxical risk changes have been described as "levee effect", "safe development paradox" or "safety dilemma". In this commentary, we briefly review this phenomenon by critically analysing the intended benefits and unintended effects of structural flood protection, and then we propose an interdisciplinary research agenda to uncover these paradoxical dynamics of risk.
Die Hochwasserkatastrophe im Juli 2021 in Westdeutschland erfordert eine kritische Diskussion über die Abschätzung der Hochwassergefährdung, Aktualisierung von Hochwassergefahrenkarten und Kommunikation von extremen Hochwasserszenarien. In der vorliegenden Arbeit wurde die Extremwertstatistik für die jährlichen maximalen Spitzenabflüsse am Pegel Altenahr im Ahrtal mit und ohne Berücksichtigung historischer Hochwasser berechnet und verglichen. Die Schätzung der Wiederkehrperiode für das aktuelle Hochwasser mittels Generalisierter Extremwertverteilung (GEV) unter Berücksichtigung historischer Hochwasser schwankt zwischen etwa 2.600 und über 58.700 Jahren (90%-Konfidenzintervall) mit einem Median bei etwa 8.600 Jahren, wogegen die Schätzung, die nur auf der systematisch gemessenen Abflusszeitreihe von 74 Jahren basiert, theoretisch eine Wiederkehrperiode von über 100 Millionen Jahren ergeben würde. Die Berücksichtigung der historischen Hochwasser führt zu einer dramatischen Änderung der Hochwasserquan-
tile, die für eine Gefahrenkartierung zugrunde gelegt werden. Die Anpassung der GEV an die Zeitreihe mit historischen Hochwassern zeigt dennoch, dass das GEV-Modell möglicherweise die Grundgesamtheit der Hochwasser im Ahrtal nicht adäquat abbilden kann. Es könnte sich im vorliegenden Fall um eine gemischte Stichprobe handeln, in der die extremen Hochwasser im Vergleich zu kleineren Ereignissen durch besondere Prozesse hervorgerufen werden. Somit könnten die Wahrscheinlichkeiten von extremen Hochwassern deutlich größer sein, als aus dem GEV-Modell hervorgeht. Hier sollte in Zukunft die Anwendung einer prozessbasierten Mischverteilung
untersucht werden. Der Vergleich von amtlichen Gefahrenkarten zu Extremhochwassern (HQextrem) im Ahrtal mit den Überflutungsflächen vom Juli 2021
zeigt eine deutliche Diskrepanz in den betroffenen Gebieten und die Notwendigkeit, die Grundlagen zur Erstellung der Extremszenarien zu überdenken. Die hydrodynamisch-numerischen Simulationen von 1.000-jährlichen Hochwassern (HQ1000) unter Berücksichtigung historischer Ereignisse und des größten historischen Hochwassers 1804 können die Gefährdung des Juli-Hochwassers 2021 deutlich besser widerspiegeln, wenngleich auch diese beiden Szenarien die Überflutungsflächen unterschätzen. Besondere Effekte wie die Verklausung von Brücken und die geomorphologischen Änderungen im Flussschlauch führten zu noch größeren Überflutungs- flächen im Juli 2021, als die Simulationsergebnisse zeigten. Basierend auf dieser Analyse wird eine einheitliche Festlegung von HQextrem bei Hochwassergefahrenkartierungen in Deutschland vorgeschlagen, die sich an höheren Hochwasserquantilen im Bereich von HQ1000 orientiert. Zusätzlich sollen simulationsbasierte Rekonstruktionen von den größten verlässlich dokumentierten historischen Hochwassern und/oder synthetische Worst-Case-Szenarien in den Hochwassergefahrenkarten gesondert dargestellt werden. Damit wird ein wichtiger Beitrag geleistet, um die potenziell betroffene Bevölkerung und das Katastrophenmanagement vor Überraschungen durch sehr seltene und extreme Hochwasser in Zukunft besser zu schützen.
The repeated occurrence of exceptional floods within a few years, such as the Rhine floods in 1993 and 1995 and the Elbe and Danube floods in 2002 and 2013, suggests that floods in Central Europe may be organized in flood-rich and flood-poor periods. This hypothesis is studied by testing the significance of temporal clustering in flood occurrence (peak-over-threshold) time series for 68 catchments across Germany for the period 1932-2005. To assess the robustness of the results, different methods are used: Firstly, the index of dispersion, which quantifies the departure from a homogeneous Poisson process, is investigated. Further, the time-variation of the flood occurrence rate is derived by non-parametric kernel implementation and the significance of clustering is evaluated via parametric and non-parametric tests. Although the methods give consistent overall results, the specific results differ considerably. Hence, we recommend applying different methods when investigating flood clustering. For flood estimation and risk management, it is of relevance to understand whether clustering changes with flood severity and time scale. To this end, clustering is assessed for different thresholds and time scales. It is found that the majority of catchments show temporal clustering at the 5% significance level for low thresholds and time scales of one to a few years. However, clustering decreases substantially with increasing threshold and time scale. We hypothesize that flood clustering in Germany is mainly caused by catchment memory effects along with intra- to inter-annual climate variability, and that decadal climate variability plays a minor role. (C) 2016 Elsevier B.V. All rights reserved.
To understand past flood changes in the Rhine catchment and in particular the role of anthropogenic climate change in extreme flows, an attribution study relying on a proper GCM (general circulation model) downscaling is needed. A downscaling based on conditioning a stochastic weather generator on weather patterns is a promising approach. This approach assumes a strong link between weather patterns and local climate, and sufficient GCM skill in reproducing weather pattern climatology. These presuppositions are unprecedentedly evaluated here using 111 years of daily climate data from 490 stations in the Rhine basin and comprehensively testing the number of classification parameters and GCM weather pattern characteristics. A classification based on a combination of mean sea level pressure, temperature, and humidity from the ERA20C reanalysis of atmospheric fields over central Europe with 40 weather types was found to be the most appropriate for stratifying six local climate variables. The corresponding skill is quite diverse though, ranging from good for radiation to poor for precipitation. Especially for the latter it was apparent that pressure fields alone cannot sufficiently stratify local variability. To test the skill of the latest generation of GCMs from the CMIP5 ensemble in reproducing the frequency, seasonality, and persistence of the derived weather patterns, output from 15 GCMs is evaluated. Most GCMs are able to capture these characteristics well, but some models showed consistent deviations in all three evaluation criteria and should be excluded from further attribution analysis.
Despite its societal relevance, the question whether fluctuations in flood occurrence or magnitude are coherent in space has hardly been addressed in quantitative terms. We investigate this question for Germany by analysing fluctuations in annual maximum series (AMS) values at 68 discharge gauges for the common time period 1932-2005. We find remarkable spatial coherence across Germany given its different flood regimes. For example, there is a tendency that flood-rich/-poor years in sub-catchments of the Rhine basin, which are dominated by winter floods, coincide with flood-rich/-poor years in the southern sub-catchments of the Danube basin, which have their dominant flood season in summer. Our findings indicate that coherence is caused rather by persistence in catchment wetness than by persistent periods of higher/lower event precipitation. Further, we propose to differentiate between event-type and non-event-type coherence. There are quite a number of hydrological years with considerable nonevent-type coherence, i.e. AMS values of the 68 gauges are spread out through the year but in the same magnitude range. Years with extreme flooding tend to be of event-type and non-coherent, i.e. there is at least one precipitation event that affects many catchments to various degree. Although spatial coherence is a remarkable phenomenon, and large-scale flooding across Germany can lead to severe situations, extreme magnitudes across the whole country within one event or within one year were not observed in the investigated period. (C) 2018 Elsevier B.V. All rights reserved.
This study refines the method for calibrating a glacio-hydrological model based on Hydrograph Partitioning Curves (HPCs), and evaluates its value in comparison to multidata set optimization approaches which use glacier mass balance, satellite snow cover images, and discharge. The HPCs are extracted from the observed flow hydrograph using catchment precipitation and temperature gradients. They indicate the periods when the various runoff processes, such as glacier melt or snow melt, dominate the basin hydrograph. The annual cumulative curve of the difference between average daily temperature and melt threshold temperature over the basin, as well as the annual cumulative curve of average daily snowfall on the glacierized areas are used to identify the starting and end dates of snow and glacier ablation periods. Model parameters characterizing different runoff processes are calibrated on different HPCs in a stepwise and iterative way. Results show that the HPC-based method (1) delivers model-internal consistency comparably to the tri-data set calibration method; (2) improves the stability of calibrated parameter values across various calibration periods; and (3) estimates the contributions of runoff components similarly to the tri-data set calibration method. Our findings indicate the potential of the HPC-based approach as an alternative for hydrological model calibration in glacierized basins where other calibration data sets than discharge are often not available or very costly to obtain.
Flood risk assessments are typically based on scenarios which assume homogeneous return periods of flood peaks throughout the catchment. This assumption is unrealistic for real flood events and may bias risk estimates for specific return periods. We investigate how three assumptions about the spatial dependence affect risk estimates: (i) spatially homogeneous scenarios (complete dependence), (ii) spatially heterogeneous scenarios (modelled dependence) and (iii) spatially heterogeneous but uncorrelated scenarios (complete independence). To this end, the model chain RFM (regional flood model) is applied to the Elbe catchment in Germany, accounting for the spatio-temporal dynamics of all flood generation processes, from the rainfall through catchment and river system processes to damage mechanisms. Different assumptions about the spatial dependence do not influence the expected annual damage (EAD); however, they bias the risk curve, i.e. the cumulative distribution function of damage. The widespread assumption of complete dependence strongly overestimates flood damage of the order of 100% for return periods larger than approximately 200 years. On the other hand, for small and medium floods with return periods smaller than approximately 50 years, damage is underestimated. The overestimation aggravates when risk is estimated for larger areas. This study demonstrates the importance of representing the spatial dependence of flood peaks and damage for risk assessments.
Adaptation to flood risk
(2017)
As flood impacts are increasing in large parts of the world, understanding the primary drivers of changes in risk is essential for effective adaptation. To gain more knowledge on the basis of empirical case studies, we analyze eight paired floods, that is, consecutive flood events that occurred in the same region, with the second flood causing significantly lower damage. These success stories of risk reduction were selected across different socioeconomic and hydro-climatic contexts. The potential of societies to adapt is uncovered by describing triggered societal changes, as well as formal measures and spontaneous processes that reduced flood risk. This novel approach has the potential to build the basis for an international data collection and analysis effort to better understand and attribute changes in risk due to hydrological extremes in the framework of the IAHSs Panta Rhei initiative. Across all case studies, we find that lower damage caused by the second event was mainly due to significant reductions in vulnerability, for example, via raised risk awareness, preparedness, and improvements of organizational emergency management. Thus, vulnerability reduction plays an essential role for successful adaptation. Our work shows that there is a high potential to adapt, but there remains the challenge to stimulate measures that reduce vulnerability and risk in periods in which extreme events do not occur.
A wide variety of processes controls the time of occurrence, duration, extent, and severity of river floods. Classifying flood events by their causative processes may assist in enhancing the accuracy of local and regional flood frequency estimates and support the detection and interpretation of any changes in flood occurrence and magnitudes. This paper provides a critical review of existing causative classifications of instrumental and preinstrumental series of flood events, discusses their validity and applications, and identifies opportunities for moving toward more comprehensive approaches. So far no unified definition of causative mechanisms of flood events exists. Existing frameworks for classification of instrumental and preinstrumental series of flood events adopt different perspectives: hydroclimatic (large-scale circulation patterns and atmospheric state at the time of the event), hydrological (catchment scale precipitation patterns and antecedent catchment state), and hydrograph-based (indirectly considering generating mechanisms through their effects on hydrograph characteristics). All of these approaches intend to capture the flood generating mechanisms and are useful for characterizing the flood processes at various spatial and temporal scales. However, uncertainty analyses with respect to indicators, classification methods, and data to assess the robustness of the classification are rarely performed which limits the transferability across different geographic regions. It is argued that more rigorous testing is needed. There are opportunities for extending classification methods to include indicators of space-time dynamics of rainfall, antecedent wetness, and routing effects, which will make the classification schemes even more useful for understanding and estimating floods. This article is categorized under: Science of Water > Water Extremes Science of Water > Hydrological Processes Science of Water > Methods
Variability of the Cold Season Climate in Central Asia. Part II: Hydroclimatic Predictability
(2019)
Central Asia (CA) is subjected to a large variability of precipitation. This study presents a statistical model, relating precipitation anomalies in three subregions of CA in the cold season (November-March) with various predictors in the preceding October. Promising forecast skill is achieved for two subregions covering 1) Uzbekistan, Turkmenistan, Kyrgyzstan, Tajikistan, and southern Kazakhstan and 2) Iran, Afghanistan, and Pakistan. ENSO in October is identified as the major predictor. Eurasian snow cover and the quasi-biennial oscillation further improve the forecast performance. To understand the physical mechanisms, an analysis of teleconnections between these predictors and the wintertime circulation over CA is conducted. The correlation analysis of predictors and large-scale circulation indices suggests a seasonal persistence of tropical circulation modes and a dynamical forcing of the westerly circulation by snow cover variations over Eurasia. An EOF analysis of pressure and humidity patterns allows separating the circulation variability over CA into westerly and tropical modes and confirms that the identified predictors affect the respective circulation characteristics. Based on the previously established weather type classification for CA, the predictors are investigated with regard to their effect on the regional circulation. The results suggest a modification of the Hadley cell due to ENSO variations, with enhanced moisture supply from the Arabian Gulf during El Nino. They further indicate an influence of Eurasian snow cover on the wintertime Arctic Oscillation (AO) and Northern Hemispheric Rossby wave tracks. Positive anomalies favor weather types associated with dry conditions, while negative anomalies promote the formation of a quasi-stationary trough over CA, which typically occurs during positive AO conditions.