TY - JOUR
A1 - Winter, Benjamin
A1 - Schneeberger, Klaus
A1 - Dung, N. V.
A1 - Huttenlau, M.
A1 - Achleitner, S.
A1 - Stötter, J.
A1 - Merz, Bruno
A1 - Vorogushyn, Sergiy
T1 - A continuous modelling approach for design flood estimation on sub-daily time scale
JF - Hydrological sciences journal = Journal des sciences hydrologiques
N2 - Design flood estimation is an essential part of flood risk assessment. Commonly applied are flood frequency analyses and design storm approaches, while the derived flood frequency using continuous simulation has been getting more attention recently. In this study, a continuous hydrological modelling approach on an hourly time scale, driven by a multi-site weather generator in combination with a -nearest neighbour resampling procedure, based on the method of fragments, is applied. The derived 100-year flood estimates in 16 catchments in Vorarlberg (Austria) are compared to (a) the flood frequency analysis based on observed discharges, and (b) a design storm approach. Besides the peak flows, the corresponding runoff volumes are analysed. The spatial dependence structure of the synthetically generated flood peaks is validated against observations. It can be demonstrated that the continuous modelling approach can achieve plausible results and shows a large variability in runoff volume across the flood events.
KW - derived flood frequency
KW - continuous modelling
KW - temporal disaggregation
KW - flood hazard
KW - synthetic flood events
Y1 - 2019
U6 - https://doi.org/10.1080/02626667.2019.1593419
SN - 0262-6667
SN - 2150-3435
VL - 64
IS - 5
SP - 539
EP - 554
PB - Routledge, Taylor & Francis Group
CY - Abingdon
ER -
TY - JOUR
A1 - Kreibich, Heidi
A1 - Di Baldassarre, Giuliano
A1 - Vorogushyn, Sergiy
A1 - Aerts, Jeroen C. J. H.
A1 - Apel, Heiko
A1 - Aronica, Giuseppe T.
A1 - Arnbjerg-Nielsen, Karsten
A1 - Bouwer, Laurens M.
A1 - Bubeck, Philip
A1 - Caloiero, Tommaso
A1 - Chinh, Do T.
A1 - Cortes, Maria
A1 - Gain, Animesh K.
A1 - Giampa, Vincenzo
A1 - Kuhlicke, Christian
A1 - Kundzewicz, Zbigniew W.
A1 - Llasat, Maria Carmen
A1 - Mard, Johanna
A1 - Matczak, Piotr
A1 - Mazzoleni, Maurizio
A1 - Molinari, Daniela
A1 - Dung, Nguyen V.
A1 - Petrucci, Olga
A1 - Schröter, Kai
A1 - Slager, Kymo
A1 - Thieken, Annegret
A1 - Ward, Philip J.
A1 - Merz, Bruno
T1 - Adaptation to flood risk
BT - Results of international paired flood event studies
JF - Earth's Future
N2 - As flood impacts are increasing in large parts of the world, understanding the primary drivers of changes in risk is essential for effective adaptation. To gain more knowledge on the basis of empirical case studies, we analyze eight paired floods, that is, consecutive flood events that occurred in the same region, with the second flood causing significantly lower damage. These success stories of risk reduction were selected across different socioeconomic and hydro-climatic contexts. The potential of societies to adapt is uncovered by describing triggered societal changes, as well as formal measures and spontaneous processes that reduced flood risk. This novel approach has the potential to build the basis for an international data collection and analysis effort to better understand and attribute changes in risk due to hydrological extremes in the framework of the IAHSs Panta Rhei initiative. Across all case studies, we find that lower damage caused by the second event was mainly due to significant reductions in vulnerability, for example, via raised risk awareness, preparedness, and improvements of organizational emergency management. Thus, vulnerability reduction plays an essential role for successful adaptation. Our work shows that there is a high potential to adapt, but there remains the challenge to stimulate measures that reduce vulnerability and risk in periods in which extreme events do not occur.
KW - flooding
KW - vulnerability
KW - global environmental change
KW - adaptation
Y1 - 2017
U6 - https://doi.org/10.1002/2017EF000606
SN - 2328-4277
VL - 5
SP - 953
EP - 965
PB - Wiley
CY - Hoboken
ER -
TY - JOUR
A1 - Vorogushyn, Sergiy
A1 - Apel, Heiko
A1 - Kemter, Matthias
A1 - Thieken, Annegret
T1 - Analyse der Hochwassergefährdung im Ahrtal unter Berücksichtigung historischer Hochwasser
T1 - Analysis of flood hazard in the Ahr Valley considering historical floods
JF - Hydrologie und Wasserbewirtschaftung
N2 - The flood disaster in July 2021 in western Germany calls for a critical discussion on flood hazard assessment, revision of flood hazard maps and communication of extreme flood scenarios. In the presented work, extreme value analysis was carried out for annual maximum peak flow series at the Altenahr gauge on the river Ahr. We compared flood statistics with and without considering historical flood events. An estimate for the return period of the recent flood based on the Generalized Extreme Value (GEV) distribution considering historical floods ranges between about 2600 and above 58700 years (90% confidence interval) with a median of approximately 8600 years, whereas an estimate based on the 74-year long systematically recorded flow series would theoretically exceed 100 million years. Consideration of historical floods dramatically changes the flood quantiles that are used for the generation of official flood hazard maps. The fitting of the GEV to the time series with historical floods reveals, however, that the model potentially inadequately reflects the flood population. In this case, we might face a mixed sample, in which extreme floods result from very different processes compared to smaller floods. Hence, the probabilities of extreme floods could be much larger than those resulting from a single GEV model. The application of a process-based mixed flood distribution should be explored in future work.
The comparison of the official HQextrem flood maps for the AhrValley with the inundation areas from July 2021 shows a striking discrepancy in the affected areas and calls for revision of design values used to define extreme flood scenarios. The hydrodynamic simulations of a 1000-year return period flood considering historical events and of the 1804 flood scenario compare much better to the flooded areas from July 2021, though both scenarios still underestimated the flood extent.
Particular effects such as clogging of bridges and geomorphological changes of the river channel led to considerably larger flooded areas in July 2021 compared to the simulation results. Based on this analysis, we call for a consistent definition of HQextrem for flood hazard mapping in Germany, and suggest using high flood quantiles in the range of a 1,000-year flood. Flood maps should additionally include model-based reconstructions of the largest, reliably documented historical floods and/or synthetic worst-case scenarios. This would be an important step towards protecting potentially affected population and disaster management from surprises due to very rare and extreme flood events in future.
N2 - Die Hochwasserkatastrophe im Juli 2021 in Westdeutschland erfordert eine kritische Diskussion über die Abschätzung der Hochwassergefährdung, Aktualisierung von Hochwassergefahrenkarten und Kommunikation von extremen Hochwasserszenarien. In der vorliegenden Arbeit wurde die Extremwertstatistik für die jährlichen maximalen Spitzenabflüsse am Pegel Altenahr im Ahrtal mit und ohne Berücksichtigung historischer Hochwasser berechnet und verglichen. Die Schätzung der Wiederkehrperiode für das aktuelle Hochwasser mittels Generalisierter Extremwertverteilung (GEV) unter Berücksichtigung historischer Hochwasser schwankt zwischen etwa 2.600 und über 58.700 Jahren (90%-Konfidenzintervall) mit einem Median bei etwa 8.600 Jahren, wogegen die Schätzung, die nur auf der systematisch gemessenen Abflusszeitreihe von 74 Jahren basiert, theoretisch eine Wiederkehrperiode von über 100 Millionen Jahren ergeben würde. Die Berücksichtigung der historischen Hochwasser führt zu einer dramatischen Änderung der Hochwasserquan-
tile, die für eine Gefahrenkartierung zugrunde gelegt werden. Die Anpassung der GEV an die Zeitreihe mit historischen Hochwassern zeigt dennoch, dass das GEV-Modell möglicherweise die Grundgesamtheit der Hochwasser im Ahrtal nicht adäquat abbilden kann. Es könnte sich im vorliegenden Fall um eine gemischte Stichprobe handeln, in der die extremen Hochwasser im Vergleich zu kleineren Ereignissen durch besondere Prozesse hervorgerufen werden. Somit könnten die Wahrscheinlichkeiten von extremen Hochwassern deutlich größer sein, als aus dem GEV-Modell hervorgeht. Hier sollte in Zukunft die Anwendung einer prozessbasierten Mischverteilung
untersucht werden. Der Vergleich von amtlichen Gefahrenkarten zu Extremhochwassern (HQextrem) im Ahrtal mit den Überflutungsflächen vom Juli 2021
zeigt eine deutliche Diskrepanz in den betroffenen Gebieten und die Notwendigkeit, die Grundlagen zur Erstellung der Extremszenarien zu überdenken. Die hydrodynamisch-numerischen Simulationen von 1.000-jährlichen Hochwassern (HQ1000) unter Berücksichtigung historischer Ereignisse und des größten historischen Hochwassers 1804 können die Gefährdung des Juli-Hochwassers 2021 deutlich besser widerspiegeln, wenngleich auch diese beiden Szenarien die Überflutungsflächen unterschätzen. Besondere Effekte wie die Verklausung von Brücken und die geomorphologischen Änderungen im Flussschlauch führten zu noch größeren Überflutungs- flächen im Juli 2021, als die Simulationsergebnisse zeigten. Basierend auf dieser Analyse wird eine einheitliche Festlegung von HQextrem bei Hochwassergefahrenkartierungen in Deutschland vorgeschlagen, die sich an höheren Hochwasserquantilen im Bereich von HQ1000 orientiert. Zusätzlich sollen simulationsbasierte Rekonstruktionen von den größten verlässlich dokumentierten historischen Hochwassern und/oder synthetische Worst-Case-Szenarien in den Hochwassergefahrenkarten gesondert dargestellt werden. Damit wird ein wichtiger Beitrag geleistet, um die potenziell betroffene Bevölkerung und das Katastrophenmanagement vor Überraschungen durch sehr seltene und extreme Hochwasser in Zukunft besser zu schützen.
KW - Extreme value statistics
KW - historical floods
KW - flood hazard mapping;
KW - inundation simulation
KW - Ahr River
KW - Extremwertstatistik
KW - historische Hochwasser
KW - Gefahrenkarten
KW - Überflutungssimulation
KW - Ahr
Y1 - 2022
U6 - https://doi.org/10.5675/HyWa_2022.5_2
SN - 1439-1783
VL - 66
IS - 5
SP - 244
EP - 254
PB - Bundesanst. für Gewässerkunde
CY - Koblenz
ER -
TY - THES
A1 - Vorogushyn, Sergiy
T1 - Analysis of flood hazard under consideration of dike breaches
T1 - Analyse der Hochwassergefährdung unter Berücksichtigung von Deichbrüchen
N2 - River reaches protected by dikes exhibit high damage potential due to strong value accumulation in the hinterland areas. While providing an efficient protection against low magnitude flood events, dikes may fail under the load of extreme water levels and long flood durations. Hazard and risk assessments for river reaches protected by dikes have not adequately considered the fluvial inundation processes up to now. Particularly, the processes of dike failures and their influence on the hinterland inundation and flood wave propagation lack comprehensive consideration. This study focuses on the development and application of a new modelling system which allows a comprehensive flood hazard assessment along diked river reaches under consideration of dike failures. The proposed Inundation Hazard Assessment Model (IHAM) represents a hybrid probabilistic-deterministic model. It comprises three models interactively coupled at runtime. These are: (1) 1D unsteady hydrodynamic model of river channel and floodplain flow between dikes, (2) probabilistic dike breach model which determines possible dike breach locations, breach widths and breach outflow discharges, and (3) 2D raster-based diffusion wave storage cell model of the hinterland areas behind the dikes. Due to the unsteady nature of the 1D and 2D coupled models, the dependence between hydraulic load at various locations along the reach is explicitly considered. The probabilistic dike breach model describes dike failures due to three failure mechanisms: overtopping, piping and slope instability caused by the seepage flow through the dike core (micro-instability). The 2D storage cell model driven by the breach outflow boundary conditions computes an extended spectrum of flood intensity indicators such as water depth, flow velocity, impulse, inundation duration and rate of water rise. IHAM is embedded in a Monte Carlo simulation in order to account for the natural variability of the flood generation processes reflected in the form of input hydrographs and for the randomness of dike failures given by breach locations, times and widths. The model was developed and tested on a ca. 91 km heavily diked river reach on the German part of the Elbe River between gauges Torgau and Vockerode. The reach is characterised by low slope and fairly flat extended hinterland areas. The scenario calculations for the developed synthetic input hydrographs for the main river and tributary were carried out for floods with return periods of T = 100, 200, 500, 1000 a. Based on the modelling results, probabilistic dike hazard maps could be generated that indicate the failure probability of each discretised dike section for every scenario magnitude. In the disaggregated display mode, the dike hazard maps indicate the failure probabilities for each considered breach mechanism. Besides the binary inundation patterns that indicate the probability of raster cells being inundated, IHAM generates probabilistic flood hazard maps. These maps display spatial patterns of the considered flood intensity indicators and their associated return periods. Finally, scenarios of polder deployment for the extreme floods with T = 200, 500, 1000 were simulated with IHAM. The developed IHAM simulation system represents a new scientific tool for studying fluvial inundation dynamics under extreme conditions incorporating effects of technical flood protection measures. With its major outputs in form of novel probabilistic inundation and dike hazard maps, the IHAM system has a high practical value for decision support in flood management.
N2 - Entlang eingedeichter Flussabschnitte kann das Hinterland ein hohes Schadenspotential, aufgrund der starken Akkumulation der Werte, aufweisen. Obwohl Deiche einen effizienten Schutz gegen kleinere häufiger auftretende Hochwässer bieten, können sie unter der Last hoher Wasserstände sowie langer Anstaudauer versagen. Gefährdungs- und Risikoabschätzungsmethoden für die eingedeichten Flussstrecken haben bisher die fluvialen Überflutungsprozesse nicht hinreichend berücksichtigt. Besonders, die Prozesse der Deichbrüche und deren Einfluss auf Überflutung im Hinterland und Fortschreiten der Hochwasserwelle verlangen eine umfassende Betrachtung. Die vorliegende Studie setzt ihren Fokus auf die Entwicklung und Anwendung eines neuen Modellierungssystems, das eine umfassende Hochwassergefährdungsanalyse entlang eingedeichter Flussstrecken unter Berücksichtigung von Deichbrüchen ermöglicht. Das vorgeschlagene Inundation Hazard Assessment Model (IHAM) stellt ein hybrides probabilistisch-deterministisches Modell dar. Es besteht aus drei laufzeitgekoppelten Modellen: (1) einem 1D instationären hydrodynamisch-numerischen Modell für den Flussschlauch und die Vorländer zwischen den Deichen, (2) einem probabilistischen Deichbruchmodell, welches die möglichen Bruchstellen, Breschenbreiten und Breschenausflüsse berechnet, und (3) einem 2D raster-basierten Überflutungsmodell für das Hinterland, das auf dem Speiherzellenansatz und der Diffusionswellengleichung basiert ist. Das probabilistische Deichbruchmodell beschreibt Deichbrüche, die infolge von drei Bruchmechanismen auftreten: dem Überströmen, dem Piping im Deichuntergrund und dem Versagen der landseitigen Böschung als Folge des Sickerflusses und der Erosion im Deichkörper (Mikro-Instabilität). Das 2D Speicherzellenmodell, angetrieben durch den Breschenausfluss als Randbedingung, berechnet ein erweitertes Spektrum der Hochwasserintensitätsindikatoren wie: Überflutungstiefe, Fliessgeschwindigkeit, Impuls, Überflutungsdauer und Wasseranstiegsrate. IHAM wird im Rahmen einer Monte Carlo Simulation ausgeführt und berücksichtigt die natürliche Variabilität der Hochwasserentstehungsprozesse, die in der Form der Hydrographen und deren Häufigkeit abgebildet wird, und die Zufälligkeit des Deichversagens, gegeben durch die Lokationen der Bruchstellen, der Zeitpunkte der Brüche und der Breschenbreiten. Das Modell wurde entwickelt und getestet an einem ca. 91 km langen Flussabschnitt. Dieser Flussabschnitt ist durchgängig eingedeicht und befindet sich an der deutschen Elbe zwischen den Pegeln Torgau und Vockerode. Die Szenarioberechnungen wurden von synthetischen Hydrographen für den Hauptstrom und Nebenfluss angetrieben, die für Hochwässer mit Wiederkehrintervallen von 100, 200, 500, und 1000 Jahren entwickelt wurden. Basierend auf den Modellierungsergebnissen wurden probabilistische Deichgefährdungskarten generiert. Sie zeigen die Versagenswahrscheinlichkeiten der diskretisierten Deichabschnitte für jede modellierte Hochwassermagnitude. Die Deichgefährdungskarten im disaggregierten Darstellungsmodus zeigen die Bruchwahrscheinlichkeiten für jeden betrachteten Bruchmechanismus. Neben den binären Überflutungsmustern, die die Wahrscheinlichkeit der Überflutung jeder Rasterzelle im Hinterland zeigen, generiert IHAM probabilistische Hochwassergefährdungskarten. Diese Karten stellen räumliche Muster der in Betracht gezogenen Hochwasserintensitätsindikatoren und entsprechende Jährlichkeiten dar. Schließlich, wurden mit IHAM Szenarien mit Aktivierung vom Polder bei extremen Hochwässern mit Jährlichkeiten von 200, 500, 1000 Jahren simuliert. Das entwickelte IHAM Modellierungssystem stellt ein neues wissenschaftliches Werkzeug für die Untersuchung fluvialer Überflutungsdynamik in extremen Hochwassersituationen unter Berücksichtigung des Einflusses technischer Hochwasserschutzmaßnahmen dar. Das IHAM System hat eine hohe praktische Bedeutung für die Entscheidungsunterstützung im Hochwassermanagement aufgrund der neuartigen Deichbruch- und Hochwassergefährdungskarten, die das Hauptprodukt der Simulationen darstellen.
KW - Hochwasser
KW - Deichbruch
KW - Unsicherheitsanalyse
KW - Gefährdungskarten
KW - Polder
KW - Flood
KW - dike breach
KW - uncertainty analysis
KW - hazard maps
KW - polder
Y1 - 2008
U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus-27646
ER -
TY - JOUR
A1 - Duethmann, Doris
A1 - Bolch, Tobias
A1 - Farinotti, Daniel
A1 - Kriegel, David
A1 - Vorogushyn, Sergiy
A1 - Merz, Bruno
A1 - Pieczonka, Tino
A1 - Jiang, Tong
A1 - Su, Buda
A1 - Güntner, Andreas
T1 - Attribution of streamflow trends in snow and glacier melt-dominated catchments of the Tarim River, Central Asia
JF - Water resources research
N2 - Observed streamflow of headwater catchments of the Tarim River (Central Asia) increased by about 30% over the period 1957-2004. This study aims at assessing to which extent these streamflow trends can be attributed to changes in air temperature or precipitation. The analysis includes a data-based approach using multiple linear regression and a simulation-based approach using a hydrological model. The hydrological model considers changes in both glacier area and surface elevation. It was calibrated using a multiobjective optimization algorithm with calibration criteria based on glacier mass balance and daily and interannual variations of discharge. The individual contributions to the overall streamflow trends from changes in glacier geometry, temperature, and precipitation were assessed using simulation experiments with a constant glacier geometry and with detrended temperature and precipitation time series. The results showed that the observed changes in streamflow were consistent with the changes in temperature and precipitation. In the Sari-Djaz catchment, increasing temperatures and related increase of glacier melt were identified as the dominant driver, while in the Kakshaal catchment, both increasing temperatures and increasing precipitation played a major role. Comparing the two approaches, an advantage of the simulation-based approach is the fact that it is based on process-based relationships implemented in the hydrological model instead of statistical links in the regression model. However, data-based approaches are less affected by model parameter and structural uncertainties and typically fast to apply. A complementary application of both approaches is recommended.
KW - trend analysis
KW - data-based
KW - simulation-based
KW - multiobjective calibration
KW - hydrological modeling
KW - glacier melt
Y1 - 2015
U6 - https://doi.org/10.1002/2014WR016716
SN - 0043-1397
SN - 1944-7973
VL - 51
IS - 6
SP - 4727
EP - 4750
PB - American Geophysical Union
CY - Washington
ER -
TY - GEN
A1 - Murawski, Aline
A1 - Bürger, Gerd
A1 - Vorogushyn, Sergiy
A1 - Merz, Bruno
T1 - Can local climate variability be explained by weather patterns?
BT - a multi-station evaluation for the Rhine basin
T2 - Postprints der Universität Potsdam : Mathematisch-Naturwissenschaftliche Reihe
N2 - To understand past flood changes in the Rhine catchment and in particular the role of anthropogenic climate change in extreme flows, an attribution study relying on a proper GCM (general circulation model) downscaling is needed. A downscaling based on conditioning a stochastic weather generator on weather patterns is a promising approach. This approach assumes a strong link between weather patterns and local climate, and sufficient GCM skill in reproducing weather pattern climatology. These presuppositions are unprecedentedly evaluated here using 111 years of daily climate data from 490 stations in the Rhine basin and comprehensively testing the number of classification parameters and GCM weather pattern characteristics. A classification based on a combination of mean sea level pressure, temperature, and humidity from the ERA20C reanalysis of atmospheric fields over central Europe with 40 weather types was found to be the most appropriate for stratifying six local climate variables. The corresponding skill is quite diverse though, ranging from good for radiation to poor for precipitation. Especially for the latter it was apparent that pressure fields alone cannot sufficiently stratify local variability. To test the skill of the latest generation of GCMs from the CMIP5 ensemble in reproducing the frequency, seasonality, and persistence of the derived weather patterns, output from 15 GCMs is evaluated. Most GCMs are able to capture these characteristics well, but some models showed consistent deviations in all three evaluation criteria and should be excluded from further attribution analysis.
T3 - Zweitveröffentlichungen der Universität Potsdam : Mathematisch-Naturwissenschaftliche Reihe - 525
KW - athmospheric circulation patterns
KW - stochastic rainfall model
KW - within-type variability
KW - river Rhine
KW - precipitation
KW - temperature
KW - trends
KW - classification
KW - Europe
KW - scenarios
Y1 - 2019
U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-410155
SN - 1866-8372
IS - 525
ER -
TY - JOUR
A1 - Murawski, Aline
A1 - Bürger, Gerd
A1 - Vorogushyn, Sergiy
A1 - Merz, Bruno
T1 - Can local climate variability be explained by weather patterns? A multi-station evaluation for the Rhine basin
JF - Hydrology and earth system sciences : HESS
N2 - To understand past flood changes in the Rhine catchment and in particular the role of anthropogenic climate change in extreme flows, an attribution study relying on a proper GCM (general circulation model) downscaling is needed. A downscaling based on conditioning a stochastic weather generator on weather patterns is a promising approach. This approach assumes a strong link between weather patterns and local climate, and sufficient GCM skill in reproducing weather pattern climatology. These presuppositions are unprecedentedly evaluated here using 111 years of daily climate data from 490 stations in the Rhine basin and comprehensively testing the number of classification parameters and GCM weather pattern characteristics. A classification based on a combination of mean sea level pressure, temperature, and humidity from the ERA20C reanalysis of atmospheric fields over central Europe with 40 weather types was found to be the most appropriate for stratifying six local climate variables. The corresponding skill is quite diverse though, ranging from good for radiation to poor for precipitation. Especially for the latter it was apparent that pressure fields alone cannot sufficiently stratify local variability. To test the skill of the latest generation of GCMs from the CMIP5 ensemble in reproducing the frequency, seasonality, and persistence of the derived weather patterns, output from 15 GCMs is evaluated. Most GCMs are able to capture these characteristics well, but some models showed consistent deviations in all three evaluation criteria and should be excluded from further attribution analysis.
Y1 - 2016
U6 - https://doi.org/10.5194/hess-20-4283-2016
SN - 1027-5606
SN - 1607-7938
VL - 20
SP - 4283
EP - 4306
PB - Copernicus
CY - Göttingen
ER -
TY - JOUR
A1 - Tarasova, Larisa
A1 - Merz, Ralf
A1 - Kiss, Andrea
A1 - Basso, Stefano
A1 - Blöchl, Günter
A1 - Merz, Bruno
A1 - Viglione, Alberto
A1 - Plötner, Stefan
A1 - Guse, Björn
A1 - Schumann, Andreas
A1 - Fischer, Svenja
A1 - Ahrens, Bodo
A1 - Anwar, Faizan
A1 - Bárdossy, András
A1 - Bühler, Philipp
A1 - Haberlandt, Uwe
A1 - Kreibich, Heidi
A1 - Krug, Amelie
A1 - Lun, David
A1 - Müller-Thomy, Hannes
A1 - Pidoto, Ross
A1 - Primo, Cristina
A1 - Seidel, Jochen
A1 - Vorogushyn, Sergiy
A1 - Wietzke, Luzie
T1 - Causative classification of river flood events
JF - Wiley Interdisciplinary Reviews : Water
N2 - A wide variety of processes controls the time of occurrence, duration, extent, and severity of river floods. Classifying flood events by their causative processes may assist in enhancing the accuracy of local and regional flood frequency estimates and support the detection and interpretation of any changes in flood occurrence and magnitudes. This paper provides a critical review of existing causative classifications of instrumental and preinstrumental series of flood events, discusses their validity and applications, and identifies opportunities for moving toward more comprehensive approaches. So far no unified definition of causative mechanisms of flood events exists. Existing frameworks for classification of instrumental and preinstrumental series of flood events adopt different perspectives: hydroclimatic (large-scale circulation patterns and atmospheric state at the time of the event), hydrological (catchment scale precipitation patterns and antecedent catchment state), and hydrograph-based (indirectly considering generating mechanisms through their effects on hydrograph characteristics). All of these approaches intend to capture the flood generating mechanisms and are useful for characterizing the flood processes at various spatial and temporal scales. However, uncertainty analyses with respect to indicators, classification methods, and data to assess the robustness of the classification are rarely performed which limits the transferability across different geographic regions. It is argued that more rigorous testing is needed. There are opportunities for extending classification methods to include indicators of space-time dynamics of rainfall, antecedent wetness, and routing effects, which will make the classification schemes even more useful for understanding and estimating floods. This article is categorized under: Science of Water > Water Extremes Science of Water > Hydrological Processes Science of Water > Methods
KW - flood genesis
KW - flood mechanisms
KW - flood typology
KW - historical floods
KW - hydroclimatology of floods
Y1 - 2019
U6 - https://doi.org/10.1002/wat2.1353
SN - 2049-1948
VL - 6
IS - 4
PB - Wiley
CY - Hoboken
ER -
TY - JOUR
A1 - Merz, Bruno
A1 - Vorogushyn, Sergiy
A1 - Lall, Upmanu
A1 - Viglione, Alberto
A1 - Blöschl, Günter
T1 - Charting unknown waters-On the role of surprise in flood risk assessment and management
JF - Water resources research
N2 - Unexpected incidents, failures, and disasters are abundant in the history of flooding events. In this paper, we introduce the metaphors of terra incognita and terra maligna to illustrate unknown and wicked flood situations, respectively. We argue that surprise is a neglected element in flood risk assessment and management. Two sources of surprise are identified: (1) the complexity of flood risk systems, represented by nonlinearities, interdependencies, and nonstationarities and (2) cognitive biases in human perception and decision making. Flood risk assessment and management are particularly prone to cognitive biases due to the rarity and uniqueness of extremes, and the nature of human risk perception. We reflect on possible approaches to better understanding and reducing the potential for surprise and its adverse consequences which may be supported by conceptually charting maps that separate terra incognita from terra cognita, and terra maligna from terra benigna. We conclude that flood risk assessment and management should account for the potential for surprise and devastating consequences which will require a shift in thinking.
Y1 - 2015
U6 - https://doi.org/10.1002/2015WR017464
SN - 0043-1397
SN - 1944-7973
VL - 51
IS - 8
SP - 6399
EP - 6416
PB - American Geophysical Union
CY - Washington
ER -
TY - JOUR
A1 - Wietzke, Luzie M.
A1 - Merz, Bruno
A1 - Gerlitz, Lars
A1 - Kreibich, Heidi
A1 - Guse, Björn
A1 - Castellarin, Attilio
A1 - Vorogushyn, Sergiy
T1 - Comparative analysis of scalar upper tail indicators
JF - Hydrological sciences journal = Journal des sciences hydrologiques
N2 - Different upper tail indicators exist to characterize heavy tail phenomena, but no comparative study has been carried out so far. We evaluate the shape parameter (GEV), obesity index, Gini index and upper tail ratio (UTR) against a novel benchmark of tail heaviness - the surprise factor. Sensitivity analyses to sample size and changes in scale-to-location ratio are carried out in bootstrap experiments. The UTR replicates the surprise factor best but is most uncertain and only comparable between records of similar length. For samples with symmetric Lorenz curves, shape parameter, obesity and Gini indices provide consistent indications. For asymmetric Lorenz curves, however, the first two tend to overestimate, whereas Gini index tends to underestimate tail heaviness. We suggest the use of a combination of shape parameter, obesity and Gini index to characterize tail heaviness. These indicators should be supported with calculation of the Lorenz asymmetry coefficients and interpreted with caution.
KW - upper tail behaviour
KW - heavy-tailed distributions
KW - extremes
KW - diagnostics
KW - surprise
Y1 - 2020
U6 - https://doi.org/10.1080/02626667.2020.1769104
SN - 0262-6667
SN - 2150-3435
VL - 65
IS - 10
SP - 1625
EP - 1639
PB - Routledge, Taylor & Francis Group
CY - Abingdon
ER -