Refine
Has Fulltext
- no (61)
Year of publication
Document Type
- Article (61) (remove)
Is part of the Bibliography
- yes (61)
Keywords
- Mekong Delta (3)
- adaptation (3)
- Bayesian networks (2)
- Climate variability (2)
- compound flood (2)
- floodplain sedimentation (2)
- probabilistic (2)
- sediment dynamics (2)
- trend analysis (2)
- uncertainty (2)
Design flood estimation is an essential part of flood risk assessment. Commonly applied are flood frequency analyses and design storm approaches, while the derived flood frequency using continuous simulation has been getting more attention recently. In this study, a continuous hydrological modelling approach on an hourly time scale, driven by a multi-site weather generator in combination with a -nearest neighbour resampling procedure, based on the method of fragments, is applied. The derived 100-year flood estimates in 16 catchments in Vorarlberg (Austria) are compared to (a) the flood frequency analysis based on observed discharges, and (b) a design storm approach. Besides the peak flows, the corresponding runoff volumes are analysed. The spatial dependence structure of the synthetically generated flood peaks is validated against observations. It can be demonstrated that the continuous modelling approach can achieve plausible results and shows a large variability in runoff volume across the flood events.
Residential assets, comprising buildings and household contents, are a major source of direct flood losses. Existing damage models are mostly deterministic and limited to particular countries or flood types. Here, we compile building-level losses from Germany, Italy and the Netherlands covering a wide range of fluvial and pluvial flood events. Utilizing a Bayesian network (BN) for continuous variables, we find that relative losses (i.e. loss relative to exposure) to building structure and its contents could be estimated with five variables: water depth, flow velocity, event return period, building usable floor space area and regional disposable income per capita. The model's ability to predict flood losses is validated for the 11 flood events contained in the sample. Predictions for the German and Italian fluvial floods were better than for pluvial floods or the 1993 Meuse river flood. Further, a case study of a 2010 coastal flood in France is used to test the BN model's performance for a type of flood not included in the survey dataset. Overall, the BN model achieved better results than any of 10 alternative damage models for reproducing average losses for the 2010 flood. An additional case study of a 2013 fluvial flood has also shown good performance of the model. The study shows that data from many flood events can be combined to derive most important factors driving flood losses across regions and time, and that resulting damage models could be applied in an open data framework.
Adaptation to flood risk
(2017)
As flood impacts are increasing in large parts of the world, understanding the primary drivers of changes in risk is essential for effective adaptation. To gain more knowledge on the basis of empirical case studies, we analyze eight paired floods, that is, consecutive flood events that occurred in the same region, with the second flood causing significantly lower damage. These success stories of risk reduction were selected across different socioeconomic and hydro-climatic contexts. The potential of societies to adapt is uncovered by describing triggered societal changes, as well as formal measures and spontaneous processes that reduced flood risk. This novel approach has the potential to build the basis for an international data collection and analysis effort to better understand and attribute changes in risk due to hydrological extremes in the framework of the IAHSs Panta Rhei initiative. Across all case studies, we find that lower damage caused by the second event was mainly due to significant reductions in vulnerability, for example, via raised risk awareness, preparedness, and improvements of organizational emergency management. Thus, vulnerability reduction plays an essential role for successful adaptation. Our work shows that there is a high potential to adapt, but there remains the challenge to stimulate measures that reduce vulnerability and risk in periods in which extreme events do not occur.
This paper introduces a novel measure to assess similarity between event hydrographs. It is based on cross recurrence plots (CRP) and recurrence quantification analysis (RQA), which have recently gained attention in a range of disciplines when dealing with complex systems. The method attempts to quantify the event runoff dynamics and is based on the time delay embedded phase space representation of discharge hydrographs. A phase space trajectory is reconstructed from the event hydrograph, and pairs of hydrographs are compared to each other based on the distance of their phase space trajectories. Time delay embedding allows considering the multidimensional relationships between different points in time within the event. Hence, the temporal succession of discharge values is taken into account, such as the impact of the initial conditions on the runoff event. We provide an introduction to cross recurrence plots and discuss their parameterization. An application example based on flood time series demonstrates how the method can be used to measure the similarity or dissimilarity of events, and how it can be used to detect events with rare runoff dynamics. It is argued that this methods provides a more comprehensive approach to quantify hydrograph similarity compared to conventional hydrological signatures.
Flood risk analyses are often estimated assuming the same flood intensity along the river reach under study, i.e. discharges are calculated for a number of return periods T, e.g. 10 or 100 years, at several streamflow gauges. T-year discharges are regionalised and then transferred into T-year water levels, inundated areas and impacts. This approach assumes that (1) flood scenarios are homogeneous throughout a river basin, and (2) the T-year damage corresponds to the T-year discharge. Using a reach at the river Rhine, this homogeneous approach is compared with an approach that is based on four flood types with different spatial discharge patterns. For each type, a regression model was created and used in a Monte-Carlo framework to derive heterogeneous scenarios. Per scenario, four cumulative impact indicators were calculated: (1) the total inundated area, (2) the exposed settlement and industrial areas, (3) the exposed population and 4) the potential building loss. Their frequency curves were used to establish a ranking of eight past flood events according to their severity. The investigation revealed that the two assumptions of the homogeneous approach do not hold. It tends to overestimate event probabilities in large areas. Therefore, the generation of heterogeneous scenarios should receive more attention.
Observed streamflow of headwater catchments of the Tarim River (Central Asia) increased by about 30% over the period 1957-2004. This study aims at assessing to which extent these streamflow trends can be attributed to changes in air temperature or precipitation. The analysis includes a data-based approach using multiple linear regression and a simulation-based approach using a hydrological model. The hydrological model considers changes in both glacier area and surface elevation. It was calibrated using a multiobjective optimization algorithm with calibration criteria based on glacier mass balance and daily and interannual variations of discharge. The individual contributions to the overall streamflow trends from changes in glacier geometry, temperature, and precipitation were assessed using simulation experiments with a constant glacier geometry and with detrended temperature and precipitation time series. The results showed that the observed changes in streamflow were consistent with the changes in temperature and precipitation. In the Sari-Djaz catchment, increasing temperatures and related increase of glacier melt were identified as the dominant driver, while in the Kakshaal catchment, both increasing temperatures and increasing precipitation played a major role. Comparing the two approaches, an advantage of the simulation-based approach is the fact that it is based on process-based relationships implemented in the hydrological model instead of statistical links in the regression model. However, data-based approaches are less affected by model parameter and structural uncertainties and typically fast to apply. A complementary application of both approaches is recommended.
To understand past flood changes in the Rhine catchment and in particular the role of anthropogenic climate change in extreme flows, an attribution study relying on a proper GCM (general circulation model) downscaling is needed. A downscaling based on conditioning a stochastic weather generator on weather patterns is a promising approach. This approach assumes a strong link between weather patterns and local climate, and sufficient GCM skill in reproducing weather pattern climatology. These presuppositions are unprecedentedly evaluated here using 111 years of daily climate data from 490 stations in the Rhine basin and comprehensively testing the number of classification parameters and GCM weather pattern characteristics. A classification based on a combination of mean sea level pressure, temperature, and humidity from the ERA20C reanalysis of atmospheric fields over central Europe with 40 weather types was found to be the most appropriate for stratifying six local climate variables. The corresponding skill is quite diverse though, ranging from good for radiation to poor for precipitation. Especially for the latter it was apparent that pressure fields alone cannot sufficiently stratify local variability. To test the skill of the latest generation of GCMs from the CMIP5 ensemble in reproducing the frequency, seasonality, and persistence of the derived weather patterns, output from 15 GCMs is evaluated. Most GCMs are able to capture these characteristics well, but some models showed consistent deviations in all three evaluation criteria and should be excluded from further attribution analysis.
Unexpected incidents, failures, and disasters are abundant in the history of flooding events. In this paper, we introduce the metaphors of terra incognita and terra maligna to illustrate unknown and wicked flood situations, respectively. We argue that surprise is a neglected element in flood risk assessment and management. Two sources of surprise are identified: (1) the complexity of flood risk systems, represented by nonlinearities, interdependencies, and nonstationarities and (2) cognitive biases in human perception and decision making. Flood risk assessment and management are particularly prone to cognitive biases due to the rarity and uniqueness of extremes, and the nature of human risk perception. We reflect on possible approaches to better understanding and reducing the potential for surprise and its adverse consequences which may be supported by conceptually charting maps that separate terra incognita from terra cognita, and terra maligna from terra benigna. We conclude that flood risk assessment and management should account for the potential for surprise and devastating consequences which will require a shift in thinking.
The link between streamflow extremes and climatology has been widely studied in recent decades. However, a study investigating the effect of large-scale circulation variations on the distribution of seasonal discharge extremes at the European level is missing. Here we fit a climate-informed generalized extreme value (GEV) distribution to about 600 streamflow records in Europe for each of the standard seasons, i.e., to winter, spring, summer and autumn maxima, and compare it with the classical GEV distribution with parameters invariant in time. The study adopts a Bayesian framework and covers the period 1950 to 2016. Five indices with proven influence on the European climate are examined independently as covariates, namely the North Atlantic Oscillation (NAO), the east Atlantic pattern (EA), the east Atlantic-western Russian pattern (EA/WR), the Scandinavia pattern (SCA) and the polar-Eurasian pattern (POL). It is found that for a high percentage of stations the climate-informed model is preferred to the classical model. Particularly for NAO during winter, a strong influence on streamflow extremes is detected for large parts of Europe (preferred to the classical GEV distribution for 46% of the stations). Climate-informed fits are characterized by spatial coherence and form patterns that resemble relations between the climate indices and seasonal precipitation, suggesting a prominent role of the considered circulation modes for flood generation. For certain regions, such as northwestern Scandinavia and the British Isles, yearly variations of the mean seasonal climate indices result in considerably different extreme value distributions and thus in highly different flood estimates for individual years that can also persist for longer time periods.
Different upper tail indicators exist to characterize heavy tail phenomena, but no comparative study has been carried out so far. We evaluate the shape parameter (GEV), obesity index, Gini index and upper tail ratio (UTR) against a novel benchmark of tail heaviness - the surprise factor. Sensitivity analyses to sample size and changes in scale-to-location ratio are carried out in bootstrap experiments. The UTR replicates the surprise factor best but is most uncertain and only comparable between records of similar length. For samples with symmetric Lorenz curves, shape parameter, obesity and Gini indices provide consistent indications. For asymmetric Lorenz curves, however, the first two tend to overestimate, whereas Gini index tends to underestimate tail heaviness. We suggest the use of a combination of shape parameter, obesity and Gini index to characterize tail heaviness. These indicators should be supported with calculation of the Lorenz asymmetry coefficients and interpreted with caution.
Stochastic modeling of precipitation for estimation of hydrological extremes is an important element of flood risk assessment and management. The spatially consistent estimation of rainfall fields and their temporal variability remains challenging and is addressed by various stochastic weather generators.
In this study, two types of weather generators are evaluated against observed data and benchmarked regarding their ability to simulate spatio-temporal precipitation fields in the Rhine catchment. A multi-site station-based weather generator uses an auto-regressive model and estimates the spatial correlation structure between stations. Another weather generator is raster-based and uses the nearest-neighbor resampling technique for reshuffling daily patterns while preserving the correlation structure between the observations.
Both weather generators perform well and are comparable at the point (station) scale with regards to daily mean and 99.9th percentile precipitation as well as concerning wet/dry frequencies and transition probabilities. The areal extreme precipitation at the sub-basin scale is however overestimated in the station-based weather generator due to an overestimation of the correlation structure between individual stations. The auto-regressive model tends to generate larger rainfall fields in space for extreme precipitation than observed, particularly in summer. The weather generator based on nearest-neighbor resampling reproduces the observed daily and multiday (5, 10 and 20) extreme events in a similar magnitude. Improvements in performance regarding wet frequencies and transition probabilities are recommended for both models.
Water stable isotope signatures can provide valuable insights into the catchment internal runoff processes. However, the ability of the water isotope data to constrain the internal apportionments of runoff components in hydrological models for glacierized basins is not well understood. This study developed an approach to simultaneously model the water stable isotopic compositions and runoff processes in a glacierized basin in Central Asia. The fractionation and mixing processes of water stable isotopes in and from the various water sources were integrated into a glacio-hydrological model. The model parameters were calibrated on discharge, snow cover and glacier mass balance data, and additionally isotopic composition of streamflow. We investigated the value of water isotopic compositions for the calibration of model parameters, in comparison to calibration methods without using such measurements. Results indicate that: (1) The proposed isotope-hydrological integrated modeling approach was able to reproduce the isotopic composition of streamflow, and improved the model performance in the evaluation period; (2) Involving water isotopic composition for model calibration reduced the model parameter uncertainty, and helped to reduce the uncertainty in the quantification of runoff components; (3) The isotope-hydrological integrated modeling approach quantified the contributions of runoff components comparably to a three-component tracer-based end-member mixing analysis method for summer peak flows, and required less water tracer data. Our findings demonstrate the value of water isotopic compositions to improve the quantification of runoff components using hydrological models in glacierized basins.
Sophisticated methods have been developed and become standard in analysing floods as well as for assessing flood risk. However, increasingly critique of the current standards and scientific practice can be found both in the flood hydrology community as well as in the risk community who argue that the considerable amount of information already available on natural disasters has not been adequately deployed and brought to effective use. We describe this phenomenon as a failure to synthesize knowledge that results from barriers and ignorance in awareness, use and management of the entire spectrum of relevant content, that is, data, information and knowledge. In this paper we argue that the scientific community in flood risk research ignores event-specific analysis and documentations as another source of data. We present results from a systematic search that includes an intensive study on sources and ways of information dissemination of flood-relevant publications. We obtain 186 documents that contain information on the sources, pathways, receptors and/or consequences for any of the 40 strongest trans-basin floods in Germany in the period 1952-2002. This study therefore provides the most comprehensive metadata collection of flood documentations for the considered geographical space and period. A total of 87.5% of all events have been documented, and especially the most severe floods have received extensive coverage. Only 30% of the material has been produced in the scientific/academic environment, and the majority of all documents (about 80%) can be considered grey literature (i.e. literature not controlled by commercial publishers). Therefore, ignoring grey sources in flood research also means ignoring the largest part of knowledge available on single flood events (in Germany). Further, the results of this study underpin the rapid changes in information dissemination of flood event literature over the last decade. We discuss the options and obstacles of incorporating this data into the knowledge-building process in light of the current technological developments and international, interdisciplinary debates for data curation.
For attributing hydrological changes to anthropogenic climate change, catchment models are driven by climate model output. A widespread approach to bridge the spatial gap between global climate and hydrological catchment models is to use a weather generator conditioned on weather patterns (WPs). This approach assumes that changes in local climate are characterized by between-type changes of patterns. In this study we test this assumption by analyzing a previously developed WP classification for the Rhine basin, which is based on dynamic and thermodynamic variables. We quantify changes in pattern characteristics and associated climatic properties. The amount of between- and within-type changes is investigated by comparing observed trends to trends resulting solely from WP occurrence. To overcome uncertainties in trend detection resulting from the selected time period, all possible periods in 1901-2010 with a minimum length of 31 years are analyzed. Increasing frequency is found for some patterns associated with high precipitation, although the trend sign highly depends on the considered period. Trends and interannual variations of WP frequencies are related to the long-term variability of large-scale circulation modes. Long-term WP internal warming is evident for summer patterns and enhanced warming for spring/autumn patterns since the 1970s. Observed trends in temperature and partly in precipitation are mainly associated with frequency changes of specific WPs, but some amount of within-type changes remains. The classification can be used for downscaling of past changes considering this limitation, but the inclusion of thermodynamic variables into the classification impedes the downscaling of future climate projections.
Compound flooding, such as the co-occurrence of fluvial floods and extreme coastal water levels (CWL), may lead to significant impacts in densely-populated Low Elevation Coastal Zones. They may overstrain disaster management owing to the co-occurrence of inundation from rivers and the sea. Recent studies are limited by analyzing joint dependence between river discharge and either CWL or storm surges, and little is known about return levels of compound flooding, accounting for the covariance between drivers. Here, we assess the compound flood severity and identify hotspots for northwestern Europe during 1970–2014, using a newly developed Compound Hazard Ratio (CHR) that compares the severity of compound flooding associated with extreme CWL with the unconditional T-year fluvial peak discharge. We show that extreme CWL and stronger storms greatly amplify fluvial flood hazards. Our results, based on frequency analyses of observational records during 2013/2014’s winter storm Xaver, reveal that the river discharge of the 50-year compound flood is up to 70% larger, conditioned on the occurrence of extreme CWL, than that of the at-site peak discharge. For this event, nearly half of the stream gauges show increased flood hazards, demonstrating the importance of including the compounding effect of extreme CWL in river flood risk management.
Flood estimation and flood management have traditionally been the domain of hydrologists, water resources engineers and statisticians, and disciplinary approaches abound. Dominant views have been shaped; one example is the catchment perspective: floods are formed and influenced by the interaction of local, catchment-specific characteristics, such as meteorology, topography and geology. These traditional views have been beneficial, but they have a narrow framing. In this paper we contrast traditional views with broader perspectives that are emerging from an improved understanding of the climatic context of floods. We come to the following conclusions: (1) extending the traditional system boundaries (local catchment, recent decades, hydrological/hydraulic processes) opens up exciting possibilities for better understanding and improved tools for flood risk assessment and management. (2) Statistical approaches in flood estimation need to be complemented by the search for the causal mechanisms and dominant processes in the atmosphere, catchment and river system that leave their fingerprints on flood characteristics. (3) Natural climate variability leads to time-varying flood characteristics, and this variation may be partially quantifiable and predictable, with the perspective of dynamic, climate-informed flood risk management. (4) Efforts are needed to fully account for factors that contribute to changes in all three risk components (hazard, exposure, vulnerability) and to better understand the interactions between society and floods. (5) Given the global scale and societal importance, we call for the organization of an international multidisciplinary collaboration and data-sharing initiative to further understand the links between climate and flooding and to advance flood research.
From precipitation to damage
(2018)
Flood risk assessments for large river basins often involve piecing together smaller-scale assessments leading to erroneous risk statements. We describe a coupled model chain for quantifying flood risk at the scale of 100,000 km(2). It consists of a catchment model, a 1D-2D river network model, and a loss model. We introduce the model chain and present two applications. The first application for the Elbe River basin with an area of 66,000 km(2) demonstrates that it is feasible to simulate the complete risk chain for large river basins in a continuous simulation mode with high temporal and spatial resolution. In the second application, RFM is coupled to a multisite weather generator and applied to the Mulde catchment with an area of 6,000 km(2). This approach is able to provide a very long time series of spatially heterogeneous patterns of precipitation, discharge, inundation, and damage. These patterns respect the spatial correlation of the different processes and are suitable to derive large-scale risk estimates. We discuss how the RFM approach can be transferred to the continental scale.
The hydrological load causing flood hazard is in many instances not only determined by peak discharge, but is a multidimensional problem. While the methodology for multivariate frequency analysis is well established, the estimation of the associated uncertainty is rarely studied. In this paper, a method is developed to quantify the different sources of uncertainty for a bivariate flood frequency analysis. The method is exemplarily developed for the Mekong Delta (MD), one of the largest and most densely populated river deltas worldwide. Floods in the MD are the basis for the livelihoods of the local population, but they are also the major hazard. This hazard has, however, not been studied within the frame of a probabilistic flood hazard analysis. The nature of the floods in the MD suggests a bivariate approach, because the societal flood severity is determined by both peak discharge and flood volume. The uncertainty caused by selection of statistical models and parameter estimation procedures are analyzed by applying different models and methods. For the quantification of the sampling uncertainty two bootstrapping methods were applied. The developed bootstrapping-based uncertainty estimation method shows that large uncertainties are associated with the estimation of bivariate flood quantiles. This uncertainty is much larger than the model selection and fitting uncertainty. Given the rather long data series of 88 years, it is concluded that bivariate flood frequency analysis is expected to carry significant uncertainty and that the quantification and reduction of uncertainty merit greater attention. But despite this uncertainty the proposed approach has certainly major advantages compared to a univariate approach, because (a) it reflects the two essential aspects of floods in this region, (b) the uncertainties are inherent for every bivariate frequency analysis in hydrology due to the general limited length of observations and can hardly be avoided, and (c) a framework for the quantification of the uncertainties is given, which can be used and interpreted in the hazard assessment. In addition it is shown by a parametric bootstrapping experiment how longer observation time series can reduce the sampling uncertainty. Based on this finding it is concluded that bivariate frequency analyses in hydrology would greatly benefit from discharge time series augmented by proxy or historical data, or by causal hydrologic expansion of time series. (C) 2015 Elsevier B.V. All rights reserved.
Hierarchical Bayesian Approach for Modeling Spatiotemporal Variability in Flood Damage Processes
(2019)
Flood damage processes are complex and vary between events and regions. State-of-the-art flood loss models are often developed on the basis of empirical damage data from specific case studies and do not perform well when spatially and temporally transferred. This is due to the fact that such localized models often cover only a small set of possible damage processes from one event and a region. On the other hand, a single generalized model covering multiple events and different regions ignores the variability in damage processes across regions and events due to variables that are not explicitly accounted for individual households. We implement a hierarchical Bayesian approach to parameterize widely used depth-damage functions resulting in a hierarchical (multilevel) Bayesian model (HBM) for flood loss estimation that accounts for spatiotemporal heterogeneity in damage processes. We test and prove the hypothesis that, in transfer scenarios, HBMs are superior compared to generalized and localized regression models. In order to improve loss predictions for regions and events for which no empirical damage data are available, we use variables pertaining to specific region- and event-characteristics representing commonly available expert knowledge as group-level predictors within the HBM.