Refine
Has Fulltext
- no (66)
Year of publication
Document Type
- Article (66) (remove)
Is part of the Bibliography
- yes (66)
Keywords
- Mekong Delta (3)
- adaptation (3)
- Bayesian networks (2)
- Climate variability (2)
- compound flood (2)
- flood (2)
- flood risk (2)
- floodplain sedimentation (2)
- floods (2)
- probabilistic (2)
The hydrological load causing flood hazard is in many instances not only determined by peak discharge, but is a multidimensional problem. While the methodology for multivariate frequency analysis is well established, the estimation of the associated uncertainty is rarely studied. In this paper, a method is developed to quantify the different sources of uncertainty for a bivariate flood frequency analysis. The method is exemplarily developed for the Mekong Delta (MD), one of the largest and most densely populated river deltas worldwide. Floods in the MD are the basis for the livelihoods of the local population, but they are also the major hazard. This hazard has, however, not been studied within the frame of a probabilistic flood hazard analysis. The nature of the floods in the MD suggests a bivariate approach, because the societal flood severity is determined by both peak discharge and flood volume. The uncertainty caused by selection of statistical models and parameter estimation procedures are analyzed by applying different models and methods. For the quantification of the sampling uncertainty two bootstrapping methods were applied. The developed bootstrapping-based uncertainty estimation method shows that large uncertainties are associated with the estimation of bivariate flood quantiles. This uncertainty is much larger than the model selection and fitting uncertainty. Given the rather long data series of 88 years, it is concluded that bivariate flood frequency analysis is expected to carry significant uncertainty and that the quantification and reduction of uncertainty merit greater attention. But despite this uncertainty the proposed approach has certainly major advantages compared to a univariate approach, because (a) it reflects the two essential aspects of floods in this region, (b) the uncertainties are inherent for every bivariate frequency analysis in hydrology due to the general limited length of observations and can hardly be avoided, and (c) a framework for the quantification of the uncertainties is given, which can be used and interpreted in the hazard assessment. In addition it is shown by a parametric bootstrapping experiment how longer observation time series can reduce the sampling uncertainty. Based on this finding it is concluded that bivariate frequency analyses in hydrology would greatly benefit from discharge time series augmented by proxy or historical data, or by causal hydrologic expansion of time series. (C) 2015 Elsevier B.V. All rights reserved.
Flood estimation and flood management have traditionally been the domain of hydrologists, water resources engineers and statisticians, and disciplinary approaches abound. Dominant views have been shaped; one example is the catchment perspective: floods are formed and influenced by the interaction of local, catchment-specific characteristics, such as meteorology, topography and geology. These traditional views have been beneficial, but they have a narrow framing. In this paper we contrast traditional views with broader perspectives that are emerging from an improved understanding of the climatic context of floods. We come to the following conclusions: (1) extending the traditional system boundaries (local catchment, recent decades, hydrological/hydraulic processes) opens up exciting possibilities for better understanding and improved tools for flood risk assessment and management. (2) Statistical approaches in flood estimation need to be complemented by the search for the causal mechanisms and dominant processes in the atmosphere, catchment and river system that leave their fingerprints on flood characteristics. (3) Natural climate variability leads to time-varying flood characteristics, and this variation may be partially quantifiable and predictable, with the perspective of dynamic, climate-informed flood risk management. (4) Efforts are needed to fully account for factors that contribute to changes in all three risk components (hazard, exposure, vulnerability) and to better understand the interactions between society and floods. (5) Given the global scale and societal importance, we call for the organization of an international multidisciplinary collaboration and data-sharing initiative to further understand the links between climate and flooding and to advance flood research.
We investigate whether the distribution of maximum seasonal streamflow is significantly affected by catchment or climate state of the season/month ahead. We fit the Generalized Extreme Value (GEV) distribution to extreme seasonal streamflow for around 600 stations across Europe by conditioning the GEV location and scale parameters on 14 indices, which represent the season-ahead climate or catchment state. The comparison of these climate-informed models with the classical GEV distribution, with time-constant parameters, suggests that there is a substantial potential for seasonal forecasting of flood probabilities. The potential varies between seasons and regions. Overall, the season-ahead catchment wetness shows the highest potential, although climate indices based on large-scale atmospheric circulation, sea surface temperature or sea ice concentration also show some skill for certain regions and seasons. Spatially coherent patterns and a substantial fraction of climate-informed models are promising signs towards early alerts to increase flood preparedness already a season ahead.
Sedimentation in the floodplains of the Mekong Delta, Vietnam Part II: deposition and erosion
(2014)
Deposition and erosion play a key role in the determination of the sediment budget of a river basin, as well as for floodplain sedimentation. Floodplain sedimentation, in turn, is a relevant factor for the design of flood protection measures, productivity of agro-ecosystems, and for ecological rehabilitation plans. In the Mekong Delta, erosion and deposition are important factors for geomorphological processes like the compensation of deltaic subsidence as well as for agricultural productivity. Floodplain deposition is also counteracting the increasing climate change induced hazard by sea level rise in the delta. Despite this importance, a sediment database of the Mekong Delta is lacking, and the knowledge about erosion and deposition processes is limited. In the Vietnamese part of the Delta, the annually flooded natural floodplains have been replaced by a dense system of channels, dikes, paddy fields, and aquaculture ponds, resulting in floodplain compartments protected by ring dikes. The agricultural productivity depends on the sediment and associated nutrient input to the floodplains by the annual floods. However, no quantitative information regarding their sediment trapping efficiency has been reported yet. The present study investigates deposition and erosion based on intensive field measurements in three consecutive years (2008, 2009, and 2010). Optical backscatter sensors are used in combination with sediment traps for interpreting deposition and erosion processes in different locations. In our study area, the mean calculated deposition rate is 6.86kg/m(2) (approximate to 6mm/year). The key parameters for calculating erosion and deposition are estimated, i.e. the critical bed shear stress for deposition and erosion and the surface constant erosion rate. The bulk of the floodplain sediment deposition is found to occur during the initial stage of floodplain inundation. This finding has direct implications on the operation of sluice gates in order to optimize sediment input and distribution in the floodplains.
Sophisticated methods have been developed and become standard in analysing floods as well as for assessing flood risk. However, increasingly critique of the current standards and scientific practice can be found both in the flood hydrology community as well as in the risk community who argue that the considerable amount of information already available on natural disasters has not been adequately deployed and brought to effective use. We describe this phenomenon as a failure to synthesize knowledge that results from barriers and ignorance in awareness, use and management of the entire spectrum of relevant content, that is, data, information and knowledge. In this paper we argue that the scientific community in flood risk research ignores event-specific analysis and documentations as another source of data. We present results from a systematic search that includes an intensive study on sources and ways of information dissemination of flood-relevant publications. We obtain 186 documents that contain information on the sources, pathways, receptors and/or consequences for any of the 40 strongest trans-basin floods in Germany in the period 1952-2002. This study therefore provides the most comprehensive metadata collection of flood documentations for the considered geographical space and period. A total of 87.5% of all events have been documented, and especially the most severe floods have received extensive coverage. Only 30% of the material has been produced in the scientific/academic environment, and the majority of all documents (about 80%) can be considered grey literature (i.e. literature not controlled by commercial publishers). Therefore, ignoring grey sources in flood research also means ignoring the largest part of knowledge available on single flood events (in Germany). Further, the results of this study underpin the rapid changes in information dissemination of flood event literature over the last decade. We discuss the options and obstacles of incorporating this data into the knowledge-building process in light of the current technological developments and international, interdisciplinary debates for data curation.
In hydrology, the storage-discharge relationship is a fundamental catchment property. Understanding what controls this relationship is at the core of catchment science. To date, there are no direct methods to measure water storage at catchment scales (10(1)-10(3)km(2)). In this study, we use direct measurements of terrestrial water storage dynamics by means of superconducting gravimetry in a small headwater catchment of the Regen River, Germany, to derive empirical storage-discharge relationships in nested catchments of increasing scale. Our results show that the local storage measurements are strongly related to streamflow dynamics at larger scales (> 100km(2); correlation coefficient=0.78-0.81), but at small scale, no such relationship exists (similar to 1km(2); correlation coefficients=-0.11). The geologic setting in the region can explain both the disconnection between local water storage and headwater runoff, and the connectivity between headwater storage and streams draining larger catchment areas. More research is required to understand what controls the form of the observed storage-discharge relationships at the catchment scale. This study demonstrates that high-precision gravimetry can provide new insights into the complex relationship between state and response of hydrological systems.
Flood hazard projections under climate change are typically derived by applying model chains consisting of the following elements: "emission scenario - global climate model - downscaling, possibly including bias correction hydrological model - flood frequency analysis". To date, this approach yields very uncertain results, due to the difficulties of global and regional climate models to represent precipitation. The implementation of such model chains requires major efforts, and their complexity is high.
We propose for the Mekong River an alternative approach which is based on a shortened model chain: "emission scenario - global climate model - non-stationary flood frequency model". The underlying idea is to use a link between the Western Pacific monsoon and local flood characteristics: the variance of the monsoon drives a non-stationary flood frequency model, yielding a direct estimate of flood probabilities. This approach bypasses the uncertain precipitation, since the monsoon variance is derived from large-scale wind fields which are better represented by climate models. The simplicity of the monsoon-flood link allows deriving large ensembles of flood projections under climate change. We conclude that this is a worthwhile, complementary approach to the typical model chains in catchments where a substantial link between climate and floods is found.
We investigate the usefulness of complex flood damage models for predicting relative damage to residential buildings in a spatial and temporal transfer context. We apply eight different flood damage models to predict relative building damage for five historic flood events in two different regions of Germany. Model complexity is measured in terms of the number of explanatory variables which varies from 1 variable up to 10 variables which are singled out from 28 candidate variables. Model validation is based on empirical damage data, whereas observation uncertainty is taken into consideration. The comparison of model predictive performance shows that additional explanatory variables besides the water depth improve the predictive capability in a spatial and temporal transfer context, i.e., when the models are transferred to different regions and different flood events. Concerning the trade-off between predictive capability and reliability the model structure seem more important than the number of explanatory variables. Among the models considered, the reliability of Bayesian network-based predictions in space-time transfer is larger than for the remaining models, and the uncertainties associated with damage predictions are reflected more completely.
Reliable flood risk analyses, including the estimation of damage, are an important prerequisite for efficient risk management. However, not much is known about flood damage processes affecting companies. Thus, we conduct a flood damage assessment of companies in Germany with regard to two aspects. First, we identify relevant damage-influencing variables. Second, we assess the prediction performance of the developed damage models with respect to the gain by using an increasing amount of training data and a sector-specific evaluation of the data. Random forests are trained with data from two postevent surveys after flood events occurring in the years 2002 and 2013. For a sector-specific consideration, the data set is split into four subsets corresponding to the manufacturing, commercial, financial, and service sectors. Further, separate models are derived for three different company assets: buildings, equipment, and goods and stock. Calculated variable importance values reveal different variable sets relevant for the damage estimation, indicating significant differences in the damage process for various company sectors and assets. With an increasing number of data used to build the models, prediction errors decrease. Yet the effect is rather small and seems to saturate for a data set size of several hundred observations. In contrast, the prediction improvement achieved by a sector-specific consideration is more distinct, especially for damage to equipment and goods and stock. Consequently, sector-specific data acquisition and a consideration of sector-specific company characteristics in future flood damage assessments is expected to improve the model performance more than a mere increase in data.
Observed streamflow of headwater catchments of the Tarim River (Central Asia) increased by about 30% over the period 1957-2004. This study aims at assessing to which extent these streamflow trends can be attributed to changes in air temperature or precipitation. The analysis includes a data-based approach using multiple linear regression and a simulation-based approach using a hydrological model. The hydrological model considers changes in both glacier area and surface elevation. It was calibrated using a multiobjective optimization algorithm with calibration criteria based on glacier mass balance and daily and interannual variations of discharge. The individual contributions to the overall streamflow trends from changes in glacier geometry, temperature, and precipitation were assessed using simulation experiments with a constant glacier geometry and with detrended temperature and precipitation time series. The results showed that the observed changes in streamflow were consistent with the changes in temperature and precipitation. In the Sari-Djaz catchment, increasing temperatures and related increase of glacier melt were identified as the dominant driver, while in the Kakshaal catchment, both increasing temperatures and increasing precipitation played a major role. Comparing the two approaches, an advantage of the simulation-based approach is the fact that it is based on process-based relationships implemented in the hydrological model instead of statistical links in the regression model. However, data-based approaches are less affected by model parameter and structural uncertainties and typically fast to apply. A complementary application of both approaches is recommended.