Refine
Has Fulltext
- no (60)
Year of publication
Document Type
- Article (60) (remove)
Language
- English (60) (remove)
Is part of the Bibliography
- yes (60)
Keywords
- Mekong Delta (3)
- adaptation (3)
- Bayesian networks (2)
- Climate variability (2)
- compound flood (2)
- floodplain sedimentation (2)
- probabilistic (2)
- sediment dynamics (2)
- trend analysis (2)
- uncertainty (2)
Design flood estimation is an essential part of flood risk assessment. Commonly applied are flood frequency analyses and design storm approaches, while the derived flood frequency using continuous simulation has been getting more attention recently. In this study, a continuous hydrological modelling approach on an hourly time scale, driven by a multi-site weather generator in combination with a -nearest neighbour resampling procedure, based on the method of fragments, is applied. The derived 100-year flood estimates in 16 catchments in Vorarlberg (Austria) are compared to (a) the flood frequency analysis based on observed discharges, and (b) a design storm approach. Besides the peak flows, the corresponding runoff volumes are analysed. The spatial dependence structure of the synthetically generated flood peaks is validated against observations. It can be demonstrated that the continuous modelling approach can achieve plausible results and shows a large variability in runoff volume across the flood events.
Different upper tail indicators exist to characterize heavy tail phenomena, but no comparative study has been carried out so far. We evaluate the shape parameter (GEV), obesity index, Gini index and upper tail ratio (UTR) against a novel benchmark of tail heaviness - the surprise factor. Sensitivity analyses to sample size and changes in scale-to-location ratio are carried out in bootstrap experiments. The UTR replicates the surprise factor best but is most uncertain and only comparable between records of similar length. For samples with symmetric Lorenz curves, shape parameter, obesity and Gini indices provide consistent indications. For asymmetric Lorenz curves, however, the first two tend to overestimate, whereas Gini index tends to underestimate tail heaviness. We suggest the use of a combination of shape parameter, obesity and Gini index to characterize tail heaviness. These indicators should be supported with calculation of the Lorenz asymmetry coefficients and interpreted with caution.
This paper introduces a novel measure to assess similarity between event hydrographs. It is based on cross recurrence plots (CRP) and recurrence quantification analysis (RQA), which have recently gained attention in a range of disciplines when dealing with complex systems. The method attempts to quantify the event runoff dynamics and is based on the time delay embedded phase space representation of discharge hydrographs. A phase space trajectory is reconstructed from the event hydrograph, and pairs of hydrographs are compared to each other based on the distance of their phase space trajectories. Time delay embedding allows considering the multidimensional relationships between different points in time within the event. Hence, the temporal succession of discharge values is taken into account, such as the impact of the initial conditions on the runoff event. We provide an introduction to cross recurrence plots and discuss their parameterization. An application example based on flood time series demonstrates how the method can be used to measure the similarity or dissimilarity of events, and how it can be used to detect events with rare runoff dynamics. It is argued that this methods provides a more comprehensive approach to quantify hydrograph similarity compared to conventional hydrological signatures.
As an effort to reduce parameter uncertainties in constructing recurrence plots, and in particular to avoid potential artefacts, this paper presents a technique to derive artefact-safe region of parameter sets. This technique exploits both deterministic (incl. chaos) and stochastic signal characteristics of recurrence quantification (i.e. diagonal structures). It is useful when the evaluated signal is known to be deterministic. This study focuses on the recurrence plot generated from the reconstructed phase space in order to represent many real application scenarios when not all variables to describe a system are available (data scarcity). The technique involves random shuffling of the original signal to destroy its original deterministic characteristics. Its purpose is to evaluate whether the determinism values of the original and the shuffled signal remain closely together, and therefore suggesting that the recurrence plot might comprise artefacts. The use of such determinism-sensitive region shall be accompanied by standard embedding optimization approaches, e.g. using indices like false nearest neighbor and mutual information, to result in a more reliable recurrence plot parameterization.
Stochastic modeling of precipitation for estimation of hydrological extremes is an important element of flood risk assessment and management. The spatially consistent estimation of rainfall fields and their temporal variability remains challenging and is addressed by various stochastic weather generators.
In this study, two types of weather generators are evaluated against observed data and benchmarked regarding their ability to simulate spatio-temporal precipitation fields in the Rhine catchment. A multi-site station-based weather generator uses an auto-regressive model and estimates the spatial correlation structure between stations. Another weather generator is raster-based and uses the nearest-neighbor resampling technique for reshuffling daily patterns while preserving the correlation structure between the observations.
Both weather generators perform well and are comparable at the point (station) scale with regards to daily mean and 99.9th percentile precipitation as well as concerning wet/dry frequencies and transition probabilities. The areal extreme precipitation at the sub-basin scale is however overestimated in the station-based weather generator due to an overestimation of the correlation structure between individual stations. The auto-regressive model tends to generate larger rainfall fields in space for extreme precipitation than observed, particularly in summer. The weather generator based on nearest-neighbor resampling reproduces the observed daily and multiday (5, 10 and 20) extreme events in a similar magnitude. Improvements in performance regarding wet frequencies and transition probabilities are recommended for both models.
Sophisticated methods have been developed and become standard in analysing floods as well as for assessing flood risk. However, increasingly critique of the current standards and scientific practice can be found both in the flood hydrology community as well as in the risk community who argue that the considerable amount of information already available on natural disasters has not been adequately deployed and brought to effective use. We describe this phenomenon as a failure to synthesize knowledge that results from barriers and ignorance in awareness, use and management of the entire spectrum of relevant content, that is, data, information and knowledge. In this paper we argue that the scientific community in flood risk research ignores event-specific analysis and documentations as another source of data. We present results from a systematic search that includes an intensive study on sources and ways of information dissemination of flood-relevant publications. We obtain 186 documents that contain information on the sources, pathways, receptors and/or consequences for any of the 40 strongest trans-basin floods in Germany in the period 1952-2002. This study therefore provides the most comprehensive metadata collection of flood documentations for the considered geographical space and period. A total of 87.5% of all events have been documented, and especially the most severe floods have received extensive coverage. Only 30% of the material has been produced in the scientific/academic environment, and the majority of all documents (about 80%) can be considered grey literature (i.e. literature not controlled by commercial publishers). Therefore, ignoring grey sources in flood research also means ignoring the largest part of knowledge available on single flood events (in Germany). Further, the results of this study underpin the rapid changes in information dissemination of flood event literature over the last decade. We discuss the options and obstacles of incorporating this data into the knowledge-building process in light of the current technological developments and international, interdisciplinary debates for data curation.
Flooding is an imminent natural hazard threatening most river deltas, e.g. the Mekong Delta. An appropriate flood management is thus required for a sustainable development of the often densely populated regions. Recently, the traditional event-based hazard control shifted towards a risk management approach in many regions, driven by intensive research leading to new legal regulation on flood management. However, a large-scale flood risk assessment does not exist for the Mekong Delta. Particularly, flood risk to paddy rice cultivation, the most important economic activity in the delta, has not been performed yet. Therefore, the present study was developed to provide the very first insight into delta-scale flood damages and risks to rice cultivation. The flood hazard was quantified by probabilistic flood hazard maps of the whole delta using a bivariate extreme value statistics, synthetic flood hydrographs, and a large-scale hydraulic model. The flood risk to paddy rice was then quantified considering cropping calendars, rice phenology, and harvest times based on a time series of enhanced vegetation index (EVI) derived from MODIS satellite data, and a published rice flood damage function. The proposed concept provided flood risk maps to paddy rice for the Mekong Delta in terms of expected annual damage. The presented concept can be used as a blueprint for regions facing similar problems due to its generic approach. Furthermore, the changes in flood risk to paddy rice caused by changes in land use currently under discussion in the Mekong Delta were estimated. Two land-use scenarios either intensifying or reducing rice cropping were considered, and the changes in risk were presented in spatially explicit flood risk maps. The basic risk maps could serve as guidance for the authorities to develop spatially explicit flood management and mitigation plans for the delta. The land-use change risk maps could further be used for adaptive risk management plans and as a basis for a cost-benefit of the discussed land-use change scenarios. Additionally, the damage and risks maps may support the recently initiated agricultural insurance programme in Vietnam.
Flood risk analyses are often estimated assuming the same flood intensity along the river reach under study, i.e. discharges are calculated for a number of return periods T, e.g. 10 or 100 years, at several streamflow gauges. T-year discharges are regionalised and then transferred into T-year water levels, inundated areas and impacts. This approach assumes that (1) flood scenarios are homogeneous throughout a river basin, and (2) the T-year damage corresponds to the T-year discharge. Using a reach at the river Rhine, this homogeneous approach is compared with an approach that is based on four flood types with different spatial discharge patterns. For each type, a regression model was created and used in a Monte-Carlo framework to derive heterogeneous scenarios. Per scenario, four cumulative impact indicators were calculated: (1) the total inundated area, (2) the exposed settlement and industrial areas, (3) the exposed population and 4) the potential building loss. Their frequency curves were used to establish a ranking of eight past flood events according to their severity. The investigation revealed that the two assumptions of the homogeneous approach do not hold. It tends to overestimate event probabilities in large areas. Therefore, the generation of heterogeneous scenarios should receive more attention.
Annually laminated (varved) lake sediments with intercalated detrital layers resulting from sedimentary input by runoff events are ideal archives to establish precisely dated records of past extreme runoff events. In this study, the mid- to late Holocene varved sediments of Lake Mondsee (Upper Austria) were analysed by combining sedimentological, geophysical and geochemical methods. This approach allows to distinguish two types of detrital layers related to different types of extreme runoff events (floods and debris flows) and to detect changes in flood activity during the last 7100 years. In total, 271 flood and 47 debris flow layers, deposited during spring and summer, were identified, which cluster in 18 main flood episodes (FE 1-18) with durations of 30-50 years each. These main flood periods occurred during the Neolithic (7100-7050 vyr BP and 6470-4450 vyr BP), the late Bronze Age and the early Iron Age (3300-3250 and 2800-2750 vyr BP), the late Iron Age (2050-2000 vyr BP), throughout the Dark Ages Cold Period (1500-1200 vyr BP), and at the end of the Medieval Warm Period and the Little Ice Age (810-430 vyr BP).
Summer flood episodes in Lake Mondsee are generally more abundant during the last 1500 years, often coinciding with major advances of Alpine glaciers. Prior to 1500 vyr BP, spring/summer floods and debris flows are generally less frequent, indicating a lower number of intense rainfall events that triggered erosion. In comparison with the increase of late Holocene flood activity in western and northwestern (NW) Europe, commencing already as early as 2800 yr BP, the hydro-meteorological shift in the Lake Mondsee region occurred much later. These time lags in the onset of increased hydrological activity might be either due to regional differences in atmospheric circulation pattern or to the sensitivity of the individual flood archives. The Lake Mondsee sediments represent the first precisely dated and several millennia long summer flood record for the northeastern (NE) Alps, a key region at the climatic boundary of Atlantic, Mediterranean and East European air masses, aiding a better understanding of regional and seasonal peculiarities of flood occurrence under changing climate conditions. (C) 2013 Elsevier Ltd. All rights reserved.
Especially for extreme precipitation or floods, there is considerable spatial and temporal variability in long term trends or in the response of station time series to large-scale climate indices. Consequently, identifying trends or sensitivity of these extremes to climate parameters can be marked by high uncertainty. When one develops a nonstationary frequency analysis model, a key step is the identification of potential trends or effects of climate indices on the station series. An automatic clustering procedure that effectively pools stations where there are similar responses is desirable to reduce the estimation variance, thus improving the identification of trends or responses, and accounting for spatial dependence. This paper presents a new hierarchical Bayesian approach for exploring homogeneity of response in large area data sets, through a multicomponent mixture model. The approach allows the reduction of uncertainties through both full pooling and partial pooling of stations across automatically chosen subsets of the data. We apply the model to study the trends in annual maximum daily stream flow at 68 gauges over Germany. The effects of changing the number of clusters and the parameters used for clustering are demonstrated. The results show that there are large, mainly upward trends in the gauges of the River Rhine Basin in Western Germany and along the main stream of the Danube River in the south, while there are also some small upward trends at gauges in Central and Northern Germany.
The link between streamflow extremes and climatology has been widely studied in recent decades. However, a study investigating the effect of large-scale circulation variations on the distribution of seasonal discharge extremes at the European level is missing. Here we fit a climate-informed generalized extreme value (GEV) distribution to about 600 streamflow records in Europe for each of the standard seasons, i.e., to winter, spring, summer and autumn maxima, and compare it with the classical GEV distribution with parameters invariant in time. The study adopts a Bayesian framework and covers the period 1950 to 2016. Five indices with proven influence on the European climate are examined independently as covariates, namely the North Atlantic Oscillation (NAO), the east Atlantic pattern (EA), the east Atlantic-western Russian pattern (EA/WR), the Scandinavia pattern (SCA) and the polar-Eurasian pattern (POL). It is found that for a high percentage of stations the climate-informed model is preferred to the classical model. Particularly for NAO during winter, a strong influence on streamflow extremes is detected for large parts of Europe (preferred to the classical GEV distribution for 46% of the stations). Climate-informed fits are characterized by spatial coherence and form patterns that resemble relations between the climate indices and seasonal precipitation, suggesting a prominent role of the considered circulation modes for flood generation. For certain regions, such as northwestern Scandinavia and the British Isles, yearly variations of the mean seasonal climate indices result in considerably different extreme value distributions and thus in highly different flood estimates for individual years that can also persist for longer time periods.
Inventory of dams in Germany
(2021)
Dams are an important element of water resources management. Data about dams are crucial for practitioners, scientists, and policymakers for various purposes, such as seasonal forecasting of water availability or flood mitigation. However, detailed information on dams on the national level for Germany is so far not freely available. We present the most comprehensive open-access dam inventory for Germany (DIG) to date. We have collected and combined information on dams using books, state agency reports, engineering reports, and internet pages. We have applied a priority rule that ensures the highest level of reliability for the dam information. Our dam inventory comprises 530 dams in Germany with information on name, location, river, start year of construction and operation, crest length, dam height, lake area, lake volume, purpose, dam structure, and building characteristics. We have used a global, satellite-based water surface raster to evaluate the location of the dams. A significant proportion (63 %) of dams were built between 1950-2013. Our inventory shows that dams in Germany are mostly single-purpose (52 %), 53% can be used for flood control, and 25% are involved in energy production. The inventory is freely available through GFZ (GeoForschungsZentrum) Data Services (https://doi.org/10.5880/GFZ.4.4.2020.005)
Reliable flood risk analyses, including the estimation of damage, are an important prerequisite for efficient risk management. However, not much is known about flood damage processes affecting companies. Thus, we conduct a flood damage assessment of companies in Germany with regard to two aspects. First, we identify relevant damage-influencing variables. Second, we assess the prediction performance of the developed damage models with respect to the gain by using an increasing amount of training data and a sector-specific evaluation of the data. Random forests are trained with data from two postevent surveys after flood events occurring in the years 2002 and 2013. For a sector-specific consideration, the data set is split into four subsets corresponding to the manufacturing, commercial, financial, and service sectors. Further, separate models are derived for three different company assets: buildings, equipment, and goods and stock. Calculated variable importance values reveal different variable sets relevant for the damage estimation, indicating significant differences in the damage process for various company sectors and assets. With an increasing number of data used to build the models, prediction errors decrease. Yet the effect is rather small and seems to saturate for a data set size of several hundred observations. In contrast, the prediction improvement achieved by a sector-specific consideration is more distinct, especially for damage to equipment and goods and stock. Consequently, sector-specific data acquisition and a consideration of sector-specific company characteristics in future flood damage assessments is expected to improve the model performance more than a mere increase in data.
Hydrometeorological hazards caused losses of approximately 110 billion U.S. Dollars in 2016 worldwide. Current damage estimations do not consider the uncertainties in a comprehensive way, and they are not consistent between spatial scales. Aggregated land use data are used at larger spatial scales, although detailed exposure data at the object level, such as openstreetmap.org, is becoming increasingly available across the globe.We present a probabilistic approach for object-based damage estimation which represents uncertainties and is fully scalable in space. The approach is applied and validated to company damage from the flood of 2013 in Germany. Damage estimates are more accurate compared to damage models using land use data, and the estimation works reliably at all spatial scales. Therefore, it can as well be used for pre-event analysis and risk assessments. This method takes hydrometeorological damage estimation and risk assessments to the next level, making damage estimates and their uncertainties fully scalable in space, from object to country level, and enabling the exploitation of new exposure data.
Understanding and quantifying total economic impacts of flood events is essential for flood risk management and adaptation planning. Yet, detailed estimations of joint direct and indirect flood-induced economic impacts are rare. In this study an innovative modeling procedure for the joint assessment of short-term direct and indirect economic flood impacts is introduced. The procedure is applied to 19 economic sectors in eight federal states of Germany after the flood events in 2013. The assessment of the direct economic impacts is object-based and considers uncertainties associated with the hazard, the exposed objects and their vulnerability. The direct economic impacts are then coupled to a supply-side Input-Output-Model to estimate the indirect economic impacts. The procedure provides distributions of direct and indirect economic impacts which capture the associated uncertainties. The distributions of the direct economic impacts in the federal states are plausible when compared to reported values. The ratio between indirect and direct economic impacts shows that the sectors Manufacturing, Financial and Insurance activities suffered the most from indirect economic impacts. These ratios also indicate that indirect economic impacts can be almost as high as direct economic impacts. They differ strongly between the economic sectors indicating that the application of a single factor as a proxy for the indirect impacts of all economic sectors is not appropriate.
The Limpopo Basin in southern Africa is prone to droughts which affect the livelihood of millions of people in South Africa, Botswana, Zimbabwe and Mozambique. Seasonal drought early warning is thus vital for the whole region. In this study, the predictability of hydrological droughts during the main runoff period from December to May is assessed using statistical approaches. Three methods (multiple linear models, artificial neural networks, random forest regression trees) are compared in terms of their ability to forecast streamflow with up to 12 months of lead time. The following four main findings result from the study. 1. There are stations in the basin at which standardised streamflow is predictable with lead times up to 12 months. The results show high inter-station differences of forecast skill but reach a coefficient of determination as high as 0.73 (cross validated). 2. A large range of potential predictors is considered in this study, comprising well-established climate indices, customised teleconnection indices derived from sea surface temperatures and antecedent streamflow as a proxy of catchment conditions. El Nino and customised indices, representing sea surface temperature in the Atlantic and Indian oceans, prove to be important teleconnection predictors for the region. Antecedent streamflow is a strong predictor in small catchments (with median 42% explained variance), whereas teleconnections exert a stronger influence in large catchments. 3. Multiple linear models show the best forecast skill in this study and the greatest robustness compared to artificial neural networks and random forest regression trees, despite their capabilities to represent nonlinear relationships. 4. Employed in early warning, the models can be used to forecast a specific drought level. Even if the coefficient of determination is low, the forecast models have a skill better than a climatological forecast, which is shown by analysis of receiver operating characteristics (ROCs). Seasonal statistical forecasts in the Limpopo show promising results, and thus it is recommended to employ them as complementary to existing forecasts in order to strengthen preparedness for droughts.
We investigate the usefulness of complex flood damage models for predicting relative damage to residential buildings in a spatial and temporal transfer context. We apply eight different flood damage models to predict relative building damage for five historic flood events in two different regions of Germany. Model complexity is measured in terms of the number of explanatory variables which varies from 1 variable up to 10 variables which are singled out from 28 candidate variables. Model validation is based on empirical damage data, whereas observation uncertainty is taken into consideration. The comparison of model predictive performance shows that additional explanatory variables besides the water depth improve the predictive capability in a spatial and temporal transfer context, i.e., when the models are transferred to different regions and different flood events. Concerning the trade-off between predictive capability and reliability the model structure seem more important than the number of explanatory variables. Among the models considered, the reliability of Bayesian network-based predictions in space-time transfer is larger than for the remaining models, and the uncertainties associated with damage predictions are reflected more completely.
Private precaution is an important component in contemporary flood risk management and climate adaptation. However, quantitative knowledge about vulnerability reduction via private precautionary measures is scarce and their effects are hardly considered in loss modeling and risk assessments. However, this is a prerequisite to enable temporally dynamic flood damage and risk modeling, and thus the evaluation of risk management and adaptation strategies. To quantify the average reduction in vulnerability of residential buildings via private precaution empirical vulnerability data (n = 948) is used. Households with and without precautionary measures undertaken before the flood event are classified into treatment and nontreatment groups and matched. Postmatching regression is used to quantify the treatment effect. Additionally, we test state-of-the-art flood loss models regarding their capability to capture this difference in vulnerability. The estimated average treatment effect of implementing private precaution is between 11 and 15 thousand EUR per household, confirming the significant effectiveness of private precautionary measures in reducing flood vulnerability. From all tested flood loss models, the expert Bayesian network-based model BN-FLEMOps and the rule-based loss model FLEMOps perform best in capturing the difference in vulnerability due to private precaution. Thus, the use of such loss models is suggested for flood risk assessments to effectively support evaluations and decision making for adaptable flood risk management.
Hierarchical Bayesian Approach for Modeling Spatiotemporal Variability in Flood Damage Processes
(2019)
Flood damage processes are complex and vary between events and regions. State-of-the-art flood loss models are often developed on the basis of empirical damage data from specific case studies and do not perform well when spatially and temporally transferred. This is due to the fact that such localized models often cover only a small set of possible damage processes from one event and a region. On the other hand, a single generalized model covering multiple events and different regions ignores the variability in damage processes across regions and events due to variables that are not explicitly accounted for individual households. We implement a hierarchical Bayesian approach to parameterize widely used depth-damage functions resulting in a hierarchical (multilevel) Bayesian model (HBM) for flood loss estimation that accounts for spatiotemporal heterogeneity in damage processes. We test and prove the hypothesis that, in transfer scenarios, HBMs are superior compared to generalized and localized regression models. In order to improve loss predictions for regions and events for which no empirical damage data are available, we use variables pertaining to specific region- and event-characteristics representing commonly available expert knowledge as group-level predictors within the HBM.