TY - JOUR A1 - Doerr, Benjamin A1 - Krejca, Martin S. T1 - Significance-based estimation-of-distribution algorithms JF - IEEE transactions on evolutionary computation N2 - Estimation-of-distribution algorithms (EDAs) are randomized search heuristics that create a probabilistic model of the solution space, which is updated iteratively, based on the quality of the solutions sampled according to the model. As previous works show, this iteration-based perspective can lead to erratic updates of the model, in particular, to bit-frequencies approaching a random boundary value. In order to overcome this problem, we propose a new EDA based on the classic compact genetic algorithm (cGA) that takes into account a longer history of samples and updates its model only with respect to information which it classifies as statistically significant. We prove that this significance-based cGA (sig-cGA) optimizes the commonly regarded benchmark functions OneMax (OM), LeadingOnes, and BinVal all in quasilinear time, a result shown for no other EDA or evolutionary algorithm so far. For the recently proposed stable compact genetic algorithm-an EDA that tries to prevent erratic model updates by imposing a bias to the uniformly distributed model-we prove that it optimizes OM only in a time exponential in its hypothetical population size. Similarly, we show that the convex search algorithm cannot optimize OM in polynomial time. KW - heuristic algorithms KW - sociology KW - statistics KW - history KW - probabilistic KW - logic KW - benchmark testing KW - genetic algorithms KW - estimation-of-distribution KW - algorithm (EDA) KW - run time analysis KW - theory Y1 - 2020 U6 - https://doi.org/10.1109/TEVC.2019.2956633 SN - 1089-778X SN - 1941-0026 VL - 24 IS - 6 SP - 1025 EP - 1034 PB - Institute of Electrical and Electronics Engineers CY - New York, NY ER - TY - JOUR A1 - Kreibich, Heidi A1 - Botto, Anna A1 - Merz, Bruno A1 - Schröter, Kai T1 - Probabilistic, Multivariable Flood Loss Modeling on the Mesoscale with BT-FLEMO JF - Risk analysis N2 - Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth-damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable Bagging decision Tree Flood Loss Estimation MOdel (BT-FLEMO) for residential buildings was developed. The application of BT-FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT-FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth-damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT-FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT-FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT-FLEMO well represents the variation range of loss estimates of the other models in the case study. KW - Damage modeling KW - multiparameter KW - probabilistic KW - uncertainty KW - validation Y1 - 2016 U6 - https://doi.org/10.1111/risa.12650 SN - 0272-4332 SN - 1539-6924 VL - 37 IS - 4 SP - 774 EP - 787 PB - Wiley CY - Hoboken ER - TY - JOUR A1 - Rözer, Viktor A1 - Kreibich, Heidi A1 - Schröter, Kai A1 - Müller, Meike A1 - Sairam, Nivedita A1 - Doss-Gollin, James A1 - Lall, Upmanu A1 - Merz, Bruno T1 - Probabilistic Models Significantly Reduce Uncertainty in Hurricane Harvey Pluvial Flood Loss Estimates JF - Earths future N2 - Pluvial flood risk is mostly excluded in urban flood risk assessment. However, the risk of pluvial flooding is a growing challenge with a projected increase of extreme rainstorms compounding with an ongoing global urbanization. Considered as a flood type with minimal impacts when rainfall rates exceed the capacity of urban drainage systems, the aftermath of rainfall-triggered flooding during Hurricane Harvey and other events show the urgent need to assess the risk of pluvial flooding. Due to the local extent and small-scale variations, the quantification of pluvial flood risk requires risk assessments on high spatial resolutions. While flood hazard and exposure information is becoming increasingly accurate, the estimation of losses is still a poorly understood component of pluvial flood risk quantification. We use a new probabilistic multivariable modeling approach to estimate pluvial flood losses of individual buildings, explicitly accounting for the associated uncertainties. Except for the water depth as the common most important predictor, we identified the drivers for having loss or not and for the degree of loss to be different. Applying this approach to estimate and validate building structure losses during Hurricane Harvey using a property level data set, we find that the reliability and dispersion of predictive loss distributions vary widely depending on the model and aggregation level of property level loss estimates. Our results show that the use of multivariable zero-inflated beta models reduce the 90% prediction intervalsfor Hurricane Harvey building structure loss estimates on average by 78% (totalling U.S.$3.8 billion) compared to commonly used models. KW - pluvial flooding KW - loss modeling KW - urban flooding KW - probabilistic KW - Hurricane Harvey KW - climate change adaptation Y1 - 2019 U6 - https://doi.org/10.1029/2018EF001074 SN - 2328-4277 VL - 7 IS - 4 SP - 384 EP - 394 PB - American Geophysical Union CY - Washington ER -