Refine
Year of publication
Document Type
- Article (8)
- Doctoral Thesis (8)
- Other (2)
Language
- English (18) (remove)
Keywords
- earthquake (18) (remove)
Institute
- Institut für Geowissenschaften (18) (remove)
Efforts have been made in the past to enhance building exposure models on a regional scale with increasing spatial resolutions by integrating different data sources. This work follows a similar path and focuses on the downscaling of the existing SARA exposure model that was proposed for the residential building stock of the communes of Valparaiso and Vina del Mar (Chile). Although this model allowed great progress in harmonising building classes and characterising their differential physical vulnerabilities, it is now outdated, and in any case, it is spatially aggregated over large administrative units. Hence, to more accurately consider the impact of future earthquakes on these cities, it is necessary to employ more reliable exposure models. For such a purpose, we propose updating this existing model through a Bayesian approach by integrating ancillary data that has been made increasingly available from Volunteering Geo-Information (VGI) activities. Its spatial representation is also optimised in higher resolution aggregation units that avoid the inconvenience of having incomplete building-by-building footprints. A worst-case earthquake scenario is presented to calculate direct economic losses and highlight the degree of uncertainty imposed by exposure models in comparison with other parameters used to generate the seismic ground motions within a sensitivity analysis. This example study shows the great potential of using increasingly available VGI to update worldwide building exposure models as well as its importance in scenario-based seismic risk assessment.
A comprehensive study on seismic hazard and earthquake triggering is crucial for effective mitigation of earthquake risks. The destructive nature of earthquakes motivates researchers to work on forecasting despite the apparent randomness of the earthquake occurrences. Understanding their underlying mechanisms and patterns is vital, given their potential for widespread devastation and loss of life. This thesis combines methodologies, including Coulomb stress calculations and aftershock analysis, to shed light on earthquake complexities, ultimately enhancing seismic hazard assessment.
The Coulomb failure stress (CFS) criterion is widely used to predict the spatial distributions of aftershocks following large earthquakes. However, uncertainties associated with CFS calculations arise from non-unique slip inversions and unknown fault networks, particularly due to the choice of the assumed aftershocks (receiver) mechanisms. Recent studies have proposed alternative stress quantities and deep neural network approaches as superior to CFS with predefined receiver mechanisms. To challenge these propositions, I utilized 289 slip inversions from the SRCMOD database to calculate more realistic CFS values for a layered-half space and variable receiver mechanisms. The analysis also investigates the impact of magnitude cutoff, grid size variation, and aftershock duration on the ranking of stress metrics using receiver operating characteristic (ROC) analysis. Results reveal the performance of stress metrics significantly improves after accounting for receiver variability and for larger aftershocks and shorter time periods, without altering the relative ranking of the different stress metrics.
To corroborate Coulomb stress calculations with the findings of earthquake source studies in more detail, I studied the source properties of the 2005 Kashmir earthquake and its aftershocks, aiming to unravel the seismotectonics of the NW Himalayan syntaxis. I simultaneously relocated the mainshock and its largest aftershocks using phase data, followed by a comprehensive analysis of Coulomb stress changes on the aftershock planes. By computing the Coulomb failure stress changes on the aftershock faults, I found that all large aftershocks lie in regions of positive stress change, indicating triggering by either co-seismic or post-seismic slip on the mainshock fault.
Finally, I investigated the relationship between mainshock-induced stress changes and associated seismicity parameters, in particular those of the frequency-magnitude (Gutenberg-Richter) distribution and the temporal aftershock decay (Omori-Utsu law). For that purpose, I used my global data set of 127 mainshock-aftershock sequences with the calculated Coulomb Stress (ΔCFS) and the alternative receiver-independent stress metrics in the vicinity of the mainshocks and analyzed the aftershocks properties depend on the stress values. Surprisingly, the results show a clear positive correlation between the Gutenberg-Richter b-value and induced stress, contrary to expectations from laboratory experiments. This observation highlights the significance of structural heterogeneity and strength variations in seismicity patterns. Furthermore, the study demonstrates that aftershock productivity increases nonlinearly with stress, while the Omori-Utsu parameters c and p systematically decrease with increasing stress changes. These partly unexpected findings have significant implications for future estimations of aftershock hazard.
The findings in this thesis provides valuable insights into earthquake triggering mechanisms by examining the relationship between stress changes and aftershock occurrence. The results contribute to improved understanding of earthquake behavior and can aid in the development of more accurate probabilistic-seismic hazard forecasts and risk reduction strategies.
The creation of building exposure models for seismic risk assessment is frequently challenging due to the lack of availability of detailed information on building structures. Different strategies have been developed in recent years to overcome this, including the use of census data, remote sensing imagery and volunteered graphic information (VGI). This paper presents the development of a building-by-building exposure model based exclusively on openly available datasets, including both VGI and census statistics, which are defined at different levels of spatial resolution and for different moments in time. The initial model stemming purely from building-level data is enriched with statistics aggregated at the neighbourhood and city level by means of a Monte Carlo simulation that enables the generation of full realisations of damage estimates when using the exposure model in the context of an earthquake scenario calculation. Though applicable to any other region of interest where analogous datasets are available, the workflow and approach followed are explained by focusing on the case of the German city of Cologne, for which a scenario earthquake is defined and the potential damage is calculated. The resulting exposure model and damage estimates are presented, and it is shown that the latter are broadly consistent with damage data from the 1978 Albstadt earthquake, notwithstanding the differences in the scenario. Through this real-world application we demonstrate the potential of VGI and open data to be used for exposure modelling for natural risk assessment, when combined with suitable knowledge on building fragility and accounting for the inherent uncertainties.
Evaluation of a novel application of earthquake HVSR in site-specific amplification estimation
(2020)
Ground response analyses (GRA) model the vertical propagations of SH waves through flat-layered media (1DSH) and are widely carried out to evaluate local site effects in practice. Horizontal-to-vertical spectral ratio (HVSR) technique is a cost-effective approach to extract certain site-specific information, e.g., site fundamental frequency (f(0)), but HVSR values cannot be directly used to approximate the levels of S-wave amplifications. Motivated by the work of Kawase et al. (2019), we propose a procedure to correct earthquake HVSR amplitudes for direct amplification estimations. The empirical correction compensates HVSR by generic vertical amplification spectra categorized by the vertical fundamental frequency (f(0v)) via kappa-means clustering. In this investigation, we evaluate the effectiveness of the corrected HVSR in approximating observed linear amplifications in comparison with 1DSH modellings. We select a total of 90 KiK-net (Kiban Kyoshin network) surface-downhole sites which are found to have no velocity contrasts below their boreholes and thus of which surface-to-borehole spectral ratios (SBSRs) can be taken as their empirical transfer functions (ETFs). 1DSH-based theoretical transfer functions (TTFs) are computed in the linear domain considering uncertainties in Vs profiles through randomizations. Five goodness-of-fit metrics are adopted to gauge the closeness between observed (ETF) and predicted (i.e., TTF and corrected HVSR) amplifications in both amplitude and spectral shape over frequencies from f(0) to 25 Hz. We find that the empirical correction to HVSR is highly effective and achieves a "good match" in both spectral shape and amplitude at the majority of the 90 KiK-net sites, as opposed to less than one-third for the 1DSH modelling. In addition, the empirical correction does not require a velocity model, which GRAs require, and thus has great potentials in seismic hazard assessments.
Along a subduction zone, great megathrust earthquakes recur either after long seismic gaps lasting several decades to centuries or over much shorter periods lasting hours to a few years when cascading successions of earthquakes rupture nearby segments of the fault. We analyze a decade of continuous Global Positioning System observations along the South American continent to estimate changes in deformation rates between the 2010 Maule (M8.8) and 2015 Illapel (M8.3) Chilean earthquakes. We find that surface velocities increased after the 2010 earthquake, in response to continental-scale viscoelastic mantle relaxation and to regional-scale increased degree of interplate locking. We propose that increased locking occurs transiently during a super-interseismic phase in segments adjacent to a megathrust rupture, responding to bending of both plates caused by coseismic slip and subsequent afterslip. Enhanced strain rates during a super-interseismic phase may therefore bring a megathrust segment closer to failure and possibly triggered the 2015 event.
The impressive number of stream gauges in Chile, combined with a suite of past and recent large earthquakes, makes Chile a unique natural laboratory to study several streams that recorded responses to multiple seismic events. We document changes in discharge in eight streams in Chile following two or more large earthquakes. In all cases, discharge increases. Changes in discharge occur for peak ground velocities greater than about 7-11cm/s. Above that threshold, the magnitude of both the increase in discharge and the total excess water do not increase with increasing peak ground velocities. While these observations are consistent with previous work in California, they conflict with lab experiments that show that the magnitude of permeability changes increases with increasing amplitude of ground motion. Instead, our study suggests that streamflow responses are binary. Plain Language Summary Earthquakes deform and shake the surface and the ground below. These changes may affect groundwater flows by increasing the permeability along newly formed cracks and/or clearing clogged pores. As a result, groundwater flow may substantially increase after earthquakes and remain elevated for several months. Here we document streamflow anomalies following multiple high magnitude earthquakes in multiple streams in one of the most earthquake prone regions worldwide, Chile. We take advantage of the dense monitoring network in Chile that recorded streamflow since the 1940s. We show that once a critical ground motion is exceeded, streamflow responses to earthquakes can be expected.
We address the question of whether all large-magnitude earthquakes produce an erosion peak in the subaerial components of fluvial catchments. We evaluate the sediment flux response to the Maule earthquake in the Chilean Andes (Mw 8.8) using daily suspended sediment records from 31 river gauges. The catchments cover drainage areas of 350 to around 10,000 km(2), including a wide range of topographic slopes and vegetation cover of the Andean western flank. We compare the 3- to 8-year postseismic record of sediment flux to each of the following preseismic periods: (1) all preseismic data, (2) a 3-year period prior to the seismic event, and (3) the driest preseismic periods, as drought conditions prevailed in the postseismic period. Following the earthquake, no increases in suspended sediment flux were observed for moderate to high percentiles of the streamflow distribution (mean, median, and >= 75th percentile). However, more than half of the examined stations showed increased sediment flux during baseflow. By using a Random Forest approach, we evaluate the contributions of seismic intensities, peak ground accelerations, co-seismic landslides, hydroclimatic conditions, topography, lithology, and land cover to explain the observed changes in suspended sediment concentration and fluxes. We find that the best predictors are hillslope gradient, low-vegetation cover, and changes in streamflow discharge. This finding suggests a combined first-order control of topography, land cover, and hydrology on the catchment-wide erosion response. We infer a reduced sediment connectivity due to the postseismic drought, which increased the residence time of sediment detached and remobilized following the Maule earthquake.
Most of the deformation associated with the seismic cycle in subduction zones occurs offshore and has been therefore difficult to quantify with direct observations at millennial timescales. Here we study millennial deformation associated with an active splay-fault system in the Arauco Bay area off south central Chile. We describe hitherto unrecognized drowned shorelines using high-resolution multibeam bathymetry, geomorphic, sedimentologic, and paleontologic observations and quantify uplift rates using a Landscape Evolution Model. Along a margin-normal profile, uplift rates are 1.3m/ka near the edge of the continental shelf, 1.5m/ka at the emerged Santa Maria Island, -0.1m/ka at the center of the Arauco Bay, and 0.3m/ka in the mainland. The bathymetry images a complex pattern of folds and faults representing the surface expression of the crustal-scale Santa Maria splay-fault system. We modeled surface deformation using two different structural scenarios: deep-reaching normal faults and deep-reaching reverse faults with shallow extensional structures. Our preferred model comprises a blind reverse fault extending from 3km depth down to the plate interface at 16km that slips at a rate between 3.0 and 3.7m/ka. If all the splay-fault slip occurs during every great megathrust earthquake, with a recurrence of similar to 150-200years, the fault would slip similar to 0.5m per event, equivalent to a magnitude similar to 6.4 earthquake. However, if the splay-fault slips only with a megathrust earthquake every similar to 1000years, the fault would slip similar to 3.7m per event, equivalent to a magnitude similar to 7.5 earthquake.
Earthquakes occurring close to hydrocarbon fields under production are often under critical view of being induced or triggered. However, clear and testable rules to discriminate the different events have rarely been developed and tested. The unresolved scientific problem may lead to lengthy public disputes with unpredictable impact on the local acceptance of the exploitation and field operations. We propose a quantitative approach to discriminate induced, triggered, and natural earthquakes, which is based on testable input parameters. Maxima of occurrence probabilities are compared for the cases under question, and a single probability of being triggered or induced is reported. The uncertainties of earthquake location and other input parameters are considered in terms of the integration over probability density functions. The probability that events have been human triggered/induced is derived from the modeling of Coulomb stress changes and a rate and state-dependent seismicity model. In our case a 3-D boundary element method has been adapted for the nuclei of strain approach to estimate the stress changes outside the reservoir, which are related to pore pressure changes in the field formation. The predicted rate of natural earthquakes is either derived from the background seismicity or, in case of rare events, from an estimate of the tectonic stress rate. Instrumentally derived seismological information on the event location, source mechanism, and the size of the rupture plane is of advantage for the method. If the rupture plane has been estimated, the discrimination between induced or only triggered events is theoretically possible if probability functions are convolved with a rupture fault filter. We apply the approach to three recent main shock events: (1) the M-w 4.3 Ekofisk 2001, North Sea, earthquake close to the Ekofisk oil field; (2) the M-w 4.4 Rotenburg 2004, Northern Germany, earthquake in the vicinity of the Sohlingen gas field; and (3) the M-w 6.1 Emilia 2012, Northern Italy, earthquake in the vicinity of a hydrocarbon reservoir. The three test cases cover the complete range of possible causes: clearly human induced, not even human triggered, and a third case in between both extremes.
Rapidly uplifting coastlines are frequently associated with convergent tectonic boundaries, like subduction zones, which are repeatedly breached by giant megathrust earthquakes. The coastal relief along tectonically active realms is shaped by the effect of sea-level variations and heterogeneous patterns of permanent tectonic deformation, which are accumulated through several cycles of megathrust earthquakes. However, the correlation between earthquake deformation patterns and the sustained long-term segmentation of forearcs, particularly in Chile, remains poorly understood. Furthermore, the methods used to estimate permanent deformation from geomorphic markers, like marine terraces, have remained qualitative and are based on unrepeatable methods. This contrasts with the increasing resolution of digital elevation models, such as Light Detection and Ranging (LiDAR) and high-resolution bathymetric surveys.
Throughout this thesis I study permanent deformation in a holistic manner: from the methods to assess deformation rates, to the processes involved in its accumulation. My research focuses particularly on two aspects: Developing methodologies to assess permanent deformation using marine terraces, and comparing permanent deformation with seismic cycle deformation patterns under different spatial scales along the M8.8 Maule earthquake (2010) rupture zone. Two methods are developed to determine deformation rates from wave-built and wave-cut terraces respectively. I selected an archetypal example of a wave-built terrace at Santa Maria Island studying its stratigraphy and recognizing sequences of reoccupation events tied with eleven radiocarbon sample ages (14C ages). I developed a method to link patterns of reoccupation with sea-level proxies by iterating relative sea level curves for a range of uplift rates. I find the best fit between relative sea-level and the stratigraphic patterns for an uplift rate of 1.5 +- 0.3 m/ka.
A Graphical User Interface named TerraceM® was developed in Matlab®. This novel software tool determines shoreline angles in wave-cut terraces under different geomorphic scenarios. To validate the methods, I select test sites in areas of available high-resolution LiDAR topography along the Maule earthquake rupture zone and in California, USA. The software allows determining the 3D location of the shoreline angle, which is a proxy for the estimation of permanent deformation rates. The method is based on linear interpolations to define the paleo platform and cliff on swath profiles. The shoreline angle is then located by intersecting these interpolations. The
accuracy and precision of TerraceM® was tested by comparing its results with previous assessments, and through an experiment with students in a computer lab setting at the University
of Potsdam.
I combined the methods developed to analyze wave-built and wave-cut terraces to assess regional patterns of permanent deformation along the (2010) Maule earthquake rupture. Wave-built terraces are tied using 12 Infra Red Stimulated luminescence ages (IRSL ages) and shoreline angles in wave-cut terraces are estimated from 170 aligned swath profiles. The comparison of coseismic slip, interseismic coupling, and permanent deformation, leads to three areas of high permanent uplift, terrace warping, and sharp fault offsets. These three areas correlate with regions of high slip and low coupling, as well as with the spatial limit of at least eight historical megathrust ruptures (M8-9.5). I propose that the zones of upwarping at Arauco and Topocalma reflect changes in frictional properties of the megathrust, which result in discrete boundaries for the propagation of mega earthquakes.
To explore the application of geomorphic markers and quantitative morphology in offshore areas I performed a local study of patterns of permanent deformation inferred from hitherto unrecognized drowned shorelines at the Arauco Bay, at the southern part of the (2010) Maule earthquake rupture zone. A multidisciplinary approach, including morphometry, sedimentology, paleontology, 3D morphoscopy, and a landscape Evolution Model is used to recognize, map, and assess local rates and patterns of permanent deformation in submarine environments. Permanent deformation patterns are then reproduced using elastic models to assess deformation rates of an active submarine splay fault defined as Santa Maria Fault System. The best fit suggests a reverse structure with a slip rate of 3.7 m/ka for the last 30 ka. The register of land level changes during the earthquake cycle at Santa Maria Island suggest that most of the deformation may be accrued through splay fault reactivation during mega earthquakes, like the (2010) Maule event. Considering a recurrence time of 150 to 200 years, as determined from historical and geological observations, slip between 0.3 and 0.7 m per event would be required to account for the 3.7 m/ka millennial slip rate. However, if the SMFS slips only every ~1000 years, representing a few megathrust earthquakes, then a slip of ~3.5 m per event would be required to account for the long- term rate. Such event would be equivalent to a magnitude ~6.7 earthquake capable to generate a local tsunami.
The results of this thesis provide novel and fundamental information regarding the amount of permanent deformation accrued in the crust, and the mechanisms responsible for this accumulation at millennial time-scales along the M8.8 Maule earthquake (2010) rupture zone. Furthermore, the results of this thesis highlight the application of quantitative geomorphology and the use of repeatable methods to determine permanent deformation, improve the accuracy of marine terrace assessments, and estimates of vertical deformation rates in tectonically active coastal areas. This is vital information for adequate coastal-hazard assessments and to anticipate realistic earthquake and tsunami scenarios.