Institut für Geowissenschaften
Refine
Has Fulltext
- no (76)
Year of publication
Document Type
- Article (76) (remove)
Is part of the Bibliography
- yes (76)
Keywords
- Ground-motion prediction equation (3)
- Ground-motion prediction equations (2)
- Aleatory variability (1)
- Array seismology (1)
- Attenuation (1)
- Backbone model (1)
- Bagging (1)
- Bayesian networks (1)
- Chile subduction zone (1)
- Correlation (1)
- Crustal earthquakes (1)
- Data processing (1)
- Deconvolution (1)
- Early warning (1)
- Ensembles (1)
- Epistemic uncertainty (1)
- Europe (1)
- Expert judgment (1)
- Frequency-magnitude distribution (1)
- GPR (1)
- Ground penetrating radar (1)
- Ground-motion models (1)
- Ground-motion-model (1)
- Hierarchical model (1)
- Indian Ocean (1)
- Inverse filtering (1)
- Kappa (1)
- Logic tree (1)
- Logic trees (1)
- Manifold (1)
- Middle East (1)
- Mixture model (1)
- Model selection (1)
- Multilevel model (1)
- Non-ergodic PSHA (1)
- P-waves (1)
- Probabilistic forecasting (1)
- Probabilistic seismic hazard analysis and Bayesian inference (1)
- Regional-dependence (1)
- S-waves (1)
- Seismic hazard assessment (1)
- Single-station sigma (1)
- Site amplification (1)
- Site effects (1)
- Stochastic model (1)
- Stress parameter (1)
- Strong-motion data (1)
- Style of faulting (1)
- Swarm earthquakes (1)
- Theoretical seismology (1)
- Tsunamis (1)
- Vertical resolution (1)
- Wave propagation (1)
- West Bohemia (1)
- algorithms (1)
- body waves (1)
- damage (1)
- earthquakes (1)
- elastic waves (1)
- eruptions (1)
- floods (1)
- geologic hazards (1)
- model validation (1)
- natural hazards (1)
- regression tree (1)
- seismic waves (1)
- signal-to-noise ratio (1)
- swarms (1)
- volcanic earthquakes (1)
Institute
Inferring a ground-motion prediction equation (GMPE) for a region in which only a small number of seismic events has been observed is a challenging task. A response to this data scarcity is to utilise data from other regions in the hope that there exist common patterns in the generation of ground motion that can contribute to the development of a GMPE for the region in question. This is not an unreasonable course of action since we expect regional GMPEs to be related to each other. In this work we model this relatedness by assuming that the regional GMPEs occupy a common low-dimensional manifold in the space of all possible GMPEs. As a consequence, the GMPEs are fitted in a joint manner and not independent of each other, borrowing predictive strength from each other's regional datasets. Experimentation on a real dataset shows that the manifold assumption displays better predictive performance over fitting regional GMPEs independent of each other. (C) 2014 Elsevier Ltd. All rights reserved.
Aleatory variability in ground-motion prediction, represented by the standard deviation (sigma) of a ground-motion prediction equation, exerts a very strong influence on the results of probabilistic seismic-hazard analysis (PSHA). This is especially so at the low annual exceedance frequencies considered for nuclear facilities; in these cases, even small reductions in sigma can have a marked effect on the hazard estimates. Proper separation and quantification of aleatory variability and epistemic uncertainty can lead to defensible reductions in sigma. One such approach is the single-station sigma concept, which removes that part of sigma corresponding to repeatable site-specific effects. However, the site-to-site component must then be constrained by site-specific measurements or else modeled as epistemic uncertainty and incorporated into the modeling of site effects. The practical application of the single-station sigma concept, including the characterization of the dynamic properties of the site and the incorporation of site-response effects into the hazard calculations, is illustrated for a PSHA conducted at a rock site under consideration for the potential construction of a nuclear power plant.
Response spectra are of fundamental importance in earthquake engineering and represent a standard measure in seismic design for the assessment of structural performance. However, unlike Fourier spectral amplitudes, the relationship of response spectral amplitudes to seismological source, path, and site characteristics is not immediately obvious and might even be considered counterintuitive for high oscillator frequencies. The understanding of this relationship is nevertheless important for seismic-hazard analysis. The purpose of the present study is to comprehensively characterize the variation of response spectral amplitudes due to perturbations of the causative seismological parameters. This is done by calculating the absolute parameter sensitivities (sensitivity coefficients) defined as the partial derivatives of the model output with respect to its input parameters. To derive sensitivities, we apply algorithmic differentiation (AD). This powerful approach is extensively used for sensitivity analysis of complex models in meteorology or aerodynamics. To the best of our knowledge, AD has not been explored yet in the seismic-hazard context. Within the present study, AD was successfully implemented for a proven and extensively applied simulation program for response spectra (Stochastic Method SIMulation [SMSIM]) using the TAPENADE AD tool. We assess the effects and importance of input parameter perturbations on the shape of response spectra for different regional stochastic models in a quantitative way. Additionally, we perform sensitivity analysis regarding adjustment issues of groundmotion prediction equations.
Bayesian networks are a powerful and increasingly popular tool for reasoning under uncertainty, offering intuitive insight into (probabilistic) data-generating processes. They have been successfully applied to many different fields, including bioinformatics. In this paper, Bayesian networks are used to model the joint-probability distribution of selected earthquake, site, and ground-motion parameters. This provides a probabilistic representation of the independencies and dependencies between these variables. In particular, contrary to classical regression, Bayesian networks do not distinguish between target and predictors, treating each variable as random variable. The capability of Bayesian networks to model the ground-motion domain in probabilistic seismic hazard analysis is shown for a generic situation. A Bayesian network is learned based on a subset of the Next Generation Attenuation (NGA) dataset, using 3342 records from 154 earthquakes. Because no prior assumptions about dependencies between particular parameters are made, the learned network displays the most probable model given the data. The learned network shows that the ground-motion parameter (horizontal peak ground acceleration, PGA) is directly connected only to the moment magnitude, Joyner-Boore distance, fault mechanism, source-to-site azimuth, and depth to a shear-wave horizon of 2: 5 km/s (Z2.5). In particular, the effect of V-S30 is mediated by Z2.5. Comparisons of the PGA distributions based on the Bayesian networks with the NGA model of Boore and Atkinson (2008) show a reasonable agreement in ranges of good data coverage.
One of the key challenges in the context of local site effect studies is the determination of frequencies where the shakeability of the ground is enhanced. In this context, the H/V technique has become increasingly popular and peak frequencies of H/V spectral ratio are sometimes interpreted as resonance frequencies of the transmission response. In the present study, assuming that Rayleigh surface wave is dominant in H/V spectral ratio, we analyse theoretically under which conditions this may be justified and when not. We focus on 'layer over half-space' models which, although seemingly simple, capture many aspects of local site effects in real sedimentary structures. Our starting point is the ellipticity of Rayleigh waves. We use the exact formula of the H/V-ratio presented by Malischewsky & Scherbaum (2004) to investigate the main characteristics of peak and trough frequencies. We present a simple formula illustrating if and where H/V-ratio curves have sharp peaks in dependence of model parameters. In addition, we have constructed a map, which demonstrates the relation between the H/V-peak frequency and the peak frequency of the transmission response in the domain of the layer's Poisson ratio and the impedance contrast. Finally, we have derived maps showing the relationship between the H/V-peak and trough frequency and key parameters of the model such as impedance contrast. These maps are seen as diagnostic tools, which can help to guide the interpretation of H/V spectral ratio diagrams in the context of site effect studies.
Logic trees have become the most popular tool for the quantification of epistemic uncertainties in probabilistic seismic hazard assessment (PSHA). In a logic-tree framework, epistemic uncertainty is expressed in a set of branch weights, by which an expert or an expert group assigns degree-of-belief values to the applicability of the corresponding branch models. Despite the popularity of logic-trees, however, one finds surprisingly few clear commitments to what logic-tree branch weights are assumed to be (even by hazard analysts designing logic trees). In the present paper we argue that it is important for hazard analysts to accept the probabilistic framework from the beginning for assigning logic-tree branch weights. In other words, to accept that logic-tree branch weights are probabilities in the axiomatic sense, independent of one's preference for the philosophical interpretation of probabilities. We demonstrate that interpreting logic-tree branch weights merely as a numerical measure of "model quality," which are then subsequently normalized to sum up to unity, will with increasing number of models inevitably lead to an apparent insensitivity of hazard curves on the logic-tree branch weights, which may even be mistaken for robustness of the results. Finally, we argue that assigning logic-tree branch weights in a sequential fashion may improve their logical consistency.
Magnitude estimation for microseismicity induced during the KTB 2004/2005 injection experiment
(2011)
We determined the magnitudes of 2540 microseismic events measured at one single 3C borehole geophone at the German Deep Drilling Site (known by the German acronym, KTB) during the injection phase 2004/2005. For this task we developed a three-step approach. First, we estimated local magnitudes of 104 larger events with a standard method based on amplitude measurements at near-surface stations. Second, we investigated a series of parameters to characterize the size of these events using the seismograms of the borehole sensor, and we compared them statistically with the local magnitudes. Third, we extrapolated the regression curve to obtain the magnitudes of 2436 events that were only measured at the borehole geophone. This method improved the magnitude of completeness for the KTB data set by more than one order down to M = -2.75. The resulting b-value for all events was 0.78, which is similar to the b-value obtained from taking only the greater events with standard local magnitude estimation from near-surface stations, b = 0.86. The more complete magnitude catalog was required to study the magnitude distribution with time and to characterize the seismotectonic state of the KTB injection site. The event distribution with time was consistent with prediction from theory assuming pore pressure diffusion as the underlying mechanism to trigger the events. The value we obtained for the seismogenic index of -4 suggested that the seismic hazard potential at the KTB site is comparatively low.
Enhancing the resolution and accuracy of surface ground-penetrating radar (GPR) reflection data by inverse filtering to recover a zero-phased band-limited reflectivity image requires a deconvolution technique that takes the mixed-phase character of the embedded wavelet into account. In contrast, standard stochastic deconvolution techniques assume that the wavelet is minimum phase and, hence, often meet with limited success when applied to GPR data. We present a new general-purpose blind deconvolution algorithm for mixed-phase wavelet estimation and deconvolution that (1) uses the parametrization of a mixed-phase wavelet as the convolution of the wavelet's minimum-phase equivalent with a dispersive all-pass filter, (2) includes prior information about the wavelet to be estimated in a Bayesian framework, and (3) relies on the assumption of a sparse reflectivity. Solving the normal equations using the data autocorrelation function provides an inverse filter that optimally removes the minimum-phase equivalent of the wavelet from the data, which leaves traces with a balanced amplitude spectrum but distorted phase. To compensate for the remaining phase errors, we invert in the frequency domain for an all-pass filter thereby taking advantage of the fact that the action of the all-pass filter is exclusively contained in its phase spectrum. A key element of our algorithm and a novelty in blind deconvolution is the inclusion of prior information that allows resolving ambiguities in polarity and timing that cannot be resolved using the sparseness measure alone. We employ a global inversion approach for non-linear optimization to find the all-pass filter phase values for each signal frequency. We tested the robustness and reliability of our algorithm on synthetic data with different wavelets, 1-D reflectivity models of different complexity, varying levels of added noise, and different types of prior information. When applied to realistic synthetic 2-D data and 2-D field data, we obtain images with increased temporal resolution compared to the results of standard processing.
Tsunami early warning (TEW) is a challenging task as a decision has to be made within few minutes on the basis of incomplete and error-prone data. Deterministic warning systems have difficulties in integrating and quantifying the intrinsic uncertainties. In contrast, probabilistic approaches provide a framework that handles uncertainties in a natural way. Recently, we have proposed a method using Bayesian networks (BNs) that takes into account the uncertainties of seismic source parameter estimates in TEW. In this follow-up study, the method is applied to 10 recent large earthquakes offshore Sumatra and tested for its performance. We have evaluated both the general model performance given the best knowledge we have today about the source parameters of the 10 events and the corresponding response on seismic source information evaluated in real-time. We find that the resulting site-specific warning level probabilities represent well the available tsunami wave measurements and observations. Difficulties occur in the real-time tsunami assessment if the moment magnitude estimate is severely over- or underestimated. In general, the probabilistic analysis reveals a considerably large range of uncertainties in the near-field TEW. By quantifying the uncertainties the BN analysis provides important additional information to a decision maker in a warning centre to deal with the complexity in TEW and to reason under uncertainty.