Refine
Year of publication
Document Type
- Article (79)
- Monograph/Edited Volume (3)
- Other (3)
- Part of Periodical (2)
- Conference Proceeding (1)
- Postprint (1)
Keywords
Characterization of polarization attributes of seismic waves using continuous wavelet transforms
(2006)
Complex-trace analysis is the method of choice for analyzing polarized data. Because particle motion can be represented by instantaneous attributes that show distinct features for waves of different polarization characteristics, it can be used to separate and characterize these waves. Traditional methods of complex-trace analysis only give the instantaneous attributes as a function of time or frequency. However. for transient wave types or seismic events that overlap in time, an estimate of the polarization parameters requires analysis of the time-frequency dependence of these attributes. We propose a method to map instantaneous polarization attributes of seismic signals in the wavelet domain and explicitly relate these attributes with the wavelet-transform coefficients of the analyzed signal. We compare our method with traditional complex-trace analysis using numerical examples. An advantage of our method is its possibility of performing the complete wave-mode separation/ filtering process in the wavelet domain and its ability to provide the frequency dependence of ellipticity, which contains important information on the subsurface structure. Furthermore, using 2-C synthetic and real seismic shot gathers, we show how to use the method to separate different wave types and identify zones of interfering wave modes
Magnitude estimation for microseismicity induced during the KTB 2004/2005 injection experiment
(2011)
We determined the magnitudes of 2540 microseismic events measured at one single 3C borehole geophone at the German Deep Drilling Site (known by the German acronym, KTB) during the injection phase 2004/2005. For this task we developed a three-step approach. First, we estimated local magnitudes of 104 larger events with a standard method based on amplitude measurements at near-surface stations. Second, we investigated a series of parameters to characterize the size of these events using the seismograms of the borehole sensor, and we compared them statistically with the local magnitudes. Third, we extrapolated the regression curve to obtain the magnitudes of 2436 events that were only measured at the borehole geophone. This method improved the magnitude of completeness for the KTB data set by more than one order down to M = -2.75. The resulting b-value for all events was 0.78, which is similar to the b-value obtained from taking only the greater events with standard local magnitude estimation from near-surface stations, b = 0.86. The more complete magnitude catalog was required to study the magnitude distribution with time and to characterize the seismotectonic state of the KTB injection site. The event distribution with time was consistent with prediction from theory assuming pore pressure diffusion as the underlying mechanism to trigger the events. The value we obtained for the seismogenic index of -4 suggested that the seismic hazard potential at the KTB site is comparatively low.
Earthquake rupture length and width estimates are in demand in many seismological applications. Earthquake magnitude estimates are often available, whereas the geometrical extensions of the rupture fault mostly are lacking. Therefore, scaling relations are needed to derive length and width from magnitude. Most frequently used are the relationships of Wells and Coppersmith (1994) derived on the basis of a large dataset including all slip types with the exception of thrust faulting events in subduction environments. However, there are many applications dealing with earthquakes in subduction zones because of their high seismic and tsunamigenic potential. There are no well-established scaling relations for moment magnitude and length/width for subduction events. Within this study, we compiled a large database of source parameter estimates of 283 earthquakes. All focal mechanisms are represented, but special focus is set on (large) subduction zone events, in particular. Scaling relations were fitted with linear least-square as well as orthogonal regression and analyzed regarding the difference between continental and subduction zone/oceanic relationships. Additionally, the effect of technical progress in earthquake parameter estimation on scaling relations was tested as well as the influence of different fault mechanisms. For a given moment magnitude we found shorter but wider rupture areas of thrust events compared to Wells and Coppersmith (1994). The thrust event relationships for pure continental and pure subduction zone rupture areas were found to be almost identical. The scaling relations differ significantly for slip types. The exclusion of events prior to 1964 when the worldwide standard seismic network was established resulted in a remarkable effect on strike-slip scaling relations: the data do not show any saturation of rupture width of strike- slip earthquakes. Generally, rupture area seems to scale with mean slip independent of magnitude. The aspect ratio L/W, however, depends on moment and differs for each slip type.
Tsunami early warning (TEW) is a challenging task as a decision has to be made within few minutes on the basis of incomplete and error-prone data. Deterministic warning systems have difficulties in integrating and quantifying the intrinsic uncertainties. In contrast, probabilistic approaches provide a framework that handles uncertainties in a natural way. Recently, we have proposed a method using Bayesian networks (BNs) that takes into account the uncertainties of seismic source parameter estimates in TEW. In this follow-up study, the method is applied to 10 recent large earthquakes offshore Sumatra and tested for its performance. We have evaluated both the general model performance given the best knowledge we have today about the source parameters of the 10 events and the corresponding response on seismic source information evaluated in real-time. We find that the resulting site-specific warning level probabilities represent well the available tsunami wave measurements and observations. Difficulties occur in the real-time tsunami assessment if the moment magnitude estimate is severely over- or underestimated. In general, the probabilistic analysis reveals a considerably large range of uncertainties in the near-field TEW. By quantifying the uncertainties the BN analysis provides important additional information to a decision maker in a warning centre to deal with the complexity in TEW and to reason under uncertainty.
Records from ocean bottom seismometers (OBSs) are highly contaminated by noise, which is much stronger
compared to data from most land stations, especially on the horizontal components. As a consequence, the high energy of the oceanic noise at frequencies below 1 Hz considerably complicates the analysis of the teleseismic earthquake signals recorded by OBSs.
Previous studies suggested different approaches to remove low-frequency noises from OBS recordings but mainly focused on the vertical component. The records of horizontal components, which are crucial for the application of many methods in passive seismological analysis of body and surface waves, could not be much improved in the teleseismic frequency band. Here we introduce a noise reduction method, which is derived from the harmonic–percussive separation algorithms used in Zali et al. (2021), in order to separate long-lasting narrowband signals from broadband transients in the OBS signal. This leads to significant noise reduction of OBS records on both the vertical and horizontal components and increases the earthquake signal-to-noise ratio (SNR) without distortion of the broadband earthquake waveforms. This is demonstrated through tests with synthetic data. Both SNR and cross-correlation coefficients showed significant improvements for different realistic noise realizations. The application of denoised signals in surface wave analysis and receiver functions is discussed through tests with synthetic and real data.
Records from ocean bottom seismometers (OBSs) are highly contaminated by noise, which is much stronger compared to data from most land stations, especially on the horizontal components. As a consequence, the high energy of the oceanic noise at frequencies below 1 Hz considerably complicates the analysis of the teleseismic earthquake signals recorded by OBSs.
Previous studies suggested different approaches to remove low-frequency noises from OBS recordings but mainly focused on the vertical component. The records of horizontal components, which are crucial for the application of many methods in passive seismological analysis of body and surface waves, could not be much improved in the teleseismic frequency band. Here we introduce a noise reduction method, which is derived from the harmonic–percussive separation algorithms used in Zali et al. (2021), in order to separate long-lasting narrowband signals from broadband transients in the OBS signal. This leads to significant noise reduction of OBS records on both the vertical and horizontal components and increases the earthquake signal-to-noise ratio (SNR) without distortion of the broadband earthquake waveforms. This is demonstrated through tests with synthetic data. Both SNR and cross-correlation coefficients showed significant improvements for different realistic noise realizations. The application of denoised signals in surface wave analysis and receiver functions is discussed through tests with synthetic and real data.
The Ceres earthquake of 29 September 1969 is the largest known earthquake in southern Africa. Digitized analog recordings from Worldwide Standardized Seismographic Network stations (Powell and Fries, 1964) are used to retrieve the point source moment tensor and the most likely centroid depth of the event using full waveform modeling. A scalar seismic moment of 2.2-2.4 x 10(18) N center dot m corresponding to a moment magnitude of 6.2-6.3 is found. The analysis confirms the pure strike-slip mechanism previously determined from onset polarities by Green and Bloch (1971). Overall good agreement with the fault orientation previously estimated from local aftershock recordings is found. The centroid depth can be constrained to be less than 15 km. In a second analysis step, we use a higher order moment tensor based inversion scheme for simple extended rupture models to constrain the lateral fault dimensions. We find rupture propagated unilaterally for 4.7 s from east-southwest to west-northwest for about 17 km ( average rupture velocity of about 3: 1 km/s).
The most recent intense earthquake swarm in the Vogtland lasted from 6 October 2008 until January 2009. Greatest magnitudes exceeded M3.5 several times in October making it the greatest swarm since 1985/86. In contrast to the swarms in 1985 and 2000, seismic moment release was concentrated near swarm onset. Focal area and temporal evolution are similar to the swarm in 2000. Work hypothysis: uprising upper-mantle fluids trigger swarm earthquakes at low stress level. To monitor the seismicity, the University of Potsdam operated a small aperture seismic array at 10 km epicentral distance between 18 October 2008 and 18 March 2009. Consisting of 12 seismic stations and 3 additional microphones, the array is capable of detecting earthquakes from larger to very low magnitudes (M<-1) as well as associated air waves. We use array techniques to determine properties of the incoming wavefield: noise, direct P and S waves, and converted phases.
We introduce a method for computing instantaneous-polarization attributes from multicomponent signals. This is an improvement on the standard covariance method (SCM) because it does not depend on the window size used to compute the standard covariance matrix. We overcome the window-size problem by deriving an approximate analytical formula for the cross-energy matrix in which we automatically and adaptively determine the time window. The proposed method uses polarization analysis as applied to multicomponent seismic by waveform separation and filtering.
In low-seismicity regions, such as France or Germany, the estimation of probabilistic seismic hazard must cope with the difficult identification of active faults and with the low amount of seismic data available. Since the probabilistic hazard method was initiated, most studies assume a Poissonian occurrence of earthquakes. Here we propose a method that enables the inclusion of time and space dependences between earthquakes into the probabilistic estimation of hazard. Combining the seismicity model Epidemic Type Aftershocks-Sequence (ETAS) with a Monte Carlo technique, aftershocks are naturally accounted for in the hazard determination. The method is applied to the Pyrenees region in Southern France. The impact on hazard of declustering and of the usual assumption that earthquakes occur according to a Poisson process is quantified, showing that aftershocks contribute on average less than 5 per cent to the probabilistic hazard, with an upper bound around 18 per cent
In the estimate of dispersion with the help of wavelet analysis considerable emphasis has been put on the extraction of the group velocity using the modulus of the wavelet transform. In this paper we give an asymptotic expression of the full propagator in wavelet space that comprises the phase velocity as well. This operator establishes a relationship between the observed signals at two different stations during wave propagation in a dispersive and attenuating medium. Numerical and experimental examples are presented to show that the method accurately models seismic wave dispersion and attenuation
In this paper, two sets of earthquake ground-motion relations to estimate peak ground and response spectral acceleration are developed for sites in southern Spain and in southern Norway using a recently published composite approach. For this purpose seven empirical ground-motion relations developed from recorded strong-motion data from different parts of the world were employed. The different relations were first adjusted based on a number of transformations to convert the differing choices of independent parameters to a single one. After these transformations, which include the scatter introduced, were performed, the equations were modified to account for differences between the host and the target regions using the stochastic method to compute the host-to-target conversion factors. Finally functions were fitted to the derived ground-motion estimates to obtain sets of seven individual equations for use in probabilistic seismic hazard assessment for southern Spain and southern Norway. The relations are compared with local ones published for the two regions. The composite methodology calls for the setting up of independent logic trees for the median values and for the sigma values, in order to properly separate epistemic and aleatory uncertainties after the corrections and the conversions
The deterministic calculation of earthquake scenarios using complete waveform modelling plays an increasingly important role in estimating shaking hazard in seismically active regions. Here we apply 3-D numerical modelling of seismic wave propagation to M 6+ earthquake scenarios in the area of the Lower Rhine Embayment, one of the seismically most active regions in central Europe. Using a 3-D basin model derived from geology, borehole information and seismic experiments, we aim at demonstrating the strong dependence of ground shaking on hypocentre location and basin structure. The simulations are carried out up to frequencies of ca. 1 Hz. As expected, the basin structure leads to strong lateral variations in peak ground motion, amplification and shaking duration. Depending on source-basin-receiver geometry, the effects correlate with basin depth and the slope of the basin flanks; yet, the basin also affects peak ground motion and estimated shaking hazard thereof outside the basin. Comparison with measured seismograms for one of the earthquakes shows that some of the main characteristics of the wave motion are reproduced. Cumulating the derived seismic intensities from the three modelled earthquake scenarios leads to a predominantly basin correlated intensity distribution for our study area
In this paper, we propose a method of surface waves characterization based on the deformation of the wavelet transform of the analysed signal. An estimate of the phase velocity (the group velocity) and the attenuation coefficient is carried out using a model-based approach to determine the propagation operator in the wavelet domain, which depends nonlinearly on a set of unknown parameters. These parameters explicitly define the phase velocity, the group velocity and the attenuation. Under the assumption that the difference between waveforms observed at a couple of stations is solely due to the dispersion characteristics and the intrinsic attenuation of the medium, we then seek to find the set of unknown parameters of this model. Finding the model parameters turns out to be that of an optimization problem, which is solved through the minimization of an appropriately defined cost function. We show that, unlike time-frequency methods that exploit only the square modulus of the transform, we can achieve a complete characterization of surface waves in a dispersive and attenuating medium. Using both synthetic examples and experimental data, we also show that it is in principle possible to separate different modes in both the time domain and the frequency domain
An important task of seismic hazard assessment consists of estimating the rate of seismic moment release which is correlated to the rate of tectonic deformation and the seismic coupling. However, the estimations of deformation depend on the type of information utilized (e.g. geodetic, geological, seismic) and include large uncertainties. We therefore estimate the deformation rate in the Lower Rhine Embayment (LRE), Germany, using an integrated approach where the uncertainties have been systematically incorporated. On the basis of a new homogeneous earthquake catalogue we initially determine the frequency-magnitude distribution by statistical methods. In particular, we focus on an adequate estimation of the upper bound of the Gutenberg-Richter relation and demonstrate the importance of additional palaeoseis- mological information. The integration of seismological and geological information yields a probability distribution of the upper bound magnitude. Using this distribution together with the distribution of Gutenberg-Richter a and b values, we perform Monte Carlo simulations to derive the seismic moment release as a function of the observation time. The seismic moment release estimated from synthetic earthquake catalogues with short catalogue length is found to systematically underestimate the long-term moment rate which can be analytically determined. The moment release recorded in the LRE over the last 250 yr is found to be in good agreement with the probability distribution resulting from the Monte Carlo simulations. Furthermore, the long-term distribution is within its uncertainties consistent with the moment rate derived by geological measurements, indicating an almost complete seismic coupling in this region. By means of Kostrov's formula, we additionally calculate the full deformation rate tensor using the distribution of known focal mechanisms in LRE. Finally, we use the same approach to calculate the seismic moment and the deformation rate for two subsets of the catalogue corresponding to the east- and west-dipping faults, respectively
The statistics of time delays between successive earthquakes has recently been claimed to be universal and to show the existence of clustering beyond the duration of aftershock bursts. We demonstrate that these claims are unjustified. Stochastic simulations with Poissonian background activity and triggered Omori-type aftershock sequences are shown to reproduce the interevent-time distributions observed on different spatial and magnitude scales in California. Thus the empirical distribution can be explained without any additional long-term clustering. Furthermore, we find that the shape of the interevent-time distribution, which can be approximated by the gamma distribution, is determined by the percentage of main-shocks in the catalog. This percentage can be calculated by the mean and variance of the interevent times and varies between 5% and 90% for different regions in California. Our investigation of stochastic simulations indicates that the interevent-time distribution provides a nonparametric reconstruction of the mainshock magnitude-frequency distribution that is superior to standard declustering algorithm
Logic trees are widely used in probabilistic seismic hazard analysis as a tool to capture the epistemic uncertainty associated with the seismogenic sources and the ground-motion prediction models used in estimating the hazard. Combining two or more ground-motion relations within a logic tree will generally require several conversions to be made, because there are several definitions available for both the predicted ground-motion parameters and the explanatory parameters within the predictive ground-motion relations. Procedures for making conversions for each of these factors are presented, using a suite of predictive equations in current use for illustration. The sensitivity of the resulting ground-motion models to these conversions is shown to be pronounced for some of the parameters, especially the measure of source-to-site distance, highlighting the need to take into account any incompatibilities among the selected equations. Procedures are also presented for assigning weights to the branches in the ground-motion section of the logic tree in a transparent fashion, considering both intrinsic merits of the individual equations and their degree of applicability to the particular application
We introduce a method of wavefield separation from multicomponent data sets based on the use of the continuous wavelet transform. Our method is a further generalization of the approach proposed by Morozov and Smithson, in that by using the continuous wavelet transform, we can achieve a better separation of wave types by designing the filter in the time-frequency domain. Furthermore, using the instantaneous polarization attributes defined in the wavelet domain, we show how to construct filters tailored to separate different wave types (elliptically or linearly polarized), followed by an inverse wavelet transform to obtain the desired wave type in the time domain. Using synthetic and experimental data, we show how the present method can be used for wavefield separation
This study presents results of ambient noise measurements from temporary single station and small-scale array deployments in the northeast of Basle. H/V spectral ratios were determined along various profiles crossing the eastern masterfault of the Rhine Rift Valley and the adjacent sedimentary rift fills. The fundamental H/V peak frequencies are decreasing along the profile towards the eastern direction being consistent with the dip of the tertiary sediments within the rift. Using existing empirical relationships between H/V frequency peaks and the depth of the dominant seismic contrast, derived on basis of the lambda/4-resonance hypothesis and a power law depth dependence of the S-wave velocity, we obtain thicknesses of the rift fill from about 155 m in the west to 280 in in the east. This is in agreement with previous studies. The array analysis of the ambient noise wavefield yielded a stable dispersion relation consistent with Rayleigh wave propagation velocities. We conclude that a significant amount of surface waves is contained in the observed wavefield. The computed ellipticity for fundamental mode Rayleigh waves for the velocity depth models used for the estimation of the sediment thicknesses is in agreement with the observed H/V spectra over a large frequency band