Refine
Has Fulltext
- no (3)
Document Type
- Article (3) (remove)
Language
- English (3)
Is part of the Bibliography
- yes (3) (remove)
Keywords
- Time series analysis (3) (remove)
Institute
- Institut für Geowissenschaften (3) (remove)
Nowadays, an increasing amount of seismic data is collected by daily observatory routines. The basic step for successfully analyzing those data is the correct detection of various event types. However, the visually scanning process is a time-consuming task. Applying standard techniques for detection like the STA/LTAtrigger still requires the manual control for classification. Here, we present a useful alternative. The incoming data stream is scanned automatically for events of interest. A stochastic classifier, called hidden Markov model, is learned for each class of interest enabling the recognition of highly variable waveforms. In contrast to other automatic techniques as neural networks or support vector machines the algorithm allows to start the classification from scratch as soon as interesting events are identified. Neither the tedious process of collecting training samples nor a time-consuming configuration of the classifier is required. An approach originally introduced for the volcanic task force action allows to learn classifier properties from a single waveform example and some hours of background recording. Besides a reduction of required workload this also enables to detect very rare events. Especially the latter feature provides a milestone point for the use of seismic devices in alpine warning systems. Furthermore, the system offers the opportunity to flag new signal classes that have not been defined before. We demonstrate the application of the classification system using a data set from the Swiss Seismological Survey achieving very high recognition rates. In detail we document all refinements of the classifier providing a step-by-step guide for the fast set up of a well-working classification system.
P>Computing the magnitude of an earthquake requires correcting for the propagation effects from the source to the receivers. This is often accomplished by performing numerical simulations using a suitable Earth model. In this work, the energy magnitude M(e) is considered and its determination is performed using theoretical spectral amplitude decay functions over teleseismic distances based on the global Earth model AK135Q. Since the high frequency part (above the corner frequency) of the source spectrum has to be considered in computing M(e), the influence of propagation and site effects may not be negligible and they could bias the single station M(e) estimations. Therefore, in this study we assess the inter- and intrastation distributions of errors by considering the M(e) residuals computed for a large data set of earthquakes recorded at teleseismic distances by seismic stations deployed worldwide. To separate the inter- and intrastation contribution of errors, we apply a maximum likelihood approach to the M(e) residuals. We show that the interstation errors (describing a sort of site effect for a station) are within +/- 0.2 magnitude units for most stations and their spatial distribution reflects the expected lateral variation affecting the velocity and attenuation of the Earth's structure in the uppermost layers, not accounted for by the 1-D AK135Q model. The variance of the intrastation error distribution (describing the record-to-record component of variability) is larger than the interstation one (0.240 against 0.159), and the spatial distribution of the errors is not random but shows specific patterns depending on the source-to-station paths. The set of coefficients empirically determined may be used in the future to account for the heterogeneities of the real Earth not considered in the theoretical calculations of the spectral amplitude decay functions used to correct the recorded data for propagation effects.
Due to increasing demands and competition for high quality groundwater resources in many parts of the world, there is an urgent need for efficient methods that shed light on the interplay between complex natural settings and anthropogenic impacts. Thus a new approach is introduced, that aims to identify and quantify the predominant processes or factors of influence that drive groundwater and lake water dynamics on a catchment scale. The approach involves a non-linear dimension reduction method called Isometric feature mapping (Isomap). This method is applied to time series of groundwater head and lake water level data from a complex geological setting in Northeastern Germany. Two factors explaining more than 95% of the observed spatial variations are identified: (1) the anthropogenic impact of a waterworks in the study area and (2) natural groundwater recharge with different degrees of dampening at the respective sites of observation. The approach enables a presumption-free assessment to be made of the existing geological conception in the catchment, leading to an extension of the conception. Previously unknown hydraulic connections between two aquifers are identified, and connections revealed between surface water bodies and groundwater. (C) 2014 Elsevier B.V. All rights reserved.