Refine
Has Fulltext
- no (4)
Document Type
- Article (4) (remove)
Language
- English (4)
Is part of the Bibliography
- yes (4)
Keywords
- Time series analysis (4) (remove)
Institute
- Institut für Geowissenschaften (4) (remove)
A novel idea for an optimal time delay state space reconstruction from uni- and multivariate time series is presented. The entire embedding process is considered as a game, in which each move corresponds to an embedding cycle and is subject to an evaluation through an objective function. This way the embedding procedure can be modeled as a tree, in which each leaf holds a specific value of the objective function. By using a Monte Carlo ansatz, the proposed algorithm populates the tree with many leafs by computing different possible embedding paths and the final embedding is chosen as that particular path, which ends at the leaf with the lowest achieved value of the objective function. The method aims to prevent getting stuck in a local minimum of the objective function and can be used in a modular way, enabling practitioners to choose a statistic for possible delays in each embedding cycle as well as a suitable objective function themselves. The proposed method guarantees the optimization of the chosen objective function over the parameter space of the delay embedding as long as the tree is sampled sufficiently. As a proof of concept, we demonstrate the superiority of the proposed method over the classical time delay embedding methods using a variety of application examples. We compare recurrence plot-based statistics inferred from reconstructions of a Lorenz-96 system and highlight an improved forecast accuracy for map-like model data as well as for palaeoclimate isotope time series. Finally, we utilize state space reconstruction for the detection of causality and its strength between observables of a gas turbine type thermoacoustic combustor.
Nowadays, an increasing amount of seismic data is collected by daily observatory routines. The basic step for successfully analyzing those data is the correct detection of various event types. However, the visually scanning process is a time-consuming task. Applying standard techniques for detection like the STA/LTAtrigger still requires the manual control for classification. Here, we present a useful alternative. The incoming data stream is scanned automatically for events of interest. A stochastic classifier, called hidden Markov model, is learned for each class of interest enabling the recognition of highly variable waveforms. In contrast to other automatic techniques as neural networks or support vector machines the algorithm allows to start the classification from scratch as soon as interesting events are identified. Neither the tedious process of collecting training samples nor a time-consuming configuration of the classifier is required. An approach originally introduced for the volcanic task force action allows to learn classifier properties from a single waveform example and some hours of background recording. Besides a reduction of required workload this also enables to detect very rare events. Especially the latter feature provides a milestone point for the use of seismic devices in alpine warning systems. Furthermore, the system offers the opportunity to flag new signal classes that have not been defined before. We demonstrate the application of the classification system using a data set from the Swiss Seismological Survey achieving very high recognition rates. In detail we document all refinements of the classifier providing a step-by-step guide for the fast set up of a well-working classification system.
P>Computing the magnitude of an earthquake requires correcting for the propagation effects from the source to the receivers. This is often accomplished by performing numerical simulations using a suitable Earth model. In this work, the energy magnitude M(e) is considered and its determination is performed using theoretical spectral amplitude decay functions over teleseismic distances based on the global Earth model AK135Q. Since the high frequency part (above the corner frequency) of the source spectrum has to be considered in computing M(e), the influence of propagation and site effects may not be negligible and they could bias the single station M(e) estimations. Therefore, in this study we assess the inter- and intrastation distributions of errors by considering the M(e) residuals computed for a large data set of earthquakes recorded at teleseismic distances by seismic stations deployed worldwide. To separate the inter- and intrastation contribution of errors, we apply a maximum likelihood approach to the M(e) residuals. We show that the interstation errors (describing a sort of site effect for a station) are within +/- 0.2 magnitude units for most stations and their spatial distribution reflects the expected lateral variation affecting the velocity and attenuation of the Earth's structure in the uppermost layers, not accounted for by the 1-D AK135Q model. The variance of the intrastation error distribution (describing the record-to-record component of variability) is larger than the interstation one (0.240 against 0.159), and the spatial distribution of the errors is not random but shows specific patterns depending on the source-to-station paths. The set of coefficients empirically determined may be used in the future to account for the heterogeneities of the real Earth not considered in the theoretical calculations of the spectral amplitude decay functions used to correct the recorded data for propagation effects.
Due to increasing demands and competition for high quality groundwater resources in many parts of the world, there is an urgent need for efficient methods that shed light on the interplay between complex natural settings and anthropogenic impacts. Thus a new approach is introduced, that aims to identify and quantify the predominant processes or factors of influence that drive groundwater and lake water dynamics on a catchment scale. The approach involves a non-linear dimension reduction method called Isometric feature mapping (Isomap). This method is applied to time series of groundwater head and lake water level data from a complex geological setting in Northeastern Germany. Two factors explaining more than 95% of the observed spatial variations are identified: (1) the anthropogenic impact of a waterworks in the study area and (2) natural groundwater recharge with different degrees of dampening at the respective sites of observation. The approach enables a presumption-free assessment to be made of the existing geological conception in the catchment, leading to an extension of the conception. Previously unknown hydraulic connections between two aquifers are identified, and connections revealed between surface water bodies and groundwater. (C) 2014 Elsevier B.V. All rights reserved.