### Refine

#### Year of publication

#### Document Type

- Article (75)
- Other (3)
- Monograph/Edited Volume (2)
- Conference Proceeding (1)

#### Keywords

First comparison of array-derived rotational ground motions with direct ring laser measurements
(2006)

Recently, ring laser technology has provided the first consistent observations of rotational ground motions around a vertical axis induced by earthquakes. "Consistent," in this context, implies that the observed waveforms and amplitudes are compatible with collocated recordings of translational ground motions. In particular, transverse accelerations should be in phase with rotation rate and their ratio proportional to local horizontal phase velocity assuming plane-wave propagation. The ring laser installed at the Fundamental station Wettzell in the Bavarian Forest, Southeast Germany, is recording the rotation rate around a vertical axis, theoretically a linear combination of the space derivatives of the horizontal components of motion. This suggests that, in principle, rotation can be derived from seismic-array experiments by "finite differencing." This has been attempted previously in several studies; however, the accuracy of these observations could never be tested in the absence of direct measurements. We installed a double cross-shaped array of nine stations from December 2003 to March 2004 around the ring laser instrument and observed several large earthquakes on both the ring laser and the seismic array. Here we present for the first time a comparison of array-derived rotations with direct measurements of rotations for ground motions induced by the M 6.3 Al Hoceima, Morocco, earthquake of 24 February 2004. With complete 3D synthetic seismograms calculated for this event we show that even low levels of noise may considerably influence the accuracy of the array-derived rotations when the minimum number of required stations (three) is used. Nevertheless, when using all nine stations, the overall fit between direct and array-derived measurements is surprisingly good (maximum correlation coefficient of 0.94).

Slow fourier transform
(2013)

Shallowly situated evaporites in built-up areas are of relevance for urban and cultural development and hydrological regulation. The hazard of sinkholes, subrosion depressions and gypsum karst is often difficult to evaluate and may quickly change with anthropogenic influence. The geophysical exploration of evaporites in metropolitan areas is often not feasible with active industrial techniques. We collect and combine different passive geophysical data as microgravity, ambient vibrations, deformation and hydrological information to study the roof morphology of shallow evaporites beneath Hamburg, Northern Germany. The application of a novel gravity inversion technique leads to a 3-D depth model of the salt diapir under study. We compare the gravity-based depth model to pseudo-depths from H/V measurements and depth estimates from small-scale seismological array data. While the general range and trend of the diapir roof is consistent, a few anomalous regions are identified where H/V pseudo-depths indicate shallower structures not observed in gravity or array data. These are interpreted by shallow residual caprock floaters and zones of increased porosity. The shallow salt structure clearly correlates with a relative subsidence in the order of 2 mm yr(-1). The combined interpretation of roof morphology, yearly subsidence rates, chemical analyses of groundwater and of hydraulic head in aquifers indicates that the salt diapir beneath Hamburg is subject to significant ongoing dissolution that may possibly affect subrosion depressions, sinkhole distribution and land usage. The combined analysis of passive geophysical data may be exemplary for the study of shallow evaporites beneath other urban areas.

We investigate the usefulness of complex flood damage models for predicting relative damage to residential buildings in a spatial and temporal transfer context. We apply eight different flood damage models to predict relative building damage for five historic flood events in two different regions of Germany. Model complexity is measured in terms of the number of explanatory variables which varies from 1 variable up to 10 variables which are singled out from 28 candidate variables. Model validation is based on empirical damage data, whereas observation uncertainty is taken into consideration. The comparison of model predictive performance shows that additional explanatory variables besides the water depth improve the predictive capability in a spatial and temporal transfer context, i.e., when the models are transferred to different regions and different flood events. Concerning the trade-off between predictive capability and reliability the model structure seem more important than the number of explanatory variables. Among the models considered, the reliability of Bayesian network-based predictions in space-time transfer is larger than for the remaining models, and the uncertainties associated with damage predictions are reflected more completely.

One of the major challenges in engineering seismology is the reliable prediction of site-specific ground motion for particular earthquakes, observed at specific distances. For larger events, a special problem arises, at short distances, with the source-to-site distance measure, because distance metrics based on a point-source model are no longer appropriate. As a consequence, different attenuation relations differ in the distance metric that they use. In addition to being a source of confusion, this causes problems to quantitatively compare or combine different ground- motion models; for example, in the context of Probabilistic Seismic Hazard Assessment, in cases where ground-motion models with different distance metrics occupy neighboring branches of a logic tree. In such a situation, very crude assumptions about source sizes and orientations often have to be used to be able to derive an estimate of the particular metric required. Even if this solves the problem of providing a number to put into the attenuation relation, a serious problem remains. When converting distance measures, the corresponding uncertainties map onto the estimated ground motions according to the laws of error propagation. To make matters worse, conversion of distance metrics can cause the uncertainties of the adapted ground-motion model to become magnitude and distance dependent, even if they are not in the original relation. To be able to treat this problem quantitatively, the variability increase caused by the distance metric conversion has to be quantified. For this purpose, we have used well established scaling laws to determine explicit distance conversion relations using regression analysis on simulated data. We demonstrate that, for all practical purposes, most popular distance metrics can be related to the Joyner-Boore distance using models based on gamma distributions to express the shape of some "residual function." The functional forms are magnitude and distance dependent and are expressed as polynomials. We compare the performance of these relations with manually derived individual distance estimates for the Landers, the Imperial Valley, and the Chi-Chi earthquakes

This study presents an unsupervised feature selection and learning approach for the discovery and intuitive imaging of significant temporal patterns in seismic single-station or network recordings. For this purpose, the data are parametrized by real-valued feature vectors for short time windows using standard analysis tools for seismic data, such as frequency-wavenumber, polarization, and spectral analysis. We use Self-Organizing Maps (SOMs) for a data-driven feature selection, visualization and clustering procedure, which is in particular suitable for high-dimensional data sets. Our feature selection method is based on significance testing using the Wald-Wolfowitz runs test for-individual features and on correlation hunting with SOMs in feature subsets. Using synthetics composed of Rayleigh and Love waves and real-world data, we show the robustness and the improved discriminative power of that approach compared to feature subsets manually selected from individual wavefield parametrization methods. Furthermore, the capability of the clustering and visualization techniques to investigate the discrimination of wave phases is shown by means of synthetic waveforms and regional earthquake recordings.

In probabilistic seismic-hazard analysis, epistemic uncertainties are commonly treated within a logic-tree framework in which the branch weights express the degree of belief of an expert in a set of models. For the calculation of the distribution of hazard curves, these branch weights represent subjective probabilities. A major challenge for experts is to provide logically consistent weight estimates (in the sense of Kolmogorovs axioms), to be aware of the multitude of heuristics, and to minimize the biases which affect human judgment under uncertainty. We introduce a platform-independent, interactive program enabling us to quantify, elicit, and transfer expert knowledge into a set of subjective probabilities by applying experimental design theory, following the approach of Curtis and Wood (2004). Instead of determining the set of probabilities for all models in a single step, the computer-driven elicitation process is performed as a sequence of evaluations of relative weights for small subsets of models. From these, the probabilities for the whole model set are determined as a solution of an optimization problem. The result of this process is a set of logically consistent probabilities together with a measure of confidence determined from the amount of conflicting information which is provided by the expert during the relative weighting process. We experiment with different scenarios simulating likely expert behaviors in the context of knowledge elicitation and show the impact this has on the results. The overall aim is to provide a smart elicitation technique, and our findings serve as a guide for practical applications.