Refine
Year of publication
Document Type
- Article (940) (remove)
Language
- English (940) (remove)
Keywords
- random point processes (18)
- statistical mechanics (18)
- stochastic analysis (18)
- data assimilation (8)
- Bayesian inference (5)
- discrepancy principle (5)
- ensemble Kalman filter (5)
- linear term (5)
- Data assimilation (4)
- Earthquake interaction (4)
Institute
- Institut für Mathematik (940) (remove)
We evaluate the Hamiltonian particle methods (HPM) and the Nambu discretization applied to shallow-water equations on the sphere using the test suggested by Galewsky et al. (2004). Both simulations show excellent conservation of energy and are stable in long-term simulation. We repeat the test also using the ICOSWP scheme to compare with the two conservative spatial discretization schemes. The HPM simulation captures the main features of the reference solution, but wave 5 pattern is dominant in the simulations applied on the ICON grid with relatively low spatial resolutions. Nevertheless, agreement in statistics between the three schemes indicates their qualitatively similar behaviors in the long-term integration.
Ternutator identities
(2009)
The ternary commutator or ternutator, defined as the alternating sum of the product of three operators, has recently drawn much attention as an interesting structure generalizing the commutator. The ternutator satisfies cubic identities analogous to the quadratic Jacobi identity for the commutator. We present various forms of these identities and discuss the possibility of using them to define ternary algebras.
Multisymplectic methods have recently been proposed as a generalization of symplectic ODE methods to the case of Hamiltonian PDEs. Their excellent long time behavior for a variety of Hamiltonian wave equations has been demonstrated in a number of numerical studies. A theoretical investigation and justification of multisymplectic methods is still largely missing. In this paper, we study linear multisymplectic PDEs and their discretization by means of numerical dispersion relations. It is found that multisymplectic methods in the sense of Bridges and Reich [Phys. Lett. A, 284 ( 2001), pp. 184-193] and Reich [J. Comput. Phys., 157 (2000), pp. 473-499], such as Gauss-Legendre Runge-Kutta methods, possess a number of desirable properties such as nonexistence of spurious roots and conservation of the sign of the group velocity. A certain CFL-type restriction on Delta t/Delta x might be required for methods higher than second order in time. It is also demonstrated by means of the explicit midpoint method that multistep methods may exhibit spurious roots in the numerical dispersion relation for any value of Delta t/Delta x despite being multisymplectic in the sense of discrete variational mechanics [J. E. Marsden, G. P. Patrick, and S. Shkoller, Commun. Math. Phys., 199 (1999), pp. 351-395]
We analyze the notions of monotonicity and complete monotonicity for Markov Chains in continuous-time, taking values in a finite partially ordered set. Similarly to what happens in discrete-time, the two notions are not equivalent. However, we show that there are partially ordered sets for which monotonicity and complete monotonicity coincide in continuous time but not in discrete-time
Finding non-Gaussian components of high-dimensional data is an important preprocessing step for efficient information processing. This article proposes a new linear method to identify the '' non-Gaussian subspace '' within a very general semi-parametric framework. Our proposed method, called NGCA (non-Gaussian component analysis), is based on a linear operator which, to any arbitrary nonlinear (smooth) function, associates a vector belonging to the low dimensional non-Gaussian target subspace, up to an estimation error. By applying this operator to a family of different nonlinear functions, one obtains a family of different vectors lying in a vicinity of the target space. As a final step, the target space itself is estimated by applying PCA to this family of vectors. We show that this procedure is consistent in the sense that the estimaton error tends to zero at a parametric rate, uniformly over the family, Numerical examples demonstrate the usefulness of our method
The field equations following from a Lagrangian L(R) will be deduced and solved for special cases. If L is a non-linear function of the curvature scalar, then these equations are of fourth order in the metric. In the introduction we present the history of these equations beginning with the paper of H. Weyl from 1918, who first discussed them as alternative to Einstein's theory. In the third part, we give details about the cosmic no hair theorem, i.e., the details how within fourth order gravity with L= R + R^2 the inflationary phase of cosmic evolution turns out to be a transient attractor. Finally, the Bicknell theorem, i.e. the conformal relation from fourth order gravity to scalar- tensor theory, will be shortly presented.
Formal poincare lemma
(2007)
Creation of topographic maps
(2014)
Location analyses are among the most common tasks while working with spatial data and geographic information systems. Automating the most frequently used procedures is therefore an important aspect of improving their usability. In this context, this project aims to design and implement a workflow, providing some basic tools for a location analysis. For the implementation with jABC, the workflow was applied to the problem of finding a suitable location for placing an artificial reef. For this analysis three parameters (bathymetry, slope and grain size of the ground material) were taken into account, processed, and visualized with the The Generic Mapping Tools (GMT), which were integrated into the workflow as jETI-SIBs. The implemented workflow thereby showed that the approach to combine jABC with GMT resulted in an user-centric yet user-friendly tool with high-quality cartographic outputs.