Hybrid Open-Access
Refine
Has Fulltext
- no (48)
Language
- English (48)
Is part of the Bibliography
- yes (48)
Keywords
- Bayesian inference (3)
- Bayesian inverse problems (2)
- Gamma-convergence (2)
- Onsager-Machlup functional (2)
- data assimilation (2)
- estimation (2)
- filter (2)
- hyperbolic tilings (2)
- isotopic tiling theory (2)
- maximum a posteriori (2)
Institute
- Institut für Mathematik (48) (remove)
We introduce the concept of TRAP (Traces and Permutations), which can roughly be viewed as a wheeled PROP (Products and Permutations) without unit. TRAPs are equipped with a horizontal concatenation and partial trace maps.
Continuous morphisms on an infinite-dimensional topological space and smooth kernels (respectively, smoothing operators) on a closed manifold form a TRAP but not a wheeled PROP.
We build the free objects in the category of TRAPs as TRAPs of graphs and show that a TRAP can be completed to a unitary TRAP (or wheeled PROP).
We further show that it can be equipped with a vertical concatenation, which on the TRAP of linear homomorphisms of a vector space, amounts to the usual composition. The vertical concatenation in the TRAP of smooth kernels gives rise to generalised convolutions.
Graphs whose vertices are decorated by smooth kernels (respectively, smoothing operators) on a closed manifold form a TRAP. From their universal properties we build smooth amplitudes associated with the graph.
In this article we prove upper bounds for the Laplace eigenvalues lambda(k) below the essential spectrum for strictly negatively curved Cartan-Hadamard manifolds. Our bound is given in terms of k(2) and specific geometric data of the manifold. This applies also to the particular case of non-compact manifolds whose sectional curvature tends to -infinity, where no essential spectrum is present due to a theorem of Donnelly/Li. The result stands in clear contrast to Laplacians on graphs where such a bound fails to be true in general.
The spatio-temporal epidemic type aftershock sequence (ETAS) model is widely used to describe the self-exciting nature of earthquake occurrences. While traditional inference methods provide only point estimates of the model parameters, we aim at a fully Bayesian treatment of model inference, allowing naturally to incorporate prior knowledge and uncertainty quantification of the resulting estimates. Therefore, we introduce a highly flexible, non-parametric representation for the spatially varying ETAS background intensity through a Gaussian process (GP) prior. Combined with classical triggering functions this results in a new model formulation, namely the GP-ETAS model. We enable tractable and efficient Gibbs sampling by deriving an augmented form of the GP-ETAS inference problem. This novel sampling approach allows us to assess the posterior model variables conditioned on observed earthquake catalogues, i.e., the spatial background intensity and the parameters of the triggering function. Empirical results on two synthetic data sets indicate that GP-ETAS outperforms standard models and thus demonstrate the predictive power for observed earthquake catalogues including uncertainty quantification for the estimated parameters. Finally, a case study for the l'Aquila region, Italy, with the devastating event on 6 April 2009, is presented.
A sufficient quantitative understanding of aluminium (Al) toxicokinetics (TK) in man is still lacking, although highly desirable for risk assessment of Al exposure. Baseline exposure and the risk of contamination severely limit the feasibility of TK studies administering the naturally occurring isotope Al-27, both in animals and man. These limitations are absent in studies with Al-26 as a tracer, but tissue data are limited to animal studies. A TK model capable of inter-species translation to make valid predictions of Al levels in humans-especially in toxicological relevant tissues like bone and brain-is urgently needed. Here, we present: (i) a curated dataset which comprises all eligible studies with single doses of Al-26 tracer administered as citrate or chloride salts orally and/or intravenously to rats and humans, including ultra-long-term kinetic profiles for plasma, blood, liver, spleen, muscle, bone, brain, kidney, and urine up to 150 weeks; and (ii) the development of a physiology-based (PB) model for Al TK after intravenous and oral administration of aqueous Al citrate and Al chloride solutions in rats and humans. Based on the comprehensive curated Al-26 dataset, we estimated substance-dependent parameters within a non-linear mixed-effect modelling context. The model fitted the heterogeneous Al-26 data very well and was successfully validated against datasets in rats and humans. The presented PBTK model for Al, based on the most extensive and diverse dataset of Al exposure to date, constitutes a major advancement in the field, thereby paving the way towards a more quantitative risk assessment in humans.
We derive Onsager-Machlup functionals for countable product measures on weighted l(p) subspaces of the sequence space R-N. Each measure in the product is a shifted and scaled copy of a reference probability measure on R that admits a sufficiently regular Lebesgue density. We study the equicoercivity and Gamma-convergence of sequences of Onsager-Machlup functionals associated to convergent sequences of measures within this class. We use these results to establish analogous results for probability measures on separable Banach or Hilbert spaces, including Gaussian, Cauchy, and Besov measures with summability parameter 1 <= p <= 2. Together with part I of this paper, this provides a basis for analysis of the convergence of maximum a posteriori estimators in Bayesian inverse problems and most likely paths in transition path theory.
We introduce the class of "smooth rough paths" and study their main properties. Working in a smooth setting allows us to discard sewing arguments and focus on algebraic and geometric aspects. Specifically, a Maurer-Cartan perspective is the key to a purely algebraic form of Lyons' extension theorem, the renormalization of rough paths following up on [Bruned et al.: A rough path perspective on renormalization, J. Funct. Anal. 277(11), 2019], as well as a related notion of "sum of rough paths". We first develop our ideas in a geometric rough path setting, as this best resonates with recent works on signature varieties, as well as with the renormalization of geometric rough paths. We then explore extensions to the quasi-geometric and the more general Hopf algebraic setting.
The Bayesian solution to a statistical inverse problem can be summarised by a mode of the posterior distribution, i.e. a maximum a posteriori (MAP) estimator. The MAP estimator essentially coincides with the (regularised) variational solution to the inverse problem, seen as minimisation of the Onsager-Machlup (OM) functional of the posterior measure. An open problem in the stability analysis of inverse problems is to establish a relationship between the convergence properties of solutions obtained by the variational approach and by the Bayesian approach. To address this problem, we propose a general convergence theory for modes that is based on the Gamma-convergence of OM functionals, and apply this theory to Bayesian inverse problems with Gaussian and edge-preserving Besov priors. Part II of this paper considers more general prior distributions.
Forecast verification
(2021)
The philosophy of forecast verification is rather different between deterministic and probabilistic verification metrics: generally speaking, deterministic metrics measure differences, whereas probabilistic metrics assess reliability and sharpness of predictive distributions. This article considers the root-mean-square error (RMSE), which can be seen as a deterministic metric, and the probabilistic metric Continuous Ranked Probability Score (CRPS), and demonstrates that under certain conditions, the CRPS can be mathematically expressed in terms of the RMSE when these metrics are aggregated. One of the required conditions is the normality of distributions. The other condition is that, while the forecast ensemble need not be calibrated, any bias or over/underdispersion cannot depend on the forecast distribution itself. Under these conditions, the CRPS is a fraction of the RMSE, and this fraction depends only on the heteroscedasticity of the ensemble spread and the measures of calibration. The derived CRPS-RMSE relationship for the case of perfect ensemble reliability is tested on simulations of idealised two-dimensional barotropic turbulence. Results suggest that the relationship holds approximately despite the normality condition not being met.
Alpine ecosystems on the Tibetan Plateau are being threatened by ongoing climate warming and intensified human activities. Ecological time-series obtained from sedimentary ancient DNA (sedaDNA) are essential for understanding past ecosystem and biodiversity dynamics on the Tibetan Plateau and their responses to climate change at a high taxonomic resolution. Hitherto only few but promising studies have been published on this topic. The potential and limitations of using sedaDNA on the Tibetan Plateau are not fully understood. Here, we (i) provide updated knowledge of and a brief introduction to the suitable archives, region-specific taphonomy, state-of-the-art methodologies, and research questions of sedaDNA on the Tibetan Plateau; (ii) review published and ongoing sedaDNA studies from the Tibetan Plateau; and (iii) give some recommendations for future sedaDNA study designs. Based on the current knowledge of taphonomy, we infer that deep glacial lakes with freshwater and high clay sediment input, such as those from the southern and southeastern Tibetan Plateau, may have a high potential for sedaDNA studies. Metabarcoding (for microorganisms and plants), metagenomics (for ecosystems), and hybridization capture (for prehistoric humans) are three primary sedaDNA approaches which have been successfully applied on the Tibetan Plateau, but their power is still limited by several technical issues, such as PCR bias and incompleteness of taxonomic reference databases. Setting up high-quality and open-access regional taxonomic reference databases for the Tibetan Plateau should be given priority in the future. To conclude, the archival, taphonomic, and methodological conditions of the Tibetan Plateau are favorable for performing sedaDNA studies. More research should be encouraged to address questions about long-term ecological dynamics at ecosystem scale and to bring the paleoecology of the Tibetan Plateau into a new era.
We propose a global geomagnetic field model for the last 14 thousand years, based on thermoremanent records. We call the model ArchKalmag14k. ArchKalmag14k is constructed by modifying recently proposed algorithms, based on space-time correlations. Due to the amount of data and complexity of the model, the full Bayesian posterior is numerically intractable. To tackle this, we sequentialize the inversion by implementing a Kalman-filter with a fixed time step. Every step consists of a prediction, based on a degree dependent temporal covariance, and a correction via Gaussian process regression. Dating errors are treated via a noisy input formulation. Cross correlations are reintroduced by a smoothing algorithm and model parameters are inferred from the data. Due to the specific statistical nature of the proposed algorithms, the model comes with space and time-dependent uncertainty estimates. The new model ArchKalmag14k shows less variation in the large-scale degrees than comparable models. Local predictions represent the underlying data and agree with comparable models, if the location is sampled well. Uncertainties are bigger for earlier times and in regions of sparse data coverage. We also use ArchKalmag14k to analyze the appearance and evolution of the South Atlantic anomaly together with reverse flux patches at the core-mantle boundary, considering the model uncertainties. While we find good agreement with earlier models for recent times, our model suggests a different evolution of intensity minima prior to 1650 CE. In general, our results suggest that prior to 6000 BCE the data is not sufficient to support global models.