Refine
Has Fulltext
- no (44) (remove)
Year of publication
- 2017 (44) (remove)
Document Type
- Article (44) (remove)
Is part of the Bibliography
- yes (44)
Keywords
Institute
- Institut für Mathematik (44) (remove)
We consider the Cauchy problem for the heat equation in a cylinder C (T) = X x (0, T) over a domain X in R (n) , with data on a strip lying on the lateral surface. The strip is of the form S x (0, T), where S is an open subset of the boundary of X. The problem is ill-posed. Under natural restrictions on the configuration of S, we derive an explicit formula for solutions of this problem.
When trying to extend the Hodge theory for elliptic complexes on compact closed manifolds to the case of compact manifolds with boundary one is led to a boundary value problem for the Laplacian of the complex which is usually referred to as Neumann problem. We study the Neumann problem for a larger class of sequences of differential operators on a compact manifold with boundary. These are sequences of small curvature, i.e., bearing the property that the composition of any two neighbouring operators has order less than two.
Maximal subsemigroups of some semigroups of order-preserving mappings on a countably infinite set
(2017)
In this paper, we study the maximal subsemigroups of several semigroups of order-preserving transformations on the natural numbers and the integers, respectively. We determine all maximal subsemigroups of the monoid of all order-preserving injections on the set of natural numbers as well as on the set of integers. Further, we give all maximal subsemigroups of the monoid of all bijections on the integers. For the monoid of all order-preserving transformations on the natural numbers, we classify also all its maximal subsemigroups, containing a particular set of transformations.
We show a connection between the CDE′ inequality introduced in Horn et al. (Volume doubling, Poincaré inequality and Gaussian heat kernel estimate for nonnegative curvature graphs. arXiv:1411.5087v2, 2014) and the CDψ inequality established in Münch (Li–Yau inequality on finite graphs via non-linear curvature dimension conditions. arXiv:1412.3340v1, 2014). In particular, we introduce a CDφψ inequality as a slight generalization of CDψ which turns out to be equivalent to CDE′ with appropriate choices of φ and ψ. We use this to prove that the CDE′ inequality implies the classical CD inequality on graphs, and that the CDE′ inequality with curvature bound zero holds on Ricci-flat graphs.
In this study, we investigate the climatology of high-latitude total electron content (TEC) variations as observed by the dual-frequency Global Navigation Satellite Systems (GNSS) receivers onboard the Swarm satellite constellation. The distribution of TEC perturbations as a function of geographic/magnetic coordinates and seasons reasonably agrees with that of the Challenging Minisatellite Payload observations published earlier. Categorizing the high-latitude TEC perturbations according to line-of-sight directions between Swarm and GNSS satellites, we can deduce their morphology with respect to the geomagnetic field lines. In the Northern Hemisphere, the perturbation shapes are mostly aligned with the L shell surface, and this anisotropy is strongest in the nightside auroral (substorm) and subauroral regions and weakest in the central polar cap. The results are consistent with the well-known two-cell plasma convection pattern of the high-latitude ionosphere, which is approximately aligned with L shells at auroral regions and crossing different L shells for a significant part of the polar cap. In the Southern Hemisphere, the perturbation structures exhibit noticeable misalignment to the local L shells. Here the direction toward the Sun has an additional influence on the plasma structure, which we attribute to photoionization effects. The larger offset between geographic and geomagnetic poles in the south than in the north is responsible for the hemispheric difference.
The global prevalence of rapid and extensive land use change necessitates hydrologic modelling methodologies capable of handling non-stationarity. This is particularly true in the context of Hydrologic Forecasting using Data Assimilation. Data Assimilation has been shown to dramatically improve forecast skill in hydrologic and meteorological applications, although such improvements are conditional on using bias-free observations and model simulations. A hydrologic model calibrated to a particular set of land cover conditions has the potential to produce biased simulations when the catchment is disturbed. This paper sheds new light on the impacts of bias or systematic errors in hydrologic data assimilation, in the context of forecasting in catchments with changing land surface conditions and a model calibrated to pre-change conditions. We posit that in such cases, the impact of systematic model errors on assimilation or forecast quality is dependent on the inherent prediction uncertainty that persists even in pre-change conditions. Through experiments on a range of catchments, we develop a conceptual relationship between total prediction uncertainty and the impacts of land cover changes on the hydrologic regime to demonstrate how forecast quality is affected when using state estimation Data Assimilation with no modifications to account for land cover changes. This work shows that systematic model errors as a result of changing or changed catchment conditions do not always necessitate adjustments to the modelling or assimilation methodology, for instance through re-calibration of the hydrologic model, time varying model parameters or revised offline/online bias estimation.
Mental arithmetic is characterised by a tendency to overestimate addition and to underestimate subtraction results: the operational momentum (OM) effect. Here, motivated by contentious explanations of this effect, we developed and tested an arithmetic heuristics and biases model that predicts reverse OM due to cognitive anchoring effects. Participants produced bi-directional lines with lengths corresponding to the results of arithmetic problems. In two experiments, we found regular OM with zero problems (e.g., 3+0, 3-0) but reverse OM with non-zero problems (e.g., 2+1, 4-1). In a third experiment, we tested the prediction of our model. Our results suggest the presence of at least three competing biases in mental arithmetic: a more-or-less heuristic, a sign-space association and an anchoring bias. We conclude that mental arithmetic exhibits shortcuts for decision-making similar to traditional domains of reasoning and problem-solving.
Ancient genomes have revolutionized our understanding of Holocene prehistory and, particularly, the Neolithic transition in western Eurasia. In contrast, East Asia has so far received little attention, despite representing a core region at which the Neolithic transition took place independently ~3 millennia after its onset in the Near East. We report genome-wide data from two hunter-gatherers from Devil’s Gate, an early Neolithic cave site (dated to ~7.7 thousand years ago) located in East Asia, on the border between Russia and Korea. Both of these individuals are genetically most similar to geographically close modern populations from the Amur Basin, all speaking Tungusic languages, and, in particular, to the Ulchi. The similarity to nearby modern populations and the low levels of additional genetic material in the Ulchi imply a high level of genetic continuity in this region during the Holocene, a pattern that markedly contrasts with that reported for Europe.
Prospective and retrospective evaluation of five-year earthquake forecast models for California
(2017)
This paper is concerned with the filtering problem in continuous time. Three algorithmic solution approaches for this problem are reviewed: (i) the classical Kalman-Bucy filter, which provides an exact solution for the linear Gaussian problem; (ii) the ensemble Kalman-Bucy filter (EnKBF), which is an approximate filter and represents an extension of the Kalman-Bucy filter to nonlinear problems; and (iii) the feedback particle filter (FPF), which represents an extension of the EnKBF and furthermore provides for a consistent solution in the general nonlinear, non-Gaussian case. The common feature of the three algorithms is the gain times error formula to implement the update step (to account for conditioning due to the observations) in the filter. In contrast to the commonly used sequential Monte Carlo methods, the EnKBF and FPF avoid the resampling of the particles in the importance sampling update step. Moreover, the feedback control structure provides for error correction potentially leading to smaller simulation variance and improved stability properties. The paper also discusses the issue of nonuniqueness of the filter update formula and formulates a novel approximation algorithm based on ideas from optimal transport and coupling of measures. Performance of this and other algorithms is illustrated for a numerical example.
Broad-spectrum antibiotic combination therapy is frequently applied due to increasing resistance development of infective pathogens. The objective of the present study was to evaluate two common empiric broad-spectrum combination therapies consisting of either linezolid (LZD) or vancomycin (VAN) combined with meropenem (MER) against Staphylococcus aureus (S. aureus) as the most frequent causative pathogen of severe infections. A semimechanistic pharmacokinetic-pharmacodynamic (PK-PD) model mimicking a simplified bacterial life-cycle of S. aureus was developed upon time-kill curve data to describe the effects of LZD, VAN, and MER alone and in dual combinations. The PK-PD model was successfully (i) evaluated with external data from two clinical S. aureus isolates and further drug combinations and (ii) challenged to predict common clinical PK-PD indices and breakpoints. Finally, clinical trial simulations were performed that revealed that the combination of VAN-MER might be favorable over LZD-MER due to an unfavorable antagonistic interaction between LZD and MER.
The knowledge of the largest expected earthquake magnitude in a region is one of the key issues in probabilistic seismic hazard calculations and the estimation of worst-case scenarios. Earthquake catalogues are the most informative source of information for the inference of earthquake magnitudes. We analysed the earthquake catalogue for Central Asia with respect to the largest expected magnitudes m(T) in a pre-defined time horizon T-f using a recently developed statistical methodology, extended by the explicit probabilistic consideration of magnitude errors. For this aim, we assumed broad error distributions for historical events, whereas the magnitudes of recently recorded instrumental earthquakes had smaller errors. The results indicate high probabilities for the occurrence of large events (M >= 8), even in short time intervals of a few decades. The expected magnitudes relative to the assumed maximum possible magnitude are generally higher for intermediate-depth earthquakes (51-300 km) than for shallow events (0-50 km). For long future time horizons, for example, a few hundred years, earthquakes with M >= 8.5 have to be taken into account, although, apart from the 1889 Chilik earthquake, it is probable that no such event occurred during the observation period of the catalogue.