Refine
Has Fulltext
- yes (14)
Year of publication
Document Type
- Postprint (14) (remove)
Language
- English (14)
Keywords
- models (14) (remove)
New porous materials based on covalently connected monomers are presented. The key step of the synthesis is an acetalisation reaction. In previous years we used acetalisation reactions extensively to build up various molecular rods. Based on this approach, investigations towards porous polymeric materials were conducted by us. Here we wish to present the results of these studies in the synthesis of 1D polyacetals and porous 3D polyacetals. By scrambling experiments with 1D acetals we could prove that exchange reactions occur between different building blocks (evidenced by MALDI-TOF mass spectrometry). Based on these results we synthesized porous 3D polyacetals under the same mild conditions.
Flooding is an imminent natural hazard threatening most river deltas, e.g. the Mekong Delta. An appropriate flood management is thus required for a sustainable development of the often densely populated regions. Recently, the traditional event-based hazard control shifted towards a risk management approach in many regions, driven by intensive research leading to new legal regulation on flood management. However, a large-scale flood risk assessment does not exist for the Mekong Delta. Particularly, flood risk to paddy rice cultivation, the most important economic activity in the delta, has not been performed yet. Therefore, the present study was developed to provide the very first insight into delta-scale flood damages and risks to rice cultivation. The flood hazard was quantified by probabilistic flood hazard maps of the whole delta using a bivariate extreme value statistics, synthetic flood hydrographs, and a large-scale hydraulic model. The flood risk to paddy rice was then quantified considering cropping calendars, rice phenology, and harvest times based on a time series of enhanced vegetation index (EVI) derived from MODIS satellite data, and a published rice flood damage function. The proposed concept provided flood risk maps to paddy rice for the Mekong Delta in terms of expected annual damage. The presented concept can be used as a blueprint for regions facing similar problems due to its generic approach. Furthermore, the changes in flood risk to paddy rice caused by changes in land use currently under discussion in the Mekong Delta were estimated. Two land-use scenarios either intensifying or reducing rice cropping were considered, and the changes in risk were presented in spatially explicit flood risk maps. The basic risk maps could serve as guidance for the authorities to develop spatially explicit flood management and mitigation plans for the delta. The land-use change risk maps could further be used for adaptive risk management plans and as a basis for a cost-benefit of the discussed land-use change scenarios. Additionally, the damage and risks maps may support the recently initiated agricultural insurance programme in Vietnam.
Sprache
Englisch
Modern single-particle-tracking techniques produce extensive time-series of diffusive motion in a wide variety of systems, from single-molecule motion in living-cells to movement ecology. The quest is to decipher the physical mechanisms encoded in the data and thus to better understand the probed systems. We here augment recently proposed machine-learning techniques for decoding anomalous-diffusion data to include an uncertainty estimate in addition to the predicted output. To avoid the Black-Box-Problem a Bayesian-Deep-Learning technique named Stochastic-Weight-Averaging-Gaussian is used to train models for both the classification of the diffusionmodel and the regression of the anomalous diffusion exponent of single-particle-trajectories. Evaluating their performance, we find that these models can achieve a wellcalibrated error estimate while maintaining high prediction accuracies. In the analysis of the output uncertainty predictions we relate these to properties of the underlying diffusion models, thus providing insights into the learning process of the machine and the relevance of the output.
We evaluate the spatial and temporal evolution of Earth's long-wavelength surface dynamic topography since the Jurassic using a series of high-resolution global mantle convection models. These models are Earth-like in terms of convective vigour, thermal structure, surface heat-flux and the geographic distribution of heterogeneity. The models generate a degree-2-dominated spectrum of dynamic topography with negative amplitudes above subducted slabs (i.e. circum-Pacific regions and southern Eurasia) and positive amplitudes elsewhere (i.e. Africa, north-western Eurasia and the central Pacific). Model predictions are compared with published observations and subsidence patterns from well data, both globally and for the Australian and southern African regions. We find that our models reproduce the long-wavelength component of these observations, although observed smaller-scale variations are not reproduced. We subsequently define "geodynamic rules" for how different surface tectonic settings are affected by mantle processes: (i) locations in the vicinity of a subduction zone show large negative dynamic topography amplitudes; (ii) regions far away from convergent margins feature long-term positive dynamic topography; and (iii) rapid variations in dynamic support occur along the margins of overriding plates (e.g. the western US) and at points located on a plate that rapidly approaches a subduction zone (e.g. India and the Arabia Peninsula). Our models provide a predictive quantitative framework linking mantle convection with plate tectonics and sedimentary basin evolution, thus improving our understanding of how subduction and mantle convection affect the spatio-temporal evolution of basin architecture.
Observations of rift and rifted margin architecture suggest that significant spatial and temporal structural heterogeneity develops during the multiphase evolution of continental rifting. Inheritance is often invoked to explain this heterogeneity, such as preexisting anisotropies in rock composition, rheology, and deformation. Here, we use high-resolution 3-D thermal-mechanical numerical models of continental extension to demonstrate that rift-parallel heterogeneity may develop solely through fault network evolution during the transition from distributed to localized deformation. In our models, the initial phase of distributed normal faulting is seeded through randomized initial strength perturbations in an otherwise laterally homogeneous lithosphere extending at a constant rate. Continued extension localizes deformation onto lithosphere-scale faults, which are laterally offset by tens of km and discontinuous along-strike. These results demonstrate that rift- and margin-parallel heterogeneity of large-scale fault patterns may in-part be a natural byproduct of fault network coalescence.
We investigate the ergodic properties of a random walker performing (anomalous) diffusion on a random fractal geometry. Extensive Monte Carlo simulations of the motion of tracer particles on an ensemble of realisations of percolation clusters are performed for a wide range of percolation densities. Single trajectories of the tracer motion are analysed to quantify the time averaged mean squared displacement (MSD) and to compare this with the ensemble averaged MSD of the particle motion. Other complementary physical observables associated with ergodicity are studied, as well. It turns out that the time averaged MSD of individual realisations exhibits non-vanishing fluctuations even in the limit of very long observation times as the percolation density approaches the critical value. This apparent non-ergodic behaviour concurs with the ergodic behaviour on the ensemble averaged level. We demonstrate how the non-vanishing fluctuations in single particle trajectories are analytically expressed in terms of the fractal dimension and the cluster size distribution of the random geometry, thus being of purely geometrical origin. Moreover, we reveal that the convergence scaling law to ergodicity, which is known to be inversely proportional to the observation time T for ergodic diffusion processes, follows a power-law ∼T−h with h < 1 due to the fractal structure of the accessible space. These results provide useful measures for differentiating the subdiffusion on random fractals from an otherwise closely related process, namely, fractional Brownian motion. Implications of our results on the analysis of single particle tracking experiments are provided.
F2C2
(2012)
Background: Flux coupling analysis (FCA) has become a useful tool in the constraint-based analysis of genome-scale metabolic networks. FCA allows detecting dependencies between reaction fluxes of metabolic networks at steady-state. On the one hand, this can help in the curation of reconstructed metabolic networks by verifying whether the coupling between reactions is in agreement with the experimental findings. On the other hand, FCA can aid in defining intervention strategies to knock out target reactions.
Results: We present a new method F2C2 for FCA, which is orders of magnitude faster than previous approaches. As a consequence, FCA of genome-scale metabolic networks can now be performed in a routine manner.
Conclusions: We propose F2C2 as a fast tool for the computation of flux coupling in genome-scale metabolic networks. F2C2 is freely available for non-commercial use at https://sourceforge.net/projects/f2c2/files/.
Proposing relevant perturbations to biological signaling networks is central to many problems in biology and medicine because it allows for enabling or disabling certain biological outcomes. In contrast to quantitative methods that permit fine-grained (kinetic) analysis, qualitative approaches allow for addressing large-scale networks. This is accomplished by more abstract representations such as logical networks. We elaborate upon such a qualitative approach aiming at the computation of minimal interventions in logical signaling networks relying on Kleene's three-valued logic and fixpoint semantics. We address this problem within answer set programming and show that it greatly outperforms previous work using dedicated algorithms.
Background: Inferring regulatory interactions between genes from transcriptomics time-resolved data, yielding reverse engineered gene regulatory networks, is of paramount importance to systems biology and bioinformatics studies. Accurate methods to address this problem can ultimately provide a deeper insight into the complexity, behavior, and functions of the underlying biological systems. However, the large number of interacting genes coupled with short and often noisy time-resolved read-outs of the system renders the reverse engineering a challenging task. Therefore, the development and assessment of methods which are computationally efficient, robust against noise, applicable to short time series data, and preferably capable of reconstructing the directionality of the regulatory interactions remains a pressing research problem with valuable applications.
Results: Here we perform the largest systematic analysis of a set of similarity measures and scoring schemes within the scope of the relevance network approach which are commonly used for gene regulatory network reconstruction from time series data. In addition, we define and analyze several novel measures and schemes which are particularly suitable for short transcriptomics time series. We also compare the considered 21 measures and 6 scoring schemes according to their ability to correctly reconstruct such networks from short time series data by calculating summary statistics based on the corresponding specificity and sensitivity. Our results demonstrate that rank and symbol based measures have the highest performance in inferring regulatory interactions. In addition, the proposed scoring scheme by asymmetric weighting has shown to be valuable in reducing the number of false positive interactions. On the other hand, Granger causality as well as information-theoretic measures, frequently used in inference of regulatory networks, show low performance on the short time series analyzed in this study.
Conclusions: Our study is intended to serve as a guide for choosing a particular combination of similarity measures and scoring schemes suitable for reconstruction of gene regulatory networks from short time series data. We show that further improvement of algorithms for reverse engineering can be obtained if one considers measures that are rooted in the study of symbolic dynamics or ranks, in contrast to the application of common similarity measures which do not consider the temporal character of the employed data. Moreover, we establish that the asymmetric weighting scoring scheme together with symbol based measures (for low noise level) and rank based measures (for high noise level) are the most suitable choices.
We study the thermal Markovian diffusion of tracer particles in a 2D medium with spatially varying diffusivity D(r), mimicking recently measured, heterogeneous maps of the apparent diffusion coefficient in biological cells. For this heterogeneous diffusion process (HDP) we analyse the mean squared displacement (MSD) of the tracer particles, the time averaged MSD, the spatial probability density function, and the first passage time dynamics from the cell boundary to the nucleus. Moreover we examine the non-ergodic properties of this process which are important for the correct physical interpretation of time averages of observables obtained from single particle tracking experiments. From extensive computer simulations of the 2D stochastic Langevin equation we present an in-depth study of this HDP. In particular, we find that the MSDs along the radial and azimuthal directions in a circular domain obey anomalous and Brownian scaling, respectively. We demonstrate that the time averaged MSD stays linear as a function of the lag time and the system thus reveals a weak ergodicity breaking. Our results will enable one to rationalise the diffusive motion of larger tracer particles such as viruses or submicron beads in biological cells.