530 Physik
Refine
Year of publication
Document Type
- Article (596)
- Doctoral Thesis (342)
- Postprint (102)
- Preprint (50)
- Other (47)
- Habilitation Thesis (22)
- Master's Thesis (10)
- Review (9)
- Monograph/Edited Volume (3)
- Course Material (2)
Keywords
- diffusion (39)
- anomalous diffusion (28)
- gamma rays: general (20)
- Synchronisation (16)
- synchronization (16)
- cosmic rays (14)
- ISM: supernova remnants (12)
- organic solar cells (12)
- stochastic processes (12)
- Datenanalyse (10)
Institute
- Institut für Physik und Astronomie (1093)
- Interdisziplinäres Zentrum für Dynamik komplexer Systeme (52)
- Extern (30)
- Mathematisch-Naturwissenschaftliche Fakultät (25)
- Institut für Chemie (18)
- Institut für Geowissenschaften (13)
- Institut für Mathematik (12)
- Institut für Biochemie und Biologie (6)
- Potsdam Institute for Climate Impact Research (PIK) e. V. (5)
- Institut für Umweltwissenschaften und Geographie (4)
This work introduces an embedded approach for the prediction of Solar Particle Events (SPEs) in space applications by combining the real-time Soft Error Rate (SER) measurement with SRAM-based detector and the offline trained machine learning model. The proposed approach is intended for the self-adaptive fault-tolerant multiprocessing systems employed in space applications. With respect to the state-of-the-art, our solution allows for predicting the SER 1 h in advance and fine-grained hourly tracking of SER variations during SPEs as well as under normal conditions. Therefore, the target system can activate the appropriate mechanisms for radiation hardening before the onset of high radiation levels. Based on the comparison of five different machine learning algorithms trained with the public space flux database, the preliminary results indicate that the best prediction accuracy is achieved with the recurrent neural network (RNN) with long short-term memory (LSTM).
The radiation belts of the Earth, filled with energetic electrons, comprise complex and dynamic systems that pose a significant threat to satellite operation. While various models of electron flux both for low and relativistic energies have been developed, the behavior of medium energy (120-600 keV) electrons, especially in the MEO region, remains poorly quantified. At these energies, electrons are driven by both convective and diffusive transport, and their prediction usually requires sophisticated 4D modeling codes. In this paper, we present an alternative approach using the Light Gradient Boosting (LightGBM) machine learning algorithm. The Medium Energy electRon fLux In Earth's outer radiatioN belt (MERLIN) model takes as input the satellite position, a combination of geomagnetic indices and solar wind parameters including the time history of velocity, and does not use persistence. MERLIN is trained on >15 years of the GPS electron flux data and tested on more than 1.5 years of measurements. Tenfold cross validation yields that the model predicts the MEO radiation environment well, both in terms of dynamics and amplitudes o f flux. Evaluation on the test set shows high correlation between the predicted and observed electron flux (0.8) and low values of absolute error. The MERLIN model can have wide space weather applications, providing information for the scientific community in the form of radiation belts reconstructions, as well as industry for satellite mission design, nowcast of the MEO environment, and surface charging analysis.
Inferring causal relations from observational time series data is a key problem across science and engineering whenever experimental interventions are infeasible or unethical. Increasing data availability over the past few decades has spurred the development of a plethora of causal discovery methods, each addressing particular challenges of this difficult task. In this paper, we focus on an important challenge that is at the core of time series causal discovery: regime-dependent causal relations. Often dynamical systems feature transitions depending on some, often persistent, unobserved background regime, and different regimes may exhibit different causal relations. Here, we assume a persistent and discrete regime variable leading to a finite number of regimes within which we may assume stationary causal relations. To detect regime-dependent causal relations, we combine the conditional independence-based PCMCI method [based on a condition-selection step (PC) followed by the momentary conditional independence (MCI) test] with a regime learning optimization approach. PCMCI allows for causal discovery from high-dimensional and highly correlated time series. Our method, Regime-PCMCI, is evaluated on a number of numerical experiments demonstrating that it can distinguish regimes with different causal directions, time lags, and sign of causal links, as well as changes in the variables' autocorrelation. Furthermore, Regime-PCMCI is employed to observations of El Nino Southern Oscillation and Indian rainfall, demonstrating skill also in real-world datasets.
According to Radzikowski’s celebrated results, bisolutions of a wave operator on a globally hyperbolic spacetime are of the Hadamard form iff they are given by a linear combination of distinguished parametrices i2(G˜aF−G˜F+G˜A−G˜R) in the sense of Duistermaat and Hörmander [Acta Math. 128, 183–269 (1972)] and Radzikowski [Commun. Math. Phys. 179, 529 (1996)]. Inspired by the construction of the corresponding advanced and retarded Green operator GA, GR as done by Bär, Ginoux, and Pfäffle {Wave Equations on Lorentzian Manifolds and Quantization [European Mathematical Society (EMS), Zürich, 2007]}, we construct the remaining two Green operators GF, GaF locally in terms of Hadamard series. Afterward, we provide the global construction of i2(G˜aF−G˜F), which relies on new techniques such as a well-posed Cauchy problem for bisolutions and a patching argument using Čech cohomology. This leads to global bisolutions of the Hadamard form, each of which can be chosen to be a Hadamard two-point-function, i.e., the smooth part can be adapted such that, additionally, the symmetry and the positivity condition are exactly satisfied.
A comet is a highly dynamic object, undergoing a permanent state of change. These changes have to be carefully classified and considered according to their intrinsic temporal and spatial scales. The Rosetta mission has, through its contiguous in-situ and remote sensing coverage of comet 67P/Churyumov-Gerasimenko (hereafter 67P) over the time span of August 2014 to September 2016, monitored the emergence, culmination, and winding down of the gas and dust comae. This provided an unprecedented data set and has spurred a large effort to connect in-situ and remote sensing measurements to the surface. In this review, we address our current understanding of cometary activity and the challenges involved when linking comae data to the surface. We give the current state of research by describing what we know about the physical processes involved from the surface to a few tens of kilometres above it with respect to the gas and dust emission from cometary nuclei. Further, we describe how complex multidimensional cometary gas and dust models have developed from the Halley encounter of 1986 to today. This includes the study of inhomogeneous outgassing and determination of the gas and dust production rates. Additionally, the different approaches used and results obtained to link coma data to the surface will be discussed. We discuss forward and inversion models and we describe the limitations of the respective approaches. The current literature suggests that there does not seem to be a single uniform process behind cometary activity. Rather, activity seems to be the consequence of a variety of erosion processes, including the sublimation of both water ice and more volatile material, but possibly also more exotic processes such as fracture and cliff erosion under thermal and mechanical stress, sub-surface heat storage, and a complex interplay of these processes. Seasons and the nucleus shape are key factors for the distribution and temporal evolution of activity and imply that the heliocentric evolution of activity can be highly individual for every comet, and generalisations can be misleading.
Heat waves are increasingly common in many countries across the globe, and also in Germany, where this study is set. Heat poses severe health risks, especially for vulnerable groups such as the elderly and children. This case study explores visitors' behavior and perceptions during six weekends in the summer of 2018 at a 6-month open-air horticultural show. Data from a face-to-face survey (n = 306) and behavioral observations ( n = 2750) were examined by using correlation analyses, ANOVA, and multiple regression analyses. Differences in weather perception, risk awareness, adaptive behavior, and activity level were observed between rainy days (maximum daily temperature, 25 degrees C), warmsummer days (25 degrees-30 degrees C), and hot days (>30 degrees C). Respondents reported a high level of heat risk awareness, butmost (90%) were unaware of actual heat warnings. During hot days, more adaptive measures were reported and observed. Older respondents reported taking the highest number of adaptive measures. We observed the highest level of adaptation in children, but they also showed the highest activity level. From our results we discuss how to facilitate individual adaptation to heat stress at open-air events by taking the heterogeneity of visitors into account. To mitigate negative health outcomes for citizens in the future, we argue for tailored risk communication aimed at vulnerable groups. <br /> SIGNIFICANCE STATEMENT: People around the world are facing higher average temperatures. While higher temperatures make open-air events a popular leisure time activity in summer, heat waves are a threat to health and life. Since there is not much research on how visitors of such events perceive different weather conditions-especially hot temperatures-we explored this in our case study in southern Germany at an open-air horticultural show in the summer of 2018. We discovered deficits both in people's awareness of current heat risk and the heat adaptation they carry out themselves. Future research should further investigate risk perception and adaptation behavior of private individuals, whereas event organizers and authorities need to continually focus on risk communication and facilitate individual adaptation of their visitors.
We introduce a thermofield-based formulation of the multilayer multiconfigurational time-dependent Hartree (MCTDH) method to study finite temperature effects on non-adiabatic quantum dynamics from a non-stochastic, wave function perspective. Our approach is based on the formal equivalence of bosonic many-body theory at zero temperature with a doubled number of degrees of freedom and the thermal quasi-particle representation of bosonic thermofield dynamics (TFD). This equivalence allows for a transfer of bosonic many-body MCTDH as introduced by Wang and Thoss to the finite temperature framework of thermal quasi-particle TFD. As an application, we study temperature effects on the ultrafast internal conversion dynamics in pyrazine. We show that finite temperature effects can be efficiently accounted for in the construction of multilayer expansions of thermofield states in the framework presented herein. Furthermore, we find our results to agree well with existing studies on the pyrazine model based on the pMCTDH method.
In X-ray computed tomography (XCT), an X-ray beam of intensity I0 is transmitted through an object and its attenuated intensity I is measured when it exits the object. The attenuation of the beam depends on the attenuation coefficients along its path. The attenuation coefficients provide information about the structure and composition of the object and can be determined through mathematical operations that are referred to as reconstruction.
The standard reconstruction algorithms are based on the filtered backprojection (FBP) of the measured data. While these algorithms are fast and relatively simple, they do not always succeed in computing a precise reconstruction, especially from under-sampled data.
Alternatively, an image or volume can be reconstructed by solving a system of linear equations. Typically, the system of equations is too large to be solved but its solution can be approximated by iterative methods, such as the Simultaneous Iterative Reconstruction Technique (SIRT) and the Conjugate Gradient Least Squares (CGLS).
This dissertation focuses on the development of a novel iterative algorithm, the Direct Iterative Reconstruction of Computed Tomography Trajectories (DIRECTT). After its reconstruction principle is explained, its performance is assessed for real parallel- and cone-beam CT (including under-sampled) data and compared to that of other established algorithms. Finally, it is demonstrated how the shape of the measured object can be modelled into DIRECTT to achieve even better reconstruction results.
The random nature of self-amplified spontaneous emission (SASE) is a well-known challenge for x-ray core level spectroscopy at SASE free-electron lasers (FELs). Especially in time-resolved experiments that require a combination of good temporal and spectral resolution the jitter and drifts in the spectral characteristics, relative arrival time as well as power fluctuations can smear out spectral-temporal features. We present a combination of methods for the analysis of time-resolved photoelectron spectra based on power and time corrections as well as self-referencing of a strong photoelectron line. Based on sulfur 2p photoelectron spectra of 2-thiouracil taken at the SASE FEL FLASH2, we show that it is possible to correct for some of the photon energy drift and jitter even when reliable shot-to-shot photon energy data is not available. The quality of pump-probe difference spectra improves as random jumps in energy between delay points reduce significantly. The data analysis allows to identify coherent oscillations of 1 eV shift on the mean photoelectron line of 4 eV width with an error of less than 0.1 eV.
Global quantum thermometry
(2021)
A paradigm shift in quantum thermometry is proposed. To date, thermometry has relied on local estimation, which is useful to reduce statistical fluctuations once the temperature is very well known. In order to estimate temperatures in cases where few measurement data or no substantial prior knowledge are available, we build instead a method for global quantum thermometry. Based on scaling arguments, a mean logarithmic error is shown here to be the correct figure of merit for thermometry. Its full minimization provides an operational and optimal rule to postprocess measurements into a temperature reading, and it establishes a global precision limit. We apply these results to the simulated outcomes of measurements on a spin gas, finding that the local approach can lead to biased temperature estimates in cases where the global estimator converges to the true temperature. The global framework thus enables a reliable approach to data analysis in thermometry experiments.