Refine
Year of publication
Document Type
- Doctoral Thesis (13)
- Article (10)
- Postprint (4)
- Bachelor Thesis (1)
- Monograph/Edited Volume (1)
- Other (1)
Is part of the Bibliography
- yes (30)
Keywords
- simulation (30) (remove)
Institute
- Institut für Biochemie und Biologie (5)
- Institut für Geowissenschaften (5)
- Institut für Physik und Astronomie (5)
- Institut für Umweltwissenschaften und Geographie (3)
- Hasso-Plattner-Institut für Digital Engineering GmbH (2)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (2)
- Mathematisch-Naturwissenschaftliche Fakultät (2)
- Wirtschaftswissenschaften (2)
- Department Psychologie (1)
- Extern (1)
Volatile supply and sales markets, coupled with increasing product individualization and complex production processes, present significant challenges for manufacturing companies. These must navigate and adapt to ever-shifting external and internal factors while ensuring robustness against process variabilities and unforeseen events. This has a pronounced impact on production control, which serves as the operational intersection between production planning and the shop- floor resources, and necessitates the capability to manage intricate process interdependencies effectively. Considering the increasing dynamics and product diversification, alongside the need to maintain constant production performances, the implementation of innovative control strategies becomes crucial.
In recent years, the integration of Industry 4.0 technologies and machine learning methods has gained prominence in addressing emerging challenges in production applications. Within this context, this cumulative thesis analyzes deep learning based production systems based on five publications. Particular attention is paid to the applications of deep reinforcement learning, aiming to explore its potential in dynamic control contexts. Analysis reveal that deep reinforcement learning excels in various applications, especially in dynamic production control tasks. Its efficacy can be attributed to its interactive learning and real-time operational model. However, despite its evident utility, there are notable structural, organizational, and algorithmic gaps in the prevailing research. A predominant portion of deep reinforcement learning based approaches is limited to specific job shop scenarios and often overlooks the potential synergies in combined resources. Furthermore, it highlights the rare implementation of multi-agent systems and semi-heterarchical systems in practical settings. A notable gap remains in the integration of deep reinforcement learning into a hyper-heuristic.
To bridge these research gaps, this thesis introduces a deep reinforcement learning based hyper- heuristic for the control of modular production systems, developed in accordance with the design science research methodology. Implemented within a semi-heterarchical multi-agent framework, this approach achieves a threefold reduction in control and optimisation complexity while ensuring high scalability, adaptability, and robustness of the system. In comparative benchmarks, this control methodology outperforms rule-based heuristics, reducing throughput times and tardiness, and effectively incorporates customer and order-centric metrics. The control artifact facilitates a rapid scenario generation, motivating for further research efforts and bridging the gap to real-world applications. The overarching goal is to foster a synergy between theoretical insights and practical solutions, thereby enriching scientific discourse and addressing current industrial challenges.
Klossowski, who had originally started as a religious seeker of truth in his younger years, will – after his « reversal » – feel himself invested with the role of a « heretic » struggling with the libidinous search for truth. Even as the creator of a perverted metaphysics, he remains a seeker of the revelation of being, now in the role of the divine « adversary » who, thrown back on himself, tends to imitate a religious mystic. The divine is replaced by the whispers of the demon, which Klossowski experiences as « la complicité d'une force "démonique" » in the creation of his artworks. The Diana myth becomes a parable for the act of artistic creation. Sexuality, understood as the primordial ground of creative force that shapes the signe unique, the phantasm, shifts metaphysics to « phantasmaphysics » (Foucault), in which the mystery of the divine is exposed as a delusion (Wahnbild).
Die Bienaymé-Galton-Watson Prozesse können für die Untersuchung von speziellen und sich entwickelnden Populationen verwendet werden. Die Populationen umfassen Individuen, welche sich identisch, zufällig, selbstständig und unabhängig voneinander fortpflanzen und die jeweils nur eine Generation existieren. Die n-te Generation ergibt sich als zufällige Summe der Individuen der (n-1)-ten Generation. Die Relevanz dieser Prozesse begründet sich innerhalb der Historie und der inner- und außermathematischen Bedeutung. Die Geschichte der Bienaymé-Galton-Watson-Prozesse wird anhand der Entwicklung des Konzeptes bis heute dargestellt. Dabei werden die Wissenschaftler:innen verschiedener Disziplinen angeführt, die Erkenntnisse zu dem Themengebiet beigetragen und das Konzept in ihren Fachbereichen angeführt haben. Somit ergibt sich die außermathematische Signifikanz. Des Weiteren erhält man die innermathematische Bedeutsamkeit mittels des Konzeptes der Verzweigungsprozesse, welches auf die Bienaymé-Galton-Watson Prozesse zurückzuführen ist. Die Verzweigungsprozesse stellen eines der aussagekräftigsten Modelle für die Beschreibung des Populationswachstums dar. Darüber hinaus besteht die derzeitige Wichtigkeit durch die Anwendungsmöglichkeit der Verzweigungsprozesse und der Bienaymé-Galton-Watson Prozesse innerhalb der Epidemiologie. Es werden die Ebola- und die Corona-Pandemie als Anwendungsfelder angeführt. Die Prozesse dienen als Entscheidungsstütze für die Politik und ermöglichen Aussagen über die Auswirkungen von Maßnahmen bezüglich der Pandemien. Neben den Prozessen werden ebenfalls der bedingte Erwartungswert bezüglich diskreter Zufallsvariablen, die wahrscheinlichkeitserzeugende Funktion und die zufällige Summe eingeführt. Die Konzepte vereinfachen die Beschreibung der Prozesse und bilden somit die Grundlage der Betrachtungen. Außerdem werden die benötigten und weiterführenden Eigenschaften der grundlegenden Themengebiete und der Prozesse aufgeführt und bewiesen. Das Kapitel erreicht seinen Höhepunkt bei dem Beweis des Kritikalitätstheorems, wodurch eine Aussage über das Aussterben des Prozesses in verschiedenen Fällen und somit über die Aussterbewahrscheinlichkeit getätigt werden kann. Die Fälle werden anhand der zu erwartenden Anzahl an Nachkommen eines Individuums unterschieden. Es zeigt sich, dass ein Prozess bei einer zu erwartenden Anzahl kleiner gleich Eins mit Sicherheit ausstirbt und bei einer Anzahl größer als Eins, die Population nicht in jedem Fall aussterben muss. Danach werden einzelne Beispiele, wie der linear fractional case, die Population von Fibroblasten (Bindegewebszellen) von Mäusen und die Entstehungsfragestellung der Prozesse, angeführt. Diese werden mithilfe der erlangten Ergebnisse untersucht und einige ausgewählte zufällige Dynamiken werden im nachfolgenden Kapitel simuliert. Die Simulationen erfolgen durch ein in Python erstelltes Programm und werden mithilfe der Inversionsmethode realisiert. Die Simulationen stellen beispielhaft die Entwicklungen in den verschiedenen Kritikalitätsfällen der Prozesse dar. Zudem werden die Häufigkeiten der einzelnen Populationsgrößen in Form von Histogrammen angebracht. Dabei lässt sich der Unterschied zwischen den einzelnen Fällen bestätigen und es wird die Anwendungsmöglichkeit der Bienaymé-Galton-Watson Prozesse bei komplexeren Problemen deutlich. Histogramme bekräftigen, dass die einzelnen Populationsgrößen nur endlich oft vorkommen. Diese Aussage wurde von Galton aufgeworfen und in der Extinktions-Explosions-Dichotomie verwendet. Die dargestellten Erkenntnisse über das Themengebiet und die Betrachtung des Konzeptes werden mit einer didaktischen Analyse abgeschlossen. Die Untersuchung beinhaltet die Berücksichtigung der Fundamentalen Ideen, der Fundamentalen Ideen der Stochastik und der Leitidee „Daten und Zufall“. Dabei ergibt sich, dass in Abhängigkeit der gewählten Perspektive die Anwendung der Bienaymé-Galton-Watson Prozesse innerhalb der Schule plausibel ist und von Vorteil für die Schüler:innen sein kann. Für die Behandlung wird exemplarisch der Rahmenlehrplan für Berlin und Brandenburg analysiert und mit dem Kernlehrplan Nordrhein-Westfalens verglichen. Die Konzeption des Lehrplans aus Berlin und Brandenburg lässt nicht den Schluss zu, dass die Bienaymé-Galton-Watson Prozesse angewendet werden sollten. Es lässt sich feststellen, dass die zugrunde liegende Leitidee nicht vollumfänglich mit manchen Fundamentalen Ideen der Stochastik vereinbar ist. Somit würde eine Modifikation hinsichtlich einer stärkeren Orientierung des Lehrplans an den Fundamentalen Ideen die Anwendung der Prozesse ermöglichen. Die Aussage wird durch die Betrachtung und Übertragung eines nordrhein-westfälischen Unterrichtsentwurfes für stochastische Prozesse auf die Bienaymé-Galton-Watson Prozesse unterstützt. Darüber hinaus werden eine Concept Map und ein Vernetzungspentagraph nach von der Bank konzipiert um diesen Aspekt hervorzuheben.
Diagenetic trends of synthetic reservoir sandstone properties assessed by digital rock physics
(2021)
Quantifying interactions and dependencies among geometric, hydraulic and mechanical properties of reservoir sandstones is of particular importance for the exploration and utilisation of the geological subsurface and can be assessed by synthetic sandstones comprising the microstructural complexity of natural rocks. In the present study, three highly resolved samples of the Fontainebleau, Berea and Bentheim sandstones are generated by means of a process-based approach, which combines the gravity-driven deposition of irregularly shaped grains and their diagenetic cementation by three different schemes. The resulting evolution in porosity, permeability and rock stiffness is examined and compared to the respective micro-computer tomographic (micro-CT) scans. The grain contact-preferential scheme implies a progressive clogging of small throats and consequently produces considerably less connected and stiffer samples than the two other schemes. By contrast, uniform quartz overgrowth continuously alters the pore space and leads to the lowest elastic properties. The proposed stress-dependent cementation scheme combines both approaches of contact-cement and quartz overgrowth, resulting in granulometric, hydraulic and elastic properties equivalent to those of the respective micro-CT scans, where bulk moduli slightly deviate by 0.8%, 4.9% and 2.5% for the Fontainebleau, Berea and Bentheim sandstone, respectively. The synthetic samples can be further altered to examine the impact of mineral dissolution or precipitation as well as fracturing on various petrophysical correlations, which is of particular relevance for numerous aspects of a sustainable subsurface utilisation.
Coarse-grained molecular model for the Glycosylphosphatidylinositol anchor with and without protein
(2020)
Glycosylphosphatidylinositol (GPI) anchors are a unique class of complex glycolipids that anchor a great variety of proteins to the extracellular leaflet of plasma membranes of eukaryotic cells. These anchors can exist either with or without an attached protein called GPI-anchored protein (GPI-AP) both in vitro and in vivo. Although GPIs are known to participate in a broad range of cellular functions, it is to a large extent unknown how these are related to GPI structure and composition. Their conformational flexibility and microheterogeneity make it difficult to study them experimentally. Simplified atomistic models are amenable to all-atom computer simulations in small lipid bilayer patches but not suitable for studying their partitioning and trafficking in complex and heterogeneous membranes. Here, we present a coarse-grained model of the GPI anchor constructed with a modified version of the MARTINI force field that is suited for modeling carbohydrates, proteins, and lipids in an aqueous environment using MARTINI's polarizable water. The nonbonded interactions for sugars were reparametrized by calculating their partitioning free energies between polar and apolar phases. In addition, sugar-sugar interactions were optimized by adjusting the second virial coefficients of osmotic pressures for solutions of glucose, sucrose, and trehalose to match with experimental data. With respect to the conformational dynamics of GPI-anchored green fluorescent protein, the accessible time scales are now at least an order of magnitude larger than for the all-atom system. This is particularly important for fine-tuning the mutual interactions of lipids, carbohydrates, and amino acids when comparing to experimental results. We discuss the prospective use of the coarse-grained GPI model for studying protein-sorting and trafficking in membrane models.
Coarse-grained molecular model for the Glycosylphosphatidylinositol anchor with and without protein
(2020)
Glycosylphosphatidylinositol (GPI) anchors are a unique class of complex glycolipids that anchor a great variety of proteins to the extracellular leaflet of plasma membranes of eukaryotic cells. These anchors can exist either with or without an attached protein called GPI-anchored protein (GPI-AP) both in vitro and in vivo. Although GPIs are known to participate in a broad range of cellular functions, it is to a large extent unknown how these are related to GPI structure and composition. Their conformational flexibility and microheterogeneity make it difficult to study them experimentally. Simplified atomistic models are amenable to all-atom computer simulations in small lipid bilayer patches but not suitable for studying their partitioning and trafficking in complex and heterogeneous membranes. Here, we present a coarse-grained model of the GPI anchor constructed with a modified version of the MARTINI force field that is suited for modeling carbohydrates, proteins, and lipids in an aqueous environment using MARTINI's polarizable water. The nonbonded interactions for sugars were reparametrized by calculating their partitioning free energies between polar and apolar phases. In addition, sugar-sugar interactions were optimized by adjusting the second virial coefficients of osmotic pressures for solutions of glucose, sucrose, and trehalose to match with experimental data. With respect to the conformational dynamics of GPI-anchored green fluorescent protein, the accessible time scales are now at least an order of magnitude larger than for the all-atom system. This is particularly important for fine-tuning the mutual interactions of lipids, carbohydrates, and amino acids when comparing to experimental results. We discuss the prospective use of the coarse-grained GPI model for studying protein-sorting and trafficking in membrane models.
Research on novel and advanced biomaterials is an indispensable step towards their applications in desirable fields such as tissue engineering, regenerative medicine, cell culture, or biotechnology. The work presented here focuses on such a promising material: polyelectrolyte multilayer (PEM) composed of hyaluronic acid (HA) and poly(L-lysine) (PLL). This gel-like polymer surface coating is able to accumulate (bio-)molecules such as proteins or drugs and release them in a controlled manner. It serves as a mimic of the extracellular matrix (ECM) in composition and intrinsic properties. These qualities make the HA/PLL multilayers a promising candidate for multiple bio-applications such as those mentioned above. The work presented aims at the development of a straightforward approach for assessment of multi-fractional diffusion in multilayers (first part) and at control of local molecular transport into or from the multilayers by laser light trigger (second part).
The mechanism of the loading and release is governed by the interaction of bioactives with the multilayer constituents and by the diffusion phenomenon overall. The diffusion of a molecule in HA/PLL multilayers shows multiple fractions of different diffusion rate. Approaches, that are able to assess the mobility of molecules in such a complex system, are limited. This shortcoming motivated the design of a novel evaluation tool presented here.
The tool employs a simulation-based approach for evaluation of the data acquired by fluorescence recovery after photobleaching (FRAP) method. In this approach, possible fluorescence recovery scenarios are primarily simulated and afterwards compared with the data acquired while optimizing parameters of a model until a sufficient match is achieved. Fluorescent latex particles of different sizes and fluorescein in an aqueous medium are utilized as test samples validating the analysis results. The diffusion of protein cytochrome c in HA/PLL multilayers is evaluated as well.
This tool significantly broadens the possibilities of analysis of spatiotemporal FRAP data, which originate from multi-fractional diffusion, while striving to be widely applicable. This tool has the potential to elucidate the mechanisms of molecular transport and empower rational engineering of the drug release systems.
The second part of the work focuses on the fabrication of such a spatiotemporarily-controlled drug release system employing the HA/PLL multilayer. This release system comprises different layers of various functionalities that together form a sandwich structure. The bottom layer, which serves as a reservoir, is formed by HA/PLL PEM deposited on a planar glass substrate. On top of the PEM, a layer of so-called hybrids is deposited. The hybrids consist of thermoresponsive poly(N-isopropylacrylamide) (PNIPAM) -based hydrogel microparticles with surface-attached gold nanorods. The layer of hybrids is intended to serve as a gate that controls the local molecular transport through the PEM–solution-interface. The possibility of stimulating the molecular transport by near-infrared (NIR) laser irradiation is being explored.
From several tested approaches for the deposition of hybrids onto the PEM surface, the drying-based approach was identified as optimal. Experiments, that examine the functionality of the fabricated sandwich at elevated temperature, document the reversible volume phase transition of the PEM-attached hybrids while sustaining the sandwich stability. Further, the gold nanorods were shown to effectively absorb light radiation in the tissue- and cell-friendly NIR spectral region while transducing the energy of light into heat. The rapid and reversible shrinkage of the PEM-attached hybrids was thereby achieved. Finally, dextran was employed as a model transport molecule. It loads into the PEM reservoir in a few seconds with the partition constant of 2.4, while it spontaneously releases in a slower, sustained manner. The local laser irradiation of the sandwich, which contains the fluorescein isothiocyanate tagged dextran, leads to a gradual reduction of fluorescence intensity in the irradiated region.
The release system fabricated employs renowned photoresponsivity of the hybrids in an innovative setting. The results of the research are a step towards a spatially-controlled on-demand drug release system that paves the way to spatiotemporally controlled drug release.
The approaches developed in this work have the potential to elucidate the molecular dynamics in ECM and to foster engineering of multilayers with properties tuned to mimic the ECM. The work aims at spatiotemporal control over the diffusion of bioactives and their presentation to the cells.
Evaluating the performance of self-adaptive systems is challenging due to their interactions with often highly dynamic environments. In the specific case of self-healing systems, the performance evaluations of self-healing approaches and their parameter tuning rely on the considered characteristics of failure occurrences and the resulting interactions with the self-healing actions. In this paper, we first study the state-of-the-art for evaluating the performances of self-healing systems by means of a systematic literature review. We provide a classification of different input types for such systems and analyse the limitations of each input type. A main finding is that the employed inputs are often not sophisticated regarding the considered characteristics for failure occurrences. To further study the impact of the identified limitations, we present experiments demonstrating that wrong assumptions regarding the characteristics of the failure occurrences can result in large performance prediction errors, disadvantageous design-time decisions concerning the selection of alternative self-healing approaches, and disadvantageous deployment-time decisions concerning parameter tuning. Furthermore, the experiments indicate that employing multiple alternative input characteristics can help with reducing the risk of premature disadvantageous design-time decisions.
Background: Recent research reported height biased migration of taller individuals and a Monte Carlo simulation showed that such preferential migration of taller individuals into network hubs can induce a secular trend of height. In the simulation model taller agents in the hubs raise the overall height of all individuals in the network by a community effect. However, it could be seen that the actual network structure influences the strength of this effect. In this paper the background and the influence of the network structure on the strength of the secular trend by migration is investigated. Material and methods: Three principal network types are analyzed: networks derived from street connections in Switzerland, more regular fishing net like networks and randomly generated ones. Our networks have between 10 and 152 nodes and between 20 and 307 edges connecting the nodes. Depending on the network size between 5.000 and 90.000 agents with an average height of 170 cm (SD 6.5 cm) are initially released into the network. In each iteration new agents are regenerated based on the actual average body height of the previous iteration and, to a certain proportion, corrected by body heights in the neighboring nodes. After generating new agents, a certain number of them migrated into neighbor nodes, the model let preferentially taller agents migrate into network hubs. Migration is balanced by back migration of the same number of agents from nodes with high centrality measures to less connected nodes. The latter is random as well, but not biased by the agents height. Furthermore the distribution of agents per node and their correlation to the centrality of the nodes is varied in a systematic manner. After 100 iterations, the secular trend, i.e. the gain in body height for the different networks, is investigated in relation to the network properties. Results: We observe an increase of average agent body height after 100 iterations if height biased migration is enabled. The increase rate depends on the height of the neighboring factor, the population distribution, the relationship between population in the nodes and their centrality as well as on the network topology. Networks with uniform like distributions of the agents in the nodes, uncorrelated associations between node centrality and agent number per node, as well as very heterogeneous networks with very different node centralities lead to biggest gains in average body height. Conclusion: Our simulations show, that height biased migration into network hubs can possibly contribute to the secular trend of height increase in the human population. The strength of this "tall by migration" event depends on the actual properties of the underlying network. There is a possible significance of this mechanism for social networks, when hubs are represented by individuals and edges as their personal relationships. However, the required high number of iterations to achieve significant effects in more natural network structures in our models requires further studies to test the relevance and real effect sizes in real world scenarios.
The North China Plain (NCP) is one of the most productive and intensive agricultural regions in China. High doses of mineral nitrogen (N) fertiliser, often combined with flood irrigation, are applied, resulting in N surplus, groundwater depletion and environmental pollution. The objectives of this thesis were to use the HERMES model to simulate the N cycle in winter wheat (Triticum aestivum L.)–summer maize (Zea mays L.) double crop rotations and show the performance of the HERMES model, of the new ammonia volatilisation sub-module and of the new nitrification inhibition tool in the NCP. Further objectives were to assess the models potential to save N and water on plot and county scale, as well as on short and long-term. Additionally, improved management strategies with the help of a model-based nitrogen fertiliser recommendation (NFR) and adapted irrigation, should be found.
Results showed that the HERMES model performed well under growing conditions of the NCP and was able to describe the relevant processes related to soil–plant interactions concerning N and water during a 2.5 year field experiment. No differences in grain yield between the real-time model-based NFR and the other treatments of the experiments on plot scale in Quzhou County could be found. Simulations with increasing amounts of irrigation resulted in significantly higher N leaching, higher N requirements of the NFR and reduced yields. Thus, conventional flood irrigation as currently practised by the farmers bears great uncertainties and exact irrigation amounts should be known for future simulation studies. In the best-practice scenario simulation on plot-scale, N input and N leaching, but also irrigation water could be reduced strongly within 2 years. Thus, the model-based NFR in combination with adapted irrigation had the highest potential to reduce nitrate leaching, compared to farmers practice and mineral N (Nmin)-reduced treatments. Also the calibrated and validated ammonia volatilisation sub-module of the HERMES model worked well under the climatic and soil conditions of northern China. Simple ammonia volatilisation approaches gave also satisfying results compared to process-oriented approaches. During the simulation with Ammonium sulphate Nitrate with nitrification inhibitor (ASNDMPP) ammonia volatilisation was higher than in the simulation without nitrification inhibitor, while the result for nitrate leaching was the opposite. Although nitrification worked well in the model, nitrification-born nitrous oxide emissions should be considered in future. Results of the simulated annual long-term (31 years) N losses in whole Quzhou County in Hebei Province were 296.8 kg N ha−1 under common farmers practice treatment and 101.7 kg N ha−1 under optimised treatment including NFR and automated irrigation (OPTai). Spatial differences in simulated N losses throughout Quzhou County, could only be found due to different N inputs. Simulations of an optimised treatment, could save on average more than 260 kg N ha−1a−1 from fertiliser input and 190 kg N ha−1a−1 from N losses and around 115.7 mm a−1 of water, compared to farmers practice. These long-term simulation results showed lower N and water saving potential, compared to short-term simulations and underline the necessity of long-term simulations to overcome the effect of high initial N stocks in soil.
Additionally, the OPTai worked best on clay loam soil except for a high simulated denitrification loss, while the simulations using farmers practice irrigation could not match the actual water needs resulting in yield decline, especially for winter wheat. Thus, a precise adaption of management to actual weather conditions and plant growth needs is necessary for future simulations. However, the optimised treatments did not seem to be able to maintain the soil organic matter pools, even with full crop residue input. Extra organic inputs seem to be required to maintain soil quality in the optimised treatments.
HERMES is a relatively simple model, with regard to data input requirements, to simulate the N cycle. It can offer interpretation of management options on plot, on county and regional scale for extension and research staff. Also in combination with other N and water saving methods the model promises to be a useful tool.