Refine
Year of publication
Document Type
- Doctoral Thesis (36) (remove)
Keywords
- Modellierung (36) (remove)
Institute
- Institut für Biochemie und Biologie (9)
- Institut für Umweltwissenschaften und Geographie (9)
- Institut für Geowissenschaften (6)
- Institut für Informatik und Computational Science (5)
- Extern (2)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (2)
- Institut für Mathematik (2)
- Institut für Physik und Astronomie (2)
- Fachgruppe Politik- & Verwaltungswissenschaft (1)
- Institut für Chemie (1)
Durch die anthropogene Nutzung sind viele Auen in Mitteleuropa verändert worden, wobei insbesondere die Retentionsflächen stark verringert wurden. Während Auen seit längerem im Fokus der wissenschaftlichen Bearbeitung stehen, gibt es bisher große Wissensdefizite in der Frage der Auenreaktivierungen. Zum einen sind derartige Projekte bisher kaum verwirklicht und zum anderen ist ein langfristiges Monitoring notwendig, um die Anpassung von Biozönosen an die veränderten Standortbedingungen beobachten zu können. Um die Folgen derartiger Eingriffe zu analysieren, bieten sich computergestützte Modellierungen der Landschaftsentwicklung an, wie sie in der vorliegenden Arbeit verwirklicht wurden. Ziel der Arbeit war, mit Hilfe eines Geografischen Informationssystems (GIS) das Entwicklungspotenzial der Landschaft bei verschiedenen Rückdeichungsvarianten auf der Ebene der Biotoptypen darzustellen. Dabei ging es nicht um die Erstellung eines allgemein gültigen Auenmodells sondern um die Erarbeitung eines Modells für einen konkreten Anwendungsfall. Der erarbeitete Ansatz sollte zudem für die landschaftsplanerische Praxis geeignet sein. Als Beispielgebiete wurden Flächen an der Mittleren Elbe bei Rogätz und Sandau, beide im nördlichen Teil von Sachsen-Anhalt, ausgewählt. Die vorliegende Arbeit gliedert sich in zwei Teile. Im ersten Teil werden Erhebungen und Auswertungen als Grundlage der Modellentwicklung dargestellt. Dazu wurden die Biotoptypen der Beispielgebiete flächendeckend erhoben und mit punktuellen Vegetationserhebungen ergänzt. Aus dem Forschungsprojekt "Rückgewinnung von Retentionsflächen und Altauenreaktivierung an der Mittleren Elbe in Sachsen-Anhalt" des Bundesministeriums für Bildung und Forschung (BMBF) standen standortökologische Daten der Hydrologie und Bodenkunde zur Verfügung. Ziel der Auswertung war, Schlüsselfaktoren für Hydrologie und Bodenbedingungen innerhalb der rezenten Aue zu identifizieren, die zur Ausprägung bestimmter Biotoptypen führen. Im zweiten Teil der Arbeit wurde ein Modell für Biotoptypenpotenziale auf den geplanten Rück–deichungsflächen entwickelt. Das Modell bearbeitet die Datenbank der verwendeten GIS-Dateien, die auf Daten zum Bestand beruht und um solche der Prognose der Standortökologie (Hydrologie und Boden) im Rückdeichungsfalle aus dem BMBF-Projekt erweitert wurde. Weitere Voraussetzung für die Modellierung war die Erarbeitung von Leitbildern, in denen unterschiedliche Nutzungsszenarios für die Landschaft nach Deichrückverlegung hypothetisch festgelegt wurden. Insbesondere die Nutzungsintensität wurde variiert, von einer Variante intensiver land- und forstwirtschaftlicher Nutzung über sogenannte integrierte Entwicklungsziele aus dem BMBF-Projekt bis hin zu einer Variante der Naturschutznutzung. Zusätzlich wurde eine zukünftige Potentielle Natürliche Vegetation modelliert. Eine Überprüfung des Modell fand für den Raum der rezenten Aue in der intensiven Nutzungsvariante statt, die der gegenwärtigen Nutzung am nächsten kommt. Werden Informationen des Bestandsbiotoptyps als Korrekturgröße in das Modell einbezogen, konnte für viele Biotoptypen eine Trefferquote von über 90 % erreicht werden. Bei flächenmäßig weniger bedeutenden Bio–toptypen lag dieser Wert aufgrund der schmaleren Datenbasis zwischen 20 und 40 %. Als Ergebnis liegt für unterschiedliche Deichvarianten und Leitbilder in den Beispielgebieten die Landschaftsentwicklung als Biotoppotenzial vor. Als eine vereinfachte Regionalisierung der punktuellen Vegetationsdaten wurde im Modell geprüft, inwieweit die modellierten Biotopflächen der Charakteristik der pflanzensoziologischen Aufnahmen aus der rezenten Aue entsprechen. In dem Falle wurde die Pflanzengesellschaft der jeweiligen ökologisch im Rahmen der Untersuchung einheitlichen Flächeneinheit zugeordnet. Anteilig lässt sich damit die Biotopprognosefläche pflanzensoziologisch konkretisieren. Die vorliegende Arbeit gehört zu den bisher wenigen Arbeiten, die sich mit den Folgen von Auenreaktivierung auf die Entwicklung der Landschaft auseinandersetzen. Sie zeigt eine Möglichkeit auf, Prognosemodelle für Biotoptypen und Vegetation anhand begrenzter Felduntersuchungen zu entwerfen. Derartige Modelle können zum Verständnis von Eingriffen in den Naturhaushalt, wie sie die Deichrückverlegungen darstellen, beitragen und eine Folgenabschätzung unterstützen.
Systems of Systems (SoS) have received a lot of attention recently. In this thesis we will focus on SoS that are built atop the techniques of Service-Oriented Architectures and thus combine the benefits and challenges of both paradigms. For this thesis we will understand SoS as ensembles of single autonomous systems that are integrated to a larger system, the SoS. The interesting fact about these systems is that the previously isolated systems are still maintained, improved and developed on their own. Structural dynamics is an issue in SoS, as at every point in time systems can join and leave the ensemble. This and the fact that the cooperation among the constituent systems is not necessarily observable means that we will consider these systems as open systems. Of course, the system has a clear boundary at each point in time, but this can only be identified by halting the complete SoS. However, halting a system of that size is practically impossible. Often SoS are combinations of software systems and physical systems. Hence a failure in the software system can have a serious physical impact what makes an SoS of this kind easily a safety-critical system. The contribution of this thesis is a modelling approach that extends OMG's SoaML and basically relies on collaborations and roles as an abstraction layer above the components. This will allow us to describe SoS at an architectural level. We will also give a formal semantics for our modelling approach which employs hybrid graph-transformation systems. The modelling approach is accompanied by a modular verification scheme that will be able to cope with the complexity constraints implied by the SoS' structural dynamics and size. Building such autonomous systems as SoS without evolution at the architectural level --- i. e. adding and removing of components and services --- is inadequate. Therefore our approach directly supports the modelling and verification of evolution.
In this work, an approach of paleoclimate reconstruction for tropical East Africa is presented. After giving a short summary of modern climate conditions in the tropics and the East African climate peculiarity, the potential of reconstructing climate from paleolake sediments is discussed. As demonstrated, the hydrologic sensitivity of high-elevated closed-basin lakes in the Central Kenya Rift yields valuable guaranties for the establishment of long-term climate records. Temporal fluctuations of the limnological characteristics saved in the lake sediments are used to define variations in the Quaternary climate history. Based on diatom analyses in radiocarbon- and 40Ar/39Ar-dated sediments, a chronology of paleoecologic fluctuations is developed for the Central Kenya Rift -lakes Nakuru, Elmenteita and Naivasha. At least during the penultimate interglacial (around 140 to 60 kyr BP) and during the last interglacial (around 12 to 4 kyr BP), these lakes experienced several transgression-regression cycles on time intervals of about 11,000 years. Additionally, a long-term trend of lake evolution is found suggesting the general succession from deep freshwater lakes towards more saline waters during the last million years. Using ecologic transfer functions and a simple lake-balance model, the observed paleohydrologic fluctuations are linked to potential precipitation-evaporation changes in the lake basins. Though also tectonic influences on the drainage pattern and the effect of varied seepage are investigated, it can be shown that already a small increase in precipitation of about 30±10 % may have affected the hydrologic budget of the intra-rift lakes within the reconstructed range. The findings of this study help to assess the natural climate variability of East Africa. They furthermore reflect the sensitivity of the Central Kenya Rift -lakes to fluctuations of large-scale climate parameters, such as solar radiation and sea-surface temperatures of the Indian Ocean.
Biomimicry is the art of mimicking nature to overcome a particular technical or scientific challenge. The approach studies how evolution has found solutions to the most complex problems in nature. This makes it a powerful method for science. In combination with the rapid development of manufacturing and information technologies into the digital age, structures and material that were before thought to be unrealizable can now be created with simple sketch and the touch of a button. This doctoral thesis had as its primary goal to investigate how digital tools, such as programming, modelling, 3D-Design tools and 3D-Printing, with the help from biomimicry, could lead to new analysis methods in science and new medical devices in medicine.
The Electrical Discharge Machining (EDM) process is applied commonly to deform or mold hard metals that are difficult to work using normal machinery. A workpiece submerged in an electrolyte is deformed while being in close vicinity to an electrode. When high voltage is put between the workpiece and the electrode it will cause sparks that create cavitations on the substrate which in turn removes material and is flushed away by the electrolyte. Usually, such surfaces are analysed based on roughness, in this work another method using a novel curvature analysis method is presented as an alternative. In addition, to better understand how the surface changes during process time of the EDM process, a digital impact model was created which created craters on ridges on an originally flat substrate. These substrates were then analysed using the curvature analysis method at different processing times of the modelling. It was found that a substrate reaches an equilibrium at around 10000 impacts. The proposed curvature analysis method has potential to be used in the design of new cell culture substrates for stem cell.
The Venus flytrap can shut its jaws at an amazing speed. The shutting mechanism may be interesting to use in science and is an example of a so-called mechanical bi-stable system – there are two stable states. In this work two truncated pyramid structures were modelled using a non-linear mechanical model called the Chained Beam Constraint Model (CBCM). The structure with a slope angle of 30 degrees is not bi-stable and the structure with a slope angle of 45 degrees is bi-stable. Developing this idea further by using PEVA, which has a shape-memory effect, the structure which is not bi-stable could be programmed to be bi-stable and then turned off again. This could be used as an energy storage system. Another species which has interesting mechanism is the tapeworm. Some species of this animal has a crown of hooks and suckers located on its side. The parasite commonly is found in mammals in the lower intestine and attaches to the walls by using its suckers. When the tapeworm has found a suitable spot, it ejects its hooks and permanently attaches to the wall. This function could be used in minimally invasive medicine to have better control of implants during the implantation process. By using the CBCM model and a 3D-printer capable of tuning how hard or soft a printed part is, a design strategy was developed to investigate how one could create a device that mimics the tapeworm. In the end a prototype was created which was able attach to a pork loin at an under pressure of 20 kPa and to ejects its hooks at an under pressure of 50 kPa or above.
These three projects is an exhibit of how digital tools and biomimicry can be used together to come up with applicable solutions in science and in medicine.
Forschendes Lernen und die digitale Transformation sind zwei der wichtigsten Einflüsse auf die Entwicklung der Hochschuldidaktik im deutschprachigen Raum. Während das forschende Lernen als normative Theorie das sollen beschreibt, geben die digitalen Werkzeuge, alte wie neue, das können in vielen Bereichen vor.
In der vorliegenden Arbeit wird ein Prozessmodell aufgestellt, was den Versuch unternimmt, das forschende Lernen hinsichtlich interaktiver, gruppenbasierter Prozesse zu systematisieren. Basierend auf dem entwickelten Modell wurde ein Softwareprototyp implementiert, der den gesamten Forschungsprozess begleiten kann. Dabei werden Gruppenformation, Feedback- und Reflexionsprozesse und das Peer Assessment mit Bildungstechnologien unterstützt. Die Entwicklungen wurden in einem qualitativen Experiment eingesetzt, um Systemwissen über die Möglichkeiten und Grenzen der digitalen Unterstützung von forschendem Lernen zu gewinnen.
Zwischen Modellierung und Stakeholderbeteiligung - Wissensproduktion in der Energiewendeforschung
(2023)
Die Dekarbonisierung des Energiesystems ist Teil der international im Rahmen des Pariser Klimaabkommens beschlossenen CO2-Minderungsstrategie zur Bekämpfung des Klimawandels. Nach den Verhandlungen und Beschlüssen der Klimaziele stehen politische Entscheider weltweit nun vor der Frage, wie sie diese erreichen können. Dies produziert eine hohe politische Nachfrage nach Wissen um die direkten und indirekten Effekte verschiedener Instrumente und potentiellen Entwicklungspfade einer Energiewende. Dieser gesellschaftliche Bedarf an wissenschaftlichen Antworten zu Lösungsoptionen wurde im Rahmen einer Klimafolgenforschung, genauer einer Klimapolitikfolgenforschung, aufgenommen. Der relativ neue Zweig einer Energiewendeforschung hat sich weltweit entwickelt, steht dabei allerdings vor der doppelten Herausforderung: Erstens befindet sich das Objekt der Forschung nicht im luftleeren Raum, sondern innerhalb ökonomischer, sozialer und politischer Zusammenhänge, hier gesellschaftliche Einbettung genannt. Denn die Frage, wie die Energiewende erreicht werden kann, wird auch außerhalb der Wissenschaft debattiert und stellt damit ein Aushandlungsfeld unterschiedlicher Interessen und Narrative dar. Zweitens befindet sich das zu untersuchende Objekt in der Zukunft, hier unter dem Terminus des strukturellen Nicht-Wissens zusammengefasst. Diese beiden Bedingungen führen dazu, dass konventionelle Methoden der empirischen Sozialforschung nicht greifen und eine Öffnung und Transformation der Wissenschaft in Hinblick auf neue Methoden vonnöten ist (Nowotny 2001, Ravetz 2006, Schneidewind 2013). In dieser Arbeit untersuche ich zwei Möglichkeiten, wie mit der Herausforderung, Wissen unter der Bedingung des strukturellen Nicht-Wissens und der gesellschaftlichen Einbettung zu produzieren, in der Energiewendeforschung umgegangen wird. Einerseits wird dies durch die Einbeziehung von Stakeholdern, also nicht-wissenschaftlicher Akteure, in den Forschungsprozess getan. Andererseits ist die Nutzung von komplexen ökonometrischen Modellen zur Berechnung von Implikationen und energiewirtschaftlichen Entwicklungspfaden zu einem zentralen Mittel der Wissensgenerierung in der Energiewendeforschung avanciert. Damit wird der als Problem verstandenen strukturellen Bedingung des Nicht-Wissens insofern begegnet, als dass die Ergebnisse von Stakeholder-Involvement und von Modellierungsarbeiten zweifelsohne neues Wissen zur Verfügung stellen. Uneinigkeit besteht jedoch darin, worüber dieses Wissen etwas aussagt: Sind es Interessen oder legitime Perspektiven, die Stakeholder in den Forschungsprozess einbringen und sind Modelle vereinfachte Darstellungen der Welt oder sind sie Ausdruck der Vorstellung des Modellierers?
Water shortage is a serious threat for many societies worldwide. In drylands, water management measures like the construction of reservoirs are affected by eroded sediments transported in the rivers. Thus, the capability of assessing water and sediment fluxes at the river basin scale is of vital importance to support management decisions and policy making. This subject was addressed by the DFG-funded SESAM-project (Sediment Export from large Semi-Arid catchments: Measurements and Modelling). As a part of this project, this thesis focuses on (1) the development and implementation of an erosion module for a meso-scale catchment model, (2) the development of upscaling and generalization methods for the parameterization of such model, (3) the execution of measurements to obtain data required for the modelling and (4) the application of the model to different study areas and its evaluation. The research was carried out in two meso-scale dryland catchments in NE-Spain: Ribera Salada (200 km²) and Isábena (450 km²). Adressing objective 1, WASA-SED, a spatially semi-distributed model for water and sediment transport at the meso-scale was developed. The model simulates runoff and erosion processes at the hillslope scale, transport processes of suspended and bedload fluxes in the river reaches, and retention and remobilisation processes of sediments in reservoirs. This thesis introduces the model concept, presents current model applications and discusses its capabilities and limitations. Modelling at larger scales faces the dilemma of describing relevant processes while maintaining a manageable demand for input data and computation time. WASA-SED addresses this challenge by employing an innovative catena-based upscaling approach: the landscape is represented by characteristic toposequences. For deriving these toposequences with regard to multiple attributes (eg. topography, soils, vegetation) the LUMP-algorithm (Landscape Unit Mapping Program) was developed and related to objective 2. It incorporates an algorithm to retrieve representative catenas and their attributes, based on a Digital Elevation Model and supplemental spatial data. These catenas are classified to provide the discretization for the WASA-SED model. For objective 3, water and sediment fluxes were monitored at the catchment outlet of the Isábena and some of its sub-catchments. For sediment yield estimation, the intermittent measurements of suspended sediment concentration (SSC) had to be interpolated. This thesis presents a comparison of traditional sediment rating curves (SRCs), generalized linear models (GLMs) and non-parametric regression using Random Forests (RF) and Quantile Regression Forests (QRF). The observed SSCs are highly variable and range over six orders of magnitude. For these data, traditional SRCs performed poorly, as did GLMs, despite including other relevant process variables (e.g. rainfall intensities, discharge characteristics). RF and QRF proved to be very robust and performed favourably for reproducing sediment dynamics. QRF additionally excels in providing estimates on the accuracy of the predictions. Subsequent analysis showed that most of the sediment was exported during intense storms of late summer. Later floods yielded successively less sediment. Comparing sediment generation to yield at the outlet suggested considerable storage effects within the river channel. Addressing objective 4, the WASA-SED model was parameterized for the two study areas in NE Spain and applied with different foci. For Ribera Salada, the uncalibrated model yielded reasonable results for runoff and sediment. It provided quantitative measures of the change in runoff and sediment yield for different land-uses. Additional land management scenarios were presented and compared to impacts caused by climate change projections. In contrast, the application for the Isábena focussed on exploring the full potential of the model's predictive capabilities. The calibrated model achieved an acceptable performance for the validation period in terms of water and sediment fluxes. The inadequate representation of the lower sub-catchments inflicted considerable reductions on model performance, while results for the headwater catchments showed good agreement despite stark contrasts in sediment yield. In summary, the application of WASA-SED to three catchments proved the model framework to be a practicable multi-scale approach. It successfully links the hillslope to the catchment scale and integrates the three components hillslope, river and reservoir in one model. Thus, it provides a feasible approach for tackling issues of water and sediment yield at the meso-scale. The crucial role of processes like transmission losses and sediment storage in the river has been identified. Further advances can be expected when the representation of connectivity of water and sediment fluxes (intra-hillslope, hillslope-river, intra-river) is refined and input data improves.
Monoclonal antibodies (mAbs) are an innovative group of drugs with increasing clinical importance in oncology, combining high specificity with generally low toxicity. There are, however, numerous challenges associated with the development of mAbs as therapeutics. Mechanistic understanding of factors that govern the pharmacokinetics (PK) of mAbs is critical for drug development and the optimisation of effective therapies; in particular, adequate dosing strategies can improve patient quality life and lower drug cost. Physiologically-based PK (PBPK) models offer a physiological and mechanistic framework, which is of advantage in the context of animal to human extrapolation. Unlike for small molecule drugs, however, there is no consensus on how to model mAb disposition in a PBPK context. Current PBPK models for mAb PK hugely vary in their representation of physiology and parameterisation. Their complexity poses a challenge for their applications, e.g., translating knowledge from animal species to humans.
In this thesis, we developed and validated a consensus PBPK model for mAb disposition taking into account recent insights into mAb distribution (antibody biodistribution coefficients and interstitial immunoglobulin G (IgG) pharmacokinetics) to predict tissue PK across several pre-clinical species and humans based on plasma data only. The model allows to a priori predict target-independent (unspecific) mAb disposition processes as well as mAb disposition in concentration ranges, for which the unspecific clearance (CL) dominates target-mediated CL processes. This is often the case for mAb therapies at steady state dosing.
The consensus PBPK model was then used and refined to address two important problems:
1) Immunodeficient mice are crucial models to evaluate mAb efficacy in cancer therapy. Protection from elimination by binding to the neonatal Fc receptor is known to be a major pathway influencing the unspecific CL of both, endogenous and therapeutic IgG. The concentration of endogenous IgG, however, is reduced in immunodeficient mouse models, and this effect on unspecific mAb CL is unknown, yet of great importance for the extrapolation to human in the context of mAb cancer therapy.
2) The distribution of mAbs into solid tumours is of great interest. To comprehensively investigate mAb distribution within tumour tissue and its implications for therapeutic efficacy, we extended the consensus PBPK model by a detailed tumour distribution model incorporating a cell-level model for mAb-target interaction. We studied the impact of variations in tumour microenvironment on therapeutic efficacy and explored the plausibility of different mechanisms of action in mAb cancer therapy.
The mathematical findings and observed phenomena shed new light on therapeutic utility and dosing regimens in mAb cancer treatment.
This work presents mathematical and computational approaches to cover various aspects of metabolic network modelling, especially regarding the limited availability of detailed kinetic knowledge on reaction rates. It is shown that precise mathematical formulations of problems are needed i) to find appropriate and, if possible, efficient algorithms to solve them, and ii) to determine the quality of the found approximate solutions. Furthermore, some means are introduced to gain insights on dynamic properties of metabolic networks either directly from the network structure or by additionally incorporating steady-state information. Finally, an approach to identify key reactions in a metabolic networks is introduced, which helps to develop simple yet useful kinetic models. The rise of novel techniques renders genome sequencing increasingly fast and cheap. In the near future, this will allow to analyze biological networks not only for species but also for individuals. Hence, automatic reconstruction of metabolic networks provides itself as a means for evaluating this huge amount of experimental data. A mathematical formulation as an optimization problem is presented, taking into account existing knowledge and experimental data as well as the probabilistic predictions of various bioinformatical methods. The reconstructed networks are optimized for having large connected components of high accuracy, hence avoiding fragmentation into small isolated subnetworks. The usefulness of this formalism is exemplified on the reconstruction of the sucrose biosynthesis pathway in Chlamydomonas reinhardtii. The problem is shown to be computationally demanding and therefore necessitates efficient approximation algorithms. The problem of minimal nutrient requirements for genome-scale metabolic networks is analyzed. Given a metabolic network and a set of target metabolites, the inverse scope problem has as it objective determining a minimal set of metabolites that have to be provided in order to produce the target metabolites. These target metabolites might stem from experimental measurements and therefore are known to be produced by the metabolic network under study, or are given as the desired end-products of a biotechological application. The inverse scope problem is shown to be computationally hard to solve. However, I assume that the complexity strongly depends on the number of directed cycles within the metabolic network. This might guide the development of efficient approximation algorithms. Assuming mass-action kinetics, chemical reaction network theory (CRNT) allows for eliciting conclusions about multistability directly from the structure of metabolic networks. Although CRNT is based on mass-action kinetics originally, it is shown how to incorporate further reaction schemes by emulating molecular enzyme mechanisms. CRNT is used to compare several models of the Calvin cycle, which differ in size and level of abstraction. Definite results are obtained for small models, but the available set of theorems and algorithms provided by CRNT can not be applied to larger models due to the computational limitations of the currently available implementations of the provided algorithms. Given the stoichiometry of a metabolic network together with steady-state fluxes and concentrations, structural kinetic modelling allows to analyze the dynamic behavior of the metabolic network, even if the explicit rate equations are not known. In particular, this sampling approach is used to study the stabilizing effects of allosteric regulation in a model of human erythrocytes. Furthermore, the reactions of that model can be ranked according to their impact on stability of the steady state. The most important reactions in that respect are identified as hexokinase, phosphofructokinase and pyruvate kinase, which are known to be highly regulated and almost irreversible. Kinetic modelling approaches using standard rate equations are compared and evaluated against reference models for erythrocytes and hepatocytes. The results from this simplified kinetic models can simulate acceptably the temporal behavior for small changes around a given steady state, but fail to capture important characteristics for larger changes. The aforementioned approach to rank reactions according to their influence on stability is used to identify a small number of key reactions. These reactions are modelled in detail, including knowledge about allosteric regulation, while all other reactions were still described by simplified reaction rates. These so-called hybrid models can capture the characteristics of the reference models significantly better than the simplified models alone. The resulting hybrid models might serve as a good starting point for kinetic modelling of genome-scale metabolic networks, as they provide reasonable results in the absence of experimental data, regarding, for instance, allosteric regulations, for a vast majority of enzymatic reactions.
The business problem of having inefficient processes, imprecise process analyses, and simulations as well as non-transparent artificial neuronal network models can be overcome by an easy-to-use modeling concept. With the aim of developing a flexible and efficient approach to modeling, simulating, and optimizing processes, this paper proposes a flexible Concept of Neuronal Modeling (CoNM). The modeling concept, which is described by the modeling language designed and its mathematical formulation and is connected to a technical substantiation, is based on a collection of novel sub-artifacts. As these have been implemented as a computational model, the set of CoNM tools carries out novel kinds of Neuronal Process Modeling (NPM), Neuronal Process Simulations (NPS), and Neuronal Process Optimizations (NPO). The efficacy of the designed artifacts was demonstrated rigorously by means of six experiments and a simulator of real industrial production processes.