Refine
Year of publication
Document Type
- Article (17)
- Doctoral Thesis (3)
- Postprint (1)
Is part of the Bibliography
- yes (21)
Keywords
- analysis (21) (remove)
Institute
- Institut für Physik und Astronomie (4)
- Institut für Umweltwissenschaften und Geographie (3)
- Department Erziehungswissenschaft (2)
- Department Linguistik (2)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (2)
- Institut für Biochemie und Biologie (2)
- Institut für Geowissenschaften (2)
- Wirtschaftswissenschaften (2)
- Hasso-Plattner-Institut für Digital Engineering GmbH (1)
- Institut für Informatik und Computational Science (1)
Zwischen 1990 und 1994 wurden rund 1000 Liegenschaften, die in der ehemaligen DDR von der Sowjetarmee und der NVA für militärische Übungen genutzt wurden, an Bund und Länder übergeben. Die größten Truppenübungsplätze liegen in Brandenburg und sind heute teilweise in Großschutzgebiete integriert, andere Plätze werden von der Bundeswehr weiterhin aktiv genutzt. Aufgrund des militärischen Betriebs sind die Böden dieser Truppenübungsplätze oft durch Blindgänger, Munitionsreste, Treibstoff- und Schmierölreste bis hin zu chemischen Kampfstoffen belastet. Allerdings existieren auf fast allen Liegenschaften neben diesen durch Munition und militärische Übungen belasteten Bereichen auch naturschutzfachlich wertvolle Flächen; gerade in den Offenlandbereichen kann dies durchaus mit einer Belastung durch Kampfmittel einhergehen. Charakteristisch für diese offenen Flächen, zu denen u.a. Zwergstrauchheiden, Trockenrasen, wüstenähnliche Sandflächen und andere nährstoffarme baumlose Lebensräume gehören, sind Großflächigkeit, Abgeschiedenheit sowie ihre besondere Nutzung und Bewirtschaftung, d.h. die Abwesenheit von land- und forstwirtschaftlichem Betrieb sowie von Siedlungsflächen. Diese Charakteristik war die Grundlage für die Entwicklung einer speziell angepassten Flora und Fauna. Nach Beendigung des Militärbetriebs setzte dann in weiten Teilen eine großflächige Sukzession – die allmähliche Veränderung der Zusammensetzung von Pflanzen- und Tiergesellschaften – ein, die diese offenen Bereiche teilweise bereits in Wald verwandelte und somit verschwinden ließ. Dies wiederum führte zum Verlust der an diese Offenlandflächen gebundenen Tier- und Pflanzenarten. Zur Erhaltung, Gestaltung und Entwicklung dieser offenen Flächen wurden daher von einer interdisziplinären Gruppe von Naturwissenschaftlern verschiedene Methoden und Konzepte auf ihre jeweilige Wirksamkeit untersucht. So konnten schließlich die für die jeweiligen Standortbedingungen geeigneten Maßnahmen eingeleitet werden. Voraussetzung für die Einleitung der Maßnahmen sind zum einen Kenntnisse zu diesen jeweiligen Standortbedingungen, d.h. zum Ist-Zustand, sowie zur Entwicklung der Flächen, d.h. zur Dynamik. So kann eine Abschätzung über die zukünftige Flächenentwicklung getroffen werden, damit ein effizienter Maßnahmeneinsatz stattfinden kann. Geoinformationssysteme (GIS) spielen dabei eine entscheidende Rolle zur digitalen Dokumentation der Biotop- und Nutzungstypen, da sie die Möglichkeit bieten, raum- und zeitbezogene Geometrie- und Sachdaten in großen Mengen zu verarbeiten. Daher wurde ein fachspezifisches GIS für Truppenübungsplätze entwickelt und implementiert. Die Aufgaben umfassten die Konzeption der Datenbank und des Objektmodells sowie fachspezifischer Modellierungs-, Analyse- und Präsentationsfunktionen. Für die Integration von Fachdaten in die GIS-Datenbank wurde zudem ein Metadatenkatalog entwickelt, der in Form eines zusätzlichen GIS-Tools verfügbar ist. Die Basisdaten für das GIS wurden aus Fernerkundungsdaten, topographischen Karten sowie Geländekartierungen gewonnen. Als Instrument für die Abschätzung der zukünftigen Entwicklung wurde das Simulationstool AST4D entwickelt, in dem sowohl die Nutzung der (Raster-)Daten des GIS als Ausgangsdaten für die Simulationen als auch die Nutzung der Simulationsergebnisse im GIS möglich ist. Zudem können die Daten in AST4D raumbezogen visualisiert werden. Das mathematische Konstrukt für das Tool war ein so genannter Zellulärer Automat, mit dem die Flächenentwicklung unter verschiedenen Voraussetzungen simuliert werden kann. So war die Bildung verschiedener Szenarien möglich, d.h. die Simulation der Flächenentwicklung mit verschiedenen (bekannten) Eingangsparametern und den daraus resultierenden unterschiedlichen (unbekannten) Endzuständen. Vor der Durchführung einer der drei in AST4D möglichen Simulationsstufen können angepasst an das jeweilige Untersuchungsgebiet benutzerspezifische Festlegungen getroffen werden.
A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and traitbased models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative view on the functioning of lake ecosystems. We end with a set of specific recommendations that may be of help in the further development of lake ecosystem models.
A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and traitbased models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative view on the functioning of lake ecosystems. We end with a set of specific recommendations that may be of help in the further development of lake ecosystem models.
Business Process Management has become an integral part of modern organizations in the private and public sector for improving their operations. In the course of Business Process Management efforts, companies and organizations assemble large process model repositories with many hundreds and thousands of business process models bearing a large amount of information. With the advent of large business process model collections, new challenges arise as structuring and managing a large amount of process models, their maintenance, and their quality assurance.
This is covered by business process architectures that have been introduced for organizing and structuring business process model collections. A variety of business process architecture approaches have been proposed that align business processes along aspects of interest, e. g., goals, functions, or objects. They provide a high level categorization of single processes ignoring their interdependencies, thus hiding valuable information. The production of goods or the delivery of services are often realized by a complex system of interdependent business processes. Hence, taking a holistic view at business processes interdependencies becomes a major necessity to organize, analyze, and assess the impact of their re-/design. Visualizing business processes interdependencies reveals hidden and implicit information from a process model collection.
In this thesis, we present a novel Business Process Architecture approach for representing and analyzing business process interdependencies on an abstract level. We propose a formal definition of our Business Process Architecture approach, design correctness criteria, and develop analysis techniques for assessing their quality. We describe a methodology for applying our Business Process Architecture approach top-down and bottom-up. This includes techniques for Business Process Architecture extraction from, and decomposition to process models while considering consistency issues between business process architecture and process model level. Using our extraction algorithm, we present a novel technique to identify and visualize data interdependencies in Business Process Data Architectures. Our Business Process Architecture approach provides business process experts,managers, and other users of a process model collection with an overview that allows reasoning about a large set of process models,
understanding, and analyzing their interdependencies in a facilitated way. In this regard we evaluated our Business Process Architecture approach in an experiment and provide implementations of selected techniques.
The literature contains a sizable number of publications where weather types are used to decompose climate shifts or trends into contributions of frequency and mean of those types. They are all based on the product rule, that is, a transformation of a product of sums into a sum of products, the latter providing the decomposition. While there is nothing to argue about the transformation itself, its interpretation as a climate shift or trend decomposition is bound to fail. While the case of a climate shift may be viewed as an incomplete description of a more complex behaviour, trend decomposition indeed produces bogus trends, as demonstrated by a synthetic counterexample with well-defined trends in type frequency and mean. Consequently, decompositions based on that transformation, be it for climate shifts or trends, must not be used.
A numerical framework is developed to study the hysteresis of elastic properties of porous ceramics as a function of temperature. The developed numerical model is capable of employing experimentally measured crystallographic orientation distribution and coefficient of thermal expansion values. For realistic modeling of the microstructure, Voronoi polygons are used to generate polycrystalline grains. Some grains are considered as voids, to simulate the material porosity. To model intercrystalline cracking, cohesive elements are inserted along grain boundaries. Crack healing (recovery of the initial properties) upon closure is taken into account with special cohesive elements implemented in the commercial code ABAQUS. The numerical model can be used to estimate fracture properties governing the cohesive behavior through inverse analysis procedure. The model is applied to a porous cordierite ceramic. The obtained fracture properties are further used to successfully simulate general non-linear macroscopic stress-strain curves of cordierite, thereby validating the model.
The identification of vulnerabilities in IT infrastructures is a crucial problem in enhancing the security, because many incidents resulted from already known vulnerabilities, which could have been resolved. Thus, the initial identification of vulnerabilities has to be used to directly resolve the related weaknesses and mitigate attack possibilities. The nature of vulnerability information requires a collection and normalization of the information prior to any utilization, because the information is widely distributed in different sources with their unique formats. Therefore, the comprehensive vulnerability model was defined and different sources have been integrated into one database. Furthermore, different analytic approaches have been designed and implemented into the HPI-VDB, which directly benefit from the comprehensive vulnerability model and especially from the logical preconditions and postconditions.
Firstly, different approaches to detect vulnerabilities in both IT systems of average users and corporate networks of large companies are presented. Therefore, the approaches mainly focus on the identification of all installed applications, since it is a fundamental step in the detection. This detection is realized differently depending on the target use-case. Thus, the experience of the user, as well as the layout and possibilities of the target infrastructure are considered. Furthermore, a passive lightweight detection approach was invented that utilizes existing information on corporate networks to identify applications.
In addition, two different approaches to represent the results using attack graphs are illustrated in the comparison between traditional attack graphs and a simplistic graph version, which was integrated into the database as well. The implementation of those use-cases for vulnerability information especially considers the usability. Beside the analytic approaches, the high data quality of the vulnerability information had to be achieved and guaranteed. The different problems of receiving incomplete or unreliable information for the vulnerabilities are addressed with different correction mechanisms. The corrections can be carried out with correlation or lookup mechanisms in reliable sources or identifier dictionaries. Furthermore, a machine learning based verification procedure was presented that allows an automatic derivation of important characteristics from the textual description of the vulnerabilities.
The sharing economy
(2020)
Purpose Quantitative bibliometric approaches were used to statistically and objectively explore patterns in the sharing economy literature. Design/methodology/approach Journal (co-)citation analysis, author (co-)citation analysis, institution citation and co-operation analysis, keyword co-occurrence analysis, document (co-)citation analysis and burst detection analysis were conducted based on a bibliometric data set relating to sharing economy publications. Findings Sharing economy research is multi- and interdisciplinary. Journals focused upon products liability, organizing framework, profile characteristics, diverse economies, consumption system and everyday life themes. Authors focused upon profile characteristics, sharing economy organization, social connections, first principle and diverse economy themes. No institution dominated the research field. Keyword co-occurrence analysis identified organizing framework, tourism industry, consumer behavior, food waste, generous exchange and quality cue as research themes. Document co-citation analysis found research themes relating to the tourism industry, exploring public acceptability, agri-food system, commercial orientation, products liability and social connection. Most cited authors, institutions and documents are reported. Research limitations/implications The study did not exclusively focus on publications in top-tier journals. Future studies could run analyses relating to top-tier journals alone, and then run analyses relating to less renowned journals alone. To address the potential fuzzy results concern, reviews could focus on business and/or management research alone. Longitudinal reviews conducted over several points in time are warranted. Future reviews could combine qualitative and quantitative approaches. Originality/value We contribute by analyzing information relating to the population of all sharing economy articles. In addition, we contribute by employing several quantitative bibliometric approaches that enable the identification of trends relating to the themes and patterns in the growing literature.
The Wisconsin Card Sorting Test (WCST) is used to test higher-level executive functions or switching, depending on the measures chosen in a study and its goal. Many measures can be extracted from the WCST, but how to assign them to specific cognitive skills remains unclear. Thus, the current study first aimed at identifying which measures test the same cognitive abilities. Second, we compared the performance of mono- and multilingual children in the identified abilities because there is some evidence that bilingualism can improve executive functions. We tested 66 monolingual and 56 multilingual (i.e., bi- and trilingual) primary school children (M-age = 109 months) in an online version of the classic WCST. A principal component analysis revealed four factors: problem-solving, monitoring, efficient errors, and perseverations. Because the assignment of measures to factors is only partially coherent across the literature, we identified this as one of the sources of task impurity. In the second part, we calculated regression analyses to test for group differences while controlling for intelligence as a predictor for executive functions and for confounding variables such as age, German lexicon size, and socioeconomic status. Intelligence predicted problem solving and perseverations. In the monitoring component (measured by the reaction times preceding a rule switch), multilinguals outperformed monolinguals, thereby supporting the view that bi- or multilingualism can improve processing speed related to monitoring.
Throughfall, that is, the fraction of rainfall that passes through the forest canopy, is strongly influenced by rainfall and forest stand characteristics which are in turn both subject to seasonal dynamics. Disentangling the complex interplay of these controls is challenging, and only possible with long-term monitoring and a large number of throughfall events measured in parallel at different forest stands. We therefore based our analysis on 346 rainfall events across six different forest stands at the long-term terrestrial environmental observatory TERENO Northeast Germany. These forest stands included pure stands of beech, pine and young pine, and mixed stands of oak-beech, pine-beech and pine-oak-beech. Throughfall was overall relatively low, with 54-68% of incident rainfall in summer. Based on the large number of events it was possible to not only investigate mean or cumulative throughfall but also its statistical distribution. The distributions of throughfall fractions show distinct differences between the three types of forest stands (deciduous, mixed and pine). The distributions of the deciduous stands have a pronounced peak at low throughfall fractions and a secondary peak at high fractions in summer, as well as a pronounced peak at higher throughfall fractions in winter. Interestingly, the mixed stands behave like deciduous stands in summer and like pine stands in winter: their summer distributions are similar to the deciduous stands but the winter peak at high throughfall fractions is much less pronounced. The seasonal comparison further revealed that the wooden components and the leaves behaved differently in their throughfall response to incident rainfall, especially at higher rainfall intensities. These results are of interest for estimating forest water budgets and in the context of hydrological and land surface modelling where poor simulation of throughfall would adversely impact estimates of evaporative recycling and water availability for vegetation and runoff.