TY - JOUR A1 - Albert, Aviad A1 - Nicenboim, Bruno T1 - Modeling sonority in terms of pitch intelligibility with the nucleus attraction principle JF - Cognitive science : a multidisciplinary journal of anthropology, artificial intelligence, education, linguistics, neuroscience, philosophy, psychology ; journal of the Cognitive Science Society N2 - Sonority is a fundamental notion in phonetics and phonology, central to many descriptions of the syllable and various useful predictions in phonotactics. Although widely accepted, sonority lacks a clear basis in speech articulation or perception, given that traditional formal principles in linguistic theory are often exclusively based on discrete units in symbolic representation and are typically not designed to be compatible with auditory perception, sensorimotor control, or general cognitive capacities. In addition, traditional sonority principles also exhibit systematic gaps in empirical coverage. Against this backdrop, we propose the incorporation of symbol-based and signal-based models to adequately account for sonority in a complementary manner. We claim that sonority is primarily a perceptual phenomenon related to pitch, driving the optimization of syllables as pitch-bearing units in all language systems. We suggest a measurable acoustic correlate for sonority in terms of periodic energy, and we provide a novel principle that can account for syllabic well-formedness, the nucleus attraction principle (NAP). We present perception experiments that test our two NAP-based models against four traditional sonority models, and we use a Bayesian data analysis approach to test and compare them. Our symbolic NAP model outperforms all the other models we test, while our continuous bottom-up NAP model is at second place, along with the best performing traditional models. We interpret the results as providing strong support for our proposals: (i) the designation of periodic energy as the acoustic correlate of sonority; (ii) the incorporation of continuous entities in phonological models of perception; and (iii) the dual-model strategy that separately analyzes symbol-based top-down processes and signal-based bottom-up processes in speech perception. KW - Sonority KW - Pitch intelligibility KW - Periodic energy KW - Bayesian data KW - analysis KW - Speech perception KW - Phonetics and phonology Y1 - 2022 U6 - https://doi.org/10.1111/cogs.13161 SN - 0364-0213 SN - 1551-6709 VL - 46 IS - 7 PB - Wiley CY - Hoboken ER - TY - JOUR A1 - Arboleda-Zapata, Mauricio A1 - Guillemoteau, Julien A1 - Tronicke, Jens T1 - A comprehensive workflow to analyze ensembles of globally inverted 2D electrical resistivity models JF - Journal of applied geophysics N2 - Electrical resistivity tomography (ERT) aims at imaging the subsurface resistivity distribution and provides valuable information for different geological, engineering, and hydrological applications. To obtain a subsurface resistivity model from measured apparent resistivities, stochastic or deterministic inversion procedures may be employed. Typically, the inversion of ERT data results in non-unique solutions; i.e., an ensemble of different models explains the measured data equally well. In this study, we perform inference analysis of model ensembles generated using a well-established global inversion approach to assess uncertainties related to the nonuniqueness of the inverse problem. Our interpretation strategy starts by establishing model selection criteria based on different statistical descriptors calculated from the data residuals. Then, we perform cluster analysis considering the inverted resistivity models and the corresponding data residuals. Finally, we evaluate model uncertainties and residual distributions for each cluster. To illustrate the potential of our approach, we use a particle swarm optimization (PSO) algorithm to obtain an ensemble of 2D layer-based resistivity models from a synthetic data example and a field data set collected in Loon-Plage, France. Our strategy performs well for both synthetic and field data and allows us to extract different plausible model scenarios with their associated uncertainties and data residual distributions. Although we demonstrate our workflow using 2D ERT data and a PSObased inversion approach, the proposed strategy is general and can be adapted to analyze model ensembles generated from other kinds of geophysical data and using different global inversion approaches. KW - Near-surface geophysics KW - Electrical resistivity tomography KW - Non-uniqueness KW - Global inversion KW - Particle swarm optimization KW - Ensemble KW - analysis Y1 - 2021 U6 - https://doi.org/10.1016/j.jappgeo.2021.104512 SN - 0926-9851 SN - 1879-1859 VL - 196 PB - Elsevier CY - Amsterdam ER - TY - JOUR A1 - Arguello de Souza, Felipe Augusto A1 - Samprogna Mohor, Guilherme A1 - Guzman Arias, Diego Alejandro A1 - Sarmento Buarque, Ana Carolina A1 - Taffarello, Denise A1 - Mendiondo, Eduardo Mario T1 - Droughts in São Paulo BT - challenges and lessons for a water-adaptive society JF - Urban water journal N2 - Literature has suggested that droughts and societies are mutually shaped and, therefore, both require a better understanding of their coevolution on risk reduction and water adaptation. Although the Sao Paulo Metropolitan Region drew attention because of the 2013-2015 drought, this was not the first event. This paper revisits this event and the 1985-1986 drought to compare the evolution of drought risk management aspects. Documents and hydrological records are analyzed to evaluate the hazard intensity, preparedness, exposure, vulnerability, responses, and mitigation aspects of both events. Although the hazard intensity and exposure of the latter event were larger than the former one, the policy implementation delay and the dependency of service areas in a single reservoir exposed the region to higher vulnerability. In addition to the structural and non-structural tools implemented just after the events, this work raises the possibility of rainwater reuse for reducing the stress in reservoirs. KW - droughts KW - urban water supply KW - water crisis KW - drought risk KW - paired event KW - analysis KW - vulnerability Y1 - 2022 U6 - https://doi.org/10.1080/1573062X.2022.2047735 SN - 1573-062X SN - 1744-9006 VL - 20 IS - 10 SP - 1682 EP - 1694 PB - Taylor & Francis CY - London [u.a.] ER - TY - JOUR A1 - Blume, Theresa A1 - Schneider, Lisa A1 - Güntner, Andreas T1 - Comparative analysis of throughfall observations in six different forest stands BT - Influence of seasons, rainfall- and stand characteristics JF - Hydrological processes N2 - Throughfall, that is, the fraction of rainfall that passes through the forest canopy, is strongly influenced by rainfall and forest stand characteristics which are in turn both subject to seasonal dynamics. Disentangling the complex interplay of these controls is challenging, and only possible with long-term monitoring and a large number of throughfall events measured in parallel at different forest stands. We therefore based our analysis on 346 rainfall events across six different forest stands at the long-term terrestrial environmental observatory TERENO Northeast Germany. These forest stands included pure stands of beech, pine and young pine, and mixed stands of oak-beech, pine-beech and pine-oak-beech. Throughfall was overall relatively low, with 54-68% of incident rainfall in summer. Based on the large number of events it was possible to not only investigate mean or cumulative throughfall but also its statistical distribution. The distributions of throughfall fractions show distinct differences between the three types of forest stands (deciduous, mixed and pine). The distributions of the deciduous stands have a pronounced peak at low throughfall fractions and a secondary peak at high fractions in summer, as well as a pronounced peak at higher throughfall fractions in winter. Interestingly, the mixed stands behave like deciduous stands in summer and like pine stands in winter: their summer distributions are similar to the deciduous stands but the winter peak at high throughfall fractions is much less pronounced. The seasonal comparison further revealed that the wooden components and the leaves behaved differently in their throughfall response to incident rainfall, especially at higher rainfall intensities. These results are of interest for estimating forest water budgets and in the context of hydrological and land surface modelling where poor simulation of throughfall would adversely impact estimates of evaporative recycling and water availability for vegetation and runoff. KW - forest hydrology KW - forest stand characteristics KW - interception KW - leaf area KW - index KW - rainfall characteristics KW - seasonal effects KW - stratified event KW - analysis KW - throughfall KW - tree species effects Y1 - 2021 U6 - https://doi.org/10.1002/hyp.14461 SN - 0885-6087 SN - 1099-1085 VL - 36 IS - 3 PB - Wiley CY - Hoboken ER - TY - JOUR A1 - Buljak, Vladimir A1 - Bruno, Giovanni T1 - Numerical modeling of thermally induced microcracking in porous ceramics BT - an approach using cohesive elements JF - Journal of the European Ceramic Society N2 - A numerical framework is developed to study the hysteresis of elastic properties of porous ceramics as a function of temperature. The developed numerical model is capable of employing experimentally measured crystallographic orientation distribution and coefficient of thermal expansion values. For realistic modeling of the microstructure, Voronoi polygons are used to generate polycrystalline grains. Some grains are considered as voids, to simulate the material porosity. To model intercrystalline cracking, cohesive elements are inserted along grain boundaries. Crack healing (recovery of the initial properties) upon closure is taken into account with special cohesive elements implemented in the commercial code ABAQUS. The numerical model can be used to estimate fracture properties governing the cohesive behavior through inverse analysis procedure. The model is applied to a porous cordierite ceramic. The obtained fracture properties are further used to successfully simulate general non-linear macroscopic stress-strain curves of cordierite, thereby validating the model. KW - analysis KW - Cohesive finite elements KW - Interfacial strength Y1 - 2018 U6 - https://doi.org/10.1016/j.jeurceramsoc.2018.03.041 SN - 0955-2219 SN - 1873-619X VL - 38 IS - 11 SP - 4099 EP - 4108 PB - Elsevier CY - Oxford ER - TY - JOUR A1 - Bürger, Gerd T1 - A counterexample to decomposing climate shifts and trends by weather types JF - International Journal of Climatology N2 - The literature contains a sizable number of publications where weather types are used to decompose climate shifts or trends into contributions of frequency and mean of those types. They are all based on the product rule, that is, a transformation of a product of sums into a sum of products, the latter providing the decomposition. While there is nothing to argue about the transformation itself, its interpretation as a climate shift or trend decomposition is bound to fail. While the case of a climate shift may be viewed as an incomplete description of a more complex behaviour, trend decomposition indeed produces bogus trends, as demonstrated by a synthetic counterexample with well-defined trends in type frequency and mean. Consequently, decompositions based on that transformation, be it for climate shifts or trends, must not be used. KW - analysis KW - climate KW - statistical methods Y1 - 2018 U6 - https://doi.org/10.1002/joc.5519 SN - 0899-8418 SN - 1097-0088 VL - 38 IS - 9 SP - 3732 EP - 3735 PB - Wiley CY - Hoboken ER - TY - JOUR A1 - Czapka, Sophia A1 - Festman, Julia T1 - Wisconsin Card Sorting Test reveals a monitoring advantage but not a switching advantage in multilingual children JF - Journal of experimental child psychology : JECP N2 - The Wisconsin Card Sorting Test (WCST) is used to test higher-level executive functions or switching, depending on the measures chosen in a study and its goal. Many measures can be extracted from the WCST, but how to assign them to specific cognitive skills remains unclear. Thus, the current study first aimed at identifying which measures test the same cognitive abilities. Second, we compared the performance of mono- and multilingual children in the identified abilities because there is some evidence that bilingualism can improve executive functions. We tested 66 monolingual and 56 multilingual (i.e., bi- and trilingual) primary school children (M-age = 109 months) in an online version of the classic WCST. A principal component analysis revealed four factors: problem-solving, monitoring, efficient errors, and perseverations. Because the assignment of measures to factors is only partially coherent across the literature, we identified this as one of the sources of task impurity. In the second part, we calculated regression analyses to test for group differences while controlling for intelligence as a predictor for executive functions and for confounding variables such as age, German lexicon size, and socioeconomic status. Intelligence predicted problem solving and perseverations. In the monitoring component (measured by the reaction times preceding a rule switch), multilinguals outperformed monolinguals, thereby supporting the view that bi- or multilingualism can improve processing speed related to monitoring. KW - Executive functions KW - Switching KW - Monitoring KW - Multilingualism KW - Factor KW - analysis KW - Bilingual advantage Y1 - 2021 U6 - https://doi.org/10.1016/j.jecp.2020.105038 SN - 0022-0965 SN - 1096-0457 VL - 204 PB - Elsevier CY - Amsterdam ER - TY - THES A1 - Eid-Sabbagh, Rami-Habib T1 - Business process architectures BT - concepts, formalism, and analysis N2 - Business Process Management has become an integral part of modern organizations in the private and public sector for improving their operations. In the course of Business Process Management efforts, companies and organizations assemble large process model repositories with many hundreds and thousands of business process models bearing a large amount of information. With the advent of large business process model collections, new challenges arise as structuring and managing a large amount of process models, their maintenance, and their quality assurance. This is covered by business process architectures that have been introduced for organizing and structuring business process model collections. A variety of business process architecture approaches have been proposed that align business processes along aspects of interest, e. g., goals, functions, or objects. They provide a high level categorization of single processes ignoring their interdependencies, thus hiding valuable information. The production of goods or the delivery of services are often realized by a complex system of interdependent business processes. Hence, taking a holistic view at business processes interdependencies becomes a major necessity to organize, analyze, and assess the impact of their re-/design. Visualizing business processes interdependencies reveals hidden and implicit information from a process model collection. In this thesis, we present a novel Business Process Architecture approach for representing and analyzing business process interdependencies on an abstract level. We propose a formal definition of our Business Process Architecture approach, design correctness criteria, and develop analysis techniques for assessing their quality. We describe a methodology for applying our Business Process Architecture approach top-down and bottom-up. This includes techniques for Business Process Architecture extraction from, and decomposition to process models while considering consistency issues between business process architecture and process model level. Using our extraction algorithm, we present a novel technique to identify and visualize data interdependencies in Business Process Data Architectures. Our Business Process Architecture approach provides business process experts,managers, and other users of a process model collection with an overview that allows reasoning about a large set of process models, understanding, and analyzing their interdependencies in a facilitated way. In this regard we evaluated our Business Process Architecture approach in an experiment and provide implementations of selected techniques. N2 - Geschäftsprozessmanagement nimmt heutzutage eine zentrale Rolle zur Verbesserung von Geschäftsabläufen in Organisationen des öffentlichen und privaten Sektors ein. Im Laufe von Geschäftsprozessmanagementprojekten entstehen große Prozessmodellsammlungen mit hunderten und tausenden Prozessmodellen, die vielfältige Informationen enthalten. Mit der Entstehung großer Prozessmodellsammlungen, entstehen neue Herausforderungen. Diese beinhalten die Strukturierung und Organisation vieler Prozessmodelle, ihre Pflege und Aktualisierung, sowie ihre Qualitätssicherung. Mit diesen Herausforderungen befassen sich Geschäftsprozessarchitekturen. Viele der aktuellen Geschäftsprozessarchitekturen ordnen Geschäftsprozesse nach bestimmen Aspekten von Interesse, zum Beispiel, nach Zielen, Funktionen, oder Geschäftsobjekten. Diese Herangehensweisen bieten eine sehr abstrakte Kategorisierung von einzelnen Geschäftsprozessen, wobei sie wichtige Abhängigkeiten zwischen Prozessen ignorieren und so wertvolle Informationen verbergen. Die Produktion von Waren und das Anbieten von Dienstleistungen bilden ein komplexes System untereinander abhängiger Geschäftsprozesse. Diesbezüglich ist es unabdingbar eine ganzheitliche Sicht auf Geschäftsprozesse und ihre Abhängigkeiten zu schaffen, um die Geschäftsprozesse zu organisieren, zu analysieren und zu optimieren. Die Darstellung von Geschäftsprozessabhängigkeiten zeigt versteckte und implizite Informationen auf, die bisher in Geschäftsprozesssammlungen verborgen blieben. In dieser Arbeit stellen wir eine Geschäftsprozessarchitekturmethodik vor, die es erlaubt Geschäftsprozessabhänigigkeiten auf einer abstrakten Ebene darzustellen und zu analysieren. Wir führen eine formale Definition unserer Geschäftsprozessarchitektur und entsprechende Korrektheitskriterien ein. Darauf aufbauend stellen wir Analysetechniken für unsere Geschäftsprozessarchitektur vor. In einem Anwendungsrahmenwerk eläutern wir die top-down und bottomup Anwendung unserer Geschäftsprozessarchitekturmethodik. Dies beinhaltet die Beschreibung von Algorithmen zur Extraktion von Geschäftsprozessarchitekturen und zur Generierung von Prozessmodellen aus Geschäftsprozessarchitekturen, die die Konsistenz zwischen den Elementen auf Prozessmodellebene und Geschäftsprozessarchitekturebene gewährleisten. Aufbauend auf dem Extraktionsalgorithmus, stellen wir eine neue Technik zur Identifizierung, Extraktion, und Visualisierung von versteckten Datenabhängigkeiten zwischen Prozessmodellen in Geschäftsprozessdatenarchitekturen vor. Unsere Arbeit stellt Geschäftsprozessexperten, Manager, und Nutzern einer Geschäftsprozessmodellsammlung eine Methodik zur Verfügung, die es ihnen ermöglicht und vereinfacht, eine Übersicht über Prozesse und ihren Abhängigkeiten zu erstellen, diese zu verstehen und zu analysieren. Diesbezüglich haben wir unsere Geschäftsprozessarchitekturmethodik in einem empirischen Experiment auf ihre Anwendbarkeit und Effektivität untersucht und zur weiteren Evaluierung ausgewählte Algorithmen implementiert. KW - business process architecture KW - bpm KW - formalism KW - analysis KW - abstraction KW - Prozessarchitektur KW - Geschäftsprozessmanagement KW - Analyse KW - Abstraktion Y1 - 2015 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-79719 ER - TY - JOUR A1 - Fredrich, Viktor A1 - Bouncken, Ricarda B. A1 - Tiberius, Victor T1 - Dyadic business model convergence or divergence in alliances? BT - a configurational approach JF - Journal of business research N2 - In this study, we contribute to the scholarly conversation on firm-level business model changes following a neoconfigurational approach. By exploring configurations of business model changes over time, we add the direction of business model changes-namely business model convergence or divergence-as a vital avenue to the business model innovation literature. We identify necessary business model convergence and divergence recipes in a sample of N = 217 strategic dyadic alliances. Firstly, technological proximity emerges as a single precondition to both converging and diverging business models. Secondly, business models between competitors either converge through complementarities or tend not to change relative to each other. Thirdly, equity participation enables business model divergence through co-specialization. We conclude with a discussion of business model trajectories and future research directions. KW - Business model innovation KW - Business model changes KW - Convergence vs KW - divergence KW - Strategic alliances KW - Fuzzy -set qualitative comparative KW - analysis KW - (fsQCA) Y1 - 2022 U6 - https://doi.org/10.1016/j.jbusres.2022.08.046 SN - 0148-2963 SN - 1873-7978 VL - 153 SP - 300 EP - 308 PB - Elsevier CY - New York ER - TY - THES A1 - Gawron, Marian T1 - Towards automated advanced vulnerability analysis N2 - The identification of vulnerabilities in IT infrastructures is a crucial problem in enhancing the security, because many incidents resulted from already known vulnerabilities, which could have been resolved. Thus, the initial identification of vulnerabilities has to be used to directly resolve the related weaknesses and mitigate attack possibilities. The nature of vulnerability information requires a collection and normalization of the information prior to any utilization, because the information is widely distributed in different sources with their unique formats. Therefore, the comprehensive vulnerability model was defined and different sources have been integrated into one database. Furthermore, different analytic approaches have been designed and implemented into the HPI-VDB, which directly benefit from the comprehensive vulnerability model and especially from the logical preconditions and postconditions. Firstly, different approaches to detect vulnerabilities in both IT systems of average users and corporate networks of large companies are presented. Therefore, the approaches mainly focus on the identification of all installed applications, since it is a fundamental step in the detection. This detection is realized differently depending on the target use-case. Thus, the experience of the user, as well as the layout and possibilities of the target infrastructure are considered. Furthermore, a passive lightweight detection approach was invented that utilizes existing information on corporate networks to identify applications. In addition, two different approaches to represent the results using attack graphs are illustrated in the comparison between traditional attack graphs and a simplistic graph version, which was integrated into the database as well. The implementation of those use-cases for vulnerability information especially considers the usability. Beside the analytic approaches, the high data quality of the vulnerability information had to be achieved and guaranteed. The different problems of receiving incomplete or unreliable information for the vulnerabilities are addressed with different correction mechanisms. The corrections can be carried out with correlation or lookup mechanisms in reliable sources or identifier dictionaries. Furthermore, a machine learning based verification procedure was presented that allows an automatic derivation of important characteristics from the textual description of the vulnerabilities. N2 - Die Erkennung von Schwachstellen ist ein schwerwiegendes Problem bei der Absicherung von modernen IT-Systemen. Mehrere Sicherheitsvorfälle hätten durch die vorherige Erkennung von Schwachstellen verhindert werden können, da in diesen Vorfällen bereits bekannte Schwachstellen ausgenutzt wurden. Der Stellenwert der Sicherheit von IT Systemen nimmt immer weiter zu, was auch mit der Aufmerksamkeit, die seit kurzem auf die Sicherheit gelegt wird, zu begründen ist. Somit nimmt auch der Stellenwert einer Schwachstellenanalyse der IT Systeme immer mehr zu, da hierdurch potenzielle Angriffe verhindert und Sicherheitslücken geschlossen werden können. Die Informationen über Sicherheitslücken liegen in verschiedenen Quellen in unterschiedlichen Formaten vor. Aus diesem Grund wird eine Normalisierungsmethode benötigt, um die verschiedenen Informationen in ein einheitliches Format zu bringen. Das damit erzeugte Datenmodell wird in der HPI-VDB gespeichert, in die auch darauf aufbauende Analyseansätze integriert wurden. Diese Analysemethoden profitieren direkt von den Eigenschaften des Datenmodells, das maschinenlesbare Vor- und Nachbedingungen enthält. Zunächst wurden verschiedene Methoden zur Erkennung von Schwachstellen in IT Systemen von durchschnittlichen Nutzern und auch in Systemen von großen Firmen entwickelt. Hierbei wird der Identifikation der installierten Programme die größte Aufmerksamkeit beigemessen, da es der grundlegende Bestandteil der Erkennung von Schwachstellen ist. Für die Ansätze wird weiterhin die Erfahrung des Nutzers und die Eigenschaften der Zielumgebung berücksichtigt. Zusätzlich wurden zwei weitere Ansätze zur Repräsentation der Ergebnisse integriert. Hierfür wurden traditionelle Angriffsgraphen mit einer vereinfachten Variante verglichen, die auch in die Datenbank integriert wurde. Des Weiteren spielt die Datenqualität eine wichtige Rolle, da die Effizienz der Analysemethoden von der Qualität der Informationen abhängt. Deshalb wurden Probleme wie Unvollständigkeit und Unzuverlässigkeit der Informationen mit verschiedenen Korrekturansätzen bewältigt. Diese Korrekturen werden mithilfe von Korrelationen und Maschinellem Lernen bewerkstelligt, wobei die automatische Ausführbarkeit eine grundlegende Anforderung darstellt. KW - IT-security KW - vulnerability KW - analysis KW - IT-Sicherheit KW - Schwachstelle KW - Analyse Y1 - 2019 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:kobv:517-opus4-426352 ER -