Refine
Has Fulltext
- yes (649) (remove)
Year of publication
- 2016 (649) (remove)
Document Type
- Postprint (216)
- Article (175)
- Doctoral Thesis (136)
- Monograph/Edited Volume (28)
- Part of Periodical (22)
- Preprint (18)
- Review (14)
- Master's Thesis (12)
- Part of a Book (11)
- Working Paper (6)
Keywords
- Migration (13)
- migration (13)
- religion (13)
- Religion (12)
- interkulturelle Missverständnisse (12)
- religiöses Leben (12)
- confusions and misunderstandings (11)
- Logopädie (6)
- Zeitschrift (6)
- model (6)
Institute
- Mathematisch-Naturwissenschaftliche Fakultät (80)
- Institut für Slavistik (75)
- Institut für Geowissenschaften (41)
- Humanwissenschaftliche Fakultät (39)
- Institut für Chemie (39)
- Institut für Physik und Astronomie (31)
- Institut für Biochemie und Biologie (30)
- Vereinigung für Jüdische Studien e. V. (29)
- Bürgerliches Recht (28)
- Department Linguistik (23)
The knowledge of the contemporary in situ stress state is a key issue for safe and sustainable subsurface engineering. However, information on the orientation and magnitudes of the stress state is limited and often not available for the areas of interest. Therefore 3-D geomechanical-numerical modelling is used to estimate the in situ stress state and the distance of faults from failure for application in subsurface engineering. The main challenge in this approach is to bridge the gap in scale between the widely scattered data used for calibration of the model and the high resolution in the target area required for the application. We present a multi-stage 3-D geomechanical-numerical approach which provides a state-of-the-art model of the stress field for a reservoir-scale area from widely scattered data records. Therefore, we first use a large-scale regional model which is calibrated by available stress data and provides the full 3-D stress tensor at discrete points in the entire model volume. The modelled stress state is used subsequently for the calibration of a smaller-scale model located within the large-scale model in an area without any observed stress data records. We exemplify this approach with two-stages for the area around Munich in the German Molasse Basin. As an example of application, we estimate the scalar values for slip tendency and fracture potential from the model results as measures for the criticality of fault reactivation in the reservoir-scale model. The modelling results show that variations due to uncertainties in the input data are mainly introduced by the uncertain material properties and missing S-Hmax magnitude estimates needed for a more reliable model calibration. This leads to the conclusion that at this stage the model's reliability depends only on the amount and quality of available stress information rather than on the modelling technique itself or on local details of the model geometry. Any improvements in modelling and increases in model reliability can only be achieved using more high-quality data for calibration.
Compared to their inorganic counterparts, organic semiconductors suffer from relatively low charge carrier mobilities. Therefore, expressions derived for inorganic solar cells to correlate characteristic performance parameters to material properties are prone to fail when applied to organic devices. This is especially true for the classical Shockley-equation commonly used to describe current-voltage (JV)-curves, as it assumes a high electrical conductivity of the charge transporting material. Here, an analytical expression for the JV-curves of organic solar cells is derived based on a previously published analytical model. This expression, bearing a similar functional dependence as the Shockley-equation, delivers a new figure of merit α to express the balance between free charge recombination and extraction in low mobility photoactive materials. This figure of merit is shown to determine critical device parameters such as the apparent series resistance and the fill factor.
In the past, floods were basically managed by flood control mechanisms. The focus was set on the reduction of flood hazard. The potential consequences were of minor interest. Nowadays river flooding is increasingly seen from the risk perspective, including possible consequences. Moreover, the large-scale picture of flood risk became increasingly important for disaster management planning, national risk developments and the (re-) insurance industry. Therefore, it is widely accepted that risk-orientated flood management ap-proaches at the basin-scale are needed. However, large-scale flood risk assessment methods for areas of several 10,000 km² are still in early stages. Traditional flood risk assessments are performed reach wise, assuming constant probabilities for the entire reach or basin. This might be helpful on a local basis, but where large-scale patterns are important this approach is of limited use. Assuming a T-year flood (e.g. 100 years) for the entire river network is unrealistic and would lead to an overestimation of flood risk at the large scale. Due to the lack of damage data, additionally, the probability of peak discharge or rainfall is usually used as proxy for damage probability to derive flood risk. With a continuous and long term simulation of the entire flood risk chain, the spatial variability of probabilities could be consider and flood risk could be directly derived from damage data in a consistent way.
The objective of this study is the development and application of a full flood risk chain, appropriate for the large scale and based on long term and continuous simulation. The novel approach of ‘derived flood risk based on continuous simulations’ is introduced, where the synthetic discharge time series is used as input into flood impact models and flood risk is directly derived from the resulting synthetic damage time series.
The bottleneck at this scale is the hydrodynamic simu-lation. To find suitable hydrodynamic approaches for the large-scale a benchmark study with simplified 2D hydrodynamic models was performed. A raster-based approach with inertia formulation and a relatively high resolution of 100 m in combination with a fast 1D channel routing model was chosen.
To investigate the suitability of the continuous simulation of a full flood risk chain for the large scale, all model parts were integrated into a new framework, the Regional Flood Model (RFM). RFM consists of the hydrological model SWIM, a 1D hydrodynamic river network model, a 2D raster based inundation model and the flood loss model FELMOps+r. Subsequently, the model chain was applied to the Elbe catchment, one of the largest catchments in Germany. For the proof-of-concept, a continuous simulation was per-formed for the period of 1990-2003. Results were evaluated / validated as far as possible with available observed data in this period. Although each model part introduced its own uncertainties, results and runtime were generally found to be adequate for the purpose of continuous simulation at the large catchment scale.
Finally, RFM was applied to a meso-scale catchment in the east of Germany to firstly perform a flood risk assessment with the novel approach of ‘derived flood risk assessment based on continuous simulations’. Therefore, RFM was driven by long term synthetic meteorological input data generated by a weather generator. Thereby, a virtual time series of climate data of 100 x 100 years was generated and served as input to RFM providing subsequent 100 x 100 years of spatially consistent river discharge series, inundation patterns and damage values. On this basis, flood risk curves and expected annual damage could be derived directly from damage data, providing a large-scale picture of flood risk. In contrast to traditional flood risk analysis, where homogenous return periods are assumed for the entire basin, the presented approach provides a coherent large-scale picture of flood risk. The spatial variability of occurrence probability is respected. Additionally, data and methods are consistent. Catchment and floodplain processes are repre-sented in a holistic way. Antecedent catchment conditions are implicitly taken into account, as well as physical processes like storage effects, flood attenuation or channel–floodplain interactions and related damage influencing effects. Finally, the simulation of a virtual period of 100 x 100 years and consequently large data set on flood loss events enabled the calculation of flood risk directly from damage distributions. Problems associated with the transfer of probabilities in rainfall or peak runoff to probabilities in damage, as often used in traditional approaches, are bypassed.
RFM and the ‘derived flood risk approach based on continuous simulations’ has the potential to provide flood risk statements for national planning, re-insurance aspects or other questions where spatially consistent, large-scale assessments are required.
In recent decades, the Greenland Ice Sheet has been losing mass and has thereby contributed to global sea-level rise. The rate of ice loss is highly relevant for coastal protection worldwide. The ice loss is likely to increase under future warming. Beyond a critical temperature threshold, a meltdown of the Greenland Ice Sheet is induced by the self-enforcing feedback between its lowering surface elevation and its increasing surface mass loss: the more ice that is lost, the lower the ice surface and the warmer the surface air temperature, which fosters further melting and ice loss. The computation of this rate so far relies on complex numerical models which are the appropriate tools for capturing the complexity of the problem. By contrast we aim here at gaining a conceptual understanding by deriving a purposefully simple equation for the self-enforcing feedback which is then used to estimate the melt time for different levels of warming using three observable characteristics of the ice sheet itself and its surroundings. The analysis is purely conceptual in nature. It is missing important processes like ice dynamics for it to be useful for applications to sea-level rise on centennial timescales, but if the volume loss is dominated by the feedback, the resulting logarithmic equation unifies existing numerical simulations and shows that the melt time depends strongly on the level of warming with a critical slow-down near the threshold: the median time to lose 10% of the present-day ice volume varies between about 3500 years for a temperature level of 0.5 degrees C above the threshold and 500 years for 5 degrees C. Unless future observations show a significantly higher melting sensitivity than currently observed, a complete meltdown is unlikely within the next 2000 years without significant ice-dynamical contributions.
Abrechnung mit dem Archiv
(2016)
Das dritte Working Paper in der KFG Working Paper Series analysiert Zustand und Perspektiven völkerrechtlicher Abrüstungsverträge unter der Ägide der Vereinten Nationen. Während die dreißig Jahre zwischen der Kuba-Krise und dem Fall des Eisernen Vorhangs für die Abrüstung eine erfolgreiche Periode gewesen seien, seien in den Vereinten Nationen seither außer dem Waffenhandelsvertrag keine weiteren Abrüstungsverträge abgeschlossen worden. Die gegenwärtige Stimmung sei abwartend bis negativ, obwohl es ein Nachholbedürfnis gebe, Abrüstungsverträge an die heutigen politischen Gegebenheiten sowie an den Stand der Technik anzupassen. Die Verfasserin schlägt als Lösung vor, durch eine Politik der kleinen Schritte ein besseres Abrüstungsklima zu schaffen, indem dem Diskurs auf Grundlage zusätzlicher Protokolle zu bestehenden Verträgen und notfalls auch durch ein Ausweichen auf andere Gremien eine neue Richtung verliehen werde.
The present work is a case study contributing to the major planning project “Suedlink”. It is structured as follows: first, in a theoretical part, mandatory theories of social acceptance (Wüstenhagen et al., 2007), steps of participation (Münnich, 2014), and the governance theory (Benz and Dose, 2011) are elaborated. Secondly, the relevant methods are discussed. Thirdly, in a qualitative analytical part, the information that were gathered from the expert interviews are analyzed with the use of the aforementioned theories. In the fourth place, an empirical quantitative analysis of data regarding the public acceptance towards Suedlink is presented.
In this case study, with the use of qualitative and quantitative methods, two questions are answered: first, which governance aspects were relevant for the priority use of underground cables for the construction of high voltage direct current transmission lines? For this question, intensive document analysis and different expert interviews were conducted. Secondly, the central question of the present work addresses the question whether local or/and individual factors affect the public acceptance towards SüdLink. Here, in particular, it is interesting to analyze if the priority use of underground cables affected the people’s acceptance towards SuedLink. In order to respond to both questions, an online survey was conducted among citizen initiatives, district administrators, and individuals in social media during March till July 2016. Thereafter, the data was analyzed with the use of descriptive quantitative methods. The data shows, that underground cables not necessarily increase public acceptance (see also Menges and Beyer, 2013). On the contrary, individual and local criteria were relevant for the survey respondents. For example criteria such as the quality of participation, distance between home and transmission lines, and the additional financial burden (taxes, higher prices for electricity) were important for the evaluation. In addition, survey respondents who participated in citizen initiatives were more critical against the priority use of underground cables and SuedLink in general. Likewise, residential homeowners rejected every form of transmission lines.
Achtet alles Existierende
(2016)
Adorno und die Kabbala
(2016)
Im neunten Band der Reihe geht Ansgar Martins kabbalistischen Spuren in der Philosophie Theodor W. Adornos (1903–1969) nach. Der Frankfurter Gesellschaftskritiker griff im Rahmen seines radikalen materialistischen Projekts gleichwohl auch auf ‚theologische‘ Deutungsfiguren zurück. Vermittelt durch den gemeinsamen Freund Walter Benjamin (1892–1940) stieß Adorno dabei auf das Werk des Kabbala-Forschers Gershom Scholem (1897–1982). Zwischen Frankfurt und Jerusalem entwickelte sich eine lebenslange Korrespondenz.
Für Adorno erscheint vor dem Hintergrund lückenloser kapitalistischer Vergesellschaftung jede religiöse Sinngebung in der Moderne als unmöglich. Der Tradition der jüdischen Mystik schreibt er hingegen eine innere Affinität zu dieser hoffnungslosen Logik des ‚Verfalls‘ zu. Sie scheint ihm zur unumgänglichen Säkularisierung religiöser Gehalte aufzufordern. Adornos kabbalistische Marginalien beziehen einen breiten Horizont jüdisch-messianischer Ideen ein. Er verleugnet dabei nie, dass es ihm um eine sehr diesseite Verwirklichung geoffenbarter Heilsversprechen zu tun ist: Transzendenz sei als erfüllte Immanenz, als verwirklichte Utopie zu denken. In diesem Anliegen sieht Adorno selbst jedoch gerade seine Übereinstimmung mit der Kabbala.
Adornos kabbalistische Motive, die auf Scholems Forschungen zurückgehen, werden hier ausführlich an seinen Schriften und Vorlesungen untersucht. In seinem Verständnis der philosophischen Tradition sowie im Modell der Metaphysischen Erfahrung suchte er etwa explizit Anschluss an Deutungen der Kabbala: Das unerreichbare Urbild der Philosophie sei die Interpretation der geoffenbarten Schrift. Wie säkularisierte heilige Texte wurden Werke von Beethoven, Goethe, Kafka oder Schönberg so zum Anlass für ‚mystische‘ Interpretationen. Deren detaillierte Untersuchung erlaubt, das viel beschworene jüdische Erbe von Adornos Philosophie zu konkretisieren und bedenkenswerte Einzelheiten von der Negativen Dialektik zur Ästhetik in den Blick zu nehmen.
Although there is ample evidence linking insecure attachment styles and intimate partner violence (IPV), little is known about the psychological processes underlying this association, especially from the victim’s perspective. The present study examined how attachment styles relate to the experience of sexual and psychological abuse, directly or indirectly through destructive conflict resolution strategies, both self-reported and attributed to their opposite-sex romantic partner. In an online survey, 216 Spanish undergraduates completed measures of adult attachment style, engagement and withdrawal conflict resolution styles shown by self and partner, and victimization by an intimate partner in the form of sexual coercion and psychological abuse. As predicted, anxious and avoidant attachment styles were directly related to both forms of victimization. Also, an indirect path from anxious attachment to IPV victimization was detected via destructive conflict resolution strategies. Specifically, anxiously attached participants reported a higher use of conflict engagement by themselves and by their partners. In addition, engagement reported by the self and perceived in the partner was linked to an increased probability of experiencing sexual coercion and psychological abuse. Avoidant attachment was linked to higher withdrawal in conflict situations, but the paths from withdrawal to perceived partner engagement, sexual coercion, and psychological abuse were non-significant. No gender differences in the associations were found. The discussion highlights the role of anxious attachment in understanding escalating patterns of destructive conflict resolution strategies, which may increase the vulnerability to IPV victimization.
Affekte im Konflikt
(2016)
This article re-examines the relationship between Africa and the International Criminal Court (ICC). It traces the successive changes of the African attitude towards this Court, from states' euphoria, to hostility against its work, to regional counter-initiatives through the umbrella of the African Union (AU). The main argument goes beyond the idea of "the Court that Africa wants" in order to identify concrete reasons behind such a formal argument which may have fostered, if not enticed, the majority of African states to become ICC members and actively cooperate with it, when paradoxically some great powers have decided to stay outside its jurisdiction. It also seeks to understand, from a political and legal viewpoint, which parameters have changed since then to provoke that hostile attitude against the Court's work and the entrance of the AU into the debate through the African Common Position on the ICC. Lastly, this article explores African alternatives to the contested ICC justice system. It examines the need to reform the Rome Statute in order to give more independence, credibility and legitimacy to the ICC and its duplication to some extent by the new "Criminal Court of the African Union". Particular attention is paid to the resistance against this idea to reform the ICC justice system.
Age of acquisition (AOA) is a psycholinguistic variable that significantly influences behavioural measures (response times and accuracy rates) in tasks that require lexical and semantic processing. Its origin is – unlike the origin of semantic typicality (TYP), which is assumed at the semantic level – controversially discussed. Different theories propose AOA effects to originate either at the semantic level or at the link between semantics and phonology (lemma-level).
The dissertation aims at investigating the influence of AOA and its interdependence with the semantic variable TYP on particularly semantic processing in order to pinpoint the origin of AOA effects. Therefore, three studies have been conducted that considered the variables AOA and TYP in semantic processing tasks (category verifications and animacy decisions) by means of behavioural and partly electrophysiological (ERP) data and in different populations (healthy young and elderly participants and in semantically impaired individuals with aphasia (IWA)).
The behavioural and electrophysiological data of the three studies provide evidence for distinct processing levels of the variables AOA and TYP. The data further support previous assumptions on a semantic origin for TYP but question the same for AOA. The findings, however, support an origin of AOA effects at the transition between the word form (phonology) and the semantic level that can be captured at the behavioural but not at the electrophysiological level.
Air pollution is the number one environmental cause of premature deaths in Europe. Despite extensive regulations, air pollution remains a challenge, especially in urban areas. For studying summertime air quality in the Berlin-Brandenburg region of Germany, the Weather Research and Forecasting Model with Chemistry (WRF-Chem) is set up and evaluated against meteorological and air quality observations from monitoring stations as well as from a field campaign conducted in 2014. The objective is to assess which resolution and level of detail in the input data is needed for simulating urban background air pollutant concentrations and their spatial distribution in the Berlin-Brandenburg area. The model setup includes three nested domains with horizontal resolutions of 15, 3 and 1 km and anthropogenic emissions from the TNO-MACC III inventory. We use RADM2 chemistry and the MADE/SORGAM aerosol scheme. Three sensitivity simulations are conducted updating input parameters to the single-layer urban canopy model based on structural data for Berlin, specifying land use classes on a sub-grid scale (mosaic option) and downscaling the original emissions to a resolution of ca. 1 km x 1 km for Berlin based on proxy data including traffic density and population density. The results show that the model simulates meteorology well, though urban 2m temperature and urban wind speeds are biased high and nighttime mixing layer height is biased low in the base run with the settings described above. We show that the simulation of urban meteorology can be improved when specifying the input parameters to the urban model, and to a lesser extent when using the mosaic option. On average, ozone is simulated reasonably well, but maximum daily 8 h mean concentrations are underestimated, which is consistent with the results from previous modelling studies using the RADM2 chemical mechanism. Particulate matter is underestimated, which is partly due to an underestimation of secondary organic aerosols. NOx (NO + NO2) concentrations are simulated reasonably well on average, but nighttime concentrations are overestimated due to the model's underestimation of the mixing layer height, and urban daytime concentrations are underestimated. The daytime underestimation is improved when using downscaled, and thus locally higher emissions, suggesting that part of this bias is due to deficiencies in the emission input data and their resolution. The results further demonstrate that a horizontal resolution of 3 km improves the results and spatial representativeness of the model compared to a horizontal resolution of 15 km. With the input data (land use classes, emissions) at the level of detail of the base run of this study, we find that a horizontal resolution of 1 km does not improve the results compared to a resolution of 3 km. However, our results suggest that a 1 km horizontal model resolution could enable a detailed simulation of local pollution patterns in the Berlin-Brandenburg region if the urban land use classes, together with the respective input parameters to the urban canopy model, are specified with a higher level of detail and if urban emissions of higher spatial resolution are used.
Die vorliegende Abhandlung beschäftigt sich mit einem bisher wenig beachteten Aspekt von Humboldts Amerikanischer Reise (1799─1804). Während seine Studien der Pflanzen- und Tierwelt bis heute große Aufmerksamkeit genießen, wird Humboldts Beitrag zu der Entwicklung der modernen Landwirtschaft wenig beachtet. Während seines Aufenthalts in Lima erhielt er Proben von Guano, Vogelmist von den Chincha-Inseln vor der peruanischen Küste. Einige Proben, die er von dieser Reise zurückbrachte, wurden von Wissenschaftlern in Frankreich und Deutschland untersucht. Die Analysenergebnisse zeigten außerordentlich hohe Gehalte an Pflanzennährstoffen, insbesondere Stickstoff und Phosphor. In den folgenden Jahrzehnten wurde Guano zu einem bedeutsamen Düngemittel und Auslöser eines Booms in Europa und Peru. Die Auswirkungen dieser Entwicklung sind bis heute sichtbar und werden hier unter besonderer Beachtung von Phosphor beschrieben.
El Ensayo sobre la geografía de las plantas de Alexander von Humboldt ha trascendido como una de sus principales propuestas científicas, fundamento de lo que se conoce hoy como “biogeografía”. El origen de este concepto es difuso hasta el momento de la publicación simultánea de su obra en París y en Tübingen, en 1807. El presente artículo propone contrastar la primera versión manuscrita de este ensayo, elaborada en 1803 en Guayaquil y luego leída en 1805 en el Institut National de Paris, con la obra contemporánea del neogranadino Francisco José de Caldas, con quien convivió en Quito en el primer semestre de 1802.