Refine
Year of publication
Document Type
- Article (719)
- Postprint (205)
- Doctoral Thesis (171)
- Review (140)
- Conference Proceeding (139)
- Monograph/Edited Volume (98)
- Working Paper (46)
- Part of Periodical (22)
- Part of a Book (18)
- Report (15)
Keywords
- Philosophie (18)
- philosophy (18)
- Lehrkräftebildung (16)
- Brandenburg (14)
- Reflexion (13)
- Germany (9)
- Reflexionskompetenz (9)
- United States (8)
- climate change (8)
- Anthropologie (7)
Institute
- Extern (1601) (remove)
German foreign policy is in the midst of a far-reaching transformation. Contrary to disciplinary expectations, this process is neither properly captured by descriptions in the liberal tradition („Europeanisation“, „Civilian Power“) nor by Realist expectations that Germany is doomed to „remilitarise“ and/or „renationalise“. However, the key term of foreign policy discourse, „normalisation“, is an unmistakable code, signalling a rediscovery of traditional Realpolitik practices which fit Germany’s current environment. The paper argues that rather than merely playing the role of an obedient disciple of Realpolitik socialisers, Germany ought to rehabilitate the foreign policy tradition of the Bonn Republic in support of an active Idealpolitik transformation of its environment. The article serves as a starting point for a debate on German foreign policy in the upcoming issues of WeltTrends.
This article investigates the fictional narratives written by „Sub-commandante Marcos“ of the Zapatista movement EZLN. It is shown that Marcos uses three distinct frames of reference in his fictional account of the Zapatista guerrilla: an ethnic, a national and a post-national one. Contrary to other studies that emphasize the harmony between the three levels, it can be argued that there exists a fundamental tension between them. There is a tension between the ethnic discourse and the Mexican nationalist discourse which envisions a nation rather than a nation dominated by a single ethno-cultural group. Finally, it can be deduced from these tensions that the EZLN guerrilla is subject to divergent pressures.
Hegemonialmächte im Vorderen und Mittleren Orient : die Dritte Partei in internationalen Konflikten
(1997)
During the last five decades hegemons played an important role in de-escalating international conflicts in the subregion defined as the core of Oriens Islamicus. Statistical analysis of large datasets shows that half of all conflicts remained without any interference from the hegemonial powers at all - both on global scale and in the subregion. In all other cases however, hegemons (especially super-powers in the role of patrons) tended more often to act as (power-) mediators when their client-state was engaged in conflict with a client of the opposing superpower in Oriens Islamicus than they did on global scale. They did this in their own interest in order to avoid direct involvement, i.e. possible danger of a nuclear escalation. In contrast to conventional mediation theory they were more effective in conflict de-escalation than other mediators, especially in conflicts between Israel and its Arab neighbours. The end of bipolarity in the international system also brought this mechanism of de-escalation to an end. It leaves the hegemon(s) as a potentially powerful third party on the one hand, but on the other their inclination to become involved in regional conflict remains rather diminished as long as the basic national interests in the area are not at stake.
The layer-by-layer assembly (LBL) of polyelectrolytes has been extensively studied for the preparation of ultrathin films due to the versatility of the build-up process. The control of the permeability of these layers is particularly important as there are potential drug delivery applications. Multilayered polyelectrolyte microcapsules are also of great interest due to their possible use as microcontainers. This work will present two methods that can be used as employable drug delivery systems, both of which can encapsulate an active molecule and tune the release properties of the active species. Poly-(N-isopropyl acrylamide), (PNIPAM) is known to be a thermo-sensitive polymer that has a Lower Critical Solution Temperature (LCST) around 32oC; above this temperature PNIPAM is insoluble in water and collapses. It is also known that with the addition of salt, the LCST decreases. This work shows Differential Scanning Calorimetry (DSC) and Confocal Laser Scanning Microscopy (CLSM) evidence that the LCST of the PNIPAM can be tuned with salt type and concentration. Microcapsules were used to encapsulate this thermo-sensitive polymer, resulting in a reversible and tunable stimuli- responsive system. The encapsulation of the PNIPAM inside of the capsule was proven with Raman spectroscopy, DSC (bulk LCST measurements), AFM (thickness change), SEM (morphology change) and CLSM (in situ LCST measurement inside of the capsules). The exploitation of the capsules as a microcontainer is advantageous not only because of the protection the capsules give to the active molecules, but also because it facilitates easier transport. The second system investigated demonstrates the ability to reduce the permeability of polyelectrolyte multilayer films by the addition of charged wax particles. The incorporation of this hydrophobic coating leads to a reduced water sensitivity particularly after heating, which melts the wax, forming a barrier layer. This conclusion was proven with Neutron Reflectivity by showing the decreased presence of D2O in planar polyelectrolyte films after annealing creating a barrier layer. The permeability of capsules could also be decreased by the addition of a wax layer. This was proved by the increase in recovery time measured by Florescence Recovery After Photobleaching, (FRAP) measurements. In general two advanced methods, potentially suitable for drug delivery systems, have been proposed. In both cases, if biocompatible elements are used to fabricate the capsule wall, these systems provide a stable method of encapsulating active molecules. Stable encapsulation coupled with the ability to tune the wall thickness gives the ability to control the release profile of the molecule of interest.
Collisions of black holes and neutron stars, named mixed binaries in the following, are interesting because of at least two reasons. Firstly, it is expected that they emit a large amount of energy as gravitational waves, which could be measured by new detectors. The form of those waves is expected to carry information about the internal structure of such systems. Secondly, collisions of such objects are the prime suspects of short gamma ray bursts. The exact mechanism for the energy emission is unknown so far. In the past, Newtonian theory of gravitation and modifications to it were often used for numerical simulations of collisions of mixed binary systems. However, near to such objects, the gravitational forces are so strong, that the use of General Relativity is necessary for accurate predictions. There are a lot of problems in general relativistic simulations. However, systems of two neutron stars and systems of two black holes have been studies extensively in the past and a lot of those problems have been solved. One of the remaining problems so far has been the use of hydrodynamic on excision boundaries. Inside excision regions, no evolution is carried out. Such regions are often used inside black holes to circumvent instabilities of the numerical methods near the singularity. Methods to handle hydrodynamics at such boundaries have been described and tests are shown in this work. One important test and the first application of those methods has been the simulation of a collapsing neutron star to a black hole. The success of these simulations and in particular the performance of the excision methods was an important step towards simulations of mixed binaries. Initial data are necessary for every numerical simulation. However, the creation of such initial data for general relativistic situations is in general very complicated. In this work it is shown how to obtain initial data for mixed binary systems using an already existing method for initial data of two black holes. These initial data have been used for evolutions of such systems and problems encountered are discussed in this work. One of the problems are instabilities due to different methods, which could be solved by dissipation of appropriate strength. Another problem is the expected drift of the black hole towards the neutron star. It is shown, that this can be solved by using special gauge conditions, which prevent the black hole from moving on the computational grid. The methods and simulations shown in this work are only the starting step for a much more detailed study of mixed binary system. Better methods, models and simulations with higher resolution and even better gauge conditions will be focus of future work. It is expected that such detailed studies can give information about the emitted gravitational waves, which is important in view of the newly built gravitational wave detectors. In addition, these simulations could give insight into the processes responsible for short gamma ray bursts.
Nikotin in den unterschiedlichsten Darreichungsformen verringert bei verschiedenen Spezies im räumlichen Hinweisreizparadigma die Kosten invalider Hinweisreize. Welcher Teilprozess genau durch Nikotin beeinflusst wird, ist bislang nicht untersucht worden. Die gängige Interpretation ist, daß Nikotin das Loslösen von Aufmerksamkeit von einem bisher beachteten Ort erleichtert. In fünf Studien, drei elektrophysiologischen und zwei behavioralen wurden drei mögliche Mechanismen der Nikotinwirkung an Nichtrauchern untersucht. Experiment 1 und 2 gingen der Frage nach, ob Nikotin eine Modulation sensorischer gain Kontrolle bewirkt. Dazu wurden ereigniskorrelierte Potentiale (EKP) im Posner-Paradigma erhoben und die Wirkung von Nikotin auf die aufmerksamkeitsassoziierten Komponenten P1 und N1 betrachtet. Nikotin verringerte die Kosten invalider Hinweisreize bei Aufmerksamkeitslenkung durch endogene Hinweisreize, nicht aber bei exogenen Hinweisreizen. Die P1 und N1 Komponenten zeigten sich unbeeinflusst von Nikotin, damit findet also die Annahme einer Wirkung auf sensorische Suppression keine Unterstützung. In Experiment 3 und 4 wurde untersucht, ob Nikotin einen Effekt auf kostenträchtige unwillkürliche Aufmerksamkeitsverschiebungen, Distraktionen, hat. In Experiment 3 wurden in einem räumlichen Daueraufmerksamkeitsparadigma Distraktionen durch deviante Stimulusmerkmale ausgelöst und die Wirkung von Nikotin auf eine distraktionsassoziierte Komponente des EKP, die P3a, betrachtet. In Experiment 4 wurde in einem Hinweisreizparadigma durch zusätzliche Stimuli eine Distraktion ausgelöst und die Nikotinwirkung auf die Reaktionszeitkosten untersucht. Nikotin zeigte keinen Einfluss auf Distraktionskosten in beiden Studien und auch keine Wirkung auf die P3a Komponente in Experiment 3. In Experiment 4 wurde zusätzlich die Wirkung von Nikotin auf das Loslösen von Aufmerksamkeit untersucht, indem die Schwierigkeit des Loslösens variiert wurde. Auch hier zeigte sich keine Nikotinwirkung. Allerdings konnte in beiden Studien weder die häufig berichtete generelle Reaktionszeitverkürzung noch die Verringerung der Kosten invalider Hinweisreize repliziert werden, so dass zum Einen keine Aussage über die Wirkung von Nikotin auf Distraktionen oder den Aufmerksamkeitsloslöseprozess gemacht werden können, zum Anderen sich die Frage stellte, unter welchen Bedingungen Nikotin einen differentiellen Effekt überhaupt zeigt. Im letzten Experiment wurde hierzu die Häufigkeit der Reaktionsanforderung einerseits und die zeitlichen Aspekte der Aufmerksamkeitslenkung andererseits variiert und der Effekt des Nikotins auf den Validitätseffekt, die Reaktionszeitdifferenz zwischen valide und invalide vorhergesagten Zielreizen, betrachtet. Nikotin verringerte bei Individuen, bei denen Aufmerksamkeitslenkung in allen Bedingungen evident war, in der Tendenz den Validitätseffekt in der ereignisärmsten Bedingung, wenn nur selten willentliche Aufmerksamkeitsausrichtung notwendig war. Dies könnte als Hinweis gedeutet werden, dass Nikotin unter Bedingungen, die große Anforderungen an die Vigilanz stellen, die top-down Zuweisung von Aufmerksamkeitsressourcen unterstützt.
The advent of large-scale and high-throughput technologies has recently caused a shift in focus in contemporary biology from decades of reductionism towards a more systemic view. Alongside the availability of genome sequences the exploration of organisms utilizing such approach should give rise to a more comprehensive understanding of complex systems. Domestication and intensive breeding of crop plants has led to a parallel narrowing of their genetic basis. The potential to improve crops by conventional breeding using elite cultivars is therefore rather limited and molecular technologies, such as marker assisted selection (MAS) are currently being exploited to re-introduce allelic variance from wild species. Molecular breeding strategies have mostly focused on the introduction of yield or resistance related traits to date. However given that medical research has highlighted the importance of crop compositional quality in the human diet this research field is rapidly becoming more important. Chemical composition of biological tissues can be efficiently assessed by metabolite profiling techniques, which allow the multivariate detection of metabolites of a given biological sample. Here, a GC/MS metabolite profiling approach has been applied to investigate natural variation of tomatoes with respect to the chemical composition of their fruits. The establishment of a mass spectral and retention index (MSRI) library was a prerequisite for this work in order to establish a framework for the identification of metabolites from a complex mixture. As mass spectral and retention index information is highly important for the metabolomics community this library was made publicly available. Metabolite profiling of tomato wild species revealed large differences in the chemical composition, especially of amino and organic acids, as well as on the sugar composition and secondary metabolites. Intriguingly, the analysis of a set of S. pennellii introgression lines (IL) identified 889 quantitative trait loci of compositional quality and 326 yield-associated traits. These traits are characterized by increases/decreases not only of single metabolites but also of entire metabolic pathways, thus highlighting the potential of this approach in uncovering novel aspects of metabolic regulation. Finally the biosynthetic pathway of the phenylalanine-derived fruit volatiles phenylethanol and phenylacetaldehyde was elucidated via a combination of metabolic profiling of natural variation, stable isotope tracer experiments and reverse genetic experimentation.
In semi-arid savannas, unsustainable land use can lead to degradation of entire landscapes, e.g. in the form of shrub encroachment. This leads to habitat loss and is assumed to reduce species diversity. In BIOTA phase 1, we investigated the effects of land use on population dynamics on farm scale. In phase 2 we scale up to consider the whole regional landscape consisting of a diverse mosaic of farms with different historic and present land use intensities. This mosaic creates a heterogeneous, dynamic pattern of structural diversity at a large spatial scale. Understanding how the region-wide dynamic land use pattern affects the abundance of animal and plant species requires the integration of processes on large as well as on small spatial scales. In our multidisciplinary approach, we integrate information from remote sensing, genetic and ecological field studies as well as small scale process models in a dynamic region-wide simulation tool. <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006.
Decisions for the conservation of biodiversity and sustainable management of natural resources are typically related to large scales, i.e. the landscape level. However, understanding and predicting the effects of land use and climate change on scales relevant for decision-making requires to include both, large scale vegetation dynamics and small scale processes, such as soil-plant interactions. Integrating the results of multiple BIOTA subprojects enabled us to include necessary data of soil science, botany, socio-economics and remote sensing into a high resolution, process-based and spatially-explicit model. Using an example from a sustainably-used research farm and a communally used and degraded farming area in semiarid southern Namibia we show the power of simulation models as a tool to integrate processes across disciplines and scales.
Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
Mit der politischen Wende in den Staaten des ehemaligen Ostblockes wurde für viele militärisch genutzte Flächen ein tiefgreifender Nutzungswandel eingeleitet. Truppenübungsplätze als stark gestörte Bestandteile unserer Kulturlandschaft weisen auf großen Flächen naturschutzfachlich wertvolle Habitatmosaike mit speziellen Lebensgemeinschaften auf. Der Nutzungswandel ist mit einer Veränderung der Vegetationsstrukturen (Sukzession) und weiteren landschaftsökologischen Prozessen verbunden. Der ehemalige Truppenübungsplatz Döberitz im Norden der Landeshauptstadt Potsdam kann auf eine lange militärische Nutzungsgeschichte verweisen (erste Manöver des Soldatenkönigs im Jahr 1713). Nach 1992 wurden das NSG Döberitzer Heide (3.415 ha) und das NSG Ferbitzer Bruch (1.155 ha) ausgewiesen. Als Schutzgebiete nach der Vogelschutzrichtlinie sind sie Bestandteile des kohärenten Schutzgebietssystems Natura 2000 der europäischen Gemeinschaft. Trotz des Schutzstatus und der militärischen Altlasten unterliegt das Gebiet als größte zusammenhängende Naturfläche im engeren Verflechtungsraum des Landes Brandenburg einem hohen Nutzungsdruck. <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
In ihrem Bemühen, landwirtschaftliche Flächen standortgerecht zu bewirtschaften, sammelt eine zunehmende Anzahl landwirtschaftlicher Betriebe Informationen über die räumlich-zeitliche Verteilung von Boden- und Pflanzenmerkmalen auf ihren Schlägen. Diese Informationen dienen unmittelbar (Echtzeitansatz) oder mittelbar (Kartenansatz) zur Dosierung von Dünge- und Pflanzenschutzmitteln (Präzise Landbewirtschaftung). Zur Datensammlung werden vorrangig fahrzeuggestützte Sensoren und VIS- und NIR-Luftbilder, aufgenommen aus Sportflugzeugen, verwendet. Erste Betriebe erwerben von Dienstleistungsunternehmen aufbereitete Satelliten-Fernerkundungsdaten. Die landwirtschaftliche und agrartechnische Forschung ist bestrebt, die grundlegenden Muster (z.B. des Ertragspotentials) zu erkennen und damit den Aufwand der Betriebe für eine regelmäßige Informationserfassung gering zu halten. <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
Fluvial systems are one of the major features shaping a landscape. They adjust to the prevailing tectonic and climatic setting and therefore are very sensitive markers of changes in these systems. If their response to tectonic and climatic forcing is quantified and if the climatic signal is excluded, it is possible to derive a local deformation history. Here, we investigate fluvial terraces and erosional surfaces in the southern Chilean forearc to assess a long-term geomorphic and hence tectonic evolution. Remote sensing and field studies of the Nahuelbuta Range show that the long-term deformation of the Chilean forearc is manifested by breaks in topography, sequences of differentially uplifted marine, alluvial and strath terraces as well as tectonically modified river courses and drainage basins. We used SRTM-90-data as basic elevation information for extracting and delineating drainage networks. We calculated hypsometric curves as an indicator for basin uplift, stream-length gradient indices to identify stream segments with anomalous slopes, and longitudinal river profiles as well as DS-plots to identify knickpoints and other anomalies. In addition, we investigated topography with elevation-slope graphs, profiles, and DEMs to reveal erosional surfaces. During the first field trip we already measured palaeoflow directions, performed pebble counting and sampled the fluvial terraces in order to apply cosmogenic nuclide dating (<sup>10Be, <sup>26Al) as well as provenance analyses. Our preliminary analysis of the Coastal Cordillera indicates a clear segmentation between the northern and southern parts of the Nahuelbuta Range. The Lanalhue Fault, a NW-SE striking fault zone oblique to the plate boundary, defines the segment boundary. Furthermore, we find a complex drainage re-organisation including a drainage reversal and wind gap on the divide between the Tirúa and Pellahuén basins east of the town Tirúa. The coastal basins lost most of their Andean sediment supply areas that existed in Tertiary and in part during early Pleistocene time. Between the Bío-Bío and Imperial rivers no Andean river is recently capable to traverse the Coastal Cordillera, suggesting ongoing Quaternary uplift of the entire range. From the spatial distribution of geomorphic surfaces in this region two uplift signals may be derived: (1) a long-term differential uplift process, active since the Miocene and possibly caused by underplating of subducted trench sediments, (2) a younger, local uplift affecting only the northern part of the Nahuelbuta Range that may be caused by the interaction of the forearc with the subduction of the Mocha Fracture Zone at the latitude of the Arauco peninsula. Our approach thus provides results in our attempt to decipher the characteristics of forearc development of active convergent margins using long-term geomorphic indicators. Furthermore, it is expected that our ongoing assessment will constrain repeatedly active zones of deformation. <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
The rigorous development, application and validation of distributed hydrological models obligates to evaluate data in a spatially distributed way. In particular, spatial model predictions such as the distribution of soil moisture, runoff generating areas or nutrient-contributing areas or erosion rates, are to be assessed against spatially distributed observations. Also model inputs, such as the distribution of modelling units derived by GIS and remote sensing analyses, should be evaluated against groundbased observations of landscape characteristics. So far, however, quantitative methods of spatial field comparison have rarely been used in hydrology. In this paper, we present algorithms that allow to compare observed and simulated spatial hydrological data. The methods can be applied for binary and categorical data on regular grids. They comprise cell-by-cell algorithms, cell-neighbourhood approaches that account for fuzziness of location, and multi-scale algorithms that evaluate the similarity of spatial fields with changing resolution. All methods provide a quantitative measure of the similarity of two maps. The comparison methods are applied in two mountainous catchments in southern Germany (Brugga, 40 km<sup>2) and Austria (Löhnersbach, 16 km<sup>2). As an example of binary hydrological data, the distribution of saturated areas is analyzed in both catchments. For categorical data, vegetation zones that are associated with different runoff generation mechanisms are analyzed in the Löhnersbach. Mapped spatial patterns are compared to simulated patterns from terrain index calculations and from satellite image analysis. It is discussed how particular features of visual similarity between the spatial fields are captured by the quantitative measures, leading to recommendations on suitable algorithms in the context of evaluating distributed hydrological models.
Moderne Softwaresysteme sind komplexe Gebilde, welche häufig im Verbund mit anderen technischen und betriebswirtschaftlichen Systemen eingesetzt werden. Für die Hersteller solcher Systeme stellt es oft eine große Herausforderung dar, den oft weit reichenden Anforderungen bezüglich der Anpassbarkeit solcher Systeme gerecht zu werden. Zur Erfüllung dieser Anforderungen hat es sich vielfach bewährt, eine virtuelle Maschine in das betreffende System zu integrieren. Die Dissertation richtet sich insbesondere an Personen, die vor der Aufgabe der Integration virtueller Maschinen in bestehende Systeme stehen und zielt darauf ab, solche für die Entscheidung über Integrationsfragen wichtigen Zusammenhänge klar darzustellen. Typischerweise treten bei der Integration einer virtuellen Maschine in ein System eine Reihe unterschiedlicher Problemstellungen auf. Da diese Problemstellungen oft eng miteinander verzahnt sind, ist eine isolierte Betrachtung meist nicht sinnvoll. Daher werden die Problemstellungen anhand eines zentral gewählten, sehr umfangreichen Beispiels aus der industriellen Praxis eingeführt. Dieses Beispiel hat die Integration der "Java Virtual Machine" in den SAP R/3 Application Server zum Gegenstand. Im Anschluss an dieses Praxisbeispiel wird die Diskussion der Integrationsproblematik unter Bezug auf eine Auswahl weiterer, in der Literatur beschriebener Integrationsbeispiele vertieft. Das Hauptproblem bei der Behandlung der Integrationsproblematik bestand darin, dass die vorgefundenen Beschreibungen, der als Beispiel herangezogenen Systeme, nur bedingt als Basis für die Auseinandersetzung mit der Integrationsproblematik geeignet waren. Zur Schaffung einer verwertbaren Diskussionsgrundlage war es daher erforderlich, eine homogene, durchgängige Modellierung dieser Systeme vorzunehmen. Die Modellierung der Systeme erfolgte dabei unter Verwendung der "Fundamental Modeling Concepts (FMC)". Die erstellten Modelle sowie die auf Basis dieser Modelle durchgeführte Gegenüberstellung der unterschiedlichen Ansätze zur LÖsung typischer Integrationsprobleme bilden den Hauptbeitrag der Dissertation. Im Zusammenhang mit der Integration virtueller Maschinen in bestehende Systeme besteht häufig der Bedarf, zeitgleich mehrere "Programme" durch die integrierte virtuelle Maschine ausführen zu lassen. Angesichts der Konstruktionsmerkmale vieler heute verbreiteter virtueller Maschinen stellt die Realisierung eines "betriebsmittelschonenden Mehrprogrammbetriebs" eine große Herausforderung dar. Die Darstellung des Spektrums an Maßnahmen zur Realisierung eines "betriebsmittelschonenden Mehrprogrammbetriebs" bildet einen zweiten wesentlichen Beitrag der Dissertation.