Refine
Year of publication
- 2014 (1884) (remove)
Document Type
- Article (1252)
- Doctoral Thesis (222)
- Postprint (127)
- Review (65)
- Monograph/Edited Volume (60)
- Preprint (47)
- Conference Proceeding (43)
- Part of Periodical (21)
- Part of a Book (19)
- Other (11)
Language
Keywords
- prevention (25)
- violence (22)
- Gewalt (21)
- Kriminalität (21)
- Nachhaltigkeit (21)
- Prävention (21)
- Rechtsextremismus (21)
- crime (21)
- right-wing extremism (21)
- sustainability (21)
Institute
- Institut für Biochemie und Biologie (251)
- Institut für Physik und Astronomie (239)
- Institut für Geowissenschaften (228)
- Institut für Chemie (189)
- Department Psychologie (92)
- Wirtschaftswissenschaften (62)
- Sozialwissenschaften (59)
- Institut für Ernährungswissenschaft (58)
- Institut für Mathematik (54)
- Department Linguistik (52)
Soil in a changing world is subject to both anthropogenic and environmental stresses. Soil monitoring is essential to assess the magnitude of changes in soil variables and how they affect ecosystem processes and human livelihoods. However, we cannot always be sure which sampling design is best for a given monitoring task. We employed a rotational stratified simple random sampling (rotStRS) for the estimation of temporal changes in the spatial mean of saturated hydraulic conductivity (K-s) at three sites in central Panama in 2009, 2010 and 2011. To assess this design's efficiency we compared the resulting estimates of the spatial mean and variance for 2009 with those gained from stratified simple random sampling (StRS), which was effectively the data obtained on the first sampling time, and with an equivalent unexecuted simple random sampling (SRS). The poor performance of geometrical stratification and the weak predictive relationship between measurements of successive years yielded no advantage of sampling designs more complex than SRS. The failure of stratification may be attributed to the small large-scale variability of K-s. Revisiting previously sampled locations was not beneficial because of the large small-scale variability in combination with destructive sampling, resulting in poor consistency between revisited samples. We conclude that for our K-s monitoring scheme, repeated SRS is equally effective as rotStRS. Some problems of small-scale variability might be overcome by collecting several samples at close range to reduce the effect of small-scale variation. Finally, we give recommendations on the key factors to consider when deciding whether to use stratification and rotation in a soil monitoring scheme.
Many perceptual and cognitive tasks permit or require the integrated cooperation of specialized sensory channels, detectors, or other functionally separate units. In compound detection or discrimination tasks, 1 prominent general mechanism to model the combination of the output of different processing channels is probability summation. The classical example is the binocular summation model of Pirenne (1943), according to which a weak visual stimulus is detected if at least 1 of the 2 eyes detects this stimulus; as we review briefly, exactly the same reasoning is applied in numerous other fields. It is generally accepted that this mechanism necessarily predicts performance based on 2 (or more) channels to be superior to single channel performance, because 2 separate channels provide "2 chances" to succeed with the task. We argue that this reasoning is misleading because it neglects the increased opportunity with 2 channels not just for hits but also for false alarms and that there may well be no redundancy gain at all when performance is measured in terms of receiver operating characteristic curves. We illustrate and support these arguments with a visual detection experiment involving different spatial uncertainty conditions. Our arguments and findings have important implications for all models that, in one way or another, rest on, or incorporate, the notion of probability summation for the analysis of detection tasks, 2-alternative forced-choice tasks, and psychometric functions.
The term Linked Data refers to connected information sources comprising structured data about a wide range of topics and for a multitude of applications. In recent years, the conceptional and technical foundations of Linked Data have been formalized and refined. To this end, well-known technologies have been established, such as the Resource Description Framework (RDF) as a Linked Data model or the SPARQL Protocol and RDF Query Language (SPARQL) for retrieving this information. Whereas most research has been conducted in the area of generating and publishing Linked Data, this thesis presents novel approaches for improved management. In particular, we illustrate new methods for analyzing and processing SPARQL queries. Here, we present two algorithms suitable for identifying structural relationships between these queries. Both algorithms are applied to a large number of real-world requests to evaluate the performance of the approaches and the quality of their results. Based on this, we introduce different strategies enabling optimized access of Linked Data sources. We demonstrate how the presented approach facilitates effective utilization of SPARQL endpoints by prefetching results relevant for multiple subsequent requests. Furthermore, we contribute a set of metrics for determining technical characteristics of such knowledge bases. To this end, we devise practical heuristics and validate them through thorough analysis of real-world data sources. We discuss the findings and evaluate their impact on utilizing the endpoints. Moreover, we detail the adoption of a scalable infrastructure for improving Linked Data discovery and consumption. As we outline in an exemplary use case, this platform is eligible both for processing and provisioning the corresponding information.
Morphological systems are constrained in how they interact with each other. One case that has been widely studied in the psycholinguistic literature is the avoidance of plurals inside compounds (e.g. *rats eater vs. rat eater) in English and other languages, the so-called plurals-in-compounds effect. Several previous studies have shown that both adult and child speakers are sensitive to this contrast, but the question of whether semantic, morphological, or surface-form constraints are responsible for the plurals-in-compounds effect remains controversial. The present study provides new empirical evidence from adult and child English to resolve this controversy. Graded linguistic judgments were obtained from 96 children (age range: 7;06 to 12;08) and 32 adults. In the task, participants were asked to rate compounds containing different kinds of singular or plural modifiers. The results indicated that both children and adults disliked regular plurals inside compounds, whereas irregular plurals were rated as marginal and singulars as fully acceptable. Furthermore, acceptability ratings were found not to be affected by the phonological surface form of a compound-internal modifier. We conclude that semantic and morphological (rather than surface-form) constraints are responsible for the plurals-in-compounds effect, in both children and adults.
This study examines the course and driving forces of recent vegetation change in the Mongolian steppe. A sediment core covering the last 55years from a small closed-basin lake in central Mongolia was analyzed for its multi-proxy record at annual resolution. Pollen analysis shows that highest abundances of planted Poaceae and highest vegetation diversity occurred during 1977-1992, reflecting agricultural development in the lake area. A decrease in diversity and an increase in Artemisia abundance after 1992 indicate enhanced vegetation degradation in recent times, most probably because of overgrazing and farmland abandonment. Human impact is the main factor for the vegetation degradation within the past decades as revealed by a series of redundancy analyses, while climate change and soil erosion play subordinate roles. High Pediastrum (a green algae) influx, high atomic total organic carbon/total nitrogen (TOC/TN) ratios, abundant coarse detrital grains, and the decrease of C-13(org) and N-15 since about 1977 but particularly after 1992 indicate that abundant terrestrial organic matter and nutrients were transported into the lake and caused lake eutrophication, presumably because of intensified land use. Thus, we infer that the transition to a market economy in Mongolia since the early 1990s not only caused dramatic vegetation degradation but also affected the lake ecosystem through anthropogenic changes in the catchment area.
This study examines the course and driving forces of recent vegetation change in the Mongolian steppe. A sediment core covering the last 55years from a small closed-basin lake in central Mongolia was analyzed for its multi-proxy record at annual resolution. Pollen analysis shows that highest abundances of planted Poaceae and highest vegetation diversity occurred during 1977-1992, reflecting agricultural development in the lake area. A decrease in diversity and an increase in Artemisia abundance after 1992 indicate enhanced vegetation degradation in recent times, most probably because of overgrazing and farmland abandonment. Human impact is the main factor for the vegetation degradation within the past decades as revealed by a series of redundancy analyses, while climate change and soil erosion play subordinate roles. High Pediastrum (a green algae) influx, high atomic total organic carbon/total nitrogen (TOC/TN) ratios, abundant coarse detrital grains, and the decrease of C-13(org) and N-15 since about 1977 but particularly after 1992 indicate that abundant terrestrial organic matter and nutrients were transported into the lake and caused lake eutrophication, presumably because of intensified land use. Thus, we infer that the transition to a market economy in Mongolia since the early 1990s not only caused dramatic vegetation degradation but also affected the lake ecosystem through anthropogenic changes in the catchment area.
Wer wird Vater und wann?
(2014)
Moderne Kraftfahrzeuge verfügen über eine Vielzahl an Sensoren, welche für einen reibungslosen technischen Betrieb benötigt werden. Hierzu zählen neben fahrzeugspezifischen Sensoren (wie z.B. Motordrehzahl und Fahrzeuggeschwindigkeit) auch umweltspezifische Sensoren (wie z.B. Luftdruck und Umgebungstemperatur). Durch die zunehmende technische Vernetzung wird es möglich, diese Daten der Kraftfahrzeugelektronik aus dem Fahrzeug heraus für die verschiedensten Zwecke zu verwenden.
Die vorliegende Arbeit soll einen Beitrag dazu leisten, diese neue Art an massenhaften Daten im Sinne des Konzepts der „Extended Floating Car Data“ (XFCD) als Geoinformationen nutzbar zu machen und diese für raumzeitliche Visualisierungen (zur visuellen Analyse) anwenden zu können. In diesem Zusammenhang wird speziell die Perspektive des Umwelt- und Verkehrsmonitoring betrachtet, wobei die Anforderungen und Potentiale mit Hilfe von Experteninterviews untersucht werden. Es stellt sich die Frage, welche Daten durch die Kraftfahrzeugelektronik geliefert und wie diese möglichst automatisiert erfasst, verarbeitet, visualisiert und öffentlich bereitgestellt werden können. Neben theoretischen und technischen Grundlagen zur Datenerfassung und -nutzung liegt der Fokus auf den Methoden der kartographischen Visualisierung. Dabei soll der Frage nachgegangenen werden, ob eine technische Implementierung ausschließlich unter Verwendung von Open Source Software möglich ist. Das Ziel der Arbeit bildet ein zweigliedriger Ansatz, welcher zum einen die Visualisierung für ein exemplarisch gewähltes Anwendungsszenario und zum anderen die prototypische Implementierung von der Datenerfassung im Fahrzeug unter Verwendung der gesetzlich vorgeschriebenen „On Board Diagnose“-Schnittstelle und einem Smartphone-gestützten Ablauf bis zur webbasierten Visualisierung umfasst.
Geometric generalization is a fundamental concept in the digital mapping process. An increasing amount of spatial data is provided on the web as well as a range of tools to process it. This jABC workflow is used for the automatic testing of web-based generalization services like mapshaper.org by executing its functionality, overlaying both datasets before and after the transformation and displaying them visually in a .tif file. Mostly Web Services and command line tools are used to build an environment where ESRI shapefiles can be uploaded, processed through a chosen generalization service and finally visualized in Irfanview.
The Dansgaard-Oeschger oscillations and Heinrich events described in North Atlantic sediments and Greenland ice are expressed in the climate of the tropics, for example, as documented in Arabian Sea sediments. Given the strength of this teleconnection, we seek to reconstruct its range of environmental impacts. We present geochemical and sedimentological data from core SO130-289KL from the Indus submarine slope spanning the last similar to 80 kyr. Elemental and grain size analyses consistently indicate that interstadials are characterized by an increased contribution of fluvial suspension from the Indus River. In contrast, stadials are characterized by an increased contribution of aeolian dust from the Arabian Peninsula. Decadal-scale shifts at climate transitions, such as onsets of interstadials, were coeval with changes in productivity-related proxies. Heinrich events stand out as especially dry and dusty events, indicating a dramatically weakened Indian summer monsoon, potentially increased winter monsoon circulation, and increased aridity on the Arabian Peninsula. This finding is consistent with other paleoclimate evidence for continental aridity in the northern tropics during these events. Our results strengthen the evidence that circum-North Atlantic temperature variations translate to hydrological shifts in the tropics, with major impacts on regional environmental conditions such as rainfall, river discharge, aeolian dust transport, and ocean margin anoxia.
Context. It is not yet clear whether magnetic fields play an essential role in shaping planetary nebulae (PNe), or whether stellar rotation alone and/or a close binary companion, stellar or substellar, can account for the variety of the observed nebular morphologies.
Aims. In a quest for empirical evidence verifying or disproving the role of magnetic fields in shaping planetary nebulae, we follow up on previous attempts to measure the magnetic field in a representative sample of PN central stars.
Methods. We obtained low-resolution polarimetric spectra with FORS2 installed on the Antu telescope of the VLT for a sample of 12 bright central stars of PNe with different morphologies, including two round nebulae, seven elliptical nebulae, and three bipolar nebulae. Two targets are Wolf-Rayet type central stars.
Results. For the majority of the observed central stars, we do not find any significant evidence for the existence of surface magnetic fields. However, our measurements may indicate the presence of weak mean longitudinal magnetic fields of the order of 100 Gauss in the central star of the young elliptical planetary nebula IC 418 as well as in the Wolf-Rayet type central star of the bipolar nebula Hen 2-113 and the weak emission line central star of the elliptical nebula Hen 2-131. A clear detection of a 250 G mean longitudinal field is achieved for the A-type companion of the central star of NGC 1514. Some of the central stars show a moderate night-to-night spectrum variability, which may be the signature of a variable stellar wind and/or rotational modulation due to magnetic features.
Conclusions. Since our analysis indicates only weak fields, if any, in a few targets of our sample, we conclude that strong magnetic fields of the order of kG are not widespread among PNe central stars. Nevertheless, simple estimates based on a theoretical model of magnetized wind bubbles suggest that even weak magnetic fields below the current detection limit of the order of 100 G may well be sufficient to contribute to the shaping of the surrounding nebulae throughout their evolution. Our current sample is too small to draw conclusions about a correlation between nebular morphology and the presence of stellar magnetic fields.
Process models specify behavioral execution constraints between activities as well as between activities and data objects. A data object is characterized by its states and state transitions represented as object life cycle. For process execution, all behavioral execution constraints must be correct. Correctness can be verified via soundness checking which currently only considers control flow information. For data correctness, conformance between a process model and its object life cycles is checked. Current approaches abstract from dependencies between multiple data objects and require fully specified process models although, in real-world process repositories, often underspecified models are found. Coping with these issues, we introduce the concept of synchronized object life cycles and we define a mapping of data constraints of a process model to Petri nets extending an existing mapping. Further, we apply the notion of weak conformance to process models to tell whether each time an activity needs to access a data object in a particular state, it is guaranteed that the data object is in or can reach the expected state. Then, we introduce an algorithm for an integrated verification of control flow correctness and weak data conformance using soundness checking.
The economic impact analysis contained in this book shows how irrigation farming is particularly susceptible when applying certain water management policies in the Australian Murray-Darling Basin, one of the world largest river basins and Australia’s most fertile region. By comparing different pricing and non-pricing water management policies with the help of the Water Integrated Market Model, it is found that the impact of water demand reducing policies is most severe on crops that need to be intensively irrigated and are at the same time less water productive. A combination of increasingly frequent and severe droughts and the application of policies that decrease agricultural water demand, in the same region, will create a situation in which the highly water dependent crops rice and cotton cannot be cultivated at all.
Leaching of dissolved C in arable hummocky ground moraine soil landscapes is characterized by a spatial continuum of more or less erosion-affected Luvisols, Calcaric Regosols at exposed positions, and Colluvic Regosols in depressions. Our objective was to estimate the fluxes of dissolved C in four differently eroded soils as affected by erosion-induced pedological and soil structural alterations. In this model study, we considered landscape position effects by adapting the water table as the bottom boundary condition and erosion effects by using pedon-specific soil hydraulic properties. The one-dimensional vertical water movement was described with the Richards equation using HYDRUS-1D. Solute fluxes were obtained by combining calculated water fluxes with concentrations of dissolved organic and inorganic C (DOC and DIC, respectively) measured from soil solution extracted by suction cups at biweekly intervals. In the 3-yr period (2010-2012), DOC fluxes in the 2-m soil depth were similar at the three non-colluvic locations with -0.8 +/- 0.1 g m(-2) yr(-1) (i.e., outflow) but were 0.4 g m(-2) yr(-1) (i.e., input) in the depression. The DIC fluxes ranged from -10.2 g m(-2) yr(-1) for the eroded Luvisol, -9.2 g m(-2) yr(-1) for the Luvisol, and -6.1 g m(-2) yr(-1) for the Calcaric Regosol to 3.2 g m(-2) yr(-1) for the Colluvic Regosol. The temporal variations in DOC and DIC fluxes were controlled by water fluxes. The spatially distributed leaching results corroborate the hypothesis that the effects of soil erosion influence fluxes through modified hydraulic and transport properties and terrain-dependent boundary conditions.
Vorwort
(2014)
Vorwort
(2014)
Vorwort
(2014)
Der Ausgang der Bundestagswahl 2013 hat über die Zukunft des Modells Europa entschieden, noch bevor die Europawahlen 2014 auch in Deutschland stattgefunden haben. Mit dem Ergebnis dieser Bundestagswahl wird ein vom Hegemon Deutschland zugleich auch für die Eliten Europas mitstatuierter Exportund Finanzkeynesianismus für lange Zeit etabliert werden.
Von Ulfila bis Rekkared
(2014)
Für etwa 200 Jahre, vom Ende des 4. Jahrhunderts bis 589 n. Chr., gehören die Westgoten einem von der Orthodoxie der Reichskirche abweichenden christlichen Bekenntnis an. Auf dem 3. Konzil von Toledo beendet König Rekkared diesen Zustand religiöser Alterität durch die Konversion zum Katholizismus. Die antiken Berichte zeichnen nur vordergründig ein kohärentes Bild des Christentums der Goten. Dagegen weist Eike Faber Widersprüche und Fehler in der Überlieferung nach und bietet eine Reihe neuer Interpretationen an. Die genauere Betrachtung vermag dabei sowohl den dogmatischen Gehalt des Christentums der Goten schärfer zu konturieren als auch die Beweggründe für die Annahme des neuen Glaubens herauszustellen und die traditionelle Datierung dieses Vorgangs zu widerlegen. Schließlich wird deutlich, welche Funktion das fortgesetzte Festhalten an einer demonstrativ anderen religiösen Überzeugung für die Goten hatte. Prägnante Wegmarken der gotischen Geschichte, wie die Bibelübersetzung Ulfilas, die Eroberung und Plünderung Roms 410 n. Chr. oder die demonstrative Aufgabe der religiösen Differenz durch König Rekkared, werden erst durch diesen neuen, umfassenden Kontext verständlich.
Von Budapest nach Straßburg
(2014)
Der ungarische Kantor Marcel (Martón) Lorand trat 1964, von der neologen Großen Synagoge Budapest kommend, die Kantorenstelle an der orthodoxen Synagogue de la Paix in Straßburg an. Dieser Beitrag nähert sich seinem Leben und Wirken über seine Schallplatten-Aufnahmen sowie über Erinnerungen von Zeitgenossen. Am Beispiel Lorands soll betrachtet werden, mit welchen Herausforderungen ein Kantor beim Wechsel von einer Gemeinde in eine andere konfrontiert sein kann. In einer der wichtigsten Gemeindefunktionen, als Leiter des Synagogengottesdienstes, muss ein Kantor flexibel sein und sich an ortsgebundene liturgische Bräuche anpassen können.
Ziel dieser Arbeit war die Entwicklung von Methoden zur Synthese von auf Phenol basierenden Naturstoffen. Insbesondere wurde bei der Methodenentwicklung die Nachhaltigkeit in den Vordergrund gerückt. Dies bedeutet, dass durch die Zusammenfassung mehrerer Syntheseschritte zu einem (Tandem-Reaktion) beispielsweise unnötige Reaktionsschritte vermieden werden sollten. Ferner sollten im Sinne der Nachhaltigkeit möglichst ungiftige Reagenzien und Lösungmittel verwendet werden, ebenso wie mehrfach wiederverwertbare Katalysatoren zum Einsatz kommen. Im Rahmen dieser Arbeit wurden Methoden zum Aufbau von Biphenolen mittels Pd/C-katalysierten Suzuki-Miyaura-Kupplungen entwickelt. Diese Methoden sind insofern äußerst effizient, da der ansonsten gebräuchliche Syntheseweg über drei Reaktionsschritte somit auf lediglich eine Reaktionsstufe reduziert wurde. Weiterhin wurden die Reaktionsbedingungen so gestaltet, dass einfaches Wasser als vollkommen ungiftiges Lösungsmittel verwendet werden konnte. Des Weiteren wurde für diese Reaktionen ein Katalysator gewählt, der einfach durch Filtration vom Reaktionsgemisch abgetrennt und für weitere Reaktionen mehrfach wiederverwendet werden konnte. Darüber hinaus konnte durch die Synthese von mehr als 100 Verbindungen die breite Anwendbarkeit der Methoden aufgezeigt werden. Mit den entwickelten Methoden konnten 14 Naturstoffe - z. T. erstmals - synthetisiert werden. Derartige Stoffe werden u. a. von den ökonomisch bedeutenden Kernobstgewächsen (Äpfeln, Birnen) als Abwehrmittel gegenüber Schädlingen erzeugt. Folglich konnte mit Hilfe dieser Methoden ein Syntheseweg für potentielle Pflanzenschutzmittel entwickelt werden. Im zweiten Teil dieser Arbeit wurde ein Zugang zu den sich ebenfalls vom Phenol ableitenden Chromanonen, Chromonen und Cumarinen untersucht. Bei diesen Untersuchungen konnte durch die Entwicklung zweier neuer Tandem-Reaktionen ein nachhaltiger und stufenökonomischer Syntheseweg zur Darstellung substituierter Benzo(dihydro)pyrone aufgezeigt werden. Durch die erstmalige Kombination der Claisen-Umlagerung mit einer Oxa-Michael-Addition bzw. konjugierten-Addition wurden zwei vollkommen atomökonomische Reaktionen miteinander verknüpft und somit eine überaus effiente Synthese von allyl- bzw. prenylsubstituierten Chromanonen und Chromonen ermöglicht. Ferner konnten durch die Anwendung einer Claisen-Umlagerung-Wittig-Laktonisierungs-Reaktion allyl- bzw. prenylsubstituierte Cumarine erhalten werden. Herausragendes Merkmal dieser Methoden war, dass in nur einem Schritt der jeweilige Naturstoffgrundkörper aufgebaut und eine lipophile Seitenkette generiert werden konnte. Die Entwicklung dieser Methoden ist von hohem pharmazeutischem Stellenwert, da auf diesen Wegen Verbindungen synthetisiert werden können, die zum einem über das notwendige pharmakologische Grundgerüst verfügen und zum anderen über eine Seitenkette, welche die Aufnahmefähigkeit und damit die Wirksamkeit im Organismus beträchtlich erhöht. Insgesamt konnten mittels der entwickelten Methoden 15 Chromanon-, Chromon- und Cumarin-Naturstoffe z. T. erstmals synthetisiert werden.
Vom Kampf für den Frieden
(2014)
The Kv-like (potassium voltage-dependent) K+ channels at the plasma membrane, including the inward-rectifying KAT1 K+ channel of Arabidopsis (Arabidopsis thaliana), are important targets for manipulating K+ homeostasis in plants. Gating modification, especially, has been identified as a promising means by which to engineer plants with improved characteristics in mineral and water use. Understanding plant K+ channel gating poses several challenges, despite many similarities to that of mammalian Kv and Shaker channel models. We have used site-directed mutagenesis to explore residues that are thought to form two electrostatic countercharge centers on either side of a conserved phenylalanine (Phe) residue within the S2 and S3 alpha-helices of the voltage sensor domain (VSD) of Kv channels. Consistent with molecular dynamic simulations of KAT1, we show that the voltage dependence of the channel gate is highly sensitive to manipulations affecting these residues. Mutations of the central Phe residue favored the closed KAT1 channel, whereas mutations affecting the countercharge centers favored the open channel. Modeling of the macroscopic current kinetics also highlighted a substantial difference between the two sets of mutations. We interpret these findings in the context of the effects on hydration of amino acid residues within the VSD and with an inherent bias of the VSD, when hydrated around a central Phe residue, to the closed state of the channel.
After more than a decade of multidisciplinary studies of the Central American subduction zone mainly in the framework of two large research programmes, the US MARGINS program and the German Collaborative Research Center SFB 574, we here review and interpret the data pertinent to quantify the cycling of mineral-bound volatiles (H2O, CO2, Cl, S) through this subduction system. For input-flux calculations, we divide the Middle America Trench into four segments differing in convergence rate and slab lithological profiles, use the latest evidence for mantle serpentinization of the Cocos slab approaching the trench, and for the first time explicitly include subduction erosion of forearc basement. Resulting input fluxes are 40-62 (53) Tg/Ma/m H2O, 7.8-11.4 (9.3) Tg/Ma/m CO2, 1.3-1.9 (1.6) Tg/Ma/m Cl, and 1.3-2.1 (1.6) Tg/Ma/m S (bracketed are mean values for entire trench length). Output by cold seeps on the forearc amounts to 0.625-1.25 Tg/Ma/m H2O partly derived from the slab sediments as determined by geochemical analyses of fluids and carbonates. The major volatile output occurs at the Central American volcanic arc that is divided into ten arc segments by dextral strike-slip tectonics. Based on volcanic edifice and widespread tephra volumes as well as calculated parental magma masses needed to form observed evolved compositions, we determine long-term (10(5) years) average magma and K2O fluxes for each of the ten segments as 32-242 (106) Tg/Ma/m magma and 0.28-2.91 (1.38) Tg/Ma/m K2O (bracketed are mean values for entire Central American volcanic arc length). Volatile/K2O concentration ratios derived from melt inclusion analyses and petrologic modelling then allow to calculate volatile fluxes as 1.02-14.3 (6.2) Tg/Ma/m H2O, 0.02-0.45 (0.17) Tg/Ma/m CO2, and 0.07-0.34 (0.22) Tg/Ma/m Cl. The same approach yields long-term sulfur fluxes of 0.12-1.08 (0.54) Tg/Ma/m while present-day open-vent SO2-flux monitoring yields 0.06-2.37 (0.83) Tg/Ma/m S. Input-output comparisons show that the arc water fluxes only account for up to 40 % of the input even if we include an "invisible" plutonic component constrained by crustal growth. With 20-30 % of the H2O input transferred into the deeper mantle as suggested by petrologic modeling, there remains a deficiency of, say, 30-40 % in the water budget. At least some of this water is transferred into two upper-plate regions of low seismic velocity and electrical resistivity whose sizes vary along arc: one region widely envelopes the melt ascent paths from slab top to arc and the other extends obliquely from the slab below the forearc to below the arc. Whether these reservoirs are transient or steady remains unknown.
VMP1-deficient Chlamydomonas exhibits severely aberrant cell morphology and disrupted cytokinesies
(2014)
Background: The versatile Vacuole Membrane Protein 1 (VMP1) has been previously investigated in six species. It has been shown to be essential in macroautophagy, where it takes part in autophagy initiation. In addition, VMP1 has been implicated in organellar biogenesis; endo-, exo- and phagocytosis, and protein secretion; apoptosis; and cell adhesion. These roles underly its proven involvement in pancreatitis, diabetes and cancer in humans.
Results: In this study we analyzed a VMP1 homologue from the green alga Chlamydomonas reinhardtii. CrVMP1 knockdown lines showed severe phenotypes, mainly affecting cell division as well as the morphology of cells and organelles. We also provide several pieces of evidence for its involvement in macroautophagy.
With the growth of virtualization and cloud computing, more and more forensic investigations rely on being able to perform live forensics on a virtual machine using virtual machine introspection (VMI). Inspecting a virtual machine through its hypervisor enables investigation without risking contamination of the evidence, crashing the computer, etc. To further access to these techniques for the investigator/researcher we have developed a new VMI monitoring language. This language is based on a review of the most commonly used VMI-techniques to date, and it enables the user to monitor the virtual machine's memory, events and data streams. A prototype implementation of our monitoring system was implemented in KVM, though implementation on any hypervisor that uses the common x86 virtualization hardware assistance support should be straightforward. Our prototype outperforms the proprietary VMWare VProbes in many cases, with a maximum performance loss of 18% for a realistic test case, which we consider acceptable. Our implementation is freely available under a liberal software distribution license. (C) 2014 Digital Forensics Research Workshop. Published by Elsevier Ltd. All rights reserved.
Software maintenance encompasses any changes made to a software system after its initial deployment and is thereby one of the key phases in the typical software-engineering lifecycle. In software maintenance, we primarily need to understand structural and behavioral aspects, which are difficult to obtain, e.g., by code reading. Software analysis is therefore a vital tool for maintaining these systems: It provides - the preferably automated - means to extract and evaluate information from their artifacts such as software structure, runtime behavior, and related processes. However, such analysis typically results in massive raw data, so that even experienced engineers face difficulties directly examining, assessing, and understanding these data. Among other things, they require tools with which to explore the data if no clear question can be formulated beforehand. For this, software analysis and visualization provide its users with powerful interactive means. These enable the automation of tasks and, particularly, the acquisition of valuable and actionable insights into the raw data. For instance, one means for exploring runtime behavior is trace visualization. This thesis aims at extending and improving the tool set for visual software analysis by concentrating on several open challenges in the fields of dynamic and static analysis of software systems. This work develops a series of concepts and tools for the exploratory visualization of the respective data to support users in finding and retrieving information on the system artifacts concerned. This is a difficult task, due to the lack of appropriate visualization metaphors; in particular, the visualization of complex runtime behavior poses various questions and challenges of both a technical and conceptual nature. This work focuses on a set of visualization techniques for visually representing control-flow related aspects of software traces from shared-memory software systems: A trace-visualization concept based on icicle plots aids in understanding both single-threaded as well as multi-threaded runtime behavior on the function level. The concept’s extensibility further allows the visualization and analysis of specific aspects of multi-threading such as synchronization, the correlation of such traces with data from static software analysis, and a comparison between traces. Moreover, complementary techniques for simultaneously analyzing system structures and the evolution of related attributes are proposed. These aim at facilitating long-term planning of software architecture and supporting management decisions in software projects by extensions to the circular-bundle-view technique: An extension to 3-dimensional space allows for the use of additional variables simultaneously; interaction techniques allow for the modification of structures in a visual manner. The concepts and techniques presented here are generic and, as such, can be applied beyond software analysis for the visualization of similarly structured data. The techniques' practicability is demonstrated by several qualitative studies using subject data from industry-scale software systems. The studies provide initial evidence that the techniques' application yields useful insights into the subject data and its interrelationships in several scenarios.
A workflow for visualizing server connections using the Google Maps API was built in the jABC. It makes use of three basic services: An XML-based IP address geolocation web service, a command line tool and the Static Maps API. The result of the workflow is an URL leading to an image file of a map, showing server connections between a client and a target host.
Vielheit statt Einheit
(2014)
Das Projekt „Medienbildung in der LehrerInnenbildung“ hat das Ziel, den Einsatz digitaler Medien in den Lehramtsstudiengängen der Universität Potsdam nachhaltig zu fördern. Am Beispiel der Musiklehrerausbildung (Lehrstuhl für Musikpädagogik und Musikdidaktik) wurde ein Konzept für die Nutzung von Video-Podcasts in schulischen Praxisphasen entwickelt, um Studierende bei der Unterrichtsplanung zu unterstützen. Die fachspezifische Umsetzung des E-Learning-Ansatzes und die damit verbundenen Möglichkeiten und Heraus- forderungen werden gezeigt und betonen die Wichtigkeit der Zusammenarbeit zwischen Fachdidaktik und Mediendidaktik, um eine bedarfsorientierte Lösung zu finden, die praktisch umsetzbar ist.
Using density functional theory and Ab Initio Molecular Dynamics with Electronic Friction (AIMDEF), we study the adsorption and dissipative vibrational dynamics of hydrogen atoms chemisorbed on free-standing lead films of increasing thickness. Lead films are known for their oscillatory behaviour of certain properties with increasing thickness, e.g., energy and electron spill-out change in discontinuous manner, due to quantum size effects [G. Materzanini, P. Saalfrank, and P.J.D. Lindan, Phys. Rev. B 63, 235405 (2001)]. Here, we demonstrate that oscillatory features arise also for hydrogen when chemisorbed on lead films. Besides stationary properties of the adsorbate, we concentrate on finite vibrational lifetimes of H-surface vibrations. As shown by AIMDEF, the damping via vibration-electron hole pair coupling dominates clearly over the vibration-phonon channel, in particular for high-frequency modes. Vibrational relaxation times are a characteristic function of layer thickness due to the oscillating behaviour of the embedding surface electronic density. Implications derived from AIMDEF for frictional many-atom dynamics, and physisorbed species will also be given. (C) 2014 AIP Publishing LLC.
Die Arbeitsbibliothek von Dieter Adelmann befindet sich in der Universitätsbibliothek Potsdam und ist in diesem Band verzeichnet; der Nachlass und das Findbuch befinden sich im Universitätsarchiv Potsdam. Dieter Adelmann wurde am 1. Februar 1936 in Eisenach, Thüringen, geboren. Er studierte Philosophie, Germanistik und Soziologie an der Freien Universität Berlin und an der Universität Heidelberg und wurde dort 1968 mit der Arbeit Einheit des Bewusstseins als Grundlage der Philosophie Hermann Cohens bei Dieter Henrich und Hans-Georg Gadamer promoviert. Von 1968 bis 1970 war Adelmann Leiter des „Collegium Academicum“ der Universität Heidelberg; von 1970 bis 1974 Landesgeschäftsführer der SPD in Baden-Württemberg (Zuständigkeit: Politische Planung) und zeitweise auch Wahlkreisassistent des SPD-Bundestagsabgeordneten Horst Ehmke. Anschließend arbeitete Adelmann publizistisch mit dem Grafiker und gegenwärtigen Präsidenten der Berliner Akademie der Künste, Klaus Staeck zusammen, bevor er von Juli 1977 bis einschließlich September 1979 beim Vorwärts im Ressort Parteien und Programme beschäftigt war. Nach seinem Abschied vom Vorwärts war Adelmann freiberuflich in Bonn tätig, u.a. als Journalist. 1995 war er wissenschaftlicher Mitarbeiter im Rahmen der Herausgabe der Werke Hermann Cohens am „Moses-Mendelssohn-Zentrum“ und am Lehrstuhl für Innenraumgestaltung an der Technischen Universität Dresden. Nach dem Ende der Tätigkeit in Potsdam war Adelmann bis zu seinem Tod am 30. September 2008 freiberuflicher Philosoph und Cohen-Forscher.
The Galactic center is an interesting region for high-energy (0.1-100 GeV) and very-high-energy (E > 100 GeV) gamma-ray observations. Potential sources of GeV/TeV gamma-ray emission have been suggested, e.g., the accretion of matter onto the supermassive black hole, cosmic rays from a nearby supernova remnant (e.g., Sgr A East), particle acceleration in a plerion, or the annihilation of dark matter particles. The Galactic center has been detected by EGRET and by Fermi/LAT in the MeV/GeV energy band. At TeV energies, the Galactic center was detected with moderate significance by the CANGAROO and Whipple 10 m telescopes and with high significance by H.E.S.S., MAGIC, and VERITAS. We present the results from three years of VERITAS observations conducted at large zenith angles resulting in a detection of the Galactic center on the level of 18 standard deviations at energies above similar to 2.5 TeV. The energy spectrum is derived and is found to be compatible with hadronic, leptonic, and hybrid emission models discussed in the literature. Future, more detailed measurements of the high-energy cutoff and better constraints on the high-energy flux variability will help to refine and/or disentangle the individual models.