Refine
Year of publication
- 2011 (1713) (remove)
Document Type
- Article (1062)
- Doctoral Thesis (288)
- Monograph/Edited Volume (158)
- Review (73)
- Postprint (40)
- Conference Proceeding (29)
- Other (20)
- Master's Thesis (12)
- Part of Periodical (10)
- Preprint (8)
- Working Paper (7)
- Habilitation Thesis (3)
- Part of a Book (2)
- Moving Images (1)
Language
Is part of the Bibliography
- yes (1713) (remove)
Keywords
- climate change (9)
- X-rays: stars (7)
- Holocene (6)
- Tibetan Plateau (6)
- Deutschland (5)
- Dictyostelium (5)
- NMR (5)
- answer set programming (5)
- gamma rays: general (5)
- globalization (5)
Institute
- Institut für Biochemie und Biologie (242)
- Institut für Physik und Astronomie (187)
- Institut für Chemie (176)
- Institut für Geowissenschaften (139)
- Institut für Romanistik (115)
- Wirtschaftswissenschaften (84)
- Department Psychologie (75)
- Sozialwissenschaften (73)
- Institut für Informatik und Computational Science (54)
- Department Linguistik (52)
Rezension von zwei Büchern des Rabbiners Shimon Gershon Rosenberg
Dieser Beitrag setzt sich mit der Figur des R. Hanania ben Dosa auseinander.
Úvod
(2011)
Die vorliegende Arbeit basiert auf Forschungen in den Jahren 2007-2009. Sie betrachtet die saisonale Arbeitsmigration aus der polnischen Region Konin, wo die Arbeitsmigration aus ökonomischen Gründen, wie auch in ähnlich strukturierten Gebieten Polens, eine lange Tradition hat, die bis ins 19. Jahrhundert zurückgeht. Sie wird die saisonale Migration ins Ausland mit den ökonomischen, sozialen und räumlichen Auswirkungen aus der Perspektive des Einzelnen und seiner unmittelbaren Umgebung, aber auch der Gesellschaft und Herkunftsgebiet der Migranten betrachtet.
Intention der Arbeit war es, die bibliothekarische Fachwelt zunächst auf den Begriff und die Bedeutung des Bürgerhaushaltes aufmerksam zu machen und eine Auseinandersetzung mit der Thematik zu fördern. Die Öffentliche Bibliothek kann ein Diskussions- bzw. Beteiligungsgegenstand zwischen Bürgerschaft und Politik sowie Verwaltung sein, wenn es im partizipatorischen Verfahren des Bürgerhaushaltes darum geht, Modernisierungsergebnisse in einer Stadt durch gezielte Finanzierung zu erlangen. Eruiert wurde, ob der Bürgerhaushalt das Potential hat, zur Modernisierung von Dienstleistungen Öffentlicher Bibliotheken beizutragen. Mittels Interviews wurden Informationen gesammelt, aufbereitet und ausgewertet.
Rund 20 Jahre nach dem Ende der Sowjetunion verharrt ein Großteil ländlich geprägter Regionen in der Russländischen Föderation in einer strukturellen Krise, die sich auf ökonomischer, sozialer und politischer Ebene niederschlägt. Auch wenn sich ländliche Räume als vermeintliche Verlierer der Transformation erwiesen haben, so sind sie doch vielfach in sich differenziert und zeigen verschiedenartige Problemlagen und Entwicklungspfade auf, die vom Umgang mit den Herausforderungen des Systemwechsels zeugen. Beispielhaft wird dies am Deutschen Nationalen Rayon Altai (DNR Altai) dargestellt, dessen Transformationsphase in der vorliegenden Arbeit rekonstruiert wird. Der DNR Altai stellt in vielerlei Hinsicht einen Sonderfall dar, da er als räumlicher Fixpunkt russlanddeutscher Entwicklungspolitik in die bundesdeutsche Förderkulisse eingebettet war. Mit dem allmählichen Rückzug der deutschen Förderinstitutionen stellt sich jedoch die Frage nach nachhaltigen Strukturen, Verstetigung von Projekten und der Zukunft russlanddeutscher Kultur im Altai.
Zwischen Heterogenität und Hierarchie in der Bildung - Studien zur Unvollendbarkeit der Demokratie
(2011)
Aus dem Inhalt: 1 Einleitung 2 Entwicklungen auf dem deutschen Arbeitsmarkt 2.1 Das Normalarbeitsverhältnis und seine Bedeutung für den deutschen Arbeitsmarkt 2.2 Flexibilität von Beschäftigungsverhältnissen 2.3 Die Entwicklung der Beschäftigungsverhältnisse in der Bundesrepublik Deutschland 3 Das deutsche System sozialer Sicherung 3.1 Die Bundesrepublik Deutschland als konservativer Wohlfahrtsstaat 3.2 Zur Prekarität atypischer Beschäftigung im deutschen System sozialer Sicherung 4 Die Flexicurity-Strategie anderer Länder 4.1 Dänemark 4.2 Niederlande 5 Die Bewährung der dargestellten arbeitsmarkt- und sozialpolitischen Modelle und Schlussfolgerungen für die Bundesrepublik 5.1 Zur Performanz der Modelle in Dänemark, Deutschland und den Niederlanden 5.2 Ansatzpunkte für die Bundesrepublik Deutschland 6 Fazit und Ausblick
Die internationale Staatengemeinschaft steht Sezessionsbestrebungen zur Aufspaltung bestehender Staaten gewöhnlich ablehnend gegenüber. Gleichzeitig wendet sie in vielen Ländern Instrumente der Entwicklungspolitik an und greift so auch in den dortigen politischen Prozess ein. Untersucht wird, inwiefern Entwicklungspolitik so gestaltet werden kann, dass sie nicht, quasi als Nebenwirkung, einer Sezessionsbewegung zum Durchbruch verhilft. Betrachtet wird dabei neben der gezielten Förderung wirtschaftlichen Wachstums auch das Instrument der Dezentralisierung, das oft als Mittel zur „Beruhigung“ separatistischer Bestrebungen vorgeschlagen wird. Zuvor jedoch wird aufgewiesen, dass eine Politik, die Sezessionen verhindern will, zumindest in vielen Fällen auch moralphilosophisch schlüssig begründet werden kann. Den Abschluss der Arbeit bilden drei Fallstudien zu Sezessionen auf dem Gebiet der ehemaligen Sowjetunion.
Mathematik spielt im Physikunterricht eine nicht unerhebliche Rolle - wenn auch eine zwiespältige. Oft wird sie sogar zum Hindernis beim Lernen von Physik und kann ihr emanzipatorisches Potenzial nicht entfalten. Die vorliegende Arbeit stellt zwei Bausteine für eine begründete Konzeption zum Umgang mit Mathematik beim Lernen von Physik zur Verfügung. Im Theorieteil der Arbeit werden zum Einen wissenschaftstheoretische Aspekte der Rolle der Mathematik in der Physik aufgearbeitet und der physikdidaktischen Forschungsgemeinschaft im Zusammenhang zugänglich gemacht. Zum anderen werden Forschungsergebnisse zu Vorstellungen Lernender über Physik und Mathematik sowie im Bereich der Epistemologie zusammengestellt. Im empirischen Teil der Arbeit werden Vorstellungen zur Rolle der Mathematik in der Physik von Schülerinnen und Schülern der Klassenstufen 10 und 12 sowie Physik-Lehramtstudierenden im Grundstudium mit Hilfe eines Fragebogens erhoben und unter Verwendung inhaltsanalytischer bzw. statistischer Methoden ausgewertet. Die Ergebnisse zeigen unter Anderem, dass Mathematik im Physikunterricht entgegen gängiger Meinungen bei den Lernenden nicht negativ, aber zumindest bei jüngeren Lernenden formal und algorithmisch konnotiert ist.
Zur Ektoparasitenfauna der Fledermäuse in Sachsen Anhalt : Ectoparasites of bats in Saxony-Anhalt
(2011)
During the summer 2010 several mist nettings for the monitoring of bat species were performed in Saxony-Anhalt. Captured individuals were tested for ectoparasitic infestation. The aim was to update the fauna of ectoparasites of this state and to collect data on the distribution of individual species. Regarding this, results of previous surveys are summarised. In the present study nine out of thirteen bat species were found to be infested with a total of one flea species, one species of bat flies and eight species of mites. The infestation with fleas was below the expectations. Six spinturnicid mite species out of those occurring in Germany could be ascertained for Saxony-Anhalt. These are Spinturnix acuminatus (Koch, 1836), S. andegavinus (Kolenati, 1857), S. helvetiae Deunff, Keller & Aellen, 1986, S. mystacinus (Kolenati, 1857), S. plecotinus (Koch, 1839) and S. puncata (Sundevall, 1833). Details about the infestation with parasites (abundances) of the respective bat species are presented. Further information on the biology of spinturnicid mites are given and infestation characteristics are compared with those of other surveys. Keywords: ectoparasites, bats, Chiroptera, gamasine mite, Acari, Spinturnix, Ischnopsyllidae, Nycteribiidae, Saxony-Anhalt, Germany
Der Beitrag untersucht Kleists Dramentexte vor allem im Hinblick auf den "Skandal" des Koerpers und versucht dafuer eine Erklaerung zu finden. Diese besteht zunaechst im Herausfall aus dem Paradigma des aesthetischem Koerpers als einem schoenen Koerper. Zum zweiten aber auch darin, das Kleist mit seinen literarischen Mitteln eine drastische Performation des Koerpers fuer das Buehnenstueck vornahm ohne dafuer eine theatralische Praxis zu haben. Der Einsatz des Koerpers als ein Zeichen im Buehnenspiel ist daher strategisch vom Autor Kleist zu einer Aussage performiert worden, die neben den Aussagen des sprachlichen Textes eine weitere koerpersprachliche Textur konstruiert.
Zentralamerika
(2011)
Zenralamerika
(2011)
Yeast hexokinase isoenzyme ScHxk2 stability of a two-domain protein with discontinuous domains
(2011)
The hexokinase isoenzyme 2 of Saccharomyces cerevisiae (ScHxk2) represents an archetype of a two-domain protein with the active site located in a cleft between the two domains. Binding of the substrate glucose results in a rigid body movement of the two domains leading to a cleft closure of the active site. Both domains of this enzyme are composed of discontinuous peptide sequences. This structural feature is reflected in the stability and folding of the ScHxk2 protein. Structural transitions induced by urea treatment resulted in the population of a thermodynamically stable folding intermediate, which, however, does not correspond to a molecule with one domain folded and the other unfolded. As demonstrated by different spectroscopic techniques, both domains are structurally affected by the partial denaturation. The intermediate possesses only 40% of the native secondary structural content and a substantial increase in the Stokes radius as judged by circular dichroism and dynamic light scattering analyses. One-dimensional H-1 NMR data prove that all tryptophan residues are in a non-native environment in the intermediate, indicating substantial changes in the tertiary structure. Still, the intermediate possesses quite a high stability for a transition intermediate of about Delta G = -22 kJ mol(-1).
Yeast hexokinase isoenzyme ScHxk2 : stability of a two-domain protein with discontinuous domains
(2011)
The hexokinase isoenzyme 2 of Saccharomyces cerevisiae (ScHxk2) represents an archetype of a two-domain protein with the active site located in a cleft between the two domains. Binding of the substrate glucose results in a rigid body movement of the two domains leading to a cleft closure of the active site. Both domains of this enzyme are composed of discontinuous peptide sequences. This structural feature is reflected in the stability and folding of the ScHxk2 protein. Structural transitions induced by urea treatment resulted in the population of a thermodynamically stable folding intermediate, which, however, does not correspond to a molecule with one domain folded and the other unfolded. As demonstrated by different spectroscopic techniques, both domains are structurally affected by the partial denaturation. The intermediate possesses only 40% of the native secondary structural content and a substantial increase in the Stokes radius as judged by circular dichroism and dynamic light scattering analyses. One-dimensional 1H NMR data prove that all tryptophan residues are in a non-native environment in the intermediate, indicating substantial changes in the tertiary structure. Still, the intermediate possesses quite a high stability for a transition intermediate of about ;G = ;22 kJ mol;1.
The TorD family of specific chaperones is divided into four subfamilies dedicated to molybdoenzyme biogenesis and a fifth one, exemplified by YcdY of Escherichia coli, for which no defined partner has been identified so far. We propose that YcdY is the chaperone of YcdX, a zinc protein involved in the swarming motility process of E. coli, since YcdY interacts with YcdX and increases its activity in vitro.
The enzyme xanthine dehydrogenase (XDH) from the purple photosynthetic bacterium Rhodobacter capsulatus catalyzes the oxidation of hypoxanthine to xanthine and xanthine to uric acid as part of purine metabolism. The native electron acceptor is NAD(+) but herein we show that uric acid in its 2-electron oxidized form is able to act as an artificial electron acceptor from XDH in an electrochemically driven catalytic system. Hypoxanthine oxidation is also observed with the novel production of uric acid in a series of two consecutive 2-electron oxidation reactions via xanthine. XDH exhibits native activity in terms of its pH optimum and inhibition by allopurinol.
The exponential expanding of the numbers of web sites and Internet users makes WWW the most important global information resource. From information publishing and electronic commerce to entertainment and social networking, the Web allows an inexpensive and efficient access to the services provided by individuals and institutions. The basic units for distributing these services are the web sites scattered throughout the world. However, the extreme fragility of web services and content, the high competence between similar services supplied by different sites, and the wide geographic distributions of the web users drive the urgent requirement from the web managers to track and understand the usage interest of their web customers. This thesis, "X-tracking the Usage Interest on Web Sites", aims to fulfill this requirement. "X" stands two meanings: one is that the usage interest differs from various web sites, and the other is that usage interest is depicted from multi aspects: internal and external, structural and conceptual, objective and subjective. "Tracking" shows that our concentration is on locating and measuring the differences and changes among usage patterns. This thesis presents the methodologies on discovering usage interest on three kinds of web sites: the public information portal site, e-learning site that provides kinds of streaming lectures and social site that supplies the public discussions on IT issues. On different sites, we concentrate on different issues related with mining usage interest. The educational information portal sites were the first implementation scenarios on discovering usage patterns and optimizing the organization of web services. In such cases, the usage patterns are modeled as frequent page sets, navigation paths, navigation structures or graphs. However, a necessary requirement is to rebuild the individual behaviors from usage history. We give a systematic study on how to rebuild individual behaviors. Besides, this thesis shows a new strategy on building content clusters based on pair browsing retrieved from usage logs. The difference between such clusters and the original web structure displays the distance between the destinations from usage side and the expectations from design side. Moreover, we study the problem on tracking the changes of usage patterns in their life cycles. The changes are described from internal side integrating conceptual and structure features, and from external side for the physical features; and described from local side measuring the difference between two time spans, and global side showing the change tendency along the life cycle. A platform, Web-Cares, is developed to discover the usage interest, to measure the difference between usage interest and site expectation and to track the changes of usage patterns. E-learning site provides the teaching materials such as slides, recorded lecture videos and exercise sheets. We focus on discovering the learning interest on streaming lectures, such as real medias, mp4 and flash clips. Compared to the information portal site, the usage on streaming lectures encapsulates the variables such as viewing time and actions during learning processes. The learning interest is discovered in the form of answering 6 questions, which covers finding the relations between pieces of lectures and the preference among different forms of lectures. We prefer on detecting the changes of learning interest on the same course from different semesters. The differences on the content and structure between two courses leverage the changes on the learning interest. We give an algorithm on measuring the difference on learning interest integrated with similarity comparison between courses. A search engine, TASK-Moniminer, is created to help the teacher query the learning interest on their streaming lectures on tele-TASK site. Social site acts as an online community attracting web users to discuss the common topics and share their interesting information. Compared to the public information portal site and e-learning web site, the rich interactions among users and web content bring the wider range of content quality, on the other hand, provide more possibilities to express and model usage interest. We propose a framework on finding and recommending high reputation articles in a social site. We observed that the reputation is classified into global and local categories; the quality of the articles having high reputation is related with the content features. Based on these observations, our framework is implemented firstly by finding the articles having global or local reputation, and secondly clustering articles based on their content relations, and then the articles are selected and recommended from each cluster based on their reputation ranks.
We have used x-ray waveguides as highly confining optical elements for nanoscale imaging of unstained biological cells using the simple geometry of in-line holography. The well-known twin-image problem is effectively circumvented by a simple and fast iterative reconstruction. The algorithm which combines elements of the classical Gerchberg-Saxton scheme and the hybrid-input-output algorithm is optimized for phase-contrast samples, well-justified for imaging of cells at multi-keV photon energies. The experimental scheme allows for a quantitative phase reconstruction from a single holographic image without detailed knowledge of the complex illumination function incident on the sample, as demonstrated for freeze-dried cells of the eukaryotic amoeba Dictyostelium discoideum. The accessible resolution range is explored by simulations, indicating that resolutions on the order of 20 nm are within reach applying illumination times on the order of minutes at present synchrotron sources.
X-ray observations of the double-binary OB-star system QZ Car (HD 93206) obtained with the Chandra X-ray Observatory over a period of roughly 2 years are presented. The respective orbits of systems A (O9.7 I+b2 v, P-A = 21 days) and B (O8 III+o9 v, P-B = 6 days) are reasonably well sampled by the observations, allowing the origin of the X-ray emission to be examined in detail. The X-ray spectra can be well fitted by an attenuated three-temperature thermal plasma model, characterized by cool, moderate, and hot plasma components at kT similar or equal to 0.2, 0.7, and 2 keV, respectively, and a circumstellar absorption of similar or equal to 0.2 x 10(22) cm(-2). Although the hot plasma component could be indicating the presence of wind-wind collision shocks in the system, the model fluxes calculated from spectral fits, with an average value of similar or equal to 7x10(-13) erg s(-1) cm(-2), do not show a clear correlation with the orbits of the two constituent binaries. A semi-analytical model of QZ Car reveals that a stable momentum balance may not be established in either system A or B. Yet, despite this, system B is expected to produce an observed X-ray flux well in excess of the observations. If one considers the wind of the O8 III star to be disrupted by mass transfer, the model and observations are in far better agreement, which lends support to the previous suggestion of mass transfer in the O8 III+o9 v binary. We conclude that the X-ray emission from QZ Car can be reasonably well accounted for by a combination of contributions mainly from the single stars and the mutual wind-wind collision between systems A and B.
We investigate the connections between the magnetic fields and the X-ray emission from massive stars. Our study shows that the X-ray properties of known strongly magnetic stars are diverse: while some comply to the predictions of the magnetically confined wind model, others do not. We conclude that strong, hard, and variable X-ray emission may be a sufficient attribute of magnetic massive stars, but it is not a necessary one. We address the general properties of X-ray emission from "normal" massive stars, especially the long standing mystery about the correlations between the parameters of X-ray emission and fundamental stellar properties. The recent development in stellar structure modeling shows that small-scale surface magnetic fields may be common. We suggest a "hybrid" scenario which could explain the X-ray emission from massive stars by a combination of magnetic mechanisms on the surface and shocks in the stellar wind. The magnetic mechanisms and the wind shocks are triggered by convective motions in sub-photospheric layers. This scenario opens the door for a natural explanation of the well established correlation between bolometric and X-ray luminosities.
Working memory maintenance of grasp-target information in the human posterior parietal cortex
(2011)
Event-related functional magnetic resonance imaging was applied to identify cortical areas involved in maintaining target information in working memory used for an upcoming grasping action. Participants had to grasp with their thumb and index finger of the dominant right hand three-dimensional objects of different size and orientation. Reaching-to-grasp movements were performed without visual feedback either immediately after object presentation or after a variable delay of 2-12 s. The right inferior parietal cortex demonstrated sustained neural activity throughout the delay, which overlapped with activity observed during encoding of the grasp target. Immediate and delayed grasping activated similar motor-related brain areas and showed no differential activity. The results suggest that the right inferior parietal cortex plays an important functional role in working memory maintenance of grasp-related information. Moreover, our findings confirm the assumption that brain areas engaged in maintaining information are also involved in encoding the same information, and thus extend previous findings on working memory function of the posterior parietal cortex in saccadic behavior to reach-to-grasp movements.
Work-related behavior and experience patterns of entrepreneurs compared to teachers and physicians
(2011)
Purpose This study examined the status of health-related behavior and experience patterns of entrepreneurs in comparison with teachers and physicians to identify specific health risks and resources.
Methods Entrepreneurs (n = 632), teachers (n = 5,196), and physicians (n = 549) were surveyed in a cross-sectional design. The questionnaire Work-related Behavior and Experience Patterns (AVEM) was used for all professions and, in addition, two scales (health prevention and self-confidence) from the Checklist for Entrepreneurs in the sample of entrepreneurs.
Results The largest proportion of the entrepreneurs (45%) presented with a healthy pattern (compared with 18.4% teachers and 18.3% physicians). Thirty-eight percent of entrepreneurs showed a risk pattern of overexertion and stress, followed by teachers (28.9%) and physicians (20.6%). Unambitious or burnout patterns were seen in only 9.3/8.2% of entrepreneurs, respectively, and 25.3/27.3% of teachers, and 39.6/21.5% of physicians. While the distribution of patterns in teachers and physicians differed significantly between genders, a gender difference was not found among entrepreneurs. Entrepreneurs with the risk pattern of overexertion scored significantly (P < 0.01) lower in self-confidence and health care than those with the healthy pattern.
Conclusions The development of a successful enterprise depends, in part, on the health of the entrepreneur. The large proportion of entrepreneurs with the healthy pattern irrespective of gender may support the notion that self-selection effects of healthy individuals in this special career might be important. At the same time, a large proportion was at risk for overexertion and might benefit from measures to cope with professional demands and stress and promote a healthy behavior pattern.
Entsprechend der Zielstellung wurden zunächst verschiedene Varianten der Kompostierung von Holzsubstanz getestet, um eine optimale Technologie, die auch für Entwicklungsländer realisierbar ist, herauszufinden. Hierzu sind in Pflanztöpfe Holzspäne (Woodchips) von zwei verschieden Holzarten (Laub- und Nadelholz) gefüllt und mit verschiedenen natürlichen Stickstoffquellen gemischt worden. Diese Ansätze wurden regelmäßig mit Kompostwasser appliziert. Nach vier Wochen sind zwei verschiedene Wurmarten (Dendrobaena veneta und Eisenia fetida) hinzugegeben worden. Die Feuchthaltung erfolgte ab diesem Zeitpunkt durch Frischwasser. Die qualitativ beste Versuchsvariante ist im nächsten Schritt mit weiteren natürlichen Stickstoffquellen, die in Entwicklungsländern zur Verfügung gestellt werden könnten, getestet worden. Von allen Kompostvarianten sind im Labor eine Vielzahl von bodenphysikalischen (z.B. Dichte, Wasserhaltekapazität) und bodenchemischen Zustandsgrößen (z.B. Elektrische Leitfähigkeit, Totalgehalte biophiler Elemente, Bodenreaktion, organische Substanzgehalte, Kationenaustauschkapazität) bestimmt worden. Die Wiederum qualitativ beste Mischung ist in einer weiteren Versuchsreihe in verschiedenen Mengenverhältnissen mit tertiärerem Abraumsand des Braunkohlebergbaus gemischt worden. In diese Versuchsmischungen wurde die Grasmischung RSM 7.2.1 eingesät und regelmäßig bewässert sowie die Wuchshöhe gemessen. Nach 42 Tagen wurden das Gras geerntet und die biometrischen Parameter, die Nährstoffgehalte (pflanzenverfügbare Fraktionen), die Bodenreaktion, die effektive bzw. potentielle Kationenaustauschkapazität sowie die Pufferkapazitäten der Mischsubstrate bestimmt. Die nächsten Versuchsvarianten sind als Feldversuche in der Niederlausitz durchgeführt worden. Für ihre Realisierung wurde als weiterer Zuschlagsstoff Arkadolith® zugemischt. Die Plotflächen sind sowohl auf Abraumsanden des Tertiärs als auch Quartärs angelegt worden. In jeweils eine Subvariante ist RSM 7.2.1, in die andere eine autochthone Grasmischung eingesät worden. Diese Experimente wurden nach 6 Monaten beendet, die Bestimmung aller Parameter erfolgte in gleicher Weise wie bei den Gewächshausversuchen. Auf Basis aller Versuchsreihen konnten die besten Kompostqualitäten und ihre optimalen Herstellungsvarianten ermittelt werden. Eine weitere Aufgabe war es zu untersuchen, wie im Vergleich zur Verbrennung von Holzmasse die CO2-Emission in die Atmosphäre durch Holzkompostierung verringert werden kann. Hierzu wurde während der verschiedenen Kompostierungsvarianten die CO2-Freisetzung gemessen. Im Vergleich dazu ist jeweils die gleiche Masse an Holzsubstanz verbrannt worden. Die Ergebnisse zeigten, dass im Vergleich zu der thermischen Verwertung von Holsubstanz die CO2-Emission bis zu 50 % verringert werden kann. Dem Boden kann darüber hinaus energiereiche organische Substanz zugeführt werden, die eine Entwicklung der Bodenorganismen ermöglicht. Ein weiteres Experiment zielte darauf ab, die Stabilität der Holzkomposte zu bestimmen. Darüber hinaus sollte untersucht werden, ob durch die Zufuhr von pyrogenem Kohlenstoff eine Vergrößerung der Stabilität zu erreichen ist. Diese Untersuchungen wurden mit Hilfe der Thermogravimetrie vorgenommen. Alle wichtigen Kompostierungsvarianten sind sowohl mit verschiedenen Zusatzmengen als auch ohne Zusatz von pyrogenem Kohlenstoff vermessen worden. Als Vergleichssubstanz diente der Oberboden eines Niedermoorgleys, der naturgemäß einen relativ hohen Anteil an organischer Substanz aufweist. Die Ergebnisse zeigten, dass im Bereich niedriger Temperaturen die Wasserbindung im Naturboden fester ist. In der Fraktion der oxidierbaren organischen Substanz, im mittleren Temperaturbereich gemessen, ist die natürliche Bodensubstanz ebenfalls stabiler, was auf eine intensivere Bindung zwischen den organischen und anorganischen Bestandteilen, also auf stabilere organisch-mineralische Komplexe, schlussfolgern lässt. Im Bereich höherer Temperaturen (T> 550° C) waren im Naturboden keine nennenswerten organischen Bestandteile mehr nachweisbar. Hingegen wiesen die Kompostvarianten einen hohen Anteil stabiler Fraktionen, vor allem aromatische Verbindungen, auf. Diese Aussagen erscheinen vor allem für die praktische Anwendung der Holzkomposte in Hinblick auf ihre Langzeitwirkung bedeutsam. Der Zusatz von pyrogenem Kohlenstoff zeigte keine zusätzliche Stabilisierungswirkung.
Humidity is an important determinant of the mycotoxin production (DON, ZEA) by Fusarium species in the grain ears. From a landscape perspective humidity is not evenly distributed across fields. The topographically-controlled redistribution of water within a single field rather leads to spatially heterogeneous soil water content and air humidity. Therefore we hypothesized that the spatial distribution of mycotoxins is related to these topographically-controlled factors. To test this hypothesis we studied the mycotoxin concentrations at contrasting topographic relief positions, i.e. hilltops and depressions characterized by soils of different soil moisture regimes, on ten winter wheat fields in 2006 and 2007. Maize was the preceding crop and minimum tillage was practiced in the fields. The different topographic positions were associated with moderate differences in DON and ZEA concentrations in 2006, but with significant differences in 2007, with six times higher median ZEA and two times higher median DON detected at depression sites compared to the hilltops. The depression sites correspond to a higher topographic wetness index as well as redoximorphic properties in soil profiles, which empirically supports our hypothesis at least for years showing wetter conditions in sensitive time windows for Fusarium infections.
Wissenschaftliches Schreiben
(2011)
Wissenschaftliches Schreiben
(2011)
Schreiben ist harte Arbeit. Dafür sind sowohl Erfahrung als auch Orientierung nötig. Diese Schrift, die jetzt in einer vierten, erweiterten Auflage erschienen ist, gibt Ihnen Hinweise zum wissenschaftlichen Schreiben in seinen verschiedenen Varianten: vom Exzerpt und der Literaturbesprechung über die Klausur und das Essay bis hin zur Abschlussarbeit. Zudem finden Sie Anregungen zu mündlichen Prüfungen und der Disputation. Der Lehrtext ist eine konzentrierte Hilfe sowohl für Studienanfänger als auch für diejenigen, die vor dem Abschluss ihres Studiums stehen.
Psychotherapeutic interventions require empirical as well as scientific assessment. Specifically, the proven efficacy of psychotherapy for children and adolescents is essential. Thus, studies examining treatment efficacy and meta- analyses are necessary to compare effect sizes of individual therapeutic interventions between treatment groups and waiting control groups. Assessment of 138 primary studies from 1993-2009 documented the efficacy of psychotherapy for children and adolescents. Furthermore, behavioural therapy outperformed non-behavioural interventions, as 90 % of behavioural interventions showed larger effect sizes compared to non-behavioural psychotherapy. Analysis of moderator variables demonstrated an improved treatment efficacy for individual therapy, inclusion of the family, treatment of internalised disorders, and in clinical samples. Stability of psychotherapeutic treatment effects over time was demonstrated.
The open source computational fluid dynamics (CFD) wind model (CFD-WEM) for wind erosion research in the Xilingele grassland in Inner Mongolia (autonomous region, China) is compared with two open source CFD models Gerris and OpenFOAM. The evaluation of these models was made according to software technology, implemented methods, handling, accuracy and calculation speed. All models were applied to the same wind tunnel data set. Results show that the simplest CFD-WEM has the highest calculation speed with acceptable accuracy, and the most powerful OpenFOAM produces the simulation with highest accuracy and the lowest calculation speed. Gerris is between CFD-WEM and OpenFOAM. It calculates faster than OpenFOAM, and it is capable to solve different CFD problems. CFD-WEM is the optimal model to be further developed for wind erosion research in Inner Mongolia grassland considering its efficiency and the uncertainties of other input data. However, for other applications using CFD technology, Gerris and OpenFOAM can be good choices. This paper shows the powerful capability of open source CFD software in wind erosion study, and advocates more involvement of open source technology in wind erosion and related ecological researches.
Wiedergelesen
(2011)
Seit 2008 ergänzt die sowohl kleine als auch feine Rubrik "Wiedergelesen" die Ausgaben des außenpolitischen Fachjournals WeltTrends. Berühmte Bücher der Politologie und schätzenswerte Raritäten werden für die Füllung dieser Rubrik aus Bücherregalen herausgefischt, abgestaubt und in aktuellem Interesse gelesen - wieder gelesen. Stets ist es das Anliegen, die Schriften sowohl aus ihrer Zeit zu verstehen als auch für unsere Zeit nutzbar zu machen - z. B. Huntingtons The Clash of Civilizations, Lipsets Political Man, Paines Die Rechte des Menschen oder An Authoritarian Regime: The Case of Spain von Linz.
Wie geht es weiter
(2011)
Spatial numerical associations (SNAs) are prevalent yet their origin is poorly understood. We first consider the possible prime role of reading habits in shaping SNAs and list three observations that argue against a prominent influence of this role: (1) directional reading habits for numbers may conflict with those for non-numerical symbols, (2) short-term experimental manipulations can overrule the impact of decades of reading experience, (3) SNAs predate the acquisition of reading. As a promising alternative, we discuss behavioral, neuroscientific, and neuropsychological evidence in support of finger counting as the most likely initial determinant of SNAs. Implications of this "manumerical cognition" stance for the distinction between grounded, embodied, and situated cognition are discussed.
Trait-based studies have become extremely common in plant ecology. Trait-based approaches often rely on the tacit assumption that intraspecific trait variability (ITV) is negligible compared to interspecific variability, so that species can be characterized by mean trait values. Yet, numerous recent studies have challenged this assumption by showing that ITV significantly affects various ecological processes. Accounting for ITV may thus strengthen trait-based approaches, but measuring trait values on a large number of individuals per species and site is not feasible. Therefore, it is important and timely to synthesize existing knowledge on ITV in order to (1) decide critically when ITV should be considered, and (2) establish methods for incorporating this variability. Here we propose a practical set of rules to identify circumstances under which ITV should be accounted for. We formulate a spatial trait variance partitioning hypothesis to highlight the spatial scales at which ITV cannot be ignored in ecological studies. We then refine a set of four consecutive questions on the research question, the spatial scale, the sampling design, and the type of studied traits, to determine case-by-case if a given study should quantify ITV and test its effects. We review methods for quantifying ITV and develop a step-by-step guideline to design and interpret simulation studies that test for the importance of ITV. Even in the absence of quantitative knowledge on ITV, its effects can be assessed by varying trait values within species within realistic bounds around the known mean values. We finish with a discussion of future requirements to further incorporate ITV within trait-based approaches. This paper thus delineates a general framework to account for ITV and suggests a direction towards a more quantitative trait-based ecology.
What is visualization?
(2011)
Over the last 20 years, information visualization became a common tool in science and also a growing presence in the arts and culture at large. However, the use of visualization in cultural research is still in its infancy. Based on the work in the analysis of video games, cinema, TV, animation, Manga and other media carried out in Software Studies Initiative at University of California, San Diego over last two years, a number of visualization techniques and methods particularly useful for cultural and media research are presented.
Which repair strategy does the language system deploy when it gets garden-pathed, and what can regressive eye movements in reading tell us about reanalysis strategies? Several influential eye-tracking studies on syntactic reanalysis (Frazier & Rayner, 1982; Meseguer, Carreiras, & Clifton, 2002; Mitchell, Shen, Green, & Hodgson, 2008) have addressed this question by examining scanpaths, i.e., sequential patterns of eye fixations. However, in the absence of a suitable method for analyzing scanpaths, these studies relied on simplified dependent measures that are arguably ambiguous and hard to interpret. We address the theoretical question of repair strategy by developing a new method that quantifies scanpath similarity. Our method reveals several distinct fixation strategies associated with reanalysis that went undetected in a previously published data set (Meseguer et al., 2002). One prevalent pattern suggests re-parsing of the sentence, a strategy that has been discussed in the literature (Frazier & Rayner, 1982); however, readers differed tremendously in how they orchestrated the various fixation strategies. Our results suggest that the human parsing system non-deterministically adopts different strategies when confronted with the disambiguating material in garden-path sentences.
Urban forests fulfil various functions, among them the restoration process and aesthetical needs of urban residents. This article reflects the attitudes towards different managed forests on the one hand and their influence on psychological well-being on the other. Results of empirical approaches from both fields show some inconsistency, suggesting that people have a more positive attitude towards wild forest areas, while the effect on well-being is more positive after a walk in tended forest areas. A discussion follows on the link between perception and the effect of urban forests. An outlook on necessary research reveals the need for longitudinal research. The article concludes by showing management implications.
Weimarer Klassik
(2011)
The World Wide Web as an application platform becomes increasingly important. However, the development of Web applications is often more complex than for the desktop. Web-based development environments like Lively Webwerkstatt can mitigate this problem by making the development process more interactive and direct. By moving the development environment into the Web, applications can be developed collaboratively in a Wiki-like manner. This report documents the results of the project seminar on Web-based Development Environments 2010. In this seminar, participants extended the Web-based development environment Lively Webwerkstatt. They worked in small teams on current research topics from the field of Web-development and tool support for programmers and implemented their results in the Webwerkstatt environment.
To understand the evolution and morphology of planetary nebulae, a detailed knowledge of their central stars is required. Central stars that exhibit emission lines in their spectra, indicating stellar mass-loss allow to study the evolution of planetary nebulae in action. Emission line central stars constitute about 10 % of all central stars. Half of them are practically hydrogen-free Wolf-Rayet type central stars of the carbon sequence, [WC], that show strong emission lines of carbon and oxygen in their spectra. In this contribution we address the weak emission-lines central stars (wels). These stars are poorly analyzed and their hydrogen content is mostly unknown. We obtained optical spectra, that include the important Balmer lines of hydrogen, for four weak emission line central stars. We present the results of our analysis, provide spectral classification and discuss possible explanations for their formation and evolution.
Wavelet modelling of the gravity field by domain decomposition methods: an example over Japan
(2011)
With the advent of satellite gravity, large gravity data sets of unprecedented quality at low and medium resolution become available. For local, high resolution field modelling, they need to be combined with the surface gravity data. Such models are then used for various applications, from the study of the Earth interior to the determination of oceanic currents. Here we show how to realize such a combination in a flexible way using spherical wavelets and applying a domain decomposition approach. This iterative method, based on the Schwarz algorithms, allows to split a large problem into smaller ones, and avoids the calculation of the entire normal system, which may be huge if high resolution is sought over wide areas. A subdomain is defined as the harmonic space spanned by a subset of the wavelet family. Based on the localization properties of the wavelets in space and frequency, we define hierarchical subdomains of wavelets at different scales. On each scale, blocks of subdomains are defined by using a tailored spatial splitting of the area. The data weighting and regularization are iteratively adjusted for the subdomains, which allows to handle heterogeneity in the data quality or the gravity variations. Different levels of approximations of the subdomains normals are also introduced, corresponding to building local averages of the data at different resolution levels.
We first provide the theoretical background on domain decomposition methods. Then, we validate the method with synthetic data, considering two kinds of noise: white noise and coloured noise. We then apply the method to data over Japan, where we combine a satellite-based geopotential model, EIGEN-GL04S, and a local gravity model from a combination of land and marine gravity data and an altimetry-derived marine gravity model. A hybrid spherical harmonics/wavelet model of the geoid is obtained at about 15 km resolution and a corrector grid for the surface model is derived.
Die vielgestaltige Geschichte des Klosters Dobrilugk wird anhand einer Auswahl klösterlicher Urkunden, die sich heute im Brandenburgischen Landeshauptarchiv befinden, dargestellt. Die Gründe für die Auswahl sind vielseitig. Zum einen wurden im Rahmen der im ehemaligen Refektorium des Kloster Dobrilugk 2011 gezeigten gleichnamigen Urkundenausstellung Exponate ausgewählt, die den Betrachter aufgrund ihrer äußeren Erscheinung - durch ein eindrucksvolles Siegel oder einer eigenhändigen Unterschirft des Ausstellers besonders ansprechen. Zum anderen aber zeigt die Auswahl unter Berücksichtigung der Urkundenaussteller, in welch vielfacher Weise das Kloster Dobrilugk eingebunden war in das mittelalterliche Weltgeschehen.
Walter Boehlich : Kritiker
(2011)
Vorwort
(2011)
Vorwort
(2011)
Vorwort
(2011)
Vorwort
(2011)
This Letter reports on new methods and a consistent model for voltage tunable optical transmission gratings. Elastomeric gratings were molded from holographically written surface relief gratings in an azobenzene sol-gel material. These were placed on top of a transparent electroactive elastomeric substrate. Two different electro-active substrate elastomers were employed, with a large range of prestretches. A novel finite-deformation theory was found to match the device response excellently, without fitting parameters. The results clearly show that the grating underwent pure-shear deformation, and more surprisingly, that the mechanical properties of the electro-active substrate did not affect device actuation. (C) 2011 Optical Society of America
Vitamin A metabolism is changed in donors after living-kidney transplantation an observational study
(2011)
Background: The kidneys are essential for the metabolism of vitamin A (retinol) and its transport proteins retinol-binding protein 4 (RBP4) and transthyretin. Little is known about changes in serum concentration after living donor kidney transplantation (LDKT) as a consequence of unilateral nephrectomy; although an association of these parameters with the risk of cardiovascular diseases and insulin resistance has been suggested. Therefore we analyzed the concentration of retinol, RBP4, apoRBP4 and transthyretin in serum of 20 living-kidney donors and respective recipients at baseline as well as 6 weeks and 6 months after LDKT.
Results: As a consequence of LDKT, the kidney function of recipients was improved while the kidney function of donors was moderately reduced within 6 weeks after LDKT. With regard to vitamin A metabolism, the recipients revealed higher levels of retinol, RBP4, transthyretin and apoRBP4 before LDKT in comparison to donors. After LDKT, the levels of all four parameters decreased in serum of the recipients, while retinol, RBP4 as well as apoRBP4 serum levels of donors increased and remained increased during the follow-up period of 6 months.
Conclusion: LDKT is generally regarded as beneficial for allograft recipients and not particularly detrimental for the donors. However, it could be demonstrated in this study that a moderate reduction of kidney function by unilateral nephrectomy, resulted in an imbalance of components of vitamin A metabolism with a significant increase of retinol and RBP4 and apoRBP4 concentration in serum of donors.
A business process is a set of steps designed to be executed in a certain order to achieve a business value. Such processes are often driven by and documented using process models. Nowadays, process models are also applied to drive process execution. Thus, correctness of business process models is a must. Much of the work has been devoted to check general, domain-independent correctness criteria, such as soundness. However, business processes must also adhere to and show compliance with various regulations and constraints, the so-called compliance requirements. These are domain-dependent requirements.
In many situations, verifying compliance on a model level is of great value, since violations can be resolved in an early stage prior to execution. However, this calls for using formal verification techniques, e.g., model checking, that are too complex for business experts to apply. In this paper, we utilize a visual language. BPMN-Q to express compliance requirements visually in a way similar to that used by business experts to build process models. Still, using a pattern based approach, each BPMN-Qgraph has a formal temporal logic expression in computational tree logic (CTL). Moreover, the user is able to express constraints, i.e., compliance rules, regarding control flow and data flow aspects. In order to provide valuable feedback to a user in case of violations, we depend on temporal logic querying approaches as well as BPMN-Q to visually highlight paths in a process model whose execution causes violations.
Miniature eye movements jitter the retinal image unceasingly, raising the question of how perceptual continuity is achieved during visual fixation. Recent work discovered suppression of visual bursts in the superior colliculus around the time of microsaccades, tiny jerks of the eyes that support visual perception while gaze is fixed. This finding suggests that corollary discharge, supporting visual stability when rapid eye movements drastically shift the retinal image, may also exist for the smallest saccades.
Business processes are commonly modeled using a graphical modeling language. The most widespread notation for this purpose is business process diagrams in the Business Process Modeling Notation (BPMN). In this article, we use the visual query language BPMN-Q for expressing patterns that are related to possible problems in such business process diagrams. We discuss two classes of problems that can be found frequently in real-world models: sequence flow errors and model fragments that can make the model difficult to understand.
By using a query processor, a business process modeler is able to identify possible errors in business process diagrams. Moreover, the erroneous parts of the business process diagram can be highlighted when an instance of an error pattern is found. This way, the modeler gets an easy-to-understand feedback in the visual modeling language he or she is familiar with. This is an advantage over current validation methods, which usually lack this kind of intuitive feedback.
Virtualisierung und Cloud Computing gehören derzeit zu den wichtigsten Schlagworten für Betreiber von IT Infrastrukturen. Es gibt eine Vielzahl unterschiedlicher Technologien, Produkte und Geschäftsmodelle für vollkommen verschiedene Anwendungsszenarien. Die vorliegende Studie gibt zunächst einen detaillierten Überblick über aktuelle Entwicklungen in Konzepten und Technologien der Virtualisierungstechnologie – von klassischer Servervirtualisierung über Infrastrukturen für virtuelle Arbeitsplätze bis zur Anwendungsvirtualisierung und macht den Versuch einer Klassifikation der Virtualisierungsvarianten. Bei der Betrachtung des Cloud Computing-Konzepts werden deren Grundzüge sowie verschiedene Architekturvarianten und Anwendungsfälle eingeführt. Die ausführliche Untersuchung von Vorteilen des Cloud Computing sowie möglicher Bedenken, die bei der Nutzung von Cloud-Ressourcen im Unternehmen beachtet werden müssen, zeigt, dass Cloud Computing zwar große Chancen bietet, aber nicht für jede Anwendung und nicht für jeden rechtlichen und wirtschaftlichen Rahmen in Frage kommt.. Die anschließende Marktübersicht für Virtualisierungstechnologie zeigt, dass die großen Hersteller – Citrix, Microsoft und VMware – jeweils Produkte für fast alle Virtualisierungsvarianten anbieten und hebt entscheidende Unterschiede bzw. die Stärken der jeweiligen Anbieter heraus. So ist beispielsweise die Lösung von Citrix für Virtual Desktop Infrastructures sehr ausgereift, während Microsoft hier nur auf Standardtechnologie zurückgreifen kann. VMware hat als Marktführer die größte Verbreitung in Rechenzentren gefunden und bietet als einziger Hersteller echte Fehlertoleranz. Microsoft hingegen punktet mit der nahtlosen Integration ihrer Virtualisierungsprodukte in bestehende Windows-Infrastrukturen. Im Bereich der Cloud Computing-Systeme zeigen sich einige quelloffene Softwareprojekte, die durchaus für den produktiven Betrieb von sogenannten privaten Clouds geeignet sind.
We report on very high energy (>100 GeV) gamma-ray observations of Swift J164449.3+573451, an unusual transient object first detected by the Swift Observatory and later detected by multiple radio, optical, and X-ray observatories. A total exposure of 28 hr was obtained on Swift J164449.3+573451 with the Very Energetic Radiation Imaging Telescope Array System ( VERITAS) during 2011 March 28-April 15. We do not detect the source and place a differential upper limit on the emission at 500 GeV during these observations of 1.4 x 10(-12) erg cm(-2) s(-1) (99% confidence level). We also present time-resolved upper limits and use a flux limit averaged over the X-ray flaring period to constrain various emission scenarios that can accommodate both the radio-through-X-ray emission detected from the source and the lack of detection by VERITAS.
We present the results of observations of the TeV binary LS I + 61 degrees 303 with the VERITAS telescope array between 2008 and 2010, at energies above 300 GeV. In the past, both ground-based gamma-ray telescopes VERITAS and MAGIC have reported detections of TeV emission near the apastron phases of the binary orbit. The observations presented here show no strong evidence for TeV emission during these orbital phases; however, during observations taken in late 2010, significant emission was detected from the source close to the phase of superior conjunction (much closer to periastron passage) at a 5.6 standard deviation (5.6 sigma) post-trials significance. In total, between 2008 October and 2010 December a total exposure of 64.5 hr was accumulated with VERITAS on LS I + 61 degrees 303, resulting in an excess at the 3.3 sigma significance level for constant emission over the entire integrated data set. The flux upper limits derived for emission during the previously reliably active TeV phases (i.e., close to apastron) are less than 5% of the Crab Nebula flux in the same energy range. This result stands in apparent contrast to previous observations by both MAGIC and VERITAS which detected the source during these phases at 10% of the Crab Nebula flux. During the two year span of observations, a large amount of X-ray data were also accrued on LS I + 61 degrees 303 by the Swift X-ray Telescope and the Rossi X-ray Timing Explorer Proportional Counter Array. We find no evidence for a correlation between emission in the X-ray and TeV regimes during 20 directly overlapping observations. We also comment on data obtained contemporaneously by the Fermi Large Area Telescope.
We present the results of 16 Swift-triggered Gamma-ray burst (GRB) follow-up observations taken with the Very Energetic Radiation Imaging Telescope Array System (VERITAS) telescope array from 2007 January to 2009 June. The median energy threshold and response time of these observations were 260 GeV and 320 s, respectively. Observations had an average duration of 90 minutes. Each burst is analyzed independently in two modes: over the whole duration of the observations and again over a shorter timescale determined by the maximum VERITAS sensitivity to a burst with a t(-1.5) time profile. This temporal model is characteristic of GRB afterglows with high-energy, long-lived emission that have been detected by the Large Area Telescope on board the Fermi satellite. No significant very high energy (VHE) gamma-ray emission was detected and upper limits above the VERITAS threshold energy are calculated. The VERITAS upper limits are corrected for gamma-ray extinction by the extragalactic background light and interpreted in the context of the keV emission detected by Swift. For some bursts the VHE emission must have less power than the keV emission, placing constraints on inverse Compton models of VHE emission.
Previous research has shown that high phonotactic frequencies facilitate the production of regularly inflected verbs in English-learning children with specific language impairment (SLI) but not with typical development (TD). We asked whether this finding can be replicated for German, a language with a much more complex inflectional verb paradigm than English. Using an elicitation task, the production of inflected nonce verb forms (3rd person singular with - t suffix) with either high-or low-frequency subsyllables was tested in sixteen German-learning children with SLI (ages 4;1-5;1), sixteen TD-children matched for chronological age (CA) and fourteen TD-children matched for verbal age (VA) (ages 3;0-3;11). The findings revealed that children with SLI, but not CA-or VA-children, showed differential performance between the two types of verbs, producing more inflectional errors when the verb forms resulted in low-frequency subsyllables than when they resulted in high-frequency subsyllables, replicating the results from English-learning children.
The morphological features in the deviations of the total electron content (TEC) of the ionosphere from the background undisturbed state as possible precursors of the earthquake of January 12, 2010 (21:53 UT (16:53 LT), 18.46A degrees N, 72.5A degrees W, 7.0 M) in Haiti are analyzed. To identify these features, global and regional differential TEC maps based on global 2-h TEC maps provided by NASA in the IONEX format were plotted. For the considered earthquake, long-lived disturbances, presumably of seismic origin, were localized in the near-epicenter area and were accompanied by similar effects in the magnetoconjugate region. Both decreases and increases in the local TEC over the period from 22 UT of January 10 to 08 UT of January 12, 2010 were observed. The horizontal dimensions of the anomalies were similar to 40A degrees in longitude and similar to 20A degrees in latitude, with the magnitude of TEC disturbances reaching similar to 40% relative to the background near the epicenter and more than 50% in the magnetoconjugate area. No significant geomagnetic disturbances within January 1-12, 2010 were observed, i.e., the detected TEC anomalies were manifestations of interplay between processes in the lithosphere-atmosphere-ionosphere system.
In this thesis, we discuss the formulation of variational problems on supermanifolds. Supermanifolds incorporate bosonic as well as fermionic degrees of freedom. Fermionic fields take values in the odd part of an appropriate Grassmann algebra and are thus showing an anticommutative behaviour. However, a systematic treatment of these Grassmann parameters requires a description of spaces as functors, e.g. from the category of Grassmann algberas into the category of sets (or topological spaces, manifolds). After an introduction to the general ideas of this approach, we use it to give a description of the resulting supermanifolds of fields/maps. We show that each map is uniquely characterized by a family of differential operators of appropriate order. Moreover, we demonstrate that each of this maps is uniquely characterized by its component fields, i.e. by the coefficients in a Taylor expansion w.r.t. the odd coordinates. In general, the component fields are only locally defined. We present a way how to circumvent this limitation. In fact, by enlarging the supermanifold in question, we show that it is possible to work with globally defined components. We eventually use this formalism to study variational problems. More precisely, we study a super version of the geodesic and a generalization of harmonic maps to supermanifolds. Equations of motion are derived from an energy functional and we show how to decompose them into components. Finally, in special cases, we can prove the existence of critical points by reducing the problem to equations from ordinary geometric analysis. After solving these component equations, it is possible to show that their solutions give rise to critical points in the functor spaces of fields.
Standing stocks are typically easier to measure than process rates such as production. Hence, stocks are often used as indicators of ecosystem functions although the latter are generally more strongly related to rates than to stocks. The regulation of stocks and rates and thus their variability over time may differ, as stocks constitute the net result of production and losses. Based on long-term high frequency measurements in a large, deep lake we explore the variability patterns in primary and bacterial production and relate them to those of the corresponding standing stocks, i.e. chlorophyll concentration, phytoplankton and bacterial biomass. We employ different methods (coefficient of variation, spline fitting and spectral analysis) which complement each other for assessing the variability present in the plankton data, at different temporal scales. In phytoplankton, we found that the overall variability of primary production is dominated by fluctuations at low frequencies, such as the annual, whereas in stocks and chlorophyll in particular, higher frequencies contribute substantially to the overall variance. This suggests that using standing stocks instead of rate measures leads to an under- or overestimation of food shortage for consumers during distinct periods of the year. The range of annual variation in bacterial production is 8 times greater than biomass, showing that the variability of bacterial activity (e.g. oxygen consumption, remineralisation) would be underestimated if biomass is used. The P/B ratios were variable and although clear trends are present in both bacteria and phytoplankton, no systematic relationship between stock and rate measures were found for the two groups. Hence, standing stock and process rate measures exhibit different variability patterns and care is needed when interpreting the mechanisms and implications of the variability encountered.
Using cationic polyelectrolytes with different molecular architectures, only hyperbranched poly(ethyleneimine) with maltose shell is suited to tailor the morphological transformation of anionic vesicles into tube-like networks. The interaction features of those materials partly mimic biological features of tubular proteins in nature.