Refine
Has Fulltext
- no (1452) (remove)
Year of publication
- 2011 (1452) (remove)
Document Type
- Article (1040)
- Doctoral Thesis (179)
- Monograph/Edited Volume (105)
- Review (73)
- Conference Proceeding (27)
- Other (18)
- Preprint (7)
- Part of a Book (2)
- Journal/Publication series (1)
Keywords
- X-rays: stars (7)
- NMR (5)
- climate change (5)
- gamma rays: general (5)
- stars: massive (5)
- Eye movements (4)
- Photosynthesis (4)
- Political Science (4)
- Politikwissenschaft (4)
- Site effects (4)
Institute
- Institut für Biochemie und Biologie (220)
- Institut für Physik und Astronomie (164)
- Institut für Chemie (157)
- Institut für Geowissenschaften (124)
- Institut für Romanistik (108)
- Department Psychologie (71)
- Sozialwissenschaften (63)
- Wirtschaftswissenschaften (55)
- Institut für Ernährungswissenschaft (49)
- Historisches Institut (47)
Humidity is an important determinant of the mycotoxin production (DON, ZEA) by Fusarium species in the grain ears. From a landscape perspective humidity is not evenly distributed across fields. The topographically-controlled redistribution of water within a single field rather leads to spatially heterogeneous soil water content and air humidity. Therefore we hypothesized that the spatial distribution of mycotoxins is related to these topographically-controlled factors. To test this hypothesis we studied the mycotoxin concentrations at contrasting topographic relief positions, i.e. hilltops and depressions characterized by soils of different soil moisture regimes, on ten winter wheat fields in 2006 and 2007. Maize was the preceding crop and minimum tillage was practiced in the fields. The different topographic positions were associated with moderate differences in DON and ZEA concentrations in 2006, but with significant differences in 2007, with six times higher median ZEA and two times higher median DON detected at depression sites compared to the hilltops. The depression sites correspond to a higher topographic wetness index as well as redoximorphic properties in soil profiles, which empirically supports our hypothesis at least for years showing wetter conditions in sensitive time windows for Fusarium infections.
Wissenschaftliches Schreiben
(2011)
Wissenschaftliches Schreiben
(2011)
Schreiben ist harte Arbeit. Dafür sind sowohl Erfahrung als auch Orientierung nötig. Diese Schrift, die jetzt in einer vierten, erweiterten Auflage erschienen ist, gibt Ihnen Hinweise zum wissenschaftlichen Schreiben in seinen verschiedenen Varianten: vom Exzerpt und der Literaturbesprechung über die Klausur und das Essay bis hin zur Abschlussarbeit. Zudem finden Sie Anregungen zu mündlichen Prüfungen und der Disputation. Der Lehrtext ist eine konzentrierte Hilfe sowohl für Studienanfänger als auch für diejenigen, die vor dem Abschluss ihres Studiums stehen.
Psychotherapeutic interventions require empirical as well as scientific assessment. Specifically, the proven efficacy of psychotherapy for children and adolescents is essential. Thus, studies examining treatment efficacy and meta- analyses are necessary to compare effect sizes of individual therapeutic interventions between treatment groups and waiting control groups. Assessment of 138 primary studies from 1993-2009 documented the efficacy of psychotherapy for children and adolescents. Furthermore, behavioural therapy outperformed non-behavioural interventions, as 90 % of behavioural interventions showed larger effect sizes compared to non-behavioural psychotherapy. Analysis of moderator variables demonstrated an improved treatment efficacy for individual therapy, inclusion of the family, treatment of internalised disorders, and in clinical samples. Stability of psychotherapeutic treatment effects over time was demonstrated.
The open source computational fluid dynamics (CFD) wind model (CFD-WEM) for wind erosion research in the Xilingele grassland in Inner Mongolia (autonomous region, China) is compared with two open source CFD models Gerris and OpenFOAM. The evaluation of these models was made according to software technology, implemented methods, handling, accuracy and calculation speed. All models were applied to the same wind tunnel data set. Results show that the simplest CFD-WEM has the highest calculation speed with acceptable accuracy, and the most powerful OpenFOAM produces the simulation with highest accuracy and the lowest calculation speed. Gerris is between CFD-WEM and OpenFOAM. It calculates faster than OpenFOAM, and it is capable to solve different CFD problems. CFD-WEM is the optimal model to be further developed for wind erosion research in Inner Mongolia grassland considering its efficiency and the uncertainties of other input data. However, for other applications using CFD technology, Gerris and OpenFOAM can be good choices. This paper shows the powerful capability of open source CFD software in wind erosion study, and advocates more involvement of open source technology in wind erosion and related ecological researches.
Wiedergelesen
(2011)
Seit 2008 ergänzt die sowohl kleine als auch feine Rubrik "Wiedergelesen" die Ausgaben des außenpolitischen Fachjournals WeltTrends. Berühmte Bücher der Politologie und schätzenswerte Raritäten werden für die Füllung dieser Rubrik aus Bücherregalen herausgefischt, abgestaubt und in aktuellem Interesse gelesen - wieder gelesen. Stets ist es das Anliegen, die Schriften sowohl aus ihrer Zeit zu verstehen als auch für unsere Zeit nutzbar zu machen - z. B. Huntingtons The Clash of Civilizations, Lipsets Political Man, Paines Die Rechte des Menschen oder An Authoritarian Regime: The Case of Spain von Linz.
Wie geht es weiter
(2011)
Spatial numerical associations (SNAs) are prevalent yet their origin is poorly understood. We first consider the possible prime role of reading habits in shaping SNAs and list three observations that argue against a prominent influence of this role: (1) directional reading habits for numbers may conflict with those for non-numerical symbols, (2) short-term experimental manipulations can overrule the impact of decades of reading experience, (3) SNAs predate the acquisition of reading. As a promising alternative, we discuss behavioral, neuroscientific, and neuropsychological evidence in support of finger counting as the most likely initial determinant of SNAs. Implications of this "manumerical cognition" stance for the distinction between grounded, embodied, and situated cognition are discussed.
Trait-based studies have become extremely common in plant ecology. Trait-based approaches often rely on the tacit assumption that intraspecific trait variability (ITV) is negligible compared to interspecific variability, so that species can be characterized by mean trait values. Yet, numerous recent studies have challenged this assumption by showing that ITV significantly affects various ecological processes. Accounting for ITV may thus strengthen trait-based approaches, but measuring trait values on a large number of individuals per species and site is not feasible. Therefore, it is important and timely to synthesize existing knowledge on ITV in order to (1) decide critically when ITV should be considered, and (2) establish methods for incorporating this variability. Here we propose a practical set of rules to identify circumstances under which ITV should be accounted for. We formulate a spatial trait variance partitioning hypothesis to highlight the spatial scales at which ITV cannot be ignored in ecological studies. We then refine a set of four consecutive questions on the research question, the spatial scale, the sampling design, and the type of studied traits, to determine case-by-case if a given study should quantify ITV and test its effects. We review methods for quantifying ITV and develop a step-by-step guideline to design and interpret simulation studies that test for the importance of ITV. Even in the absence of quantitative knowledge on ITV, its effects can be assessed by varying trait values within species within realistic bounds around the known mean values. We finish with a discussion of future requirements to further incorporate ITV within trait-based approaches. This paper thus delineates a general framework to account for ITV and suggests a direction towards a more quantitative trait-based ecology.
Which repair strategy does the language system deploy when it gets garden-pathed, and what can regressive eye movements in reading tell us about reanalysis strategies? Several influential eye-tracking studies on syntactic reanalysis (Frazier & Rayner, 1982; Meseguer, Carreiras, & Clifton, 2002; Mitchell, Shen, Green, & Hodgson, 2008) have addressed this question by examining scanpaths, i.e., sequential patterns of eye fixations. However, in the absence of a suitable method for analyzing scanpaths, these studies relied on simplified dependent measures that are arguably ambiguous and hard to interpret. We address the theoretical question of repair strategy by developing a new method that quantifies scanpath similarity. Our method reveals several distinct fixation strategies associated with reanalysis that went undetected in a previously published data set (Meseguer et al., 2002). One prevalent pattern suggests re-parsing of the sentence, a strategy that has been discussed in the literature (Frazier & Rayner, 1982); however, readers differed tremendously in how they orchestrated the various fixation strategies. Our results suggest that the human parsing system non-deterministically adopts different strategies when confronted with the disambiguating material in garden-path sentences.
Urban forests fulfil various functions, among them the restoration process and aesthetical needs of urban residents. This article reflects the attitudes towards different managed forests on the one hand and their influence on psychological well-being on the other. Results of empirical approaches from both fields show some inconsistency, suggesting that people have a more positive attitude towards wild forest areas, while the effect on well-being is more positive after a walk in tended forest areas. A discussion follows on the link between perception and the effect of urban forests. An outlook on necessary research reveals the need for longitudinal research. The article concludes by showing management implications.
Weimarer Klassik
(2011)
Wavelet modelling of the gravity field by domain decomposition methods: an example over Japan
(2011)
With the advent of satellite gravity, large gravity data sets of unprecedented quality at low and medium resolution become available. For local, high resolution field modelling, they need to be combined with the surface gravity data. Such models are then used for various applications, from the study of the Earth interior to the determination of oceanic currents. Here we show how to realize such a combination in a flexible way using spherical wavelets and applying a domain decomposition approach. This iterative method, based on the Schwarz algorithms, allows to split a large problem into smaller ones, and avoids the calculation of the entire normal system, which may be huge if high resolution is sought over wide areas. A subdomain is defined as the harmonic space spanned by a subset of the wavelet family. Based on the localization properties of the wavelets in space and frequency, we define hierarchical subdomains of wavelets at different scales. On each scale, blocks of subdomains are defined by using a tailored spatial splitting of the area. The data weighting and regularization are iteratively adjusted for the subdomains, which allows to handle heterogeneity in the data quality or the gravity variations. Different levels of approximations of the subdomains normals are also introduced, corresponding to building local averages of the data at different resolution levels.
We first provide the theoretical background on domain decomposition methods. Then, we validate the method with synthetic data, considering two kinds of noise: white noise and coloured noise. We then apply the method to data over Japan, where we combine a satellite-based geopotential model, EIGEN-GL04S, and a local gravity model from a combination of land and marine gravity data and an altimetry-derived marine gravity model. A hybrid spherical harmonics/wavelet model of the geoid is obtained at about 15 km resolution and a corrector grid for the surface model is derived.
Die vielgestaltige Geschichte des Klosters Dobrilugk wird anhand einer Auswahl klösterlicher Urkunden, die sich heute im Brandenburgischen Landeshauptarchiv befinden, dargestellt. Die Gründe für die Auswahl sind vielseitig. Zum einen wurden im Rahmen der im ehemaligen Refektorium des Kloster Dobrilugk 2011 gezeigten gleichnamigen Urkundenausstellung Exponate ausgewählt, die den Betrachter aufgrund ihrer äußeren Erscheinung - durch ein eindrucksvolles Siegel oder einer eigenhändigen Unterschirft des Ausstellers besonders ansprechen. Zum anderen aber zeigt die Auswahl unter Berücksichtigung der Urkundenaussteller, in welch vielfacher Weise das Kloster Dobrilugk eingebunden war in das mittelalterliche Weltgeschehen.
Walter Boehlich : Kritiker
(2011)
Vorwort
(2011)
Vorwort
(2011)
Vorwort
(2011)
This Letter reports on new methods and a consistent model for voltage tunable optical transmission gratings. Elastomeric gratings were molded from holographically written surface relief gratings in an azobenzene sol-gel material. These were placed on top of a transparent electroactive elastomeric substrate. Two different electro-active substrate elastomers were employed, with a large range of prestretches. A novel finite-deformation theory was found to match the device response excellently, without fitting parameters. The results clearly show that the grating underwent pure-shear deformation, and more surprisingly, that the mechanical properties of the electro-active substrate did not affect device actuation. (C) 2011 Optical Society of America
Vitamin A metabolism is changed in donors after living-kidney transplantation an observational study
(2011)
Background: The kidneys are essential for the metabolism of vitamin A (retinol) and its transport proteins retinol-binding protein 4 (RBP4) and transthyretin. Little is known about changes in serum concentration after living donor kidney transplantation (LDKT) as a consequence of unilateral nephrectomy; although an association of these parameters with the risk of cardiovascular diseases and insulin resistance has been suggested. Therefore we analyzed the concentration of retinol, RBP4, apoRBP4 and transthyretin in serum of 20 living-kidney donors and respective recipients at baseline as well as 6 weeks and 6 months after LDKT.
Results: As a consequence of LDKT, the kidney function of recipients was improved while the kidney function of donors was moderately reduced within 6 weeks after LDKT. With regard to vitamin A metabolism, the recipients revealed higher levels of retinol, RBP4, transthyretin and apoRBP4 before LDKT in comparison to donors. After LDKT, the levels of all four parameters decreased in serum of the recipients, while retinol, RBP4 as well as apoRBP4 serum levels of donors increased and remained increased during the follow-up period of 6 months.
Conclusion: LDKT is generally regarded as beneficial for allograft recipients and not particularly detrimental for the donors. However, it could be demonstrated in this study that a moderate reduction of kidney function by unilateral nephrectomy, resulted in an imbalance of components of vitamin A metabolism with a significant increase of retinol and RBP4 and apoRBP4 concentration in serum of donors.
A business process is a set of steps designed to be executed in a certain order to achieve a business value. Such processes are often driven by and documented using process models. Nowadays, process models are also applied to drive process execution. Thus, correctness of business process models is a must. Much of the work has been devoted to check general, domain-independent correctness criteria, such as soundness. However, business processes must also adhere to and show compliance with various regulations and constraints, the so-called compliance requirements. These are domain-dependent requirements.
In many situations, verifying compliance on a model level is of great value, since violations can be resolved in an early stage prior to execution. However, this calls for using formal verification techniques, e.g., model checking, that are too complex for business experts to apply. In this paper, we utilize a visual language. BPMN-Q to express compliance requirements visually in a way similar to that used by business experts to build process models. Still, using a pattern based approach, each BPMN-Qgraph has a formal temporal logic expression in computational tree logic (CTL). Moreover, the user is able to express constraints, i.e., compliance rules, regarding control flow and data flow aspects. In order to provide valuable feedback to a user in case of violations, we depend on temporal logic querying approaches as well as BPMN-Q to visually highlight paths in a process model whose execution causes violations.
Miniature eye movements jitter the retinal image unceasingly, raising the question of how perceptual continuity is achieved during visual fixation. Recent work discovered suppression of visual bursts in the superior colliculus around the time of microsaccades, tiny jerks of the eyes that support visual perception while gaze is fixed. This finding suggests that corollary discharge, supporting visual stability when rapid eye movements drastically shift the retinal image, may also exist for the smallest saccades.
Business processes are commonly modeled using a graphical modeling language. The most widespread notation for this purpose is business process diagrams in the Business Process Modeling Notation (BPMN). In this article, we use the visual query language BPMN-Q for expressing patterns that are related to possible problems in such business process diagrams. We discuss two classes of problems that can be found frequently in real-world models: sequence flow errors and model fragments that can make the model difficult to understand.
By using a query processor, a business process modeler is able to identify possible errors in business process diagrams. Moreover, the erroneous parts of the business process diagram can be highlighted when an instance of an error pattern is found. This way, the modeler gets an easy-to-understand feedback in the visual modeling language he or she is familiar with. This is an advantage over current validation methods, which usually lack this kind of intuitive feedback.
We report on very high energy (>100 GeV) gamma-ray observations of Swift J164449.3+573451, an unusual transient object first detected by the Swift Observatory and later detected by multiple radio, optical, and X-ray observatories. A total exposure of 28 hr was obtained on Swift J164449.3+573451 with the Very Energetic Radiation Imaging Telescope Array System ( VERITAS) during 2011 March 28-April 15. We do not detect the source and place a differential upper limit on the emission at 500 GeV during these observations of 1.4 x 10(-12) erg cm(-2) s(-1) (99% confidence level). We also present time-resolved upper limits and use a flux limit averaged over the X-ray flaring period to constrain various emission scenarios that can accommodate both the radio-through-X-ray emission detected from the source and the lack of detection by VERITAS.
We present the results of observations of the TeV binary LS I + 61 degrees 303 with the VERITAS telescope array between 2008 and 2010, at energies above 300 GeV. In the past, both ground-based gamma-ray telescopes VERITAS and MAGIC have reported detections of TeV emission near the apastron phases of the binary orbit. The observations presented here show no strong evidence for TeV emission during these orbital phases; however, during observations taken in late 2010, significant emission was detected from the source close to the phase of superior conjunction (much closer to periastron passage) at a 5.6 standard deviation (5.6 sigma) post-trials significance. In total, between 2008 October and 2010 December a total exposure of 64.5 hr was accumulated with VERITAS on LS I + 61 degrees 303, resulting in an excess at the 3.3 sigma significance level for constant emission over the entire integrated data set. The flux upper limits derived for emission during the previously reliably active TeV phases (i.e., close to apastron) are less than 5% of the Crab Nebula flux in the same energy range. This result stands in apparent contrast to previous observations by both MAGIC and VERITAS which detected the source during these phases at 10% of the Crab Nebula flux. During the two year span of observations, a large amount of X-ray data were also accrued on LS I + 61 degrees 303 by the Swift X-ray Telescope and the Rossi X-ray Timing Explorer Proportional Counter Array. We find no evidence for a correlation between emission in the X-ray and TeV regimes during 20 directly overlapping observations. We also comment on data obtained contemporaneously by the Fermi Large Area Telescope.
We present the results of 16 Swift-triggered Gamma-ray burst (GRB) follow-up observations taken with the Very Energetic Radiation Imaging Telescope Array System (VERITAS) telescope array from 2007 January to 2009 June. The median energy threshold and response time of these observations were 260 GeV and 320 s, respectively. Observations had an average duration of 90 minutes. Each burst is analyzed independently in two modes: over the whole duration of the observations and again over a shorter timescale determined by the maximum VERITAS sensitivity to a burst with a t(-1.5) time profile. This temporal model is characteristic of GRB afterglows with high-energy, long-lived emission that have been detected by the Large Area Telescope on board the Fermi satellite. No significant very high energy (VHE) gamma-ray emission was detected and upper limits above the VERITAS threshold energy are calculated. The VERITAS upper limits are corrected for gamma-ray extinction by the extragalactic background light and interpreted in the context of the keV emission detected by Swift. For some bursts the VHE emission must have less power than the keV emission, placing constraints on inverse Compton models of VHE emission.
Previous research has shown that high phonotactic frequencies facilitate the production of regularly inflected verbs in English-learning children with specific language impairment (SLI) but not with typical development (TD). We asked whether this finding can be replicated for German, a language with a much more complex inflectional verb paradigm than English. Using an elicitation task, the production of inflected nonce verb forms (3rd person singular with - t suffix) with either high-or low-frequency subsyllables was tested in sixteen German-learning children with SLI (ages 4;1-5;1), sixteen TD-children matched for chronological age (CA) and fourteen TD-children matched for verbal age (VA) (ages 3;0-3;11). The findings revealed that children with SLI, but not CA-or VA-children, showed differential performance between the two types of verbs, producing more inflectional errors when the verb forms resulted in low-frequency subsyllables than when they resulted in high-frequency subsyllables, replicating the results from English-learning children.
The morphological features in the deviations of the total electron content (TEC) of the ionosphere from the background undisturbed state as possible precursors of the earthquake of January 12, 2010 (21:53 UT (16:53 LT), 18.46A degrees N, 72.5A degrees W, 7.0 M) in Haiti are analyzed. To identify these features, global and regional differential TEC maps based on global 2-h TEC maps provided by NASA in the IONEX format were plotted. For the considered earthquake, long-lived disturbances, presumably of seismic origin, were localized in the near-epicenter area and were accompanied by similar effects in the magnetoconjugate region. Both decreases and increases in the local TEC over the period from 22 UT of January 10 to 08 UT of January 12, 2010 were observed. The horizontal dimensions of the anomalies were similar to 40A degrees in longitude and similar to 20A degrees in latitude, with the magnitude of TEC disturbances reaching similar to 40% relative to the background near the epicenter and more than 50% in the magnetoconjugate area. No significant geomagnetic disturbances within January 1-12, 2010 were observed, i.e., the detected TEC anomalies were manifestations of interplay between processes in the lithosphere-atmosphere-ionosphere system.
Standing stocks are typically easier to measure than process rates such as production. Hence, stocks are often used as indicators of ecosystem functions although the latter are generally more strongly related to rates than to stocks. The regulation of stocks and rates and thus their variability over time may differ, as stocks constitute the net result of production and losses. Based on long-term high frequency measurements in a large, deep lake we explore the variability patterns in primary and bacterial production and relate them to those of the corresponding standing stocks, i.e. chlorophyll concentration, phytoplankton and bacterial biomass. We employ different methods (coefficient of variation, spline fitting and spectral analysis) which complement each other for assessing the variability present in the plankton data, at different temporal scales. In phytoplankton, we found that the overall variability of primary production is dominated by fluctuations at low frequencies, such as the annual, whereas in stocks and chlorophyll in particular, higher frequencies contribute substantially to the overall variance. This suggests that using standing stocks instead of rate measures leads to an under- or overestimation of food shortage for consumers during distinct periods of the year. The range of annual variation in bacterial production is 8 times greater than biomass, showing that the variability of bacterial activity (e.g. oxygen consumption, remineralisation) would be underestimated if biomass is used. The P/B ratios were variable and although clear trends are present in both bacteria and phytoplankton, no systematic relationship between stock and rate measures were found for the two groups. Hence, standing stock and process rate measures exhibit different variability patterns and care is needed when interpreting the mechanisms and implications of the variability encountered.
Using cationic polyelectrolytes with different molecular architectures, only hyperbranched poly(ethyleneimine) with maltose shell is suited to tailor the morphological transformation of anionic vesicles into tube-like networks. The interaction features of those materials partly mimic biological features of tubular proteins in nature.
Transarea studies focus upon spaces as created by the movements that criss-cross them. From this point of view, from its very beginnings, literature is closely interrelated with a vectorial (and much less with a purely spatial) conception of history - and with urbanity, which plays a decisive role in Gilgamesh's travels through a (narrative) cosmos centered upon the city of Uruk. This article explores the city as a transareal space of movement in three examples of literature, with no fixed abode, around the turn of the millennium, i.e. Assia Djebar's Les Nuits de Strasbourg, Emine Sevgi Oezdamar's Istanbul-Berlin Trilogy, and Cecile Wajsbrot's L'ile aux musees. These three writers project, in a very specific way, cities in motion as anagrammatic and fractal structures.
Amorphous materials represent a large and important emerging area of material's science. Amorphous oxides are key technological oxides in applications such as a gate dielectric in Complementary metal-oxide semiconductor devices and in Silicon-Oxide-Nitride-Oxide-Silicon and TANOS (TaN-Al2O3-Si3N4-SiO2-Silicon) flash memories. These technologies are required for the high packing density of today's integrated circuits. Therefore the investigation of defect states in these structures is crucial. In this work we present X-ray synchrotron measurements, with an energy resolution which is about 5-10 times higher than is attainable with standard spectrometers, of amorphous alumina. We demonstrate that our experimental results are in agreement with calculated spectra of amorphous alumina which we have generated by stochastic quenching. This first principles method, which we have recently developed, is found to be superior to molecular dynamics in simulating the rapid gas to solid transition that takes place as this material is deposited for thin film applications. We detect and analyze in detail states in the band gap that originate from oxygen pairs. Similar states were previously found in amorphous alumina by other spectroscopic methods and were assigned to oxygen vacancies claimed to act mutually as electron and hole traps. The oxygen pairs which we probe in this work act as hole traps only and will influence the information retention in electronic devices. In amorphous silica oxygen pairs have already been found, thus they may be a feature which is characteristic also of other amorphous metal oxides.
Untitled
(2011)
Background: Inferring regulatory interactions between genes from transcriptomics time-resolved data, yielding reverse engineered gene regulatory networks, is of paramount importance to systems biology and bioinformatics studies. Accurate methods to address this problem can ultimately provide a deeper insight into the complexity, behavior, and functions of the underlying biological systems. However, the large number of interacting genes coupled with short and often noisy time-resolved read-outs of the system renders the reverse engineering a challenging task. Therefore, the development and assessment of methods which are computationally efficient, robust against noise, applicable to short time series data, and preferably capable of reconstructing the directionality of the regulatory interactions remains a pressing research problem with valuable applications.
Results: Here we perform the largest systematic analysis of a set of similarity measures and scoring schemes within the scope of the relevance network approach which are commonly used for gene regulatory network reconstruction from time series data. In addition, we define and analyze several novel measures and schemes which are particularly suitable for short transcriptomics time series. We also compare the considered 21 measures and 6 scoring schemes according to their ability to correctly reconstruct such networks from short time series data by calculating summary statistics based on the corresponding specificity and sensitivity. Our results demonstrate that rank and symbol based measures have the highest performance in inferring regulatory interactions. In addition, the proposed scoring scheme by asymmetric weighting has shown to be valuable in reducing the number of false positive interactions. On the other hand, Granger causality as well as information-theoretic measures, frequently used in inference of regulatory networks, show low performance on the short time series analyzed in this study.
Conclusions: Our study is intended to serve as a guide for choosing a particular combination of similarity measures and scoring schemes suitable for reconstruction of gene regulatory networks from short time series data. We show that further improvement of algorithms for reverse engineering can be obtained if one considers measures that are rooted in the study of symbolic dynamics or ranks, in contrast to the application of common similarity measures which do not consider the temporal character of the employed data. Moreover, we establish that the asymmetric weighting scoring scheme together with symbol based measures (for low noise level) and rank based measures (for high noise level) are the most suitable choices.
The present study investigated whether visual and kinesthetic stimuli are stored as multisensory or modality-specific representations in unimodal and crossmodal working memory tasks. To this end, angle-shaped movement trajectories were presented to 16 subjects in delayed matching-to-sample tasks either visually or kinesthetically during encoding and recognition. During the retention interval, a secondary visual or kinesthetic interference task was inserted either immediately or with a delay after encoding. The modality of the interference task interacted significantly with the encoding modality. After visual encoding, memory was more impaired by a visual than by a kinesthetic secondary task, while after kinesthetic encoding the pattern was reversed. The time when the secondary task had to be performed interacted with the encoding modality as well. For visual encoding, memory was more impaired, when the secondary task had to be performed at the beginning of the retention interval. In contrast, memory after kinesthetic encoding was more affected, when the secondary task was introduced later in the retention interval. The findings suggest that working memory traces are maintained in a modality-specific format characterized by distinct consolidation processes that take longer after kinesthetic than after visual encoding.
The main thread of this review article is to identify the reasons of how to account for the trajectory of American power in the region. Leaving behind the vast amount of highly politicised and hastily compiled volumes of recent years (notwithstanding valuable exceptions), the monographs composed by Lawrence Freedman, Trita Parsi and Oliver Roy attempt to subtly disentangle the intricacies of US involvement in the region from highly distinct perspectives. One caveat for International Relations theorists is that none of the aforementioned authors intends to provide theoretical frameworks for his examination. However, since IR theory has damagingly neglected history in the last decades, the works under review here, at least in part, compensate for this disciplinary and intellectual failure.
In conclusion, Freedman's in-depth approach as a diplomatic historian, with its underlying reference to the various traditions in US foreign policy thinking, is most illuminating, while Parsi's contestable account focuses too narrowly on the Iran-Israel relationship. Roy's explications fail to show how and why the 'ideological' element in US foreign policy came to carry exceedingly more weight after 2001 than it did in the 1990s.
Parallel communicating finite automata (PCFAs) are systems of several finite state automata which process a common input string in a parallel way and are able to communicate by sending their states upon request. We consider deterministic and nondeterministic variants and distinguish four working modes. It is known that these systems in the most general mode are as powerful as one-way multi-head finite automata. It is additionally known that the number of heads corresponds to the number of automata in PCFAs in a constructive way. Thus, undecidability results as well as results on the hierarchies induced by the number of heads carry over from multi-head finite automata to PCFAs in the most general mode. Here, we complement these undecidability and hierarchy results also for the remaining working modes. In particular, we show that classical decidability questions are not semi-decidable for any type of PCFAs under consideration. Moreover, it is proven that the number of automata in the system induces infinite hierarchies for deterministic and nondeterministic PCFAs in three working modes.
Un episodio en la vida del pintor viajero : mala literatura y escritura fulminante en César Aira
(2011)
Un episodio en la vida del pintor viajero : mala literatura y escritura fulminante en Cesar Aira
(2011)
Ultrasound (20 kHz, 29 W. cm(-2)) is employed to form three types of erbium oxide nanoparticles in the presence of multiwalled carbon nanotubes as a template material in water. The nanoparticles are (i) erbium carboxioxide nanoparticles deposited on the external walls of multiwalled carbon nanotubes and Er(2)O(3) in the bulk with (ii) hexagonal and (iii) spherical geometries. Each type of ultrasonically formed nanoparticle reveals Er(3+) photoluminescence from crystal lattice. The main advantage of the erbium carboxioxide nanoparticles on the carbon nanotubes is the electromagnetic emission in the visible region, which is new and not examined up to the present date. On the other hand, the photoluminescence of hexagonal erbium oxide nanoparticles is long-lived (mu s) and enables the higher energy transition ((4)S(3/2)-(4)I(15/2)), which is not observed for spherical nanoparticles. Our work is unique because it combines for the first time spectroscopy of Er(3+) electronic transitions in the host crystal lattices of nanoparticles with the geometry established by ultrasound in aqueous solution of carbon nanotubes employed as a template material. The work can be of great interest for "green" chemistry synthesis of photoluminescent nanoparticles in water.
Lahn M, Dosche C, Hille C. Two-photon microscopy and fluorescence lifetime imaging reveal stimulus-induced intracellular Na+ and Cl- changes in cockroach salivary acinar cells. Am J Physiol Cell Physiol 300: C1323-C1336, 2011. First published February 23, 2011; doi: 10.1152/ajpcell.00320.2010.-The intracellular ion homeostasis in cockroach salivary acinar cells during salivation is not satisfactorily understood. This is mainly due to technical problems regarding strong tissue autofluorescence and ineffective ion concentration quantification. For minimizing these problems, we describe the successful application of two-photon (2P) microscopy partly in combination with fluorescence lifetime imaging microscopy (FLIM) to record intracellular Na+ and Cl- concentrations ([Na+](i), [Cl-](i)) in cockroach salivary acinar cells. Quantitative 2P-FLIM Cl- measurements with the dye N-(ethoxycarbonylmethyl)-6-methoxy-quinolinium bromide indicate that the resting [Cl-](i) is 1.6 times above the Cl- electrochemical equilibrium but is not influenced by pharmacological inhibition of the Na+-K+-2Cl(-) cotransporter (NKCC) and anion exchanger using bumetanide and 4,4'-diisothiocyanatodihydrostilbene-2,2'-disulfonic acid disodium salt. In contrast, rapid Cl- reuptake after extracellular Cl- removal is almost totally NKCC mediated both in the absence and presence of dopamine. However, in physiological saline [Cl-](i) does not change during dopamine stimulation although dopamine stimulates fluid secretion in these glands. On the other hand, dopamine causes a decrease in the sodium-binding benzofuran isophthalate tetra-ammonium salt (SBFI) fluorescence and an increase in the Sodium Green fluorescence after 2P excitation. This opposite behavior of both dyes suggests a dopamine-induced [Na+](i) rise in the acinar cells, which is supported by the determined 2P-action cross sections of SBFI. The [Na+](i) rise is Cl- dependent and inhibited by bumetanide. The Ca2+-ionophore ionomycin also causes a bumetanide-sensitive [Na+](i) rise. We propose that a Ca2+-mediated NKCC activity in acinar peripheral cells attributable to dopamine stimulation serves for basolateral Na+ uptake during saliva secretion and that the concomitantly transported Cl- is recycled back to the bath.
Turning wind into power : effects of stakeholder networks on renewalbe energy governanace in India
(2011)
Polymer foams and void-containing polymer-film systems with internally charged voids combine large piezoelectricity with mechanical flexibility and elastic compliance. This new class of soft materials (often called ferro-or piezoelectrets) has attracted considerable attention from science and industry. It has been found that the voids can be internally charged by means of dielectric barrier discharges (DBDs) under high electric fields. The charged voids can be considered as man-made macroscopic dipoles. Depending on the ferroelectret structure and the pressure of the internal gas, the voids may be highly compressible. Consequently, very large dipole-moment changes can be induced by mechanical or electrical stresses, leading to large piezoelectricity. DBD charging of the voids is a critical process for rendering polymer foams piezoelectric. Thus a comprehensive exploration of DBD charging is essential for the understanding and the optimization of piezoelectricity in ferroelectrets. Recent studies show that DBDs in the voids are triggered when the internal electric field reaches a threshold value according to Townsend's model of Paschen breakdown. During the DBDs, charges of opposite polarity are generated and trapped at the top and bottom internal surfaces of the gas-filled voids, respectively. The deposited charges induce an electric field opposite to the externally applied one and thus extinguish the DBDs. Back discharges may eventually be triggered when the external voltage is reduced or turned off. In order to optimize the efficiency of DBD charging, the geometry (in particular the height) of the voids, the type of gas and its pressure inside the voids are essential factors to be considered and to be optimized. In addition, the influence of the plasma treatment on the internal void surfaces during the DBDs should be taken into consideration.
A thermosensitive statistical copolymer based on oligo(ethylene glycol) methacrylates incorporating biotin was synthesized by free radical copolymerisation. The influence of added avidin on its thermoresponsive behaviour was investigated. The specific binding of avidin to the biotinylated copolymers provoked a marked increase of the lower critical solution temperature.
Vertical flow filters and vertical flow constructed wetlands are established wastewater treatment systems and have also been proposed for the treatment of contaminated groundwater. This study investigates the removal processes of volatile organic compounds in a pilot-scale vertical flow filter. The filter is intermittently irrigated with contaminated groundwater containing benzene, MTBE and ammonium as the main contaminants. The system is characterized by unsaturated conditions and high contaminant removal efficiency. The aim of the present study is to evaluate the contribution of biodegradation and volatilization to the overall removal of benzene and MTBE. Tracer tests and flow rate measurements showed a highly transient flow and heterogeneous transport regime. Radon-222, naturally occurring in the treated groundwater, was used as a gas tracer and indicated a high volatilization potential. Radon-222 behavior was reproduced by numerical simulations and extrapolated for benzene and MTBE, and indicated these compounds also have a high volatilization potential. In contrast, passive sampler measurements on top of the filter detected only low benzene and MTBE concentrations. Biodegradation potential was evaluated by the analysis of catabolic genes involved in organic compound degradation and a quantitative estimation of biodegradation was derived from stable isotope fractionation analysis. Results suggest that despite the high volatilization potential, biodegradation is the predominant mass removal process in the filter system, which indicates that the volatilized fraction of the contaminants is still subject to subsequent biodegradation. In particular, the upper filter layer located between the injection tubes and the surface of the system might also contribute to biodegradation, and might play a crucial role in avoiding the emission of volatilized contaminants into the atmosphere.
Trapped between in and out : the post-institutional liminality of ex-prisoners in East Berlin
(2011)
Der Beitrag untersucht bestimmte Raeume, die in Wolfgang Hilbigs Prosawerk wiederholt auftauchen und die mit einer bestimmten affektiven Ladung ausgestattet sind. Es sind dies die Kuechen, die in Hilbigs Werk einer Geschlechtsverwandlung von einem ehemals weiblichen in einen maennlichen Arbeitsraum unterzogen werden. Auch werden idyllische Waldsiedlungen in Hilbigs Geschichten zu Staetten, die vom Judenmord durchzogen werden. Generall kann verallgemeinert werden, dass Hilbigs Raeume Transformationen unterliegen, die Verwandlungen bewirken.
Transarchipelische Szenografien : mobile Inszenierungen des Globalen im 16. und 17. Jahrhundert
(2011)
Questions
What are the most likely environmental drivers for compositional herb layer changes as indicated by trait differences between winner and loser species?
Location
Weser-Elbe region (NW Germany).
Methods
We resurveyed the herb layer communities of ancient forest patches on base-rich sites of 175 semi-permanent plots. Species traits were tested for their ability to discriminate between winner and loser species using logistic regression analyses and deviance partitioning.
Results
Of 115 species tested, 31 were identified as winner species and 30 as loser species. Winner species had higher seed longevity, flowered later in the season and more often had an oceanic distribution compared to loser species. Loser species tended to have a higher specific leaf area, were more susceptible to deer browsing and had a performance optimum at higher soil pH compared to winner species. The loser species also represented several ancient forest and threatened species. Deviance partitioning indicated that local drivers (i.e. disturbance due to forest management) were primarily responsible for the species shifts, while regional drivers (i.e. browsing pressure and acidification from atmospheric deposition) and global drivers (i.e. climate warming) had moderate effects. There was no evidence that canopy closure, drainage or eutrophication contributed to herb layer changes.
Conclusions
The relative importance of the different drivers as indicated by the winner and loser species differs from that found in previous long-term studies. Relating species traits to species performance is a valuable tool that provides insight into the environmental drivers that are most likely responsible for herb layer changes.
Objectives: Today, the doping attitudes of athletes can either be measured by asking athletes directly or with the help of indirect attitude measurement procedures as for example the implicit association test (IAT). Using indirect measures may be helpful for example when psychological effects of doping prevention programs shall be evaluated. In the present study we have analyzed and compared measurement properties of two recently published IATs.
Design: The IATs "doping substance vs. tea blend" and "doping substance vs. legal nutritional supplement" were presented to two randomly assigned independent samples of 102 athletes (44 male, 58 female; mean age 23.6 years) from different sports. Both IATs were complemented by a control IAT "word vs. non-word".
Methods: In order to test central measurement properties of both IATs, distributions of measured values, correlations with the control IAT, reliability analyses, and analyses of error rates were performed.
Results: Results pointed to a rather negative doping attitude in most athletes. Especially the fact that in the "doping vs. supplement" IAT error rates (12%) and adaptational learning effects across test blocks were substantial (eta(2) = .22), indicating that participants had difficulties correctly assigning the word stimuli to the respective category, we see slight advantages for the "doping vs. tea" IAT (e.g. satisfactory internal scale consistency Cronbach's-alpha = .78 among athletes reporting to be regularly involved in competitions).
Conclusion: The less satisfactory measurement properties of the "doping vs. supplement" IAT can possibly be explained by the fact that the boundaries between (legal) supplements and (illegal) doping substances have been shifted from time to time so that athletes were not sure whether substances were legal or not.
Benzyl methacrylate (BzMA) propagation rate coefficients, k(p), were determined in ionic liquids and common organic solvents via pulsed-laser polymerizations with subsequent polymer analysis by size-exclusion chromatography (PLP-SEC). The aim of the work is to gain a deeper understanding of the solvent influence on k(p) and to develop a general correlation between solvent-induced variations in k(p) and solvent properties. Applying a linear solvation energy relationship (LSER), which correlates k(p) to solvent solvatochromic parameters, suggests that dipolarity/polarizability determines the solvent influence on k(p). To compare the solvent influence on BzMA k(p) with data for methyl methacrylate, hydroxypropyl methacrylate, and 2-ethoxyethyl methacrylate normalized k(p) data were treated by a single LSER, providing a universal treatment of the solvent influence on the propagation kinetics of the four monomers. Further, the predictive capabilities of this universal correlation were tested with additional monomers from the methacrylate family.
Point-of-care testing (POCT) systems which allow for a sensitive, quantitative detection of protein markers are extremely useful for the early detection and therapy progress monitoring of cancer. However, currently commercially available POCT devices are mainly limited to the qualitative detection of protein markers. In this study we demonstrate the successive miniaturization of a sensitive and fast assay for the quantitative detection of prostate-specific antigen (PSA) using a well established and clinically approved homogeneous time-resolved fluoroimmunoassay technology (TRACE (R)) on a commercial plate-reader system (KRYPTOR (R)). Regarding the initial requirements for the development of POCT devices we applied a 30-fold assay volume reduction (150 mu L to 5 mu L) to achieve a reasonable lab-on-a-chip volume and a 24-fold and 120-fold excitation pulse energy reduction to achieve reasonable pulse energies for low-cost miniature excitation sources. Due to highly efficient optimization of key POCT parameters our miniaturized PSA assay achieved a 30% increased sensitivity and a 2-fold improved limit of detection compared to the standard plate-reader method. Our results demonstrate the successful implementation of key parameters for a significant miniaturization and for cost reduction in the clinically approved KRYPTOR (R) platform for protein detection. The technological alterations required are easy-to-implement and can be immediately adapted for more than 30 diagnostic protein markers already available for the KRYPTOR (R) platform. These features strongly recommend our assay format to be utilized in innovative, sensitive, quantitative POCT of protein markers.
Overland flow is an important hydrological pathway in many forests of the humid tropics. Its generation is subject to topographic controls at differing spatial scales. Our objective was to identify such controls on the occurrence of overland flow in a lowland tropical rainforest. To this end, we installed 95 overland flow detectors (OFDs) in four nested subcatchments of the Lutzito catchment on Barro Colorado Island, Panama, and monitored the frequency of overland flow occurrence during 18 rainfall events at each OFD location temporal frequency. For each such location, we derived three non-digital terrain attributes and 17 digital ones, of which 15 were based on Digital Elevation Models (DEMs) of three different resolutions. These attributes then served as input into a Random Forest ensemble tree model to elucidate the importance and partial and joint dependencies of topographic controls for overland flow occurrence.
Lutzito features a high median temporal frequency in overland flow occurrence of 0.421 among OFD locations. However, spatial temporal frequencies of overland flow occurrence vary strongly among these locations and the subcatchments of Lutzito catchment. This variability is best explained by (1) microtopography, (2) coarse terrain sloping and (3) various measures of distance-to-channel, with the contribution of all other terrain attributes being small. Microtopographic features such as concentrated flowlines and wash areas produce highest temporal frequencies, whereas the occurrence of overland flow drops sharply for flow distances and terrain sloping beyond certain threshold values.
Our study contributes to understanding both the spatial controls on overland flow generation and the limitations of terrain attributes for the spatially explicit prediction of overland flow frequencies.