Extern
Refine
Has Fulltext
- yes (15)
Year of publication
- 2005 (15) (remove)
Document Type
- Article (8)
- Doctoral Thesis (2)
- Postprint (2)
- Working Paper (2)
- Monograph/Edited Volume (1)
Language
- English (15) (remove)
Keywords
- Grenzflächenchemie (1)
- Malliavin calculus (1)
- N400 (1)
- NP-deletion (1)
- Neutronensterne (1)
- Nichtlineare Spektroskopie (1)
- Oscillating Bubble (1)
- P300 (1)
- Relativistische Astrophysik (1)
- Tensid (1)
Institute
In order to investigate the temporal characteristics of cognitive processing, we apply multivariate phase synchronization analysis to event-related potentials. The experimental design combines a semantic incongruity in a sentence context with a physical mismatch (color change). In the ERP average, these result in an N400 component and a P300-like positivity, respectively. The synchronization analysis shows an effect of global desynchronization in the theta band around 288ms after stimulus presentation for the semantic incongruity, while the physical mismatch elicits an increase of global synchronization in the alpha band around 204ms. Both of these effects clearly precede those in the ERP average. Moreover, the delay between synchronization effect and ERP component correlates with the complexity of the cognitive processes.
Public pensions in the U.S.
(2005)
Contents: The Public Old Age Insurance of the U.S. -Historical overview -Technical details -Individual equity and social adequacy The Economic Problem of Old Age -Risks and economic security -Old age, retirement, and idividual precaution -Insurance markets, market failures, and social insurance -Options for public pension systems The Problems of Social Security -The financial balance of OASDI -Causes of the long-run problems -Rates of return -Conclusion - The case for Social Security reform Proposed Remedies -Full, partial, or no privatization? -The President's Commission to Strengthen Social Security -Kotlikoff's Personal Security System -The Diamond-Orszag Three-Part plan
Many European countries have experienced a significant increase of unemployment in recent years. This paper reviews several theoretical models that try to explain this phenomenon. Predominantly, these models claim a link between the poor performance of European labor markets and the high level of market regulation. Commonly referred to as the Eurosclerosis debate, prominent approaches consider insider-outsider relationships, search-models, and the influence of hiring and firing costs on equilibrium employment. The paper presents empirical evidence of each model and studies the relevance of the identified rigidities as a determinant of high unemployment in Europe. Furthermore, a case study analyzes the unemployment problem in Germany and critically discusses new reform efforts. In particular this section analyzes whether the recently enacted Hartz reforms can induce higher employment.
Face-to-face communication is multimodal. In unscripted spoken discourse we can observe the interaction of several "semiotic layers", modalities of information such as syntax, discourse structure, gesture, and intonation. We explore the role of gesture and intonation in structuring and aligning information in spoken discourse through a study of the co-occurrence of pitch accents and gestural apices. Metaphorical spatialization through gesture also plays a role in conveying the contextual relationships between the speaker, the government and other external forces in a naturally-occurring political speech setting.
This paper investigates the structural properties of morphosyntactically marked focus constructions, focussing on the often neglected non-focal sentence part in African tone languages. Based on new empirical evidence from five Gur and Kwa languages, we claim that these focus expressions have to be analysed as biclausal constructions even though they do not represent clefts containing restrictive relative clauses. First, we relativize the partly overgeneralized assumptions about structural correspondences between the out-of-focus part and relative clauses, and second, we show that our data do in fact support the hypothesis of a clause coordinating pattern as present in clause sequences in narration. It is argued that we deal with a non-accidental, systematic feature and that grammaticalization may conceal such basic narrative structures.
The Semantics of Ellipsis
(2005)
There are four phenomena that are particularly troublesome for theories of ellipsis: the existence of sloppy readings when the relevant pronouns cannot possibly be bound; an ellipsis being resolved in such a way that an ellipsis site in the antecedent is not understood in the way it was there; an ellipsis site drawing material from two or more separate antecedents; and ellipsis with no linguistic antecedent. These cases are accounted for by means of a new theory that involves copying syntactically incomplete antecedent material and an analysis of silent VPs and NPs that makes them into higher order definite descriptions that can be bound into.
We present a system for the linguistic exploration and analysis of lexical cohesion in English texts. Using an electronic thesaurus-like resource, Princeton WordNet, and the Brown Corpus of English, we have implemented a process of annotating text with lexical chains and a graphical user interface for inspection of the annotated text. We describe the system and report on some sample linguistic analyses carried out using the combined thesaurus-corpus resource.
Fronting of an infinite VP across a finite main verb-akin to German "VP-topicalization"-can be found also in Czech and Polish. The paper discusses evidence from large corpora for this process and some of its properties, both syntactic and information-structural. Based on this case, criteria for more user-friedly searching and retrieval of corpus data in syntactic research are being developed.
Multiple hierarchies
(2005)
In this paper, we present the Multiple Annotation approach, which solves two problems: the problem of annotating overlapping structures, and the problem that occurs when documents should be annotated according to different, possibly heterogeneous tag sets. This approach has many advantages: it is based on XML, the modeling of alternative annotations is possible, each level can be viewed separately, and new levels can be added at any time. The files can be regarded as an interrelated unit, with the text serving as the implicit link. Two representations of the information contained in the multiple files (one in Prolog and one in XML) are described. These representations serve as a base for several applications.
This paper describes the standardization problems that come up in a diachronic corpus: it has to cope with differing standards with regard to diplomaticity, annotation, and header information. Such highly het-erogeneous texts must be standardized to allow for comparative re-search without (too much) loss of information.
Unity in diversity
(2005)
This paper describes the creation and preparation of TUSNELDA, a collection of corpus data built for linguistic research. This collection contains a number of linguistically annotated corpora which differ in various aspects such as language, text sorts / data types, encoded annotation levels, and linguistic theories underlying the annotation. The paper focuses on this variation on the one hand and the way how these heterogeneous data are integrated into one resource on the other hand.
The papers in this volume were presented at the workshop Heterogeneity in Linguistic Databases', which took place on July 9, 2004 at the University of Potsdam. The workshop was organized by project D1: Linguistic Database for Information Structure: Annotation and Retrieval', a member project of the SFB 632, a collaborative research center entitled Information Structure: the Linguistic Means for Structuring Utterances, Sentences and Texts'. The workshop brought together both developers and users of linguistic databases from a number of research projects which work on an empirical basis, all of which have to cope with different sorts of heterogeneity: primary linguistic data and annotated information may be heterogeneous, as well as the data structures representing them. The first four papers (by Wagner, Schmidt, Lüdeling, and Witt) address aspects of heterogeneous data from the point of view of database developers; the remaining three papers (by Meyer, Smith, and Teich/Fankhauser) focus on data exploitation by the users.
Collisions of black holes and neutron stars, named mixed binaries in the following, are interesting because of at least two reasons. Firstly, it is expected that they emit a large amount of energy as gravitational waves, which could be measured by new detectors. The form of those waves is expected to carry information about the internal structure of such systems. Secondly, collisions of such objects are the prime suspects of short gamma ray bursts. The exact mechanism for the energy emission is unknown so far. In the past, Newtonian theory of gravitation and modifications to it were often used for numerical simulations of collisions of mixed binary systems. However, near to such objects, the gravitational forces are so strong, that the use of General Relativity is necessary for accurate predictions. There are a lot of problems in general relativistic simulations. However, systems of two neutron stars and systems of two black holes have been studies extensively in the past and a lot of those problems have been solved. One of the remaining problems so far has been the use of hydrodynamic on excision boundaries. Inside excision regions, no evolution is carried out. Such regions are often used inside black holes to circumvent instabilities of the numerical methods near the singularity. Methods to handle hydrodynamics at such boundaries have been described and tests are shown in this work. One important test and the first application of those methods has been the simulation of a collapsing neutron star to a black hole. The success of these simulations and in particular the performance of the excision methods was an important step towards simulations of mixed binaries. Initial data are necessary for every numerical simulation. However, the creation of such initial data for general relativistic situations is in general very complicated. In this work it is shown how to obtain initial data for mixed binary systems using an already existing method for initial data of two black holes. These initial data have been used for evolutions of such systems and problems encountered are discussed in this work. One of the problems are instabilities due to different methods, which could be solved by dissipation of appropriate strength. Another problem is the expected drift of the black hole towards the neutron star. It is shown, that this can be solved by using special gauge conditions, which prevent the black hole from moving on the computational grid. The methods and simulations shown in this work are only the starting step for a much more detailed study of mixed binary system. Better methods, models and simulations with higher resolution and even better gauge conditions will be focus of future work. It is expected that such detailed studies can give information about the emitted gravitational waves, which is important in view of the newly built gravitational wave detectors. In addition, these simulations could give insight into the processes responsible for short gamma ray bursts.
In this paper, we consider families of time Markov fields (or reciprocal classes) which have the same bridges as a Brownian diffusion. We characterize each class as the set of solutions of an integration by parts formula on the space of continuous paths C[0; 1]; R-d) Our techniques provide a characterization of gradient diffusions by a duality formula and, in case of reversibility, a generalization of a result of Kolmogorov.
Adsorption layers of soluble surfactants enable and govern a variety of phenomena in surface and colloidal sciences, such as foams. The ability of a surfactant solution to form wet foam lamellae is governed by the surface dilatational rheology. Only systems having a non-vanishing imaginary part in their surface dilatational modulus, E, are able to form wet foams. The aim of this thesis is to illuminate the dissipative processes that give rise to the imaginary part of the modulus. There are two controversial models discussed in the literature. The reorientation model assumes that the surfactants adsorb in two distinct states, differing in their orientation. This model is able to describe the frequency dependence of the modulus E. However, it assumes reorientation dynamics in the millisecond time regime. In order to assess this model, we designed a SHG pump-probe experiment that addresses the orientation dynamics. Results obtained reveal that the orientation dynamics occur in the picosecond time regime, being in strong contradiction with the two states model. The second model regards the interface as an interphase. The adsorption layer consists of a topmost monolayer and an adjacent sublayer. The dissipative process is due to the molecular exchange between both layers. The assessment of this model required the design of an experiment that discriminates between the surface compositional term and the sublayer contribution. Such an experiment has been successfully designed and results on elastic and viscoelastic surfactant provided evidence for the correctness of the model. Because of its inherent surface specificity, surface SHG is a powerful analytical tool that can be used to gain information on molecular dynamics and reorganization of soluble surfactants. They are central elements of both experiments. However, they impose several structural elements of the model system. During the course of this thesis, a proper model system has been identified and characterized. The combination of several linear and nonlinear optical techniques, allowed for a detailed picture of the interfacial architecture of these surfactants.