Extern
Refine
Has Fulltext
- yes (257) (remove)
Year of publication
Document Type
- Conference Proceeding (118)
- Article (70)
- Postprint (52)
- Review (6)
- Working Paper (6)
- Doctoral Thesis (4)
- Monograph/Edited Volume (1)
Language
- English (257) (remove)
Is part of the Bibliography
- no (257) (remove)
Keywords
- USA (7)
- United States (7)
- moderne jüdische Geschichte (6)
- modern Jewish history (5)
- 20. Jahrhundert (4)
- 20th century (4)
- 19. Jahrhundert (3)
- Diversity (3)
- 19th century (2)
- Fluoreszenz-Resonanz-Energie-Transfer (2)
Institute
- Extern (257)
- Vereinigung für Jüdische Studien e. V. (23)
- Department Psychologie (14)
- Department Linguistik (10)
- Institut für Chemie (9)
- Institut für Umweltwissenschaften und Geographie (7)
- Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung (7)
- Institut für Biochemie und Biologie (6)
- Institut für Geowissenschaften (6)
- Institut für Physik und Astronomie (6)
The complement fragments C3a and C5a were purified from zymosan-activated human serum by column chromatographic procedures after the bulk of the proteins had been removed by acidic polyethylene glycol precipitation. In the isolated in situ perfused rat liver C3a increased glucose and lactate output and reduced flow. Its effects were enhanced in the presence of the carboxypeptidase inhibitor DL-mercaptomethyl-3-guanidinoethylthio-propanoic acid (MERGETPA) and abolished by preincubation of the anaphylatoxin with carboxypeptidase B or with Fab fragments of an anti-C3a monoclonal antibody. The C3a effects were partially inhibited by the thromboxane antagonist BM13505. C5a had no effect. It is concluded that locally but not systemically produced C3a may play an important role in the regulation of local metabolism and hemodynamics during inflammatory processes in the liver.
In the isolated rat liver perfused in situ stimulation of the nerve bundles around the portal vein and the hepatic artery caused an increase of urate formation that was inhibited by the α1-blocker prazosine and the xanthine oxidase inhibitor allopurinol. Moreover, nerve stimulation increased glucose and lactate output and decreased perfusion flow. Infusion of noradrenaline had similar effects. Compared to nerve stimulation infusion of glucagon led to a less pronounced increase of urate formation and a twice as large increase in glucose output but a decrease in lactate release without affecting the flow rate. Insulin had no effect on any of the parameters studied.
I discuss observational evidence – independent of the direct spectral diagnostics of stellar winds themselves – suggesting that mass-loss rates for O stars need to be revised downward by roughly a factor of three or more, in line with recent observed mass-loss rates for clumped winds. These independent constraints include the large observed mass-loss rates in LBV eruptions, the large masses of evolved massive stars like LBVs and WNH stars, WR stars in lower metallicity environments, observed rotation rates of massive stars at different metallicity, supernovae that seem to defy expectations of high mass-loss rates in stellar evolution, and other clues. I pay particular attention to the role of feedback that would result from higher mass-loss rates, driving the star to the Eddington limit too soon, and therefore making higher rates appear highly implausible. Some of these arguments by themselves may have more than one interpretation, but together they paint a consistent picture that steady line-driven winds of O-type stars have lower mass-loss rates and are significantly clumped.
While inequality of opportunity (IOp) in earnings is well studied, the literature on IOp in individual net wealth is scarce to non-existent. This is problematic because both theoretical and empirical evidence show that the position in the wealth and income distribution can significantly diverge.We measure ex-ante IOp in net wealth for Germany using data from the Socio-Economic Panel (SOEP). Ex-ante IOp is defined as the contribution of circumstances to the inequality in net wealth before effort is exerted. The SOEP allows for a direct mapping from individual circumstances to individual net wealth and for a detailed decomposition of net wealth inequality into a variety of circumstances; among them childhood background, intergenerational transfers, and regional characteristics. The ratio of inequality of opportunity to total inequality is stable from 2002 to 2019. This is in sharp contrast to labor earnings, where ex-ante IOp is declining over time. Our estimates suggest that about 62% of the inequality in net wealth is due to circumstances. The most important circumstances are intergenerational transfers, parental occupation, and the region of birth. In contrast, gender and individuals’ own education are the most important circumstances for earnings.
This article is a summary of the work carried out by the Ministry of Education in Turkey, in terms of the development of a new ICT Curriculum, together with the e-Training of teachers who will play an important role in the forthcoming pilot study. Based on recent literature on the topic, the article starts by introducing the “F@tih Project”, a national project that aims to effectively integrate technology into schools. After assessing teachers’ and students’ ICT competencies, as defined internationally, the review continues with the proposed model for the e-training of teachers. Summarizing the process of development of the new ICT curriculum, researchers underline key points of the curriculum such as dimensions, levels and competencies. Then teachers’ e-training approaches, together with selected tools, are explained in line with the importance and stages of action research that will be used throughout the pilot implementation of the curriculum and e-training process.
When the Jewish Theological Seminary in Breslau opened its doors in 1854, it established a novel form of rabbinical education: the systematic combination of Jewish studies at the seminary in parallel with university studies. The Breslau seminary became the model for most later institutions for rabbinical training in Europe and the United States. The seminaries were the new sites of modern Jewish scholarship, especially the academic study of Judaism (Wissenschaft des Judentums). Their function and goal were to preserve, (re)organize, and transmit Jewish knowledge in the modern age. As such, they became central nodes in Jewish scholarly networks. This case study highlights the multi-nodal connections between the Conservative seminaries in Breslau, Philadelphia, New York, Budapest, and Vienna. At the same time, it is intended to provide an example of the potential of transnational and transfer studies for the history of the Jewish religious learning in Europe and the United States.
We present a concept of better integration of practical teaching in student teacher education in Computer Science. As an introduction to the workshop different possible scenarios are discussed on the basis of examples. Afterwards workshop participants will have the opportunity to discuss the application of the aconcepts in other settings.
Integration of digital elevation models and satellite images to investigate geological processes.
(2006)
In order to better understand the geological boundary conditions for ongoing or past surface processes geologists face two important questions: 1) How can we gain additional knowledge about geological processes by analyzing digital elevation models (DEM) and satellite images and 2) Do these efforts present a viable approach for more efficient research. Here, we will present case studies at a variety of scales and levels of resolution to illustrate how we can substantially complement and enhance classical geological approaches with remote sensing techniques. Commonly, satellite and DEM based studies are being used in a first step of assessing areas of geologic interest. While in the past the analysis of satellite imagery (e.g. Landsat TM) and aerial photographs was carried out to characterize the regional geologic characteristics, particularly structure and lithology, geologists have increasingly ventured into a process-oriented approach. This entails assessing structures and geomorphic features with a concept that includes active tectonics or tectonic activity on time scales relevant to humans. In addition, these efforts involve analyzing and quantifying the processes acting at the surface by integrating different remote sensing and topographic data (e.g. SRTM-DEM, SSM/I, GPS, Landsat 7 ETM, Aster, Ikonos…). A combined structural and geomorphic study in the hyperarid Atacama desert demonstrates the use of satellite and digital elevation data for assessing geological structures formed by long-term (millions of years) feedback mechanisms between erosion and crustal bending (Zeilinger et al., 2005). The medium-term change of landscapes during hundred thousands to millions years in a more humid setting is shown in an example from southern Chile. Based on an analysis of rivers/watersheds combined with landscapes parameterization by using digital elevation models, the geomorphic evolution and change in drainage pattern in the coastal Cordillera can be quantified and put into the context of seismotectonic segmentation of a tectonically active region. This has far-reaching implications for earthquake rupture scenarios and hazard mitigation (K. Rehak, see poster on IMAF Workshop). Two examples illustrate short-term processes on decadal, centennial and millennial time scales: One study uses orogen scale precipitation gradients derived from remotely sensed passive microwave data (Bookhagen et al., 2005a). They demonstrate how debris flows were triggered as a response of slopes to abnormally strong rainfall in the interior parts of the Himalaya during intensified monsoons. The area of the orogen that receives high amounts of precipitation during intensified monsoons also constitutes numerous landslide deposits of up to 1km<sup>3 volume that were generated during intensified monsoon phase at about 27 and 9 ka (Bookhagen et al., 2005b). Another project in the Swiss Alps compared sets of aerial photographs recorded in different years. By calculating high resolution surfaces the mass transport in a landslide could be reconstructed (M. Schwab, Universität Bern). All these examples, although representing only a short and limited selection of projects using remote sense data in geology, have as a common approach the goal to quantify geological processes. With increasing data resolution and new sensors future projects will even enable us to recognize more patterns and / or structures indicative of geological processes in tectonically active areas. This is crucial for the analysis of natural hazards like earthquakes, tsunamis and landslides, as well as those hazards that are related to climatic variability. The integration of remotely sensed data at different spatial and temporal scales with field observations becomes increasingly important. Many of presently highly populated places and increasingly utilized regions are subject to significant environmental pressure and often constitute areas of concentrated economic value. Combined remote sensing and ground-truthing in these regions is particularly important as geologic, seismicity and hydrologic data may be limited here due to the recency of infrastructural development. Monitoring ongoing processes and evaluating the remotely sensed data in terms of recurrence of events will greatly enhance our ability to assess and mitigate natural hazards. <hr> Dokument 1: Foliensatz | Dokument 2: Abstract <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
There has been a substantial increase in the percentage for publications with co-authors located in departments from different countries in 12 major journals of psychology. The results are evidence for a remarkable internationalization of psychological research, starting in the mid 1970s and increasing in rate at the beginning of the 1990s. This growth occurs against a constant number of articles with authors from the same country; it is not due to a concomitant increase in the number of co-authors per article. Thus, international collaboration in psychology is obviously on the rise.
Finite state methods for natural language processing often require the construction and the intersection of several automata. In this paper, we investigate the question of determining the best order in which these intersections should be performed. We take as an example lexical disambiguation in polarity grammars. We show that there is no efficient way to minimize the state complexity of these intersections.
The most recent trend in the studies of LF intervention effects makes crucial reference to focusing effects on the interveners, and this paper critically examines the representative analyses of the focus-based approach. While each analysis has its own merits and shortcomings, I argue that a pragmatic analysis that does not make appeal to syntactic configurations is better equipped to deal with many of the complex and delicate facts surrounding intervention effects.
The process of introducing compulsory ICT education at primary school level in the Czech Republic should be completed next year. Programming and Information, two topics from the basics of computer science have been included in a new textbook. The question is whether the new chapters of the textbook are comprehensible for primary school teachers, who have undergone no training in computer science. The paper reports on a pilot verification project in which pre-service primary school teachers were trained to teach these informatics topics.
The layer-by-layer assembly (LBL) of polyelectrolytes has been extensively studied for the preparation of ultrathin films due to the versatility of the build-up process. The control of the permeability of these layers is particularly important as there are potential drug delivery applications. Multilayered polyelectrolyte microcapsules are also of great interest due to their possible use as microcontainers. This work will present two methods that can be used as employable drug delivery systems, both of which can encapsulate an active molecule and tune the release properties of the active species. Poly-(N-isopropyl acrylamide), (PNIPAM) is known to be a thermo-sensitive polymer that has a Lower Critical Solution Temperature (LCST) around 32oC; above this temperature PNIPAM is insoluble in water and collapses. It is also known that with the addition of salt, the LCST decreases. This work shows Differential Scanning Calorimetry (DSC) and Confocal Laser Scanning Microscopy (CLSM) evidence that the LCST of the PNIPAM can be tuned with salt type and concentration. Microcapsules were used to encapsulate this thermo-sensitive polymer, resulting in a reversible and tunable stimuli- responsive system. The encapsulation of the PNIPAM inside of the capsule was proven with Raman spectroscopy, DSC (bulk LCST measurements), AFM (thickness change), SEM (morphology change) and CLSM (in situ LCST measurement inside of the capsules). The exploitation of the capsules as a microcontainer is advantageous not only because of the protection the capsules give to the active molecules, but also because it facilitates easier transport. The second system investigated demonstrates the ability to reduce the permeability of polyelectrolyte multilayer films by the addition of charged wax particles. The incorporation of this hydrophobic coating leads to a reduced water sensitivity particularly after heating, which melts the wax, forming a barrier layer. This conclusion was proven with Neutron Reflectivity by showing the decreased presence of D2O in planar polyelectrolyte films after annealing creating a barrier layer. The permeability of capsules could also be decreased by the addition of a wax layer. This was proved by the increase in recovery time measured by Florescence Recovery After Photobleaching, (FRAP) measurements. In general two advanced methods, potentially suitable for drug delivery systems, have been proposed. In both cases, if biocompatible elements are used to fabricate the capsule wall, these systems provide a stable method of encapsulating active molecules. Stable encapsulation coupled with the ability to tune the wall thickness gives the ability to control the release profile of the molecule of interest.
American occupying forces made the promotion of Jewish-Christian dialogue part of their plans for postwar German reconstruction. They sought to export American models of Jewish-Christian cooperation to Germany, while simultaneously validating and valorizing claims about the connection between democracy and tri-faith religious pluralism in the United States. The small size of the Jewish population in Germany meant that Jews did not set the terms of these discussions, and evidence shows that both German and American Jews expressed skepticism about participating in dialogue in the years immediately following the Holocaust. But opting out would have meant that discussions in Germany about the Judeo-Christian tradition that the American government advanced as the centerpiece of postwar democratic reconstruction would take place without a Jewish contribution. American Jewish leaders, present in Germany and in the US, therefore decided to opt in, not because they supported the project, but because it seemed far riskier to be left out.
Under Brazil's ex-president Bolsonaro, deforestation of the Amazon increased dramatically. An Austrian NGO filed a complaint to the Prosecutor of the International Criminal Court (ICC) against Bolsonaro in October 2021, accusing him of crimes against humanity against the backdrop of his involvement in environmental destruction. This paper deals with the question of whether this initi-ative constitutes a promising means of juridification to mitigate conflicts revolving around mass deforestation in Brazil. It thematizes attempts to juridify environmental destruction in international criminal law and examines the Climate Fund Case at the Brazilian Supreme Court. Finally, emerging problems and arguments in favour of starting preliminary examinations at the ICC against Bolsonaro are illuminated. This paper provides arguments as to why the initiative might be a promising undertaking, even though it is unlikely that Bolsonaro will be arrested.
We present the tool Kato which is, to the best of our knowledge, the first tool for plagiarism detection that is directly tailored for answer-set programming (ASP). Kato aims at finding similarities between (segments of) logic programs to help detecting cases of plagiarism. Currently, the tool is realised for DLV programs but it is designed to handle various logic-programming syntax versions. We review basic features and the underlying methodology of the tool.
Magnetic fields influence the dynamics of hot-star winds and create large scale structure. Based on numerical magnetohydrodynamic (MHD) simulations, we model the wind of θ¹ Ori C, and then use the SEI method to compute synthetic line profiles for a range of viewing angles as function of rotational phase. The resulting dynamic spectrum for a moderately strong line shows a distinct modulation, but with a phase that seems at odds with available observations.
The higher education structure in Malaysia has experienced significant changes since the implementation of the Private Higher Educational Institutions Act of 1996. The unprecedented expansion of the higher education sector and the increasing autonomy conferred to universities have created a huge demand for competent university leadership that supports the development of higher education in Malaysia. This article discusses the very first national multiplication training in Malaysia in 2014 and analyses such out-comes as the identification of good practices for future initiatives and applications in university leadership training.
Deans at Institutions of Higher Education are seldom recipients of effective or specific professional management training, institutional mentorship, and coaching despite an increasing demand on them to play a more dynamic leadership role in the face of ever-changing local and global challenges. To address this deficiency, the inaugural Malaysian Chapter of the International Deans’ Course (MyIDC) was held in three parts over 2019 and 2020. In this paper, findings related to feedback on the programme are presented and discussed. Responses from the participants from two sets of surveys, and written feedback provided by two IDC international trainers involved in MyIDC were analysed. These reveal potential areas of improvement for the forthcoming MyIDC programme, such as in terms of planning and organisation, duration, content, and delivery. The article explores the lessons learnt from the MyIDC 2019/2020 training programme and discusses the improvements that can be made arising from the feedback received.
A multitype Dawson-Watanabe process is conditioned, in subcritical and critical cases, on non-extinction in the remote future. On every finite time interval, its distribution is absolutely continuous with respect to the law of the unconditioned process. A martingale problem characterization is also given. Several results on the long time behavior of the conditioned mass process - the conditioned multitype Feller branching diffusion - are then proved. The general case is first considered, where the mutation matrix which models the interaction between the types, is irreducible. Several two-type models with decomposable mutation matrices are analyzed too .
Local Orders, Global Chaos
(1999)
The authors used the frameworks of reciprocal determinism and occupational socialization to study the effects of work characteristics (consisting of control and complexity of work) on personal initiative (PI)--mediated by control orientation (a 2nd-order factor consisting of control aspiration, perceived opportunity for control, and self-efficacy) and the reciprocal effects of PI on changes in work characteristics. They applied structural equation modeling to a longitudinal study with 4 measurement waves (N = 268) in a transitional economy: East Germany. Results confirm the model plus 1 additional, nonhypothesized effect. Work characteristics had a synchronous effect on PI via control orientation (full mediation). There were also effects of control orientation and of PI on later changes in work characteristics: As predicted, PI functioned as partial mediator, changing work characteristics in the long term (reciprocal effect); unexpectedly, there was a 2nd reciprocal effect of an additional lagged partial mediation of control orientation on later work characteristics.
This paper outlines a newly-developed method to include the effects of time variability in the radiative transfer code CMFGEN. It is shown that the flow timescale is often large compared to the variability timescale of LBVs. Thus, time-dependent effects significantly change the velocity law and density structure of the wind, affecting the derivation of the mass-loss rate, volume filling factor, wind terminal velocity, and luminosity. The results of this work are directly applicable to all active LBVs in the Galaxy and in the LMC, such as AG Car, HR Car, S Dor and R 127, and could result in a revision of stellar and wind parameters. The massloss rate evolution of AG Car during the last 20 years is presented, highlighting the need for time-dependent models to correctly interpret the evolution of LBVs.
In this work an extension of CSSR algorithm using Maximum Entropy Models is introduced. Preliminary experiments to perform Named Entity Recognition with this new system are presented.
We describe an experiment to gather original data on geometrical aspects of pointing. In particular, we are focusing upon the concept of the pointing cone, a geometrical model of a pointing’s extension. In our setting we employed methodological and technical procedures of a new type to integrate data from annotations as well as from tracker recordings. We combined exact information on position and orientation with rater’s classifications. Our first results seem to challenge classical linguistic and philosophical theories of demonstration in that they advise to separate pointings from reference.
The rigorous development, application and validation of distributed hydrological models obligates to evaluate data in a spatially distributed way. In particular, spatial model predictions such as the distribution of soil moisture, runoff generating areas or nutrient-contributing areas or erosion rates, are to be assessed against spatially distributed observations. Also model inputs, such as the distribution of modelling units derived by GIS and remote sensing analyses, should be evaluated against groundbased observations of landscape characteristics. So far, however, quantitative methods of spatial field comparison have rarely been used in hydrology. In this paper, we present algorithms that allow to compare observed and simulated spatial hydrological data. The methods can be applied for binary and categorical data on regular grids. They comprise cell-by-cell algorithms, cell-neighbourhood approaches that account for fuzziness of location, and multi-scale algorithms that evaluate the similarity of spatial fields with changing resolution. All methods provide a quantitative measure of the similarity of two maps. The comparison methods are applied in two mountainous catchments in southern Germany (Brugga, 40 km<sup>2) and Austria (Löhnersbach, 16 km<sup>2). As an example of binary hydrological data, the distribution of saturated areas is analyzed in both catchments. For categorical data, vegetation zones that are associated with different runoff generation mechanisms are analyzed in the Löhnersbach. Mapped spatial patterns are compared to simulated patterns from terrain index calculations and from satellite image analysis. It is discussed how particular features of visual similarity between the spatial fields are captured by the quantitative measures, leading to recommendations on suitable algorithms in the context of evaluating distributed hydrological models.
It has recently been demonstrated that the presentation of a rare target in a visual oddball paradigm induces a prolonged inhibition of microsaccades. In the field of electrophysiology, the amplitude of the P300 component in event-related potentials (ERP) has been shown to be sensitive to the stimulus category (target vs. non target) of the eliciting stimulus, its overall probability, and the preceding stimulus sequence. In the present study we further specify the functional underpinnings of the prolonged microsaccadic inhibition in the visual oddball task, showing that the stimulus category, the frequency of a stimulus and the preceding stimulus sequence influence microsaccade rate. Furthermore, by co-recording ERPs and eye-movements, we were able to demonstrate that, despite being largely sensitive to the same experimental manipulation, the amplitude of P300 and the microsaccadic inhibition predict each other very weakly, and thus constitute two independent measures of the brain’s response to rare targets in the visual oddball paradigm.
We present preliminary results of a tailored atmosphere analysis of six Galactic WC stars using UV, optical, and mid-infrared Spitzer IRS data. With these data, we are able to sample regions from 10 to 10³ stellar radii, thus to determine wind clumping in different parts of the wind. Ultimately, derived wind parameters will be used to accuratelymeasure neon abundances, and to so test predicted nuclear-reaction rates.
Previous hydrometric studies demonstrated the prevalence of overland flow as a hydrological pathway in the tropical rain forest catchment of South Creek, northeast Queensland. The purpose of this study was to consider this information in a mixing analysis with the aim of identifying sources of, and of estimating their contribution to, storm flow during two events in February 1993. K and acid-neutralizing capacity (ANC) were used as tracers because they provided the best separation of the potential sources, saturation overland flow, soil water from depths of 0.3, 0.6, and 1.2 m, and hillslope groundwater in a two-dimensional mixing plot. It was necessary to distinguish between saturation overland flow, generated at the soil surface and following unchanneled pathways, and overland flow in incised pathways. This latter type of overland flow was a mixture of saturation overland flow (event water) with high concentrations of K and a low ANC, soil water (preevent water) with low concentrations of K and a low ANC, and groundwater (preevent water) with low concentrations of K and a high ANC. The same sources explained the streamwater chemistry during the two events with strongly differing rainfall and antecedent moisture conditions. The contribution of saturation overland flow dominated the storm flow during the first, high-intensity, 178-mm event, while the contribution of soil water reached 50% during peak flow of the second, low-intensity, 44-mm event 5 days later. This latter result is remarkably similar to soil water contributions to storm flow in mountainous forested catchments of the southeastern United States. In terms of event and preevent water the storm flow hydrograph of the high-intensity event is dominated by event water and that of the low-intensity event by preevent water. This study highlights the problems of applying mixing analyses to overland flow-dominated catchments and soil environments with a poorly developed vertical chemical zonation and emphasizes the need for independent hydrometric information for a complete characterization of watershed hydrology and chemistry.
This article explores the multi-directional geographic trajectories and ties of Jews who came to the United States in the 19th century, working to complicate simplistic understandings of “German” Jewish immigration. It focuses on the case study of Henry Cohn, an ordinary Russian-born Jew whose journeys took him to Prussia, New York, Savannah, and California. Once in the United States he returned to Europe twice, the second time permanently, although a grandson ended up in California, where he worked to ensure the preservation of Cohn’s records. This story highlights how Jews navigated and transgressed national boundaries in the 19th century and the limitations of the historical narratives that have been constructed from their experiences.
The requirements of modern e-learning techniques change. Aspects such as community interaction, flexibility, pervasive learning and increasing mobility in communication habits become more important. To meet these challenges e-learning platforms must provide support on mobile learning. Most approaches try to adopt centralised and static e-learning mechanisms to mobile devices. However, often technically it is not possible for all kinds of devices to be connected to a central server. Therefore we introduce an application of a mobile e-learning network which operates totally decentralised with the help of an underlying ad hoc network architecture. Furthermore the concept of ad hoc messaging network (AMNET) is used as basis system architecture for our approach to implement a platform for pervasive mobile e-learning.
We analyze anaphoric phenomena in the context of building an input understanding component for a conversational system for tutoring mathematics. In this paper, we report the results of data analysis of two sets of corpora of dialogs on mathematical theorem proving. We exemplify anaphoric phenomena, identify factors relevant to anaphora resolution in our domain and extensions to the input interpretation component to support it.
We apply the 3-dimensional radiative transport codeWind3D to 3D hydrodynamic models of Corotating Interaction Regions to fit the detailed variability of Discrete Absorption Components observed in Si iv UV resonance lines of HD 64760 (B0.5 Ib). We discuss important effects of the hydrodynamic input parameters on these large-scale equatorial wind structures that determine the detailed morphology of the DACs computed with 3D transfer. The best fit model reveals that the CIR in HD 64760 is produced by a source at the base of the wind that lags behind the stellar surface rotation. The non-corotating coherent wind structure is an extended density wave produced by a local increase of only 0.6% in the smooth symmetric wind mass-loss rate.
Claiming that cross-speaker "but" can signal correction in dialogue, we start by describing the types of corrections "but" can communicate by focusing on the Speech Act (SA) communicated in the previous turn and address the ways in which "but" can correct what is communicated. We address whether "but" corrects the proposition, the direct SA or the discourse relation communicated in the previous turn. We will also briefly address other relations signalled by cross-turn "but". After presenting a typology of the situations "but" can correct, we will address how these corrections can be modelled in the Information State model of dialogue, motivating this work by showing how it can be used to potentially avoid misunderstandings. We wrap up by showing how the model presented here updates beliefs in the Information State representation of the dialogue and can be used to facilitate response deliberation.
We model the line profile variability (lpv) in spectra of clumped stellar atmospheres using the Stochastic Clump Model (SCM) of the winds of early-type stars. In this model the formation of dense inhomogeneities (clumps) in the line driven winds is considered as being a stochastic process. It is supposed that the emission due to clumps mainly contributes to the intensities of emission lines in the stellar spectra. It is shown that in the framework of the SCM it is possible to reproduce both the mean line profiles and a common pattern of the lpv.
Many hot stars exhibit stochastic polarimetric variability, thought to arise from clumping low in the wind. Here we investigate the wind properties required to reproduce this variability using analytic models, with particular emphasis on Luminous Blue Variables. We find that the winds must be highly structured, consisting of a large number of optically-thin clumps; while we find that the overall level of polarization should scale with mass-loss rate – consistent with observations of LBVs. The models also predict variability on very short timescales, which is supported by the results of a recent polarimetric monitoring campaign.
Monolayers of rod-shaped and disc-shaped liquid crystalline compounds at the air-water interface
(1986)
Calamitic (rod-shaped) and discotic (disc-shaped) thermotropic liquid crystalline (LC) compounds were spread at the air-water interface, and their ability to form monolayers was studied. The calamitic LCs investigated were found to form monolayers which behave analogously to conventional amphiphiles such as fatty acids. The spreading of the discotic LCs produced monolayers as well, but with a behaviour different from classical amphiphiles. The areas occupied per molecule are too small to allow the contact of all hydrophilic groups with the water surface and the packing of all hydrophobic chains. Various molecular arrangements of the discotics at the water surface to fit the spreading data are discussed.
Mothers of Seafaring
(2023)
The article aims to trace the contribution of Jewish women in the Yishuv’s maritime history. Taking the example of Henrietta Diamond, a founding member and chairperson of the Zebulun Seafaring Society, the article seeks to explore the representation and role of women in a growing Jewish maritime domain from the 1930s to the 1950s. It examines Zionist narratives on the ‘New Jew’ and the Jewish body and studies their relevance for the emerging field of maritime activities in the Yishuv. By contextualizing the work and depiction of Henrietta Diamond, the article sheds new light on the gendered notions that underlay the emergence of the Jewish maritime domain and illustrates the patterns of inclusion and exclusion in it.
Demonstratives, in particular gestures that "only" accompany speech, are not a big issue in current theories of grammar. If we deal with gestures, fixing their function is one big problem, the other one is how to integrate the representations originating from different channels and, ultimately, how to determine their composite meanings. The growing interest in multi-modal settings, computer simulations, human-machine interfaces and VRapplications increases the need for theories ofmultimodal structures and events. In our workshopcontribution we focus on the integration of multimodal contents and investigate different approaches dealing with this problem such as Johnston et al. (1997) and Johnston (1998), Johnston and Bangalore (2000), Chierchia (1995), Asher (2005), and Rieser (2005).
Multiple hierarchies
(2005)
In this paper, we present the Multiple Annotation approach, which solves two problems: the problem of annotating overlapping structures, and the problem that occurs when documents should be annotated according to different, possibly heterogeneous tag sets. This approach has many advantages: it is based on XML, the modeling of alternative annotations is possible, each level can be viewed separately, and new levels can be added at any time. The files can be regarded as an interrelated unit, with the text serving as the implicit link. Two representations of the information contained in the multiple files (one in Prolog and one in XML) are described. These representations serve as a base for several applications.
Two examples of our biophotonic research utilizing nanoparticles are presented, namely laser-based fluoroimmuno analysis and in-vivo optical oxygen monitoring. Results of the work include significantly enhanced sensitivity of a homogeneous fluorescence immunoassay and markedly improved spatial resolution of oxygen gradients in root nodules of a legume species.
In the old days (pre ∼1990) hot stellar winds were assumed to be smooth, which made life fairly easy and bothered no one. Then after suspicious behaviour had been revealed, e.g. stochastic temporal variability in broadband polarimetry of single hot stars, it took the emerging CCD technology developed in the preceding decades (∼1970-80’s) to reveal that these winds were far from smooth. It was mainly high-S/N, time-dependent spectroscopy of strong optical recombination emission lines in WR, and also a few OB and other stars with strong hot winds, that indicated all hot stellar winds likely to be pervaded by thousands of multiscale (compressible supersonic turbulent?) structures, whose driver is probably some kind of radiative instability. Quantitative estimates of clumping-independent mass-loss rates came from various fronts, mainly dependent directly on density (e.g. electron-scattering wings of emission lines, UV spectroscopy of weak resonance lines, and binary-star properties including orbital-period changes, electron-scattering, and X-ray fluxes from colliding winds) rather than the more common, easier-to-obtain but clumping-dependent density-squared diagnostics (e.g. free-free emission in the IR/radio and recombination lines, of which the favourite has always been Hα). Many big questions still remain, such as: What do the clumps really look like? Do clumping properties change as one recedes from the mother star? Is clumping universal? Does the relative clumping correction depend on $\dot{M}$ itself?
The optical spectrum of Eta Carinae (η Car) is prominent in H I, He i and Fe ii wind lines, all of which vary both in absorption and emission with phase. The phase dependance is a consequence of the interaction between the two objects in the η Car binary (η Car A & B). The binary system is enshrouded by ejecta from previous mass ejection events and consequently, η Car B is not directly observable. We have traced the He i lines over η Car’s spectroscopic period, using HST/STIS data obtained with medium spectral, but high angular, resolving power, and created a radial velocity curve for the system. The He I lines are formed in the core of the system, and appear to be a composite of multiple features formed in spatially separated regions. The sources of their irregular line profiles are still not fully understood, but can be attributed to emission/absorption near the wind-wind interface and/or a direct consequence of the η Car A’s, massive, clumpy wind. This paper will discuss the spectral variability, the narrow emission structure of the He i lines and how clumpiness of the winds may impede the construction of the reliable radial velocity curve, necessary for characterizations of especially η Car B.
Morphological analyses based on word syntax approaches can encounter difficulties with long distance dependencies. The reason is that in some cases an affix has to have access to the inner structure of the form with which it combines. One solution is the percolation of features from ther inner morphemes to the outer morphemes with some process of feature unification. However, the obstacle of percolation constraints or stipulated features has lead some linguists to argue in favour of other frameworks such as, e.g., realizational morphology or parallel approaches like optimality theory. This paper proposes a linguistic analysis of two long distance dependencies in the morphology of Russian verbs, namely secondary imperfectivization and deverbal nominalization.We show how these processes can be reanalysed as local dependencies. Although finitestate frameworks are not bound by such linguistically motivated considerations, we present an implementation of our analysis as proposed in [1] that does not complicate the grammar or enlarge the network unproportionally.
Overwhelming observational and theoretical evidence suggests that the winds of massive stars are highly clumped. We briefly discuss the influence of clumping on model diagnostics and the difficulties of allowing for the influence of clumping on model spectra. Because of its simplicity, and because of computational ease, most spectroscopic analyses incorporate clumping using the volume filling factor. The biases introduced by this approach are uncertain. To investigate alternative clumping models, and to help determine the validity of parameters derived using the volume filling factor method, we discuss results derived using an alternative model in which we assume that the wind is composed of optically thick shells.
Investigations with frequency domain photon density waves allow elucidation of absorption and scattering properties of turbid media. The temporal and spatial propagation of intensity modulated light with frequencies up to more than 1 GHz can be described by the P1 approximation to the Boltzmann transport equation. In this study, we establish requirements for the appropriate choice of turbid model media and characterize mixtures of isosulfan blue as absorber and polystyrene beads as scatterer. For these model media, the independent determination of absorption and reduced scattering coefficients over large absorber and scatterer concentration ranges is demonstrated with a frequency domain photon density wave spectrometer employing intensity and phase measurements at various modulation frequencies.
This paper investigates the structural properties of morphosyntactically marked focus constructions, focussing on the often neglected non-focal sentence part in African tone languages. Based on new empirical evidence from five Gur and Kwa languages, we claim that these focus expressions have to be analysed as biclausal constructions even though they do not represent clefts containing restrictive relative clauses. First, we relativize the partly overgeneralized assumptions about structural correspondences between the out-of-focus part and relative clauses, and second, we show that our data do in fact support the hypothesis of a clause coordinating pattern as present in clause sequences in narration. It is argued that we deal with a non-accidental, systematic feature and that grammaticalization may conceal such basic narrative structures.
A constraint programming system combines two essential components: a constraint solver and a search engine. The constraint solver reasons about satisfiability of conjunctions of constraints, and the search engine controls the search for solutions by iteratively exploring a disjunctive search tree defined by the constraint program. The Monadic Constraint Programming framework gives a monadic definition of constraint programming where the solver is defined as a monad threaded through the monadic search tree. Search and search strategies can then be defined as firstclass objects that can themselves be built or extended by composable search transformers. Search transformers give a powerful and unifying approach to viewing search in constraint programming, and the resulting constraint programming system is first class and extremely flexible.
Parafoveal Load of Word N+1 Modulates Preprocessing Effectivenessof Word N+2 in Chinese Reading
(2010)
Preview benefits (PBs) from two words to the right of the fixated one (i.e., word N+2)and associated parafoveal-on-foveal effects are critical for proposals of distributed lexical processing during reading. This experiment examined parafoveal processing during reading of Chinese sentences, using a boundary manipulation of N+2-word preview with low- and high-frequency words N+1. The main findings were (a) an identity PB for word N+2 that was (b) primarily observed when word N+1 was of high frequency (i.e., an interaction between frequency of word N+1 and PB for word N+2), and (c) a parafoveal-on-foveal frequency effect of word N+1 for fixation durations on word N. We discuss implications for theories of serial attention shifts and parallel distributed processing of words during reading.
The boundary paradigm (Rayner, 1975) with a novel preview manipulation was used to examine the extent of parafoveal processing of words to the right of fixation. Words n+1 and n+2 had either correct or incorrect previews prior to fixation (prior to crossing the boundary location). In addition, the manipulation utilized either a high or low frequency word in word n+1 location on the assumption that it would be more likely that n+2 preview effects could be obtained when word n+1 was high frequency. The primary findings were that there was no evidence for a preview benefit for word n+2 and no evidence for parafoveal-on-foveal effects when word n+1 is at least four letters long. We discuss implications for models of eye-movement control in reading.
Eye fixation durations during normal reading correlate with processing difficulty but the specific cognitive mechanisms reflected in these measures are not well understood. This study finds support in German readers’ eyefixations for two distinct difficulty metrics: surprisal, which reflects the change in probabilities across syntactic analyses as new words are integrated, and retrieval, which quantifies comprehension difficulty in terms of working memory constraints. We examine the predictions of both metrics using a family of dependency parsers indexed by an upper limit on the number of candidate syntactic analyses they retain at successive words. Surprisal models all fixation measures and regression probability. By contrast, retrieval does not model any measure in serial processing. As more candidate analyses are considered in parallel at each word, retrieval can account for the same measures as surprisal. This pattern suggests an important role for ranked parallelism in theories of sentence comprehension.
Parsing costs as predictors of reading difficulty: An evaluation using the Potsdam Sentence Corpus
(2008)
The surprisal of a word on a probabilistic grammar constitutes a promising complexity metric for human sentence comprehension difficulty. Using two different grammar types, surprisal is shown to have an effect on fixation durations and regression probabilities in a sample of German readers’ eye movements, the Potsdam Sentence Corpus. A linear mixed-effects model was used to quantify the effect of surprisal while taking into account unigram and bigram frequency, word length, and empirically-derived word predictability; the so-called “early” and “late” measures of processing difficulty both showed an effect of surprisal. Surprisal is also shown to have a small but statistically non-significant effect on empirically-derived predictability itself. This work thus demonstrates the importance of including parsing costs as a predictor of comprehension difficulty in models of reading, and suggests that a simple identification of syntactic parsing costs with early measures and late measures with durations of post-syntactic events may be difficult to uphold.
We present an algorithm that computes a function that assigns consecutive integers to trees recognized by a deterministic, acyclic, finite-state, bottom-up tree automaton. Such function is called minimal perfect hashing. It can be used to identify trees recognized by the automaton. Its value may be seen as an index in some other data structures. We also present an algorithm for inverted hashing.
In the most abstract definition of its operational semantics, the declarative and concurrent programming language CHR is trivially non-terminating for a significant class of programs. Common refinements of this definition, in closing the gap to real-world implementations, compromise on declarativity and/or concurrency. Building on recent work and the notion of persistent constraints, we introduce an operational semantics avoiding trivial non-termination without compromising on its essential features.
Cinnamic acid moieties were incorporated into amphiphilic compounds containing one and two alkyl chains. These lipid-like compounds with photoreactive units undergo self-organization to form monolayers at the gas-water interface and bilayer structures (vesicles) in aqueous solutions. The photoreaction of the cinnamic acid moiety induced by 254 nm UV light was investigated in the crystalline state, in monolayers, in vesicles and in solution in organic solvents. The single-chain amphiphiles undergo dimerization to yield photoproducts with twice the molecular weight of the corresponding monomers in organized systems. The photoreaction of amphiphiles containing two cinnamic acid groups occurs via two mechanisms: The intramolecular dimerization produces bicycles, with retention of the molecular weight of the corresponding monomer. The intermolecular reaction leads to oligomeric and polymeric photoproducts. In contrast to the single-chain amphiphiles, photodimerization processes of lipoids containing two cinnamic acid moieties also occur in solution in organic solvents.
In the last years, statistical machine translation has already demonstrated its usefulness within a wide variety of translation applications. In this line, phrase-based alignment models have become the reference to follow in order to build competitive systems. Finite state models are always an interesting framework because there are well-known efficient algorithms for their representation and manipulation. This document is a contribution to the evolution of finite state models towards a phrase-based approach. The inference of stochastic transducers that are based on bilingual phrases is carefully analysed from a finite state point of view. Indeed, the algorithmic phenomena that have to be taken into account in order to deal with such phrase-based finite state models when in decoding time are also in-depth detailed.
The birth of the Yishuv’s national shipping company, ZIM was preceded by private enterprise; the sea had not traditionally been a focus of the Zionist movement. In the 1930s, a five-year span of private commercial shipping saw three companies in the Jewish community in Palestine – Palestine Shipping Company, Palestine Maritime Lloyd, and Atid – before shipping was cut short by the outbreak of the Second World War. Despite their brief lifespans and their negligible contribution to general shipping, these companies constituted an important milestone. Their existence helped shift the Yishuv leadership’s attitudes about shipping’s importance for the community and the need for it to be supported by national institutions.
The use of nano zerovalent iron (nZVI) for environmental remediation is a promising new technique for in situ remediation. Due to its high surface area and high reactivity, nZVI is able to dechlorinate organic contaminants and render them harmless. Limited mobility, due to fast aggregation and sedimentation of nZVI, limits the capability for source and plume remediation. Carbo-Iron is a newly developed material consisting of activated carbon particles (d50 = 0,8 µm) that are plated with nZVI particles. These particles combine the mobility of activated carbon and the reactivity of nZVI. This paper presents the first results of the transport experiments.
Amphiphilic derivatives of octadiene and docosadiene were investigated in monolayers and Langmuir-Blodgett multilayers, with respect to their self-organization and their polymerization behavior. All amphiphiles investigated form monolayers. However, only acid and alcohol derivatives were able to build up multilayers. Those multilayers are rapidly photopolymerized in the layers via a two-step process: Irradiation with long-wavelength UV light yields soluble polymers, whereas additional irradiation with sfiort-wavelength UV light produces insoluble and presumably cross-linked polymers. The reaction meclianism is discussed according to the polymer characterization by UV spectroscopy, small-angle X-ray scattering, NMR spectroscopy, and gel permeation chromatography. All multilayers undergo structural changes during the polymerization; substantial changes result in defects in the polymerized layers as observed by scanning electron microscopy. In contrast to the acids and alcohols, the deposition of monolayers of the aldehyde derivatives did not yield well-ordered multilayers, but rather amorphous films. In this different film structure, the photopolymerization process differs from the one observed in multilayers.
The factors that determine the efficiency of energy transfer in aquatic food webs have been investigated for many decades. The plant-animal interface is the most variable and least predictable of all levels in the food web. In order to study determinants of food quality in a large lake and to test the recently proposed central importance of the long-chained eicosapentaenoic acid (EPA) at the pelagic producer-grazer interface, we tested the importance of polyunsaturated fatty acids (PUFAs) at the pelagic producerconsumer interface by correlating sestonic food parameters with somatic growth rates of a clone of Daphnia galeata. Daphnia growth rates were obtained from standardized laboratory experiments spanning one season with Daphnia feeding on natural seston from Lake Constance, a large pre-alpine lake. Somatic growth rates were fitted to sestonic parameters by using a saturation function. A moderate amount of variation was explained when the model included the elemental parameters carbon (r2 = 0.6) and nitrogen (r2 = 0.71). A tighter fit was obtained when sestonic phosphorus was incorporated (r2 = 0.86). The nonlinear regression with EPA was relatively weak (r2 = 0.77), whereas the highest degree of variance was explained by three C18-PUFAs. The best (r2 = 0.95), and only significant, correlation of Daphnia's growth was found with the C18-PUFA α-linolenic acid (α-LA; C18:3n-3). This correlation was weakest in late August when C:P values increased to 300, suggesting that mineral and PUFA-limitation of Daphnia's growth changed seasonally. Sestonic phosphorus and some PUFAs showed not only tight correlations with growth, but also with sestonic α-LA content. We computed Monte Carlo simulations to test whether the observed effects of α-LA on growth could be accounted for by EPA, phosphorus, or one of the two C18-PUFAs, stearidonic acid (C18:4n-3) and linoleic acid (C18:2n-6). With >99 % probability, the correlation of growth with α-LA could not be explained by any of these parameters. In order to test for EPA limitation of Daphnia's growth, in parallel with experiments on pure seston, growth was determined on seston supplemented with chemostat-grown, P-limited Stephanodiscus hantzschii, which is rich in EPA. Although supplementation increased the EPA content 80-800x, no significant changes in the nonlinear regression of the growth rates with α-LA were found, indicating that growth of Daphnia on pure seston was not EPA limited. This indicates that the two fatty acids, EPA and α-LA, were not mutually substitutable biochemical resources and points to different physiological functions of these two PUFAs. These results support the PUFA-limitation hypothesis for sestonic C:P < 300 but are contrary to the hypothesis of a general importance of EPA, since no evidence for EPA limitation was found. It is suggested that the resource ratios of EPA and α-LA rather than the absolute concentrations determine which of the two resources is limiting growth.
A wide range of additional forward chaining applications could be realized with deductive databases, if their rule formalism, their immediate consequence operator, and their fixpoint iteration process would be more flexible. Deductive databases normally represent knowledge using stratified Datalog programs with default negation. But many practical applications of forward chaining require an extensible set of user–defined built–in predicates. Moreover, they often need function symbols for building complex data structures, and the stratified fixpoint iteration has to be extended by aggregation operations. We present an new language Datalog*, which extends Datalog by stratified meta–predicates (including default negation), function symbols, and user–defined built–in predicates, which are implemented and evaluated top–down in Prolog. All predicates are subject to the same backtracking mechanism. The bottom–up fixpoint iteration can aggregate the derived facts after each iteration based on user–defined Prolog predicates.
Preface
(2010)
The workshops on (constraint) logic programming (WLP) are the annual meeting of the Society of Logic Programming (GLP e.V.) and bring together researchers interested in logic programming, constraint programming, and related areas like databases, artificial intelligence and operations research. In this decade, previous workshops took place in Dresden (2008), Würzburg (2007), Vienna (2006), Ulm (2005), Potsdam (2004), Dresden (2002), Kiel (2001), and Würzburg (2000). Contributions to workshops deal with all theoretical, experimental, and application aspects of constraint programming (CP) and logic programming (LP), including foundations of constraint/ logic programming. Some of the special topics are constraint solving and optimization, extensions of functional logic programming, deductive databases, data mining, nonmonotonic reasoning, , interaction of CP/LP with other formalisms like agents, XML, JAVA, program analysis, program transformation, program verification, meta programming, parallelism and concurrency, answer set programming, implementation and software techniques (e.g., types, modularity, design patterns), applications (e.g., in production, environment, education, internet), constraint/logic programming for semantic web systems and applications, reasoning on the semantic web, data modelling for the web, semistructured data, and web query languages.
The site of confluence of the artery and the portal vein in the liver still appears to be controversial. Anatomical studies suggested a presinusoidal or an intrasinusoidal confluence in the first, second or even final third of the sinusoids. The objective of this investigation was to study the problem with functional biochemical techniques. Rat livers were perfused through the hepatic artery and simultaneously either in the orthograde direction from the portal vein to the hepatic vein or in the retrograde direction from the hepatic vein to the portal vein. Arterial how was linearly dependent on arterial pressure between 70 cm H2O and 120 cm H2O at a constant portal or hepatovenous pressure of 18 cm H2O. An arterial pressure of 100 cm H2O was required for the maintenance of a homogeneous orthograde perfusion of the whole parenchyma and of a physiologic ratio of arterial to portal how of about 1:3. Glucagon was infused either through the artery or the portal vein and hepatic vein, respectively, to a submaximally effective ''calculated'' sinusoidal concentration after mixing of 0.1 nmol/L. During orthograde perfusions, arterial and portal glucagon caused the same increases in glucose output. Yet during retrograde perfusions, hepatovenous glucagon elicited metabolic alterations equal to those in orthograde perfusions, whereas arterial glucagon effected changes strongly reduced to between 10% and 50%. Arterially infused trypan blue was distributed homogeneously in the parenchyma during orthograde perfusions, whereas it reached clearly smaller areas of parenchyma during retrograde perfusions. Finally, arterially applied acridine orange was taken up by all periportal hepatocytes in the proximal half of the acinus during orthograde perfusions but only by a much smaller portion of periportal cells in the proximal third of the acinus during retrograde perfusions. These findings suggest that in rat liver, the hepatic artery and the portal vein mix before and within the first third of the sinusoids, rather than in the middle or even last third.
INTEGRAL tripled the number of super-giant high-mass X-ray binaries (sgHMXB) known in the Galaxy by revealing absorbed and fast transient (SFXT) systems. Quantitative constraints on the wind clumping of massive stars can be obtained from the study of the hard X-ray variability of SFXT. A large fraction of the hard X-ray emission is emitted in the form of flares with a typical duration of 3 ksec, frequency of 7 days and luminosity of $10^{36}$ erg/s. Such flares are most probably emitted by the interaction of a compact object orbiting at $\sim10~R_*$ with wind clumps ($10^{22 ... 23}$ g) representing a large fraction of the stellar mass-loss rate. The density ratio between the clumps and the inter-clump medium is $10^{2 ... 4}$. The parameters of the clumps and of the inter-clump medium, derived from the SFXT flaring behavior, are in good agreement with macro-clumping scenario and line-driven instability simulations. SFXT are likely to have larger orbital radius than classical sgHMXB.
Problem solving is one of the central activities performed by computer scientists as well as by computer science learners. Whereas the teaching of algorithms and programming languages is usually well structured within a curriculum, the development of learners’ problem-solving skills is largely implicit and less structured. Students at all levels often face difficulties in problem analysis and solution construction. The basic assumption of the workshop is that without some formal instruction on effective strategies, even the most inventive learner may resort to unproductive trial-and-error problemsolving processes. Hence, it is important to teach problem-solving strategies and to guide teachers on how to teach their pupils this cognitive tool. Computer science educators should be aware of the difficulties and acquire appropriate pedagogical tools to help their learners gain and experience problem-solving skills.
A fine-grained slope that exhibits slow movement rates was investigated to understand how geohydrological processes contribute to a consecutive development of mass movements in the Vorarlberg Alps, Austria. For that purpose intensive hydrometeorological, hydrogeological and geotechnical observations as well as surveying of surface movement rates were conducted during 1998–2001. Subsurface water dynamics at the creeping slope turned out to be dominated by a three-dimensional pressure system. The pressure reaction is triggered by fast infiltration of surface water and subsequent lateral water flow in the south-western part of the hillslope. The related pressure signal was shown to propagate further downhill, causing fast reactions of the piezometric head at 5Ð5 m depth on a daily time scale. The observed pressure reactions might belong to a temporary hillslope water body that extends further downhill. The related buoyancy forces could be one of the driving forces for the mass movement. A physically based hydrological model was adopted to model simultaneously surface and subsurface water dynamics including evapotranspiration and runoff production. It was possible to reproduce surface runoff and observed pressure reactions in principle. However, as soil hydraulic functions were only estimated on pedotransfer functions, a quantitative comparison between observed and simulated subsurface dynamics is not feasible. Nevertheless, the results suggest that it is possible to reconstruct important spatial structures based on sparse observations in the field which allow reasonable simulations with a physically based hydrological model. Copyright 2005 John Wiley & Sons, Ltd. KEY WORDS rainfall-induced landslides; soil creep; hydrological modelling; Vorarlberg; Austria; pressure propagation
Higher education institutions in Guinea face many challenges, including reporting responsibilities, globalisation, and massification. Institutional evaluations of higher education and research institutions in 2013 could not initiate the implementation of change processes within the institutions. Recently, however, various initiatives have been started to change this situation with the purpose to sensitise and raise awareness and capabilities for quality assurance structures in Guinean HEIs. So far, the emphasis has been put on quality enhancement in higher education, especially on teaching evaluation, curriculum development, as well as on establishing quality assurance structures. This article gives an overview of the state of play and takes stock of the activities that have been initiated to set up quality assurance mechanisms in higher education and research institutions, and presents perspectives for further development of the quality approach in Guinea. The project ‘Quality Assurance Multiplication 2017-2018’ serves as an example to describe approaches and activities in setting up stable quality assurance structures, and to strengthen and raise awareness for a ‘quality culture’.
By quantitatively fitting simple emission line profile models that include both atomic opacity and porosity to the Chandra X-ray spectrum of ζ Pup, we are able to explore the trade-offs between reduced mass-loss rates and wind porosity. We find that reducing the mass-loss rate of ζ Pup by roughly a factor of four, to 1.5 × 10−6 M⊙ yr−1, enables simple non-porous wind models to provide good fits to the data. If, on the other hand, we take the literature mass-loss rate of 6×10−6 M⊙ yr−1, then to produce X-ray line profiles that fit the data, extreme porosity lengths – of h∞ ≈ 3 R∗ – are required. Moreover, these porous models do not provide better fits to the data than the non-porous, low optical depth models. Additionally, such huge porosity lengths do not seem realistic in light of 2-D numerical simulations of the wind instability.
Mass accretion onto compact objects through accretion disks is a common phenomenon in the universe. It is seen in all energy domains from active galactic nuclei through cataclysmic variables (CVs) to young stellar objects. Because CVs are fairly easy to observe, they provide an ideal opportunity to study accretion disks in great detail and thus help us to understand accretion also in other energy ranges. Mass accretion in these objects is often accompanied by mass outflow from the disks. This accretion disk wind, at least in CVs, is thought to be radiatively driven, similar to O star winds. WOMPAT, a 3-D Monte Carlo radiative transfer code for accretion disk winds of CVs is presented.
Deductive databases need general formulas in rule bodies, not only conjuctions of literals. This is well known since the work of Lloyd and Topor about extended logic programming. Of course, formulas must be restricted in such a way that they can be effectively evaluated in finite time, and produce only a finite number of new tuples (in each iteration of the TP-operator: the fixpoint can still be infinite). It is also necessary to respect binding restrictions of built-in predicates: many of these predicates can be executed only when certain arguments are ground. Whereas for standard logic programming rules, questions of safety, allowedness, and range-restriction are relatively easy and well understood, the situation for general formulas is a bit more complicated. We give a syntactic analysis of formulas that guarantees the necessary properties.
We study the time variability of emission lines in three WNE stars : WR 2 (WN2), WR 3 (WN3ha) and WR152 (WN3). While WR 2 shows no variability above the noise level, the other stars do show variation, which are like other WR stars in WR 152 but very fast in WR 3. From these motions, we deduce a value of β ∼1 for WR 3 that is like that seen in O stars and β ∼2–3 for WR 152, that is intermediate between other WR stars and WR 3.
Just and Carpenter (1980) presented a theory of reading based on eye fixations wherein their "psycholinguistic" variables accounted for 72% of the variance in word gaze durations. This comment raises some statistical and theoretical problems with their use of simultaneous regression analysis of gaze duration measures and with the resulting theory of reading. A major problem was the confounding of perceptual with psycholinguistic factors. New eye fixation data are presented to support these criticisms. Analysis of fixations within words revealed that most gaze duration variance was contributed by number of fixations rather than by fixation duration.
Small livestock is an important resource for rural human populations in dry climates. How strongly will climate change affect the capacity of the rangeland? We used hierarchical modelling to scale quantitatively the growth of shrubs and annual plants, the main food of sheep and goats, to the landscape extent in the eastern Mediterranean region. Without grazing, productivity increased in a sigmoid way with mean annual precipitation. Grazing reduced productivity more strongly the drier the landscape. At a point just under the stocking capacity of the vegetation, productivity declined precipitously with more intense grazing due to a lack of seed production of annuals. We repeated simulations with precipitation patterns projected by two contrasting IPCC scenarios. Compared to results based on historic patterns, productivity and stocking capacity did not differ in most cases. Thus, grazing intensity remains the stronger impact on landscape productivity in this dry region even in the future.
We present XMM-Newton Reflection Grating Spectrometer observations of pairs of X-ray emission line profiles from the O star ζ Pup that originate from the same He-like ion. The two profiles in each pair have different shapes and cannot both be consistently fit by models assuming the same wind parameters. We show that the differences in profile shape can be accounted for in a model including the effects of resonance scattering, which affects the resonance line in the pair but not the intercombination line. This implies that resonance scattering is also important in single resonance lines, where its effect is difficult to distinguish from a low effective continuum optical depth in the wind. Thus, resonance scattering may help reconcile X-ray line profile shapes with literature mass-loss rates.
The P v λλ1118, 1128 resonance doublet is an extraordinarily useful diagnostic of O-star winds, because it bypasses the traditional problems associated with determining mass-loss rates from UV resonance lines. We discuss critically the assumptions and uncertainties involved with using P v to diagnose mass-loss rates, and conclude that the large discrepancies between massloss rates determined from P v and the rates determined from “density squared” emission processes pose a significant challenge to the “standard model” of hot-star winds. The disparate measurements can be reconciled if the winds of O-type stars are strongly clumped on small spatial scales, which in turn implies that mass-loss rates based on Hα or radio emission are too large by up to an order of magnitude.
In semi-arid savannas, unsustainable land use can lead to degradation of entire landscapes, e.g. in the form of shrub encroachment. This leads to habitat loss and is assumed to reduce species diversity. In BIOTA phase 1, we investigated the effects of land use on population dynamics on farm scale. In phase 2 we scale up to consider the whole regional landscape consisting of a diverse mosaic of farms with different historic and present land use intensities. This mosaic creates a heterogeneous, dynamic pattern of structural diversity at a large spatial scale. Understanding how the region-wide dynamic land use pattern affects the abundance of animal and plant species requires the integration of processes on large as well as on small spatial scales. In our multidisciplinary approach, we integrate information from remote sensing, genetic and ecological field studies as well as small scale process models in a dynamic region-wide simulation tool. <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006.
Received views of utterance context in pragmatic theory characterize the occurrent subjective states of interlocutors using notions like common knowledge or mutual belief. We argue that these views are not compatible with the uncertainty and robustness of context-dependence in humanhuman dialogue. We present an alternative characterization of utterance context as objective and normative. This view reconciles the need for uncertainty with received intuitions about coordination and meaning in context, and can directly inform computational approaches to dialogue.
Classical SDRT (Asher and Lascarides, 2003) discussed essential features of dialogue like adjacency pairs or corrections and up-dating. Recent work in SDRT (Asher, 2002, 2005) aims at the description of natural dialogue. We use this work to model situated communication, i.e. dialogue, in which sub-sentential utterances and gestures (pointing and grasping) are used as conventional modes of communication. We show that in addition to cognitive modelling in SDRT, capturing mental states and speech-act related goals, special postulates are needed to extract meaning out of contexts. Gestural meaning anchors Discourse Referents in contextually given domains. Both sorts of meaning are fused with the meaning of fragments to get at fully developed dialogue moves. This task accomplished, the standard SDRT machinery, tagged SDRSs, rhetorical relations, the up-date mechanism, and the Maximize Discourse Coherence constraint generate coherent structures. In sum, meanings from different verbal and non-verbal sources are assembled using extended SDRT to form coherent wholes.
Since Harris’ parser in the late 50s, multiword units have been progressively integrated in parsers. Nevertheless, in the most part, they are still restricted to compound words, that are more stable and less numerous. Actually, language is full of semi-fixed expressions that also form basic semantic units: semi-fixed adverbial expressions (e.g. time), collocations. Like compounds, the identification of these structures limits the combinatorial complexity induced by lexical ambiguity. In this paper, we detail an experiment that largely integrates these notions in a finite-state procedure of segmentation into super-chunks, preliminary to a parser.We show that the chunker, developped for French, reaches 92.9% precision and 98.7% recall. Moreover, multiword units realize 36.6% of the attachments within nominal and prepositional phrases.
The effect of moderate rates of nitrogen deposition on ground floor vegetation is poorly predicted by uncontrolled surveys or fertilization experiments using high rates of nitrogen (N) addition. We compared the temporal trends of ground floor vegetation in permanent plots with moderate (7–13 kg ha−1 year−1) and lower bulk N deposition (4–6 kg ha−1 year−1) in southern Sweden during 1982–1998. We examined whether trends differed between growth forms (vascular plants and bryophytes) and vegetation types (three types of coniferous forest, deciduous forest, and bog). Trends of site-standardized cover and richness varied among growth forms, vegetation types, and deposition regions. Cover in spruce forests decreased at the same rate with both moderate and low deposition. In pine forests cover decreased faster with moderate deposition and in bogs cover decreased faster with low deposition. Cover of bryophytes in spruce forests increased at the same rate with both moderate and low deposition. In pine forests cover decreased faster with moderate deposition and in bogs and deciduous forests there was a strong non-linear increase with moderate deposition. The trend of number of vascular plants was constant with moderate and decreased with low deposition. We found no trend in the number of bryophyte species. We propose that the decrease of cover and number with low deposition was related to normal ecosystem development (increased shading), suggesting that N deposition maintained or increased the competitiveness of some species in the moderate-deposition region. Deposition had no consistent negative effect on vegetation suggesting that it is less important than normal successional processes.
The topography of first-order catchments in a region of western Amazonia was found to exhibit distinctive, recurrent features: a steep, straight lower side slope, a flat or nearly flat terrace at an intermediate elevation between valley floor and interfluve, and an upper side slope connecting interfluve and intermediate terrace. A detailed survey of soil-saturated hydraulic conductivity (K sat)-depth relationships, involving 740 undisturbed soil cores, was conducted in a 0.75-ha first-order catchment. The sampling approach was stratified with respect to the above slope units. Exploratory data analysis suggested fourth-root transformation of batches from the 0–0.1 m depth interval, log transformation of batches from the subsequent 0.1 m depth increments, and the use of robust estimators of location and scale. The K sat of the steep lower side slope decreased from 46 to 0.1 mm/h over the overall sampling depth of 0.4 m. The corresponding decrease was from 46 to 0.1 mm/h on the intermediate terrace, from 335 to 0.01 mm/h on the upper side slope, and from 550 to 0.015 mm/h on the interfluve. A depthwise comparison of these slope units led to the formulation of several hypotheses concerning the link between K sat and topography.
The spectral efficiency of blackness induction was measured in three normal trichromatic observers and in one deuteranomalous observer. The psychophysical task was to adjust the radiance of a monochromatic 60–120′ annulus until a 45′ central broadband field just turned black and its contour became indiscriminable from a dark surrounding gap that separated it from the annulus. The reciprocal of the radiance required to induce blackness with annulus wavelengths between 420 and 680 nm was used to define a spectral-efficiency function for the blackness component of the achromatic process. For each observer, the shape of this blackness-sensitivity function agreed with the spectral-efficiency function based on heterochromatic flicker photometry when measured with the same 60–120′ annulus. Both of these functions matched the Commission Internationale de l'Eclairage Vλ function except at short wavelengths. Ancillary measurements showed that the latter difference in sensitivity can be ascribed to nonuniformities of preretinal absorption, since the annular field excluded the central 60′ of the fovea. Thus our evidence indicates that, at least to a good first approximation, induced blackness is inversely related to the spectral-luminosity function. These findings are consistent with a model that separates the achromatic and the chromatic pathways.
Many methods have been proposed for the simulation of constrained mechanical systems. The most obvious of these have mild instabilities and drift problems. Consequently, stabilization techniques have been proposed A popular stabilization method is Baumgarte's technique, but the choice of parameters to make it robust has been unclear in practice. Some of the simulation methods that have been proposed and used in computations are reviewed here, from a stability point of view. This involves concepts of differential-algebraic equation (DAE) and ordinary differential equation (ODE) invariants. An explanation of the difficulties that may be encountered using Baumgarte's method is given, and a discussion of why a further quest for better parameter values for this method will always remain frustrating is presented. It is then shown how Baumgarte's method can be improved. An efficient stabilization technique is proposed, which may employ explicit ODE solvers in case of nonstiff or highly oscillatory problems and which relates to coordinate projection methods. Examples of a two-link planar robotic arm and a squeezing mechanism illustrate the effectiveness of this new stabilization method.
Many methods have been proposed for the stabilization of higher index differential-algebraic equations (DAEs). Such methods often involve constraint differentiation and problem stabilization, thus obtaining a stabilized index reduction. A popular method is Baumgarte stabilization, but the choice of parameters to make it robust is unclear in practice. Here we explain why the Baumgarte method may run into trouble. We then show how to improve it. We further develop a unifying theory for stabilization methods which includes many of the various techniques proposed in the literature. Our approach is to (i) consider stabilization of ODEs with invariants, (ii) discretize the stabilizing term in a simple way, generally different from the ODE discretization, and (iii) use orthogonal projections whenever possible. The best methods thus obtained are related to methods of coordinate projection. We discuss them and make concrete algorithmic suggestions.
The interest in extensions of the logic programming paradigm beyond the class of normal logic programs is motivated by the need of an adequate representation and processing of knowledge. One of the most difficult problems in this area is to find an adequate declarative semantics for logic programs. In the present paper a general preference criterion is proposed that selects the ‘intended’ partial models of generalized logic programs which is a conservative extension of the stationary semantics for normal logic programs of [Prz91]. The presented preference criterion defines a partial model of a generalized logic program as intended if it is generated by a stationary chain. It turns out that the stationary generated models coincide with the stationary models on the class of normal logic programs. The general wellfounded semantics of such a program is defined as the set-theoretical intersection of its stationary generated models. For normal logic programs the general wellfounded semantics equals the wellfounded semantics.
I perform and analyse the first ever calculations of rotating stellar iron core collapse in {3+1} general relativity that start out with presupernova models from stellar evolutionary calculations and include a microphysical finite-temperature nuclear equation of state, an approximate scheme for electron capture during collapse and neutrino pressure effects. Based on the results of these calculations, I obtain the to-date most realistic estimates for the gravitational wave signal from collapse, bounce and the early postbounce phase of core collapse supernovae. I supplement my {3+1} GR hydrodynamic simulations with 2D Newtonian neutrino radiation-hydrodynamic supernova calculations focussing on (1) the late postbounce gravitational wave emission owing to convective overturn, anisotropic neutrino emission and protoneutron star pulsations, and (2) on the gravitational wave signature of accretion-induced collapse of white dwarfs to neutron stars.
Significant seasonal variation in size at settlement has been observed in newly settled larvae of Dreissena polymorpha in Lake Constance. Diet quality, which varies temporally and spatially in freshwater habitats, has been suggested as a significant factor influencing life history and development of freshwater invertebrates. Accordingly, experiments were conducted with field-collected larvae to test the hypothesis that diet quality can determine planktonic larval growth rates, size at settlement and subsequent post-metamorphic growth rates. Larvae were fed one of two diets or starved. One diet was composed of cyanobacterial cells which are deficient in polyunsaturated fatty acids (PUFAs), and the other was a mixed diet rich in PUFAs. Freshly metamorphosed animals from the starvation treatment had a carbon content per individual 70% lower than that of larvae fed the mixed diet. This apparent exhaustion of larval internal reserves resulted in a 50% reduction of the postmetamorphic growth rates. Growth was also reduced in animals previously fed the cyanobacterial diet. Hence, low food quantity or low food quality during the larval stage of D. polymorpha lead to irreversible effects for postmetamorphic animals, and is related to inferior competitive abilities.
Transitional justice is conventionally theorized as how a society deals with past injustices after regime change and alongside democratization. Nonetheless, scholars have not reached a consensus on what is to be included or excluded. Recent ideas of transformative justice seek to expand the understanding of transitional justice to include systemic restructuring and socioeconomic considerations. In the context of Nicaragua — where two transitions occurred within an 11-year span — very little transitional justice took place, in terms of the conventional concept of top-down legalistic mechanisms; however, distinct structural changes and socioeconomic policies can be found with each regime change. By analyzing the transformative justice elements of Nicaragua’s dual transition, this chapter seeks to expand the understanding of transitional justice to include how these factors influence goals of transitions such as sustainable peace and reconciliation for past injustices. The results argue for increased attention to transformative justice theories and a more nuanced conception of justice.
An approach to the development of fluorescent probes to follow polymerizations in situ using fluorinated cross-conjugated enediynes (Y-enynes) is reported. Different substitution patterns in the Y-enynes result in distinct solvatochromic behavior. β,β-Bis(phenylethynyl)pentafluorostyrene 7, which bears no donor substituents and only fluorine at the styrene moiety, shows no solvatochromism. Donor substituted β,β-bis(3,4,5-trimethoxyphenylethynyl) pentafluorostyrene 8 and β,β-bis(4-butyl-2,3,5,6-tetrafluorophenylethynyl)-3,4,5-trimethoxystyrene 9 exhibit solvatochromism upon change of solvent polarity. Y-enyne 8 showed the largest solvatochromic shift (94 nm bathochromic shift) upon changing solvent from cyclohexane to acetonitrile. A smaller solvatochromic response (44 nm bathochromic shift) was observed for 9. Lippert–Mataga treatment of 8 and 9 yields slopes of -10,800 and -6,400 cm -1, respectively. This corresponds to a change in dipole moment of 9.6 and 6.9 D, respectively. The solvatochromic behavior in 8 and 9 supports the formation of an intramolecular charge transfer (ICT) state. The low fluorescence quantum yields are caused by competitive double bond rotation. The fluorescence decay time of 9 decreases in methyltetrahydrofuran from 2.1 ns at 77 K to 0.11 ns at 200 K. Efficient single bond rotation in 9 was frozen at -50 °C in a configuration in which the trimethoxyphenyl ring is perpendicular to the fluorinated rings. 7–9 are photostable compounds. The X-ray structure of 7 shows it is not planar and that its conjugation is distorted. Y-enyne 7 stacks in the solid state showing coulombic, actetylene–arene, and fluorine–π interactions.
We exploit time-series $FUSE$ spectroscopy to {\it uniquely} probe spatial structure and clumping in the fast wind of the central star of the H-rich planetary nebula NGC~6543 (HD~164963). Episodic and recurrent optical depth enhancements are discovered in the P{\sc v} absorption troughs, with some evidence for a $\sim$ 0.17-day modulation time-scale. The characteristics of these features are essentially identical to the discrete absorption components' (DACs) commonly seen in the UV lines of massive OB stars, suggesting the temporal structures seen in NGC~6543 likely have a physical origin that is similar to that operating in massive, luminous stars. The mechanism for forming coherent perturbations in the outflows is therefore apparently operating equally in the radiation-pressure-driven winds of widely differing momenta ($\mdot$$v_\infty$$R_\star^{0.5}$) and flow times, as represented by OB stars and CSPN.