Refine
Has Fulltext
- yes (246) (remove)
Year of publication
Document Type
- Conference Proceeding (246) (remove)
Language
- English (186)
- German (49)
- Multiple languages (10)
- French (1)
Keywords
- Archiv (4)
- Information Structure (4)
- Nachlass (4)
- Cloud Computing (3)
- E-Learning (3)
- middleware (3)
- Constraint Solving (2)
- Deduction (2)
- EMOTIKON (2)
- Forschungsprojekte (2)
Institute
- Extern (137)
- Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung (23)
- Institut für Künste und Medien (20)
- Institut für Informatik und Computational Science (15)
- Institut für Slavistik (14)
- Institut für Physik und Astronomie (11)
- Institut für Geowissenschaften (10)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (9)
- Sonderforschungsbereich 632 - Informationsstruktur (7)
- Bürgerliches Recht (6)
Vom 18. bis 20. September 2014 versammelten sich an der Universität Potsdam kultur- und filmwissenschaftlich arbeitende Wissenschaftler zu einem Andrej Tarkovskij gewidmeten Symposium, dem ersten internationalen. Die 25 Teilnehmer kamen nämlich aus neun Ländern. Dadurch, dass nicht wenige auch eine – wie man heute sagt – „Migrationsbiographie“ haben, potenzierte sich die durch die jeweils unterschiedliche Herkunft bedingte Multiperspektivik, zu der jedoch der Modus der Wissenschaftlichkeit ein deutlich relativierendes Korrektiv bildet. Der vorliegende Band enthält im Wesentlichen die dort vorgestellten Beiträge, aber auch die der Fachleute, die nicht persönlich hatten nach Potsdam kommen können.
In this talk, I would like to share my experiences gained from participating in four CSP solver competitions and the second ASP solver competition. In particular, I’ll talk about how various programming techniques can make huge differences in solving some of the benchmark problems used in the competitions. These techniques include global constraints, table constraints, and problem-specific propagators and labeling strategies for selecting variables and values. I’ll present these techniques with experimental results from B-Prolog and other CLP(FD) systems.
Integration of digital elevation models and satellite images to investigate geological processes.
(2006)
In order to better understand the geological boundary conditions for ongoing or past surface processes geologists face two important questions: 1) How can we gain additional knowledge about geological processes by analyzing digital elevation models (DEM) and satellite images and 2) Do these efforts present a viable approach for more efficient research. Here, we will present case studies at a variety of scales and levels of resolution to illustrate how we can substantially complement and enhance classical geological approaches with remote sensing techniques. Commonly, satellite and DEM based studies are being used in a first step of assessing areas of geologic interest. While in the past the analysis of satellite imagery (e.g. Landsat TM) and aerial photographs was carried out to characterize the regional geologic characteristics, particularly structure and lithology, geologists have increasingly ventured into a process-oriented approach. This entails assessing structures and geomorphic features with a concept that includes active tectonics or tectonic activity on time scales relevant to humans. In addition, these efforts involve analyzing and quantifying the processes acting at the surface by integrating different remote sensing and topographic data (e.g. SRTM-DEM, SSM/I, GPS, Landsat 7 ETM, Aster, Ikonos…). A combined structural and geomorphic study in the hyperarid Atacama desert demonstrates the use of satellite and digital elevation data for assessing geological structures formed by long-term (millions of years) feedback mechanisms between erosion and crustal bending (Zeilinger et al., 2005). The medium-term change of landscapes during hundred thousands to millions years in a more humid setting is shown in an example from southern Chile. Based on an analysis of rivers/watersheds combined with landscapes parameterization by using digital elevation models, the geomorphic evolution and change in drainage pattern in the coastal Cordillera can be quantified and put into the context of seismotectonic segmentation of a tectonically active region. This has far-reaching implications for earthquake rupture scenarios and hazard mitigation (K. Rehak, see poster on IMAF Workshop). Two examples illustrate short-term processes on decadal, centennial and millennial time scales: One study uses orogen scale precipitation gradients derived from remotely sensed passive microwave data (Bookhagen et al., 2005a). They demonstrate how debris flows were triggered as a response of slopes to abnormally strong rainfall in the interior parts of the Himalaya during intensified monsoons. The area of the orogen that receives high amounts of precipitation during intensified monsoons also constitutes numerous landslide deposits of up to 1km<sup>3 volume that were generated during intensified monsoon phase at about 27 and 9 ka (Bookhagen et al., 2005b). Another project in the Swiss Alps compared sets of aerial photographs recorded in different years. By calculating high resolution surfaces the mass transport in a landslide could be reconstructed (M. Schwab, Universität Bern). All these examples, although representing only a short and limited selection of projects using remote sense data in geology, have as a common approach the goal to quantify geological processes. With increasing data resolution and new sensors future projects will even enable us to recognize more patterns and / or structures indicative of geological processes in tectonically active areas. This is crucial for the analysis of natural hazards like earthquakes, tsunamis and landslides, as well as those hazards that are related to climatic variability. The integration of remotely sensed data at different spatial and temporal scales with field observations becomes increasingly important. Many of presently highly populated places and increasingly utilized regions are subject to significant environmental pressure and often constitute areas of concentrated economic value. Combined remote sensing and ground-truthing in these regions is particularly important as geologic, seismicity and hydrologic data may be limited here due to the recency of infrastructural development. Monitoring ongoing processes and evaluating the remotely sensed data in terms of recurrence of events will greatly enhance our ability to assess and mitigate natural hazards. <hr> Dokument 1: Foliensatz | Dokument 2: Abstract <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
In this paper, we present a finite-state approach to constituency and therewith an analysis of coordination phenomena involving so-called non-constituents. We show that non-constituents can be seen as parts of fully-fledged constituents and therefore be coordinated in the same way. We have implemented an algorithm based on finite state automata that generates an LFG grammar assigning valid analyses to non-constituent coordination structures in the German language.
Nested complementation plays an important role in expressing counter- i.e. star-free and first-order definable languages and their hierarchies. In addition, methods that compile phonological rules into finite-state networks use double-nested complementation or “double negation”. This paper reviews how the double-nested complementation extends to a relatively new operation, generalized restriction (GR), coined by the author (Yli-Jyrä and Koskenniemi 2004). This operation encapsulates a double-nested complementation and elimination of a concatenation marker, diamond, whose finite occurrences align concatenations in the arguments of the operation. The paper demonstrates that the GR operation has an interesting potential in expressing regular languages, various kinds of grammars, bimorphisms and relations. This motivates a further study of optimized implementation of the operator.
Generalized Two-Level Grammar (GTWOL) provides a new method for compilation of parallel replacement rules into transducers. The current paper identifies the role of generalized lenient composition (GLC) in this method. Thanks to the GLC operation, the compilation method becomes bipartite and easily extendible to capture various application modes. In the light of three notions of obligatoriness, a modification to the compilation method is proposed. We argue that the bipartite design makes implementation of parallel obligatoriness, directionality, length and rank based application modes extremely easy, which is the main result of the paper.
We analyze anaphoric phenomena in the context of building an input understanding component for a conversational system for tutoring mathematics. In this paper, we report the results of data analysis of two sets of corpora of dialogs on mathematical theorem proving. We exemplify anaphoric phenomena, identify factors relevant to anaphora resolution in our domain and extensions to the input interpretation component to support it.
The most massive stars are those with the shortest but most active life. One group of massive stars, the Luminous Blue Variables (LBVs), of which only a few objects are known, are in particular of interest concerning the stability of stars. They have a high mass loss rate and are close to being instable. This is even more likely as rotation becomes an important factor in stellar evolution of these stars. Through massive stellar winds and sometimes giant eruptions, LBV nebulae are formed. Various aspects in the evolution in the LBV phase lead, beside the large scale morphological and kinematical differences, to a diversity of small structures like clumps, rims, and outflows in these nebulae.
Die Präsentation gibt zuerst einen Überblick über mögliche Parameter für die Wasserhaushaltsmodellierung, die aus Fernerkundungs(FE)-daten generell abgeleitet werden können. Bei der Beschreibung der Ableitungsverfahren dieser Parameter aus (FE)-Daten wird auf die Landnutzung, Vegetationsindices und die reale Evapotranspiration (ETr) fokussiert. Die Verfahren zur Bestimmung der ETr aus optischen FE-Daten lassen grob wie folgt gliedern : • Direkte Ableitung der Evapotranspiration aus radiometrisch bestimmten Oberflächen-temperaturen • Ableitung von Modellinputdaten wie z.B. Globalstrahlung, Albedo, Blattflächeniondex LAI und NDVI aus FE-Daten zur Anwendung von SoilVegetation-AtmosphereTransfer- und Energiebilanzmodellen wie z.B. SEBAL (Bastiaansen et al . 1998) • Kombinierte Anwendung verschiedenster Sensoren wie SAR-ERS1, LANDSAT-TM, NOAA-AHVRR mit SVAT-Modellen und hydrologischen Einzugsgebietsmodellen Die Validierung dieser Methoden wurde in verschiedenen Messkampagnen wie z.B. Lo-trex10E-HIBE, FIFE oder HAPEX-Sahel durchgeführt. Dabei wurde die aus dem entspre-chenden Sensor abgeleitete ETr mit gemessenen ETr-Raten von Ankerstationen innerhalb eines definierten Gebietes verglichen. Diese Ankerstationen leiteten die ETr aus Profil-, Ed-dy-Flux-, oder Szintillometermessungen ab. Durchgängige längere Zeitreihen der ETr sind nur mit FE-Daten mit hoher Wiederholungsrate wie z.B. NOAA-AVHRR, MODIS hoher Zeitauflösung möglich Mit Landsat-TM z.B. ergeben sich dagegen nur „Snap Shots“ der ETr von einzelnen Tagen. Daher wurden oftmals Multisensorverfahren d.h. Kombination von z.B. Landsat-TM mit NOAA-AVHRR eingesetzt oder die FE-Daten nur für die Erhebung zeitin-varianter Eingangsdaten (z.B. Landnutzung) und zur raumbezogenen Validierung der ETr-Berechnungen von hydrologischen Modellen verwendet. Im zweiten Teil des Vortrags wird ein Anwendungsbeispiel für den Versuch einer räumliche Validierung eines Wasserhaus-haltsmodells über NDVI-ETr-Datenprodukte aus Landsat-TM5-Daten für das Stobbergebiet. Ein weiteres Anwendungsbeispiel für die Einbindung von Landnutzungsdatenprodukten aus Landsat-TM5-Daten in die Wasserhaushaltmodellierung für das Ucker-Einzugsgebiet schliesst den Vortrag ab. <hr> Dokument 1: Foliensatz | Dokument 2: Abstract <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
Verfassungsgerichtsbarkeit in der Russischen Föderation und in der Bundesrepublik Deutschland
(2013)
Der Tagungsband enthält die Referate und Diskussionsbeiträge des in Moskau an der Staatlichen Juristischen Kutafin-Universität am 9. und 10. Oktober 2012 durchgeführten Rundtischgespräches zur Verfassungsgerichtsbarkeit. Behandelt werden ausgewählte rechtshistorische und -politische Fragen sowie aktuelle rechtliche Probleme der Verfassungsgerichtsbarkeit in der Russischen Föderation und der Bundesrepublik Deutschland sowohl aus der Sicht der Rechtspraxis als auch der Wissenschaft: insbesondere die Entwicklung der Verfassungsgerichtsbarkeit in Geschichte und Gegenwart, Status, Rechtsnatur und Aufgaben des Verfassungsgerichts in den Subjekten der Föderation und in den Ländern sowie Verfassungsgericht und Gesetzgebung. Zudem werden Spezialfragen der Verfassungsgerichtsbarkeit erörtert, z.B. die Institution des Bevollmächtigten Vertreters des Präsidenten im Verfassungsgericht in Russland, der Eilrechtsschutz durch das BVerfG und der Rechtsschutz bei überlangen Verfahren vor dem BVerfG in Deutschland.
INTEGRAL tripled the number of super-giant high-mass X-ray binaries (sgHMXB) known in the Galaxy by revealing absorbed and fast transient (SFXT) systems. Quantitative constraints on the wind clumping of massive stars can be obtained from the study of the hard X-ray variability of SFXT. A large fraction of the hard X-ray emission is emitted in the form of flares with a typical duration of 3 ksec, frequency of 7 days and luminosity of $10^{36}$ erg/s. Such flares are most probably emitted by the interaction of a compact object orbiting at $\sim10~R_*$ with wind clumps ($10^{22 ... 23}$ g) representing a large fraction of the stellar mass-loss rate. The density ratio between the clumps and the inter-clump medium is $10^{2 ... 4}$. The parameters of the clumps and of the inter-clump medium, derived from the SFXT flaring behavior, are in good agreement with macro-clumping scenario and line-driven instability simulations. SFXT are likely to have larger orbital radius than classical sgHMXB.
Der Vortrag skizziert die Geschichte der deutschen Romanistik, gibt einen kurzen Überblick über den Stand 2003 im Bereich der Französischen Philologie und resümiert die fachlichen Herausforderungen im deutsch-französischen sowohl kulturellen wie politischen Kontext. Anschließend folgen drei Vorschläge zur Veränderung der Schulausbildung und Universitätslehre auf der Basis eines breiten Kulturverständnisses: 1. Einführung eines neuen allgemeinverbindlichen Schulfaches "Europa-Kunde" ("Connaissances de l'Europe"), das europaweit in Ergänzung oder Kooperation mit dem Fach Geschichte gelehrt werden sollte, 2. die systematische Ergänzung der traditionellen Literatur- und Sprachwissenschaft durch Kurse zu den deutsch-französischen Kulturbeziehungen, 3. die Ergänzung des traditionellen romanistischen Lehrkanons durch Seminare aus dem Bereich des Kulturmanagements.
We present the results of Monte Carlo mass-loss predictions for massive stars covering a wide range of stellar parameters. We critically test our predictions against a range of observed massloss rates – in light of the recent discussions on wind clumping. We also present a model to compute the clumping-induced polarimetric variability of hot stars and we compare this with observations of Luminous Blue Variables, for which polarimetric variability is larger than for O and Wolf-Rayet stars. Luminous Blue Variables comprise an ideal testbed for studies of wind clumping and wind geometry, as well as for wind strength calculations, and we propose they may be direct supernova progenitors.
The interdisciplinary workshop STOCHASTIC PROCESSES WITH APPLICATIONS IN THE NATURAL SCIENCES was held in Bogotá, at Universidad de los Andes from December 5 to December 9, 2016. It brought together researchers from Colombia, Germany, France, Italy, Ukraine, who communicated recent progress in the mathematical research related to stochastic processes with application in biophysics.
The present volume collects three of the four courses held at this meeting by Angelo Valleriani, Sylvie Rœlly and Alexei Kulik.
A particular aim of this collection is to inspire young scientists in setting up research goals within the wide scope of fields represented in this volume.
Angelo Valleriani, PhD in high energy physics, is group leader of the team "Stochastic processes in complex and biological systems" from the Max-Planck-Institute of Colloids and Interfaces, Potsdam.
Sylvie Rœlly, Docteur en Mathématiques, is the head of the chair of Probability at the University of Potsdam.
Alexei Kulik, Doctor of Sciences, is a Leading researcher at the Institute of Mathematics of Ukrainian National Academy of Sciences.
Recent studies of massive O-type stars present clear evidences of inhomogeneous and clumped winds. O-type (H-rich) central stars of planetary nebulae (CSPNs) are in some ways the low mass–low luminosity analogous of those massive stars. In this contribution, we present preliminary results of our on-going multi-wavelength (FUV, UV and optical) study of the winds of Galactic CSPNs. Particular emphasis will be given to the clumping factors derived by means of optical lines (Hα and Heii 4686) and “classic” FUV (and UV) lines.
Magnetic fields influence the dynamics of hot-star winds and create large scale structure. Based on numerical magnetohydrodynamic (MHD) simulations, we model the wind of θ¹ Ori C, and then use the SEI method to compute synthetic line profiles for a range of viewing angles as function of rotational phase. The resulting dynamic spectrum for a moderately strong line shows a distinct modulation, but with a phase that seems at odds with available observations.
Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
While there is strong evidence for clumping in the winds of massive hot stars, very little is known about clumping in the winds from Central Stars. We have checked [WC]-type CSPN winds for clumping by inspecting the electron-scattering line wings. At least for three stars we found indications for wind inhomogeneities.
Scrambling and interfaces
(2013)
This paper proposes a novel analysis of the Russian OVS construction and argues that the parametric variation in the availability of OVS cross-linguistically depends on the type of relative interpretative argument prominence that a language encodes via syntactic structure. When thematic and information-structural prominence relations do not coincide, only one of them can be structurally/linearly represented. The relation that is not structurally/linearly encoded must be made visible at the PF interface either via prosody or morphology.
Claiming that cross-speaker "but" can signal correction in dialogue, we start by describing the types of corrections "but" can communicate by focusing on the Speech Act (SA) communicated in the previous turn and address the ways in which "but" can correct what is communicated. We address whether "but" corrects the proposition, the direct SA or the discourse relation communicated in the previous turn. We will also briefly address other relations signalled by cross-turn "but". After presenting a typology of the situations "but" can correct, we will address how these corrections can be modelled in the Information State model of dialogue, motivating this work by showing how it can be used to potentially avoid misunderstandings. We wrap up by showing how the model presented here updates beliefs in the Information State representation of the dialogue and can be used to facilitate response deliberation.
Throughout the years 2020 and 2021, schools were temporarily closed to slow the spread of SarsCoV-2. For some periods, children were locked out of sports in schools (physical education lessons, school sports working groups) and organized sports in sports clubs which often resulted in physical inactivity. Did these restrictions affect children’s physical fitness? The EMOTIKON project (www.uni-potsdam.de/emotikon) annually assesses the physical fitness (cardiorespiratory endurance [6-minute-run test], coordination [star-run test], speed [20-m sprint test], lower [standing long jump test] and upper [ball push test] limbs muscle power, and balance [one-legged stance test]) of all third graders in the Federal State of Brandenburg, Germany. Participation is mandatory for all public primary schools. In the falls from 2016 to 2021, 83,476 keyage children (i.e., school enrollment according to the legal key date, between eight and nine years in third grade) from 512 schools were assessed with the EMOTIKON test battery. We tested the Covid pandemic effect on a composite score of the four highly correlated physical fitness tests assessing cardiorespiratory endurance, coordination, speed and powerLOW and on another composite score of the three running tests (cardiorespiratory endurance, coordination, speed), as well as separately on all six physical fitness components. Secular trends for each of the physical fitness components and differences between schools and children were taken into account in linear mixed models. We found a negative Covid pandemic effect on the two composite physical fitness scores, as well as on cardiorespiratory endurance, coordination, and speed. We found a positive Covid pandemic effect on powerLOW. Coordination was associated with the largest negative Covid pandemic effect, also passing the threshold of smallest meaningful change (SMC, i.e., 0.2 Cohen’s d) when accumulated across two years. Given the educational context, Covid pandemic effects were also compared relative to the expected age-related development of the physical fitness components between eight and nine years. The Covid pandemic-related developmental costs/gains ranged from three to seven months relative to a longitudinal age effect, and from five to 17 months relative to a cross-sectional age effect. We propose that a longitudinal assessment yields a more reliable estimate of the developmental (age-related) gain than a cross-sectional one. Therefore, we consider the smaller Covid pandemic-related developmental costs/gains to be more credible. Interestingly, on the school level, „fitter” schools (relatively higher Grand Mean) exhibited larger negative Covid pandemic effects than schools with a lower physical fitness score. Negative Covid pandemic effects for the three run tasks were also found by Bähr et al. (2022), who tested the physical fitness of 16,496 Thuringian third-graders from 292 schools with the same six physical fitness tests used in EMOTIKON. Our results may be used to prioritize health-related interventions.
E-Learning Symposium 2012
(2013)
Dieser Tagungsband beinhaltet die auf dem E-Learning Symposium 2012 an der Universität Potsdam vorgestellten Beiträge zu aktuellen Anwendungen, innovativen Prozesse und neuesten Ergebnissen im Themenbereich E-Learning. Lehrende, E-Learning-Praktiker und -Entscheider tauschten ihr Wissen über etablierte und geplante Konzepte im Zusammenhang mit dem Student-Life-Cycle aus. Der Schwerpunkt lag hierbei auf der unmittelbaren Unterstützung von Lehr- und Lernprozessen, auf Präsentation, Aktivierung und Kooperation durch Verwendung von neuen und etablierten Technologien.
Luminous Blue Variables show strong changes in their stellar wind on time scales of typically years to decades when they expand and contract radially at approximately constant luminosity. Micro-variability on shorter time scales and amplitudes can be observed superimposed to the larger scale radial changes. I will show long-term time series of high resolution spectra which we have collected in the past 20 years for many of the well known LBVs together with a few time series of weekly sampling (HR Car, R40, R71, R110, R127, S Dor) covering a time windows of up to a few months. Wind variability is seen on short and intermediate time scales with the line profiles changing from P Cygni to inverse P Cygni and double peeked profiles sometimes for the same star and spectral line. On longer time scales the ionisation levels for all chemical elements change drastically due to the strong change of the temperature on the stellar surface. While on the long term the characteristic radial changes may have impact on the over all mass loss rates, the variabilities and asymmetries on short and intermediate time scales may cause false estimates of the mass loss rates when confronting models with the observed line profiles
Aspect-oriented middleware is a promising technology for the realisation of dynamic reconfiguration in heterogeneous distributed systems. However, like other dynamic reconfiguration approaches, AO-middleware-based reconfiguration requires that the consistency of the system is maintained across reconfigurations. AO-middleware-based reconfiguration is an ongoing research topic and several consistency approaches have been proposed. However, most of these approaches tend to be targeted at specific contexts, whereas for distributed systems it is crucial to cover a wide range of operating conditions. In this paper we propose an approach that offers distributed, dynamic reconfiguration in a consistent manner, and features a flexible framework-based consistency management approach to cover a wide range of operating conditions. We evaluate our approach by investigating the configurability and transparency of our approach and also quantify the performance overheads of the associated consistency mechanisms.
Recent models of Information Structure (IS) identify a low level contrast feature that functions within the topic and focus of the utterance. This study investigates the exact nature of this feature based on empirical evidence from a controlled read speech experiment on the prosodic realization of different levels of contrast in Modern Greek. Results indicate that only correction is truly contrastive, and that it is similarly realized in both topic and focus, suggesting that contrast is an independent IS dimension. Non default focus position is further identified as a parameter that triggers a prosodically marked rendition, similar to correction.
We report on new mass-loss rate estimates for O stars in six massive binaries using the amplitude of orbital-phase dependent, linear-polarimetric variability caused by electron scattering off free electrons in the winds. Our estimated mass-loss rates for luminous O stars are independent of clumping. They suggest similar clumping corrections as for WR stars and do not support the recently proposed reduction in mass-loss rates of O stars by one or two orders of magnitude.
The James Webb Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2013. JWST will find the first stars and galaxies that formed in the early universe, connecting the Big Bang to our own Milky Way galaxy. JWST will peer through dusty clouds to see stars forming planetary systems, connecting the MilkyWay to our own Solar System. JWST’s instruments are designed to work primarily in the infrared range of 1 - 28 μm, with some capability in the visible range. JWST will have a large mirror, 6.5 m in diameter, and will be diffraction-limited at 2 μm (0.1 arcsec resolution). JWST will be placed in an L2 orbit about 1.5 million km from the Earth. The instruments will provide imaging, coronography, and multi-object and integral-field spectroscopy across the 1 - 28 μm wavelength range. The breakthrough capabilities of JWST will enable new studies of massive star winds from the Milky Way to the early universe.
I discuss observational evidence – independent of the direct spectral diagnostics of stellar winds themselves – suggesting that mass-loss rates for O stars need to be revised downward by roughly a factor of three or more, in line with recent observed mass-loss rates for clumped winds. These independent constraints include the large observed mass-loss rates in LBV eruptions, the large masses of evolved massive stars like LBVs and WNH stars, WR stars in lower metallicity environments, observed rotation rates of massive stars at different metallicity, supernovae that seem to defy expectations of high mass-loss rates in stellar evolution, and other clues. I pay particular attention to the role of feedback that would result from higher mass-loss rates, driving the star to the Eddington limit too soon, and therefore making higher rates appear highly implausible. Some of these arguments by themselves may have more than one interpretation, but together they paint a consistent picture that steady line-driven winds of O-type stars have lower mass-loss rates and are significantly clumped.
A wide range of additional forward chaining applications could be realized with deductive databases, if their rule formalism, their immediate consequence operator, and their fixpoint iteration process would be more flexible. Deductive databases normally represent knowledge using stratified Datalog programs with default negation. But many practical applications of forward chaining require an extensible set of user–defined built–in predicates. Moreover, they often need function symbols for building complex data structures, and the stratified fixpoint iteration has to be extended by aggregation operations. We present an new language Datalog*, which extends Datalog by stratified meta–predicates (including default negation), function symbols, and user–defined built–in predicates, which are implemented and evaluated top–down in Prolog. All predicates are subject to the same backtracking mechanism. The bottom–up fixpoint iteration can aggregate the derived facts after each iteration based on user–defined Prolog predicates.
Die 8. Fachtagung für Hochschuldidaktik der Informatik (HDI) fand im September 2018 zusammen mit der Deutschen E-Learning Fachtagung Informatik (DeLFI) unter dem gemeinsamen Motto „Digitalisierungswahnsinn? - Wege der Bildungstransformationen“ in Frankfurt statt.
Dabei widmet sich die HDI allen Fragen der informatischen Bildung im Hochschulbereich. Schwerpunkte bildeten in diesem Jahr u. a.:
- Analyse der Inhalte und anzustrebenden Kompetenzen in Informatikveranstaltungen
- Programmieren lernen & Einstieg in Softwareentwicklung
- Spezialthemen: Data Science, Theoretische Informatik und Wissenschaftliches Arbeiten
Die Fachtagung widmet sich ausgewählten Fragestellungen dieser Themenkomplexe, die durch Vorträge ausgewiesener Experten und durch eingereichte Beiträge intensiv behandelt werden.
Nachlässe sind persönliches Eigentum und unterliegen deshalb keiner Abgabepflicht. Der Wunsch des Nachlassers bezüglich der weiteren Aufbewahrung seines schriftlichen Erbes ist demzufolge primär gegenüber allen unseren Wünschen. Wir können nicht fordern, sondern nur bitten, uns durch eigene Leistungen anbieten und die zukünftigen Nachlassenden oder deren Erben überzeugen. Die nicht vorhandene institutionelle Zuständigkeit für die Übernahme von Nachlässen erzeugt die Reibungspunkte zwischen den Einrichtungen, die sich um den Erwerb von Nachlässen bemühen: Archive – Bibliotheken – Museen - Sammlungen. Die Wünsche zum Erwerb des Nachlasses einer bestimmten Person – egal ob Wissenschaftler, Künstler oder Politiker – sind demzufolge immer an verschiedenen Orten gleichzeitig vorhanden. Der Zufall entscheidet dann leider meist darüber, an welcher Stelle der Nachlass zukünftig verwahrt und wissenschaftlich genutzt wird. Es stellt sich die Frage, ob wir auf solche Zufälle hoffen und warten sollen, oder ob wir nicht eher eine engagierte – gemeinsam zwischen den Archiven abgestimmte - Erwerbspolitik betrieben sollten. ------------ Beiträge zum Thema "Nachlässe an Universitäts- und Hochschularchiven sowie Archiven wissenschaftlicher Institutionen" im Rahmen der Frühjahrstagung der Fachgruppe 8: "Archivare an Hochschularchiven und Archiven wissenschaftlicher Institutionen" am 16./17. Juni an der Universität Potsdam.
A constraint programming system combines two essential components: a constraint solver and a search engine. The constraint solver reasons about satisfiability of conjunctions of constraints, and the search engine controls the search for solutions by iteratively exploring a disjunctive search tree defined by the constraint program. The Monadic Constraint Programming framework gives a monadic definition of constraint programming where the solver is defined as a monad threaded through the monadic search tree. Search and search strategies can then be defined as firstclass objects that can themselves be built or extended by composable search transformers. Search transformers give a powerful and unifying approach to viewing search in constraint programming, and the resulting constraint programming system is first class and extremely flexible.
Playing with information : how political games encourage the player to cross the magic circle
(2008)
The concept of the magic circle suggests that the experience of play is separated from reality. However, in order to interact with a game’s rule system, the player has to make meaningful interpretations of its representations – and representations are never neutral. Games with political content refer in their representations explicitly to social discourses. Cues within their representational layers provoke the player to link the experience of play to mental concepts of reality.
We present preliminary results of a tailored atmosphere analysis of six Galactic WC stars using UV, optical, and mid-infrared Spitzer IRS data. With these data, we are able to sample regions from 10 to 10³ stellar radii, thus to determine wind clumping in different parts of the wind. Ultimately, derived wind parameters will be used to accuratelymeasure neon abundances, and to so test predicted nuclear-reaction rates.
Morphological analyses based on word syntax approaches can encounter difficulties with long distance dependencies. The reason is that in some cases an affix has to have access to the inner structure of the form with which it combines. One solution is the percolation of features from ther inner morphemes to the outer morphemes with some process of feature unification. However, the obstacle of percolation constraints or stipulated features has lead some linguists to argue in favour of other frameworks such as, e.g., realizational morphology or parallel approaches like optimality theory. This paper proposes a linguistic analysis of two long distance dependencies in the morphology of Russian verbs, namely secondary imperfectivization and deverbal nominalization.We show how these processes can be reanalysed as local dependencies. Although finitestate frameworks are not bound by such linguistically motivated considerations, we present an implementation of our analysis as proposed in [1] that does not complicate the grammar or enlarge the network unproportionally.
Goal-oriented dialog as a collaborative subordinated activity involving collective acceptance
(2006)
Modeling dialog as a collaborative activity consists notably in specifying the contain of the Conversational Common Ground and the kind of social mental state involved. In previous work (Saget, 2006), we claim that Collective Acceptance is the proper social attitude for modeling Conversational Common Ground in the particular case of goal-oriented dialog. We provide a formalization of Collective Acceptance, besides elements in order to integrate this attitude in a rational model of dialog are provided; and finally, a model of referential acts as being part of a collaborative activity is provided. The particular case of reference has been chosen in order to exemplify our claims.
In a production experiment and two follow-up perception experiments on read German we investigated the (de-)coding of discourse-new, inferentially and textually accessible and given discourse referents by prosodic means. Results reveal that a decrease in the referent’s level of givenness is reflected by an increase in its prosodic prominence (expressed by differences in the status and type of accent used) providing evidence for the relevance of different intermediate types of information status between the poles given and new. Furthermore, perception data indicate that the degree of prosodic prominence can serve as the decisive cue for decoding a referent’s level of givenness.
We present one-dimensional, time-dependent models of the clumps generated by the linedeshadowing instability. In order to follow the clumps out to distances of more than 1000 R∗, we use an efficient moving-box technique. We show that, within the approximations, the wind can remain clumped well into the formation region of the radio continuum.
In semi-arid savannas, unsustainable land use can lead to degradation of entire landscapes, e.g. in the form of shrub encroachment. This leads to habitat loss and is assumed to reduce species diversity. In BIOTA phase 1, we investigated the effects of land use on population dynamics on farm scale. In phase 2 we scale up to consider the whole regional landscape consisting of a diverse mosaic of farms with different historic and present land use intensities. This mosaic creates a heterogeneous, dynamic pattern of structural diversity at a large spatial scale. Understanding how the region-wide dynamic land use pattern affects the abundance of animal and plant species requires the integration of processes on large as well as on small spatial scales. In our multidisciplinary approach, we integrate information from remote sensing, genetic and ecological field studies as well as small scale process models in a dynamic region-wide simulation tool. <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006.
Gamma-rays can be produced by the interaction of a relativistic jet and the matter of the stellar wind in the subclass of massive X-ray binaries known as “microquasars”. The relativistic jet is ejected from the surroundings of the compact object and interacts with cold protons from the stellar wind, producing pions that then quickly decay into gamma-rays. Since the resulting gamma-ray emissivity depends on the target density, the detection of rapid variability in microquasars with GLAST and the new generation of Cherenkov imaging arrays could be used to probe the clumped structure of the stellar wind. In particular, we show here that the relative fluctuation in gamma rays may scale with the square root of the ratio of porosity length to binary separation, $\sqrt{h/a}$, implying for example a ca. 10% variation in gamma ray emission for a quite moderate porosity, h/a ∼ 0.01.
Mit der 4. Tagung zur Hochschuldidaktik Informatik wird eine Reihe fortgesetzt, die ihren Anfang 1998 in Stuttgart unter der Überschrift „Informatik und Ausbildung“ genommen hat. Seither dienen diese Tagungen den Lehrenden im Bereich der Hochschulinformatik als Forum der Information und des Diskurses über aktuelle didaktische und bildungspolitische Entwicklungen im Bereich der Informatikausbildung. Aktuell zählen dazu insbesondere Fragen der Bildungsrelevanz informatischer Inhalte und der Herausforderung durch eine stärkere Kompetenzorientierung in der Informatik. Die eingereichten Beiträge zur HDI 2010 in Paderborn veranschaulichen unterschiedliche Bemühungen, sich mit relevanten Problemen der Informatikdidaktik an Hochschulen in Deutschland (und z. T. auch im Ausland) auseinanderzusetzen. Aus der Breite des Spektrums der Einreichungen ergaben sich zugleich Probleme bei der Begutachtung. Letztlich konnten von den zahlreichen Einreichungen nur drei die Gutachter so überzeugen, dass sie uneingeschränkt in ihrer Langfassung akzeptiert wurden. Neun weitere Einreichungen waren trotz Kritik überwiegend positiv begutachtet worden, so dass wir diese als Kurzfassung bzw. Diskussionspapier in die Tagung aufgenommen haben.
Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
Massive stars usually form groups such as OB associations. Their fast stellar winds sweep up collectively the surrounding insterstellar medium (ISM) to generate superbubbles. Observations suggest that superbubble evolution on the surrounding ISM can be very irregular. Numerical simulations considering these conditions could help to understand the evolution of these superbubbles and to clarify the dynamics of these objects as well as the difference between observed X-ray luminosities and the predicted ones by the standard model (Weaver et al. 1977).
The H.E.S.S. collaboration recently reported the discovery of VHE γ-ray emission coincident with the young stellar cluster Westerlund 2. This system is known to host a population of hot, massive stars, and, most particularly, the WR binary WR 20a. Particle acceleration to TeV energies in Westerlund 2 can be accomplished in several alternative scenarios, therefore we only discuss energetic constraints based on the total available kinetic energy in the system, the actual mass loss rates of respective cluster members, and implied gamma-ray production from processes such as inverse Compton scattering or neutral pion decay. From the inferred gammaray luminosity of the order of 1035erg/s, implications for the efficiency of converting available kinetic energy into non-thermal radiation associated with stellar winds in the Westerlund 2 cluster are discussed under consideration of either the presence or absence of wind clumping.
Observational evidence exists that winds of massive stars are clumped. Many massive star systems are known as non-thermal particle production sites, as indicated by their synchrotron emission in the radio band. As a consequence they are also considered as candidate sites for non-thermal high-energy photon production up to gamma-ray energies. The present work considers the effects of wind clumpiness expected on the emitting relativistic particle spectrum in colliding wind systems, built up from the pool of thermal wind particles through diffusive particle acceleration, and taking into account inverse Compton and synchrotron losses. In comparison to a homogeneous wind, a clumpy wind causes flux variations of the emitting particle spectrum when the clump enters the wind collision region. It is found that the spectral features associated with this variability moves temporally from low to high energy bands with the time shift between any two spectral bands being dependent on clump size, filling factor, and the energy-dependence of particle energy gains and losses.
Fluvial systems are one of the major features shaping a landscape. They adjust to the prevailing tectonic and climatic setting and therefore are very sensitive markers of changes in these systems. If their response to tectonic and climatic forcing is quantified and if the climatic signal is excluded, it is possible to derive a local deformation history. Here, we investigate fluvial terraces and erosional surfaces in the southern Chilean forearc to assess a long-term geomorphic and hence tectonic evolution. Remote sensing and field studies of the Nahuelbuta Range show that the long-term deformation of the Chilean forearc is manifested by breaks in topography, sequences of differentially uplifted marine, alluvial and strath terraces as well as tectonically modified river courses and drainage basins. We used SRTM-90-data as basic elevation information for extracting and delineating drainage networks. We calculated hypsometric curves as an indicator for basin uplift, stream-length gradient indices to identify stream segments with anomalous slopes, and longitudinal river profiles as well as DS-plots to identify knickpoints and other anomalies. In addition, we investigated topography with elevation-slope graphs, profiles, and DEMs to reveal erosional surfaces. During the first field trip we already measured palaeoflow directions, performed pebble counting and sampled the fluvial terraces in order to apply cosmogenic nuclide dating (<sup>10Be, <sup>26Al) as well as provenance analyses. Our preliminary analysis of the Coastal Cordillera indicates a clear segmentation between the northern and southern parts of the Nahuelbuta Range. The Lanalhue Fault, a NW-SE striking fault zone oblique to the plate boundary, defines the segment boundary. Furthermore, we find a complex drainage re-organisation including a drainage reversal and wind gap on the divide between the Tirúa and Pellahuén basins east of the town Tirúa. The coastal basins lost most of their Andean sediment supply areas that existed in Tertiary and in part during early Pleistocene time. Between the Bío-Bío and Imperial rivers no Andean river is recently capable to traverse the Coastal Cordillera, suggesting ongoing Quaternary uplift of the entire range. From the spatial distribution of geomorphic surfaces in this region two uplift signals may be derived: (1) a long-term differential uplift process, active since the Miocene and possibly caused by underplating of subducted trench sediments, (2) a younger, local uplift affecting only the northern part of the Nahuelbuta Range that may be caused by the interaction of the forearc with the subduction of the Mocha Fracture Zone at the latitude of the Arauco peninsula. Our approach thus provides results in our attempt to decipher the characteristics of forearc development of active convergent margins using long-term geomorphic indicators. Furthermore, it is expected that our ongoing assessment will constrain repeatedly active zones of deformation. <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
Avatime, a Kwa language of Ghana, has an additive particle tsyɛ that at first sight looks similar to additive particles such as too and also in English. However, on closer inspection, the Avatime particle behaves differently. Contrary to what is usually claimed about additive particles, tsyɛ does not only associate with focused elements. Moreover, unlike its English equivalents, tsyɛ does not come with a requirement of identity between the expressed proposition and an alternative. Instead, it indicates that the proposition it occurs in is similar to or compatible with a presupposed alternative proposition.
Clumping in O-star winds
(2007)
Discussion : X-rays
(2007)
We exploit time-series $FUSE$ spectroscopy to {\it uniquely} probe spatial structure and clumping in the fast wind of the central star of the H-rich planetary nebula NGC~6543 (HD~164963). Episodic and recurrent optical depth enhancements are discovered in the P{\sc v} absorption troughs, with some evidence for a $\sim$ 0.17-day modulation time-scale. The characteristics of these features are essentially identical to the discrete absorption components' (DACs) commonly seen in the UV lines of massive OB stars, suggesting the temporal structures seen in NGC~6543 likely have a physical origin that is similar to that operating in massive, luminous stars. The mechanism for forming coherent perturbations in the outflows is therefore apparently operating equally in the radiation-pressure-driven winds of widely differing momenta ($\mdot$$v_\infty$$R_\star^{0.5}$) and flow times, as represented by OB stars and CSPN.
Decisions for the conservation of biodiversity and sustainable management of natural resources are typically related to large scales, i.e. the landscape level. However, understanding and predicting the effects of land use and climate change on scales relevant for decision-making requires to include both, large scale vegetation dynamics and small scale processes, such as soil-plant interactions. Integrating the results of multiple BIOTA subprojects enabled us to include necessary data of soil science, botany, socio-economics and remote sensing into a high resolution, process-based and spatially-explicit model. Using an example from a sustainably-used research farm and a communally used and degraded farming area in semiarid southern Namibia we show the power of simulation models as a tool to integrate processes across disciplines and scales.
This paper focuses on the way computer games refer to the context of their formation and ask how they might stimulate the user’s understanding of the world around him. The central question is: Do computer games have the potential to inspire our reflection about moral and ethical issues? And if so, by which means do they achieve this? Drawing on concepts of the ethical criticism in literary studies as proposed by Wayne C. Booth and Martha Nussbaum, I will argue in favor of an ethical criticism for computer games. Two aspects will be brought into focus: the ethical reflection in the artifact as a whole, and the recipient’s emotional involvement. The paper aims at evaluating the interaction of game content and game structure in order to give an adequate insight into the way computer games function and affect us.
The emergence of information extraction (IE) oriented pattern engines has been observed during the last decade. Most of them exploit heavily finite-state devices. This paper introduces ExPRESS – a new extraction pattern engine, whose rules are regular expressions over flat feature structures. The underlying pattern language is a blend of two previously introduced IE oriented pattern formalisms, namely, JAPE, used in the widely known GATE system, and the unificationbased XTDL formalism used in SProUT. A brief and technical overview of ExPRESS, its pattern language and the pool of its native linguistic components is given. Furthermore, the implementation of the grammar interpreter is addressed too.
This paper explores the role of the intentional stance in games, arguing that any question of artificial intelligence has as much to do with the co-option of the player’s interpretation of actions as intelligent as any actual fixed-state systems attached to agents. It demonstrates how simply using a few simple and, in system terms, cheap tricks, existing AI can be both supported and enhanced. This includes representational characteristics, importing behavioral expectations from real life, constraining these expectations using diegetic devices, and managing social interrelationships to create the illusion of a greater intelligence than is ever actually present. It is concluded that complex artificial intelligence is often of less importance to the experience of intelligent agents in play than the creation of a space where the intentional stance can be evoked and supported.
We present an extension to a comprehensive context model that has been successfully employed in a number of practical conversational dialogue systems. The model supports the task of multimodal fusion as well as that of reference resolution in a uniform manner. Our extension consists of integrating implicitly mentioned concepts into the context model and we show how they serve as candidates for reference resolution.
In honour of Seymour Papert
(2018)
Forth is nice and flexible but to a philosopher and teacher educator Logo is the more impressing language. Both are relatives of Lisp, but Forth has a reverse Polish notation where as Logo has an infix notation. Logo allows top down programming, Forth only bottom up. Logo enables recursive programming, Forth does not. Logo includes turtle graphics, Forth has nothing comparable. So what to do if you can't get Logo and have no information about its inner architecture? This should be a case of "empirical modelling": How can you model observable results of the behaviour of Logo in terms of Forth? The main steps to solve this problem are shown in the first part of the paper.
The second part of the paper discusses the problem of modelling and shows that the modelling of making and the modelling of recognition have the same mathematical structure. So "empirical modelling" can also serve for modelling desired behaviour of technical systems.
The last part of the paper will show that the heuristic potential of a problem which should be modeled is more important than the programming language. The Picasso construal shows, in a very simple way, how children of different ages can model emotional relations in human behaviour with a simple Logo system.
Der Band enthält die Tagungsmaterialien des deutsch-russichen Symposiums zum Thema "Verfassungsentwicklung in Russland und Deutschland", welches am 25. und 26. September 2013 in Potsdam stattfand. Die Tagung wurde anlässlich des 20. Jahrestages der russischen Verfassung vom Dezember 2013 durchgeführt. Die inhaltlichen Schwerpunkte bilden die Themen: Verfassungsentstehung, Verfassungsänderung, Verfassungsprinzipien, Landesverfassungen, Fortentwicklung der Verfassung durch die Verfassungsgerichtsbarkeit und Grundrechte, die jeweils aus russischer und deutscher Sicht behandelt werden. Ergänzend befasst sich jeweils ein Betrag mit aktuellen Problemen der Menschenrechtsverwirklichung in Russland und der Ausländerintegration in Deutschland und Russland im Vergleich.
Der Forschungskreis Vereinte Nationen veranstaltete am 25. Juni 2016 seine dreizehnte Konferenz in Kooperation mit dem „Forum internationale Ordnung“ des Auswärtigen Amtes. Die Potsdamer UNO-Konferenzen stellen in ihrem Programm traditionell eine Verbindung von Wissenschaft und Praxis her unter Beteiligung unterschiedlicher Disziplinen.
Die Konferenz 2016 widmete sich dem Thema „Die Rolle der Vereinten Nationen in der multilateralen Entwicklungszusammenarbeit“.
Entwicklung als wichtiges Ziel der Vereinten Nationen und ihrer Mitgliedstaaten ist auch für andere Handlungs- und Politikfelder wichtig. Bilanz und Ausblick, die auf der Konferenz unternommen wurden, betrafen deshalb nicht nur die Entwicklungspolitik im eigentlichen Sinne – hier standen die 2015 beschlossenen Nachhaltigkeitsziele (SDGs) zur Debatte –, sondern bezogen auch umweltpolitische und menschenrechtliche Aspekte mit ein.
Die Referate zum Thema Entwicklung wurden eingerahmt von zwei Vorträgen zu aktuellen UN-politischen Fragen: der Wahl des neuen Generalsekretärs der Vereinten Nationen in New York nach einem reformierten Wahlverfahren, das mehr Transparenz und für UN-Mitgliedstaaten und NGOs mehr Beteiligungsmöglichkeiten bietet, und die Neuorganisation innerhalb des Auswärtigen Amtes, was die Vereinten Nationen betrifft.
Large open-source software projects involve developers with a wide variety of backgrounds and expertise. Such software projects furthermore include many internal APIs that developers must understand and use properly. According to the intended purpose of these APIs, they are more or less frequently used, and used by developers with more or less expertise. In this paper, we study the impact of usage patterns and developer expertise on the rate of defects occurring in the use of internal APIs. For this preliminary study, we focus on memory management APIs in the Linux kernel, as the use of these has been shown to be highly error prone in previous work. We study defect rates and developer expertise, to consider e.g., whether widely used APIs are more defect prone because they are used by less experienced developers, or whether defects in widely used APIs are more likely to be fixed.
In this work an extension of CSSR algorithm using Maximum Entropy Models is introduced. Preliminary experiments to perform Named Entity Recognition with this new system are presented.
Dynamical simulation of the “velocity-porosity” reduction in observed strength of stellar wind lines
(2007)
I use dynamical simulations of the line-driven instability to examine the potential role of the resulting flow structure in reducing the observed strength of wind absorption lines. Instead of the porosity length formalism used to model effects on continuum absorption, I suggest reductions in line strength can be better characterized in terms of a velocity clumping factor that is insensitive to spatial scales. Examples of dynamic spectra computed directly from instability simulations do exhibit a net reduction in absorption, but only at a modest 10-20% level that is well short of the ca. factor 10 required by recent analyses of PV lines.
X-ray spectroscopy is a sensitive probe of stellar winds. X-rays originate from optically thin shock-heated plasma deep inside the wind and propagate outwards throughout absorbing cool material. Recent analyses of the line ratios from He-like ions in the X-ray spectra of O-stars highlighted problems with this general paradigm: the measured line ratios of highest ions are consistent with the location of the hottest X-ray emitting plasma very close to the base of the wind, perhaps indicating the presence of a corona, while measurements from lower ions conform with the wind-embedded shock model. Generally, to correctly model the emerging Xray spectra, a detailed knowledge of the cool wind opacities based on stellar atmosphere models is prerequisite. A nearly grey stellar wind opacity for the X-rays is deduced from the analyses of high-resolution X-ray spectra. This indicates that the stellar winds are strongly clumped. Furthermore, the nearly symmetric shape of X-ray emission line profiles can be explained if the wind clumps are radially compressed. In massive binaries the orbital variations of X-ray emission allow to probe the opacity of the stellar wind; results support the picture of strong wind clumping. In high-mass X-ray binaries, the stochastic X-ray variability and the extend of the stellar-wind part photoionized by X-rays provide further strong evidence that stellar winds consist of dense clumps.
Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
We present the tool Kato which is, to the best of our knowledge, the first tool for plagiarism detection that is directly tailored for answer-set programming (ASP). Kato aims at finding similarities between (segments of) logic programs to help detecting cases of plagiarism. Currently, the tool is realised for DLV programs but it is designed to handle various logic-programming syntax versions. We review basic features and the underlying methodology of the tool.
Krajobraz Języka
(2003)
The optical spectrum of Eta Carinae (η Car) is prominent in H I, He i and Fe ii wind lines, all of which vary both in absorption and emission with phase. The phase dependance is a consequence of the interaction between the two objects in the η Car binary (η Car A & B). The binary system is enshrouded by ejecta from previous mass ejection events and consequently, η Car B is not directly observable. We have traced the He i lines over η Car’s spectroscopic period, using HST/STIS data obtained with medium spectral, but high angular, resolving power, and created a radial velocity curve for the system. The He I lines are formed in the core of the system, and appear to be a composite of multiple features formed in spatially separated regions. The sources of their irregular line profiles are still not fully understood, but can be attributed to emission/absorption near the wind-wind interface and/or a direct consequence of the η Car A’s, massive, clumpy wind. This paper will discuss the spectral variability, the narrow emission structure of the He i lines and how clumpiness of the winds may impede the construction of the reliable radial velocity curve, necessary for characterizations of especially η Car B.
Metacommunicative circles
(2008)
The paper uses Gregory Bateson’s concept of metacommunication to explore the boundaries of the ‘magic circle’ in play and computer games. It argues that the idea of a self-contained “magic circle” ignores the constant negotiations among players which establish the realm of play. The “magic circle” is no fixed ontological entity but is set up by metacommunicative play. The paper further pursues the question if metacommunication could also be found in single-player computer games, and comes to the conclusion that metacommunication is implemented in single-player games by the means of metalepsis.
This first volume of the DIGAREC Series holds the proceedings of the conference “The Philosophy of Computer Games”, held at the University of Potsdam from May 8-10, 2008. The contributions of the conference address three fields of computer game research that are philosophically relevant and, likewise, to which philosophical reflection is crucial. These are: ethics and politics, the action-space of games, and the magic circle. All three topics are interlinked and constitute the paradigmatic object of computer games: Whereas the first describes computer games on the outside, looking at the cultural effects of games as well as on moral practices acted out with them, the second describes computer games on the inside, i.e. how they are constituted as a medium. The latter finally discusses the way in which a border between these two realms, games and non-games, persists or is already transgressed in respect to a general performativity.
Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
Extending Alexander Galloway’s analysis of the action-image in videogames, this essay explores the concept in relation to its source: the analysis of cinema by the French philosopher Gilles Deleuze. The applicability of the concept to videogames will, therefore, be considered through a comparison between the First Person Shooter S.T.A.L.K.E.R. and Andrey Tarkovsky’s film Stalker. This analysis will compellingly explore the nature of videogame-action, its relation to player-perceptions and its location within the machinic and ludic schema.
Proceedings of TripleA 10
(2024)
The TripleA workshop series was founded in 2014 by linguists from Potsdam and Tübingen with the aim of providing a platform for researchers that conduct theoretically-informed linguistic fieldwork on meaning. Its focus is particularly on languages that are under-represented in the current research landscape, including but not limited to languages of Africa, Asia, and Australia, hence TripleA.
For its 10th anniversary, TripleA returned to the University of Potsdam on the 7-9th of June 2023.
The programme included 21 talks dealing with no less than 22 different languages, including three invited talks given by Sihwei Chen (Academia Sinica), Jérémy Pasquereau (Laboratoire de Linguistique de Nantes, CNRS) and Agata Renans (Ruhr-Universität Bochum). Nine of these (invited or peer-reviewed) talks are featured in this volume.
Hα observations of Rigel obtained on 184 nights during the past ten years with the 1-m telescope and ´echelle spectrograph of Ritter Observatory are surveyed. The line profiles were classified in terms of morphology. About 1/4 of them are of P Cygni type, about 15% inverse P Cygni, about 25% double-peaked, about 1/3 pure absorption, and a few are single emission lines. Transformation of the profile from one type to another typically takes a few days. Although the line stays in absorption for extended intervals, only one high-velocity absorption event of the intensity reported by Kaufer et al. (1996a) was observed, in late 2006. Late in this event, Hα absorption occurred farther to the red than the red wing of a plausible photospheric absorption component, an indication of infalling material. In general, as the absorption events come to an end, the emission typically returns with an inverse P Cygni profile. The Hα profile class shows no obvious correlation with the radial velocity of C II λ6578, a photospheric absorption line.
General Discussion
(2007)
In the old days (pre ∼1990) hot stellar winds were assumed to be smooth, which made life fairly easy and bothered no one. Then after suspicious behaviour had been revealed, e.g. stochastic temporal variability in broadband polarimetry of single hot stars, it took the emerging CCD technology developed in the preceding decades (∼1970-80’s) to reveal that these winds were far from smooth. It was mainly high-S/N, time-dependent spectroscopy of strong optical recombination emission lines in WR, and also a few OB and other stars with strong hot winds, that indicated all hot stellar winds likely to be pervaded by thousands of multiscale (compressible supersonic turbulent?) structures, whose driver is probably some kind of radiative instability. Quantitative estimates of clumping-independent mass-loss rates came from various fronts, mainly dependent directly on density (e.g. electron-scattering wings of emission lines, UV spectroscopy of weak resonance lines, and binary-star properties including orbital-period changes, electron-scattering, and X-ray fluxes from colliding winds) rather than the more common, easier-to-obtain but clumping-dependent density-squared diagnostics (e.g. free-free emission in the IR/radio and recombination lines, of which the favourite has always been Hα). Many big questions still remain, such as: What do the clumps really look like? Do clumping properties change as one recedes from the mother star? Is clumping universal? Does the relative clumping correction depend on $\dot{M}$ itself?
This paper approaches the debate over the notion of “magic circle” through an exploratory analysis of the unfolding of identities/differences in gameplay through Derrida’s différance. Initially, différance is related to the notion of play and identity/difference in Derrida’s perspective. Next, the notion of magic circle through Derrida’s play is analyzed, emphasizing the dynamics of différance to understand gameplay as process; questioning its boundaries. Finally, the focus shifts toward the implications of the interplay of identities and differences during gameplay.
A key problem for models of dialogue is to explain the mechanisms involved in generating and responding to clarification requests. We report a 'Maze task' experiment that investigates the effect of 'spoof' clarification requests on the development of semantic co-ordination. The results provide evidence of both local and global semantic co-ordination phenomena that are not captured by existing dialogue co-ordination models.
We review the effects of clumping on the profiles of resonance doublets. By allowing the ratio of the doublet oscillator strenghts to be a free parameter, we demonstrate that doublet profiles contain more information than is normally utilized. In clumped (or porous) winds, this ratio can lies between unity and the ratio of the f-values, and can change as a function of velocity and time, depending on the fraction of the stellar disk that is covered by material moving at a particular velocity at a given moment. Using these insights, we present the results of SEI modeling of a sample of B supergiants, ζ Pup and a time series for a star whose terminal velocity is low enough to make the components of its Si VIλλ1400 independent. These results are interpreted within the framewrok of the Oskinova et al. (2007) model, and demonstrate how the doublet profiles can be used to extract infromation about wind structure.
We discuss the results of time-resolved spectroscopy of three presumably single Population I Wolf-Rayet stars in the Small Magellanic Cloud, where the ambient metallicity is $\sim 1/5 Z_\odot$. We were able to detect and follow numerous small-scale wind-embedded inhomogeneities in all observed stars. The general properties of the moving features, such as their velocity dispersions, emissivities and average accelerations, closely match the corresponding characteristics of small-scale inhomogeneities in the winds of Galactic Wolf-Rayet stars.
How does a shared lexicon arise in population of agents with differing lexicons, and how can this shared lexicon be maintained over multiple generations? In order to get some insight into these questions we present an ALife model in which the lexicon dynamics of populations that possess and lack metacommunicative interaction (MCI) capabilities are compared. We ran a series of experiments on multi-generational populations whose initial state involved agents possessing distinct lexicons. These experiments reveal some clear differences in the lexicon dynamics of populations that acquire words solely by introspection contrasted with populations that learn using MCI or using a mixed strategy of introspection and MCI. The lexicon diverges at a faster rate for an introspective population, eventually collapsing to one single form which is associated with all meanings. This contrasts sharply with MCI capable populations in which a lexicon is maintained, where every meaning is associated with a unique word. We also investigated the effect of increasing the meaning space and showed that it speeds up the lexicon divergence for all populations irrespective of their acquisition method.
Demonstratives, in particular gestures that "only" accompany speech, are not a big issue in current theories of grammar. If we deal with gestures, fixing their function is one big problem, the other one is how to integrate the representations originating from different channels and, ultimately, how to determine their composite meanings. The growing interest in multi-modal settings, computer simulations, human-machine interfaces and VRapplications increases the need for theories ofmultimodal structures and events. In our workshopcontribution we focus on the integration of multimodal contents and investigate different approaches dealing with this problem such as Johnston et al. (1997) and Johnston (1998), Johnston and Bangalore (2000), Chierchia (1995), Asher (2005), and Rieser (2005).
Classical SDRT (Asher and Lascarides, 2003) discussed essential features of dialogue like adjacency pairs or corrections and up-dating. Recent work in SDRT (Asher, 2002, 2005) aims at the description of natural dialogue. We use this work to model situated communication, i.e. dialogue, in which sub-sentential utterances and gestures (pointing and grasping) are used as conventional modes of communication. We show that in addition to cognitive modelling in SDRT, capturing mental states and speech-act related goals, special postulates are needed to extract meaning out of contexts. Gestural meaning anchors Discourse Referents in contextually given domains. Both sorts of meaning are fused with the meaning of fragments to get at fully developed dialogue moves. This task accomplished, the standard SDRT machinery, tagged SDRSs, rhetorical relations, the up-date mechanism, and the Maximize Discourse Coherence constraint generate coherent structures. In sum, meanings from different verbal and non-verbal sources are assembled using extended SDRT to form coherent wholes.
This paper suggests an approach to studying the rhetoric of persuasive computer games through comparative analysis. A comparison of the military propaganda game AMERICA’S ARMY to similar shooter games reveals an emphasis on discipline and constraints in all main aspects of the games, demonstrating a preoccupation with ethos more than pathos. Generalizing from this, a model for understanding game rhetoric through balances of freedom and constraints is proposed.