Refine
Has Fulltext
- yes (246) (remove)
Year of publication
Document Type
- Conference Proceeding (246) (remove)
Language
- English (186)
- German (49)
- Multiple languages (10)
- French (1)
Keywords
- Archiv (4)
- Information Structure (4)
- Nachlass (4)
- Cloud Computing (3)
- E-Learning (3)
- middleware (3)
- Constraint Solving (2)
- Deduction (2)
- EMOTIKON (2)
- Forschungsprojekte (2)
Institute
- Extern (137)
- Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung (23)
- Institut für Künste und Medien (20)
- Institut für Informatik und Computational Science (15)
- Institut für Slavistik (14)
- Institut für Physik und Astronomie (11)
- Institut für Geowissenschaften (10)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (9)
- Sonderforschungsbereich 632 - Informationsstruktur (7)
- Bürgerliches Recht (6)
The influence of the wind to the total continuum of OB supergiants is discussed. For wind velocity distributions with β > 1.0, the wind can have strong influence to the total continuum emission, even at optical wavelengths. Comparing the continuum emission of clumped and unclumped winds, especially for stars with high β values, delivers flux differences of up to 30% with maximum in the near-IR. Continuum observations at these wavelengths are therefore an ideal tool to discriminate between clumped and unclumped winds of OB supergiants.
While there is strong evidence for clumping in the winds of massive hot stars, very little is known about clumping in the winds from Central Stars. We have checked [WC]-type CSPN winds for clumping by inspecting the electron-scattering line wings. At least for three stars we found indications for wind inhomogeneities.
Stellar winds play an important role for the evolution of massive stars and their cosmic environment. Multiple lines of evidence, coming from spectroscopy, polarimetry, variability, stellar ejecta, and hydrodynamic modeling, suggest that stellar winds are non-stationary and inhomogeneous. This is referred to as 'wind clumping'. The urgent need to understand this phenomenon is boosted by its far-reaching implications. Most importantly, all techniques to derive empirical mass-loss rates are more or less corrupted by wind clumping. Consequently, mass-loss rates are extremely uncertain. Within their range of uncertainty, completely different scenarios for the evolution of massive stars are obtained. Settling these questions for Galactic OB, LBV and Wolf-Rayet stars is prerequisite to understanding stellar clusters and galaxies, or predicting the properties of first-generation stars. In order to develop a consistent picture and understanding of clumped stellar winds, an international workshop on 'Clumping in Hot Star Winds' was held in Potsdam, Germany, from 18. - 22. June 2007. About 60 participants, comprising almost all leading experts in the field, gathered for one week of extensive exchange and discussion. The Scientific Organizing Committee (SOC) included John Brown (Glasgow), Joseph Cassinelli (Madison), Paul Crowther (Sheffield), Alex Fullerton (Baltimore), Wolf-Rainer Hamann (Potsdam, chair), Anthony Moffat (Montreal), Stan Owocki (Newark), and Joachim Puls (Munich). These proceedings contain the invited and contributed talks presented at the workshop, and document the extensive discussions.
X-ray spectroscopy is a sensitive probe of stellar winds. X-rays originate from optically thin shock-heated plasma deep inside the wind and propagate outwards throughout absorbing cool material. Recent analyses of the line ratios from He-like ions in the X-ray spectra of O-stars highlighted problems with this general paradigm: the measured line ratios of highest ions are consistent with the location of the hottest X-ray emitting plasma very close to the base of the wind, perhaps indicating the presence of a corona, while measurements from lower ions conform with the wind-embedded shock model. Generally, to correctly model the emerging Xray spectra, a detailed knowledge of the cool wind opacities based on stellar atmosphere models is prerequisite. A nearly grey stellar wind opacity for the X-rays is deduced from the analyses of high-resolution X-ray spectra. This indicates that the stellar winds are strongly clumped. Furthermore, the nearly symmetric shape of X-ray emission line profiles can be explained if the wind clumps are radially compressed. In massive binaries the orbital variations of X-ray emission allow to probe the opacity of the stellar wind; results support the picture of strong wind clumping. In high-mass X-ray binaries, the stochastic X-ray variability and the extend of the stellar-wind part photoionized by X-rays provide further strong evidence that stellar winds consist of dense clumps.
Clumps in hot star winds can originate from shock compression due to the line driven instability. One-dimensional hydrodynamic simulations reveal a radial wind structure consisting of highly compressed shells separated by voids, and colliding with fast clouds. Two-dimensional simulations are still largely missing, despite first attempts. Clumpiness dramatically affects the radiative transfer and thus all wind diagnostics in the UV, optical, and in X-rays. The microturbulence approximation applied hitherto is currently superseded by a more sophisticated radiative transfer in stochastic media. Besides clumps, i.e. jumps in the density stratification, so-called kinks in the velocity law, i.e. jumps in dv/dr, play an eminent role in hot star winds. Kinks are a new type of radiative-acoustic shock, and propagate at super-Abbottic speed.
Modeling expanding atmospheres is a difficult task because of the extreme non-LTE situation, the need to account for complex model atoms, especially for the iron-group elements with their millions of lines, and because of the supersonic expansion. Adequate codes have been developed e.g. by Hillier (CMFGEN), the Munich group (Puls, Pauldrach), and in Potsdam (PoWR code, Hamann et al.). While early work was based on the assumption of a smooth and homogeneous spherical stellar wind, the need to account for clumping became obvious about ten years ago. A relatively simple first-order clumping correction was readily implemented into the model codes. However, its simplifying assumptions are severe. Most importantly, the clumps are taken to be optically thin at all frequencies (”microclumping”). We discuss the consequences of this approximation and describe an approach to account for optically thick clumps (“macroclumping”). First results demonstrate that macroclumping can generally reduce the strength of spectral features, depending on their optical thickness. The recently reported discrepancy between the Hα diagnostic and the Pv resonance lines in O star spectra can be resolved without decreasing the mass-loss rates, when macroclumping is taken into account.
Der Tagungsband enthält das Programm und die Abstracts des 26. Symposiums der Fachgruppe Klinische Psychologie und Psychotherapie der Deutschen Gesellschaft für Psychologie, veranstaltet an der Universität Potsdam vom 1. bis 3. Mai 2008. Etwa 450 Kongressteilnehmer präsentieren den aktuellen Forschungs- und Wissensstand der Klinischen Psychologie und Psychotherapie in Deutschland. Grußworte halten die brandenburgische Ministerin für Arbeit, Soziales, Gesundheit und Familie, Dagmar Ziegler, die Präsidentin der Universität Potsdam, Prof. Dr.-Ing. Dr. Sabine Kunst, sowie Prof. Dr. Michael Linden als Vertreter der Deutschen Gesellschaft für Psychiatrie, Psychotherapie und Nervenheilkunde (DGPPN). Zu den Themenschwerpunkten des Kongresses gehören Einflussfaktoren auf die psychische Gesundheit Älterer, Impulsivität, Schlaf- und Traumforschung in der Klinischen Psychologie, Behandlung von Essstörungen, Wirksamkeitsstudien psychischer Störungen des Kindes- und Jugendalters, Angst und Depression, Behandlung von Kriegs- und Folteropfern, Risiko- und Schutzfaktoren der Kindesentwicklung sowie Adipositas im Kindes- und Jugendalter. Außer den Vorträgen gibt es eine Präsentation von etwa 150 Postern. Zum Programm der Tagung gehört ebenso die Verleihung des Klaus-Grawe-Awards for the Advancement of Innovative Research in Clinical Psychology and Psychotherapy an Prof. Dr. Timothy J. Strauman von der Duke University (USA), die Verleihung der Nachwuchswissenschaftler- und Posterpreise sowie ein Pre-conference Workshop für Doktorandinnen und Doktoranden der Klinischen Psychologie zum Thema "Verhaltens- und Molekulargenetik".
The emergence of information extraction (IE) oriented pattern engines has been observed during the last decade. Most of them exploit heavily finite-state devices. This paper introduces ExPRESS – a new extraction pattern engine, whose rules are regular expressions over flat feature structures. The underlying pattern language is a blend of two previously introduced IE oriented pattern formalisms, namely, JAPE, used in the widely known GATE system, and the unificationbased XTDL formalism used in SProUT. A brief and technical overview of ExPRESS, its pattern language and the pool of its native linguistic components is given. Furthermore, the implementation of the grammar interpreter is addressed too.
Metacommunicative circles
(2008)
The paper uses Gregory Bateson’s concept of metacommunication to explore the boundaries of the ‘magic circle’ in play and computer games. It argues that the idea of a self-contained “magic circle” ignores the constant negotiations among players which establish the realm of play. The “magic circle” is no fixed ontological entity but is set up by metacommunicative play. The paper further pursues the question if metacommunication could also be found in single-player computer games, and comes to the conclusion that metacommunication is implemented in single-player games by the means of metalepsis.
Being "in the game"
(2008)
When people describe themselves as being “in the game” this is often thought to mean they have a sense of presence, i.e. they feel like they are in the virtual environment (Brown/Cairns 2004). Presence research traditionally focuses on user experiences in virtual reality systems (e.g. head mounted displays, CAVE-like systems). In contrast, the experience of gaming is very different. Gamers willingly submit to the rules of the game, learn arbitrary relationships between the controls and the screen output, and take on the persona of their game character. Also whereas presence in VR systems is immediate, presence in gaming is gradual. Due to these differences, one can question the extent to which people feel present during gaming. A qualitative study was conducted to explore what gamers actually mean when they describe themselves as being “in the game.” Thirteen gamers were interviewed and the resulting grounded theory suggests being “in the game” does not necessarily mean presence (i.e. feeling like you are the character and present in the VE). Some people use this phrase just to emphasize their high involvement in the game. These findings differ with Brown and Cairns as they suggest at the highest state of immersion not everybody experiences presence. Furthermore, the experience of presence does not appear dependent on the game being in the first person perspective or the gamer being able to empathize with the character. Future research should investigate why some people experience presence and others do not. Possible explanations include: use of language, perception of presence, personality traits, and types of immersion.
This paper approaches the debate over the notion of “magic circle” through an exploratory analysis of the unfolding of identities/differences in gameplay through Derrida’s différance. Initially, différance is related to the notion of play and identity/difference in Derrida’s perspective. Next, the notion of magic circle through Derrida’s play is analyzed, emphasizing the dynamics of différance to understand gameplay as process; questioning its boundaries. Finally, the focus shifts toward the implications of the interplay of identities and differences during gameplay.
This paper describes a two-level formalism where feature structures are used in contextual rules. Whereas usual two-level grammars describe rational sets over symbol pairs, this new formalism uses tree structured regular expressions. They allow an explicit and precise definition of the scope of feature structures. A given surface form may be described using several feature structures. Feature unification is expressed in contextual rules using variables, like in a unification grammar. Grammars are compiled in finite state multi-tape transducers.
Since Harris’ parser in the late 50s, multiword units have been progressively integrated in parsers. Nevertheless, in the most part, they are still restricted to compound words, that are more stable and less numerous. Actually, language is full of semi-fixed expressions that also form basic semantic units: semi-fixed adverbial expressions (e.g. time), collocations. Like compounds, the identification of these structures limits the combinatorial complexity induced by lexical ambiguity. In this paper, we detail an experiment that largely integrates these notions in a finite-state procedure of segmentation into super-chunks, preliminary to a parser.We show that the chunker, developped for French, reaches 92.9% precision and 98.7% recall. Moreover, multiword units realize 36.6% of the attachments within nominal and prepositional phrases.
Finite state methods for natural language processing often require the construction and the intersection of several automata. In this paper, we investigate the question of determining the best order in which these intersections should be performed. We take as an example lexical disambiguation in polarity grammars. We show that there is no efficient way to minimize the state complexity of these intersections.
We present an algorithm that computes a function that assigns consecutive integers to trees recognized by a deterministic, acyclic, finite-state, bottom-up tree automaton. Such function is called minimal perfect hashing. It can be used to identify trees recognized by the automaton. Its value may be seen as an index in some other data structures. We also present an algorithm for inverted hashing.
In this work an extension of CSSR algorithm using Maximum Entropy Models is introduced. Preliminary experiments to perform Named Entity Recognition with this new system are presented.
In a common description, to play a game is to step inside a concrete or metaphorical magic circle where special rules apply. In video game studies, this description has received an inordinate amount of criticism which the paper argues has two primary sources: 1. a misreading of the basic concept of the magic circle and 2. a somewhat rushed application of traditional theoretical concerns onto games. The paper argues that games studies must move beyond conventional criticisms of binary distinctions and rather look at the details of how games are played. Finally, the paper proposes an alternative metaphor for game-playing, the puzzle piece.
This paper highlights the different ways of perceiving video games and video game content, incorporating interactive and non-interactive methods. It examines varying cognitive and emotive reactions by persons who are used to play video games as well as persons who are unfamiliar with the aesthetics and the most basic game play rules incorporated within video games. Additionally, the principle of “Flow” serves as a theoretical and philosophical foundation. A small case-study featuring two games has been made to emphasize the numerous possible ways of perception of video games.
Landscape aesthetics drawing on philosophy and psychology allow us to understand computer games from a new angle. The landscapes of computer games can be understood as environments or images. This difference creates two options: 1. We experience environments or images, or 2. We experience landscape simultaneously as both. Psychologically, the first option can be backed up by a Vygotskian framework (this option highlights certain non-mainstream subject positions), the second by a Piegatian (highlighting cognitive mapping of game worlds).
This text compares the special characteristics of the game space in computer-generated environments with that in non-computerized playing-situations. Herewith, the concept of the magic circle as a deliberately delineated playing sphere with specific rules to be upheld by the players, is challenged. Yet, computer games also provide a virtual playing environment containing the rules of the game as well as the various action possibilities. But both the hardware and software facilitate the player’s actions rather than constraining them. This makes computer games fundamentally different: in contrast to traditional game spaces or limits, the computer-generated environment does not rely on the awareness of the player in upholding these rules. – Thus, there is no magic circle.
Most play spaces support completely different actions than we normally would think of when moving through real space, out of play. This paper therefore discusses the relationship between selected game rules and game spaces in connection to the behaviors, or possible behaviors, of the player. Space will be seen as a modifier or catalyst of player behavior. Six categories of game space are covered: Joy of movement, exploration, tactical, social, performative, and creative spaces. Joy of movement is examined in detail, with a briefer explanation of the other categories.
The paper aims to bring the experience of playing videogames closer to objective knowledge, where the experience can be assessed and falsified via an operational concept. The theory focuses on explaining the basic elements that form the core of the process of the experience. The name of puppetry is introduced after discussing the similarities in the importance of experience for both videogames and theatrical puppetry. Puppetry, then, operationalizes the gaming experience into a concept that can be assessed.
This paper explores the role of the intentional stance in games, arguing that any question of artificial intelligence has as much to do with the co-option of the player’s interpretation of actions as intelligent as any actual fixed-state systems attached to agents. It demonstrates how simply using a few simple and, in system terms, cheap tricks, existing AI can be both supported and enhanced. This includes representational characteristics, importing behavioral expectations from real life, constraining these expectations using diegetic devices, and managing social interrelationships to create the illusion of a greater intelligence than is ever actually present. It is concluded that complex artificial intelligence is often of less importance to the experience of intelligent agents in play than the creation of a space where the intentional stance can be evoked and supported.
We introduce and discuss a number of issues that arise in the process of building a finite-state morphological analyzer for Urdu, in particular issues with potential ambiguity and non-concatenative morphology. Our approach allows for an underlyingly similar treatment of both Urdu and Hindi via a cascade of finite-state transducers that transliterates the very different scripts into a common ASCII transcription system. As this transliteration system is based on the XFST tools that the Urdu/Hindi common morphological analyzer is also implemented in, no compatibility problems arise.
Nested complementation plays an important role in expressing counter- i.e. star-free and first-order definable languages and their hierarchies. In addition, methods that compile phonological rules into finite-state networks use double-nested complementation or “double negation”. This paper reviews how the double-nested complementation extends to a relatively new operation, generalized restriction (GR), coined by the author (Yli-Jyrä and Koskenniemi 2004). This operation encapsulates a double-nested complementation and elimination of a concatenation marker, diamond, whose finite occurrences align concatenations in the arguments of the operation. The paper demonstrates that the GR operation has an interesting potential in expressing regular languages, various kinds of grammars, bimorphisms and relations. This motivates a further study of optimized implementation of the operator.
This article describes a HMM-based word-alignment method that can selectively enforce a contiguity constraint. This method has a direct application in the extraction of a bilingual terminological lexicon from a parallel corpus, but can also be used as a preliminary step for the extraction of phrase pairs in a Phrase-Based Statistical Machine Translation system. Contiguous source words composing terms are aligned to contiguous target language words. The HMM is transformed into a Weighted Finite State Transducer (WFST) and contiguity constraints are enforced by specific multi-tape WFSTs. The proposed method is especially suited when basic linguistic resources (morphological analyzer, part-of-speech taggers and term extractors) are available for the source language only.
Jesper Juul has convincingly argued that the conflict over the proper object of study has shifted from “rules or story” to “player or game.” But a key component of digital games is still missing from either of these oppositions: that of the computer itself. This paper offers a way of thinking about the phenomenology of the videogame from the perspective of the computer rather than the game or the player.
This paper suggests an approach to studying the rhetoric of persuasive computer games through comparative analysis. A comparison of the military propaganda game AMERICA’S ARMY to similar shooter games reveals an emphasis on discipline and constraints in all main aspects of the games, demonstrating a preoccupation with ethos more than pathos. Generalizing from this, a model for understanding game rhetoric through balances of freedom and constraints is proposed.
The space-image
(2008)
In recent computer game research a paradigmatic shift is observable: Games today are first and foremost conceived as a new medium characterized by their status as an interactive image. The shift in attention towards this aspect becomes apparent in a new approach that is, first and foremost, aware of the spatiality of games or their spatial structures. This rejects traditional approaches on the basis that the medial specificity of games can no longer be reduced to textual or ludic properties, but has to be seen in medial constituted spatiality. For this purpose, seminal studies on the spatiality of computer games are resumed and their advantages and disadvantages are discussed. In connection with this, and against the background of the philosophical method of phenomenology, we propose three steps in describing computer games as space images: With this method it is possible to describe games with respect to the possible appearance of spatiality in a pictorial medium.
One of the informal properties often used to describe a new virtual world is its degree of openness. Yet what is an “open” virtual world? Does the phrase mean generally the same thing to different people? What distinguishes an open world from a less open world? Why does openness matter anyway? The answers to these questions cast light on an important, but shadowy, and uneasy, topic for virtual worlds: the relationship between those who construct the virtual, and those who use these constructions.
Generalized Two-Level Grammar (GTWOL) provides a new method for compilation of parallel replacement rules into transducers. The current paper identifies the role of generalized lenient composition (GLC) in this method. Thanks to the GLC operation, the compilation method becomes bipartite and easily extendible to capture various application modes. In the light of three notions of obligatoriness, a modification to the compilation method is proposed. We argue that the bipartite design makes implementation of parallel obligatoriness, directionality, length and rank based application modes extremely easy, which is the main result of the paper.
Morphological analyses based on word syntax approaches can encounter difficulties with long distance dependencies. The reason is that in some cases an affix has to have access to the inner structure of the form with which it combines. One solution is the percolation of features from ther inner morphemes to the outer morphemes with some process of feature unification. However, the obstacle of percolation constraints or stipulated features has lead some linguists to argue in favour of other frameworks such as, e.g., realizational morphology or parallel approaches like optimality theory. This paper proposes a linguistic analysis of two long distance dependencies in the morphology of Russian verbs, namely secondary imperfectivization and deverbal nominalization.We show how these processes can be reanalysed as local dependencies. Although finitestate frameworks are not bound by such linguistically motivated considerations, we present an implementation of our analysis as proposed in [1] that does not complicate the grammar or enlarge the network unproportionally.
This paper presents a system for the detection and correction of syntactic errors. It combines a robust morphosyntactic analyser and two groups of finite-state transducers specified using the Xerox Finite State Tool (xfst). One of the groups is used for the description of syntactic error patterns while the second one is used for the correction of the detected errors. The system has been tested on a corpus of real texts, containing both correct and incorrect sentences, with good results.
Temporal propositions are mapped to sets of strings that witness (in a precise sense) the propositions over discrete linear Kripke frames. The strings are collected into regular languages to ensure the decidability of entailments given by inclusions between languages. (Various notions of bounded entailment are shown to be expressible as language inclusions.) The languages unwind computations implicit in the logical (and temporal) connectives via a system of finite-state constraints adapted from finite-state morphology. Applications to Hybrid Logic and non-monotonic inertial reasoning are briefly considered.
This paper describes the key aspects of the system SynCoP (Syntactic Constraint Parser) developed at the Berlin-Brandenburgische Akademie der Wissenschaften. The parser allows to combine syntactic tagging and chunking by means of constraint grammar using weighted finite state transducers (WFST). Chunks are interpreted as local dependency structures within syntactic tagging. The linguistic theories are formulated by criteria which are formalized by a semiring; these criteria allow structural preferences and gradual grammaticality. The parser is essentially a cascade of WFSTs. To find the most likely syntactic readings a best-path search is used.
In this paper, we present a finite-state approach to constituency and therewith an analysis of coordination phenomena involving so-called non-constituents. We show that non-constituents can be seen as parts of fully-fledged constituents and therefore be coordinated in the same way. We have implemented an algorithm based on finite state automata that generates an LFG grammar assigning valid analyses to non-constituent coordination structures in the German language.
In the last years, statistical machine translation has already demonstrated its usefulness within a wide variety of translation applications. In this line, phrase-based alignment models have become the reference to follow in order to build competitive systems. Finite state models are always an interesting framework because there are well-known efficient algorithms for their representation and manipulation. This document is a contribution to the evolution of finite state models towards a phrase-based approach. The inference of stochastic transducers that are based on bilingual phrases is carefully analysed from a finite state point of view. Indeed, the algorithmic phenomena that have to be taken into account in order to deal with such phrase-based finite state models when in decoding time are also in-depth detailed.
Playing with information : how political games encourage the player to cross the magic circle
(2008)
The concept of the magic circle suggests that the experience of play is separated from reality. However, in order to interact with a game’s rule system, the player has to make meaningful interpretations of its representations – and representations are never neutral. Games with political content refer in their representations explicitly to social discourses. Cues within their representational layers provoke the player to link the experience of play to mental concepts of reality.
MMORPGs such as WORLD OF WARCRAFT can be understood as interactive representations of war. Within the frame provided by the program the players experience martial conflicts and thus a “virtual war.” The game world however requires a technical and as far as possible invisible infrastructure which has to be protected against attacks: Infrastructure means e.g. the servers on which the data of the player characters and the game’s world are saved, as well as the user accounts, which have to be protected, among other things, from “identity theft.” Besides the war on the virtual surface of the program we will therefore describe the invisible war concerning the infrastructure, the outbreak of which is always feared by the developers and operators of online-worlds, requiring them to take precautions. Furthermore we would like to focus on “virtual game worlds” as places of complete surveillance. Since action in these worlds is always associated with the production of data, total observation is theoretically possible and put into practice by the so-called “game master.” The observation of different communication channels (including user forums) serves to monitor and direct the actions on the virtual battlefield subtly, without the player feeling that his freedom is being limited. Finally, we will compare the fictional theater of war in WORLD OF WARCRAFT to the vision of “Network-Centric Warfare,” since it has often been observed that the analysis of MMORPGs is useful to the real trade of war. However, we point out what an unrealistic theater of war WORLD OF WARCRAFT really is.
This paper focuses on the way computer games refer to the context of their formation and ask how they might stimulate the user’s understanding of the world around him. The central question is: Do computer games have the potential to inspire our reflection about moral and ethical issues? And if so, by which means do they achieve this? Drawing on concepts of the ethical criticism in literary studies as proposed by Wayne C. Booth and Martha Nussbaum, I will argue in favor of an ethical criticism for computer games. Two aspects will be brought into focus: the ethical reflection in the artifact as a whole, and the recipient’s emotional involvement. The paper aims at evaluating the interaction of game content and game structure in order to give an adequate insight into the way computer games function and affect us.
This first volume of the DIGAREC Series holds the proceedings of the conference “The Philosophy of Computer Games”, held at the University of Potsdam from May 8-10, 2008. The contributions of the conference address three fields of computer game research that are philosophically relevant and, likewise, to which philosophical reflection is crucial. These are: ethics and politics, the action-space of games, and the magic circle. All three topics are interlinked and constitute the paradigmatic object of computer games: Whereas the first describes computer games on the outside, looking at the cultural effects of games as well as on moral practices acted out with them, the second describes computer games on the inside, i.e. how they are constituted as a medium. The latter finally discusses the way in which a border between these two realms, games and non-games, persists or is already transgressed in respect to a general performativity.
Extending Alexander Galloway’s analysis of the action-image in videogames, this essay explores the concept in relation to its source: the analysis of cinema by the French philosopher Gilles Deleuze. The applicability of the concept to videogames will, therefore, be considered through a comparison between the First Person Shooter S.T.A.L.K.E.R. and Andrey Tarkovsky’s film Stalker. This analysis will compellingly explore the nature of videogame-action, its relation to player-perceptions and its location within the machinic and ludic schema.
The papers contained in this issue share the insight that the different components of the grammar sometimes impose conflicting requirements on the grammar’s output, and that, in order to handle such conflicts, it seems advantageous to combine aspects from minimalist and OT modelling. The papers show that this can be undertaken in a multiplicity of ways, by using varying proportions of each framework, and offer a broad range of perspectives for future research.
Thema des Workshops waren alle Fragen, die sich der Vermittlung von Informatikgegenständen im Hochschulbereich widmen. Dazu gehören u.a.: - fachdidaktische Konzepte der Vermittlung einzelner Informatikgegenstände - methodische Lösungen, wie spezielle Lehr- und Lernformen, Durchführungskonzepte - Studienkonzepte und Curricula, insbesondere im Zusammenhang mit Bachelor- und Masterstudiengängen - E-Learning-Ansätze, wenn sie ein erkennbares didaktisches Konzept verfolgen empirische Ergebnisse und Vergleichsstudien. Die Fachtagung widmete sich ausgewählten Fragestellungen dieses Themenkomplexes, die durch Vorträge ausgewiesener Experten, durch eingereichte Beiträge und durch eine Präsentation intensiv behandelt wurden.
Mit der 4. Tagung zur Hochschuldidaktik Informatik wird eine Reihe fortgesetzt, die ihren Anfang 1998 in Stuttgart unter der Überschrift „Informatik und Ausbildung“ genommen hat. Seither dienen diese Tagungen den Lehrenden im Bereich der Hochschulinformatik als Forum der Information und des Diskurses über aktuelle didaktische und bildungspolitische Entwicklungen im Bereich der Informatikausbildung. Aktuell zählen dazu insbesondere Fragen der Bildungsrelevanz informatischer Inhalte und der Herausforderung durch eine stärkere Kompetenzorientierung in der Informatik. Die eingereichten Beiträge zur HDI 2010 in Paderborn veranschaulichen unterschiedliche Bemühungen, sich mit relevanten Problemen der Informatikdidaktik an Hochschulen in Deutschland (und z. T. auch im Ausland) auseinanderzusetzen. Aus der Breite des Spektrums der Einreichungen ergaben sich zugleich Probleme bei der Begutachtung. Letztlich konnten von den zahlreichen Einreichungen nur drei die Gutachter so überzeugen, dass sie uneingeschränkt in ihrer Langfassung akzeptiert wurden. Neun weitere Einreichungen waren trotz Kritik überwiegend positiv begutachtet worden, so dass wir diese als Kurzfassung bzw. Diskussionspapier in die Tagung aufgenommen haben.
Abstract interpretation-based model checking provides an approach to verifying properties of infinite-state systems. In practice, most previous work on abstract model checking is either restricted to verifying universal properties, or develops special techniques for temporal logics such as modal transition systems or other dual transition systems. By contrast we apply completely standard techniques for constructing abstract interpretations to the abstraction of a CTL semantic function, without restricting the kind of properties that can be verified. Furthermore we show that this leads directly to implementation of abstract model checking algorithms for abstract domains based on constraints, making use of an SMT solver.
Large open-source software projects involve developers with a wide variety of backgrounds and expertise. Such software projects furthermore include many internal APIs that developers must understand and use properly. According to the intended purpose of these APIs, they are more or less frequently used, and used by developers with more or less expertise. In this paper, we study the impact of usage patterns and developer expertise on the rate of defects occurring in the use of internal APIs. For this preliminary study, we focus on memory management APIs in the Linux kernel, as the use of these has been shown to be highly error prone in previous work. We study defect rates and developer expertise, to consider e.g., whether widely used APIs are more defect prone because they are used by less experienced developers, or whether defects in widely used APIs are more likely to be fixed.
Preface
(2010)
Aspect-oriented programming, component models, and design patterns are modern and actively evolving techniques for improving the modularization of complex software. In particular, these techniques hold great promise for the development of "systems infrastructure" software, e.g., application servers, middleware, virtual machines, compilers, operating systems, and other software that provides general services for higher-level applications. The developers of infrastructure software are faced with increasing demands from application programmers needing higher-level support for application development. Meeting these demands requires careful use of software modularization techniques, since infrastructural concerns are notoriously hard to modularize. Aspects, components, and patterns provide very different means to deal with infrastructure software, but despite their differences, they have much in common. For instance, component models try to free the developer from the need to deal directly with services like security or transactions. These are primary examples of crosscutting concerns, and modularizing such concerns are the main target of aspect-oriented languages. Similarly, design patterns like Visitor and Interceptor facilitate the clean modularization of otherwise tangled concerns. Building on the ACP4IS meetings at AOSD 2002-2009, this workshop aims to provide a highly interactive forum for researchers and developers to discuss the application of and relationships between aspects, components, and patterns within modern infrastructure software. The goal is to put aspects, components, and patterns into a common reference frame and to build connections between the software engineering and systems communities.
Because software development is increasingly expensive and timeconsuming, software reuse gains importance. Aspect-oriented software development modularizes crosscutting concerns which enables their systematic reuse. Literature provides a number of AOP patterns and best practices for developing reusable aspects based on compelling examples for concerns like tracing, transactions and persistence. However, such best practices are lacking for systematically reusing invasive aspects. In this paper, we present the ‘callback mismatch problem’. This problem arises in the context of abstraction mismatch, in which the aspect is required to issue a callback to the base application. As a consequence, the composition of invasive aspects is cumbersome to implement, difficult to maintain and impossible to reuse. We motivate this problem in a real-world example, show that it persists in the current state-of-the-art, and outline the need for advanced aspectual composition mechanisms to deal with this.
A deterministic cycle scheduling of partitions at the operating system level is supposed for a multiprocessor system. In this paper, we propose a tool for generating such schedules. We use constraint based programming and develop methods and concepts for a combined interactive and automatic partition scheduling system. This paper is also devoted to basic methods and techniques for modeling and solving this partition scheduling problem. Initial application of our partition scheduling tool has proved successful and demonstrated the suitability of the methods used.
An important characteristic of Service-Oriented Architectures is that clients do not depend on the service implementation's internal assignment of methods to objects. It is perhaps the most important technical characteristic that differentiates them from more common object-oriented solutions. This characteristic makes clients and services malleable, allowing them to be rearranged at run-time as circumstances change. That improvement in malleability is impaired by requiring clients to direct service requests to particular services. Ideally, the clients are totally oblivious to the service structure, as they are to aspect structure in aspect-oriented software. Removing knowledge of a method implementation's location, whether in object or service, requires re-defining the boundary line between programming language and middleware, making clearer specification of dependence on protocols, and bringing the transaction-like concept of failure scopes into language semantics as well. This paper explores consequences and advantages of a transition from object-request brokering to service-request brokering, including the potential to improve our ability to write more parallel software.
A constraint programming system combines two essential components: a constraint solver and a search engine. The constraint solver reasons about satisfiability of conjunctions of constraints, and the search engine controls the search for solutions by iteratively exploring a disjunctive search tree defined by the constraint program. The Monadic Constraint Programming framework gives a monadic definition of constraint programming where the solver is defined as a monad threaded through the monadic search tree. Search and search strategies can then be defined as firstclass objects that can themselves be built or extended by composable search transformers. Search transformers give a powerful and unifying approach to viewing search in constraint programming, and the resulting constraint programming system is first class and extremely flexible.
Enforcing security policies to distributed systems is difficult, in particular, when a system contains untrusted components. We designed AspectKE*, a distributed AOP language based on a tuple space, to tackle this issue. In AspectKE*, aspects can enforce access control policies that depend on future behavior of running processes. One of the key language features is the predicates and functions that extract results of static program analysis, which are useful for defining security aspects that have to know about future behavior of a program. AspectKE* also provides a novel variable binding mechanism for pointcuts, so that pointcuts can uniformly specify join points based on both static and dynamic information about the program. Our implementation strategy performs fundamental static analysis at load-time, so as to retain runtime overheads minimal. We implemented a compiler for AspectKE*, and demonstrate usefulness of AspectKE* through a security aspect for a distributed chat system.
Component based software development (CBSD) and aspectoriented software development (AOSD) are two complementary approaches. However, existing proposals for integrating aspects into component models are direct transposition of object-oriented AOSD techniques to components. In this article, we propose a new approach based on views. Our proposal introduces crosscutting components quite naturally and can be integrated into different component models.
The interest in extensions of the logic programming paradigm beyond the class of normal logic programs is motivated by the need of an adequate representation and processing of knowledge. One of the most difficult problems in this area is to find an adequate declarative semantics for logic programs. In the present paper a general preference criterion is proposed that selects the ‘intended’ partial models of generalized logic programs which is a conservative extension of the stationary semantics for normal logic programs of [Prz91]. The presented preference criterion defines a partial model of a generalized logic program as intended if it is generated by a stationary chain. It turns out that the stationary generated models coincide with the stationary models on the class of normal logic programs. The general wellfounded semantics of such a program is defined as the set-theoretical intersection of its stationary generated models. For normal logic programs the general wellfounded semantics equals the wellfounded semantics.
Different properties of programs, implemented in Constraint Handling Rules (CHR), have already been investigated. Proving these properties in CHR is fairly simpler than proving them in any type of imperative programming language, which triggered the proposal of a methodology to map imperative programs into equivalent CHR. The equivalence of both programs implies that if a property is satisfied for one, then it is satisfied for the other. The mapping methodology could be put to other beneficial uses. One such use is the automatic generation of global constraints, at an attempt to demonstrate the benefits of having a rule-based implementation for constraint solvers.
In the most abstract definition of its operational semantics, the declarative and concurrent programming language CHR is trivially non-terminating for a significant class of programs. Common refinements of this definition, in closing the gap to real-world implementations, compromise on declarativity and/or concurrency. Building on recent work and the notion of persistent constraints, we introduce an operational semantics avoiding trivial non-termination without compromising on its essential features.
We present the tool Kato which is, to the best of our knowledge, the first tool for plagiarism detection that is directly tailored for answer-set programming (ASP). Kato aims at finding similarities between (segments of) logic programs to help detecting cases of plagiarism. Currently, the tool is realised for DLV programs but it is designed to handle various logic-programming syntax versions. We review basic features and the underlying methodology of the tool.
In this talk, I would like to share my experiences gained from participating in four CSP solver competitions and the second ASP solver competition. In particular, I’ll talk about how various programming techniques can make huge differences in solving some of the benchmark problems used in the competitions. These techniques include global constraints, table constraints, and problem-specific propagators and labeling strategies for selecting variables and values. I’ll present these techniques with experimental results from B-Prolog and other CLP(FD) systems.
We describe a framework to support the implementation of web-based systems to manipulate data stored in relational databases. Since the conceptual model of a relational database is often specified as an entity-relationship (ER) model, we propose to use the ER model to generate a complete implementation in the declarative programming language Curry. This implementation contains operations to create and manipulate entities of the data model, supports authentication, authorization, session handling, and the composition of individual operations to user processes. Furthermore and most important, the implementation ensures the consistency of the database w.r.t. the data dependencies specified in the ER model, i.e., updates initiated by the user cannot lead to an inconsistent state of the database. In order to generate a high-level declarative implementation that can be easily adapted to individual customer requirements, the framework exploits previous works on declarative database programming and web user interface construction in Curry.
Preface
(2010)
The workshops on (constraint) logic programming (WLP) are the annual meeting of the Society of Logic Programming (GLP e.V.) and bring together researchers interested in logic programming, constraint programming, and related areas like databases, artificial intelligence and operations research. In this decade, previous workshops took place in Dresden (2008), Würzburg (2007), Vienna (2006), Ulm (2005), Potsdam (2004), Dresden (2002), Kiel (2001), and Würzburg (2000). Contributions to workshops deal with all theoretical, experimental, and application aspects of constraint programming (CP) and logic programming (LP), including foundations of constraint/ logic programming. Some of the special topics are constraint solving and optimization, extensions of functional logic programming, deductive databases, data mining, nonmonotonic reasoning, , interaction of CP/LP with other formalisms like agents, XML, JAVA, program analysis, program transformation, program verification, meta programming, parallelism and concurrency, answer set programming, implementation and software techniques (e.g., types, modularity, design patterns), applications (e.g., in production, environment, education, internet), constraint/logic programming for semantic web systems and applications, reasoning on the semantic web, data modelling for the web, semistructured data, and web query languages.
The workshops on (constraint) logic programming (WLP) are the annual meeting of the Society of Logic Programming (GLP e.V.) and bring together researchers interested in logic programming, constraint programming, and related areas like databases, artificial intelligence and operations research. The 23rd WLP was held in Potsdam at September 15 – 16, 2009. The topics of the presentations of WLP2009 were grouped into the major areas: Databases, Answer Set Programming, Theory and Practice of Logic Programming as well as Constraints and Constraint Handling Rules.
Aspect-oriented middleware is a promising technology for the realisation of dynamic reconfiguration in heterogeneous distributed systems. However, like other dynamic reconfiguration approaches, AO-middleware-based reconfiguration requires that the consistency of the system is maintained across reconfigurations. AO-middleware-based reconfiguration is an ongoing research topic and several consistency approaches have been proposed. However, most of these approaches tend to be targeted at specific contexts, whereas for distributed systems it is crucial to cover a wide range of operating conditions. In this paper we propose an approach that offers distributed, dynamic reconfiguration in a consistent manner, and features a flexible framework-based consistency management approach to cover a wide range of operating conditions. We evaluate our approach by investigating the configurability and transparency of our approach and also quantify the performance overheads of the associated consistency mechanisms.
In this paper we consider a simple syntactic extension of Answer Set Programming (ASP) for dealing with (nested) existential quantifiers and double negation in the rule bodies, in a close way to the recent proposal RASPL-1. The semantics for this extension just resorts to Equilibrium Logic (or, equivalently, to the General Theory of Stable Models), which provides a logic-programming interpretation for any arbitrary theory in the syntax of Predicate Calculus. We present a translation of this syntactic class into standard logic programs with variables (either disjunctive or normal, depending on the input rule heads), as those allowed by current ASP solvers. The translation relies on the introduction of auxiliary predicates and the main result shows that it preserves strong equivalence modulo the original signature.
The difference-list technique is described in literature as effective method for extending lists to the right without using calls of append/3. There exist some proposals for automatic transformation of list programs into differencelist programs. However, we are interested in construction of difference-list programs by the programmer, avoiding the need of a transformation step. In [GG09] it was demonstrated, how left-recursive procedures with a dangling call of append/3 can be transformed into right-recursion using the unfolding technique. For simplification of writing difference-list programs using a new cons/2 procedure was introduced. In the present paper, we investigate how efficieny is influenced using cons/2. We measure the efficiency of procedures using accumulator technique, cons/2, DCG’s, and difference lists and compute the resulting speedup in respect to the simple procedure definition using append/3. Four Prolog systems were investigated and we found different behaviour concerning the speedup by difference lists. A result of our investigations is, that an often advice given in the literature for avoiding calls append/3 could not be confirmed in this strong formulation.
We propose a paraconsistent declarative semantics of possibly inconsistent generalized logic programs which allows for arbitrary formulas in the body and in the head of a rule (i.e. does not depend on the presence of any specific connective, such as negation(-as-failure), nor on any specific syntax of rules). For consistent generalized logic programs this semantics coincides with the stable generated models introduced in [HW97], and for normal logic programs it yields the stable models in the sense of [GL88].
A wide range of additional forward chaining applications could be realized with deductive databases, if their rule formalism, their immediate consequence operator, and their fixpoint iteration process would be more flexible. Deductive databases normally represent knowledge using stratified Datalog programs with default negation. But many practical applications of forward chaining require an extensible set of user–defined built–in predicates. Moreover, they often need function symbols for building complex data structures, and the stratified fixpoint iteration has to be extended by aggregation operations. We present an new language Datalog*, which extends Datalog by stratified meta–predicates (including default negation), function symbols, and user–defined built–in predicates, which are implemented and evaluated top–down in Prolog. All predicates are subject to the same backtracking mechanism. The bottom–up fixpoint iteration can aggregate the derived facts after each iteration based on user–defined Prolog predicates.
Deductive databases need general formulas in rule bodies, not only conjuctions of literals. This is well known since the work of Lloyd and Topor about extended logic programming. Of course, formulas must be restricted in such a way that they can be effectively evaluated in finite time, and produce only a finite number of new tuples (in each iteration of the TP-operator: the fixpoint can still be infinite). It is also necessary to respect binding restrictions of built-in predicates: many of these predicates can be executed only when certain arguments are ground. Whereas for standard logic programming rules, questions of safety, allowedness, and range-restriction are relatively easy and well understood, the situation for general formulas is a bit more complicated. We give a syntactic analysis of formulas that guarantees the necessary properties.
We introduce a simple approach extending the input language of Answer Set Programming (ASP) systems by multi-valued propositions. Our approach is implemented as a (prototypical) preprocessor translating logic programs with multi-valued propositions into logic programs with Boolean propositions only. Our translation is modular and heavily benefits from the expressive input language of ASP. The resulting approach, along with its implementation, allows for solving interesting constraint satisfaction problems in ASP, showing a good performance.
Seit dem Jahr 2007 kooperieren die Moskauer Staatliche Juristische O. E. Kutafin Akademie und die Juristische Fakultät der Universität Potsdam mit hohem gegenseitigen Ertrag. Über den wissenschaftlichen Erkenntnisgewinn hinaus trägt die Partnerschaft zum Verständnis des anderen bei und hilft, die Gegensätze zu überwinden, welche das Verhältnis zwischen Deutschland und Russland im 20. Jahrhundert kennzeichneten. Im jährlichen Wechsel finden die Woche des Russischen Rechts in Potsdam und die Woche des Deutschen Rechts in Moskau statt. Der Band gibt die Vorträge namhafter Moskauer Wissenschaftlerinnen und Wissenschaftler wieder. Der Leser findet die Vorträge in russischer Sprache, jeweils mit Zusammenfassungen auf Deutsch. Das thematische Spektrum ist weit. Es reicht von der Kriminalistik, dem Völkerrecht, der Europäisierung des Russischen Rechts über die Einführung in das Russische Bankrecht und das Bankensystem der Russischen Föderation bis zur Einführung in das Internationale Privatrecht in Russland.
Die Beiträge des Sammelbandes sind im Rahmen eines Forschungs- und Doktoranden-seminars vorgetragen und diskutiert worden, das im Dezember 2010 in Potsdam stattfand und an dem Wissenschaftler der Staatlichen Universität für Wirtschaft und Finanzen, St. Petersburg, und Wissenschaftler der Lehrstühle für Statistik und Ökonometrie sowie für Volkswirtschaftslehre, insbesondere Wirtschaftstheorie, der Universität Potsdam teilnahmen. Die Veröffentlichung der Aufsätze zeigt zum einen die Vielfalt der Forschungsfelder an beiden Universitäten, die sich aus den unterschiedlichen Schwerpunkten der wissenschaftlichen Einheiten ergeben, sie zeigt auch zum anderen in beispielhafter Weise die unterschiedlichen Forschungstraditionen und Forschungsstile an beiden Universitäten. Die Beiträge beziehen sich sowohl auf ausgewählte Branchen und als auch auf bestimmte raumwirtschaftliche Fragestellungen.
Aktuelle Fragen des Menschenrechtsschutzes : 1. Potsdamer Menschenrechtstag am 26. Oktober 2011
(2012)
Aus Anlass der Neubesetzung des Menschenrechtszentrums mit den Direktoren Prof. Dr. Andreas Zimmermann, LL.M. (Harvard) und Prof. Dr. Logi Gunnarsson fand am 26.10.2011 der Potsdamer Menschenrechtstag unter der Themenstellung „Aktuelle Fragen des Menschenrechtsschutzes“ statt. Ganz im Sinne der interdisziplinären Ausrichtung des MenschenRechtsZentrums der Universität Potsdam beschäftigten sich die beiden Direktoren in ihren Einführungsvorträgen aus ihrer jeweiligen Disziplin heraus mit philosophischen und rechtlichen Problemstellungen der Menschenrechte und ihres Schutzes.
Die Tagungsreihe zur Hochschuldidaktik der Informatik HDI wird vom Fachbereich Informatik und Ausbildung / Didaktik der Informatik (IAD) in der Gesellschaft für Informatik e. V. (GI) organisiert. Sie dient den Lehrenden der Informatik in Studiengängen an Hochschulen als Forum der Information und des Austauschs über neue didaktische Ansätze und bildungspolitische Themen im Bereich der Hochschulausbildung aus der fachlichen Perspektive der Informatik. Diese fünfte HDI 2012 wurde an der Universität Hamburg organisiert. Für sie wurde das spezielle Motto „Informatik für eine nachhaltige Zukunft“ gewählt, um insbesondere Fragen der Bildungsrelevanz informatischer Inhalte, der Kompetenzen für Studierende informatisch geprägter Studiengänge und der Rolle der Informatik in der Hochschulentwicklung zu diskutieren.
Avatime, a Kwa language of Ghana, has an additive particle tsyɛ that at first sight looks similar to additive particles such as too and also in English. However, on closer inspection, the Avatime particle behaves differently. Contrary to what is usually claimed about additive particles, tsyɛ does not only associate with focused elements. Moreover, unlike its English equivalents, tsyɛ does not come with a requirement of identity between the expressed proposition and an alternative. Instead, it indicates that the proposition it occurs in is similar to or compatible with a presupposed alternative proposition.
In this paper, doubling in Russian Sign Language and Sign Language of the Netherlands is discussed. In both sign languages different constituents (including verbs, nouns, adjectives, adverbs, and whole clauses) can be doubled. It is shown that doubling in both languages has common functions and exhibits a similar structure, despite some differences. On this basis, a unified pragmatic explanation for many doubling phenomena on both the discourse and the clause-internal levels is provided, namely that the main function of doubling both in RSL and NGT is foregrounding of the doubled information.
E-Learning Symposium 2012
(2013)
Dieser Tagungsband beinhaltet die auf dem E-Learning Symposium 2012 an der Universität Potsdam vorgestellten Beiträge zu aktuellen Anwendungen, innovativen Prozesse und neuesten Ergebnissen im Themenbereich E-Learning. Lehrende, E-Learning-Praktiker und -Entscheider tauschten ihr Wissen über etablierte und geplante Konzepte im Zusammenhang mit dem Student-Life-Cycle aus. Der Schwerpunkt lag hierbei auf der unmittelbaren Unterstützung von Lehr- und Lernprozessen, auf Präsentation, Aktivierung und Kooperation durch Verwendung von neuen und etablierten Technologien.
On July 20/21 in 2012, an international workshop was held on the subject of the global impact of the Euro-Financial-Crisis at the University of Potsdam. Prof. Dr. Detlev Hummel, faculty Finance and Banking, was the host of the event. Academic colleagues from Beijing, Moscow and Connecticut (USA) as well as domestic capital market and banking experts presented their analyses. Different aspects of national and international finance markets were examined, with a focus on the European region, China and Russia. Mistakes and failures of the banking regulations were identified as one, but note the sole cause of the economic problems. A lack of budget discipline of some politicians and the loss of business competitiveness of certain European nations were mentioned, too. Some members of the European Union did not succeed in mastering the challenges of the global economy. There have been structural issues in some states that impede their competitiveness in the global market, for example with China. The participants pointed out a number of other reasons for the crisis, like dubious distribution types as well as a lack of transparency of certain financial products. Furthermore, remuneration and incentive schemas of investment banks and especially the reckless risk management policy of large banks were identified as other factors for the crisis. The participants of the international workshop in Potsdam agree that the birth of the Euro-currency was a political event and will remain a challenge. The reform of the banking supervision and further steps towards an economic and fiscal union are new research tasks.
The International Conference on Informatics in Schools: Situation, Evolution and Perspectives – ISSEP – is a forum for researchers and practitioners in the area of Informatics education, both in primary and secondary schools. It provides an opportunity for educators to reflect upon the goals and objectives of this subject, its curricula and various teaching/learning paradigms and topics, possible connections to everyday life and various ways of establishing Informatics Education in schools. This conference also cares about teaching/learning materials, various forms of assessment, traditional and innovative educational research designs, Informatics’ contribution to the preparation of children for the 21st century, motivating competitions, projects and activities supporting informatics education in school.
The papers collected in this volume were presented at a Graduate/Postgraduate Student Conference with the title Information Structure: Empirical Perspectives on Theory held on December 2 and 3, 2011 at Potsdam-Griebnitzsee. The main goal of the conference was to connect young researchers working on information structure (IS) related topics and to discuss various IS categories such as givenness, focus, topic, and contrast. The aim of the conference was to find at least partial answers to the following questions: What IS categories are necessary? Are they gradient/continuous? How can one deal with optionality or redundancy? How are IS categories encoded grammatically? How do different empirical methods contribute to distinguishing between the influence of different IS categories on language comprehension and production? To answer these questions, a range of languages (Avatime, Chinese, German, Ishkashimi, Modern Greek, Old Saxon, Russian, Russian Sign Language and Sign Language of the Netherlands) and a range of phenomena from phonology, semantics, and syntax were investigated. The presented theories and data were based on different kinds of linguistic evidence: syntactic and semantic fieldwork, corpus studies, and phonological experiments. The six papers presented in this volume discuss a variety of IS categories, such as emphasis and contrast (Stavropoulous, Titov), association with focus and topics (van Putten, Karvovskaya), and givenness and backgrounding (Kimmelmann, Röhr).
Scrambling and interfaces
(2013)
This paper proposes a novel analysis of the Russian OVS construction and argues that the parametric variation in the availability of OVS cross-linguistically depends on the type of relative interpretative argument prominence that a language encodes via syntactic structure. When thematic and information-structural prominence relations do not coincide, only one of them can be structurally/linearly represented. The relation that is not structurally/linearly encoded must be made visible at the PF interface either via prosody or morphology.
In den zeitgenössischen slavischen Literaturen ist Gewalt allgegenwärtig – als Echo der Revolutionen, Kriege, Diktaturen und Systemumbrüche des 20. Jahrhunderts, als Reaktion auf andauernde und neu ausbrechende Konflikte, als Faszination, Sensation und Kaufanreiz. Gewalt erscheint als narrativ-ästhetischer, tradierter Bestandteil der literarischen Darstellung und als aussagekräftiges, tabubrechendes Motiv. Dieser Band trägt die Ergebnisse einer internationalen Konferenz an der Universität Hamburg zusammen, die sich im Herbst 2012 diesem Thema unter der Trias "Verbrechen – Fiktion - Vermarktung" gewidmet hat. Das breite Spektrum der untersuchten Literaturen (von ost- und west- über südslavische Literaturen, von Prosa über Lyrik und Dramatik) aber auch der Blick über die Literatur hinaus (unter anderem auf Film und Musik), die Vielfalt der Themen, Darstellungsweisen und analytischen Zugänge ergeben ein vielfältiges Bild, das eine Annäherung an die Frage nach den Spezifika literarischer Gewaltdarstellungen ermöglicht.
The paper discusses the distribution and meaning of the additive particle -m@s in Ishkashimi. -m@s receives different semantic associations while staying in the same syntactic position. Thus, structurally combined with an object, it can semantically associate with the focused object or with the whole focused VP; similarly, combined with the subject it can semantically associate with the focused subject and with the whole focused sentence.
In a production experiment and two follow-up perception experiments on read German we investigated the (de-)coding of discourse-new, inferentially and textually accessible and given discourse referents by prosodic means. Results reveal that a decrease in the referent’s level of givenness is reflected by an increase in its prosodic prominence (expressed by differences in the status and type of accent used) providing evidence for the relevance of different intermediate types of information status between the poles given and new. Furthermore, perception data indicate that the degree of prosodic prominence can serve as the decisive cue for decoding a referent’s level of givenness.
Recent models of Information Structure (IS) identify a low level contrast feature that functions within the topic and focus of the utterance. This study investigates the exact nature of this feature based on empirical evidence from a controlled read speech experiment on the prosodic realization of different levels of contrast in Modern Greek. Results indicate that only correction is truly contrastive, and that it is similarly realized in both topic and focus, suggesting that contrast is an independent IS dimension. Non default focus position is further identified as a parameter that triggers a prosodically marked rendition, similar to correction.
Die Moskauer Staatliche Juristische O. E. Kutafin Universität (Akademie) und die Juristische Fakultät der Universität Potsdam arbeiten seit 2007 mit hohem wissenschaftlichem Ertrag zusammen. Ihre gemeinsamen Anstrengungen um das Verständnis und die Entwicklung des Rechts in Russland und in Deutschland finden in den Wochen des Russischen Rechts ihren Ausdruck. In den Beiträgen spiegelt sich das wissenschaftliche Denken auf dem Gebiet der Rechtswissenschaft zweier Länder wider, deren Rechtsschulen seit jeher ein historisches Band verbindet. Ius est ars aequi et boni – Nach dem Wahren und Guten durch das Recht streben beide Partner. Den thematischen Schwerpunkt der 2. Woche des Russischen Rechts, Potsdam, 2012, bildeten – neben zivilrechtlichen Fragestellungen – das Zusammenwirken und der wechselseitige Einfluss von Staat und Kirche in Russland und Deutschland. Wo harsche Gegensätze zwischen Staat und Kirche das Bild prägen, wo die eine Seite die andere gar bekämpft, dominieren Unrecht und Willkür. Wo aber zwischen Staat und Kirche gegenseitiges Vertrauen und Wohlwollen herrschen, wo Konflikte rechtsstaatlich, mithin friedlich, ausgetragen werden, lässt sich das Wahre finden, und das Gute gedeiht. Diese Einsicht zu fördern und zu verbreiten, war ein Ziel der Konferenz.
Verfassungsgerichtsbarkeit in der Russischen Föderation und in der Bundesrepublik Deutschland
(2013)
Der Tagungsband enthält die Referate und Diskussionsbeiträge des in Moskau an der Staatlichen Juristischen Kutafin-Universität am 9. und 10. Oktober 2012 durchgeführten Rundtischgespräches zur Verfassungsgerichtsbarkeit. Behandelt werden ausgewählte rechtshistorische und -politische Fragen sowie aktuelle rechtliche Probleme der Verfassungsgerichtsbarkeit in der Russischen Föderation und der Bundesrepublik Deutschland sowohl aus der Sicht der Rechtspraxis als auch der Wissenschaft: insbesondere die Entwicklung der Verfassungsgerichtsbarkeit in Geschichte und Gegenwart, Status, Rechtsnatur und Aufgaben des Verfassungsgerichts in den Subjekten der Föderation und in den Ländern sowie Verfassungsgericht und Gesetzgebung. Zudem werden Spezialfragen der Verfassungsgerichtsbarkeit erörtert, z.B. die Institution des Bevollmächtigten Vertreters des Präsidenten im Verfassungsgericht in Russland, der Eilrechtsschutz durch das BVerfG und der Rechtsschutz bei überlangen Verfahren vor dem BVerfG in Deutschland.
Die Konferenz „International Conference for the 10th Anniversary of the Institute of Comparative Law” hat am 24. Mai 2013 in Szeged stattgefunden. Im Rahmen der viersprachigen Konferenz haben mehr als dreißig Teilnehmer ihre Forschungsergebnisse präsentiert. Der Essay von Zoltán Péteri blickt auf die Disziplin aus der Perspektive der Wissenschaftsgeschichte. Katalin Kelemen und Balázs Fekete gehen in ihrem Aufsatz der Frage nach, welchen Weg die Versuche der Klassifikation der Rechtssysteme von Osteuropa in der späten Phase der Umbrüche der 1980/90er Jahren genommen haben. Die historische Betrachtungsweise mit Bezug auf Rechtsgeschichte und Rechtsvergleichung spiegelt sich auch in anderen Essays wider, vor allem in den Aufsätzen von Szilvia Bató, Magdolna Gedeon und Béla Szabó P. sowie auch in den Aufsätzen von Péter Mezei und Tünde Szűcs. Attila Badó analysiert die Rechtsvergleichung aus der Sicht des Rechts, der Soziologie und der Politikwissenschaft anhand von Untersuchungen über das Sanktionsystem der Richter in den USA. Diese politikwissenschaftliche Seite wird auch in den Aufsätzen über die aktuellen Fragen der europäischen Integration von Carine Guemar und Laureline Congnard betont. Eine Reihe von Aufsätzen behandeln die konventionelle normative Komparatistik auf dem Gebiet des Verfassungsrechts (Jordane Arlettaz und Péter Kruzslicz), Gesellschaftsrechts (Kitti Bakos-Kovács), Urheberrechts (Dóra Hajdú) und Steuerrechts (Judit Jacsó). Daneben bilden eine weitere Gruppe die Aufsätze von János Bóka und Erzsébet Csatlós, die die Verwendung der vergleichenden Methode in der Praxis der Rechtsprechung untersuchen. Die Rechtsvergleichung ist eine sich dynamisch entwickelnde Disziplin. Die Konferenz und dieser Band dienen nicht nur der Würdigung der bisherigen Arbeit des Instituts für Rechtsvergleichung, sondern zeigen gleichzeitig neue Ziele auf. Die wichtigsten Grundsätze bleiben aber fest verankert auch in einem sich stets verändernden rechtlichen und geistigen Umfeld. Das Motto des Instituts lautet „instruere et docere omnes qui edoceri desiderant“ – „alle lehren, die lernen wollen.“ Auch in den folgenden Jahrzehnten werden uns der Wille des Lernens und Lehrens, die Freiheit der Forschung sowie die Übertragung und Weiterentwicklung der ungarischen wie globalen Rechtskultur leiten.
E-Learning Symposium 2014
(2014)
Der Tagungsband zum E-Learning Symposium 2014 an der Universität Potsdam beleuchtet die diversen Zielgruppen und Anwendungsbereiche, die aktuell in der E-Learning-Forschung angesprochen werden. Während im letzten Symposium 2012 der Dozierende mit den unterschiedlichen Möglichkeiten der Studierendenaktivierung und Lehrgestaltung im Fokus der Diskussionen stand, werden in diesem Jahr in einem großen Teil der Beiträge die Studierenden ins Zentrum der Aufmerksamkeit gerückt. Dass nicht nur der Inhalt des Lernmediums für den Lernerfolg eine Rolle spielt, sondern auch dessen Unterhaltungswert und die Freude, die die Lernenden während des Prozesses der Wissensakquise empfinden, zeigt sehr anschaulich die Keynote von Linda Breitlauch zum Thema „Faites vos Jeux“ (Spielen Sie jetzt). Der Beitrag von Zoerner et al. verbindet den Gedanken des spiele-basierten Lernens mit dem nach wie vor aktuellen Thema des mobilen Lernens. Auch in diesem Forschungsbereich spielt die Fokussierung auf den Lernenden eine immer herausragendere Rolle. Einen Schritt weiter in Richtung Individualisierung geht in diesem Zusammenhang der eingeladene Vortrag von Christoph Rensing, der sich mit der Adaptivität von mobilen Lernanwendungen beschäftigt. Mit Hilfe zur Verfügung stehender Kontextinformationen sollen gezielt individuelle Lernprozesse unterstützt werden. Alle Beiträge, die sich auf mobile Applikationen und auf Spiele beziehen, sprechen auch die zwischenmenschliche Komponente am Lernen an. So wird neben der Mobilität insbesondere auch der Austausch von Lernobjekten zwischen Lernenden (vergleiche den Beitrag von Zoerner et al.) sowie die Kooperation zwischen Lernenden (siehe Beitrag von Kallookaran und Robra-Bissantz) diskutiert. Der interpersonelle Kontakt spielt allerdings ebenfalls in den Beiträgen ohne Spiel- oder App-Fokussierung eine Rolle. Tutoren werden beispielsweise zur Moderation von Lernprozessen eingesetzt und Lerngruppen gegründet um das problem-orientierte Lernen stärker in den Mittelpunkt zu rücken (siehe Beitrag von Mach und Dirwelis) bzw. näher am Bedarf der Studierenden zu arbeiten (wie in eingeladenen Vortrag von Tatiana N. Noskova sowie in dem Beitrag von Mach und Dirwelis beschrieben). In der Evaluation wird ebenfalls der Schritt weg von anonymen, akkumulierten statistischen Auswertungen hin zu individualisierten Nutzerprofilen im Bereich des Learning Analytics untersucht (vergleiche dazu den Beitrag von Ifenthaler). Neben der Schwerpunktsetzung auf die Lernenden und deren Mobilität rückt das Thema Transmedialität stärker ins Zentrum der Forschung. Während schon die Keynote mit ihrem Spielefokus darauf anspricht, geht es in weiteren Beiträgen darum Abläufe aus der analogen Welt bestmöglich in der digitalen Welt abzubilden. Lerninhalte, die bisher mittels Bildern und Texten für Lehrende und Lernende zugänglich gemacht wurden, werden nunmehr mit weiteren Medien, insbesondere Videos, angereichert um deren Verständnis zu erhöhen. Dies ist beispielsweise geeignet, um Bewegungsabläufe im Sport (vergleiche dazu den Beitrag von Owassapian und Hensinger) oder musikpraktische Übungen wie Bodyperkussion (beschrieben im Beitrag von Buschmann und Glasemann) zu erlernen Lernendenfokussierung, persönlicher Austausch, Mobilität und Transmedialität sind somit einige der Kernthemen, die Sie in diesem Sammelband erwarten. Auch zeigt die häufige Verknüpfung verschedener dieser Kernthemen, dass keines davon ein Randthema ist, sondern sich die Summe aus allen im E-Learning bündelt und damit eine neue Qualität für Lehre, Studium und Forschung erreicht werden kann.
Menschenrechte in der Zuwanderungsgesellschaft : 2. Potsdamer MenschenRechtsTag am 22. November 2012
(2014)
Am Potsdamer MenschenRechtsTag – zeitlich in Nähe zum Internationalen Tag der Menschenrechte am 10. Dezember gelegen – diskutiert das MenschenRechtsZentrum der Universität Potsdam wichtige Menschenrechtsthemen mit einem konkreten gesellschafts- und oder rechtspolitischen Bezug. Ende 2012 lag der Fokus auf den Menschenrechten von Zuwanderern. Aus einer grundlegenden philosophischen Perspektive wurde erläutert, dass Beschränkungen des menschenrechtlichen Status dieser Personengruppe nur schwer und in Einzelfällen begründbar sind; eine praktische und rechtspolitische Sichtweise legte konkreten Reformbedarf im Asylverfahren offen, dem inzwischen immerhin zum Teil entsprochen wurde.
Der Band enthält die Tagungsmaterialien des deutsch-russichen Symposiums zum Thema "Verfassungsentwicklung in Russland und Deutschland", welches am 25. und 26. September 2013 in Potsdam stattfand. Die Tagung wurde anlässlich des 20. Jahrestages der russischen Verfassung vom Dezember 2013 durchgeführt. Die inhaltlichen Schwerpunkte bilden die Themen: Verfassungsentstehung, Verfassungsänderung, Verfassungsprinzipien, Landesverfassungen, Fortentwicklung der Verfassung durch die Verfassungsgerichtsbarkeit und Grundrechte, die jeweils aus russischer und deutscher Sicht behandelt werden. Ergänzend befasst sich jeweils ein Betrag mit aktuellen Problemen der Menschenrechtsverwirklichung in Russland und der Ausländerintegration in Deutschland und Russland im Vergleich.
KEYCIT 2014
(2015)
In our rapidly changing world it is increasingly important not only to be an expert in a chosen field of study but also to be able to respond to developments, master new approaches to solving problems, and fulfil changing requirements in the modern world and in the job market. In response to these needs key competencies in understanding, developing and using new digital technologies are being brought into focus in school and university programmes. The IFIP TC3 conference "KEYCIT – Key Competences in Informatics and ICT (KEYCIT 2014)" was held at the University of Potsdam in Germany from July 1st to 4th, 2014 and addressed the combination of key competencies, Informatics and ICT in detail. The conference was organized into strands focusing on secondary education, university education and teacher education (organized by IFIP WGs 3.1 and 3.3) and provided a forum to present and to discuss research, case studies, positions, and national perspectives in this field.
Die Tagung HDI 2014 in Freiburg zur Hochschuldidaktik der Informatik HDI wurde erneut vom Fachbereich Informatik und Ausbildung / Didaktik der Informatik (IAD) in der Gesellschaft für Informatik e. V. (GI) organisiert. Sie dient den Lehrenden der Informatik in Studiengängen an Hochschulen als Forum der Information und des Austauschs über neue didaktische Ansätze und bildungspolitische Themen im Bereich der Hochschulausbildung aus der fachlichen Perspektive der Informatik.
Die HDI 2014 ist nun bereits die sechste Ausgabe der HDI. Für sie wurde das spezielle Motto „Gestalten und Meistern von Übergängen“ gewählt. Damit soll ein besonderes Augenmerk auf die Übergänge von Schule zum Studium, vom Bachelor zum Master, vom Studium zur Promotion oder vom Studium zur Arbeitswelt gelegt werden.
Wolf-Rayet Stars
(2015)
Nearly 150 years ago, the French astronomers Charles Wolf and Georges Rayet described stars with very conspicuous spectra that are dominated by bright and broad emission lines. Meanwhile termed Wolf-Rayet Stars after their discoverers, those objects turned out to represent important stages in the life of massive stars.
As the first conference in a long time that was specifically dedicated to Wolf-Rayet stars, an international workshop was held in Potsdam, Germany, from 1.-5. June 2015. About 100 participants, comprising most of the leading experts in the field as well as as many young scientists, gathered for one week of extensive scientific exchange and discussions. Considerable progress has been reported throughout, e.g. on finding such stars, modeling and analyzing their spectra, understanding their evolutionary context, and studying their circumstellar nebulae. While some major questions regarding Wolf-Rayet stars still remain open 150 years after their discovery, it is clear today that these objects are not just interesting stars as such, but also keystones in the evolution of galaxies.
These proceedings summarize the talks and posters presented at the Potsdam Wolf-Rayet workshop. Moreover, they also include the questions, comments, and discussions emerging after each talk, thereby giving a rare overview not only about the research, but also about the current debates and unknowns in the field. The Scientific Organizing Committee (SOC) included Alceste Bonanos (Athens), Paul Crowther (Sheffield), John Eldridge (Auckland), Wolf-Rainer Hamann (Potsdam, Chair), John Hillier (Pittsburgh), Claus Leitherer (Baltimore), Philip Massey (Flagstaff), George Meynet (Geneva), Tony Moffat (Montreal), Nicole St-Louis (Montreal), and Dany Vanbeveren (Brussels).
HPI Future SOC Lab
(2015)
Das Future SOC Lab am HPI ist eine Kooperation des Hasso-Plattner-Instituts mit verschiedenen Industriepartnern. Seine Aufgabe ist die Ermöglichung und Förderung des Austausches zwischen Forschungsgemeinschaft und Industrie.
Am Lab wird interessierten Wissenschaftlern eine Infrastruktur von neuester Hard- und Software kostenfrei für Forschungszwecke zur Verfügung gestellt. Dazu zählen teilweise noch nicht am Markt verfügbare Technologien, die im normalen Hochschulbereich in der Regel nicht zu finanzieren wären, bspw. Server mit bis zu 64 Cores und 2 TB Hauptspeicher. Diese Angebote richten sich insbesondere an Wissenschaftler in den Gebieten Informatik und Wirtschaftsinformatik. Einige der Schwerpunkte sind Cloud Computing, Parallelisierung und In-Memory Technologien.
In diesem Technischen Bericht werden die Ergebnisse der Forschungsprojekte des Jahres 2015 vorgestellt. Ausgewählte Projekte stellten ihre Ergebnisse am 15. April 2015 und 4. November 2015 im Rahmen der Future SOC Lab Tag Veranstaltungen vor.
Every year, the Hasso Plattner Institute (HPI) invites guests from industry and academia to a collaborative scientific workshop on the topic “Operating the Cloud”. Our goal is to provide a forum for the exchange of knowledge and experience between industry and academia. Hence, HPI’s Future SOC Lab is the adequate environment to host this event which is also supported by BITKOM.
On the occasion of this workshop we called for submissions of research papers and practitioners’ reports. “Operating the Cloud” aims to be a platform for productive discussions of innovative ideas, visions, and upcoming technologies in the field of cloud operation and administration.
In this workshop proceedings the results of the second HPI cloud symposium "Operating the Cloud" 2014 are published. We thank the authors for exciting presentations and insights into their current work and research. Moreover, we look forward to more interesting submissions for the upcoming symposium in 2015.
Die zwölfte Konferenz des Forschungskreises Vereinte Nationen wurde am 28. Juni 2014 an der Universität Potsdam veranstaltet und stand unter dem Thema „Konzepte für die Reform der Vereinten Nationen“. Dabei wurde das komplexe Verhältnis zwischen Reformnotwendigkeit und Machbarkeitserwartungen einerseits und ausgebliebenen weitreichenden Strukturreformen und erfolgten „kleineren“, pragmatischen Reformschritten andererseits deutlich. Mit den vier Referaten stellte die Konferenz in der Tradition der Potsdamer UNO-Konferenzen die Verbindung von Wissenschaft und Praxis unter Beteiligung unterschiedlicher Disziplinen her.
Mit der Broschüre soll allen Interessierten die Möglichkeit gegeben werden, Einblick in ausgewählte Schwerpunkte der Reformdiskussion zu erhalten:
An ein Grundsatzreferat zu bisherigen Reformprozessen im UN-System insgesamt schließen sich drei Texte an, die das Verhältnis zwischen den Vereinten Nationen und den nichtstaatlichen Akteuren in den Blick nehmen, die relativ neue Institution des Menschenrechtsrates als Teil der Reformbemühungen betrachten und schließlich Ergebnisse und Trends bei den Reformen der Arbeitsmethoden des Sicherheitsrates erörtern.