Refine
Has Fulltext
- yes (245) (remove)
Year of publication
Document Type
- Conference Proceeding (245) (remove)
Language
- English (185)
- German (49)
- Multiple languages (10)
- French (1)
Keywords
- Archiv (4)
- Information Structure (4)
- Nachlass (4)
- Cloud Computing (3)
- E-Learning (3)
- middleware (3)
- Constraint Solving (2)
- Deduction (2)
- EMOTIKON (2)
- Forschungsprojekte (2)
Institute
- Extern (137)
- Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung (23)
- Institut für Künste und Medien (20)
- Institut für Informatik und Computational Science (15)
- Institut für Slavistik (14)
- Institut für Physik und Astronomie (11)
- Institut für Geowissenschaften (10)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (9)
- Sonderforschungsbereich 632 - Informationsstruktur (7)
- Bürgerliches Recht (6)
This paper highlights the different ways of perceiving video games and video game content, incorporating interactive and non-interactive methods. It examines varying cognitive and emotive reactions by persons who are used to play video games as well as persons who are unfamiliar with the aesthetics and the most basic game play rules incorporated within video games. Additionally, the principle of “Flow” serves as a theoretical and philosophical foundation. A small case-study featuring two games has been made to emphasize the numerous possible ways of perception of video games.
A deterministic cycle scheduling of partitions at the operating system level is supposed for a multiprocessor system. In this paper, we propose a tool for generating such schedules. We use constraint based programming and develop methods and concepts for a combined interactive and automatic partition scheduling system. This paper is also devoted to basic methods and techniques for modeling and solving this partition scheduling problem. Initial application of our partition scheduling tool has proved successful and demonstrated the suitability of the methods used.
In the last years, statistical machine translation has already demonstrated its usefulness within a wide variety of translation applications. In this line, phrase-based alignment models have become the reference to follow in order to build competitive systems. Finite state models are always an interesting framework because there are well-known efficient algorithms for their representation and manipulation. This document is a contribution to the evolution of finite state models towards a phrase-based approach. The inference of stochastic transducers that are based on bilingual phrases is carefully analysed from a finite state point of view. Indeed, the algorithmic phenomena that have to be taken into account in order to deal with such phrase-based finite state models when in decoding time are also in-depth detailed.
This paper outlines a newly-developed method to include the effects of time variability in the radiative transfer code CMFGEN. It is shown that the flow timescale is often large compared to the variability timescale of LBVs. Thus, time-dependent effects significantly change the velocity law and density structure of the wind, affecting the derivation of the mass-loss rate, volume filling factor, wind terminal velocity, and luminosity. The results of this work are directly applicable to all active LBVs in the Galaxy and in the LMC, such as AG Car, HR Car, S Dor and R 127, and could result in a revision of stellar and wind parameters. The massloss rate evolution of AG Car during the last 20 years is presented, highlighting the need for time-dependent models to correctly interpret the evolution of LBVs.
We investigate the effect of wind clumping on the dynamics of Wolf-Rayet winds, by means of the Potsdam Wolf-Rayet (PoWR) hydrodynamic atmosphere models. In the limit of microclumping the radiative acceleration is generally enhanced. We examine the reasons for this effect and show that the resulting wind structure depends critically on the assumed radial dependence of the clumping factor D(r). The observed terminal wind velocities for WR stars imply that D(r) increases to very large values in the outer part of the wind, in agreement with the assumption of detached expanding shells.
The spatially-resolved winds of the massive binary, Eta Carinae, extend an arcsecond on the sky, well beyond the 10 to 20 milliarcsecond binary orbital dimension. Stellar wind line profiles, observed at very different angular resolutions of VLTI/AMBER, HST/STIS and VLT/UVES, provide spatial information on the extended wind interaction structure as it changes with orbital phase. These same wind lines, observable in the starlight scattered off the foreground lobe of the dusty Homunculus, provide time-variant line profiles viewed from significantly different angles. Comparisons of direct and scattered wind profiles observed in the same epoch and at different orbital phases provide insight on the extended wind structure and promise the potential for three-dimensional imaging of the outer wind structures. Massive, long-lasting clumps, including the nebularWeigelt blobs, originated during the two historical ejection events. Wind interactions with these clumps are quite noticeable in spatially-resolved spectroscopy. As the 2009.0 minimum approaches, analysis of existing spectra and 3-D modeling are providing bases for key observations to gain further understanding of this complex massive binary.
Aktuelle Fragen des Menschenrechtsschutzes : 1. Potsdamer Menschenrechtstag am 26. Oktober 2011
(2012)
Aus Anlass der Neubesetzung des Menschenrechtszentrums mit den Direktoren Prof. Dr. Andreas Zimmermann, LL.M. (Harvard) und Prof. Dr. Logi Gunnarsson fand am 26.10.2011 der Potsdamer Menschenrechtstag unter der Themenstellung „Aktuelle Fragen des Menschenrechtsschutzes“ statt. Ganz im Sinne der interdisziplinären Ausrichtung des MenschenRechtsZentrums der Universität Potsdam beschäftigten sich die beiden Direktoren in ihren Einführungsvorträgen aus ihrer jeweiligen Disziplin heraus mit philosophischen und rechtlichen Problemstellungen der Menschenrechte und ihres Schutzes.
The space-image
(2008)
In recent computer game research a paradigmatic shift is observable: Games today are first and foremost conceived as a new medium characterized by their status as an interactive image. The shift in attention towards this aspect becomes apparent in a new approach that is, first and foremost, aware of the spatiality of games or their spatial structures. This rejects traditional approaches on the basis that the medial specificity of games can no longer be reduced to textual or ludic properties, but has to be seen in medial constituted spatiality. For this purpose, seminal studies on the spatiality of computer games are resumed and their advantages and disadvantages are discussed. In connection with this, and against the background of the philosophical method of phenomenology, we propose three steps in describing computer games as space images: With this method it is possible to describe games with respect to the possible appearance of spatiality in a pictorial medium.
Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
Modeling expanding atmospheres is a difficult task because of the extreme non-LTE situation, the need to account for complex model atoms, especially for the iron-group elements with their millions of lines, and because of the supersonic expansion. Adequate codes have been developed e.g. by Hillier (CMFGEN), the Munich group (Puls, Pauldrach), and in Potsdam (PoWR code, Hamann et al.). While early work was based on the assumption of a smooth and homogeneous spherical stellar wind, the need to account for clumping became obvious about ten years ago. A relatively simple first-order clumping correction was readily implemented into the model codes. However, its simplifying assumptions are severe. Most importantly, the clumps are taken to be optically thin at all frequencies (”microclumping”). We discuss the consequences of this approximation and describe an approach to account for optically thick clumps (“macroclumping”). First results demonstrate that macroclumping can generally reduce the strength of spectral features, depending on their optical thickness. The recently reported discrepancy between the Hα diagnostic and the Pv resonance lines in O star spectra can be resolved without decreasing the mass-loss rates, when macroclumping is taken into account.
Component based software development (CBSD) and aspectoriented software development (AOSD) are two complementary approaches. However, existing proposals for integrating aspects into component models are direct transposition of object-oriented AOSD techniques to components. In this article, we propose a new approach based on views. Our proposal introduces crosscutting components quite naturally and can be integrated into different component models.
We describe a framework to support the implementation of web-based systems to manipulate data stored in relational databases. Since the conceptual model of a relational database is often specified as an entity-relationship (ER) model, we propose to use the ER model to generate a complete implementation in the declarative programming language Curry. This implementation contains operations to create and manipulate entities of the data model, supports authentication, authorization, session handling, and the composition of individual operations to user processes. Furthermore and most important, the implementation ensures the consistency of the database w.r.t. the data dependencies specified in the ER model, i.e., updates initiated by the user cannot lead to an inconsistent state of the database. In order to generate a high-level declarative implementation that can be easily adapted to individual customer requirements, the framework exploits previous works on declarative database programming and web user interface construction in Curry.
An important characteristic of Service-Oriented Architectures is that clients do not depend on the service implementation's internal assignment of methods to objects. It is perhaps the most important technical characteristic that differentiates them from more common object-oriented solutions. This characteristic makes clients and services malleable, allowing them to be rearranged at run-time as circumstances change. That improvement in malleability is impaired by requiring clients to direct service requests to particular services. Ideally, the clients are totally oblivious to the service structure, as they are to aspect structure in aspect-oriented software. Removing knowledge of a method implementation's location, whether in object or service, requires re-defining the boundary line between programming language and middleware, making clearer specification of dependence on protocols, and bringing the transaction-like concept of failure scopes into language semantics as well. This paper explores consequences and advantages of a transition from object-request brokering to service-request brokering, including the potential to improve our ability to write more parallel software.
Die Tagungsreihe zur Hochschuldidaktik der Informatik HDI wird vom Fachbereich Informatik und Ausbildung / Didaktik der Informatik (IAD) in der Gesellschaft für Informatik e. V. (GI) organisiert. Sie dient den Lehrenden der Informatik in Studiengängen an Hochschulen als Forum der Information und des Austauschs über neue didaktische Ansätze und bildungspolitische Themen im Bereich der Hochschulausbildung aus der fachlichen Perspektive der Informatik. Diese fünfte HDI 2012 wurde an der Universität Hamburg organisiert. Für sie wurde das spezielle Motto „Informatik für eine nachhaltige Zukunft“ gewählt, um insbesondere Fragen der Bildungsrelevanz informatischer Inhalte, der Kompetenzen für Studierende informatisch geprägter Studiengänge und der Rolle der Informatik in der Hochschulentwicklung zu diskutieren.
The interest in extensions of the logic programming paradigm beyond the class of normal logic programs is motivated by the need of an adequate representation and processing of knowledge. One of the most difficult problems in this area is to find an adequate declarative semantics for logic programs. In the present paper a general preference criterion is proposed that selects the ‘intended’ partial models of generalized logic programs which is a conservative extension of the stationary semantics for normal logic programs of [Prz91]. The presented preference criterion defines a partial model of a generalized logic program as intended if it is generated by a stationary chain. It turns out that the stationary generated models coincide with the stationary models on the class of normal logic programs. The general wellfounded semantics of such a program is defined as the set-theoretical intersection of its stationary generated models. For normal logic programs the general wellfounded semantics equals the wellfounded semantics.
We propose a paraconsistent declarative semantics of possibly inconsistent generalized logic programs which allows for arbitrary formulas in the body and in the head of a rule (i.e. does not depend on the presence of any specific connective, such as negation(-as-failure), nor on any specific syntax of rules). For consistent generalized logic programs this semantics coincides with the stable generated models introduced in [HW97], and for normal logic programs it yields the stable models in the sense of [GL88].
Overwhelming observational and theoretical evidence suggests that the winds of massive stars are highly clumped. We briefly discuss the influence of clumping on model diagnostics and the difficulties of allowing for the influence of clumping on model spectra. Because of its simplicity, and because of computational ease, most spectroscopic analyses incorporate clumping using the volume filling factor. The biases introduced by this approach are uncertain. To investigate alternative clumping models, and to help determine the validity of parameters derived using the volume filling factor method, we discuss results derived using an alternative model in which we assume that the wind is composed of optically thick shells.
Mass loss is a very important aspect of the life of massive stars. After briefly reviewing its importance, we discuss the impact of the recently proposed downward revision of mass loss rates due to clumping (difficulty to form Wolf-Rayet stars and production of critically rotating stars). Although a small reduction might be allowed, large reduction factors around ten are disfavoured. We then discuss the possibility of significant mass loss at very low metallicity due to stars reaching break-up velocities and especially due to the metal enrichment of the surface of the star via rotational and convective mixing. This significant mass loss may help the first very massive stars avoid the fate of pair-creation supernova, the chemical signature of which is not observed in extremely metal poor stars. The chemical composition of the very low metallicity winds is very similar to that of the most metal poor star known to date, HE1327-2326 and offer an interesting explanation for the origin of the metals in this star. We also discuss the importance of mass loss in the context of long and soft gamma-ray bursts and pair-creation supernovae. Finally, we would like to stress that mass loss in cooler parts of the HR-diagram (luminous blue variable and yellow and red supergiant stages) are much more uncertain than in the hot part. More work needs to be done in these areas to better constrain the evolution of the most massive stars.
MMORPGs such as WORLD OF WARCRAFT can be understood as interactive representations of war. Within the frame provided by the program the players experience martial conflicts and thus a “virtual war.” The game world however requires a technical and as far as possible invisible infrastructure which has to be protected against attacks: Infrastructure means e.g. the servers on which the data of the player characters and the game’s world are saved, as well as the user accounts, which have to be protected, among other things, from “identity theft.” Besides the war on the virtual surface of the program we will therefore describe the invisible war concerning the infrastructure, the outbreak of which is always feared by the developers and operators of online-worlds, requiring them to take precautions. Furthermore we would like to focus on “virtual game worlds” as places of complete surveillance. Since action in these worlds is always associated with the production of data, total observation is theoretically possible and put into practice by the so-called “game master.” The observation of different communication channels (including user forums) serves to monitor and direct the actions on the virtual battlefield subtly, without the player feeling that his freedom is being limited. Finally, we will compare the fictional theater of war in WORLD OF WARCRAFT to the vision of “Network-Centric Warfare,” since it has often been observed that the analysis of MMORPGs is useful to the real trade of war. However, we point out what an unrealistic theater of war WORLD OF WARCRAFT really is.
The International Conference on Informatics in Schools: Situation, Evolution and Perspectives – ISSEP – is a forum for researchers and practitioners in the area of Informatics education, both in primary and secondary schools. It provides an opportunity for educators to reflect upon the goals and objectives of this subject, its curricula and various teaching/learning paradigms and topics, possible connections to everyday life and various ways of establishing Informatics Education in schools. This conference also cares about teaching/learning materials, various forms of assessment, traditional and innovative educational research designs, Informatics’ contribution to the preparation of children for the 21st century, motivating competitions, projects and activities supporting informatics education in school.
This paper presents a system for the detection and correction of syntactic errors. It combines a robust morphosyntactic analyser and two groups of finite-state transducers specified using the Xerox Finite State Tool (xfst). One of the groups is used for the description of syntactic error patterns while the second one is used for the correction of the detected errors. The system has been tested on a corpus of real texts, containing both correct and incorrect sentences, with good results.
We report FUSE observations in 2005–2006 of three O-type, double-lined spectroscopic binaries in the Magellanic Clouds. The systems have very short periods (1.4–2.25 d), represent rare, young evolutionary stages of massive stars and binaries, and provide a unique glimpse at some of the most massive systems that form in dense clusters of massive stars. Improved orbit parameters, including revised masses, for LH54-425 are derived from new ctio spectroscopy. The systems are: LH54-425 in the LMC (O3V + O5V, P=2.25d, 62+37M⊙), J053441-693139 in the LMC (O2-3If+O6V, P=1.4 d, 41+27M⊙), and Hodge 53-47 in the SMC (O6V + O4-5IIIf, P=2.2 d, 24+14M⊙, where the O4 star appears to be less massive than the O6 star). Their short periods indicates that wind interaction and mass transfer are likely important factors in their evolution. The spectra provide quantitative and systematic studies of phase-dependent stellar wind properties, wind collision effects in O+O binaries at lower metallicities, improved radial velocity curves, and FUV spectro-photometric changes as a function of orbital phase.
Problemstellung • Geoökologische Prozessforschung versucht für große Landschaftsausschnitte, die in der Natur ablaufenden und vom Menschen beeinflussten Prozesse mit Hilfe von Modellen nachzuvollziehen • exakte Erfassung der Ausstattung des Untersuchungsraumes ist wesentliche Voraussetzung für eine wirklichkeitsnahe Abbildung • Modelle derzeit weder in der Lage, alle ablaufenden Prozesse in die Betrachtung einzubeziehen, noch präzise Eingangsdaten bei der Beschreibung des Ausgangszustandes zu verarbeiten • häufig liegen Modelleingangsdaten nicht in der notwendigen Präzision vor • In Modellen wird Ausstattung eines Untersuchungsgebietes über den Boden, den Grundwassereinfluss und die Flächennutzung beschrieben • Flächennutzung besitzt weitgehend statische Elemente (Nutzungstypen Wald, Gewässer, Siedlung) und hochdynamische Elemente (jährlicher Wechsel der Fruchtart auf jedem Acker) • Bedarf nach detaillierter (lage- und zeitkonkreter) Eingabe der Verteilung der Ackerfrüchte im Modellzeitraum, da Landwirtschaft als eine der bedeutenden Quellen für diffusen Nährstoffeintrag ins Ökosystem angesehen wird Stand der Forschung • bei Erfassung von Kulturen der Landwirtschaft aus Fernerkundungsdaten hat sich multitemporale Klassifizierung als sinnvoll erwiesen, weil sich anhand einer Einzelaufnahme die verschiedenen Kulturen nicht sicher trennen lassen • Klassifizierung erfolgt mit überwachten Methoden unter Verwendung von Trainingsflächen im Datensatz, von denen die dort angebaute Frucht bekannt ist • in die Klassifizierung werden zusätzliche Informationen einbezogen (Fuzzy), die Auskunft über die Wahrscheinlichkeit des Auftretens der Frucht geben (Anbaueignung in Abhängigkeit von Hangneigung, Niederschlag, Höhenlage, Boden) Die Ergebnisse dieser Klassifikationen sind meist nicht auf andere Landschaftsausschnitte und Anbaujahre übertragbar, weil die Ausprägung der Spektralsignatur einer Kultur durch veränderte Boden- und Witterungsbedingungen variiert. Lösungsansatz • auf Basis von Satellitendaten und Anbauinformationen aus 15 aufeinander folgenden Jahren (35 Aufnahmetermine) sollten von Witterung und Boden unabhängige Jahreskurven der spektralen Charakteristik wichtiger Ackerkulturen gewonnen werden, die den Wachstumsverlauf der Pflanzen beschreiben • diese Kurven sollen anstelle von Trainingsgebieten zur multitemporalen Klassifizierung von Daten eines Anbaujahres herangezogen werden Schlussfolgerungen und Ausblick • Prinzipiell erscheint Vorgehen erfolgreich, jedoch in Abhängigkeit von der Brauchbarkeit der herangezogenen Szenen schwankt Güte des Ergebnisses noch • Verfahren stellt wesentlichen Fortschritt zu bisherigem Vorgehen auf Trainingsflächenbasis dar • ist zumindest im Untersuchungsgebiet immer wieder ohne weitere Kenntnis von Anbauinformationen anwendbar, lediglich exakte phänologische Datierung der dann verwendeten Aufnahmen erforderlich • für andere Gebiete (Variation in Niederschlag und Boden) ist Anpassung der phänologischen Datierung der Kurven erforderlich (Form ist weiter verwendbar) • optimale Bildkombination zur Trennung aller Kulturen ist: Anfang/Mitte April – Mitte Mai – Anfang Juli – Mitte August – Mitte September • Kombination sollte bei verbesserter Verfügbarkeit von Daten beschaffbar sein • problematisch scheinen Trockensituationen im Mai und Juni zu sein, so dass zu schnell reifende Wintergetreide nicht richtig erkannt werden, Bedarf Bodeninformationen einzubeziehen • Trennung von Hackfrüchten weiterhin problematisch (wie schon in bisherigen Verfahren), führt zu übermäßigen Anteilen im Ergebnis, in Abhängigkeit vom Anbauanteil besser vernachlässigen • Einbeziehung von Fuzzyinformationen erscheint sinnvoll • Zusammenhang von Bodengüte und Frucht (Anbaueignung eines Bodens für eine Frucht) • Wasserverfügbarkeit am Standort (in Abhängigkeit von Speichervermögen des Bodens, Grundwasseranschluss und Niederschlag) • Summe der Niederschläge bis zum Aufnahmezeitpunkt (Trockenheitsindex) <hr> Dokument 1: Foliensatz | Dokument 2: Abstract <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
EMOOCs 2021
(2021)
From June 22 to June 24, 2021, Hasso Plattner Institute, Potsdam, hosted the seventh European MOOC Stakeholder Summit (EMOOCs 2021) together with the eighth ACM Learning@Scale Conference.
Due to the COVID-19 situation, the conference was held fully online.
The boost in digital education worldwide as a result of the pandemic was also one of the main topics of this year’s EMOOCs. All institutions of learning have been forced to transform and redesign their educational methods, moving from traditional models to hybrid or completely online models at scale. The learnings, derived from practical experience and research, have been explored in EMOOCs 2021 in six tracks and additional workshops, covering various aspects of this field. In this publication, we present papers from the conference’s Experience Track, the Policy Track, the Business Track, the International Track, and the Workshops.
Being "in the game"
(2008)
When people describe themselves as being “in the game” this is often thought to mean they have a sense of presence, i.e. they feel like they are in the virtual environment (Brown/Cairns 2004). Presence research traditionally focuses on user experiences in virtual reality systems (e.g. head mounted displays, CAVE-like systems). In contrast, the experience of gaming is very different. Gamers willingly submit to the rules of the game, learn arbitrary relationships between the controls and the screen output, and take on the persona of their game character. Also whereas presence in VR systems is immediate, presence in gaming is gradual. Due to these differences, one can question the extent to which people feel present during gaming. A qualitative study was conducted to explore what gamers actually mean when they describe themselves as being “in the game.” Thirteen gamers were interviewed and the resulting grounded theory suggests being “in the game” does not necessarily mean presence (i.e. feeling like you are the character and present in the VE). Some people use this phrase just to emphasize their high involvement in the game. These findings differ with Brown and Cairns as they suggest at the highest state of immersion not everybody experiences presence. Furthermore, the experience of presence does not appear dependent on the game being in the first person perspective or the gamer being able to empathize with the character. Future research should investigate why some people experience presence and others do not. Possible explanations include: use of language, perception of presence, personality traits, and types of immersion.
In a common description, to play a game is to step inside a concrete or metaphorical magic circle where special rules apply. In video game studies, this description has received an inordinate amount of criticism which the paper argues has two primary sources: 1. a misreading of the basic concept of the magic circle and 2. a somewhat rushed application of traditional theoretical concerns onto games. The paper argues that games studies must move beyond conventional criticisms of binary distinctions and rather look at the details of how games are played. Finally, the paper proposes an alternative metaphor for game-playing, the puzzle piece.
An account is presented of the focus properties, common ground effect and dialogue behaviour of the accented German discourse marker "doch" and the accented sentence negation "nicht". It is argued that "doch" and "nicht" evoke as a focus alternative the logical complement of the proposition expressed by the sentence in which they occur, and that an analysis in terms of contrastive focus accounts for their effect on the common ground and their function in dialogue.
The paper discusses the distribution and meaning of the additive particle -m@s in Ishkashimi. -m@s receives different semantic associations while staying in the same syntactic position. Thus, structurally combined with an object, it can semantically associate with the focused object or with the whole focused VP; similarly, combined with the subject it can semantically associate with the focused subject and with the whole focused sentence.
The papers collected in this volume were presented at a Graduate/Postgraduate Student Conference with the title Information Structure: Empirical Perspectives on Theory held on December 2 and 3, 2011 at Potsdam-Griebnitzsee. The main goal of the conference was to connect young researchers working on information structure (IS) related topics and to discuss various IS categories such as givenness, focus, topic, and contrast. The aim of the conference was to find at least partial answers to the following questions: What IS categories are necessary? Are they gradient/continuous? How can one deal with optionality or redundancy? How are IS categories encoded grammatically? How do different empirical methods contribute to distinguishing between the influence of different IS categories on language comprehension and production? To answer these questions, a range of languages (Avatime, Chinese, German, Ishkashimi, Modern Greek, Old Saxon, Russian, Russian Sign Language and Sign Language of the Netherlands) and a range of phenomena from phonology, semantics, and syntax were investigated. The presented theories and data were based on different kinds of linguistic evidence: syntactic and semantic fieldwork, corpus studies, and phonological experiments. The six papers presented in this volume discuss a variety of IS categories, such as emphasis and contrast (Stavropoulous, Titov), association with focus and topics (van Putten, Karvovskaya), and givenness and backgrounding (Kimmelmann, Röhr).
Abbildende Spektrometrie
(2006)
E-Learning Symposium 2018
(2018)
In den vergangenen Jahren sind viele E-Learning-Innovationen entstanden. Einige davon wurden auf den vergangenen E-Learning Symposien der Universität Potsdam präsentiert: Das erste E-Learning Symposium im Jahr 2012 konzentrierte sich auf unterschiedliche Möglichkeiten der Studierendenaktivierung und Lehrgestaltung. Das Symposium 2014 rückte vor allem die Studierenden ins Zentrum der Aufmerksamkeit. Im Jahr 2016 kam es durch das Zusammengehen des Symposiums mit der DeLFI-Tagung zu einer Fokussierung auf technische Innovationen. Doch was ist aus den Leuchttürmen von gestern geworden, und brauchen wir überhaupt noch neue Leuchttürme? Das Symposium setzt sich in diesem Jahr unter dem Motto „Innovation und Nachhaltigkeit – (k)ein Gegensatz?“ mit mediengestützten Lehr- und Lernprozessen im universitären Kontext auseinander und reflektiert aktuelle technische sowie didaktische Entwicklungen mit Blick auf deren mittel- bis langfristigen Einsatz in der Praxis.
Dieser Tagungsband zum E-Learning Symposium 2018 an der Universität Potsdam beinhaltet eine Mischung von Forschungs- und Praxisbeiträgen aus verschiedenen Fachdisziplinen und eröffnet vielschichtige Perspektiven auf das Thema E-Learning. Dabei werden die Vielfalt der didaktischen Einsatzszenarien als auch die Potentiale von Werk-zeugen und Methoden der Informatik in ihrem Zusammenspiel beleuchtet.
In seiner Keynote widmet sich Reinhard Keil dem Motto des Symposiums und geht der Nachhaltigkeit bei E-Learning-Projekten auf den Grund. Dabei analysiert und beleuchtet er anhand seiner über 15-jährigen Forschungspraxis die wichtigsten Wirkfaktoren und formuliert Empfehlungen zur Konzeption von E-Learning-Projekten. Im Gegensatz zu rein auf Kostenersparnis ausgerichteten (hochschul-)politischen Forderungen proklamiert er den Ansatz der hypothesengeleiteten Technikgestaltung, in der Nachhaltigkeit als Leitfrage oder Forschungsstrategie verstanden werden kann. In eine ähnliche Richtung geht der Beitrag von Iris Braun et al., die über Erfolgsfaktoren beim Einsatz von Audience Response Systemen in der universitären Lehre berichten.
Ein weiteres aktuelles Thema, sowohl für die Bildungstechnologie als auch in den Bildungswissenschaften allgemein, ist die Kompetenzorientierung und –modellierung. Hier geht es darum (Problemlöse-)Fähigkeiten gezielt zu beschreiben und in den Mittelpunkt der Lehre zu stellen. Johannes Konert stellt in einem eingeladenen Vortrag zwei Projekte vor, die den Prozess beginnend bei der Definition von Kompetenzen, deren Modellierung in einem semantischen maschinenlesbaren Format bis hin zur Erarbeitung von Methoden zur Kompetenzmessung und der elektronischen Zertifizierung aufzeigen. Dabei geht er auf technische Möglichkeiten, aber auch Grenzen ein. Auf einer spezifischeren Ebene beschäftigt sich Sarah Stumpf mit digitalen bzw. mediendidaktischen Kompetenzen im Lehramtsstudium und stellt ein Framework für die Förderung ebensolcher Kompetenzen bei angehenden Lehrkräften vor.
Der Einsatz von E-Learning birgt noch einige Herausforderungen. Dabei geht es oft um die Verbindung von Didaktik und Technik, den Erhalt von Aufmerksamkeit oder den Aufwand für das Erstellen von interaktiven Lehr- und Lerninhalten. Drei Beiträge in diesem Tagungsband beschäftigen sich mit dieser Thematik in unterschiedlichen Kontexten und zeigen Best-Practices und Lösungsansätze auf: Der Beitrag von Martina Wahl und Michael Hölscher behandelt den besonderen Kontext von Blended Learning-Szenarien in berufsbegleitenden Studiengängen. Um die Veröffentlichung eines global frei verfügbaren Onlinekurses abseits der großen MOOC Plattformen und den didaktischen Herausforderungen auch hinsichtlich der Motivation geht es im Beitrag von Ennio Marani und Isabel Jaisli. Schließlich schlagen Gregor Damnik et al. die automatische Erzeugung von Aufgaben zur Erhöhung von Interaktivität und Adaptivität in digitalen Lernressourcen vor, um den teilweise erheblichen Erstellungsaufwand zu reduzieren.
Zum Thema E-Learning zählen auch immer mobile Apps bzw. Spiele. Gleich zwei Beiträge beschäftigen sich mit dem Einsatz von E-Learning-Tools im Gesundheitskontext: Anna Tscherejkina und Anna Morgiel stellen in ihrem Beitrag Minispiele zum Training von sozio-emotionalen Kompetenzen für Menschen mit Autismus vor, und Stephanie Herbstreit et al. berichten vom Einsatz einer mobilen Lern-App zur Verbesserung von klinisch-praktischem Unterricht.
Thema des Workshops waren alle Fragen, die sich der Vermittlung von Informatikgegenständen im Hochschulbereich widmen. Dazu gehören u.a.: - fachdidaktische Konzepte der Vermittlung einzelner Informatikgegenstände - methodische Lösungen, wie spezielle Lehr- und Lernformen, Durchführungskonzepte - Studienkonzepte und Curricula, insbesondere im Zusammenhang mit Bachelor- und Masterstudiengängen - E-Learning-Ansätze, wenn sie ein erkennbares didaktisches Konzept verfolgen empirische Ergebnisse und Vergleichsstudien. Die Fachtagung widmete sich ausgewählten Fragestellungen dieses Themenkomplexes, die durch Vorträge ausgewiesener Experten, durch eingereichte Beiträge und durch eine Präsentation intensiv behandelt wurden.
On the basis of the Dynamic Syntax framework, this paper argues that the production pressures in dialogue determining alignment effects and given versus new informational effects also drive the shift from case-rich free word order systems without clitic pronouns into systems with clitic pronouns with rigid relative ordering. The paper introduces assumptions of Dynamic Syntax, in particular the building up of interpretation through structural underspecification and update, sketches the attendant account of production with close coordination of parsing and production strategies, and shows how what was at the Latin stage a purely pragmatic, production-driven decision about linear ordering becomes encoded in the clitics in theMedieval Spanish system which then through successive steps of routinization yield the modern systems with immediately pre-verbal fixed clitic templates.
We model the line profile variability (lpv) in spectra of clumped stellar atmospheres using the Stochastic Clump Model (SCM) of the winds of early-type stars. In this model the formation of dense inhomogeneities (clumps) in the line driven winds is considered as being a stochastic process. It is supposed that the emission due to clumps mainly contributes to the intensities of emission lines in the stellar spectra. It is shown that in the framework of the SCM it is possible to reproduce both the mean line profiles and a common pattern of the lpv.
In this paper, doubling in Russian Sign Language and Sign Language of the Netherlands is discussed. In both sign languages different constituents (including verbs, nouns, adjectives, adverbs, and whole clauses) can be doubled. It is shown that doubling in both languages has common functions and exhibits a similar structure, despite some differences. On this basis, a unified pragmatic explanation for many doubling phenomena on both the discourse and the clause-internal levels is provided, namely that the main function of doubling both in RSL and NGT is foregrounding of the doubled information.
Developmental Gains in Physical Fitness Components of Keyage and Older-than-Keyage Third-Graders
(2022)
Children who were enrolled according to legal enrollment dates (i.e., keyage third-graders aged eight to nine years) exhibit a positive linear physical fitness development (Fühner et al., 2021). However, children who were enrolled with a delay of one year or who repeated a grade (i.e., older-than-keyage children [OTK] aged nine to ten years in third grade) appear to exhibit a poorer physical fitness relative to what could be expected given their chronological age (Fühner et al., 2022). However, because Fühner et al. (2022) compared the performance of OTK children to predicted test scores that were extrapolated based on the data of keyage children, the observed physical fitness of these children could either indicate a delayed physical-fitness development or some physiological or psychological changes occurring during the tenth year of life. We investigate four hypotheses about this effect. (H1) OTK children are biologically younger than keyage children. A formula transforming OTK’s chronological age into a proxy for their biological age brings some of the observed cross-sectional age-related development in line with the predicted age-related development based on the data of keyage children, but large negative group differences remain. Hypotheses 2 to 4 were tested with a longitudinal assessment. (H2) Physiological changes due to biological maturation or psychological factors cause a stagnation of physical fitness development in the tenth year of life. H2 predicts a decline of performance from third to fourth grade also for keyage children. (H3) OTK children exhibit an age-related (temporary) developmental delay in the tenth year of life, but later catch up to the performance of age-matched keyage children. H3 predicts a larger developmental gain for OTK than for keyage children from third to fourth grade. (H4) OTK children exhibit a sustained physical fitness deficit and do not catch up over time. H4 predicts a positive development for keyage and OTK children, with no greater development for OTK compared to keyage children. The longitudinal study was based on a subset of children from the EMOTIKON project (www.uni-potsdam.de/emotikon). The physical fitness (cardiorespiratory endurance [6-minute-run test], coordination [star-run test], speed [20-m sprint test], lower [standing long jump test] and upper [ball push test] limbs muscle power, and balance [one-legged stance test]) of 1,274 children (1,030 keyage and 244 OTK children) from 32 different schools was tested in third grade and retested one year later in fourth grade. Results: (a) Both keyage and OTK children exhibit a positive longitudinal development from third to fourth grade in all six physical fitness components. (b) There is no evidence for a different longitudinal development of keyage and OTK children. (c) Keyage children (approximately 9.5 years in fourth grade) outperform age-matched OTK children (approximately 9.5 years in third grade) in all six physical fitness components. The results show that the physical fitness of OTK children is indeed impaired and are in support of a sustained difference in physical fitness between the groups of keyage and OTK children (H4).
We describe an experiment to gather original data on geometrical aspects of pointing. In particular, we are focusing upon the concept of the pointing cone, a geometrical model of a pointing’s extension. In our setting we employed methodological and technical procedures of a new type to integrate data from annotations as well as from tracker recordings. We combined exact information on position and orientation with rater’s classifications. Our first results seem to challenge classical linguistic and philosophical theories of demonstration in that they advise to separate pointings from reference.
The influence of the wind to the total continuum of OB supergiants is discussed. For wind velocity distributions with β > 1.0, the wind can have strong influence to the total continuum emission, even at optical wavelengths. Comparing the continuum emission of clumped and unclumped winds, especially for stars with high β values, delivers flux differences of up to 30% with maximum in the near-IR. Continuum observations at these wavelengths are therefore an ideal tool to discriminate between clumped and unclumped winds of OB supergiants.
We study the influence of clumping on the predicted wind structure of O-type stars. For this purpose we artificially include clumping into our stationary wind models. When the clumps are assumed to be optically thin, the radiative line force increases compared to corresponding unclumped models, with a similar effect on either the mass-loss rate or the terminal velocity (depending on the onset of clumping). Optically thick clumps, alternatively, might be able to decrease the radiative force.
The rigorous development, application and validation of distributed hydrological models obligates to evaluate data in a spatially distributed way. In particular, spatial model predictions such as the distribution of soil moisture, runoff generating areas or nutrient-contributing areas or erosion rates, are to be assessed against spatially distributed observations. Also model inputs, such as the distribution of modelling units derived by GIS and remote sensing analyses, should be evaluated against groundbased observations of landscape characteristics. So far, however, quantitative methods of spatial field comparison have rarely been used in hydrology. In this paper, we present algorithms that allow to compare observed and simulated spatial hydrological data. The methods can be applied for binary and categorical data on regular grids. They comprise cell-by-cell algorithms, cell-neighbourhood approaches that account for fuzziness of location, and multi-scale algorithms that evaluate the similarity of spatial fields with changing resolution. All methods provide a quantitative measure of the similarity of two maps. The comparison methods are applied in two mountainous catchments in southern Germany (Brugga, 40 km<sup>2) and Austria (Löhnersbach, 16 km<sup>2). As an example of binary hydrological data, the distribution of saturated areas is analyzed in both catchments. For categorical data, vegetation zones that are associated with different runoff generation mechanisms are analyzed in the Löhnersbach. Mapped spatial patterns are compared to simulated patterns from terrain index calculations and from satellite image analysis. It is discussed how particular features of visual similarity between the spatial fields are captured by the quantitative measures, leading to recommendations on suitable algorithms in the context of evaluating distributed hydrological models.
HPI Future SOC Lab
(2015)
Das Future SOC Lab am HPI ist eine Kooperation des Hasso-Plattner-Instituts mit verschiedenen Industriepartnern. Seine Aufgabe ist die Ermöglichung und Förderung des Austausches zwischen Forschungsgemeinschaft und Industrie.
Am Lab wird interessierten Wissenschaftlern eine Infrastruktur von neuester Hard- und Software kostenfrei für Forschungszwecke zur Verfügung gestellt. Dazu zählen teilweise noch nicht am Markt verfügbare Technologien, die im normalen Hochschulbereich in der Regel nicht zu finanzieren wären, bspw. Server mit bis zu 64 Cores und 2 TB Hauptspeicher. Diese Angebote richten sich insbesondere an Wissenschaftler in den Gebieten Informatik und Wirtschaftsinformatik. Einige der Schwerpunkte sind Cloud Computing, Parallelisierung und In-Memory Technologien.
In diesem Technischen Bericht werden die Ergebnisse der Forschungsprojekte des Jahres 2015 vorgestellt. Ausgewählte Projekte stellten ihre Ergebnisse am 15. April 2015 und 4. November 2015 im Rahmen der Future SOC Lab Tag Veranstaltungen vor.