Refine
Year of publication
Document Type
- Conference Proceeding (489) (remove)
Language
- English (489) (remove)
Keywords
- social media (5)
- Information Structure (4)
- COVID-19 (3)
- Cloud Computing (3)
- E-Mail Tracking (3)
- ERP (3)
- MOOC (3)
- Privacy (3)
- conversational agents (3)
- enterprise systems (3)
Institute
- Extern (122)
- Fachgruppe Betriebswirtschaftslehre (76)
- Institut für Biochemie und Biologie (55)
- Department Sport- und Gesundheitswissenschaften (40)
- Institut für Ernährungswissenschaft (37)
- Department Psychologie (28)
- Institut für Künste und Medien (22)
- Institut für Chemie (15)
- Institut für Physik und Astronomie (15)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (12)
We describe an experiment to gather original data on geometrical aspects of pointing. In particular, we are focusing upon the concept of the pointing cone, a geometrical model of a pointing’s extension. In our setting we employed methodological and technical procedures of a new type to integrate data from annotations as well as from tracker recordings. We combined exact information on position and orientation with rater’s classifications. Our first results seem to challenge classical linguistic and philosophical theories of demonstration in that they advise to separate pointings from reference.
Perfectionism is a personality disposition characterized by setting extremely high performance-standards coupled with critical self-evaluations. Often conceived as positive, perfectionism can yield not only beneficial but also deleterious outcomes ranging from anxiety to burnout. In this proposal, we set out to investigate the role of the technology and, particularly, social media in individuals’ strivings for perfection. We lay down theoretical bases for the possibility that social media plays a role in the development of perfectionism. To empirically test the hypothesized relationship, we propose a comprehensive study design based on the experience sampling method. Lastly, we provide an overview of the planned analysis and future steps.
Coming back for more
(2022)
Recent spikes in social networking site (SNS) usage times have launched investigations into reasons for excessive SNS usage. Extending research on social factors (i.e., fear of missing out), this study considers the News Feed setup. More specifically, we suggest that the order of the News Feed (chronological vs. algorithmically assembled posts) affects usage behaviors. Against the background of the variable reward schedule, this study hypothesizes that the different orders exert serendipity differently. Serendipity, termed as unexpected lucky encounters with information, resembles variable rewards. Studies have evidenced a relation between variable rewards and excessive behaviors. Similarly, we hypothesize that order-induced serendipitous encounters affect SNS usage times and explore this link in a two-wave survey with an experimental setup (users using either chronological or algorithmic News Feeds). While theoretically extending explanations for increased SNS usage times by considering the News Feed order, practically the study will offer recommendations for relevant stakeholders.
The influence of the wind to the total continuum of OB supergiants is discussed. For wind velocity distributions with β > 1.0, the wind can have strong influence to the total continuum emission, even at optical wavelengths. Comparing the continuum emission of clumped and unclumped winds, especially for stars with high β values, delivers flux differences of up to 30% with maximum in the near-IR. Continuum observations at these wavelengths are therefore an ideal tool to discriminate between clumped and unclumped winds of OB supergiants.
The devil in disguise
(2021)
Envy constitutes a serious issue on Social Networking Sites (SNSs), as this painful emotion can severely diminish individuals' well-being. With prior research mainly focusing on the affective consequences of envy in the SNS context, its behavioral consequences remain puzzling. While negative interactions among SNS users are an alarming issue, it remains unclear to which extent the harmful emotion of malicious envy contributes to these toxic dynamics. This study constitutes a first step in understanding malicious envy’s causal impact on negative interactions within the SNS sphere. Within an online experiment, we experimentally induce malicious envy and measure its immediate impact on users’ negative behavior towards other users. Our findings show that malicious envy seems to be an essential factor fueling negativity among SNS users and further illustrate that this effect is especially pronounced when users are provided an objective factor to mask their envy and justify their norm-violating negative behavior.
The envy spiral
(2020)
On Social Networking Sites (SNS) users disclose mostly positive and often self-enhancing information. Scholars refer to this phenomenon as the positivity bias in SNS communication (PBSC). However, while theoretical explanations for this phenomenon have been proposed, an empirical proof of these theorized mechanisms is still missing. The project presented in this Research-in-Progress paper aims at explaining the PBSC with the mechanism specified in the self-enhancement envy spiral. Specifically, we hypothesize that feelings of envy drive people to post positive and self-enhancing content on SNS. To test this hypothesis, we developed an experimental design allowing to examine the causal effect of envy on the positivity of users’ subsequently posted content. In a preliminary study, we tested our manipulation of envy and could show its effectiveness in inducing different levels of envy between our groups. Our project will help to broaden the understanding of the complex dynamics of SNS and the potentially adverse driving forces underlying them.
We study the influence of clumping on the predicted wind structure of O-type stars. For this purpose we artificially include clumping into our stationary wind models. When the clumps are assumed to be optically thin, the radiative line force increases compared to corresponding unclumped models, with a similar effect on either the mass-loss rate or the terminal velocity (depending on the onset of clumping). Optically thick clumps, alternatively, might be able to decrease the radiative force.
The rigorous development, application and validation of distributed hydrological models obligates to evaluate data in a spatially distributed way. In particular, spatial model predictions such as the distribution of soil moisture, runoff generating areas or nutrient-contributing areas or erosion rates, are to be assessed against spatially distributed observations. Also model inputs, such as the distribution of modelling units derived by GIS and remote sensing analyses, should be evaluated against groundbased observations of landscape characteristics. So far, however, quantitative methods of spatial field comparison have rarely been used in hydrology. In this paper, we present algorithms that allow to compare observed and simulated spatial hydrological data. The methods can be applied for binary and categorical data on regular grids. They comprise cell-by-cell algorithms, cell-neighbourhood approaches that account for fuzziness of location, and multi-scale algorithms that evaluate the similarity of spatial fields with changing resolution. All methods provide a quantitative measure of the similarity of two maps. The comparison methods are applied in two mountainous catchments in southern Germany (Brugga, 40 km<sup>2) and Austria (Löhnersbach, 16 km<sup>2). As an example of binary hydrological data, the distribution of saturated areas is analyzed in both catchments. For categorical data, vegetation zones that are associated with different runoff generation mechanisms are analyzed in the Löhnersbach. Mapped spatial patterns are compared to simulated patterns from terrain index calculations and from satellite image analysis. It is discussed how particular features of visual similarity between the spatial fields are captured by the quantitative measures, leading to recommendations on suitable algorithms in the context of evaluating distributed hydrological models.
HPI Future SOC Lab
(2015)
Das Future SOC Lab am HPI ist eine Kooperation des Hasso-Plattner-Instituts mit verschiedenen Industriepartnern. Seine Aufgabe ist die Ermöglichung und Förderung des Austausches zwischen Forschungsgemeinschaft und Industrie.
Am Lab wird interessierten Wissenschaftlern eine Infrastruktur von neuester Hard- und Software kostenfrei für Forschungszwecke zur Verfügung gestellt. Dazu zählen teilweise noch nicht am Markt verfügbare Technologien, die im normalen Hochschulbereich in der Regel nicht zu finanzieren wären, bspw. Server mit bis zu 64 Cores und 2 TB Hauptspeicher. Diese Angebote richten sich insbesondere an Wissenschaftler in den Gebieten Informatik und Wirtschaftsinformatik. Einige der Schwerpunkte sind Cloud Computing, Parallelisierung und In-Memory Technologien.
In diesem Technischen Bericht werden die Ergebnisse der Forschungsprojekte des Jahres 2015 vorgestellt. Ausgewählte Projekte stellten ihre Ergebnisse am 15. April 2015 und 4. November 2015 im Rahmen der Future SOC Lab Tag Veranstaltungen vor.
Mass accretion onto compact objects through accretion disks is a common phenomenon in the universe. It is seen in all energy domains from active galactic nuclei through cataclysmic variables (CVs) to young stellar objects. Because CVs are fairly easy to observe, they provide an ideal opportunity to study accretion disks in great detail and thus help us to understand accretion also in other energy ranges. Mass accretion in these objects is often accompanied by mass outflow from the disks. This accretion disk wind, at least in CVs, is thought to be radiatively driven, similar to O star winds. WOMPAT, a 3-D Monte Carlo radiative transfer code for accretion disk winds of CVs is presented.
Intrasession reliability of insole in-shoe plantar pressure measurements in different foot areas
(2012)
Data sharing requires researchers to publish their (primary) data and any supporting research materials. With increased attention on reproducibility and more transparent research requiring sharing of data, the issues surrounding data sharing are moving beyond whether data sharing is beneficial, to what kind of research data should be shared and how. However, despite its benefits, data sharing still is not common practice in Information Systems (IS) research. The panel seeks to discuss the controversies related to data sharing in research, specifically focusing on the IS discipline. It remains unclear how the positive effects of data sharing that are often framed as extending beyond the individual researcher (e.g., openness for innovation) can be utilized while reducing the downsides often associated with negative consequences for the individual researcher (e.g., losing a competitive advantage). To foster data sharing practices in IS, the panel will address this dilemma by drawing on the panelists’ expertise.
Visual Social Networking Sites (SNSs) enable users to present themselves favorably to gain likes and the attention of others. Especially, Instagram is known for its focus on beauty, fitness, fashion, and dietary topics. Although a large body of research reports negative weight-related outcomes of SNS usage (e.g., body dissatisfaction, body image concerns), studies examining how SNS usage relates to these outcomes are scarce. Based on the visual normalization theory, we argue that SNS content facilitates normalization of so-called thin- and fit-ideals, thereby leading to biased perceptions of the average body weight in society. Therefore, this study tests whether Instagram use is associated with perceiving that the average person weighs less. Responses of 181 survey participants confirm that Instagram use is negatively related to average weight perception of both women and men. These findings contribute to the growing body of research on how SNS use relates to negative weight-related outcomes.
We present a formal analysis of iconic coverbal gesture. Our model describes the incomplete meaning of gesture that’s derivable from its form, and the pragmatic reasoning that yields a more specific interpretation. Our formalism builds on established models of discourse interpretation to capture key insights from the descriptive literature on gesture: synchronous speech and gesture express a single thought, but while the form of iconic gesture is an important clue to its interpretation, the content of gesture can be resolved only by linking it to its context.
The digitalization of value networks holds out the prospect of many advantages for the participating compa- nies. Utilizing information platforms, cross-company data exchange enables increased efficiency of collab- oration and offers space for new business models and services. In addition to the technological challenges, the fear of know-how leakage appears to be a significant roadblock that hinders the beneficial realization of new business models in digital ecosystems. This paper provides the necessary building blocks of digital participation and, in particular, classifies the issue of trust creation within it as a significant success factor. Based on these findings, it presents a solution concept that, by linking the identified building blocks, offers the individual actors of the digital value network the opportunity to retain sovereignty over their data and know-how and to use the potential of extensive networking. In particular, the presented concept takes into account the relevant dilemma, that every actor (e. g. the machine users) has to be able to control his commu- nicated data at any time and have sufficient possibilities for intervention that, on the one hand, satisfy the need for protection of his knowledge and, on the other hand, do not excessively diminish the benefits of the system or the business. Taking up this perspective, this paper introduces dedicated data sovereignty and shows a possible implementation concept.
We present XMM-Newton Reflection Grating Spectrometer observations of pairs of X-ray emission line profiles from the O star ζ Pup that originate from the same He-like ion. The two profiles in each pair have different shapes and cannot both be consistently fit by models assuming the same wind parameters. We show that the differences in profile shape can be accounted for in a model including the effects of resonance scattering, which affects the resonance line in the pair but not the intercombination line. This implies that resonance scattering is also important in single resonance lines, where its effect is difficult to distinguish from a low effective continuum optical depth in the wind. Thus, resonance scattering may help reconcile X-ray line profile shapes with literature mass-loss rates.
Landscape aesthetics drawing on philosophy and psychology allow us to understand computer games from a new angle. The landscapes of computer games can be understood as environments or images. This difference creates two options: 1. We experience environments or images, or 2. We experience landscape simultaneously as both. Psychologically, the first option can be backed up by a Vygotskian framework (this option highlights certain non-mainstream subject positions), the second by a Piegatian (highlighting cognitive mapping of game worlds).
This text compares the special characteristics of the game space in computer-generated environments with that in non-computerized playing-situations. Herewith, the concept of the magic circle as a deliberately delineated playing sphere with specific rules to be upheld by the players, is challenged. Yet, computer games also provide a virtual playing environment containing the rules of the game as well as the various action possibilities. But both the hardware and software facilitate the player’s actions rather than constraining them. This makes computer games fundamentally different: in contrast to traditional game spaces or limits, the computer-generated environment does not rely on the awareness of the player in upholding these rules. – Thus, there is no magic circle.
Clumping in Galactic WN stars : a comparison of mass loss rates from UV/optical & radio diagnostics
(2007)
The mass loss rates and other parameters for a large sample of Galactic WN stars have been revised by Hamann et al. (2006), using the most up-to date Potsdam Wolf-Rayet (PoWR) model atmospheres. For a sub-sample of these stars exist measurements of their radio free-free emission. After harmonizing the adopted distance and terminal wind velocities, we compare the mass loss rates obtained from the two diagnostics. The differences are discussed as a possible consequence of different clumping contrast in the line-forming and radio-emitting regions.
Scaling up CSP
(2023)
Concentrating solar power (CSP) is one of the few scalable technologies capable of delivering dispatchable renewable power. Therefore, many expect it to shoulder a significant share of system balancing in a renewable electricity future powered by cheap, intermittent PV and wind power: the IEA, for example, projects 73 GW CSP by 2030 and several hundred GW by 2050 in its Net-Zero by 2050 pathway. In this paper, we assess how fast CSP can be expected to scale up and how long time it would take to get new, high-efficiency CSP technologies to market, based on observed trends and historical patterns. We find that to meaningfully contribute to net-zero pathways the CSP sector needs to reach and exceed the maximum historical annual growth rate of 30%/year last seen between 2010-2014 and maintain it for at least two decades. Any CSP deployment in the 2020s will rely mostly on mature existing technologies, namely parabolic trough and molten-salt towers, but likely with adapted business models such as hybrid CSP-PV stations, combining the advantages of higher-cost dispatchable and low-cost intermittent power. New third-generation CSP designs are unlikely to play a role in markets during the 2020s, as they are still at or before the pilot stage and, judging from past pilot-to-market cycles for CSP, they will likely not be ready for market deployment before 2030. CSP can contribute to low-cost zero-emission energy systems by 2050, but to make that happen, at the scale foreseen in current energy models, ambitious technology-specific policy support is necessary, as soon as possible and in several countries.
Insertion of artificial cell surface receptors for antigen-specific labelling of hybridoma cells
(2012)
Most play spaces support completely different actions than we normally would think of when moving through real space, out of play. This paper therefore discusses the relationship between selected game rules and game spaces in connection to the behaviors, or possible behaviors, of the player. Space will be seen as a modifier or catalyst of player behavior. Six categories of game space are covered: Joy of movement, exploration, tactical, social, performative, and creative spaces. Joy of movement is examined in detail, with a briefer explanation of the other categories.
We apply the 3-dimensional radiative transport codeWind3D to 3D hydrodynamic models of Corotating Interaction Regions to fit the detailed variability of Discrete Absorption Components observed in Si iv UV resonance lines of HD 64760 (B0.5 Ib). We discuss important effects of the hydrodynamic input parameters on these large-scale equatorial wind structures that determine the detailed morphology of the DACs computed with 3D transfer. The best fit model reveals that the CIR in HD 64760 is produced by a source at the base of the wind that lags behind the stellar surface rotation. The non-corotating coherent wind structure is an extended density wave produced by a local increase of only 0.6% in the smooth symmetric wind mass-loss rate.
Both Alpine and Mediterranean areas are considered sensitive to so-called global change, considered as the combination of climate and land use changes. All panels on climate evolution predict future scenarios of increasing frequency and magnitude of floods which are likely to lead to huge geomorphic adjustments of river channels so major metamorphosis of fluvial systems is expected as a result of global change. Such pressures are likely to give rise to major ecological and economic changes and challenges that governments need to address as a matter of priority. Changes in river flow regimes associated with global change are therefore ushering in a new era, where there is a critical need to evaluate hydro-geomorphological hazards from headwaters to lowland areas (flooding can be not just a problem related to being under the water). A key question is how our understanding of these hazards associated with global change can be improved; improvement has to come from integrated research which includes the climatological and physical conditions that could influence the hydrology and sediment generation and hence the conveyance of water and sediments (including the river’s capacity, i.e. amount of sediment, and competence, i.e. channel deformation) and the vulnerabilities and economic repercussions of changing hydrological hazards (including the evaluation of the hydro-geomorphological risks too).
Within this framework, the purpose of this international symposium is to bring together researchers from several disciplines as hydrology, fluvial geomorphology, hydraulic engineering, environmental science, geography, economy (and any other related discipline) to discuss the effects of global change over the river system in relation with floods. The symposium is organized by means of invited talks given by prominent experts, oral lectures, poster sessions and discussion sessions for each individual topic; it will try to improve our understanding of how rivers are likely to evolve as a result of global change and hence address the associated hazards of that fluvial environmental change concerning flooding.
Four main topics are going to be addressed:
- Modelling global change (i.e. climate and land-use) at relevant spatial (regional, local) and temporal (from the long-term to the single-event) scales.
- Measuring and modelling river floods from the hydrological, sediment transport (both suspended and bedload) and channel morphology points of view at different spatial (from the catchment to the reach) and temporal (from the long-term to the single-event) scales.
- Evaluation and assessment of current and future river flooding hazards and risks in a global change perspective.
- Catchment management to face river floods in a changing world.
We are very pleased to welcome you to Potsdam. We hope you will enjoy your participation at the International Symposium on the Effects of Global Change on Floods, Fluvial Geomorphology and Related Hazards in Mountainous Rivers and have an exciting and profitable experience. Finally, we would like to thank all speakers, participants, supporters, and sponsors for their contributions that for sure will make of this event a very remarkable and fruitful meeting. We acknowledge the valuable support of the European Commission (Marie Curie Intra-European Fellowship, Project ‘‘Floodhazards’’, PIEF-GA-2013-622468, Seventh EU Framework Programme) and the Deutschen Forschungsgemeinschaft (Research Training Group “Natural Hazards and Risks in a Changing World” (NatRiskChange; GRK 2043/1) as the symposium would not have been possible without their help. Without your cooperation, this symposium would not be either possible or successful.
This paper suggests an approach to studying the rhetoric of persuasive computer games through comparative analysis. A comparison of the military propaganda game AMERICA’S ARMY to similar shooter games reveals an emphasis on discipline and constraints in all main aspects of the games, demonstrating a preoccupation with ethos more than pathos. Generalizing from this, a model for understanding game rhetoric through balances of freedom and constraints is proposed.
Demonstratives, in particular gestures that "only" accompany speech, are not a big issue in current theories of grammar. If we deal with gestures, fixing their function is one big problem, the other one is how to integrate the representations originating from different channels and, ultimately, how to determine their composite meanings. The growing interest in multi-modal settings, computer simulations, human-machine interfaces and VRapplications increases the need for theories ofmultimodal structures and events. In our workshopcontribution we focus on the integration of multimodal contents and investigate different approaches dealing with this problem such as Johnston et al. (1997) and Johnston (1998), Johnston and Bangalore (2000), Chierchia (1995), Asher (2005), and Rieser (2005).
Classical SDRT (Asher and Lascarides, 2003) discussed essential features of dialogue like adjacency pairs or corrections and up-dating. Recent work in SDRT (Asher, 2002, 2005) aims at the description of natural dialogue. We use this work to model situated communication, i.e. dialogue, in which sub-sentential utterances and gestures (pointing and grasping) are used as conventional modes of communication. We show that in addition to cognitive modelling in SDRT, capturing mental states and speech-act related goals, special postulates are needed to extract meaning out of contexts. Gestural meaning anchors Discourse Referents in contextually given domains. Both sorts of meaning are fused with the meaning of fragments to get at fully developed dialogue moves. This task accomplished, the standard SDRT machinery, tagged SDRSs, rhetorical relations, the up-date mechanism, and the Maximize Discourse Coherence constraint generate coherent structures. In sum, meanings from different verbal and non-verbal sources are assembled using extended SDRT to form coherent wholes.
How does a shared lexicon arise in population of agents with differing lexicons, and how can this shared lexicon be maintained over multiple generations? In order to get some insight into these questions we present an ALife model in which the lexicon dynamics of populations that possess and lack metacommunicative interaction (MCI) capabilities are compared. We ran a series of experiments on multi-generational populations whose initial state involved agents possessing distinct lexicons. These experiments reveal some clear differences in the lexicon dynamics of populations that acquire words solely by introspection contrasted with populations that learn using MCI or using a mixed strategy of introspection and MCI. The lexicon diverges at a faster rate for an introspective population, eventually collapsing to one single form which is associated with all meanings. This contrasts sharply with MCI capable populations in which a lexicon is maintained, where every meaning is associated with a unique word. We also investigated the effect of increasing the meaning space and showed that it speeds up the lexicon divergence for all populations irrespective of their acquisition method.
We discuss the results of time-resolved spectroscopy of three presumably single Population I Wolf-Rayet stars in the Small Magellanic Cloud, where the ambient metallicity is $\sim 1/5 Z_\odot$. We were able to detect and follow numerous small-scale wind-embedded inhomogeneities in all observed stars. The general properties of the moving features, such as their velocity dispersions, emissivities and average accelerations, closely match the corresponding characteristics of small-scale inhomogeneities in the winds of Galactic Wolf-Rayet stars.
Social media constitute an important arena for public debates and steady interchange of issues relevant to society. To boost their reputation, commercial organizations also engage in political, social, or environmental debates on social media. To engage in this type of digital activism, organizations increasingly utilize the social media profiles of executive employees and other brand ambassadors. However, the relationship between brand ambassadors’ digital activism and corporate reputation is only vaguely understood. The results of a qualitative inquiry suggest that digital activism via brand ambassadors can be risky (e.g., creating additional surface for firestorms, financial loss) and rewarding (e.g., emitting authenticity, employing ‘megaphones’ for industry change) at the same time. The paper informs both scholarship and practitioners about strategic trade-offs that need to be considered when employing brand ambassadors for digital activism.
We review the effects of clumping on the profiles of resonance doublets. By allowing the ratio of the doublet oscillator strenghts to be a free parameter, we demonstrate that doublet profiles contain more information than is normally utilized. In clumped (or porous) winds, this ratio can lies between unity and the ratio of the f-values, and can change as a function of velocity and time, depending on the fraction of the stellar disk that is covered by material moving at a particular velocity at a given moment. Using these insights, we present the results of SEI modeling of a sample of B supergiants, ζ Pup and a time series for a star whose terminal velocity is low enough to make the components of its Si VIλλ1400 independent. These results are interpreted within the framewrok of the Oskinova et al. (2007) model, and demonstrate how the doublet profiles can be used to extract infromation about wind structure.
A key problem for models of dialogue is to explain the mechanisms involved in generating and responding to clarification requests. We report a 'Maze task' experiment that investigates the effect of 'spoof' clarification requests on the development of semantic co-ordination. The results provide evidence of both local and global semantic co-ordination phenomena that are not captured by existing dialogue co-ordination models.
Breaking down barriers
(2024)
Many researchers hesitate to provide full access to their datasets due to a lack of knowledge about research data management (RDM) tools and perceived fears, such as losing the value of one's own data. Existing tools and approaches often do not take into account these fears and missing knowledge. In this study, we examined how conversational agents (CAs) can provide a natural way of guidance through RDM processes and nudge researchers towards more data sharing. This work offers an online experiment in which researchers interacted with a CA on a self-developed RDM platform and a survey on participants’ data sharing behavior. Our findings indicate that the presence of a guiding and enlightening CA on an RDM platform has a constructive influence on both the intention to share data and the actual behavior of data sharing. Notably, individual factors do not appear to impede or hinder this effect.
This paper approaches the debate over the notion of “magic circle” through an exploratory analysis of the unfolding of identities/differences in gameplay through Derrida’s différance. Initially, différance is related to the notion of play and identity/difference in Derrida’s perspective. Next, the notion of magic circle through Derrida’s play is analyzed, emphasizing the dynamics of différance to understand gameplay as process; questioning its boundaries. Finally, the focus shifts toward the implications of the interplay of identities and differences during gameplay.
In the old days (pre ∼1990) hot stellar winds were assumed to be smooth, which made life fairly easy and bothered no one. Then after suspicious behaviour had been revealed, e.g. stochastic temporal variability in broadband polarimetry of single hot stars, it took the emerging CCD technology developed in the preceding decades (∼1970-80’s) to reveal that these winds were far from smooth. It was mainly high-S/N, time-dependent spectroscopy of strong optical recombination emission lines in WR, and also a few OB and other stars with strong hot winds, that indicated all hot stellar winds likely to be pervaded by thousands of multiscale (compressible supersonic turbulent?) structures, whose driver is probably some kind of radiative instability. Quantitative estimates of clumping-independent mass-loss rates came from various fronts, mainly dependent directly on density (e.g. electron-scattering wings of emission lines, UV spectroscopy of weak resonance lines, and binary-star properties including orbital-period changes, electron-scattering, and X-ray fluxes from colliding winds) rather than the more common, easier-to-obtain but clumping-dependent density-squared diagnostics (e.g. free-free emission in the IR/radio and recombination lines, of which the favourite has always been Hα). Many big questions still remain, such as: What do the clumps really look like? Do clumping properties change as one recedes from the mother star? Is clumping universal? Does the relative clumping correction depend on $\dot{M}$ itself?
General Discussion
(2007)
Hα observations of Rigel obtained on 184 nights during the past ten years with the 1-m telescope and ´echelle spectrograph of Ritter Observatory are surveyed. The line profiles were classified in terms of morphology. About 1/4 of them are of P Cygni type, about 15% inverse P Cygni, about 25% double-peaked, about 1/3 pure absorption, and a few are single emission lines. Transformation of the profile from one type to another typically takes a few days. Although the line stays in absorption for extended intervals, only one high-velocity absorption event of the intensity reported by Kaufer et al. (1996a) was observed, in late 2006. Late in this event, Hα absorption occurred farther to the red than the red wing of a plausible photospheric absorption component, an indication of infalling material. In general, as the absorption events come to an end, the emission typically returns with an inverse P Cygni profile. The Hα profile class shows no obvious correlation with the radial velocity of C II λ6578, a photospheric absorption line.
Proceedings of TripleA 10
(2024)
The TripleA workshop series was founded in 2014 by linguists from Potsdam and Tübingen with the aim of providing a platform for researchers that conduct theoretically-informed linguistic fieldwork on meaning. Its focus is particularly on languages that are under-represented in the current research landscape, including but not limited to languages of Africa, Asia, and Australia, hence TripleA.
For its 10th anniversary, TripleA returned to the University of Potsdam on the 7-9th of June 2023.
The programme included 21 talks dealing with no less than 22 different languages, including three invited talks given by Sihwei Chen (Academia Sinica), Jérémy Pasquereau (Laboratoire de Linguistique de Nantes, CNRS) and Agata Renans (Ruhr-Universität Bochum). Nine of these (invited or peer-reviewed) talks are featured in this volume.
Extending Alexander Galloway’s analysis of the action-image in videogames, this essay explores the concept in relation to its source: the analysis of cinema by the French philosopher Gilles Deleuze. The applicability of the concept to videogames will, therefore, be considered through a comparison between the First Person Shooter S.T.A.L.K.E.R. and Andrey Tarkovsky’s film Stalker. This analysis will compellingly explore the nature of videogame-action, its relation to player-perceptions and its location within the machinic and ludic schema.
Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
The game itself?
(2020)
In this paper, we reassess the notion and current state of ludohermeneutics in game studies, and propose a more solid foundation for how to conduct hermeneutic game analysis. We argue that there can be no ludo-hermeneutics as such, and that every game interpretation rests in a particular game ontology, whether implicit or explicit. The quality of this ontology, then, determines a vital aspect of the quality of the analysis.
The game itself?
(2020)
In this paper, we reassess the notion and current state of ludohermeneutics in game studies, and propose a more solid foundation for how to conduct hermeneutic game analysis. We argue that there can be no ludo-hermeneutics as such, and that every game interpretation rests in a particular game ontology, whether implicit or explicit. The quality of this ontology, then, determines a vital aspect of the quality of the analysis.
This first volume of the DIGAREC Series holds the proceedings of the conference “The Philosophy of Computer Games”, held at the University of Potsdam from May 8-10, 2008. The contributions of the conference address three fields of computer game research that are philosophically relevant and, likewise, to which philosophical reflection is crucial. These are: ethics and politics, the action-space of games, and the magic circle. All three topics are interlinked and constitute the paradigmatic object of computer games: Whereas the first describes computer games on the outside, looking at the cultural effects of games as well as on moral practices acted out with them, the second describes computer games on the inside, i.e. how they are constituted as a medium. The latter finally discusses the way in which a border between these two realms, games and non-games, persists or is already transgressed in respect to a general performativity.
CpG-oligonucleotides modulate sphingosine-1-phosphate metabolism in normal human keratinocytes
(2012)
Metacommunicative circles
(2008)
The paper uses Gregory Bateson’s concept of metacommunication to explore the boundaries of the ‘magic circle’ in play and computer games. It argues that the idea of a self-contained “magic circle” ignores the constant negotiations among players which establish the realm of play. The “magic circle” is no fixed ontological entity but is set up by metacommunicative play. The paper further pursues the question if metacommunication could also be found in single-player computer games, and comes to the conclusion that metacommunication is implemented in single-player games by the means of metalepsis.
The optical spectrum of Eta Carinae (η Car) is prominent in H I, He i and Fe ii wind lines, all of which vary both in absorption and emission with phase. The phase dependance is a consequence of the interaction between the two objects in the η Car binary (η Car A & B). The binary system is enshrouded by ejecta from previous mass ejection events and consequently, η Car B is not directly observable. We have traced the He i lines over η Car’s spectroscopic period, using HST/STIS data obtained with medium spectral, but high angular, resolving power, and created a radial velocity curve for the system. The He I lines are formed in the core of the system, and appear to be a composite of multiple features formed in spatially separated regions. The sources of their irregular line profiles are still not fully understood, but can be attributed to emission/absorption near the wind-wind interface and/or a direct consequence of the η Car A’s, massive, clumpy wind. This paper will discuss the spectral variability, the narrow emission structure of the He i lines and how clumpiness of the winds may impede the construction of the reliable radial velocity curve, necessary for characterizations of especially η Car B.
We present the tool Kato which is, to the best of our knowledge, the first tool for plagiarism detection that is directly tailored for answer-set programming (ASP). Kato aims at finding similarities between (segments of) logic programs to help detecting cases of plagiarism. Currently, the tool is realised for DLV programs but it is designed to handle various logic-programming syntax versions. We review basic features and the underlying methodology of the tool.
Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
X-ray spectroscopy is a sensitive probe of stellar winds. X-rays originate from optically thin shock-heated plasma deep inside the wind and propagate outwards throughout absorbing cool material. Recent analyses of the line ratios from He-like ions in the X-ray spectra of O-stars highlighted problems with this general paradigm: the measured line ratios of highest ions are consistent with the location of the hottest X-ray emitting plasma very close to the base of the wind, perhaps indicating the presence of a corona, while measurements from lower ions conform with the wind-embedded shock model. Generally, to correctly model the emerging Xray spectra, a detailed knowledge of the cool wind opacities based on stellar atmosphere models is prerequisite. A nearly grey stellar wind opacity for the X-rays is deduced from the analyses of high-resolution X-ray spectra. This indicates that the stellar winds are strongly clumped. Furthermore, the nearly symmetric shape of X-ray emission line profiles can be explained if the wind clumps are radially compressed. In massive binaries the orbital variations of X-ray emission allow to probe the opacity of the stellar wind; results support the picture of strong wind clumping. In high-mass X-ray binaries, the stochastic X-ray variability and the extend of the stellar-wind part photoionized by X-rays provide further strong evidence that stellar winds consist of dense clumps.
Dynamical simulation of the “velocity-porosity” reduction in observed strength of stellar wind lines
(2007)
I use dynamical simulations of the line-driven instability to examine the potential role of the resulting flow structure in reducing the observed strength of wind absorption lines. Instead of the porosity length formalism used to model effects on continuum absorption, I suggest reductions in line strength can be better characterized in terms of a velocity clumping factor that is insensitive to spatial scales. Examples of dynamic spectra computed directly from instability simulations do exhibit a net reduction in absorption, but only at a modest 10-20% level that is well short of the ca. factor 10 required by recent analyses of PV lines.
In this work an extension of CSSR algorithm using Maximum Entropy Models is introduced. Preliminary experiments to perform Named Entity Recognition with this new system are presented.
Large open-source software projects involve developers with a wide variety of backgrounds and expertise. Such software projects furthermore include many internal APIs that developers must understand and use properly. According to the intended purpose of these APIs, they are more or less frequently used, and used by developers with more or less expertise. In this paper, we study the impact of usage patterns and developer expertise on the rate of defects occurring in the use of internal APIs. For this preliminary study, we focus on memory management APIs in the Linux kernel, as the use of these has been shown to be highly error prone in previous work. We study defect rates and developer expertise, to consider e.g., whether widely used APIs are more defect prone because they are used by less experienced developers, or whether defects in widely used APIs are more likely to be fixed.
Increasingly fast development cycles and individualized products pose major challenges for today's smart production systems in times of industry 4.0. The systems must be flexible and continuously adapt to changing conditions while still guaranteeing high throughputs and robustness against external disruptions. Deep rein- forcement learning (RL) algorithms, which already reached impressive success with Google DeepMind's AlphaGo, are increasingly transferred to production systems to meet related requirements. Unlike supervised and unsupervised machine learning techniques, deep RL algorithms learn based on recently collected sensor- and process-data in direct interaction with the environment and are able to perform decisions in real-time. As such, deep RL algorithms seem promising given their potential to provide decision support in complex environments, as production systems, and simultaneously adapt to changing circumstances. While different use-cases for deep RL emerged, a structured overview and integration of findings on their application are missing. To address this gap, this contribution provides a systematic literature review of existing deep RL applications in the field of production planning and control as well as production logistics. From a performance perspective, it became evident that deep RL can beat heuristics significantly in their overall performance and provides superior solutions to various industrial use-cases. Nevertheless, safety and reliability concerns must be overcome before the widespread use of deep RL is possible which presumes more intensive testing of deep RL in real world applications besides the already ongoing intensive simulations.
Enhancing economic efficiency in modular production systems through deep reinforcement learning
(2024)
In times of increasingly complex production processes and volatile customer demands, the production adaptability is crucial for a company's profitability and competitiveness. The ability to cope with rapidly changing customer requirements and unexpected internal and external events guarantees robust and efficient production processes, requiring a dedicated control concept at the shop floor level. Yet in today's practice, conventional control approaches remain in use, which may not keep up with the dynamic behaviour due to their scenario-specific and rigid properties. To address this challenge, deep learning methods were increasingly deployed due to their optimization and scalability properties. However, these approaches were often tested in specific operational applications and focused on technical performance indicators such as order tardiness or total throughput. In this paper, we propose a deep reinforcement learning based production control to optimize combined techno-financial performance measures. Based on pre-defined manufacturing modules that are supplied and operated by multiple agents, positive effects were observed in terms of increased revenue and reduced penalties due to lower throughput times and fewer delayed products. The combined modular and multi-staged approach as well as the distributed decision-making further leverage scalability and transferability to other scenarios.
Background:
Like most countries, Germany is currently recruiting international nurses due to staff shortages. While these are mostly academic, the academisation of nursing in Germany has only just begun. This allows for a broader look at the participation of migrant nurses: How do care teams deal with the fact that immigrant colleagues are theoretically more highly qualified than long-established colleagues?
Methods:
Case studies were conducted in four inpatient care teams of two hospitals in 2022. Qualitative data include 26 observation protocols, 4 group discussions and 17 guided interviews. These were analysed using the documentary method and validated intersubjectively.
Results:
Due to current academisation efforts in Germany and the immigration of academised nursing staff from abroad, the areas of activity and responsibility of nursing in Germany are under negotiating pressure. This concerns basic care for example, which in Germany is provided by skilled workers, but in other countries is mostly provided by assistants or relatives. The question of who should provide basic care, whether all nurses or only nursing assistants, documents the struggle between an established and a new understanding of care. In this context, the knowledge and skills of migrant and academicised care workers become a crucial aspect in the struggle for a new professional identity for care in Germany.
Conclusions:
The specific situation in Germany makes it possible to show the potential for change that international care migration can constitute for destination countries. The far-reaching process of change of German nursing is given a further dimension not only by its academization, but by the immigration of international and academically trained nursing staff, where inclusive or exclusive effects can already be observed.
Key messages: The increasing proportion of migrant nurses accelerates the current discussion on nursing in Germany. Conflict areas show up in everyday work of care teams and must be addressed there.
In honour of Seymour Papert
(2018)
Forth is nice and flexible but to a philosopher and teacher educator Logo is the more impressing language. Both are relatives of Lisp, but Forth has a reverse Polish notation where as Logo has an infix notation. Logo allows top down programming, Forth only bottom up. Logo enables recursive programming, Forth does not. Logo includes turtle graphics, Forth has nothing comparable. So what to do if you can't get Logo and have no information about its inner architecture? This should be a case of "empirical modelling": How can you model observable results of the behaviour of Logo in terms of Forth? The main steps to solve this problem are shown in the first part of the paper.
The second part of the paper discusses the problem of modelling and shows that the modelling of making and the modelling of recognition have the same mathematical structure. So "empirical modelling" can also serve for modelling desired behaviour of technical systems.
The last part of the paper will show that the heuristic potential of a problem which should be modeled is more important than the programming language. The Picasso construal shows, in a very simple way, how children of different ages can model emotional relations in human behaviour with a simple Logo system.
We present an extension to a comprehensive context model that has been successfully employed in a number of practical conversational dialogue systems. The model supports the task of multimodal fusion as well as that of reference resolution in a uniform manner. Our extension consists of integrating implicitly mentioned concepts into the context model and we show how they serve as candidates for reference resolution.
This paper explores the role of the intentional stance in games, arguing that any question of artificial intelligence has as much to do with the co-option of the player’s interpretation of actions as intelligent as any actual fixed-state systems attached to agents. It demonstrates how simply using a few simple and, in system terms, cheap tricks, existing AI can be both supported and enhanced. This includes representational characteristics, importing behavioral expectations from real life, constraining these expectations using diegetic devices, and managing social interrelationships to create the illusion of a greater intelligence than is ever actually present. It is concluded that complex artificial intelligence is often of less importance to the experience of intelligent agents in play than the creation of a space where the intentional stance can be evoked and supported.
The emergence of information extraction (IE) oriented pattern engines has been observed during the last decade. Most of them exploit heavily finite-state devices. This paper introduces ExPRESS – a new extraction pattern engine, whose rules are regular expressions over flat feature structures. The underlying pattern language is a blend of two previously introduced IE oriented pattern formalisms, namely, JAPE, used in the widely known GATE system, and the unificationbased XTDL formalism used in SProUT. A brief and technical overview of ExPRESS, its pattern language and the pool of its native linguistic components is given. Furthermore, the implementation of the grammar interpreter is addressed too.
This paper focuses on the way computer games refer to the context of their formation and ask how they might stimulate the user’s understanding of the world around him. The central question is: Do computer games have the potential to inspire our reflection about moral and ethical issues? And if so, by which means do they achieve this? Drawing on concepts of the ethical criticism in literary studies as proposed by Wayne C. Booth and Martha Nussbaum, I will argue in favor of an ethical criticism for computer games. Two aspects will be brought into focus: the ethical reflection in the artifact as a whole, and the recipient’s emotional involvement. The paper aims at evaluating the interaction of game content and game structure in order to give an adequate insight into the way computer games function and affect us.