Refine
Year of publication
Document Type
- Conference Proceeding (526) (remove)
Language
- English (443)
- German (70)
- Multiple languages (10)
- French (2)
- Russian (1)
Keywords
- Archiv (4)
- Information Structure (4)
- Nachlass (4)
- Cloud Computing (3)
- E-Learning (3)
- E-Mail Tracking (3)
- ERP (3)
- MOOC (3)
- Privacy (3)
- enterprise systems (3)
Institute
- Extern (139)
- Institut für Biochemie und Biologie (56)
- Fachgruppe Betriebswirtschaftslehre (55)
- Department Sport- und Gesundheitswissenschaften (40)
- Institut für Ernährungswissenschaft (35)
- Department Psychologie (29)
- Institut für Künste und Medien (23)
- Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung (23)
- Institut für Informatik und Computational Science (20)
- Institut für Physik und Astronomie (15)
The paper discusses the distribution and meaning of the additive particle -m@s in Ishkashimi. -m@s receives different semantic associations while staying in the same syntactic position. Thus, structurally combined with an object, it can semantically associate with the focused object or with the whole focused VP; similarly, combined with the subject it can semantically associate with the focused subject and with the whole focused sentence.
Die Moskauer Staatliche Juristische O. E. Kutafin Universität (Akademie) und die Juristische Fakultät der Universität Potsdam arbeiten seit 2007 mit hohem wissenschaftlichem Ertrag zusammen. Ihre gemeinsamen Anstrengungen um das Verständnis und die Entwicklung des Rechts in Russland und in Deutschland finden in den Wochen des Russischen Rechts ihren Ausdruck. In den Beiträgen spiegelt sich das wissenschaftliche Denken auf dem Gebiet der Rechtswissenschaft zweier Länder wider, deren Rechtsschulen seit jeher ein historisches Band verbindet. Ius est ars aequi et boni – Nach dem Wahren und Guten durch das Recht streben beide Partner. Den thematischen Schwerpunkt der 2. Woche des Russischen Rechts, Potsdam, 2012, bildeten – neben zivilrechtlichen Fragestellungen – das Zusammenwirken und der wechselseitige Einfluss von Staat und Kirche in Russland und Deutschland. Wo harsche Gegensätze zwischen Staat und Kirche das Bild prägen, wo die eine Seite die andere gar bekämpft, dominieren Unrecht und Willkür. Wo aber zwischen Staat und Kirche gegenseitiges Vertrauen und Wohlwollen herrschen, wo Konflikte rechtsstaatlich, mithin friedlich, ausgetragen werden, lässt sich das Wahre finden, und das Gute gedeiht. Diese Einsicht zu fördern und zu verbreiten, war ein Ziel der Konferenz.
Problemstellung • Geoökologische Prozessforschung versucht für große Landschaftsausschnitte, die in der Natur ablaufenden und vom Menschen beeinflussten Prozesse mit Hilfe von Modellen nachzuvollziehen • exakte Erfassung der Ausstattung des Untersuchungsraumes ist wesentliche Voraussetzung für eine wirklichkeitsnahe Abbildung • Modelle derzeit weder in der Lage, alle ablaufenden Prozesse in die Betrachtung einzubeziehen, noch präzise Eingangsdaten bei der Beschreibung des Ausgangszustandes zu verarbeiten • häufig liegen Modelleingangsdaten nicht in der notwendigen Präzision vor • In Modellen wird Ausstattung eines Untersuchungsgebietes über den Boden, den Grundwassereinfluss und die Flächennutzung beschrieben • Flächennutzung besitzt weitgehend statische Elemente (Nutzungstypen Wald, Gewässer, Siedlung) und hochdynamische Elemente (jährlicher Wechsel der Fruchtart auf jedem Acker) • Bedarf nach detaillierter (lage- und zeitkonkreter) Eingabe der Verteilung der Ackerfrüchte im Modellzeitraum, da Landwirtschaft als eine der bedeutenden Quellen für diffusen Nährstoffeintrag ins Ökosystem angesehen wird Stand der Forschung • bei Erfassung von Kulturen der Landwirtschaft aus Fernerkundungsdaten hat sich multitemporale Klassifizierung als sinnvoll erwiesen, weil sich anhand einer Einzelaufnahme die verschiedenen Kulturen nicht sicher trennen lassen • Klassifizierung erfolgt mit überwachten Methoden unter Verwendung von Trainingsflächen im Datensatz, von denen die dort angebaute Frucht bekannt ist • in die Klassifizierung werden zusätzliche Informationen einbezogen (Fuzzy), die Auskunft über die Wahrscheinlichkeit des Auftretens der Frucht geben (Anbaueignung in Abhängigkeit von Hangneigung, Niederschlag, Höhenlage, Boden) Die Ergebnisse dieser Klassifikationen sind meist nicht auf andere Landschaftsausschnitte und Anbaujahre übertragbar, weil die Ausprägung der Spektralsignatur einer Kultur durch veränderte Boden- und Witterungsbedingungen variiert. Lösungsansatz • auf Basis von Satellitendaten und Anbauinformationen aus 15 aufeinander folgenden Jahren (35 Aufnahmetermine) sollten von Witterung und Boden unabhängige Jahreskurven der spektralen Charakteristik wichtiger Ackerkulturen gewonnen werden, die den Wachstumsverlauf der Pflanzen beschreiben • diese Kurven sollen anstelle von Trainingsgebieten zur multitemporalen Klassifizierung von Daten eines Anbaujahres herangezogen werden Schlussfolgerungen und Ausblick • Prinzipiell erscheint Vorgehen erfolgreich, jedoch in Abhängigkeit von der Brauchbarkeit der herangezogenen Szenen schwankt Güte des Ergebnisses noch • Verfahren stellt wesentlichen Fortschritt zu bisherigem Vorgehen auf Trainingsflächenbasis dar • ist zumindest im Untersuchungsgebiet immer wieder ohne weitere Kenntnis von Anbauinformationen anwendbar, lediglich exakte phänologische Datierung der dann verwendeten Aufnahmen erforderlich • für andere Gebiete (Variation in Niederschlag und Boden) ist Anpassung der phänologischen Datierung der Kurven erforderlich (Form ist weiter verwendbar) • optimale Bildkombination zur Trennung aller Kulturen ist: Anfang/Mitte April – Mitte Mai – Anfang Juli – Mitte August – Mitte September • Kombination sollte bei verbesserter Verfügbarkeit von Daten beschaffbar sein • problematisch scheinen Trockensituationen im Mai und Juni zu sein, so dass zu schnell reifende Wintergetreide nicht richtig erkannt werden, Bedarf Bodeninformationen einzubeziehen • Trennung von Hackfrüchten weiterhin problematisch (wie schon in bisherigen Verfahren), führt zu übermäßigen Anteilen im Ergebnis, in Abhängigkeit vom Anbauanteil besser vernachlässigen • Einbeziehung von Fuzzyinformationen erscheint sinnvoll • Zusammenhang von Bodengüte und Frucht (Anbaueignung eines Bodens für eine Frucht) • Wasserverfügbarkeit am Standort (in Abhängigkeit von Speichervermögen des Bodens, Grundwasseranschluss und Niederschlag) • Summe der Niederschläge bis zum Aufnahmezeitpunkt (Trockenheitsindex) <hr> Dokument 1: Foliensatz | Dokument 2: Abstract <hr> Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
Nachlässe sind persönliches Eigentum und unterliegen deshalb keiner Abgabepflicht. Der Wunsch des Nachlassers bezüglich der weiteren Aufbewahrung seines schriftlichen Erbes ist demzufolge primär gegenüber allen unseren Wünschen. Wir können nicht fordern, sondern nur bitten, uns durch eigene Leistungen anbieten und die zukünftigen Nachlassenden oder deren Erben überzeugen. Die nicht vorhandene institutionelle Zuständigkeit für die Übernahme von Nachlässen erzeugt die Reibungspunkte zwischen den Einrichtungen, die sich um den Erwerb von Nachlässen bemühen: Archive – Bibliotheken – Museen - Sammlungen. Die Wünsche zum Erwerb des Nachlasses einer bestimmten Person – egal ob Wissenschaftler, Künstler oder Politiker – sind demzufolge immer an verschiedenen Orten gleichzeitig vorhanden. Der Zufall entscheidet dann leider meist darüber, an welcher Stelle der Nachlass zukünftig verwahrt und wissenschaftlich genutzt wird. Es stellt sich die Frage, ob wir auf solche Zufälle hoffen und warten sollen, oder ob wir nicht eher eine engagierte – gemeinsam zwischen den Archiven abgestimmte - Erwerbspolitik betrieben sollten. ------------ Beiträge zum Thema "Nachlässe an Universitäts- und Hochschularchiven sowie Archiven wissenschaftlicher Institutionen" im Rahmen der Frühjahrstagung der Fachgruppe 8: "Archivare an Hochschularchiven und Archiven wissenschaftlicher Institutionen" am 16./17. Juni an der Universität Potsdam.
We introduce a simple approach extending the input language of Answer Set Programming (ASP) systems by multi-valued propositions. Our approach is implemented as a (prototypical) preprocessor translating logic programs with multi-valued propositions into logic programs with Boolean propositions only. Our translation is modular and heavily benefits from the expressive input language of ASP. The resulting approach, along with its implementation, allows for solving interesting constraint satisfaction problems in ASP, showing a good performance.
X-ray spectroscopy is a sensitive probe of stellar winds. X-rays originate from optically thin shock-heated plasma deep inside the wind and propagate outwards throughout absorbing cool material. Recent analyses of the line ratios from He-like ions in the X-ray spectra of O-stars highlighted problems with this general paradigm: the measured line ratios of highest ions are consistent with the location of the hottest X-ray emitting plasma very close to the base of the wind, perhaps indicating the presence of a corona, while measurements from lower ions conform with the wind-embedded shock model. Generally, to correctly model the emerging Xray spectra, a detailed knowledge of the cool wind opacities based on stellar atmosphere models is prerequisite. A nearly grey stellar wind opacity for the X-rays is deduced from the analyses of high-resolution X-ray spectra. This indicates that the stellar winds are strongly clumped. Furthermore, the nearly symmetric shape of X-ray emission line profiles can be explained if the wind clumps are radially compressed. In massive binaries the orbital variations of X-ray emission allow to probe the opacity of the stellar wind; results support the picture of strong wind clumping. In high-mass X-ray binaries, the stochastic X-ray variability and the extend of the stellar-wind part photoionized by X-rays provide further strong evidence that stellar winds consist of dense clumps.
We summarize Chandra observations of the emission line profiles from 17 OB stars. The lines tend to be broad and unshifted. The forbidden/intercombination line ratios arising from Helium-like ions provide radial distance information for the X-ray emission sources, while the H-like to He-like line ratios provide X-ray temperatures, and thus also source temperature versus radius distributions. OB stars usually show power law differential emission measure distributions versus temperature. In models of bow shocks, we find a power law differential emission measure, a wide range of ion stages, and the bow shock flow around the clumps provides transverse velocities comparable to HWHM values. We find that the bow shock results for the line profile properties, consistent with the observations of X-ray line emission for a broad range of OB star properties.
Wolf-Rayet Stars
(2015)
Nearly 150 years ago, the French astronomers Charles Wolf and Georges Rayet described stars with very conspicuous spectra that are dominated by bright and broad emission lines. Meanwhile termed Wolf-Rayet Stars after their discoverers, those objects turned out to represent important stages in the life of massive stars.
As the first conference in a long time that was specifically dedicated to Wolf-Rayet stars, an international workshop was held in Potsdam, Germany, from 1.-5. June 2015. About 100 participants, comprising most of the leading experts in the field as well as as many young scientists, gathered for one week of extensive scientific exchange and discussions. Considerable progress has been reported throughout, e.g. on finding such stars, modeling and analyzing their spectra, understanding their evolutionary context, and studying their circumstellar nebulae. While some major questions regarding Wolf-Rayet stars still remain open 150 years after their discovery, it is clear today that these objects are not just interesting stars as such, but also keystones in the evolution of galaxies.
These proceedings summarize the talks and posters presented at the Potsdam Wolf-Rayet workshop. Moreover, they also include the questions, comments, and discussions emerging after each talk, thereby giving a rare overview not only about the research, but also about the current debates and unknowns in the field. The Scientific Organizing Committee (SOC) included Alceste Bonanos (Athens), Paul Crowther (Sheffield), John Eldridge (Auckland), Wolf-Rainer Hamann (Potsdam, Chair), John Hillier (Pittsburgh), Claus Leitherer (Baltimore), Philip Massey (Flagstaff), George Meynet (Geneva), Tony Moffat (Montreal), Nicole St-Louis (Montreal), and Dany Vanbeveren (Brussels).
Luminous Blue Variables show strong changes in their stellar wind on time scales of typically years to decades when they expand and contract radially at approximately constant luminosity. Micro-variability on shorter time scales and amplitudes can be observed superimposed to the larger scale radial changes. I will show long-term time series of high resolution spectra which we have collected in the past 20 years for many of the well known LBVs together with a few time series of weekly sampling (HR Car, R40, R71, R110, R127, S Dor) covering a time windows of up to a few months. Wind variability is seen on short and intermediate time scales with the line profiles changing from P Cygni to inverse P Cygni and double peeked profiles sometimes for the same star and spectral line. On longer time scales the ionisation levels for all chemical elements change drastically due to the strong change of the temperature on the stellar surface. While on the long term the characteristic radial changes may have impact on the over all mass loss rates, the variabilities and asymmetries on short and intermediate time scales may cause false estimates of the mass loss rates when confronting models with the observed line profiles
The most massive stars are those with the shortest but most active life. One group of massive stars, the Luminous Blue Variables (LBVs), of which only a few objects are known, are in particular of interest concerning the stability of stars. They have a high mass loss rate and are close to being instable. This is even more likely as rotation becomes an important factor in stellar evolution of these stars. Through massive stellar winds and sometimes giant eruptions, LBV nebulae are formed. Various aspects in the evolution in the LBV phase lead, beside the large scale morphological and kinematical differences, to a diversity of small structures like clumps, rims, and outflows in these nebulae.
We discuss the results of time-resolved spectroscopy of three presumably single Population I Wolf-Rayet stars in the Small Magellanic Cloud, where the ambient metallicity is $\sim 1/5 Z_\odot$. We were able to detect and follow numerous small-scale wind-embedded inhomogeneities in all observed stars. The general properties of the moving features, such as their velocity dispersions, emissivities and average accelerations, closely match the corresponding characteristics of small-scale inhomogeneities in the winds of Galactic Wolf-Rayet stars.
The influence of the wind to the total continuum of OB supergiants is discussed. For wind velocity distributions with β > 1.0, the wind can have strong influence to the total continuum emission, even at optical wavelengths. Comparing the continuum emission of clumped and unclumped winds, especially for stars with high β values, delivers flux differences of up to 30% with maximum in the near-IR. Continuum observations at these wavelengths are therefore an ideal tool to discriminate between clumped and unclumped winds of OB supergiants.
One of the informal properties often used to describe a new virtual world is its degree of openness. Yet what is an “open” virtual world? Does the phrase mean generally the same thing to different people? What distinguishes an open world from a less open world? Why does openness matter anyway? The answers to these questions cast light on an important, but shadowy, and uneasy, topic for virtual worlds: the relationship between those who construct the virtual, and those who use these constructions.
The “output-orientation” is omnipresent in teacher education. In order to evaluate teachers' and students' performances, a wide range of different quantitative questionnaires exist worldwide. One important goal of teaching evaluation is to increase the quality of teaching and learning. The author argues, that standard evaluations which are typically made at the end of the semester are problematic due to two reasons. The first one is that some of the questions are too general and don`t offer concrete ideas as to what kind of actions can be taken to make the courses better. The second problem is that the evaluation is mostly made when the course is already over. Because of this criticism, Apelojg invented the Felix-App which offers the possibility to give feedback in real-time by asking for the emotions and needs that occur during different learning situations. The idea is very simple: positive emotions and satisfied needs are helpful for the learning process. Negative emotions and unsatisfied needs have negative effects on the learning process. First descriptive results show, that “managing emotions” during classes can have positive effects on both motivation and emotions.
In this talk, I would like to share my experiences gained from participating in four CSP solver competitions and the second ASP solver competition. In particular, I’ll talk about how various programming techniques can make huge differences in solving some of the benchmark problems used in the competitions. These techniques include global constraints, table constraints, and problem-specific propagators and labeling strategies for selecting variables and values. I’ll present these techniques with experimental results from B-Prolog and other CLP(FD) systems.
Web Tracking
(2018)
Web tracking seems to become ubiquitous in online business and leads to increased privacy concerns of users. This paper provides an overview over the current state of the art of web-tracking research, aiming to reveal the relevance and methodologies of this research area and creates a foundation for future work. In particular, this study addresses the following research questions: What methods are followed? What results have been achieved so far? What are potential future research areas? For these goals, a structured literature review based upon an established methodological framework is conducted. The identified articles are investigated with respect to the applied research methodologies and the aspects of web tracking they emphasize.
Visual Social Networking Sites (SNSs) enable users to present themselves favorably to gain likes and the attention of others. Especially, Instagram is known for its focus on beauty, fitness, fashion, and dietary topics. Although a large body of research reports negative weight-related outcomes of SNS usage (e.g., body dissatisfaction, body image concerns), studies examining how SNS usage relates to these outcomes are scarce. Based on the visual normalization theory, we argue that SNS content facilitates normalization of so-called thin- and fit-ideals, thereby leading to biased perceptions of the average body weight in society. Therefore, this study tests whether Instagram use is associated with perceiving that the average person weighs less. Responses of 181 survey participants confirm that Instagram use is negatively related to average weight perception of both women and men. These findings contribute to the growing body of research on how SNS use relates to negative weight-related outcomes.
The management of knowledge in organizations considers both established long-term
processes and cooperation in agile project teams. Since knowledge can be both tacit and explicit, its transfer from the individual to the organizational knowledge base poses a challenge in organizations. This challenge increases when the fluctuation of knowledge carriers is exceptionally high. Especially in large projects in which external consultants are involved, there is a risk that critical, company-relevant knowledge generated in the project will leave the company with the external knowledge carrier and thus be lost. In this paper, we show the advantages of an early warning system for knowledge management to avoid this loss. In particular, the potential of visual analytics in the context of knowledge management systems is presented and discussed. We present a project for the development of a business-critical software system and discuss the first implementations and results.
Component based software development (CBSD) and aspectoriented software development (AOSD) are two complementary approaches. However, existing proposals for integrating aspects into component models are direct transposition of object-oriented AOSD techniques to components. In this article, we propose a new approach based on views. Our proposal introduces crosscutting components quite naturally and can be integrated into different component models.
Das vorliegende Werk ist das Ergebnis der 4. Woche des Russischen Rechts an der Juristischen Fakultät der Universität Potsdam. Namhafte Wissenschaftlerinnen und Wissenschaftler der Moskauer Staatlichen Juristischen Universität O. E. Kutafin hielten Vorträge zum Völkerrecht, Verfassungs- und Staatsangehörigkeitsrecht, Bürgerlichen Recht, Unternehmens- und Gesellschaftsrecht, Finanzrecht und Bankrecht. Das russische Recht befindet sich nach dem Ende der Sowjetunion im Wandel. Die Beiträge zeugen vom hohen Stand der Jurisprudenz an der Moskauer Staatlichen Juristischen Universität O. E. Kutafin. Die Wochen des Russischen Rechts tragen dazu bei, das Recht der Russischen Föderation bei uns in Deutschland bekannt zu machen und rechtsvergleichend zur Diskussion zu stellen.
The H.E.S.S. collaboration recently reported the discovery of VHE γ-ray emission coincident with the young stellar cluster Westerlund 2. This system is known to host a population of hot, massive stars, and, most particularly, the WR binary WR 20a. Particle acceleration to TeV energies in Westerlund 2 can be accomplished in several alternative scenarios, therefore we only discuss energetic constraints based on the total available kinetic energy in the system, the actual mass loss rates of respective cluster members, and implied gamma-ray production from processes such as inverse Compton scattering or neutral pion decay. From the inferred gammaray luminosity of the order of 1035erg/s, implications for the efficiency of converting available kinetic energy into non-thermal radiation associated with stellar winds in the Westerlund 2 cluster are discussed under consideration of either the presence or absence of wind clumping.
Verfassungsgerichtsbarkeit in der Russischen Föderation und in der Bundesrepublik Deutschland
(2013)
Der Tagungsband enthält die Referate und Diskussionsbeiträge des in Moskau an der Staatlichen Juristischen Kutafin-Universität am 9. und 10. Oktober 2012 durchgeführten Rundtischgespräches zur Verfassungsgerichtsbarkeit. Behandelt werden ausgewählte rechtshistorische und -politische Fragen sowie aktuelle rechtliche Probleme der Verfassungsgerichtsbarkeit in der Russischen Föderation und der Bundesrepublik Deutschland sowohl aus der Sicht der Rechtspraxis als auch der Wissenschaft: insbesondere die Entwicklung der Verfassungsgerichtsbarkeit in Geschichte und Gegenwart, Status, Rechtsnatur und Aufgaben des Verfassungsgerichts in den Subjekten der Föderation und in den Ländern sowie Verfassungsgericht und Gesetzgebung. Zudem werden Spezialfragen der Verfassungsgerichtsbarkeit erörtert, z.B. die Institution des Bevollmächtigten Vertreters des Präsidenten im Verfassungsgericht in Russland, der Eilrechtsschutz durch das BVerfG und der Rechtsschutz bei überlangen Verfahren vor dem BVerfG in Deutschland.
Der Band enthält die Tagungsmaterialien des deutsch-russichen Symposiums zum Thema "Verfassungsentwicklung in Russland und Deutschland", welches am 25. und 26. September 2013 in Potsdam stattfand. Die Tagung wurde anlässlich des 20. Jahrestages der russischen Verfassung vom Dezember 2013 durchgeführt. Die inhaltlichen Schwerpunkte bilden die Themen: Verfassungsentstehung, Verfassungsänderung, Verfassungsprinzipien, Landesverfassungen, Fortentwicklung der Verfassung durch die Verfassungsgerichtsbarkeit und Grundrechte, die jeweils aus russischer und deutscher Sicht behandelt werden. Ergänzend befasst sich jeweils ein Betrag mit aktuellen Problemen der Menschenrechtsverwirklichung in Russland und der Ausländerintegration in Deutschland und Russland im Vergleich.
In den zeitgenössischen slavischen Literaturen ist Gewalt allgegenwärtig – als Echo der Revolutionen, Kriege, Diktaturen und Systemumbrüche des 20. Jahrhunderts, als Reaktion auf andauernde und neu ausbrechende Konflikte, als Faszination, Sensation und Kaufanreiz. Gewalt erscheint als narrativ-ästhetischer, tradierter Bestandteil der literarischen Darstellung und als aussagekräftiges, tabubrechendes Motiv. Dieser Band trägt die Ergebnisse einer internationalen Konferenz an der Universität Hamburg zusammen, die sich im Herbst 2012 diesem Thema unter der Trias "Verbrechen – Fiktion - Vermarktung" gewidmet hat. Das breite Spektrum der untersuchten Literaturen (von ost- und west- über südslavische Literaturen, von Prosa über Lyrik und Dramatik) aber auch der Blick über die Literatur hinaus (unter anderem auf Film und Musik), die Vielfalt der Themen, Darstellungsweisen und analytischen Zugänge ergeben ein vielfältiges Bild, das eine Annäherung an die Frage nach den Spezifika literarischer Gewaltdarstellungen ermöglicht.
Verbal or visual? : How information is distributed across speech and gesture in spatial dialog
(2006)
In spatial dialog like in direction giving humans make frequent use of speechaccompanying gestures. Some gestures convey largely the same information as speech while others complement speech. This paper reports a study on how speakers distribute meaning across speech and gesture, and depending on what factors. Utterance meaning and the wider dialog context were tested by statistically analyzing a corpus of direction-giving dialogs. Problems of speech production (as indicated by discourse markers and disfluencies), the communicative goals, and the information status were found to be influential, while feedback signals by the addressee do not have any influence.
Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung Workshop vom 9. - 10. Februar 2006
Gamma-rays can be produced by the interaction of a relativistic jet and the matter of the stellar wind in the subclass of massive X-ray binaries known as “microquasars”. The relativistic jet is ejected from the surroundings of the compact object and interacts with cold protons from the stellar wind, producing pions that then quickly decay into gamma-rays. Since the resulting gamma-ray emissivity depends on the target density, the detection of rapid variability in microquasars with GLAST and the new generation of Cherenkov imaging arrays could be used to probe the clumped structure of the stellar wind. In particular, we show here that the relative fluctuation in gamma rays may scale with the square root of the ratio of porosity length to binary separation, $\sqrt{h/a}$, implying for example a ca. 10% variation in gamma ray emission for a quite moderate porosity, h/a ∼ 0.01.
Despite the phenomenal growth of Big Data Analytics in the last few years, little research is done to explicate the relationship between Big Data Analytics Capability (BDAC) and indirect strategic value derived from such digital capabilities. We attempt to address this gap by proposing a conceptual model of the BDAC - Innovation relationship using dynamic capability theory. The work expands on BDAC business value research and extends the nominal research done on BDAC – innovation. We focus on BDAC's relationship with different innovation objects, namely product, business process, and business model innovation, impacting all value chain activities. The insights gained will stimulate academic and practitioner interest in explicating strategic value generated from BDAC and serve as a framework for future research on the subject
We present an analysis of student language input in a corpus of tutoring dialogue in the domain of symbolic differentiation. Our focus on procedural tutoring makes the dialogue comparable to collaborative problem-solving (CPS). Existing CPS models describe the process of negotiating plans and goals, which also fits procedural tutoring. However, we provide a classification of student utterances and corpus annotation which shows that approximately 28% of non-trivial student language in this corpus is not accounted for by existing models, and addresses other functions, such as evaluating past actions or correcting mistakes. Our analysis can be used as a foundation for improving models of tutoring dialogue.
Ultrasound evaluation of the patellar tendon cross-sectional area and its relation to maximum force
(2012)
Turning shy on winter's day effects of season on personality and stress response in Microtus arvalis
(2013)
This paper explores the role of the intentional stance in games, arguing that any question of artificial intelligence has as much to do with the co-option of the player’s interpretation of actions as intelligent as any actual fixed-state systems attached to agents. It demonstrates how simply using a few simple and, in system terms, cheap tricks, existing AI can be both supported and enhanced. This includes representational characteristics, importing behavioral expectations from real life, constraining these expectations using diegetic devices, and managing social interrelationships to create the illusion of a greater intelligence than is ever actually present. It is concluded that complex artificial intelligence is often of less importance to the experience of intelligent agents in play than the creation of a space where the intentional stance can be evoked and supported.
Different properties of programs, implemented in Constraint Handling Rules (CHR), have already been investigated. Proving these properties in CHR is fairly simpler than proving them in any type of imperative programming language, which triggered the proposal of a methodology to map imperative programs into equivalent CHR. The equivalence of both programs implies that if a property is satisfied for one, then it is satisfied for the other. The mapping methodology could be put to other beneficial uses. One such use is the automatic generation of global constraints, at an attempt to demonstrate the benefits of having a rule-based implementation for constraint solvers.
Generalized Two-Level Grammar (GTWOL) provides a new method for compilation of parallel replacement rules into transducers. The current paper identifies the role of generalized lenient composition (GLC) in this method. Thanks to the GLC operation, the compilation method becomes bipartite and easily extendible to capture various application modes. In the light of three notions of obligatoriness, a modification to the compilation method is proposed. We argue that the bipartite design makes implementation of parallel obligatoriness, directionality, length and rank based application modes extremely easy, which is the main result of the paper.
Track and Treat
(2018)
E-Mail tracking mechanisms gather information on individual recipients’ reading behavior. Previous studies show that e-mail newsletters commonly include tracking elements. However, prior work does not examine the degree to which e-mail senders actually employ gathered user information. The paper closes this research gap by means of an experimental study to clarify the use of tracking-based infor- mation. To that end, twelve mail accounts are created, each of which subscribes to a pre-defined set of newsletters from companies based in Germany, the UK, and the USA. Systematically varying e-mail reading patterns across accounts, each account simulates a different type of user with individual read- ing behavior. Assuming senders to track e-mail reading habits, we expect changes in mailer behavior. The analysis confirms the prominence of tracking in that over 92% of the newsletter e-mails contain tracking images. For 13 out of 44 senders an adjustment of communication policy in response to user reading behavior is observed. Observed effects include sending newsletters at different times, adapting advertised products to match the users’ IT environment, increased or decreased mailing frequency, and mobile-specific adjustments. Regarding legal issues, not all companies that adapt the mail-sending behavior state the usage of such mechanisms in their privacy policy.
Because software development is increasingly expensive and timeconsuming, software reuse gains importance. Aspect-oriented software development modularizes crosscutting concerns which enables their systematic reuse. Literature provides a number of AOP patterns and best practices for developing reusable aspects based on compelling examples for concerns like tracing, transactions and persistence. However, such best practices are lacking for systematically reusing invasive aspects. In this paper, we present the ‘callback mismatch problem’. This problem arises in the context of abstraction mismatch, in which the aspect is required to issue a callback to the base application. As a consequence, the composition of invasive aspects is cumbersome to implement, difficult to maintain and impossible to reuse. We motivate this problem in a real-world example, show that it persists in the current state-of-the-art, and outline the need for advanced aspectual composition mechanisms to deal with this.
Artificial intelligence (AI)-based technologies can increasingly perform knowledge work tasks, such as medical diagnosis. Thereby, it is expected that humans will not be replaced by AI but work closely with AI-based technology (“augmentation”). Augmentation has ethical implications for humans (e.g., impact on autonomy, opportunities to flourish through work), thus, developers and managers of AI-based technology have a responsibility to anticipate and mitigate risks to human workers. However, doing so can be difficult as AI encompasses a wide range of technologies, some of which enable fundamentally new forms of interaction. In this research-in-progress paper, we propose the development of a taxonomy to categorize unique characteristics of AI-based technology that influence the interaction and have ethical implications for human workers. The completed taxonomy will support researchers in forming cumulative knowledge on the ethical implications of augmentation and assist practitioners in the ethical design and management of AI-based technology in knowledge work.
We present an extension to a comprehensive context model that has been successfully employed in a number of practical conversational dialogue systems. The model supports the task of multimodal fusion as well as that of reference resolution in a uniform manner. Our extension consists of integrating implicitly mentioned concepts into the context model and we show how they serve as candidates for reference resolution.
The rise of open source models for software and hardware development has catalyzed the debate regarding sustainable business models. Open Source Software has already become a dominant part in the software industry, whereas Open Source Hardware is still a little-researched phenomenon but has the potential to do the same to manufacturing in a wide range of products. This article addresses this potential by introducing a research design to analyze the prototyping phase of six different Open Source Hardware projects tackling ecological, social, and economical challenges. Using a design science research methodology, a process model is developed to concretise the prototype development steps. The prototype phase is important because it is where fundamental decisions are made that affect the openness of the final product. This paper aims to advance the discourse on open production as a concept that enables companies to apply the aspect of openness towards collaboration-oriented and sustainable business models.
This text compares the special characteristics of the game space in computer-generated environments with that in non-computerized playing-situations. Herewith, the concept of the magic circle as a deliberately delineated playing sphere with specific rules to be upheld by the players, is challenged. Yet, computer games also provide a virtual playing environment containing the rules of the game as well as the various action possibilities. But both the hardware and software facilitate the player’s actions rather than constraining them. This makes computer games fundamentally different: in contrast to traditional game spaces or limits, the computer-generated environment does not rely on the awareness of the player in upholding these rules. – Thus, there is no magic circle.
The variability of bone strength and skeletal robustness of young men - how it can be influenced
(2011)
The space-image
(2008)
In recent computer game research a paradigmatic shift is observable: Games today are first and foremost conceived as a new medium characterized by their status as an interactive image. The shift in attention towards this aspect becomes apparent in a new approach that is, first and foremost, aware of the spatiality of games or their spatial structures. This rejects traditional approaches on the basis that the medial specificity of games can no longer be reduced to textual or ludic properties, but has to be seen in medial constituted spatiality. For this purpose, seminal studies on the spatiality of computer games are resumed and their advantages and disadvantages are discussed. In connection with this, and against the background of the philosophical method of phenomenology, we propose three steps in describing computer games as space images: With this method it is possible to describe games with respect to the possible appearance of spatiality in a pictorial medium.
This paper suggests an approach to studying the rhetoric of persuasive computer games through comparative analysis. A comparison of the military propaganda game AMERICA’S ARMY to similar shooter games reveals an emphasis on discipline and constraints in all main aspects of the games, demonstrating a preoccupation with ethos more than pathos. Generalizing from this, a model for understanding game rhetoric through balances of freedom and constraints is proposed.
Many prediction tasks can be done based on users’ trace data. In this paper, we explored convergent thinking as a personality-related attribute and its relation to features gathered in interactive and non-interactive tasks of an online course. This is an under-utilized attribute that could be used for adapting online courses according to the creativity level to enhance the motivation of learners. Therefore, we used the logfile data of a 60 minutes Moodle course with N=128 learners, combined with the Remote Associates Test (RAT). We explored the trace data and found a weak correlation between interactive tasks and the RAT score, which was the highest considering the overall dataset. We trained a Random Forest Regressor to predict convergent thinking based on the trace data and analyzed the feature importance. The result has shown that the interactive tasks have the highest importance in prediction, but the accuracy is very low. We discuss the potential for personalizing online courses and address further steps to improve the applicability.
Jesper Juul has convincingly argued that the conflict over the proper object of study has shifted from “rules or story” to “player or game.” But a key component of digital games is still missing from either of these oppositions: that of the computer itself. This paper offers a way of thinking about the phenomenology of the videogame from the perspective of the computer rather than the game or the player.
This paper highlights the different ways of perceiving video games and video game content, incorporating interactive and non-interactive methods. It examines varying cognitive and emotive reactions by persons who are used to play video games as well as persons who are unfamiliar with the aesthetics and the most basic game play rules incorporated within video games. Additionally, the principle of “Flow” serves as a theoretical and philosophical foundation. A small case-study featuring two games has been made to emphasize the numerous possible ways of perception of video games.
Avatime, a Kwa language of Ghana, has an additive particle tsyɛ that at first sight looks similar to additive particles such as too and also in English. However, on closer inspection, the Avatime particle behaves differently. Contrary to what is usually claimed about additive particles, tsyɛ does not only associate with focused elements. Moreover, unlike its English equivalents, tsyɛ does not come with a requirement of identity between the expressed proposition and an alternative. Instead, it indicates that the proposition it occurs in is similar to or compatible with a presupposed alternative proposition.
In a common description, to play a game is to step inside a concrete or metaphorical magic circle where special rules apply. In video game studies, this description has received an inordinate amount of criticism which the paper argues has two primary sources: 1. a misreading of the basic concept of the magic circle and 2. a somewhat rushed application of traditional theoretical concerns onto games. The paper argues that games studies must move beyond conventional criticisms of binary distinctions and rather look at the details of how games are played. Finally, the paper proposes an alternative metaphor for game-playing, the puzzle piece.
Landscape aesthetics drawing on philosophy and psychology allow us to understand computer games from a new angle. The landscapes of computer games can be understood as environments or images. This difference creates two options: 1. We experience environments or images, or 2. We experience landscape simultaneously as both. Psychologically, the first option can be backed up by a Vygotskian framework (this option highlights certain non-mainstream subject positions), the second by a Piegatian (highlighting cognitive mapping of game worlds).
The interplay of autonomy goals and spousal support a prospective study with couples facing cancer
(2012)
We study the influence of clumping on the predicted wind structure of O-type stars. For this purpose we artificially include clumping into our stationary wind models. When the clumps are assumed to be optically thin, the radiative line force increases compared to corresponding unclumped models, with a similar effect on either the mass-loss rate or the terminal velocity (depending on the onset of clumping). Optically thick clumps, alternatively, might be able to decrease the radiative force.
Mass loss is a very important aspect of the life of massive stars. After briefly reviewing its importance, we discuss the impact of the recently proposed downward revision of mass loss rates due to clumping (difficulty to form Wolf-Rayet stars and production of critically rotating stars). Although a small reduction might be allowed, large reduction factors around ten are disfavoured. We then discuss the possibility of significant mass loss at very low metallicity due to stars reaching break-up velocities and especially due to the metal enrichment of the surface of the star via rotational and convective mixing. This significant mass loss may help the first very massive stars avoid the fate of pair-creation supernova, the chemical signature of which is not observed in extremely metal poor stars. The chemical composition of the very low metallicity winds is very similar to that of the most metal poor star known to date, HE1327-2326 and offer an interesting explanation for the origin of the metals in this star. We also discuss the importance of mass loss in the context of long and soft gamma-ray bursts and pair-creation supernovae. Finally, we would like to stress that mass loss in cooler parts of the HR-diagram (luminous blue variable and yellow and red supergiant stages) are much more uncertain than in the hot part. More work needs to be done in these areas to better constrain the evolution of the most massive stars.
The game itself?
(2020)
In this paper, we reassess the notion and current state of ludohermeneutics in game studies, and propose a more solid foundation for how to conduct hermeneutic game analysis. We argue that there can be no ludo-hermeneutics as such, and that every game interpretation rests in a particular game ontology, whether implicit or explicit. The quality of this ontology, then, determines a vital aspect of the quality of the analysis.
The game itself?
(2020)
In this paper, we reassess the notion and current state of ludohermeneutics in game studies, and propose a more solid foundation for how to conduct hermeneutic game analysis. We argue that there can be no ludo-hermeneutics as such, and that every game interpretation rests in a particular game ontology, whether implicit or explicit. The quality of this ontology, then, determines a vital aspect of the quality of the analysis.
On July 20/21 in 2012, an international workshop was held on the subject of the global impact of the Euro-Financial-Crisis at the University of Potsdam. Prof. Dr. Detlev Hummel, faculty Finance and Banking, was the host of the event. Academic colleagues from Beijing, Moscow and Connecticut (USA) as well as domestic capital market and banking experts presented their analyses. Different aspects of national and international finance markets were examined, with a focus on the European region, China and Russia. Mistakes and failures of the banking regulations were identified as one, but note the sole cause of the economic problems. A lack of budget discipline of some politicians and the loss of business competitiveness of certain European nations were mentioned, too. Some members of the European Union did not succeed in mastering the challenges of the global economy. There have been structural issues in some states that impede their competitiveness in the global market, for example with China. The participants pointed out a number of other reasons for the crisis, like dubious distribution types as well as a lack of transparency of certain financial products. Furthermore, remuneration and incentive schemas of investment banks and especially the reckless risk management policy of large banks were identified as other factors for the crisis. The participants of the international workshop in Potsdam agree that the birth of the Euro-currency was a political event and will remain a challenge. The reform of the banking supervision and further steps towards an economic and fiscal union are new research tasks.
The envy spiral
(2020)
On Social Networking Sites (SNS) users disclose mostly positive and often self-enhancing information. Scholars refer to this phenomenon as the positivity bias in SNS communication (PBSC). However, while theoretical explanations for this phenomenon have been proposed, an empirical proof of these theorized mechanisms is still missing. The project presented in this Research-in-Progress paper aims at explaining the PBSC with the mechanism specified in the self-enhancement envy spiral. Specifically, we hypothesize that feelings of envy drive people to post positive and self-enhancing content on SNS. To test this hypothesis, we developed an experimental design allowing to examine the causal effect of envy on the positivity of users’ subsequently posted content. In a preliminary study, we tested our manipulation of envy and could show its effectiveness in inducing different levels of envy between our groups. Our project will help to broaden the understanding of the complex dynamics of SNS and the potentially adverse driving forces underlying them.
We review the effects of clumping on the profiles of resonance doublets. By allowing the ratio of the doublet oscillator strenghts to be a free parameter, we demonstrate that doublet profiles contain more information than is normally utilized. In clumped (or porous) winds, this ratio can lies between unity and the ratio of the f-values, and can change as a function of velocity and time, depending on the fraction of the stellar disk that is covered by material moving at a particular velocity at a given moment. Using these insights, we present the results of SEI modeling of a sample of B supergiants, ζ Pup and a time series for a star whose terminal velocity is low enough to make the components of its Si VIλλ1400 independent. These results are interpreted within the framewrok of the Oskinova et al. (2007) model, and demonstrate how the doublet profiles can be used to extract infromation about wind structure.
e-ASTROGAM is a concept for a breakthrough observatory space mission carrying a gamma-ray telescope dedicated to the study of the non-thermal Universe in the photon energy range from 0.15 MeV to 3 GeV. The lower energy limit can be pushed down to energies as low as 30 keV for gamma-ray burst detection with the calorimeter. The mission is based on an advanced space-proven detector technology, with unprecedented sensitivity, angular and energy resolution, combined with remarkable polarimetric capability. Thanks to its performance in the MeV-GeV domain, substantially improving its predecessors, e-ASTROGAM will open a new window on the non-thermal Universe, making pioneering observations of the most powerful Galactic and extragalactic sources, elucidating the nature of their relativistic outflows and their effects on the surroundings. With a line sensitivity in the MeV energy range one to two orders of magnitude better than previous and current generation instruments, e-ASTROGAM will determine the origin of key isotopes fundamental for the understanding of supernova explosion and the chemical evolution of our Galaxy. The mission will be a major player of the multiwavelength, multimessenger time-domain astronomy of the 2030s, and provide unique data of significant interest to a broad astronomical community, complementary to powerful observatories such as LISA, LIGO, Virgo, KAGRA, the Einstein Telescope and the Cosmic Explorer, IceCube-Gen2 and KM3NeT, SKA, ALMA, JWST, E-ELT, LSST, Athena, and the Cherenkov Telescope Array.
The paper deals with the increasing growth of embedded systems and their role within structures similar to the Internet (Internet of Things) as those that provide calculating power and are more or less appropriate for analytical tasks. Faced with the example of a cyber-physical manufacturing system, a common objective function is developed with the intention to measure efficient task processing within analytical infrastructures. A first validation is realized on base of an expert panel.
MMORPGs such as WORLD OF WARCRAFT can be understood as interactive representations of war. Within the frame provided by the program the players experience martial conflicts and thus a “virtual war.” The game world however requires a technical and as far as possible invisible infrastructure which has to be protected against attacks: Infrastructure means e.g. the servers on which the data of the player characters and the game’s world are saved, as well as the user accounts, which have to be protected, among other things, from “identity theft.” Besides the war on the virtual surface of the program we will therefore describe the invisible war concerning the infrastructure, the outbreak of which is always feared by the developers and operators of online-worlds, requiring them to take precautions. Furthermore we would like to focus on “virtual game worlds” as places of complete surveillance. Since action in these worlds is always associated with the production of data, total observation is theoretically possible and put into practice by the so-called “game master.” The observation of different communication channels (including user forums) serves to monitor and direct the actions on the virtual battlefield subtly, without the player feeling that his freedom is being limited. Finally, we will compare the fictional theater of war in WORLD OF WARCRAFT to the vision of “Network-Centric Warfare,” since it has often been observed that the analysis of MMORPGs is useful to the real trade of war. However, we point out what an unrealistic theater of war WORLD OF WARCRAFT really is.
During the last few years there was a tremendous growth of scientific activities in the fields related to both Physics and Control theory: nonlinear dynamics, micro- and nanotechnologies, self-organization and complexity, etc. New horizons were opened and new exciting applications emerged. Experts with different backgrounds starting to work together need more opportunities for information exchange to improve mutual understanding and cooperation. The Conference "Physics and Control 2007" is the third international conference focusing on the borderland between Physics and Control with emphasis on both theory and applications. With its 2007 address at Potsdam, Germany, the conference is located for the first time outside of Russia. The major goal of the Conference is to bring together researchers from different scientific communities and to gain some general and unified perspectives in the studies of controlled systems in physics, engineering, chemistry, biology and other natural sciences. We hope that the Conference helps experts in control theory to get acquainted with new interesting problems, and helps experts in physics and related fields to know more about ideas and tools from the modern control theory.
Test-retest-reliability of metabolic and cardiovascular load during isokinetic strength testing
(2012)
Temporal propositions are mapped to sets of strings that witness (in a precise sense) the propositions over discrete linear Kripke frames. The strings are collected into regular languages to ensure the decidability of entailments given by inclusions between languages. (Various notions of bounded entailment are shown to be expressible as language inclusions.) The languages unwind computations implicit in the logical (and temporal) connectives via a system of finite-state constraints adapted from finite-state morphology. Applications to Hybrid Logic and non-monotonic inertial reasoning are briefly considered.
The term Personal Learning Environment (PLE) is associated with the desire to put the learner in control of his own learning process, so that he is able to set and accomplish the desired learning goals at the right time with the learning environment chosen by him. Gradually, such a learning environment includes several digital content, services and tools. It is thus summarized as the Virtual Learning Environment (VLE). Even though the construction of an individual PLE is a complex task, several approaches to support this process already exist. They mostly occur under the umbrella term PLE or with little accentuations like iPLE, which especially live within the context of institutions. This paper sums up the variety of attempts and technical approaches to establish a PLE and suggests a categorization for them.