Refine
Year of publication
Document Type
- Conference Proceeding (313) (remove)
Language
- English (313) (remove)
Is part of the Bibliography
- yes (313) (remove)
Keywords
- social media (5)
- COVID-19 (3)
- Cloud Computing (3)
- E-Mail Tracking (3)
- ERP (3)
- MOOC (3)
- Privacy (3)
- conversational agents (3)
- enterprise systems (3)
- knowledge management (3)
Institute
- Fachgruppe Betriebswirtschaftslehre (66)
- Institut für Biochemie und Biologie (52)
- Department Sport- und Gesundheitswissenschaften (39)
- Institut für Ernährungswissenschaft (37)
- Department Psychologie (27)
- Institut für Chemie (15)
- Wirtschaftswissenschaften (8)
- Institut für Mathematik (7)
- Institut für Geowissenschaften (6)
- Institut für Physik und Astronomie (6)
Expanding modeling notations
(2021)
Creativity is a common aspect of business processes and thus needs a proper representation through process modeling notations. However, creative processes constitute highly flexible process elements, as new and unforeseeable outcome is developed. This presents a challenge for modeling languages. Current methods representing creative-intensive work are rather less able to capture creative specifics which are relevant to successfully run and manage these processes. We outline the concept of creative-intensive processes and present an example from a game design process in order to derive critical process aspects relevant for its modeling. Six aspects are detected, with first and foremost: process flexibility, as well as temporal uncertainty, experience, types of creative problems, phases of the creative process and individual criteria. By first analyzing what aspects of creative work modeling notations already cover, we further discuss which modeling extensions need to be developed to better represent creativity within business processes. We argue that a proper representation of creative work would not just improve the management of those processes, but can further enable process actors to more efficiently run these creative processes and adjust them to better fit to the creative needs.
Despite the phenomenal growth of Big Data Analytics in the last few years, little research is done to explicate the relationship between Big Data Analytics Capability (BDAC) and indirect strategic value derived from such digital capabilities. We attempt to address this gap by proposing a conceptual model of the BDAC - Innovation relationship using dynamic capability theory. The work expands on BDAC business value research and extends the nominal research done on BDAC – innovation. We focus on BDAC's relationship with different innovation objects, namely product, business process, and business model innovation, impacting all value chain activities. The insights gained will stimulate academic and practitioner interest in explicating strategic value generated from BDAC and serve as a framework for future research on the subject
Already successfully used products or designs, past projects or our own experiences can be the basis for the development of new products. As reference products or existing knowledge, it is reused in the development process and across generations of products. Since further, products are developed in cooperation, the development of new product generations is characterized by knowledge-intensive processes in which information and knowledge are exchanged between different kinds of knowledge carriers. The particular knowledge transfer here describes the identification of knowledge, its transmission from the knowledge carrier to the knowledge receiver, and its application by the knowledge receiver, which includes embodied knowledge of physical products. Initial empirical findings of the quantitative effects regarding the speed of knowledge transfers already have been examined. However, the factors influencing the quality of knowledge transfer to increase the efficiency and effectiveness of knowledge transfer in product development have not yet been examined empirically. Therefore, this paper prepares an experimental setting for the empirical investigation of the quality of knowledge transfers.
Many prediction tasks can be done based on users’ trace data. In this paper, we explored convergent thinking as a personality-related attribute and its relation to features gathered in interactive and non-interactive tasks of an online course. This is an under-utilized attribute that could be used for adapting online courses according to the creativity level to enhance the motivation of learners. Therefore, we used the logfile data of a 60 minutes Moodle course with N=128 learners, combined with the Remote Associates Test (RAT). We explored the trace data and found a weak correlation between interactive tasks and the RAT score, which was the highest considering the overall dataset. We trained a Random Forest Regressor to predict convergent thinking based on the trace data and analyzed the feature importance. The result has shown that the interactive tasks have the highest importance in prediction, but the accuracy is very low. We discuss the potential for personalizing online courses and address further steps to improve the applicability.
The devil in disguise
(2021)
Envy constitutes a serious issue on Social Networking Sites (SNSs), as this painful emotion can severely diminish individuals' well-being. With prior research mainly focusing on the affective consequences of envy in the SNS context, its behavioral consequences remain puzzling. While negative interactions among SNS users are an alarming issue, it remains unclear to which extent the harmful emotion of malicious envy contributes to these toxic dynamics. This study constitutes a first step in understanding malicious envy’s causal impact on negative interactions within the SNS sphere. Within an online experiment, we experimentally induce malicious envy and measure its immediate impact on users’ negative behavior towards other users. Our findings show that malicious envy seems to be an essential factor fueling negativity among SNS users and further illustrate that this effect is especially pronounced when users are provided an objective factor to mask their envy and justify their norm-violating negative behavior.
The holocaust in the USSR
(2021)
This paper sketches the current status of international scholarship on the subject of the Holocaust in the USSR and its place in the wider military conflict of the Second World War. Research on this topic over the last 20 to 30 years has been truly international and the findings of this research cannot be sketched here without pointing to the contributions made by German, American, Russian, Israeli, British and Australian historians. Historians from these countries have made important contributions to our understanding of key questions relating to this subject. These questions address, among other things, pre-invasion orders issued to German units; the radicalisation of German policy, culminating in the root-and-branch extermination of Soviet Jewry; the network of ghettos set up on Soviet territory; the nature of the killing and the methods used to murder these victims; the total death toll of the Holocaust in the USSR; and the relationship between war and extermination, in which genocide can be regarded as an actual strategy of warfare pursued by the German Reich.
Halide perovskites
(2021)
Since the beginning of the recent global refugee crisis, researchers have been tackling many of its associated aspects, investigating how we can help to alleviate this crisis, in particular, using ICTs capabilities. In our research, we investigated the use of ICT solutions by refugees to foster the social inclusion process in the host community. To tackle this topic, we conducted thirteen interviews with Syrian refugees in Germany. Our findings reveal different ICT usages by refugees and how these contribute to feeling empowered. Moreover, we show the sources of empowerment for refugees that are gained by ICT use. Finally, we identified the two types of social inclusion benefits that were derived from empowerment sources. Our results provide practical implications to different stakeholders and decision-makers on how ICT usage can empower refugees, which can foster the social inclusion of refugees, and what should be considered to support them in their integration effort.
Increasingly fast development cycles and individualized products pose major challenges for today's smart production systems in times of industry 4.0. The systems must be flexible and continuously adapt to changing conditions while still guaranteeing high throughputs and robustness against external disruptions. Deep rein- forcement learning (RL) algorithms, which already reached impressive success with Google DeepMind's AlphaGo, are increasingly transferred to production systems to meet related requirements. Unlike supervised and unsupervised machine learning techniques, deep RL algorithms learn based on recently collected sensor- and process-data in direct interaction with the environment and are able to perform decisions in real-time. As such, deep RL algorithms seem promising given their potential to provide decision support in complex environments, as production systems, and simultaneously adapt to changing circumstances. While different use-cases for deep RL emerged, a structured overview and integration of findings on their application are missing. To address this gap, this contribution provides a systematic literature review of existing deep RL applications in the field of production planning and control as well as production logistics. From a performance perspective, it became evident that deep RL can beat heuristics significantly in their overall performance and provides superior solutions to various industrial use-cases. Nevertheless, safety and reliability concerns must be overcome before the widespread use of deep RL is possible which presumes more intensive testing of deep RL in real world applications besides the already ongoing intensive simulations.
Digitization and demographic change are enormous challenges for companies. Learning factories as innovative learning places can help prepare older employees for the digital change but must be designed and configured based on their specific learning requirements. To date, however, there are no particular recommendations to ensure effective age-appropriate training of bluecollar workers in learning factories. Therefore, based on a literature review, design characteristics and attributes of learning factories and learning requirements of older employees are presented. Furthermore, didactical recommendations for realizing age-appropriate learning designs in learning factories and a conceptualized scenario are outlined by synthesizing the findings.
Yes, we can (?)
(2021)
The COVID-19 crisis has caused an extreme situation for higher education institutions around the world, where exclusively virtual teaching and learning has become obligatory rather than an additional supporting feature. This has created opportunities to explore the potential and limitations of virtual learning formats. This paper presents four theses on virtual classroom teaching and learning that are discussed critically. We use existing theoretical insights extended by empirical evidence from a survey of more than 850 students on acceptance, expectations, and attitudes regarding the positive and negative aspects of virtual teaching. The survey responses were gathered from students at different universities during the first completely digital semester (Spring-Summer 2020) in Germany. We discuss similarities and differences between the subjects being studied and highlight the advantages and disadvantages of virtual teaching and learning. Against the background of existing theory and the gathered data, we emphasize the importance of social interaction, the combination of different learning formats, and thus context-sensitive hybrid learning as the learning form of the future.
The game itself?
(2020)
In this paper, we reassess the notion and current state of ludohermeneutics in game studies, and propose a more solid foundation for how to conduct hermeneutic game analysis. We argue that there can be no ludo-hermeneutics as such, and that every game interpretation rests in a particular game ontology, whether implicit or explicit. The quality of this ontology, then, determines a vital aspect of the quality of the analysis.
Public blockchain
(2020)
Blockchain has the potential to change business transactions to a major extent. Thereby, underlying consensus algorithms are the core mechanism to achieve consistency in distributed infrastructures. Their application aims for transparency and accountability in societal transactions. As a result of missing reviews holistically covering consensus algorithms, we aim to (1) identify prevalent consensus algorithms for public blockchains, and (2) address the resource perspective with a sustainability consideration (whereby we address the three spheres of sustainability). Our systematic literature review identified 33 different consensus algorithms for public blockchains. Our contribution is twofold: first, we provide a systematic summary of consensus algorithms for public blockchains derived from the scientific literature as well as real-world applications and systemize them according to their research focus; second, we assess the sustainability of consensus algorithms using a representative sample and thereby highlight the gaps in literature to address the holistic sustainability of consensus algorithms.
The game itself?
(2020)
In this paper, we reassess the notion and current state of ludohermeneutics in game studies, and propose a more solid foundation for how to conduct hermeneutic game analysis. We argue that there can be no ludo-hermeneutics as such, and that every game interpretation rests in a particular game ontology, whether implicit or explicit. The quality of this ontology, then, determines a vital aspect of the quality of the analysis.
Developing a new paradigm
(2020)
Internet users commonly agree that it is important for them to protect their personal data. However, the same users readily disclose their data when requested by an online service. The dichotomy between privacy attitude and actual behaviour is commonly referred to as the “privacy paradox”. Over twenty years of research were not able to provide one comprehensive explanation for the paradox and seems even further from providing actual means to overcome the paradox. We argue that the privacy paradox is not just an instantiation of the attitude-behaviour gap. Instead, we introduce a new paradigm explaining the paradox as the result of attitude-intention and intentionbehaviour gaps. Historically, motivational goal-setting psychologists addressed the issue of intentionbehaviour gaps in terms of the Rubicon Model of Action Phases and argued that commitment and volitional strength are an essential mechanism that fuel intentions and translate them into action. Thus, in this study we address the privacy paradox from a motivational psychological perspective by developing two interventions on Facebook and assess whether the 287 participants of our online experiment actually change their privacy behaviour. The results demonstrate the presence of an intentionbehaviour gap and the efficacy of our interventions in reducing the privacy paradox.
As Industry 4.0 infrastructures are seen as highly evolutionary environment with volatile, and time-dependent workloads for analytical tasks, particularly the optimal dimensioning of IT hardware is a challenge for decision makers because the digital processing of these tasks can be decoupled from their physical place of origin. Flexible architecture models to allocate tasks efficiently with regard to multi-facet aspects and a predefined set of local systems and external cloud services have been proven in small example scenarios. This paper provides a benchmark of existing task realization strategies, composed of (1) task distribution and (2) task prioritization in a real-world scenario simulation. It identifies heuristics as superior strategies.
How games spoil creativity
(2020)
The demand for a creative workforce is every growing and effective measures to improve individual creativity are searched for. This study analyzes the possibility to use games as a prime for a creative mindset. Two short entertainment games, plus a no-game-comparison condition were set up in three versions of an online-study, along with two creativity tasks and scales to assess the individual creative mindset (fixed-vs-growth, creative self-efficacy and affect). Results indicate priming effects of the games, but in the opposite intended direction: gaming diminished the creative test performances. Those playing the games reported more ideas in the open-ended creative problem task, but those answers were of less quality and they solved less closed-problem items compared to those not playing. An impact of further mindset differences could be ruled out.
From learners to educators
(2020)
The rapid growth of technology and its evolving potential to support the transformation of teaching and learning in post-secondary institutions is a major challenge to the basic understanding of both the university and the communities it serves. In higher education, the standard forms of learning and teaching are increasingly being challenged and a more comprehensive process of differentiation is taking place. Student-centered teaching methods are becoming increasingly important in course design and the role of the lecturer is changing from the knowledge mediator to moderator and learning companion. However, this is accelerating the need for strategically planned faculty support and a reassessment of the role of teaching and learning. Even though the benefits of experience-based learning approaches for the development of life skills are well known, most knowledge transfer is still realized through lectures in higher education. Teachers have the goal to design the curriculum, new assignments, and share insights into evolving pedagogy. Student engagement could be the most important factor in the learning success of university students, regardless of the university program or teaching format. Against this background, this article presents the development, application, and initial findings of an innovative learning concept. In this concept, students are allowed to deal with a scientific topic, but instead of a presentation and a written elaboration, their examination consists of developing an online course in terms of content, didactics, and concept to implement it in a learning environment, which is state of the art. The online courses include both self-created teaching material and interactive tasks. The courses are created to be available to other students as learning material after a review process and are thus incorporated into the curriculum.
The envy spiral
(2020)
On Social Networking Sites (SNS) users disclose mostly positive and often self-enhancing information. Scholars refer to this phenomenon as the positivity bias in SNS communication (PBSC). However, while theoretical explanations for this phenomenon have been proposed, an empirical proof of these theorized mechanisms is still missing. The project presented in this Research-in-Progress paper aims at explaining the PBSC with the mechanism specified in the self-enhancement envy spiral. Specifically, we hypothesize that feelings of envy drive people to post positive and self-enhancing content on SNS. To test this hypothesis, we developed an experimental design allowing to examine the causal effect of envy on the positivity of users’ subsequently posted content. In a preliminary study, we tested our manipulation of envy and could show its effectiveness in inducing different levels of envy between our groups. Our project will help to broaden the understanding of the complex dynamics of SNS and the potentially adverse driving forces underlying them.
The book offers a comprehensive overview of current research in Slavic linguistics from a theoretical and experimental perspective and from a variety of languages. The selected papers from the 11th European Conference on Formal Description of Slavic Languages (FDSL 11) that took place at the University of Potsdam in 2015, illustrate the advancement of Slavic linguistic studies and their outreach for the development of general linguistics. The guest paper by Noam Chomsky at the beginning of the book sets a clear sign in this direction and may be taken as an acknowledgement of the field.
Missing out on life
(2020)
Mobile devices have become an integral part of everyday life due to
their portability. As literature shows, technology use is not only beneficial but also has dark sides, such as addiction. Parents face the need to balance perceived benefits and risks of children’s exposure to mobile technologies. However, no study has uncovered what kind of benefits and concerns parents consider when implementing technology-related rules. We built on qualitative responses of 300
parents of children aged two to thirteen to explore concerns about, and perceived benefits of children’s smartphone and tablet usage, as well as the rules parents have developed regarding technology use. Findings point to concerns regarding children’s development, as well as benefits for both children and parents, and ultimately to new insights about mobile technology mediation. These results provide practical guidance for parents, physicians and mobile industry
stakeholders, trying to ensure that children are acting responsibly with mobile technology.
Living in a world of plenty?
(2020)
Inequality in the distribution of economic wealth within populations has been rising steadily over the past century, having reached unprecedented highs in many Western societies. However, this development is not reflected in people’s perceptions of wealth inequality, as the public tends to underestimate it. Research suggests that inequality estimates are derived from personal reference groups, which, as we propose, are expanded by social network site (SNS) use. As content on SNSs frequently revolves around events of consumption, signaling enhanced overall population wealth, this study tests the hypothesis that SNS use distorts inequality perceptions downward, i.e., increases the perception of societal equality. Responses of 534 survey participants in the United States confirm that SNS use negatively predicts perceived inequality. The relationship is stronger the more SNS users perceive the content they encounter online as real, supporting the assumption that observing other people’s behavior online lowers estimates of nationwide wealth inequality. These findings provide novel insights on inequality misperceptions by suggesting individuals’ SNS use as a new predictor of perceived wealth inequality.
How messy is your news feed
(2020)
Social Networking Sites (SNSs) are pervasive in our daily lives. However, emerging reports suggest that people are increasingly dissatisfied with their experience of SNSs News Feeds. Motivated by the cognitive load theory, the paper postulates that arrangement and presentation of information are important constituents of one’s Facebook News Feed experience. Integrating these factors into the novel concept of ‘perceived disorder’, this paper hypothesizes that the perception of disorder elicited by the Facebook News Feed plays an important role in causing discontinuance intentions. Drawing on the Stressor-Strain-Outcome Model, we suggest that perceived disorder leads to SNS discontinuance intention and is partially mediated by SNS fatigue. The paper uses the responses of 268 Facebook users to investigate these relationships and introduces perceived disorder as a novel stressor. Besides adding to the existing body of literature, these insights are of relevance to internet service providers, policy makers and SNS users.
Data sharing requires researchers to publish their (primary) data and any supporting research materials. With increased attention on reproducibility and more transparent research requiring sharing of data, the issues surrounding data sharing are moving beyond whether data sharing is beneficial, to what kind of research data should be shared and how. However, despite its benefits, data sharing still is not common practice in Information Systems (IS) research. The panel seeks to discuss the controversies related to data sharing in research, specifically focusing on the IS discipline. It remains unclear how the positive effects of data sharing that are often framed as extending beyond the individual researcher (e.g., openness for innovation) can be utilized while reducing the downsides often associated with negative consequences for the individual researcher (e.g., losing a competitive advantage). To foster data sharing practices in IS, the panel will address this dilemma by drawing on the panelists’ expertise.
Background:
Anti-TNFα monoclonal antibodies (mAbs) are a well-established treatment for patients with Crohn’s disease (CD). However, subtherapeutic concentrations of mAbs have been related to a loss of response during the first year of therapy1. Therefore, an appropriate dosing strategy is crucial to prevent the underexposure of mAbs for those patients. The aim of our study was to assess the impact of different dosing strategies (fixed dose or body size descriptor adapted) on drug exposure and the target concentration attainment for two different anti-TNFα mAbs: infliximab (IFX, body weight (BW)-based dosing) and certolizumab pegol (CZP, fixed dosing). For this purpose, a comprehensive pharmacokinetic (PK) simulation study was performed.
Methods:
A virtual population of 1000 clinically representative CD patients was generated based on the distribution of CD patient characteristics from an in-house clinical database (n = 116). Seven dosing regimens were investigated: fixed dose and per BW, lean BW (LBW), body surface area, height, body mass index and fat-free mass. The individual body size-adjusted doses were calculated from patient generated body size descriptor values. Then, using published PK models for IFX and CZP in CD patients2,3, for each patient, 1000 concentration–time profiles were simulated to consider the typical profile of a specific patient as well as the range of possible individual profiles due to unexplained PK variability across patients. For each dosing strategy, the variability in maximum and minimum mAb concentrations (Cmax and Cmin, respectively), area under the concentration-time curve (AUC) and the per cent of patients reaching target concentration were assessed during maintenance therapy.
Results:
For IFX and CZP, Cmin showed the highest variability between patients (CV ≈110% and CV ≈80%, respectively) with a similar extent across all dosing strategies. For IFX, the per cent of patients reaching the target (Cmin = 5 µg/ml) was similar across all dosing strategies (~15%). For CZP, the per cent of patients reaching the target average concentration of 17 µg/ml ranged substantially (52–71%), being the highest for LBW-adjusted dosing.
Conclusion:
By using a PK simulation approach, different dosing regimen of IFX and CZP revealed the highest variability for Cmin, the most commonly used PK parameter guiding treatment decisions, independent upon dosing regimen. Our results demonstrate similar target attainment with fixed dosing of IFX compared with currently recommended BW-based dosing. For CZP, the current fixed dosing strategy leads to comparable percentage of patients reaching target as the best performing body size-adjusted dosing (66% vs. 71%, respectively).
In the time of digitalization the demand for organizational change is rising and demands ways to cope with fundamental changes on the organizational as well as individual level. As a basis, learning and forgetting mechanisms need to be understood in order to guide a change process efficiently and successfully. Our research aims to get a better understanding of individual differences and mechanisms in the change context by performing an experiment where individuals learn and later re-learn a complex production process using a simulation setting. The individual’s performance, as well as retentivity and prior knowledge is assessed. Our results show that higher retentivity goes along with better learning and forgetting performances. Prior knowledge did not reveal such relation to the learning and forgetting performances. The influence of age and gender is discussed in detail.
Does a smile open all doors?
(2020)
Online photographs govern an individual’s choices across a variety of contexts. In sharing arrangements, facial appearance has been shown to affect the desire to collaborate, interest to explore a listing, and even willingness to pay for a stay. Because of the ubiquity of online images and their influence on social attitudes, it seems crucial to be able to control these aspects. The present study examines the effect of different photographic self-disclosures on the provider’s perceptions and willingness to accept a potential co-sharer. The findings from our experiment in the accommodation-sharing context suggest social attraction mediates the effect of photographic self-disclosures on willingness to host. Implications of the results for IS research and practitioners are discussed.
Background:
Childhood and adolescence are critical stages of life for mental health and well-being. Schools are a key setting for mental health promotion and illness prevention. One in five children and adolescents have a mental disorder, about half of mental disorders beginning before the age of 14. Beneficial and explainable artificial intelligence can replace current paper- based and online approaches to school mental health surveys. This can enhance data acquisition, interoperability, data driven analysis, trust and compliance. This paper presents a model for using chatbots for non-obtrusive data collection and supervised machine learning models for data analysis; and discusses ethical considerations pertaining to the use of these models.
Methods:
For data acquisition, the proposed model uses chatbots which interact with students. The conversation log acts as the source of raw data for the machine learning. Pre-processing of the data is automated by filtering for keywords and phrases.
Existing survey results, obtained through current paper-based data collection methods, are evaluated by domain experts (health professionals). These can be used to create a test dataset to validate the machine learning models. Supervised learning
can then be deployed to classify specific behaviour and mental health patterns.
Results:
We present a model that can be used to improve upon current paper-based data collection and manual data analysis methods. An open-source GitHub repository contains necessary tools and components of this model. Privacy is respected through
rigorous observance of confidentiality and data protection requirements. Critical reflection on these ethics and law aspects is included in the project.
Conclusions:
This model strengthens mental health surveillance in schools. The same tools and components could be applied to other public health data. Future extensions of this model could also incorporate unsupervised learning to find clusters and patterns
of unknown effects.
The “output-orientation” is omnipresent in teacher education. In order to evaluate teachers' and students' performances, a wide range of different quantitative questionnaires exist worldwide. One important goal of teaching evaluation is to increase the quality of teaching and learning. The author argues, that standard evaluations which are typically made at the end of the semester are problematic due to two reasons. The first one is that some of the questions are too general and don`t offer concrete ideas as to what kind of actions can be taken to make the courses better. The second problem is that the evaluation is mostly made when the course is already over. Because of this criticism, Apelojg invented the Felix-App which offers the possibility to give feedback in real-time by asking for the emotions and needs that occur during different learning situations. The idea is very simple: positive emotions and satisfied needs are helpful for the learning process. Negative emotions and unsatisfied needs have negative effects on the learning process. First descriptive results show, that “managing emotions” during classes can have positive effects on both motivation and emotions.
RAA2019
(2019)
These abstracts result from the 10th International Congress on the Application of Raman Spectroscopy in Art and Archaeology held 03.09. – 07.09.2019 in Potsdam (Germany).
The RAA is an established biennial international conference series. Since the beginning in 2001, the RAA conferences promote Raman Spectroscopy and play an important role in increasing the field of its applications in art history, history, archaeology, palaeontology, conservation and restoration, museology, degradation of cultural heritage, archaeometry, etc. Furthermore, the development of new instrumentation, especially for non-invasive measurements, receives great attention.
The Congress covers all topics of Raman spectroscopic applications in art and archaeology and focuses on the following themes:
• Material characterization and degradation processes
• Conservation issues affecting cultural heritage
• Raman spectroscopy of biological and organic materials
• Surface enhanced Raman spectroscopy
• Chemometrics in Raman spectroscopy
• Development of Raman techniques
• New Raman instrumentation and applications in cultural heritage objects investigations
• Raman spectroscopy in paleontology, paleoenvironment and archaeology
Modern web browsers are digital software platforms, as they allow third-parties to extend functionality by providing extensions. Given the intense competition, differentiation through provided functionality is a key factor for browser platforms. As browsers progress, they constantly release new features. Browsers might thereby enter complementary markets if they add functionality formerly provided by third-party extensions, which is referred to as ‘platform coring’. Previous studies missed the perspective of the involved parties. To address this gap, we conduct interviews with third-party and core developers in the security and privacy domain from Firefox and Chrome. In essence, the study provides three contributions. First, insights into stakeholder-specific issues concerning coring. Second, measures to prevent coring. Third, strategical guidance for developers and owners. Third-parties experienced and core developers acknowledged coring to occur on browser platforms. While developers with extrinsic motivations assess coring negatively, developers with intrinsic motivations perceive coring positively.
Accelerating knowledge
(2019)
As knowledge-intensive processes are often carried out in teams and demand for knowledge transfers among various knowledge carriers, any optimization in regard to the acceleration of knowledge transfers obtains a great economic potential. Exemplified with product development projects, knowledge transfers focus on knowledge acquired in former situations and product generations. An adjustment in the manifestation of knowledge transfers in its concrete situation, here called intervention, therefore can directly be connected to the adequate speed optimization of knowledge-intensive process steps. This contribution presents the specification of seven concrete interventions following an intervention template. Further, it describes the design and results of a workshop with experts as a descriptive study. The workshop was used to assess the practical relevance of interventions designed as well as the identification of practical success factors and barriers of their implementation.
Track and Treat
(2018)
E-Mail tracking mechanisms gather information on individual recipients’ reading behavior. Previous studies show that e-mail newsletters commonly include tracking elements. However, prior work does not examine the degree to which e-mail senders actually employ gathered user information. The paper closes this research gap by means of an experimental study to clarify the use of tracking-based infor- mation. To that end, twelve mail accounts are created, each of which subscribes to a pre-defined set of newsletters from companies based in Germany, the UK, and the USA. Systematically varying e-mail reading patterns across accounts, each account simulates a different type of user with individual read- ing behavior. Assuming senders to track e-mail reading habits, we expect changes in mailer behavior. The analysis confirms the prominence of tracking in that over 92% of the newsletter e-mails contain tracking images. For 13 out of 44 senders an adjustment of communication policy in response to user reading behavior is observed. Observed effects include sending newsletters at different times, adapting advertised products to match the users’ IT environment, increased or decreased mailing frequency, and mobile-specific adjustments. Regarding legal issues, not all companies that adapt the mail-sending behavior state the usage of such mechanisms in their privacy policy.
In honour of Seymour Papert
(2018)
Forth is nice and flexible but to a philosopher and teacher educator Logo is the more impressing language. Both are relatives of Lisp, but Forth has a reverse Polish notation where as Logo has an infix notation. Logo allows top down programming, Forth only bottom up. Logo enables recursive programming, Forth does not. Logo includes turtle graphics, Forth has nothing comparable. So what to do if you can't get Logo and have no information about its inner architecture? This should be a case of "empirical modelling": How can you model observable results of the behaviour of Logo in terms of Forth? The main steps to solve this problem are shown in the first part of the paper.
The second part of the paper discusses the problem of modelling and shows that the modelling of making and the modelling of recognition have the same mathematical structure. So "empirical modelling" can also serve for modelling desired behaviour of technical systems.
The last part of the paper will show that the heuristic potential of a problem which should be modeled is more important than the programming language. The Picasso construal shows, in a very simple way, how children of different ages can model emotional relations in human behaviour with a simple Logo system.
e-ASTROGAM is a concept for a breakthrough observatory space mission carrying a gamma-ray telescope dedicated to the study of the non-thermal Universe in the photon energy range from 0.15 MeV to 3 GeV. The lower energy limit can be pushed down to energies as low as 30 keV for gamma-ray burst detection with the calorimeter. The mission is based on an advanced space-proven detector technology, with unprecedented sensitivity, angular and energy resolution, combined with remarkable polarimetric capability. Thanks to its performance in the MeV-GeV domain, substantially improving its predecessors, e-ASTROGAM will open a new window on the non-thermal Universe, making pioneering observations of the most powerful Galactic and extragalactic sources, elucidating the nature of their relativistic outflows and their effects on the surroundings. With a line sensitivity in the MeV energy range one to two orders of magnitude better than previous and current generation instruments, e-ASTROGAM will determine the origin of key isotopes fundamental for the understanding of supernova explosion and the chemical evolution of our Galaxy. The mission will be a major player of the multiwavelength, multimessenger time-domain astronomy of the 2030s, and provide unique data of significant interest to a broad astronomical community, complementary to powerful observatories such as LISA, LIGO, Virgo, KAGRA, the Einstein Telescope and the Cosmic Explorer, IceCube-Gen2 and KM3NeT, SKA, ALMA, JWST, E-ELT, LSST, Athena, and the Cherenkov Telescope Array.
Natural hazards such as floods, earthquakes, landslides, and multi-hazard events heavily affect human societies and call for better management strategies. Due to the severity of such events, it is of utmost importance to understand whether and how they change in re-sponse to evolving hydro-climatological, geo-physical and socio-economic conditions. These conditions jointly determine the magnitude, frequency, and impact of disasters, and are changing in response to climate change and human behavior. Therefore methods are need-ed for hazard and risk quantification accounting for the transient nature of hazards and risks in response to changing natural and anthropogenic altered systems. The purpose of this conference is to bring together researchers from natural sciences (e.g. hydrology, meteorology, geomorphology, hydraulic engineering, environmental science, seismology, geography), risk research, nonlinear systems dynamics, and applied mathematics to discuss new insights and developments about data science, changing systems, multi-hazard events and the linkage between hazard and vulnerabilities under unstable environmental conditions. Knowledge transfer, communication and networking will be key issues of the conference. The conference is organized by means of invited talks given by outstanding experts, oral presentations, poster sessions and discussions.
Web Tracking
(2018)
Web tracking seems to become ubiquitous in online business and leads to increased privacy concerns of users. This paper provides an overview over the current state of the art of web-tracking research, aiming to reveal the relevance and methodologies of this research area and creates a foundation for future work. In particular, this study addresses the following research questions: What methods are followed? What results have been achieved so far? What are potential future research areas? For these goals, a structured literature review based upon an established methodological framework is conducted. The identified articles are investigated with respect to the applied research methodologies and the aspects of web tracking they emphasize.
The paper deals with the increasing growth of embedded systems and their role within structures similar to the Internet (Internet of Things) as those that provide calculating power and are more or less appropriate for analytical tasks. Faced with the example of a cyber-physical manufacturing system, a common objective function is developed with the intention to measure efficient task processing within analytical infrastructures. A first validation is realized on base of an expert panel.
Integral Fourier operators
(2017)
This volume of contributions based on lectures delivered at a school on Fourier Integral Operators
held in Ouagadougou, Burkina Faso, 14–26 September 2015, provides an introduction to Fourier Integral Operators (FIO) for a readership of Master and PhD students as well as any interested layperson. Considering the wide
spectrum of their applications and the richness of the mathematical tools they involve, FIOs lie the cross-road of many a field. This volume offers
the necessary background, whether analytic or geometric, to get acquainted with FIOs, complemented by more advanced material presenting various aspects of active research in that area.
Many markets are characterized by pricing competition. Typically, competitors are involved that adjust their prices in response to other competitors with different frequencies. We analyze stochastic dynamic pricing models under competition for the sale of durable goods. Given a competitor’s pricing strategy, we show how to derive optimal response strategies that take the anticipated competitor’s price adjustments into account. We study resulting price cycles and the associated expected long-term profits. We show that reaction frequencies have a major impact on a strategy’s performance. In order not to act predictable our model also allows to include randomized reaction times. Additionally, we study to which extent optimal response strategies of active competitors are affected by additional passive competitors that use constant prices. It turns out that optimized feedback strategies effectively avoid a decline in price. They help to gain profits, especially, when aggressive competitor s are involved.
Coring on Digital Platforms
(2017)
Today’s mobile devices are part of powerful business ecosystems, which usually involve digital platforms. To better understand the complex phenomenon of coring and related dynamics, this paper presents a case study comparing iMessage as part of Apple’s iOS and WhatsApp. Specifically, it investigates activities regarding platform coring, as the integration of several functionalities provided by third-party applications in the platform core. The paper makes three contributions. First, a systematization of coring activities is developed. Coring modes are differentiated by the amount of coring and application maintenance. Second, the case study revealed that the phenomenon of platform coring is present on digital platforms for mobile devices. Third, the fundamentals of coring are discussed as a first step towards theoretical development. Even though coring constitutes a potential threat for third-party developers regarding their functional differentiation, an idea of what a beneficial partnership incorporating coring activities could look like is developed here.
The interdisciplinary workshop STOCHASTIC PROCESSES WITH APPLICATIONS IN THE NATURAL SCIENCES was held in Bogotá, at Universidad de los Andes from December 5 to December 9, 2016. It brought together researchers from Colombia, Germany, France, Italy, Ukraine, who communicated recent progress in the mathematical research related to stochastic processes with application in biophysics.
The present volume collects three of the four courses held at this meeting by Angelo Valleriani, Sylvie Rœlly and Alexei Kulik.
A particular aim of this collection is to inspire young scientists in setting up research goals within the wide scope of fields represented in this volume.
Angelo Valleriani, PhD in high energy physics, is group leader of the team "Stochastic processes in complex and biological systems" from the Max-Planck-Institute of Colloids and Interfaces, Potsdam.
Sylvie Rœlly, Docteur en Mathématiques, is the head of the chair of Probability at the University of Potsdam.
Alexei Kulik, Doctor of Sciences, is a Leading researcher at the Institute of Mathematics of Ukrainian National Academy of Sciences.
Traditional production systems are enhanced by cyber-physical systems (CPS) and Internet of Things. A kind of next generation systems, those cyber-physical production systems (CPPS) are able to raise the level of autonomy of its production components. To find the optimal degree of autonomy in a given context, a research approach is formulated using a simulation concept. Based on requirements and assumptions, a cyber-physical market is modeled and qualitative hypotheses are formulated, which will be verified with the help of the CPPS of a hybrid simulation environment.
This article provides some insights into the complex relationships between thinking and behavioral patterns, bio graphical aspects and teaching style. The data was analyzed in the Grounded Theory tradition and with the help of ATLAS.ti. The results presented here offer preliminary findings only since the research is still ongoing. The focus is on the ways teachers deal with mistakes. Based on two case examples, it will be shown how the fear of making mistakes can lead to teacher-centered lessons, and thereby limiting pupils' possibilities to learn autonomously.
E-Mail Tracking
(2016)
E-mail advertisement, as one instrument in the marketing mix, allows companies to collect fine-grained behavioural data about individual users’ e-mail reading habits realised through sophisticated tracking mechanisms. Such tracking can be harmful for user privacy and security. This problem is especially severe since e-mail tracking techniques gather data without user consent. Striving to increase privacy and security in e-mail communication, the paper makes three contributions. First, a large database of newsletter e-mails is developed. This data facilitates investigating the prevalence of e- mail tracking among 300 global enterprises from Germany, the United Kingdom and the United States. Second, countermeasures are developed for automatically identifying and blocking e-mail tracking mechanisms without impeding the user experience. The approach consists of identifying important tracking descriptors and creating a neural network-based detection model. Last, the effectiveness of the proposed approach is established by means of empirical experimentation. The results suggest a classification accuracy of 99.99%.
The term Personal Learning Environment (PLE) is associated with the desire to put the learner in control of his own learning process, so that he is able to set and accomplish the desired learning goals at the right time with the learning environment chosen by him. Gradually, such a learning environment includes several digital content, services and tools. It is thus summarized as the Virtual Learning Environment (VLE). Even though the construction of an individual PLE is a complex task, several approaches to support this process already exist. They mostly occur under the umbrella term PLE or with little accentuations like iPLE, which especially live within the context of institutions. This paper sums up the variety of attempts and technical approaches to establish a PLE and suggests a categorization for them.
In times of digitalization, the collection and modeling of business processes is still a challenge for companies. The demand for trustworthy process models that reflect the actual execution steps therefore increases. The respective kinds of processes significantly determine both, business process analysis and the conception of future target processes and they are the starting point for any kind of change initiatives. Existing approaches to model as-is processes, like process mining, are exclusively focused on reconstruction. Therefore, transactional protocols and limited data from a single application system are used. Heterogeneous application landscapes and business processes that are executed across multiple application systems, on the contrary, are one of the main challenges in process mining research. Using RFID technology is hence one approach to close the existing gap between different application systems. This paper focuses on methods for data collection from real world objects via RFID technology and possible combinations with application data (process mining) in order to realize a cross system mining approach.
HPI Future SOC Lab
(2016)
The “HPI Future SOC Lab” is a cooperation of the Hasso Plattner Institute (HPI) and industrial partners. Its mission is to enable and promote exchange and interaction between the research community and the industrial partners.
The HPI Future SOC Lab provides researchers with free of charge access to a complete infrastructure of state of the art hard and software. This infrastructure includes components, which might be too expensive for an ordinary research environment, such as servers with up to 64 cores and 2 TB main memory. The offerings address researchers particularly from but not limited to the areas of computer science and business information systems. Main areas of research include cloud computing, parallelization, and In-Memory technologies.
This technical report presents results of research projects executed in 2016. Selected projects have presented their results on April 5th and November 3th 2016 at the Future SOC Lab Day events.
KEYCIT 2014
(2015)
In our rapidly changing world it is increasingly important not only to be an expert in a chosen field of study but also to be able to respond to developments, master new approaches to solving problems, and fulfil changing requirements in the modern world and in the job market. In response to these needs key competencies in understanding, developing and using new digital technologies are being brought into focus in school and university programmes. The IFIP TC3 conference "KEYCIT – Key Competences in Informatics and ICT (KEYCIT 2014)" was held at the University of Potsdam in Germany from July 1st to 4th, 2014 and addressed the combination of key competencies, Informatics and ICT in detail. The conference was organized into strands focusing on secondary education, university education and teacher education (organized by IFIP WGs 3.1 and 3.3) and provided a forum to present and to discuss research, case studies, positions, and national perspectives in this field.
E-Mail tracking uses personalized links and pictures for gathering information on user behavior, for example, where, when, on what kind of device, and how often an e-mail has been read. This information can be very useful for marketing purposes. On the other hand, privacy and security requirements of customers could be violated by tracking. This paper examines how e-mail tracking works, how it can be detected automatically, and to what extent it is used in German e-commerce. We develop a detection model and software tool in order to collect and analyze more than 600 newsletter e-mails from companies of several different industries. The results show that the usage of e-mail tracking in Germany is prevalent but also varies depending on the industry.
Wolf-Rayet Stars
(2015)
Nearly 150 years ago, the French astronomers Charles Wolf and Georges Rayet described stars with very conspicuous spectra that are dominated by bright and broad emission lines. Meanwhile termed Wolf-Rayet Stars after their discoverers, those objects turned out to represent important stages in the life of massive stars.
As the first conference in a long time that was specifically dedicated to Wolf-Rayet stars, an international workshop was held in Potsdam, Germany, from 1.-5. June 2015. About 100 participants, comprising most of the leading experts in the field as well as as many young scientists, gathered for one week of extensive scientific exchange and discussions. Considerable progress has been reported throughout, e.g. on finding such stars, modeling and analyzing their spectra, understanding their evolutionary context, and studying their circumstellar nebulae. While some major questions regarding Wolf-Rayet stars still remain open 150 years after their discovery, it is clear today that these objects are not just interesting stars as such, but also keystones in the evolution of galaxies.
These proceedings summarize the talks and posters presented at the Potsdam Wolf-Rayet workshop. Moreover, they also include the questions, comments, and discussions emerging after each talk, thereby giving a rare overview not only about the research, but also about the current debates and unknowns in the field. The Scientific Organizing Committee (SOC) included Alceste Bonanos (Athens), Paul Crowther (Sheffield), John Eldridge (Auckland), Wolf-Rainer Hamann (Potsdam, Chair), John Hillier (Pittsburgh), Claus Leitherer (Baltimore), Philip Massey (Flagstaff), George Meynet (Geneva), Tony Moffat (Montreal), Nicole St-Louis (Montreal), and Dany Vanbeveren (Brussels).
Pak Choi Fed to Mice: Formation of DNA Adducts and Influence on Xenobiotic-Metabolizing Enzymes
(2015)
HPI Future SOC Lab
(2015)
Das Future SOC Lab am HPI ist eine Kooperation des Hasso-Plattner-Instituts mit verschiedenen Industriepartnern. Seine Aufgabe ist die Ermöglichung und Förderung des Austausches zwischen Forschungsgemeinschaft und Industrie.
Am Lab wird interessierten Wissenschaftlern eine Infrastruktur von neuester Hard- und Software kostenfrei für Forschungszwecke zur Verfügung gestellt. Dazu zählen teilweise noch nicht am Markt verfügbare Technologien, die im normalen Hochschulbereich in der Regel nicht zu finanzieren wären, bspw. Server mit bis zu 64 Cores und 2 TB Hauptspeicher. Diese Angebote richten sich insbesondere an Wissenschaftler in den Gebieten Informatik und Wirtschaftsinformatik. Einige der Schwerpunkte sind Cloud Computing, Parallelisierung und In-Memory Technologien.
In diesem Technischen Bericht werden die Ergebnisse der Forschungsprojekte des Jahres 2015 vorgestellt. Ausgewählte Projekte stellten ihre Ergebnisse am 15. April 2015 und 4. November 2015 im Rahmen der Future SOC Lab Tag Veranstaltungen vor.
Every year, the Hasso Plattner Institute (HPI) invites guests from industry and academia to a collaborative scientific workshop on the topic “Operating the Cloud”. Our goal is to provide a forum for the exchange of knowledge and experience between industry and academia. Hence, HPI’s Future SOC Lab is the adequate environment to host this event which is also supported by BITKOM.
On the occasion of this workshop we called for submissions of research papers and practitioners’ reports. “Operating the Cloud” aims to be a platform for productive discussions of innovative ideas, visions, and upcoming technologies in the field of cloud operation and administration.
In this workshop proceedings the results of the second HPI cloud symposium "Operating the Cloud" 2014 are published. We thank the authors for exciting presentations and insights into their current work and research. Moreover, we look forward to more interesting submissions for the upcoming symposium in 2015.
Timing and magnitude of surface uplift are key to understanding the impact of crustal deformation and topographic growth on atmospheric circulation, environmental conditions, and surface processes. Uplift of the East African Plateau is linked to mantle processes, but paleoaltimetry data are too scarce to constrain plateau evolution and subsequent vertical motions associated with rifting. Here, we assess the paleotopographic implications of a beaked whale fossil (Ziphiidae) from the Turkana region of Kenya found 740 km inland from the present-day coastline of the Indian Ocean at an elevation of 620 m. The specimen is similar to 17 My old and represents the oldest derived beaked whale known, consistent with molecular estimates of the emergence of modern straptoothed whales (Mesoplodon). The whale traveled from the Indian Ocean inland along an eastward-directed drainage system controlled by the Cretaceous Anza Graben and was stranded slightly above sea level. Surface uplift from near sea level coincides with paleoclimatic change from a humid environment to highly variable and much drier conditions, which altered biotic communities and drove evolution in east Africa, including that of primates.