Refine
Year of publication
Document Type
- Conference Proceeding (296) (remove)
Language
- English (296) (remove)
Is part of the Bibliography
- yes (296) (remove)
Keywords
- Cloud Computing (3)
- E-Mail Tracking (3)
- ERP (3)
- MOOC (3)
- Privacy (3)
- enterprise systems (3)
- knowledge management (3)
- social media (3)
- Blockchain (2)
- COVID-19 (2)
Institute
- Fachgruppe Betriebswirtschaftslehre (57)
- Institut für Biochemie und Biologie (52)
- Department Sport- und Gesundheitswissenschaften (38)
- Institut für Ernährungswissenschaft (36)
- Department Psychologie (27)
- Institut für Chemie (14)
- Wirtschaftswissenschaften (8)
- Institut für Mathematik (7)
- Institut für Geowissenschaften (6)
- Institut für Physik und Astronomie (6)
Expanding modeling notations
(2021)
Creativity is a common aspect of business processes and thus needs a proper representation through process modeling notations. However, creative processes constitute highly flexible process elements, as new and unforeseeable outcome is developed. This presents a challenge for modeling languages. Current methods representing creative-intensive work are rather less able to capture creative specifics which are relevant to successfully run and manage these processes. We outline the concept of creative-intensive processes and present an example from a game design process in order to derive critical process aspects relevant for its modeling. Six aspects are detected, with first and foremost: process flexibility, as well as temporal uncertainty, experience, types of creative problems, phases of the creative process and individual criteria. By first analyzing what aspects of creative work modeling notations already cover, we further discuss which modeling extensions need to be developed to better represent creativity within business processes. We argue that a proper representation of creative work would not just improve the management of those processes, but can further enable process actors to more efficiently run these creative processes and adjust them to better fit to the creative needs.
Despite the phenomenal growth of Big Data Analytics in the last few years, little research is done to explicate the relationship between Big Data Analytics Capability (BDAC) and indirect strategic value derived from such digital capabilities. We attempt to address this gap by proposing a conceptual model of the BDAC - Innovation relationship using dynamic capability theory. The work expands on BDAC business value research and extends the nominal research done on BDAC – innovation. We focus on BDAC's relationship with different innovation objects, namely product, business process, and business model innovation, impacting all value chain activities. The insights gained will stimulate academic and practitioner interest in explicating strategic value generated from BDAC and serve as a framework for future research on the subject
Already successfully used products or designs, past projects or our own experiences can be the basis for the development of new products. As reference products or existing knowledge, it is reused in the development process and across generations of products. Since further, products are developed in cooperation, the development of new product generations is characterized by knowledge-intensive processes in which information and knowledge are exchanged between different kinds of knowledge carriers. The particular knowledge transfer here describes the identification of knowledge, its transmission from the knowledge carrier to the knowledge receiver, and its application by the knowledge receiver, which includes embodied knowledge of physical products. Initial empirical findings of the quantitative effects regarding the speed of knowledge transfers already have been examined. However, the factors influencing the quality of knowledge transfer to increase the efficiency and effectiveness of knowledge transfer in product development have not yet been examined empirically. Therefore, this paper prepares an experimental setting for the empirical investigation of the quality of knowledge transfers.
Increasingly fast development cycles and individualized products pose major challenges for today's smart production systems in times of industry 4.0. The systems must be flexible and continuously adapt to changing conditions while still guaranteeing high throughputs and robustness against external disruptions. Deep rein- forcement learning (RL) algorithms, which already reached impressive success with Google DeepMind's AlphaGo, are increasingly transferred to production systems to meet related requirements. Unlike supervised and unsupervised machine learning techniques, deep RL algorithms learn based on recently collected sensor- and process-data in direct interaction with the environment and are able to perform decisions in real-time. As such, deep RL algorithms seem promising given their potential to provide decision support in complex environments, as production systems, and simultaneously adapt to changing circumstances. While different use-cases for deep RL emerged, a structured overview and integration of findings on their application are missing. To address this gap, this contribution provides a systematic literature review of existing deep RL applications in the field of production planning and control as well as production logistics. From a performance perspective, it became evident that deep RL can beat heuristics significantly in their overall performance and provides superior solutions to various industrial use-cases. Nevertheless, safety and reliability concerns must be overcome before the widespread use of deep RL is possible which presumes more intensive testing of deep RL in real world applications besides the already ongoing intensive simulations.
Many prediction tasks can be done based on users’ trace data. In this paper, we explored convergent thinking as a personality-related attribute and its relation to features gathered in interactive and non-interactive tasks of an online course. This is an under-utilized attribute that could be used for adapting online courses according to the creativity level to enhance the motivation of learners. Therefore, we used the logfile data of a 60 minutes Moodle course with N=128 learners, combined with the Remote Associates Test (RAT). We explored the trace data and found a weak correlation between interactive tasks and the RAT score, which was the highest considering the overall dataset. We trained a Random Forest Regressor to predict convergent thinking based on the trace data and analyzed the feature importance. The result has shown that the interactive tasks have the highest importance in prediction, but the accuracy is very low. We discuss the potential for personalizing online courses and address further steps to improve the applicability.
The devil in disguise
(2021)
Envy constitutes a serious issue on Social Networking Sites (SNSs), as this painful emotion can severely diminish individuals' well-being. With prior research mainly focusing on the affective consequences of envy in the SNS context, its behavioral consequences remain puzzling. While negative interactions among SNS users are an alarming issue, it remains unclear to which extent the harmful emotion of malicious envy contributes to these toxic dynamics. This study constitutes a first step in understanding malicious envy’s causal impact on negative interactions within the SNS sphere. Within an online experiment, we experimentally induce malicious envy and measure its immediate impact on users’ negative behavior towards other users. Our findings show that malicious envy seems to be an essential factor fueling negativity among SNS users and further illustrate that this effect is especially pronounced when users are provided an objective factor to mask their envy and justify their norm-violating negative behavior.
The holocaust in the USSR
(2021)
This paper sketches the current status of international scholarship on the subject of the Holocaust in the USSR and its place in the wider military conflict of the Second World War. Research on this topic over the last 20 to 30 years has been truly international and the findings of this research cannot be sketched here without pointing to the contributions made by German, American, Russian, Israeli, British and Australian historians. Historians from these countries have made important contributions to our understanding of key questions relating to this subject. These questions address, among other things, pre-invasion orders issued to German units; the radicalisation of German policy, culminating in the root-and-branch extermination of Soviet Jewry; the network of ghettos set up on Soviet territory; the nature of the killing and the methods used to murder these victims; the total death toll of the Holocaust in the USSR; and the relationship between war and extermination, in which genocide can be regarded as an actual strategy of warfare pursued by the German Reich.
Halide perovskites
(2021)
The game itself?
(2020)
In this paper, we reassess the notion and current state of ludohermeneutics in game studies, and propose a more solid foundation for how to conduct hermeneutic game analysis. We argue that there can be no ludo-hermeneutics as such, and that every game interpretation rests in a particular game ontology, whether implicit or explicit. The quality of this ontology, then, determines a vital aspect of the quality of the analysis.
Public blockchain
(2020)
Blockchain has the potential to change business transactions to a major extent. Thereby, underlying consensus algorithms are the core mechanism to achieve consistency in distributed infrastructures. Their application aims for transparency and accountability in societal transactions. As a result of missing reviews holistically covering consensus algorithms, we aim to (1) identify prevalent consensus algorithms for public blockchains, and (2) address the resource perspective with a sustainability consideration (whereby we address the three spheres of sustainability). Our systematic literature review identified 33 different consensus algorithms for public blockchains. Our contribution is twofold: first, we provide a systematic summary of consensus algorithms for public blockchains derived from the scientific literature as well as real-world applications and systemize them according to their research focus; second, we assess the sustainability of consensus algorithms using a representative sample and thereby highlight the gaps in literature to address the holistic sustainability of consensus algorithms.
The game itself?
(2020)
In this paper, we reassess the notion and current state of ludohermeneutics in game studies, and propose a more solid foundation for how to conduct hermeneutic game analysis. We argue that there can be no ludo-hermeneutics as such, and that every game interpretation rests in a particular game ontology, whether implicit or explicit. The quality of this ontology, then, determines a vital aspect of the quality of the analysis.
Developing a new paradigm
(2020)
Internet users commonly agree that it is important for them to protect their personal data. However, the same users readily disclose their data when requested by an online service. The dichotomy between privacy attitude and actual behaviour is commonly referred to as the “privacy paradox”. Over twenty years of research were not able to provide one comprehensive explanation for the paradox and seems even further from providing actual means to overcome the paradox. We argue that the privacy paradox is not just an instantiation of the attitude-behaviour gap. Instead, we introduce a new paradigm explaining the paradox as the result of attitude-intention and intentionbehaviour gaps. Historically, motivational goal-setting psychologists addressed the issue of intentionbehaviour gaps in terms of the Rubicon Model of Action Phases and argued that commitment and volitional strength are an essential mechanism that fuel intentions and translate them into action. Thus, in this study we address the privacy paradox from a motivational psychological perspective by developing two interventions on Facebook and assess whether the 287 participants of our online experiment actually change their privacy behaviour. The results demonstrate the presence of an intentionbehaviour gap and the efficacy of our interventions in reducing the privacy paradox.
As Industry 4.0 infrastructures are seen as highly evolutionary environment with volatile, and time-dependent workloads for analytical tasks, particularly the optimal dimensioning of IT hardware is a challenge for decision makers because the digital processing of these tasks can be decoupled from their physical place of origin. Flexible architecture models to allocate tasks efficiently with regard to multi-facet aspects and a predefined set of local systems and external cloud services have been proven in small example scenarios. This paper provides a benchmark of existing task realization strategies, composed of (1) task distribution and (2) task prioritization in a real-world scenario simulation. It identifies heuristics as superior strategies.
How games spoil creativity
(2020)
The demand for a creative workforce is every growing and effective measures to improve individual creativity are searched for. This study analyzes the possibility to use games as a prime for a creative mindset. Two short entertainment games, plus a no-game-comparison condition were set up in three versions of an online-study, along with two creativity tasks and scales to assess the individual creative mindset (fixed-vs-growth, creative self-efficacy and affect). Results indicate priming effects of the games, but in the opposite intended direction: gaming diminished the creative test performances. Those playing the games reported more ideas in the open-ended creative problem task, but those answers were of less quality and they solved less closed-problem items compared to those not playing. An impact of further mindset differences could be ruled out.
From learners to educators
(2020)
The rapid growth of technology and its evolving potential to support the transformation of teaching and learning in post-secondary institutions is a major challenge to the basic understanding of both the university and the communities it serves. In higher education, the standard forms of learning and teaching are increasingly being challenged and a more comprehensive process of differentiation is taking place. Student-centered teaching methods are becoming increasingly important in course design and the role of the lecturer is changing from the knowledge mediator to moderator and learning companion. However, this is accelerating the need for strategically planned faculty support and a reassessment of the role of teaching and learning. Even though the benefits of experience-based learning approaches for the development of life skills are well known, most knowledge transfer is still realized through lectures in higher education. Teachers have the goal to design the curriculum, new assignments, and share insights into evolving pedagogy. Student engagement could be the most important factor in the learning success of university students, regardless of the university program or teaching format. Against this background, this article presents the development, application, and initial findings of an innovative learning concept. In this concept, students are allowed to deal with a scientific topic, but instead of a presentation and a written elaboration, their examination consists of developing an online course in terms of content, didactics, and concept to implement it in a learning environment, which is state of the art. The online courses include both self-created teaching material and interactive tasks. The courses are created to be available to other students as learning material after a review process and are thus incorporated into the curriculum.
The envy spiral
(2020)
On Social Networking Sites (SNS) users disclose mostly positive and often self-enhancing information. Scholars refer to this phenomenon as the positivity bias in SNS communication (PBSC). However, while theoretical explanations for this phenomenon have been proposed, an empirical proof of these theorized mechanisms is still missing. The project presented in this Research-in-Progress paper aims at explaining the PBSC with the mechanism specified in the self-enhancement envy spiral. Specifically, we hypothesize that feelings of envy drive people to post positive and self-enhancing content on SNS. To test this hypothesis, we developed an experimental design allowing to examine the causal effect of envy on the positivity of users’ subsequently posted content. In a preliminary study, we tested our manipulation of envy and could show its effectiveness in inducing different levels of envy between our groups. Our project will help to broaden the understanding of the complex dynamics of SNS and the potentially adverse driving forces underlying them.
Does a smile open all doors?
(2020)
Online photographs govern an individual’s choices across a variety of contexts. In sharing arrangements, facial appearance has been shown to affect the desire to collaborate, interest to explore a listing, and even willingness to pay for a stay. Because of the ubiquity of online images and their influence on social attitudes, it seems crucial to be able to control these aspects. The present study examines the effect of different photographic self-disclosures on the provider’s perceptions and willingness to accept a potential co-sharer. The findings from our experiment in the accommodation-sharing context suggest social attraction mediates the effect of photographic self-disclosures on willingness to host. Implications of the results for IS research and practitioners are discussed.
In the time of digitalization the demand for organizational change is rising and demands ways to cope with fundamental changes on the organizational as well as individual level. As a basis, learning and forgetting mechanisms need to be understood in order to guide a change process efficiently and successfully. Our research aims to get a better understanding of individual differences and mechanisms in the change context by performing an experiment where individuals learn and later re-learn a complex production process using a simulation setting. The individual’s performance, as well as retentivity and prior knowledge is assessed. Our results show that higher retentivity goes along with better learning and forgetting performances. Prior knowledge did not reveal such relation to the learning and forgetting performances. The influence of age and gender is discussed in detail.
The book offers a comprehensive overview of current research in Slavic linguistics from a theoretical and experimental perspective and from a variety of languages. The selected papers from the 11th European Conference on Formal Description of Slavic Languages (FDSL 11) that took place at the University of Potsdam in 2015, illustrate the advancement of Slavic linguistic studies and their outreach for the development of general linguistics. The guest paper by Noam Chomsky at the beginning of the book sets a clear sign in this direction and may be taken as an acknowledgement of the field.
Missing out on life
(2020)
Mobile devices have become an integral part of everyday life due to
their portability. As literature shows, technology use is not only beneficial but also has dark sides, such as addiction. Parents face the need to balance perceived benefits and risks of children’s exposure to mobile technologies. However, no study has uncovered what kind of benefits and concerns parents consider when implementing technology-related rules. We built on qualitative responses of 300
parents of children aged two to thirteen to explore concerns about, and perceived benefits of children’s smartphone and tablet usage, as well as the rules parents have developed regarding technology use. Findings point to concerns regarding children’s development, as well as benefits for both children and parents, and ultimately to new insights about mobile technology mediation. These results provide practical guidance for parents, physicians and mobile industry
stakeholders, trying to ensure that children are acting responsibly with mobile technology.
Living in a world of plenty?
(2020)
Inequality in the distribution of economic wealth within populations has been rising steadily over the past century, having reached unprecedented highs in many Western societies. However, this development is not reflected in people’s perceptions of wealth inequality, as the public tends to underestimate it. Research suggests that inequality estimates are derived from personal reference groups, which, as we propose, are expanded by social network site (SNS) use. As content on SNSs frequently revolves around events of consumption, signaling enhanced overall population wealth, this study tests the hypothesis that SNS use distorts inequality perceptions downward, i.e., increases the perception of societal equality. Responses of 534 survey participants in the United States confirm that SNS use negatively predicts perceived inequality. The relationship is stronger the more SNS users perceive the content they encounter online as real, supporting the assumption that observing other people’s behavior online lowers estimates of nationwide wealth inequality. These findings provide novel insights on inequality misperceptions by suggesting individuals’ SNS use as a new predictor of perceived wealth inequality.
How messy is your news feed
(2020)
Social Networking Sites (SNSs) are pervasive in our daily lives. However, emerging reports suggest that people are increasingly dissatisfied with their experience of SNSs News Feeds. Motivated by the cognitive load theory, the paper postulates that arrangement and presentation of information are important constituents of one’s Facebook News Feed experience. Integrating these factors into the novel concept of ‘perceived disorder’, this paper hypothesizes that the perception of disorder elicited by the Facebook News Feed plays an important role in causing discontinuance intentions. Drawing on the Stressor-Strain-Outcome Model, we suggest that perceived disorder leads to SNS discontinuance intention and is partially mediated by SNS fatigue. The paper uses the responses of 268 Facebook users to investigate these relationships and introduces perceived disorder as a novel stressor. Besides adding to the existing body of literature, these insights are of relevance to internet service providers, policy makers and SNS users.
Data sharing requires researchers to publish their (primary) data and any supporting research materials. With increased attention on reproducibility and more transparent research requiring sharing of data, the issues surrounding data sharing are moving beyond whether data sharing is beneficial, to what kind of research data should be shared and how. However, despite its benefits, data sharing still is not common practice in Information Systems (IS) research. The panel seeks to discuss the controversies related to data sharing in research, specifically focusing on the IS discipline. It remains unclear how the positive effects of data sharing that are often framed as extending beyond the individual researcher (e.g., openness for innovation) can be utilized while reducing the downsides often associated with negative consequences for the individual researcher (e.g., losing a competitive advantage). To foster data sharing practices in IS, the panel will address this dilemma by drawing on the panelists’ expertise.
Background:
Anti-TNFα monoclonal antibodies (mAbs) are a well-established treatment for patients with Crohn’s disease (CD). However, subtherapeutic concentrations of mAbs have been related to a loss of response during the first year of therapy1. Therefore, an appropriate dosing strategy is crucial to prevent the underexposure of mAbs for those patients. The aim of our study was to assess the impact of different dosing strategies (fixed dose or body size descriptor adapted) on drug exposure and the target concentration attainment for two different anti-TNFα mAbs: infliximab (IFX, body weight (BW)-based dosing) and certolizumab pegol (CZP, fixed dosing). For this purpose, a comprehensive pharmacokinetic (PK) simulation study was performed.
Methods:
A virtual population of 1000 clinically representative CD patients was generated based on the distribution of CD patient characteristics from an in-house clinical database (n = 116). Seven dosing regimens were investigated: fixed dose and per BW, lean BW (LBW), body surface area, height, body mass index and fat-free mass. The individual body size-adjusted doses were calculated from patient generated body size descriptor values. Then, using published PK models for IFX and CZP in CD patients2,3, for each patient, 1000 concentration–time profiles were simulated to consider the typical profile of a specific patient as well as the range of possible individual profiles due to unexplained PK variability across patients. For each dosing strategy, the variability in maximum and minimum mAb concentrations (Cmax and Cmin, respectively), area under the concentration-time curve (AUC) and the per cent of patients reaching target concentration were assessed during maintenance therapy.
Results:
For IFX and CZP, Cmin showed the highest variability between patients (CV ≈110% and CV ≈80%, respectively) with a similar extent across all dosing strategies. For IFX, the per cent of patients reaching the target (Cmin = 5 µg/ml) was similar across all dosing strategies (~15%). For CZP, the per cent of patients reaching the target average concentration of 17 µg/ml ranged substantially (52–71%), being the highest for LBW-adjusted dosing.
Conclusion:
By using a PK simulation approach, different dosing regimen of IFX and CZP revealed the highest variability for Cmin, the most commonly used PK parameter guiding treatment decisions, independent upon dosing regimen. Our results demonstrate similar target attainment with fixed dosing of IFX compared with currently recommended BW-based dosing. For CZP, the current fixed dosing strategy leads to comparable percentage of patients reaching target as the best performing body size-adjusted dosing (66% vs. 71%, respectively).
The “output-orientation” is omnipresent in teacher education. In order to evaluate teachers' and students' performances, a wide range of different quantitative questionnaires exist worldwide. One important goal of teaching evaluation is to increase the quality of teaching and learning. The author argues, that standard evaluations which are typically made at the end of the semester are problematic due to two reasons. The first one is that some of the questions are too general and don`t offer concrete ideas as to what kind of actions can be taken to make the courses better. The second problem is that the evaluation is mostly made when the course is already over. Because of this criticism, Apelojg invented the Felix-App which offers the possibility to give feedback in real-time by asking for the emotions and needs that occur during different learning situations. The idea is very simple: positive emotions and satisfied needs are helpful for the learning process. Negative emotions and unsatisfied needs have negative effects on the learning process. First descriptive results show, that “managing emotions” during classes can have positive effects on both motivation and emotions.
RAA2019
(2019)
These abstracts result from the 10th International Congress on the Application of Raman Spectroscopy in Art and Archaeology held 03.09. – 07.09.2019 in Potsdam (Germany).
The RAA is an established biennial international conference series. Since the beginning in 2001, the RAA conferences promote Raman Spectroscopy and play an important role in increasing the field of its applications in art history, history, archaeology, palaeontology, conservation and restoration, museology, degradation of cultural heritage, archaeometry, etc. Furthermore, the development of new instrumentation, especially for non-invasive measurements, receives great attention.
The Congress covers all topics of Raman spectroscopic applications in art and archaeology and focuses on the following themes:
• Material characterization and degradation processes
• Conservation issues affecting cultural heritage
• Raman spectroscopy of biological and organic materials
• Surface enhanced Raman spectroscopy
• Chemometrics in Raman spectroscopy
• Development of Raman techniques
• New Raman instrumentation and applications in cultural heritage objects investigations
• Raman spectroscopy in paleontology, paleoenvironment and archaeology
Modern web browsers are digital software platforms, as they allow third-parties to extend functionality by providing extensions. Given the intense competition, differentiation through provided functionality is a key factor for browser platforms. As browsers progress, they constantly release new features. Browsers might thereby enter complementary markets if they add functionality formerly provided by third-party extensions, which is referred to as ‘platform coring’. Previous studies missed the perspective of the involved parties. To address this gap, we conduct interviews with third-party and core developers in the security and privacy domain from Firefox and Chrome. In essence, the study provides three contributions. First, insights into stakeholder-specific issues concerning coring. Second, measures to prevent coring. Third, strategical guidance for developers and owners. Third-parties experienced and core developers acknowledged coring to occur on browser platforms. While developers with extrinsic motivations assess coring negatively, developers with intrinsic motivations perceive coring positively.
Accelerating knowledge
(2019)
As knowledge-intensive processes are often carried out in teams and demand for knowledge transfers among various knowledge carriers, any optimization in regard to the acceleration of knowledge transfers obtains a great economic potential. Exemplified with product development projects, knowledge transfers focus on knowledge acquired in former situations and product generations. An adjustment in the manifestation of knowledge transfers in its concrete situation, here called intervention, therefore can directly be connected to the adequate speed optimization of knowledge-intensive process steps. This contribution presents the specification of seven concrete interventions following an intervention template. Further, it describes the design and results of a workshop with experts as a descriptive study. The workshop was used to assess the practical relevance of interventions designed as well as the identification of practical success factors and barriers of their implementation.
Track and Treat
(2018)
E-Mail tracking mechanisms gather information on individual recipients’ reading behavior. Previous studies show that e-mail newsletters commonly include tracking elements. However, prior work does not examine the degree to which e-mail senders actually employ gathered user information. The paper closes this research gap by means of an experimental study to clarify the use of tracking-based infor- mation. To that end, twelve mail accounts are created, each of which subscribes to a pre-defined set of newsletters from companies based in Germany, the UK, and the USA. Systematically varying e-mail reading patterns across accounts, each account simulates a different type of user with individual read- ing behavior. Assuming senders to track e-mail reading habits, we expect changes in mailer behavior. The analysis confirms the prominence of tracking in that over 92% of the newsletter e-mails contain tracking images. For 13 out of 44 senders an adjustment of communication policy in response to user reading behavior is observed. Observed effects include sending newsletters at different times, adapting advertised products to match the users’ IT environment, increased or decreased mailing frequency, and mobile-specific adjustments. Regarding legal issues, not all companies that adapt the mail-sending behavior state the usage of such mechanisms in their privacy policy.
Web Tracking
(2018)
Web tracking seems to become ubiquitous in online business and leads to increased privacy concerns of users. This paper provides an overview over the current state of the art of web-tracking research, aiming to reveal the relevance and methodologies of this research area and creates a foundation for future work. In particular, this study addresses the following research questions: What methods are followed? What results have been achieved so far? What are potential future research areas? For these goals, a structured literature review based upon an established methodological framework is conducted. The identified articles are investigated with respect to the applied research methodologies and the aspects of web tracking they emphasize.
In honour of Seymour Papert
(2018)
Forth is nice and flexible but to a philosopher and teacher educator Logo is the more impressing language. Both are relatives of Lisp, but Forth has a reverse Polish notation where as Logo has an infix notation. Logo allows top down programming, Forth only bottom up. Logo enables recursive programming, Forth does not. Logo includes turtle graphics, Forth has nothing comparable. So what to do if you can't get Logo and have no information about its inner architecture? This should be a case of "empirical modelling": How can you model observable results of the behaviour of Logo in terms of Forth? The main steps to solve this problem are shown in the first part of the paper.
The second part of the paper discusses the problem of modelling and shows that the modelling of making and the modelling of recognition have the same mathematical structure. So "empirical modelling" can also serve for modelling desired behaviour of technical systems.
The last part of the paper will show that the heuristic potential of a problem which should be modeled is more important than the programming language. The Picasso construal shows, in a very simple way, how children of different ages can model emotional relations in human behaviour with a simple Logo system.
e-ASTROGAM is a concept for a breakthrough observatory space mission carrying a gamma-ray telescope dedicated to the study of the non-thermal Universe in the photon energy range from 0.15 MeV to 3 GeV. The lower energy limit can be pushed down to energies as low as 30 keV for gamma-ray burst detection with the calorimeter. The mission is based on an advanced space-proven detector technology, with unprecedented sensitivity, angular and energy resolution, combined with remarkable polarimetric capability. Thanks to its performance in the MeV-GeV domain, substantially improving its predecessors, e-ASTROGAM will open a new window on the non-thermal Universe, making pioneering observations of the most powerful Galactic and extragalactic sources, elucidating the nature of their relativistic outflows and their effects on the surroundings. With a line sensitivity in the MeV energy range one to two orders of magnitude better than previous and current generation instruments, e-ASTROGAM will determine the origin of key isotopes fundamental for the understanding of supernova explosion and the chemical evolution of our Galaxy. The mission will be a major player of the multiwavelength, multimessenger time-domain astronomy of the 2030s, and provide unique data of significant interest to a broad astronomical community, complementary to powerful observatories such as LISA, LIGO, Virgo, KAGRA, the Einstein Telescope and the Cosmic Explorer, IceCube-Gen2 and KM3NeT, SKA, ALMA, JWST, E-ELT, LSST, Athena, and the Cherenkov Telescope Array.
Natural hazards such as floods, earthquakes, landslides, and multi-hazard events heavily affect human societies and call for better management strategies. Due to the severity of such events, it is of utmost importance to understand whether and how they change in re-sponse to evolving hydro-climatological, geo-physical and socio-economic conditions. These conditions jointly determine the magnitude, frequency, and impact of disasters, and are changing in response to climate change and human behavior. Therefore methods are need-ed for hazard and risk quantification accounting for the transient nature of hazards and risks in response to changing natural and anthropogenic altered systems. The purpose of this conference is to bring together researchers from natural sciences (e.g. hydrology, meteorology, geomorphology, hydraulic engineering, environmental science, seismology, geography), risk research, nonlinear systems dynamics, and applied mathematics to discuss new insights and developments about data science, changing systems, multi-hazard events and the linkage between hazard and vulnerabilities under unstable environmental conditions. Knowledge transfer, communication and networking will be key issues of the conference. The conference is organized by means of invited talks given by outstanding experts, oral presentations, poster sessions and discussions.
The paper deals with the increasing growth of embedded systems and their role within structures similar to the Internet (Internet of Things) as those that provide calculating power and are more or less appropriate for analytical tasks. Faced with the example of a cyber-physical manufacturing system, a common objective function is developed with the intention to measure efficient task processing within analytical infrastructures. A first validation is realized on base of an expert panel.
Integral Fourier operators
(2017)
This volume of contributions based on lectures delivered at a school on Fourier Integral Operators
held in Ouagadougou, Burkina Faso, 14–26 September 2015, provides an introduction to Fourier Integral Operators (FIO) for a readership of Master and PhD students as well as any interested layperson. Considering the wide
spectrum of their applications and the richness of the mathematical tools they involve, FIOs lie the cross-road of many a field. This volume offers
the necessary background, whether analytic or geometric, to get acquainted with FIOs, complemented by more advanced material presenting various aspects of active research in that area.