Filtern
Erscheinungsjahr
- 2018 (195) (entfernen)
Dokumenttyp
- Sonstiges (195) (entfernen)
Gehört zur Bibliographie
- ja (195)
Schlagworte
- E-Learning (3)
- Security Metrics (3)
- Security Risk Assessment (3)
- 3D Point Clouds (2)
- Cloud-Security (2)
- Internet of Things (2)
- Kanban (2)
- Lecture Video Archive (2)
- MQTT (2)
- Scrum (2)
- Secure Configuration (2)
- capstone course (2)
- embodied cognition (2)
- thermally stimulated discharge (2)
- 3D printing (1)
- 7924 (1)
- 7934 (1)
- 7959 (1)
- Aerosol (1)
- Agile methods (1)
- Algorithms (1)
- Android (1)
- Answer set programming (1)
- Application Container Security (1)
- Artikelindex (1)
- Asian American studies (1)
- Assays (1)
- Atlantic studies (1)
- Augmented reality (1)
- Automatic domain term extraction (1)
- BIM (1)
- BPMN (1)
- Basic English (1)
- Biological Assay (1)
- Black Pacific (1)
- Blockchain (1)
- Boolean Networks (1)
- Bourdieu (1)
- Business process models (1)
- Business process simulation (1)
- C. K. ogden (1)
- CKD (1)
- CPPS (1)
- CPS (1)
- CT (1)
- CWSI (1)
- Climate Change (1)
- Clock Tree Implementation (1)
- Cloud Audit (1)
- Cloud Service Provider (1)
- Collaborative learning (1)
- Curie transition (1)
- DEM (1)
- DMN (1)
- Danish (1)
- Data compression (1)
- Data mining (1)
- Data mining Machine learning (1)
- Data partitioning (1)
- Data profiling (1)
- Decision models (1)
- Discovery System (1)
- Distance Learning (1)
- Diverse solution enumeration (1)
- Dopamine (1)
- Dutch (1)
- Dänisch (1)
- E-Learning exam preparation (1)
- E-Lecture (1)
- Economic sociology (1)
- Edge Computing (1)
- Educational Data Mining (1)
- Embedded Programming (1)
- Emotion Mining (1)
- Energy efficiency (1)
- English dialects (1)
- Entropy (1)
- Epigenetic Biomarkers (1)
- Expert knowledge (1)
- Extensibility (1)
- Fabrication (1)
- Flash (1)
- Function of the spatial metaphor (1)
- GMDH (1)
- GTEx (1)
- Generalized additive mixed-effects modeling (1)
- Generalized knowledge constructin axiom (1)
- Geospatial intelligence (1)
- HLS (1)
- HTML5 (1)
- Haptics (1)
- Hebraica (1)
- Hierarchical Design (1)
- High Mountain Asia (1)
- IT project (1)
- Icelandic (1)
- Indoor Models (1)
- Industry 4.0 (1)
- Inertial measurement units (1)
- Information flow control (1)
- Information system (1)
- Inhibitory Control (1)
- Intelligence (1)
- Intent analysis (1)
- Interacting processes (1)
- International language (1)
- Internet of things (1)
- Interoperability (1)
- Isländisch (1)
- Isotype (1)
- Judaica (1)
- Katalog (1)
- Komplementierer (1)
- LDPE nanocomposites (1)
- Landsat (1)
- Learning (1)
- Lecture Recording (1)
- M2M (1)
- MOOC (1)
- MOOC Remote Lab (1)
- Machine (1)
- Machine Learning (1)
- Marine mammals (1)
- Maternal relationships (1)
- Meta-model (1)
- Metamaterials (1)
- Microservices Security (1)
- Migration (1)
- Minimum spanning tree (1)
- Mitochondrial DNA (1)
- Model checking (1)
- Moving Target Defense (1)
- N2/P3 (1)
- NAB (1)
- NETCONF (1)
- Natural Language Processing (1)
- Near infrared (1)
- Neural Networks (1)
- Niederländisch (1)
- Norwegian (1)
- Norwegisch (1)
- OH suppression (1)
- Ontology (1)
- Operator (1)
- Otto neurath (1)
- P(VDF-TrFE-CFE) terpolymer (1)
- PTH (1)
- Pacific studies (1)
- Passive Microwave (1)
- Peer assessment (1)
- Philosophy of language (1)
- Physical Implementation (1)
- Polygenic Risk Score (1)
- Population genetics (1)
- Predictive markers (1)
- Process modeling (1)
- Process-related data (1)
- Psychological Emotions (1)
- RNAseq (1)
- Raman lidar (1)
- Reward Anticipation (1)
- Risk factors (1)
- SMI (1)
- SPB (1)
- Satztyp (1)
- Schwedisch (1)
- Security analytics (1)
- Semantic Interoperability (1)
- Semantic Web (1)
- Semiotics (1)
- Service-Oriented (1)
- Servicification (1)
- Simulation process building (1)
- Smart Home Education (1)
- Snow (1)
- Social Media Analysis (1)
- Space and metaphor (1)
- Space and spatiality in the internet terminology (1)
- Spatial data handling systems (1)
- Static analysis (1)
- Structural health monitoring (1)
- Swedish (1)
- Syntax (1)
- TCGA (1)
- TIN (1)
- Team based assignment (1)
- Teamwork (1)
- Threat Models (1)
- Time series data (1)
- Time-resolved crystallography (1)
- Topic modeling (1)
- Transpacific studies (1)
- Tree maintenance (1)
- Triarchic Model of Psychopathy (1)
- UV (1)
- Unified logging system (1)
- Use cases Morphologic box (1)
- Video annotations (1)
- Vienna circle (1)
- Virtual reality (1)
- Vulnerability analysis (1)
- X-ray refraction (1)
- YANG (1)
- abstract concepts (1)
- accelerator architectures (1)
- accessibility (1)
- action observation (1)
- activities (1)
- additive manufacturing (1)
- apple (1)
- application (1)
- archipelagic studies (1)
- astronomy (1)
- astrophotonics (1)
- athletic performance (1)
- authentication (1)
- automated driving (1)
- avoid magnetometers (1)
- back motion assessment (1)
- balloon telescopes (1)
- basal body (1)
- behavior psychotherapy (1)
- behavioral (1)
- behavioral reasoning (1)
- blind (1)
- bridges (1)
- centriole (1)
- centrosome (1)
- chemical modification (1)
- chimera state (1)
- cilium (1)
- clause type (1)
- cloud monitoring (1)
- cognitive development (1)
- community effect on height (1)
- competitive growth (1)
- complementary actions (1)
- complementiser (1)
- computer-mediated therapy (1)
- conceptualization (1)
- continuous (1)
- cooperation and competition (1)
- creep (1)
- cross-over effect (1)
- crystallinity (1)
- crystallography (1)
- damage evolution (1)
- data integration (1)
- data transfer (1)
- democracy (1)
- democratic quality (1)
- detectors (1)
- development artifacts (1)
- discourse (1)
- distributional learning (1)
- domination (1)
- doping (1)
- drainage networks (1)
- drift correction (1)
- economy (1)
- electrets (1)
- electroacoustic probing (1)
- emotion measurement (1)
- enzymology (1)
- exercise (1)
- fabrication (1)
- far infrared (1)
- ferroelectrets (1)
- ferroelectric and paraelectric phases (1)
- fibre Bragg gratings (1)
- field (1)
- fitness (1)
- flow accumulation (1)
- force-feedback (1)
- gait (1)
- gaming (1)
- gene selection (1)
- hardware (1)
- human motion analysis (1)
- human-computer interaction (1)
- humanoid (1)
- imitation (1)
- indigenous studies (1)
- injury prevention (1)
- intentionality (1)
- international academic mobility (1)
- internationalization (1)
- inversion (1)
- joint action (1)
- joint angle estimation (1)
- joint lab (1)
- kinematics (1)
- labeling (1)
- large scale mechanism (1)
- left periphery (1)
- lexical tone (1)
- lidar (1)
- linke Peripherie (1)
- locomotion (1)
- low back pain (1)
- low-density polyethylene (1)
- low-duty-cycling (1)
- management zone (1)
- measurement (1)
- medical documentation (1)
- mental number line (1)
- metal matrix composite (1)
- method development (1)
- methodology (1)
- microscopy (1)
- microstructures (1)
- microtubules (1)
- mismatch negativity (1)
- multimodal wireless sensor network (1)
- negative numbers (1)
- non-photorealistic rendering (1)
- nonlinear dynamics (1)
- nonlocal coupling (1)
- note-taking (1)
- nucleus-associated body (1)
- observatory (1)
- oceanic discourse (1)
- oneM2M (1)
- oneM2M Ontology (1)
- operator (1)
- oxidative stress (1)
- partial synchronization (1)
- particle microphysics (1)
- phase lag (1)
- phase oscillator (1)
- photometer (1)
- physical fitness (1)
- plyometric training (1)
- point clouds (1)
- point-based rendering (1)
- polypropylene (1)
- porosity (1)
- pre-attentive discrimination (1)
- precision agriculture (1)
- programmable matter (1)
- quenching (1)
- radiography (1)
- real-time rendering (1)
- real-walking (1)
- recreational sport (1)
- recrystallization (1)
- regularization (1)
- relaxor-ferroelectric polymer (1)
- reliability (1)
- restoration (1)
- safety (1)
- security (1)
- security analytics (1)
- smartphone (1)
- social cognition (1)
- social robots (1)
- software engineering (1)
- soil moisture (1)
- space-charge and polarization profiles (1)
- spectroscopy (1)
- speech acoustics (1)
- speech perception (1)
- speech production (1)
- speech variability (1)
- spindle pole body (1)
- state (1)
- steel and concrete structures (1)
- stochastic filtering (1)
- strain gauges (1)
- strain sensors (1)
- strategic growth adjustments (1)
- strength training (1)
- study abroad (1)
- study-related student travel (1)
- stunting (1)
- style transfer (1)
- surface charge stability (1)
- synchronization (1)
- synchrotron X-ray refraction radiography (1)
- syntax (1)
- teacher education (1)
- teaching students (1)
- time series (1)
- tissue-awareness (1)
- tomography (1)
- transoceanic studies (1)
- triangle method (1)
- turing test (1)
- uncertainty quantification (1)
- undernutrition (1)
- user experience (1)
- validation against optical motion capture (1)
- variable geometry truss (1)
- verbal reports (1)
- verification (1)
- visualization (1)
- visually impaired (1)
- voice onset time (1)
- vowels (1)
- wake-up radio (1)
- web-based rendering (1)
- young adults (1)
Institut
- Hasso-Plattner-Institut für Digital Engineering GmbH (52)
- Institut für Physik und Astronomie (18)
- Institut für Geowissenschaften (17)
- Institut für Biochemie und Biologie (16)
- Department Psychologie (13)
- Department Sport- und Gesundheitswissenschaften (13)
- Institut für Informatik und Computational Science (13)
- Institut für Ernährungswissenschaft (12)
- Institut für Chemie (8)
- Department Linguistik (6)
- Institut für Mathematik (3)
- Juristische Fakultät (3)
- Sozialwissenschaften (3)
- Department Erziehungswissenschaft (2)
- Fachgruppe Politik- & Verwaltungswissenschaft (2)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (2)
- Historisches Institut (2)
- Institut für Germanistik (2)
- Institut für Romanistik (2)
- Universitätsbibliothek (2)
- Institut für Anglistik und Amerikanistik (1)
- Institut für Umweltwissenschaften und Geographie (1)
- Lehreinheit für Wirtschafts-Arbeit-Technik (1)
- Wirtschaftswissenschaften (1)
- Öffentliches Recht (1)
A balance to death
(2018)
Leaf senescence plays a crucial role in nutrient recovery in late-stage plant development and requires vast transcriptional reprogramming by transcription factors such as ORESARA1 (ORE1). A proteolytic mechanism is now found to control ORE1 degradation, and thus senescence, during nitrogen starvation.
We analyze the problem of response suggestion in a closed domain along a real-world scenario of a digital library. We present a text-processing pipeline to generate question-answer pairs from chat transcripts. On this limited amount of training data, we compare retrieval-based, conditioned-generation, and dedicated representation learning approaches for response suggestion. Our results show that retrieval-based methods that strive to find similar, known contexts are preferable over parametric approaches from the conditioned-generation family, when the training data is limited. We, however, identify a specific representation learning approach that is competitive to the retrieval-based approaches despite the training data limitation.
Microservice Architectures (MSA) structure applications as a collection of loosely coupled services that implement business capabilities. The key advantages of MSA include inherent support for continuous deployment of large complex applications, agility and enhanced productivity. However, studies indicate that most MSA are homogeneous, and introduce shared vulnerabilites, thus vulnerable to multi-step attacks, which are economics-of-scale incentives to attackers. In this paper, we address the issue of shared vulnerabilities in microservices with a novel solution based on the concept of Moving Target Defenses (MTD). Our mechanism works by performing risk analysis against microservices to detect and prioritize vulnerabilities. Thereafter, security risk-oriented software diversification is employed, guided by a defined diversification index. The diversification is performed at runtime, leveraging both model and template based automatic code generation techniques to automatically transform programming languages and container images of the microservices. Consequently, the microservices attack surfaces are altered thereby introducing uncertainty for attackers while reducing the attackability of the microservices. Our experiments demonstrate the efficiency of our solution, with an average success rate of over 70% attack surface randomization.
Low back pain (LBP) is a leading cause of activity limitation. Objective assessment of the spinal motion plays a key role in diagnosis and treatment of LBP. We propose a method that facilitates clinical assessment of lower back motions by means of a wireless inertial sensor network. The sensor units are attached to the right and left side of the lumbar region, the pelvis and the thighs, respectively. Since magnetometers are known to be unreliable in indoor environments, we use only 3D accelerometer and 3D gyroscope readings. Compensation of integration drift in the horizontal plane is achieved by estimating the gyroscope biases from automatically detected initial rest phases. For the estimation of sensor orientations, both a smoothing algorithm and a filtering algorithm are presented. From these orientations, we determine three-dimensional joint angles between the thighs and the pelvis and between the pelvis and the lumbar region. We compare the orientations and joint angles to measurements of an optical motion tracking system that tracks each skin-mounted sensor by means of reflective markers. Eight subjects perform a neutral initial pose, then flexion/extension, lateral flexion, and rotation of the trunk. The root mean square deviation between inertial and optical angles is about one degree for angles in the frontal and sagittal plane and about two degrees for angles in the transverse plane (both values averaged over all trials). We choose five features that characterize the initial pose and the three motions. Interindividual differences of all features are found to be clearly larger than the observed measurement deviations. These results indicate that the proposed inertial sensor-based method is a promising tool for lower back motion assessment.
3D point cloud technology facilitates the automated and highly detailed digital acquisition of real-world environments such as assets, sites, cities, and countries; the acquired 3D point clouds represent an essential category of geodata used in a variety of geoinformation applications and systems. In this paper, we present a web-based system for the interactive and collaborative exploration and inspection of arbitrary large 3D point clouds. Our approach is based on standard WebGL on the client side and is able to render 3D point clouds with billions of points. It uses spatial data structures and level-of-detail representations to manage the 3D point cloud data and to deploy out-of-core and web-based rendering concepts. By providing functionality for both, thin-client and thick-client applications, the system scales for client devices that are vastly different in computing capabilities. Different 3D point-based rendering techniques and post-processing effects are provided to enable task-specific and data-specific filtering and highlighting, e.g., based on per-point surface categories or temporal information. A set of interaction techniques allows users to collaboratively work with the data, e.g., by measuring distances and areas, by annotating, or by selecting and extracting data subsets. Additional value is provided by the system's ability to display additional, context-providing geodata alongside 3D point clouds and to integrate task-specific processing and analysis operations. We have evaluated the presented techniques and the prototype system with different data sets from aerial, mobile, and terrestrial acquisition campaigns with up to 120 billion points to show their practicality and feasibility.
The rapid digitalization of the Facility Management (FM) sector has increased the demand for mobile, interactive analytics approaches concerning the operational state of a building. These approaches provide the key to increasing stakeholder engagement associated with Operation and Maintenance (O&M) procedures of living and working areas, buildings, and other built environment spaces. We present a generic and fast approach to process and analyze given 3D point clouds of typical indoor office spaces to create corresponding up-to-date approximations of classified segments and object-based 3D models that can be used to analyze, record and highlight changes of spatial configurations. The approach is based on machine-learning methods used to classify the scanned 3D point cloud data using 2D images. This approach can be used to primarily track changes of objects over time for comparison, allowing for routine classification, and presentation of results used for decision making. We specifically focus on classification, segmentation, and reconstruction of multiple different object types in a 3D point-cloud scene. We present our current research and describe the implementation of these technologies as a web-based application using a services-oriented methodology.
We present a prototype of an integrated reasoning environment for educational purposes. The presented tool is a fragment of a proof assistant and automated theorem prover. We describe the existing and planned functionality of the theorem prover and especially the functionality of the educational fragment. This currently supports working with terms of the untyped lambda calculus and addresses both undergraduate students and researchers. We show how the tool can be used to support the students' understanding of functional programming and discuss general problems related to the process of building theorem proving software that aims at supporting both research and education.
High-throughput RNA sequencing (RNAseq) produces large data sets containing expression levels of thousands of genes. The analysis of RNAseq data leads to a better understanding of gene functions and interactions, which eventually helps to study diseases like cancer and develop effective treatments. Large-scale RNAseq expression studies on cancer comprise samples from multiple cancer types and aim to identify their distinct molecular characteristics. Analyzing samples from different cancer types implies analyzing samples from different tissue origin. Such multi-tissue RNAseq data sets require a meaningful analysis that accounts for the inherent tissue-related bias: The identified characteristics must not originate from the differences in tissue types, but from the actual differences in cancer types. However, current analysis procedures do not incorporate that aspect. As a result, we propose to integrate a tissue-awareness into the analysis of multi-tissue RNAseq data. We introduce an extension for gene selection that provides a tissue-wise context for every gene and can be flexibly combined with any existing gene selection approach. We suggest to expand conventional evaluation by additional metrics that are sensitive to the tissue-related bias. Evaluations show that especially low complexity gene selection approaches profit from introducing tissue-awareness.
An energy consumption model for multiModal wireless sensor networks based on wake-up radio receivers
(2018)
Energy consumption is a major concern in Wireless Sensor Networks. A significant waste of energy occurs due to the idle listening and overhearing problems, which are typically avoided by turning off the radio, while no transmission is ongoing. The classical approach for allowing the reception of messages in such situations is to use a low-duty-cycle protocol, and to turn on the radio periodically, which reduces the idle listening problem, but requires timers and usually unnecessary wakeups. A better solution is to turn on the radio only on demand by using a Wake-up Radio Receiver (WuRx). In this paper, an energy model is presented to estimate the energy saving in various multi-hop network topologies under several use cases, when a WuRx is used instead of a classical low-duty-cycling protocol. The presented model also allows for estimating the benefit of various WuRx properties like using addressing or not.
An Information System Supporting the Eliciting of Expert Knowledge for Successful IT Projects
(2018)
In order to guarantee the success of an IT project, it is necessary for a company to possess expert knowledge. The difficulty arises when experts no longer work for the company and it then becomes necessary to use their knowledge, in order to realise an IT project. In this paper, the ExKnowIT information system which supports the eliciting of expert knowledge for successful IT projects, is presented and consists of the following modules: (1) the identification of experts for successful IT projects, (2) the eliciting of expert knowledge on completed IT projects, (3) the expert knowledge base on completed IT projects, (4) the Group Method for Data Handling (GMDH) algorithm, (5) new knowledge in support of decisions regarding the selection of a manager for a new IT project. The added value of our system is that these three approaches, namely, the elicitation of expert knowledge, the success of an IT project and the discovery of new knowledge, gleaned from the expert knowledge base, otherwise known as the decision model, complement each other.
One-tube osmotic fragility (OF) test is a rapid test used widely for screening thalassemia in countries with limited resources. The test has important limitation in that its accuracy relies on observers’ experience.
The iCheck Turbidity is a prototype of portable nephelometer developed by BioAnalyt (Bioanalyt GmbH, Germany). In this study, we assessed the applicability of the iCheck Turbidity, for checking turbidity of the OF-test
ASEDS
(2018)
The Massive adoption of social media has provided new ways for individuals to express their opinion and emotion online. In 2016, Facebook introduced a new reactions feature that allows users to express their psychological emotions regarding published contents using so-called Facebook reactions. In this paper, a framework for predicting the distribution of Facebook post reactions is presented. For this purpose, we collected an enormous amount of Facebook posts associated with their reactions labels using the proposed scalable Facebook crawler. The training process utilizes 3 million labeled posts for more than 64,000 unique Facebook pages from diverse categories. The evaluation on standard benchmarks using the proposed features shows promising results compared to previous research. The final model is able to predict the reaction distribution on Facebook posts with a recall score of 0.90 for "Joy" emotion.
Manufacturing industries are undergoing a major paradigm shift towards more autonomy. Automated planning and scheduling then becomes a necessity. The Planning and Execution Competition for Logistics Robots in Simulation held at ICAPS is based on this scenario and provides an interesting testbed. However, the posed problem is challenging as also demonstrated by the somewhat weak results in 2017. The domain requires temporal reasoning and dealing with uncertainty. We propose a novel planning system based on Answer Set Programming and the Clingo solver to tackle these problems and incentivize robot cooperation. Our results show a significant performance improvement, both, in terms of lowering computational requirements and better game metrics.
The classification of vulnerabilities is a fundamental step to derive formal attributes that allow a deeper analysis. Therefore, it is required that this classification has to be performed timely and accurate. Since the current situation demands a manual interaction in the classification process, the timely processing becomes a serious issue. Thus, we propose an automated alternative to the manual classification, because the amount of identified vulnerabilities per day cannot be processed manually anymore. We implemented two different approaches that are able to automatically classify vulnerabilities based on the vulnerability description. We evaluated our approaches, which use Neural Networks and the Naive Bayes methods respectively, on the base of publicly known vulnerabilities.
Beacon in the Dark
(2018)
The large amount of heterogeneous data in these email corpora renders experts' investigations by hand infeasible. Auditors or journalists, e.g., who are looking for irregular or inappropriate content or suspicious patterns, are in desperate need for computer-aided exploration tools to support their investigations.
We present our Beacon system for the exploration of such corpora at different levels of detail. A distributed processing pipeline combines text mining methods and social network analysis to augment the already semi-structured nature of emails. The user interface ties into the resulting cleaned and enriched dataset. For the interface design we identify three objectives expert users have: gain an initial overview of the data to identify leads to investigate, understand the context of the information at hand, and have meaningful filters to iteratively focus onto a subset of emails. To this end we make use of interactive visualisations based on rearranged and aggregated extracted information to reveal salient patterns.
Beware of SMOMBIES
(2018)
Several research evaluated the user's style of walking for the verification of a claimed identity and showed high authentication accuracies in many settings. In this paper we present a system that successfully verifies a user's identity based on many real world smartphone placements and yet not regarded interactions while walking. Our contribution is the distinction of all considered activities into three distinct subsets and a specific one-class Support Vector Machine per subset. Using sensor data of 30 participants collected in a semi-supervised study approach, we prove that unsupervised verification is possible with very low false-acceptance and false-rejection rates. We furthermore show that these subsets can be distinguished with a high accuracy and demonstrate that this system can be deployed on off-the-shelf smartphones.
Beyond Surveys
(2018)
Capsella
(2018)
Organic or inorganic (A) metal (M) halide (X) perovskites (AMX(3)) are semiconductor materials setting the basis for the development of highly efficient, low-cost and multijunction solar energy conversion devices. The best efficiencies nowadays are obtained with mixed compositions containing methylammonium, formamidinium, Cs and Rb as well as iodine, bromine and chlorine as anions. The understanding of fundamental properties such as crystal structure and its effect on the band gap, as well as their phase stability is essential. In this systematic study X-ray diffraction and photoluminescense spectroscopy were applied to evaluate structural and optoelectronic properties of hybrid perovskites with mixed compositions.
Clause typing in Germanic
(2018)
The questionnaire investigates the functional left periphery of various finite clauses in Germanic languages, with particular attention paid to clause-typing elements and the combinations thereof. The questionnaire is mostly concerned with clause typing in embedded clauses, but main clause counterparts are also considered for comparative purposes. The chief aim was to achieve comparable results across Germanic languages, though the standardised questionnaire may also be helpful in the study of other languages, too. Most questions examine the availability of various complementisers and clause-typing operators, and in some cases the movement of verbs to the left periphery is also taken into account. The questionnaire is split into seven major parts according to the types of clauses under scrutiny.
All instructions were given in English and the individual questions either concern translations of given sentences from English into the target language, and/or they ask for specific details about the constructions in the target language.
The present document contains the questionnaire itself (together with the instructions given at the beginning of the questionnaire and at the beginning of the individual sections, as well as the questions asking for personal data), the sociolinguistic data of the speakers, and the actual results for the individual languages. Five Germanic languages are included: Dutch, Danish, Icelandic, Norwegian and Swedish. For each language, two informants were recruited. Given the small number of informants, the present study serves as a qualitative investigation and as a basis for further, quantitative and experimental studies.
This Research-to-Practice paper examines the practical application of various forms of collaborative learning in MOOCs. Since 2012, about 60 MOOCs in the wider context of Information Technology and Computer Science have been conducted on our self-developed MOOC platform. The platform is also used by several customers, who either run their own platform instances or use our white label platform. We, as well as some of our partners, have experimented with different approaches in collaborative learning in these courses. Based on the results of early experiments, surveys amongst our participants, and requests by our business partners we have integrated several options to offer forms of collaborative learning to the system. The results of our experiments are directly fed back to the platform development, allowing to fine tune existing and to add new tools where necessary. In the paper at hand, we discuss the benefits and disadvantages of decisions in the design of a MOOC with regard to the various forms of collaborative learning. While the focus of the paper at hand is on forms of large group collaboration, two types of small group collaboration on our platforms are briefly introduced.
Riback et al. (Reports, 13 October 2017, p. 238) used small-angle x-ray scattering (SAXS) experiments to infer a degree of compaction for unfolded proteins in water versus chemical denaturant that is highly consistent with the results from Forster resonance energy transfer (FRET) experiments. There is thus no "contradiction" between the two methods, nor evidence to support their claim that commonly used FRET fluorophores cause protein compaction.
Kim et al. recently measured the structure factor of deeply supercooled water droplets (Reports, 22 December 2017, p. 1589). We raise several concerns about their data analysis and interpretation. In our opinion, the reported data do not lead to clear conclusions about the origins of water’s anomalies.
The centrosome is not only the largest and most sophisticated protein complex within a eukaryotic cell, in the light of evolution, it is also one of its most ancient organelles. This special issue of "Cells" features representatives of three main, structurally divergent centrosome types, i.e., centriole-containing centrosomes, yeast spindle pole bodies (SPBs), and amoebozoan nucleus-associated bodies (NABs). Here, I discuss their evolution and their key-functions in microtubule organization, mitosis, and cytokinesis. Furthermore, I provide a brief history of centrosome research and highlight recently emerged topics, such as the role of centrioles in ciliogenesis, the relationship of centrosomes and centriolar satellites, the integration of centrosomal structures into the nuclear envelope and the involvement of centrosomal components in non-centrosomal microtubule organization.
We consider chimera states in a one-dimensional medium of nonlinear nonlocally coupled phase oscillators. Stationary inhomogeneous solutions of the Ott-Antonsen equation for a complex order parameter that correspond to fundamental chimeras have been constructed. Stability calculations reveal that only some of these states are stable. The direct numerical simulation has shown that these structures under certain conditions are transformed to breathing chimera regimes because of the development of instability. Further development of instability leads to turbulent chimeras.
Logical modeling has been widely used to understand and expand the knowledge about protein interactions among different pathways. Realizing this, the caspo-ts system has been proposed recently to learn logical models from time series data. It uses Answer Set Programming to enumerate Boolean Networks (BNs) given prior knowledge networks and phosphoproteomic time series data. In the resulting sequence of solutions, similar BNs are typically clustered together. This can be problematic for large scale problems where we cannot explore the whole solution space in reasonable time. Our approach extends the caspo-ts system to cope with the important use case of finding diverse solutions of a problem with a large number of solutions. We first present the algorithm for finding diverse solutions and then we demonstrate the results of the proposed approach on two different benchmark scenarios in systems biology: (1) an artificial dataset to model TCR signaling and (2) the HPN-DREAM challenge dataset to model breast cancer cell lines.
Business processes constantly generate, manipulate, and consume data that are managed by organizational databases. Despite being central to process modeling and execution, the link between processes and data is often handled by developers when the process is implemented, thus leaving the connection unexplored during the conceptual design. In this paper, we introduce, formalize, and evaluate a novel conceptual view that bridges the gap between process and data models, and show some kinds of interesting insights that can be derived from this novel proposal.
In the course of patient treatments, psychotherapists aim to meet the challenges of being both a trusted, knowledgeable conversation partner and a diligent documentalist. We are developing the digital whiteboard system Tele-Board MED (TBM), which allows the therapist to take digital notes during the session together with the patient. This study investigates what therapists are experiencing when they document with TBM in patient sessions for the first time and whether this documentation saves them time when writing official clinical documents. As the core of this study, we conducted four anamnesis session dialogues with behavior psychotherapists and volunteers acting in the role of patients. Following a mixed-method approach, the data collection and analysis involved self-reported emotion samples, user experience curves and questionnaires. We found that even in the very first patient session with TBM, therapists come to feel comfortable, develop a positive feeling and can concentrate on the patient. Regarding administrative documentation tasks, we found with the TBM report generation feature the therapists save 60% of the time they normally spend on writing case reports to the health insurance.
One paragraph of the manuscript of the paper has been inadvertently omitted in the very final stage of its compilation due to a technical mistake. Since this paragraph discusses the declustering of the used earthquake catalogue and is therefore necessary for the understanding of the seismicity data preprocessing, the authors decided to provide this paragraph in form of a correction. The respective paragraph belongs to chapter 2 of the paper, where it was placed originally, and should be inserted into the published paper before the second to the last paragraph. The omitted text reads as follows:
We study the rupture processes of Iquique earthquake 8.1 (2014/04/01) and its largest aftershock 7.7 (2014/04/03) that ruptured the North Chile subduction zone. High-rate Global Positioning System (GPS) recordings and strong motion data are used to reconstruct the evolution of the slip amplitude, rise time and rupture time of both earthquakes. A two-step inversion scheme is assumed, by first building prior models for both earthquakes from the inversion of the estimated static displacements and then, kinematic inversions in the frequency domain are carried out taken into account this prior information. The preferred model for the mainshock exhibits a seismic moment of 1.73 × 1021 Nm ( 8.1) and maximum slip of ∼9 m, while the aftershock model has a seismic moment of 3.88 × 1020 ( 7.7) and a maximum slip of ∼3 m. For both earthquakes, the final slip distributions show two asperities (a shallow one and a deep one) separated by an area with significant slip deficit. This suggests a segmentation along-dip which might be related to a change of the dipping angle of the subducting slab inferred from gravimetric data. Along-strike, the areas where the seismic ruptures stopped seem to be well correlated with geological features observed from geophysical information (high-resolution bathymetry, gravimetry and coupling maps) that are representative of the long-term segmentation of the subduction margin. Considering the spatially limited portions that were broken by these two earthquakes, our results support the idea that the seismic gap is not filled yet.
The Aral Sea desiccation and related changes in hydroclimatic conditions on a regional level is a hot topic for past decades. The key problem of scientific research projects devoted to an investigation of modern Aral Sea basin hydrological regime is its discontinuous nature - the only limited amount of papers takes into account the complex runoff formation system entirely. Addressing this challenge we have developed a continuous prediction system for assessing freshwater inflow into the Small Aral Sea based on coupling stack of hydrological and data-driven models. Results show a good prediction skill and approve the possibility to develop a valuable water assessment tool which utilizes the power of classical physically based and modern machine learning models both for territories with complex water management system and strong water-related data scarcity. The source code and data of the proposed system is available on a Github page (https://github.com/SMASHIproject/IWRM2018).
CSBAuditor
(2018)
Cloud Storage Brokers (CSB) provide seamless and concurrent access to multiple Cloud Storage Services (CSS) while abstracting cloud complexities from end-users. However, this multi-cloud strategy faces several security challenges including enlarged attack surfaces, malicious insider threats, security complexities due to integration of disparate components and API interoperability issues. Novel security approaches are imperative to tackle these security issues. Therefore, this paper proposes CSBAuditor, a novel cloud security system that continuously audits CSB resources, to detect malicious activities and unauthorized changes e.g. bucket policy misconfigurations, and remediates these anomalies. The cloud state is maintained via a continuous snapshotting mechanism thereby ensuring fault tolerance. We adopt the principles of chaos engineering by integrating Broker Monkey, a component that continuously injects failure into our reference CSB system, Cloud RAID. Hence, CSBAuditor is continuously tested for efficiency i.e. its ability to detect the changes injected by Broker Monkey. CSBAuditor employs security metrics for risk analysis by computing severity scores for detected vulnerabilities using the Common Configuration Scoring System, thereby overcoming the limitation of insufficient security metrics in existing cloud auditing schemes. CSBAuditor has been tested using various strategies including chaos engineering failure injection strategies. Our experimental evaluation validates the efficiency of our approach against the aforementioned security issues with a detection and recovery rate of over 96 %.
CurEx
(2018)
The integration of diverse structured and unstructured information sources into a unified, domain-specific knowledge base is an important task in many areas. A well-maintained knowledge base enables data analysis in complex scenarios, such as risk analysis in the financial sector or investigating large data leaks, such as the Paradise or Panama papers. Both the creation of such knowledge bases, as well as their continuous maintenance and curation involves many complex tasks and considerable manual effort. With CurEx, we present a modular system that allows structured and unstructured data sources to be integrated into a domain-specific knowledge base. In particular, we (i) enable the incremental improvement of each individual integration component; (ii) enable the selective generation of multiple knowledge graphs from the information contained in the knowledge base; and (iii) provide two distinct user interfaces tailored to the needs of data engineers and end-users respectively. The former has curation capabilities and controls the integration process, whereas the latter focuses on the exploration of the generated knowledge graph.
Operational decisions in business processes can be modeled by using the Decision Model and Notation (DMN). The complementary use of DMN for decision modeling and of the Business Process Model and Notation (BPMN) for process design realizes the separation of concerns principle. For supporting separation of concerns during the design phase, it is crucial to understand which aspects of decision-making enclosed in a process model should be captured by a dedicated decision model. Whereas existing work focuses on the extraction of decision models from process control flow, the connection of process-related data and decision models is still unexplored. In this paper, we investigate how process-related data used for making decisions can be represented in process models and we distinguish a set of BPMN patterns capturing such information. Then, we provide a formal mapping of the identified BPMN patterns to corresponding DMN models and apply our approach to a real-world healthcare process.
Business process simulation is an important means for quantitative analysis of a business process and to compare different process alternatives. With the Business Process Model and Notation (BPMN) being the state-of-the-art language for the graphical representation of business processes, many existing process simulators support already the simulation of BPMN diagrams. However, they do not provide well-defined interfaces to integrate new concepts in the simulation environment. In this work, we present the design and architecture of a proof-of-concept implementation of an open and extensible BPMN process simulator. It also supports the simulation of multiple BPMN processes at a time and relies on the building blocks of the well-founded discrete event simulation. The extensibility is assured by a plug-in concept. Its feasibility is demonstrated by extensions supporting new BPMN concepts, such as the simulation of business rule activities referencing decision models and batch activities.
For theoretical analyses there are two specifics distinguishing GP from many other areas of evolutionary computation. First, the variable size representations, in particular yielding a possible bloat (i.e. the growth of individuals with redundant parts). Second, the role and realization of crossover, which is particularly central in GP due to the tree-based representation. Whereas some theoretical work on GP has studied the effects of bloat, crossover had a surprisingly little share in this work. We analyze a simple crossover operator in combination with local search, where a preference for small solutions minimizes bloat (lexicographic parsimony pressure); the resulting algorithm is denoted Concatenation Crossover GP. For this purpose three variants of the wellstudied Majority test function with large plateaus are considered. We show that the Concatenation Crossover GP can efficiently optimize these test functions, while local search cannot be efficient for all three variants independent of employing bloat control.
Development of a tool to identify intensive care patients at risk of meropenem therapy failure
(2018)
The Amyloid-precursor-like protein 1 (APLP1) is a neuronal type I transmembrane protein which plays a role in synaptic adhesion and synaptogenesis. Past investigations indicated that APLP1 is involved in the formation of protein-protein complexes that bridge the junctions between neighboring cells. Nevertheless, APLP1-APLP1 trans interactions have never been directly observed in higher eukaryotic cells. Here, we investigate APLP1 interactions and dynamics directly in living human embryonic kidney (HEK) cells, using fluorescence fluctuation spectroscopy techniques, namely cross-correlation scanning fluorescence correlation spectroscopy (sFCS) and Number&Brightness (N&B). Our results show that APLP1 forms homotypic trans complexes at cell-cell contacts. In the presence of zinc ions, the protein forms macroscopic clusters, exhibiting an even higher degree of trans binding and strongly reduced dynamics. Further evidence from Giant Plasma Membrane Vesicles and live cell actin staining suggests that the presence of an intact cortical cytoskeleton is required for zinc-induced cis multimerization. Subsequently, large adhesion platforms bridging interacting cells are formed through APLP1-APLP1 direct trans interactions. Taken together, our results provide direct evidence that APLP1 functions as a neuronal zinc-dependent adhesion protein and provide a more detailed understanding of the molecular mechanisms driving the formation of APLP1 adhesion platforms. Further, they show that fluorescence fluctuation spectroscopy techniques are useful tools for the investigation of protein-protein interactions at cell-cell adhesion sites.
Participants of the 2017 European Space Weather Week in Ostend, Belgium, discussed the stakeholder requirements for space weather-related models. It was emphasized that stakeholders show an increased interest in space weather-related models. Participants of the meeting discussed particular prediction indicators that can provide first-order estimates of the impact of space weather on engineering systems.
Skeletal muscle alterations during aging lead to dysfunctional metabolism, correlating with frailty and early mortality. The loss of proteostasis is a hallmark of aging. Whether proteostasis loss plays a role in muscle aging remains elusive. To address this question we collected muscles, Soleus (SOL, type I) and Extensor digitorum longus (EDL, type II), from young (4 months) and old (25 months) C57BL/6 mice and evaluated the proteasomal system. Initial work showed decreased 26 S activity in old SOL. EDL displayed lower proteasomal activity in both ages compared to any of the SOL ages. Moreover, in order to understand if during aging there is the so-called “fiber switch from fast-to-slow”, we performed western blots against sMHC and fMHC (slow and fast myosin heavy chain, respectively). Preliminary results suggest that young SOL is composed by slow twitch fibers but also contains fast twitch fibers, while young EDL seems to be mostly composed by fast twitch fibers that level down during aging, suggesting the switch. As a conclusion, EDL seems to have less proteasomal activity, however, if this is a contributor or a consequence to the muscle fiber switch during aging still needs further investigation.
Recently blockchain technology has been introduced to execute interacting business processes in a secure and transparent way. While the foundations for process enactment on blockchain have been researched, the execution of decisions on blockchain has not been addressed yet. In this paper we argue that decisions are an essential aspect of interacting business processes, and, therefore, also need to be executed on blockchain. The immutable representation of decision logic can be used by the interacting processes, so that decision taking will be more secure, more transparent, and better auditable. The approach is based on a mapping of the DMN language S-FEEL to Solidity code to be run on the Ethereum blockchain. The work is evaluated by a proof-of-concept prototype and an empirical cost evaluation.
DualPanto
(2018)
We present a new haptic device that enables blind users to continuously track the absolute position of moving objects in spatial virtual environments, as is the case in sports or shooter games. Users interact with DualPanto by operating the me handle with one hand and by holding on to the it handle with the other hand. Each handle is connected to a pantograph haptic input/output device. The key feature is that the two handles are spatially registered with respect to each other. When guiding their avatar through a virtual world using the me handle, spatial registration enables users to track moving objects by having the device guide the output hand. This allows blind players of a 1-on-1 soccer game to race for the ball or evade an opponent; it allows blind players of a shooter game to aim at an opponent and dodge shots. In our user study, blind participants reported very high enjoyment when using the device to play (6.5/7).
The problem of constructing and maintaining a tree topology in a distributed manner is a challenging task in WSNs. This is because the nodes have limited computational and memory resources and the network changes over time. We propose the Dynamic Gallager-Humblet-Spira (D-GHS) algorithm that builds and maintains a minimum spanning tree. To do so, we divide D-GHS into four phases, namely neighbor discovery, tree construction, data collection, and tree maintenance. In the neighbor discovery phase, the nodes collect information about their neighbors and the link quality. In the tree construction, D-GHS finds the minimum spanning tree by executing the Gallager-Humblet-Spira algorithm. In the data collection phase, the sink roots the minimum spanning tree at itself, and each node sends data packets. In the tree maintenance phase, the nodes repair the tree when communication failures occur. The emulation results show that D-GHS reduces the number of control messages and the energy consumption, at the cost of a slight increase in memory size and convergence time.
Editorial
(2018)
"Never doubt that a small group of thoughtful, committed citizens can change the world; indeed, it's the only thing that ever has. - Margaret Mead."
With the last issue of this year we want to point out directions towards what will come and what challenges and opportunities lie ahead of us. More needed than ever are joint creative efforts to find ways to collaborate and innovate in order to secure the wellbeing of our earth for the next generation to come. We have found ourselves puzzled that we could assemble a sustainability issue without having a call for papers or a special issue. In fact, many of the submissions we currently receive, deal with sustainable, ecological or novel approaches to management and organizations. As creativity and innovation are undisputable necessary ingredients for reaching the sustainable development goals, empirical proof and research in this area are still in their infancy. While the role of design and design thinking has been highlighted before for solving wicked societal problems, a lot more research is needed which creative and innovative ways organisations and societies can take to find solutions to climate change, poverty, hunger and education. We would therefore like to call to you, our readers and writers to tackle these problems with your research.
The first article in this issue addresses one of the above named challenges - the role of innovation for achieving the transition to a low-carbon energy world. In “Innovating for low-carbon energy through hydropower: Enabling a conservation charity's transition to a low-carbon community”, the authors John Gallagher, Paul Coughlan, A. Prysor Williams and Aonghus McNabola look at how an eco-design approach has supported a community transition to low-carbon. They highlight the importance of effective management as well as external collaboration and how the key for success lay in fostering an open environment for creativity and idea sharing. The second article addresses another of the grand challenges, the future of mobility and uses a design-driven approach to develop scenarios for mobility in cities. In “Designing radical innovations of meanings for society: envisioning new scenarios for smart mobility”, the authors Claudio Dell'Era, Naiara Altuna and Roberto Verganti investigate how new meanings can be designed and proposed to society rather than to individuals in the particular context of smart mobility. Through two case studies the authors argue for a multi-level perspective, taking the perspective of the society to solve societal challenges while considering the needs of the individual. The latter is needed because we will not change if our needs are not addressed. Furthermore, the authors find that both, meaning and technology need to be considered to create radical innovation for society. The role of meaning continues in the third article in this issue. The authors Marta Gasparin and William Green show in their article “Reconstructing meaning without redesigning products: The case of the Serie7 chair” how meaning changes over time even though the product remains the same. Through an in-depth retrospective study of the Serie 7 chair the authors investigate the relationship between meaning and the materiality of the object, and show the importance of materiality in constructing product meaning over long periods. Translating this meaning over the course of the innovation process is an important task of management in order to gain buy-in from all involved stakeholders. In the following article “A systematic approach for new technology development by using a biomimicry-based TRIZ contradiction matrix” the authors Byungun Yoon, Chaeguk Lim, Inchae Park and Dooseob Yoon develop a systematic process combining biomimicry and technology-based TRIZ in order to solve technological problems or develop new technologies based on completely new sources or combinations from technology and biology.
In the fifth article in this issue “Innovating via Building Absorptive Capacity: Interactive Effects of Top Management Support of Learning, Employee Learning Orientation, and Decentralization Structure” the authors Li-Yun Sun, Chenwei Li and Yuntao Dong examine the effect of learning-related personal and contextual factors on organizational absorptive capability and subsequent innovative performance. The authors find positive effects as well as a moderation influence of decentralized organizational decision-making structures. In the sixth article “Creativity within boundaries: social identity and the development of new ideas in franchise systems” the authors Fanny Simon, Catherine Allix-Desfautaux, Nabil Khelil and Anne-Laure Le Nadant address the paradox of balancing novelty and conformity for creativity in a franchise system. This research is one of the first we know to explicitly address creativity and innovation in such a rigid and pre-determined system. Using a social identity perspective, they can show that social control, which may be exerted by manipulating group identity, is an efficient lever to increase both the creation and the diffusion of the idea. Furthermore, they show that franchisees who do not conform to the norm of the group are stigmatized and must face pressure from the group to adapt their behaviors. This has important implications for future research. In the following article “Exploring employee interactions and quality of contributions in intra-organisational innovation platforms” the authors Dimitra Chasanidou, Njål Sivertstol and Jarle Hildrum examine the user interactions in an intra-organisational innovation platform, and also address the influence of user interactions for idea development. The authors find that employees communicate through the innovation platform with different interaction, contribution and collaboration types and propose three types of contribution qualities—passive, efficient and balanced contribution. In the eighth article “Ready for Take-off”: How Open Innovation influences startup success” Cristina Marullo, Elena Casprini, Alberto di Minin and Andrea Piccaluga seek to predict new venture success based on factors that can be observed in the pre-startup phase. The authors introduce different variables of founding teams and how these relate to startup success. Building on large-scale dataset of submitted business plans at UC Berkeley, they can show that teams with high skills diversity and past joint experience are a lot better able to prevent the risk of business failure at entry and to adapt the internal resources to market conditions. Furthermore, it is crucial for the team to integrate many external knowledge sources into their process (openness) in order to be successful. The crucial role of knowledge and how it is communicated and shared is the focal point of Natalya Sergeeva's and Anna Trifilova's article on “The role of storytelling in the innovation process”. They authors can show how storytelling has an important role to play when it comes to motivating employees to innovate and promoting innovation success stories inside and outside the organization. The deep human desire to hear and experience stories is also addressed in the last article in this issue “Gamification Approaches to the Early Stage of Innovation” by Rui Patricio, Antonio Moreira and Francesco Zurlo. Using gamification approaches at the early stage of innovation promises to create better team coherence, let employees experience fun and engagement, improve communication and foster knowledge exchange. Using an analytical framework, the authors analyze 15 articles that have looked at gamification in the context of innovation management before. They find that gamification indeed supports firms in becoming better at performing complex innovation tasks and managing innovation challenges. Furthermore, gamification in innovation creates a space for inspiration, improves creativity and the generation of high potential ideas.
Editorial: Reaching to Grasp Cognition: Analyzing Motor Behavior to Investigate Social Interactions
(2018)
Embedded smart home — remote lab MOOC with optional real hardware experience for over 4000 students
(2018)
MOOCs (Massive Open Online Courses) become more and more popular for learners of all ages to study further or to learn new subjects of interest. The purpose of this paper is to introduce a different MOOC course style. Typically, video content is shown teaching the student new information. After watching a video, self-test questions can be answered. Finally, the student answers weekly exams and final exams like the self test questions. Out of the points that have been scored for weekly and final exams a certificate can be issued. Our approach extends the possibility to receive points for the final score with practical programming exercises on real hardware. It allows the student to do embedded programming by communicating over GPIO pins to control LEDs and measure sensor values. Additionally, they can visualize values on an embedded display using web technologies, which are an essential part of embedded and smart home devices to communicate with common APIs. Students have the opportunity to solve all tasks within the online remote lab and at home on the same kind of hardware. The evaluation of this MOOCs indicates the interesting design for students to learn an engineering technique with new technology approaches in an appropriate, modern, supporting and motivating way of teaching.
When students watch learning videos online, they usually need to watch several hours of video content. In the end, not every minute of a video is relevant for the exam. Additionally, students need to add notes to clarify issues of a lecture. There are several possibilities to enhance the metadata of a video, e.g. a typical way to add user-specific information to an online video is a comment functionality, which allows users to share their thoughts and questions with the public. In contrast to common video material which can be found online, lecture videos are used for exam preparation. Due to this difference, the idea comes up to annotate lecture videos with markers and personal notes for a better understanding of the taught content. Especially, students learning for an exam use their notes to refresh their memories. To ease this learning method with lecture videos, we introduce the annotation feature in our video lecture archive. This functionality supports the students with keeping track of their thoughts by providing an intuitive interface to easily add, modify or remove their ideas. This annotation function is integrated in the video player. Hence, scrolling to a separate annotation area on the website is not necessary. Furthermore, the annotated notes can be exported together with the slide content to a PDF file, which can then be printed easily. Lecture video annotations support and motivate students to learn and watch videos from an E-Learning video archive.
Live migration is an important feature in modern software-defined datacenters and cloud computing environments. Dynamic resource management, load balance, power saving and fault tolerance are all dependent on the live migration feature. Despite the importance of live migration, the cost of live migration cannot be ignored and may result in service availability degradation. Live migration cost includes the migration time, downtime, CPU overhead, network and power consumption. There are many research articles that discuss the problem of live migration cost with different scopes like analyzing the cost and relate it to the parameters that control it, proposing new migration algorithms that minimize the cost and also predicting the migration cost. For the best of our knowledge, most of the papers that discuss the migration cost problem focus on open source hypervisors. For the research articles focus on VMware environments, none of the published articles proposed migration time, network overhead and power consumption modeling for single and multiple VMs live migration. In this paper, we propose empirical models for the live migration time, network overhead and power consumption for single and multiple VMs migration. The proposed models are obtained using a VMware based testbed.
An efficient Design Space Exploration (DSE) is imperative for the design of modern, highly complex embedded systems in order to steer the development towards optimal design points. The early evaluation of design decisions at system-level abstraction layer helps to find promising regions for subsequent development steps in lower abstraction levels by diminishing the complexity of the search problem. In recent works, symbolic techniques, especially Answer Set Programming (ASP) modulo Theories (ASPmT), have been shown to find feasible solutions of highly complex system-level synthesis problems with non-linear constraints very efficiently. In this paper, we present a novel approach to a holistic system-level DSE based on ASPmT. To this end, we include additional background theories that concurrently guarantee compliance with hard constraints and perform the simultaneous optimization of several design objectives. We implement and compare our approach with a state-of-the-art preference handling framework for ASP. Experimental results indicate that our proposed method produces better solutions with respect to both diversity and convergence to the true Pareto front.
The relentless improvement of silicon photonics is making optical interconnects and networks appealing for use in miniaturized systems, where electrical interconnects cannot keep up with the growing levels of core integration due to bandwidth density and power efficiency limitations. At the same time, solutions such as 3D stacking or 2.5D integration open the door to a fully dedicated process optimization for the photonic die. However, an architecture-level integration challenge arises between the electronic network and the optical one in such tightly-integrated parallel systems. It consists of adapting signaling rates, matching the different levels of communication parallelism, handling cross-domain flow control, addressing re-synchronization concerns, and avoiding protocol-dependent deadlock. The associated energy and performance overhead may offset the inherent benefits of the emerging technology itself. This paper explores a hybrid CMOS-ECL bridge architecture between 3D-stacked technology-heterogeneous networks-on-chip (NoCs). The different ways of overcoming the serialization challenge (i.e., through an improvement of the signaling rate and/or through space-/wavelength division multiplexing options) give rise to a configuration space that the paper explores, in search for the most energy-efficient configuration for high-performance.
We describe how inversion symmetry separation of electronic state manifolds in resonant inelastic soft X-ray scattering (RIXS) can be applied to probe excited-state dynamics with compelling selectivity. In a case study of Fe L3-edge RIXS in the ferricyanide complex Fe(CN)63−, we demonstrate with multi-configurational restricted active space spectrum simulations how the information content of RIXS spectral fingerprints can be used to unambiguously separate species of different electronic configurations, spin multiplicities, and structures, with possible involvement in the decay dynamics of photo-excited ligand-to-metal charge-transfer. Specifically, we propose that this could be applied to confirm or reject the presence of a hitherto elusive transient Quartet species. Thus, RIXS offers a particular possibility to settle a recent controversy regarding the decay pathway, and we expect the technique to be similarly applicable in other model systems of photo-induced dynamics.
One of the most important aspects of a randomized algorithm is bounding its expected run time on various problems. Formally speaking, this means bounding the expected first-hitting time of a random process. The two arguably most popular tools to do so are the fitness level method and drift theory. The fitness level method considers arbitrary transition probabilities but only allows the process to move toward the goal. On the other hand, drift theory allows the process to move into any direction as long as it move closer to the goal in expectation; however, this tendency has to be monotone and, thus, the transition probabilities cannot be arbitrary. We provide a result that combines the benefit of these two approaches: our result gives a lower and an upper bound for the expected first-hitting time of a random process over {0,..., n} that is allowed to move forward and backward by 1 and can use arbitrary transition probabilities. In case that the transition probabilities are known, our bounds coincide and yield the exact value of the expected first-hitting time. Further, we also state the stationary distribution as well as the mixing time of a special case of our scenario.
For the last ten years, almost every theoretical result concerning the expected run time of a randomized search heuristic used drift theory, making it the arguably most important tool in this domain. Its success is due to its ease of use and its powerful result: drift theory allows the user to derive bounds on the expected first-hitting time of a random process by bounding expected local changes of the process - the drift. This is usually far easier than bounding the expected first-hitting time directly. Due to the widespread use of drift theory, it is of utmost importance to have the best drift theorems possible. We improve the fundamental additive, multiplicative, and variable drift theorems by stating them in a form as general as possible and providing examples of why the restrictions we keep are still necessary. Our additive drift theorem for upper bounds only requires the process to be nonnegative, that is, we remove unnecessary restrictions like a finite, discrete, or bounded search space. As corollaries, the same is true for our upper bounds in the case of variable and multiplicative drift.
Several areas in Southeast Asia are very vulnerable to climate change and unable to take immediate/effective actions on countermeasures due to insufficient capabilities. Malaysia, in particular the east coast of peninsular Malaysia and Sarawak, is known as one of the vulnerable regions to flood disaster. Prolonged and intense rainfall, natural activities and increase in runoff are the main reasons to cause flooding in this area. In addition, topographic conditions also contribute to the occurrence of flood disaster. Kuching city is located in the northwest of Borneo Island and part of Sarawak river catchment. This area is a developing state in Malaysia experiencing rapid urbanization since 2000s, which has caused the insufficient data availability in topography and hydrology. To deal with these challenging issues, this study presents a flood modelling framework using the remote sensing technologies and machine learning techniques to acquire the digital elevation model (DEM) with improved accuracy for the non-surveyed areas. Intensity–duration–frequency (IDF) curves were derived from climate model for various scenario simulations. The developed flood framework will be beneficial for the planners, policymakers, stakeholders as well as researchers in the field of water resource management in the aspect of providing better ideas/tools in dealing with the flooding issues in the region.
Foreword
(2018)
Foreword
(2018)
This book aims at understanding the diversity of planetary and lunar magnetic fields and their interaction with the solar wind. A synergistic interdisciplinary approach combines newly developed tools for data acquisition and analysis, computer simulations of planetary interiors and dynamos, models of solar wind interaction, measurement of terrestrial rocks and meteorites, and laboratory investigations. The following chapters represent a selection of some of the scientific findings derived by the 22 projects within the DFG Priority Program Planetary Magnetism" (PlanetMag). This introductory chapter gives an overview of the individual following chapters, highlighting their role in the overall goals of the PlanetMag framework. The diversity of the different contributions reflects the wide range of magnetic phenomena in our solar system. From the program we have excluded magnetism of the sun, which is an independent broad research discipline, but include the interaction of the solar wind with planets and moons. Within the subsequent 13 chapters of this book, the authors review the field centered on their research topic within PlanetMag. Here we shortly introduce the content of all the subsequent chapters and outline the context in which they should be seen.
We compare the robustness of humans and current convolutional deep neural networks (DNNs) on object recognition under twelve different types of image degradations. First, using three well known DNNs (ResNet-152, VGG-19, GoogLeNet) we find the human visual system to be more robust to nearly all of the tested image manipulations, and we observe progressively diverging classification error-patterns between humans and DNNs when the signal gets weaker. Secondly, we show that DNNs trained directly on distorted images consistently surpass human performance on the exact distortion types they were trained on, yet they display extremely poor generalisation abilities when tested on other distortion types. For example, training on salt-and-pepper noise does not imply robustness on uniform white noise and vice versa. Thus, changes in the noise distribution between training and testing constitutes a crucial challenge to deep learning vision systems that can be systematically addressed in a lifelong machine learning approach. Our new dataset consisting of 83K carefully measured human psychophysical trials provide a useful reference for lifelong robustness against image degradations set by the human visual system.
Grußwort
(2018)
Background:
The overall goal of the project ‘StiEL’ is to contribute to the professional development of teachers and other educational staff working at German secondary schools. The aim is to develop an evidence-based training concept for the inclusion of students with diverse abilities. The project is organized as a collaborative research effort of three partnering institutions and funded by the German Federal Ministry of Education and Research from 2018-2021.
Methods:
To support the on-going transition towards inclusive school practices, a multi-stage approach is envisaged. The first phase aims at a scoping review of existing literature and programmes on inclusion. The overview is supplemented by interviews with school staff members. Training modules are developed in the second project phase. The third phase of StiEL puts the newly developed training program into practice. The knowledge and skills acquired by the participants through the training as well as the teaching and management of inclusive classrooms after the training are evaluated through longitudinal and ethnographic approaches. The final project phase creates a best practice manual and makes the modules available via open access databases.
Results:
The presentation will focus on the first phase and try to explore the health-related consequences of the transition towards an inclusive school system in Germany for different participants. We will present preliminary results of expert interviews as well as some results from the literature screening. Due to our findings the current practice on German schools towards the road to inclusion is very stressful for all participants. We will explore recommendations for health promoting schools under conditions of inclusion.
Conclusions:
In terms of health-related consequences for all participants, the road to inclusion is very ambitious but also very stressful. Regarding the development of an inclusive school system, we need to focus much more on health and health promotion.
I can see it in your face
(2018)
The plant pathogen Pseudomonas syringae is a gram-negative bacterium which infects a wide range of plant species including important crops plants. To suppress plant immunity and cause disease P.syringae injects type-III effector proteins (T3Es) into the plant cell cytosol. In this study, we identified a novel target of the well characterized bacterial T3E HopZ1a. HopZ1a is an acetyltransferase that was shown to disrupt vesicle transport during innate immunity by acetylating tubulin. Using a yeast-two-hybrid screen approach, we identified a REMORIN (REM) protein from tobacco as a novel HopZ1a target. HopZ1a interacts with REM at the plasma membrane (PM) as shown by split-YFP experiments. Interestingly, we found that PBS1, a well-known kinase involved in plant immunity also interacts with REM in pull-down assays, and at the PM as shown by BiFC. Furthermore, we confirmed that REM is phosphorylated by PBS1 in vitro. Overexpression of REM provokes the upregulation of defense genes and leads to disease-like phenotypes pointing to a role of REM in plant immune signaling. Further protein-protein interaction studies reveal novel REM binding partners with a possible role in plant immune signaling. Thus, REM might act as an assembly hub for an immune signaling complex targeted by HopZ1a. Taken together, this is the first report describing that a REM protein is targeted by a bacterial effector. How HopZ1a might mechanistically manipulate the plant immune system through interfering with REM function will be discussed.
Imaginar la nación
(2018)
Impact of self-assessment of return to work on employable discharge from multi-component cardiac rehabilitation. Retrospective unicentric analysis of routine data from cardiac rehabilitation in patients below 65 years of age. Presentation in the "Cardiovascular rehabilitation revisited" high impact abstract session during ESC Congress 2018.
In university teaching today, it is common practice to record regular lectures and special events such as conferences and speeches. With these recordings, a large fundus of video teaching material can be created quickly and easily. Typically, lectures have a length of about one and a half hours and usually take place once or twice a week based on the credit hours. Depending on the number of lectures and other events recorded, the number of recordings available is increasing rapidly, which means that an appropriate form of provisioning is essential for the students. This is usually done in the form of lecture video platforms. In this work, we have investigated how lecture video platforms and the contained knowledge can be improved and accessed more easily by an increasing number of students. We came up with a multistep process we have applied to our own lecture video web portal that can be applied to other solutions as well.
The detection of all inclusion dependencies (INDs) in an unknown dataset is at the core of any data profiling effort. Apart from the discovery of foreign key relationships, INDs can help perform data integration, integrity checking, schema (re-)design, and query optimization. With the advent of Big Data, the demand increases for efficient INDs discovery algorithms that can scale with the input data size. To this end, we propose S-INDD++ as a scalable system for detecting unary INDs in large datasets. S-INDD++ applies a new stepwise partitioning technique that helps discard a large number of attributes in early phases of the detection by processing the first partitions of smaller sizes. S-INDD++ also extends the concept of the attribute clustering to decide which attributes to be discarded based on the clustering result of each partition. Moreover, in contrast to the state-of-the-art, S-INDD++ does not require the partition to fit into the main memory-which is a highly appreciable property in the face of the ever growing datasets. We conducted an exhaustive evaluation of S-INDD++ by applying it to large datasets with thousands attributes and more than 266 million tuples. The results show the high superiority of S-INDD++ over the state-of-the-art. S-INDD++ reduced up to 50 % of the runtime in comparison with BINDER, and up to 98 % in comparison with S-INDD.
Previous work has shown that surface modification with orthophosphoric acid can significantly enhance the charge stability on polypropylene (PP) surface by generating deeper traps. In the present study, thermally stimulated potential-decay measurements revealed that the chemical treatment may also significantly increase the number of available trapping sites on the surface. Thus, as a consequence, the so-called "cross-over" phenomenon, which is observed on as-received and thermally treated PP electrets, may be overcome in a certain range of initial charge densities. Furthermore, the discharge behavior of chemically modified samples indicates that charges can be injected from the treated surface into the bulk, and/or charges of opposite polarity can be pulled from the rear electrode into the bulk at elevated temperatures and at the high electric fields that are caused by the deposited charges. In the bulk, a lack of deep traps causes rapid charge decay already in the temperature range around 95 degrees C.
The influence of chemical composition and crystallisation conditions on the ferroelectric and paraelectric phases and the resulting morphology in Poly(vinylidene fluoride-trifluoroethylene-chlorofluoroethylene) (P(VDF-TrFE-CFE)) terpolymer films with 55.4/37.2/7.3 mol% or with 62.2/29.4/8.4 mol% of VDF/TrFE/CFE was studied. Poly(vinylidene fluoride trifluoroethylene) (P(VDF-TrFE)) with 75/25 mol% VDF/TrFE was employed as reference material. Fourier-Transform Infrared Spectroscopy (FTIR) was used to determine the fractions of the relevant terpolymer phases, and X-Ray Diffraction (XRD) was employed to assess the crystalline morphology. The FTIR results show an increase of the fraction of paraelectric phases after annealing. On the other hand, XRD results indicate a more stable paraelectric phase in the terpolymer with higher CFE content.
We propose a new temporal extension of the logic of Here-and-There (HT) and its equilibria obtained by combining it with dynamic logic over (linear) traces. Unlike previous temporal extensions of HT based on linear temporal logic, the dynamic logic features allow us to reason about the composition of actions. For instance, this can be used to exercise fine grained control when planning in robotics, as exemplified by GOLOG. In this paper, we lay the foundations of our approach, and refer to it as Linear Dynamic Equilibrium Logic, or simply DEL. We start by developing the formal framework of DEL and provide relevant characteristic results. Among them, we elaborate upon the relationships to traditional linear dynamic logic and previous temporal extensions of HT.
Introduction
(2018)
The present thematic set of studies comprises five concise review articles on the use of priming paradigms in different areas of bilingualism research. Their aim is to provide readers with a quick overview of how priming paradigms can be employed in particular subfields of bilingualism research and to make readers aware of the methodological issues that need to be considered when using priming techniques.
Introduction
(2018)
In Europe, different countries developed a rich variety of sub-municipal institutions. Out of the plethora of intra- and sub-municipal decentralization forms (reaching from local outposts of city administration to “quasi-federal” structures), this book focuses on territorial sub-municipal units (SMUs) which combine multipurpose territorial responsibility with democratic legitimacy and can be seen as institutions promoting the articulation and realization of collective choices at a sub-municipal level.
Country chapters follow a common pattern that is facilitating systematic comparisons, while at the same time leaving enough space for national peculiarities and priorities chosen and highlighted by the authors, who also take advantage of the eventually existing empirical surveys and case studies.