Refine
Has Fulltext
- no (189) (remove)
Year of publication
- 2018 (189) (remove)
Document Type
- Other (189) (remove)
Is part of the Bibliography
- yes (189) (remove)
Keywords
- E-Learning (3)
- Security Metrics (3)
- Security Risk Assessment (3)
- 3D Point Clouds (2)
- Cloud-Security (2)
- Internet of Things (2)
- Kanban (2)
- Lecture Video Archive (2)
- MQTT (2)
- Scrum (2)
Institute
- Hasso-Plattner-Institut für Digital Engineering GmbH (52)
- Institut für Physik und Astronomie (18)
- Institut für Geowissenschaften (17)
- Institut für Biochemie und Biologie (16)
- Department Psychologie (13)
- Department Sport- und Gesundheitswissenschaften (13)
- Institut für Informatik und Computational Science (13)
- Institut für Ernährungswissenschaft (12)
- Institut für Chemie (8)
- Department Linguistik (5)
Introduction
(2018)
The present thematic set of studies comprises five concise review articles on the use of priming paradigms in different areas of bilingualism research. Their aim is to provide readers with a quick overview of how priming paradigms can be employed in particular subfields of bilingualism research and to make readers aware of the methodological issues that need to be considered when using priming techniques.
We present a prototype of an integrated reasoning environment for educational purposes. The presented tool is a fragment of a proof assistant and automated theorem prover. We describe the existing and planned functionality of the theorem prover and especially the functionality of the educational fragment. This currently supports working with terms of the untyped lambda calculus and addresses both undergraduate students and researchers. We show how the tool can be used to support the students' understanding of functional programming and discuss general problems related to the process of building theorem proving software that aims at supporting both research and education.
Manufacturing industries are undergoing a major paradigm shift towards more autonomy. Automated planning and scheduling then becomes a necessity. The Planning and Execution Competition for Logistics Robots in Simulation held at ICAPS is based on this scenario and provides an interesting testbed. However, the posed problem is challenging as also demonstrated by the somewhat weak results in 2017. The domain requires temporal reasoning and dealing with uncertainty. We propose a novel planning system based on Answer Set Programming and the Clingo solver to tackle these problems and incentivize robot cooperation. Our results show a significant performance improvement, both, in terms of lowering computational requirements and better game metrics.
Declarative languages for knowledge representation and reasoning provide constructs to define preference relations over the set of possible interpretations, so that preferred models represent optimal solutions of the encoded problem. We introduce the notion of approximation for replacing preference relations with stronger preference relations, that is, relations comparing more pairs of interpretations. Our aim is to accelerate the computation of a non-empty subset of the optimal solutions by means of highly specialized algorithms. We implement our approach in Answer Set Programming (ASP), where problems involving quantitative and qualitative preference relations can be addressed by ASPRIN, implementing a generic optimization algorithm. Unlike this, chains of approximations allow us to reduce several preference relations to the preference relations associated with ASP’s native weak constraints and heuristic directives. In this way, ASPRIN can now take advantage of several highly optimized algorithms implemented by ASP solvers for computing optimal solutions
The Amyloid-precursor-like protein 1 (APLP1) is a neuronal type I transmembrane protein which plays a role in synaptic adhesion and synaptogenesis. Past investigations indicated that APLP1 is involved in the formation of protein-protein complexes that bridge the junctions between neighboring cells. Nevertheless, APLP1-APLP1 trans interactions have never been directly observed in higher eukaryotic cells. Here, we investigate APLP1 interactions and dynamics directly in living human embryonic kidney (HEK) cells, using fluorescence fluctuation spectroscopy techniques, namely cross-correlation scanning fluorescence correlation spectroscopy (sFCS) and Number&Brightness (N&B). Our results show that APLP1 forms homotypic trans complexes at cell-cell contacts. In the presence of zinc ions, the protein forms macroscopic clusters, exhibiting an even higher degree of trans binding and strongly reduced dynamics. Further evidence from Giant Plasma Membrane Vesicles and live cell actin staining suggests that the presence of an intact cortical cytoskeleton is required for zinc-induced cis multimerization. Subsequently, large adhesion platforms bridging interacting cells are formed through APLP1-APLP1 direct trans interactions. Taken together, our results provide direct evidence that APLP1 functions as a neuronal zinc-dependent adhesion protein and provide a more detailed understanding of the molecular mechanisms driving the formation of APLP1 adhesion platforms. Further, they show that fluorescence fluctuation spectroscopy techniques are useful tools for the investigation of protein-protein interactions at cell-cell adhesion sites.
Pace-of-life syndromes
(2018)
This introduction to the topical collection on Pace-of-life syndromes: a framework for the adaptive integration of behaviour, physiology, and life history provides an overview of conceptual, theoretical, methodological, and empirical progress in research on pace-of-life syndromes (POLSs) over the last decade. The topical collection has two main goals. First, we briefly describe the history of POLS research and provide a refined definition of POLS that is applicable to various key levels of variation (genetic, individual, population, species). Second, we summarise the main lessons learned from current POLS research included in this topical collection. Based on an assessment of the current state of the theoretical foundations and the empirical support of the POLS hypothesis, we propose (i) conceptual refinements of theory, particularly with respect to the role of ecology in the evolution of (sexual dimorphism in) POLS, and (ii) methodological and statistical approaches to the study of POLS at all major levels of variation. This topical collection further holds (iii) key empirical examples demonstrating how POLS structures may be studied in wild populations of (non) human animals, and (iv) a modelling paper predicting POLS under various ecological conditions. Future POLS research will profit from the development of more explicit theoretical models and stringent empirical tests of model assumptions and predictions, increased focus on how ecology shapes (sex-specific) POLS structures at multiple hierarchical levels, and the usage of appropriate statistical tests and study designs. Significance statement As an introduction to the topical collection, we summarise current conceptual, theoretical, methodological and empirical progress in research on pace-of-life syndromes (POLSs), a framework for the adaptive integration of behaviour, physiology and life history at multiple hierarchical levels of variation (genetic, individual, population, species). Mixed empirical support of POLSs, particularly at the within-species level, calls for an evaluation and refinement of the hypothesis. We provide a refined definition of POLSs facilitating testable predictions. Future research on POLSs will profit from the development of more explicit theoretical models and stringent empirical tests of model assumptions and predictions, increased focus on how ecology shapes (sex-specific) POLSs structures at multiple hierarchical levels and the usage of appropriate statistical tests and study designs.
Comparative text mining extends from genre analysis and political bias detection to the revelation of cultural and geographic differences, through to the search for prior art across patents and scientific papers. These applications use cross-collection topic modeling for the exploration, clustering, and comparison of large sets of documents, such as digital libraries. However, topic modeling on documents from different collections is challenging because of domain-specific vocabulary. We present a cross-collection topic model combined with automatic domain term extraction and phrase segmentation. This model distinguishes collection-specific and collection-independent words based on information entropy and reveals commonalities and differences of multiple text collections. We evaluate our model on patents, scientific papers, newspaper articles, forum posts, and Wikipedia articles. In comparison to state-of-the-art cross-collection topic modeling, our model achieves up to 13% higher topic coherence, up to 4% lower perplexity, and up to 31% higher document classification accuracy. More importantly, our approach is the first topic model that ensures disjunct general and specific word distributions, resulting in clear-cut topic representations.
Over the past few years, studying abroad and other educational international experiences have become increasingly highly regarded. Nevertheless, research shows that only a minority of students actually take part in
academic mobility programs. But what is it that distinguishes those students who take up these international opportunities from those who do not? In this
study we reviewed recent quantitative studies on why (primarily German) students choose to travel abroad or not. This revealed a pattern of predictive factors. These indicate the key role played by students’ personal and social background, as well as previous international travel and the course of studies they are enrolled in. The study then focuses on teaching students. Both facilitating and debilitating factors are discussed and included in a model illustrating the decision-making process these students use. Finally, we discuss the practical implications for ways in which international, studyrelated travel might be increased in the future. We suggest that higher education institutions analyze individual student characteristics, offering differentiated programs to better meet the needs of different groups, thus raising the likelihood of disadvantaged students participating in academic international travel.
The globally distributed sperm whale (Physeter macrocephalus) has a partly matrilineal social structure with predominant male dispersal. At the beginning of 2016, a total of 30 male sperm whales stranded in five different countries bordering the southern North Sea. It has been postulated that these individuals were on a migration route from the north to warmer temperate and tropical waters where females live in social groups. By including samples from four countries (n = 27), this event provided a unique chance to genetically investigate the maternal relatedness and the putative origin of these temporally and spatially co-occuring male sperm whales. To utilize existing genetic resources, we sequenced 422 bp of the mitochondrial control region, a molecular marker for which sperm whale data are readily available from the entire distribution range. Based on four single nucleotide polymorphisms (SNPs) within the mitochondrial control region, five matrilines could be distinguished within the stranded specimens, four of which matched published haplotypes previously described in the Atlantic. Among these male sperm whales, multiple matrilineal lineages co-occur. We analyzed the population differentiation and could show that the genetic diversity of these male sperm whales is comparable to the genetic diversity in sperm whales from the entire Atlantic Ocean. We confirm that within this stranding event, males do not comprise maternally related individuals and apparently include assemblages of individuals from different geographic regions. (c) 2017 Deutsche Gesellschaft fur Saugetierkunde. Published by Elsevier GmbH. All rights reserved.
The rapid digitalization of the Facility Management (FM) sector has increased the demand for mobile, interactive analytics approaches concerning the operational state of a building. These approaches provide the key to increasing stakeholder engagement associated with Operation and Maintenance (O&M) procedures of living and working areas, buildings, and other built environment spaces. We present a generic and fast approach to process and analyze given 3D point clouds of typical indoor office spaces to create corresponding up-to-date approximations of classified segments and object-based 3D models that can be used to analyze, record and highlight changes of spatial configurations. The approach is based on machine-learning methods used to classify the scanned 3D point cloud data using 2D images. This approach can be used to primarily track changes of objects over time for comparison, allowing for routine classification, and presentation of results used for decision making. We specifically focus on classification, segmentation, and reconstruction of multiple different object types in a 3D point-cloud scene. We present our current research and describe the implementation of these technologies as a web-based application using a services-oriented methodology.
Currently we are witnessing profound changes in the geospatial domain. Driven by recent ICT developments, such as web services, serviceoriented computing or open-source software, an explosion of geodata and geospatial applications or rapidly growing communities of non-specialist users, the crucial issue is the provision and integration of geospatial intelligence in these rapidly changing, heterogeneous developments. This paper introduces the concept of Servicification into geospatial data processing. Its core idea is the provision of expertise through a flexible number of web-based software service modules. Selection and linkage of these services to user profiles, application tasks, data resources, or additional software allow for the compilation of flexible, time-sensitive geospatial data handling processes. Encapsulated in a string of discrete services, the approach presented here aims to provide non-specialist users with geospatial expertise required for the effective, professional solution of a defined application problem. Providing users with geospatial intelligence in the form of web-based, modular services, is a completely different approach to geospatial data processing. This novel concept puts geospatial intelligence, made available through services encapsulating rule bases and algorithms, in the centre and at the disposal of the users, regardless of their expertise.
Mobile expressive rendering gained increasing popularity among users seeking casual creativity by image stylization and supports the development of mobile artists as a new user group. In particular, neural style transfer has advanced as a core technology to emulate characteristics of manifold artistic styles. However, when it comes to creative expression, the technology still faces inherent limitations in providing low-level controls for localized image stylization. This work enhances state-of-the-art neural style transfer techniques by a generalized user interface with interactive tools to facilitate a creative and localized editing process. Thereby, we first propose a problem characterization representing trade-offs between visual quality, run-time performance, and user control. We then present MaeSTrO, a mobile app for orchestration of neural style transfer techniques using iterative, multi-style generative and adaptive neural networks that can be locally controlled by on-screen painting metaphors. At this, first user tests indicate different levels of satisfaction for the implemented techniques and interaction design.
OpenLL
(2018)
Today's rendering APIs lack robust functionality and capabilities for dynamic, real-time text rendering and labeling, which represent key requirements for 3D application design in many fields. As a consequence, most rendering systems are barely or not at all equipped with respective capabilities. This paper drafts the unified text rendering and labeling API OpenLL intended to complement common rendering APIs, frameworks, and transmission formats. For it, various uses of static and dynamic placement of labels are showcased and a text interaction technique is presented. Furthermore, API design constraints with respect to state-of-the-art text rendering techniques are discussed. This contribution is intended to initiate a community-driven specification of a free and open label library.
Impact of self-assessment of return to work on employable discharge from multi-component cardiac rehabilitation. Retrospective unicentric analysis of routine data from cardiac rehabilitation in patients below 65 years of age. Presentation in the "Cardiovascular rehabilitation revisited" high impact abstract session during ESC Congress 2018.
Screeninginstrumente
(2018)
For the last ten years, almost every theoretical result concerning the expected run time of a randomized search heuristic used drift theory, making it the arguably most important tool in this domain. Its success is due to its ease of use and its powerful result: drift theory allows the user to derive bounds on the expected first-hitting time of a random process by bounding expected local changes of the process - the drift. This is usually far easier than bounding the expected first-hitting time directly. Due to the widespread use of drift theory, it is of utmost importance to have the best drift theorems possible. We improve the fundamental additive, multiplicative, and variable drift theorems by stating them in a form as general as possible and providing examples of why the restrictions we keep are still necessary. Our additive drift theorem for upper bounds only requires the process to be nonnegative, that is, we remove unnecessary restrictions like a finite, discrete, or bounded search space. As corollaries, the same is true for our upper bounds in the case of variable and multiplicative drift.
One of the most important aspects of a randomized algorithm is bounding its expected run time on various problems. Formally speaking, this means bounding the expected first-hitting time of a random process. The two arguably most popular tools to do so are the fitness level method and drift theory. The fitness level method considers arbitrary transition probabilities but only allows the process to move toward the goal. On the other hand, drift theory allows the process to move into any direction as long as it move closer to the goal in expectation; however, this tendency has to be monotone and, thus, the transition probabilities cannot be arbitrary. We provide a result that combines the benefit of these two approaches: our result gives a lower and an upper bound for the expected first-hitting time of a random process over {0,..., n} that is allowed to move forward and backward by 1 and can use arbitrary transition probabilities. In case that the transition probabilities are known, our bounds coincide and yield the exact value of the expected first-hitting time. Further, we also state the stationary distribution as well as the mixing time of a special case of our scenario.
The centrosome is not only the largest and most sophisticated protein complex within a eukaryotic cell, in the light of evolution, it is also one of its most ancient organelles. This special issue of "Cells" features representatives of three main, structurally divergent centrosome types, i.e., centriole-containing centrosomes, yeast spindle pole bodies (SPBs), and amoebozoan nucleus-associated bodies (NABs). Here, I discuss their evolution and their key-functions in microtubule organization, mitosis, and cytokinesis. Furthermore, I provide a brief history of centrosome research and highlight recently emerged topics, such as the role of centrioles in ciliogenesis, the relationship of centrosomes and centriolar satellites, the integration of centrosomal structures into the nuclear envelope and the involvement of centrosomal components in non-centrosomal microtubule organization.
For theoretical analyses there are two specifics distinguishing GP from many other areas of evolutionary computation. First, the variable size representations, in particular yielding a possible bloat (i.e. the growth of individuals with redundant parts). Second, the role and realization of crossover, which is particularly central in GP due to the tree-based representation. Whereas some theoretical work on GP has studied the effects of bloat, crossover had a surprisingly little share in this work. We analyze a simple crossover operator in combination with local search, where a preference for small solutions minimizes bloat (lexicographic parsimony pressure); the resulting algorithm is denoted Concatenation Crossover GP. For this purpose three variants of the wellstudied Majority test function with large plateaus are considered. We show that the Concatenation Crossover GP can efficiently optimize these test functions, while local search cannot be efficient for all three variants independent of employing bloat control.
Kim et al. recently measured the structure factor of deeply supercooled water droplets (Reports, 22 December 2017, p. 1589). We raise several concerns about their data analysis and interpretation. In our opinion, the reported data do not lead to clear conclusions about the origins of water’s anomalies.
I can see it in your face
(2018)
An essential, respected, and critical aspect of the modern practice of science and scientific publishing is peer review. The process of peer review facilitates best practices in scientific conduct and communication, ensuring that manuscripts published as accurate, valuable, and clearly communicated. The over 152 papers published in Tectonics in 2017 benefit from the time, effort, and expertise of our reviewers who have provided thoughtfully considered advice on each manuscript. This role is critical to advancing our understanding of the evolution of the continents and their margins, as these reviews lead to even clearer and higher-quality papers. In 2017, the over 423 papers submitted to Tectonics were the beneficiaries of more than 786 reviews provided by 562 members of the tectonics community and related disciplines. To everyone who has volunteered their time and intellect to peer reviewing, thank you for helping Tectonics and all other AGU Publications provide the best science possible.
One-tube osmotic fragility (OF) test is a rapid test used widely for screening thalassemia in countries with limited resources. The test has important limitation in that its accuracy relies on observers’ experience.
The iCheck Turbidity is a prototype of portable nephelometer developed by BioAnalyt (Bioanalyt GmbH, Germany). In this study, we assessed the applicability of the iCheck Turbidity, for checking turbidity of the OF-test
Editorial: Reaching to Grasp Cognition: Analyzing Motor Behavior to Investigate Social Interactions
(2018)
Utilizing quad-trees for efficient design space exploration with partial assignment evaluation
(2018)
Recently, it has been shown that constraint-based symbolic solving techniques offer an efficient way for deciding binding and routing options in order to obtain a feasible system level implementation. In combination with various background theories, a feasibility analysis of the resulting system may already be performed on partial solutions. That is, infeasible subsets of mapping and routing options can be pruned early in the decision process, which fastens the solving accordingly. However, allowing a proper design space exploration including multi-objective optimization also requires an efficient structure for storing and managing non-dominated solutions. In this work, we propose and study the usage of the Quad-Tree data structure in the context of partial assignment evaluation during system synthesis. Out experiments show that unnecessary dominance checks can be avoided, which indicates a preference of Quad-Trees over a commonly used list-based implementation for large combinatorial optimization problems.
Imaginar la nación
(2018)
The problem of constructing and maintaining a tree topology in a distributed manner is a challenging task in WSNs. This is because the nodes have limited computational and memory resources and the network changes over time. We propose the Dynamic Gallager-Humblet-Spira (D-GHS) algorithm that builds and maintains a minimum spanning tree. To do so, we divide D-GHS into four phases, namely neighbor discovery, tree construction, data collection, and tree maintenance. In the neighbor discovery phase, the nodes collect information about their neighbors and the link quality. In the tree construction, D-GHS finds the minimum spanning tree by executing the Gallager-Humblet-Spira algorithm. In the data collection phase, the sink roots the minimum spanning tree at itself, and each node sends data packets. In the tree maintenance phase, the nodes repair the tree when communication failures occur. The emulation results show that D-GHS reduces the number of control messages and the energy consumption, at the cost of a slight increase in memory size and convergence time.
High storage density magnetic devices rely on the precise, reliable and ultrafast switching times of the magnetic states. Optical control of magnetization using femtosecond laser without applying any external magnetic field offers the advantage of switching magnetic states at ultrashort time scales, which has attracted a significant attention. Recently, it has been reported and demonstrated the,so-called, all-optical helicity-dependent switching (AO-HDS) in which a circularly polarized femtosecond laser pulse switches the magnetization of a ferromagnetic thin film as function of laser helicity [1]. Afterward, in more recent studies, it has been reported that AO-HDS is a general phenomenon existing in magnetic materials ranging from rare earth - transition metals ferrimagnetic (e.g. alloys, multilayers and hetero-structures system) to even ferromagnetic thin films. Among numerous studies in the literature which are discussing the microscopic origin of AO-HDS in ferromagnets or ferrimagnetic alloys, the most renowned concepts are momentum transfer via Inverse Faraday Effect (IFE) [1-3]and the concept of preferential thermal demagnetization for one magnetization direction by heating close to Tc (Curie temperature) in the presence of magnetic circular dichroism (MCD) [4-6]. In this study, we investigate all-optical magnetic switching using a stationary femtosecond laser spot (3-5 μm) in TbFe alloys via photoemission electron microscopy (PEEM) and x-ray magnetic circular dichroism (XMCD) with a spatial resolution of approximately 30 nm. We spatially characterize the effect of laser heating and local temperature profile created across the laser spot on AO-HDS in TbFe thin films. We find that AO-HDS occurs only in a `ring' shaped region surrounding the thermally demagnetized region formed by the laser spot and the formation of switched domains relies further on thermally induced domain wall motion. Our temperature dependent measurements highlight the importance of attainin...
Modern server systems with large NUMA architectures necessitate (i) data being distributed over the available computing nodes and (ii) NUMA-aware query processing to enable effective parallel processing in database systems. As these architectures incur significant latency and throughout penalties for accessing non-local data, queries should be executed as close as possible to the data. To further increase both performance and efficiency, data that is not relevant for the query result should be skipped as early as possible. One way to achieve this goal is horizontal partitioning to improve static partition pruning. As part of our ongoing work on workload-driven partitioning, we have implemented a recent approach called aggressive data skipping and extended it to handle both analytical as well as transactional access patterns. In this paper, we evaluate this approach with the workload and data of a production enterprise system of a Global 2000 company. The results show that over 80% of all tuples can be skipped in average while the resulting partitioning schemata are surprisingly stable over time.