Refine
Has Fulltext
- no (145)
Year of publication
- 2019 (145) (remove)
Document Type
- Other (145) (remove)
Language
- English (145) (remove)
Is part of the Bibliography
- yes (145)
Keywords
- evaluation (3)
- Cloud Computing (2)
- Industry 4.0 (2)
- Scrum (2)
- Social Media Analysis (2)
- Teamwork (2)
- Virtual Machine (2)
- fabrication (2)
- retrospective (2)
- software process improvement (2)
Institute
- Hasso-Plattner-Institut für Digital Engineering GmbH (30)
- Institut für Physik und Astronomie (19)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (17)
- Institut für Biochemie und Biologie (16)
- Department Psychologie (12)
- Institut für Geowissenschaften (9)
- Department Sport- und Gesundheitswissenschaften (4)
- Institut für Ernährungswissenschaft (4)
- Institut für Informatik und Computational Science (4)
- Institut für Umweltwissenschaften und Geographie (4)
General intelligence has a substantial genetic background in children, adolescents, and adults, but environmental factors also strongly correlate with cognitive performance as evidenced by a strong (up to one SD) increase in average intelligence test results in the second half of the previous century. This change occurred in a period apparently too short to accommodate radical genetic changes. It is highly suggestive that environmental factors interact with genotype by possible modification of epigenetic factors that regulate gene expression and thus contribute to individual malleability. This modification might as well be reflected in recent observations of an association between dopamine-dependent encoding of reward prediction errors and cognitive capacity, which was modulated by adverse life events.
Audit - and then what?
(2019)
Current trends such as digital transformation, Internet of Things, or Industry 4.0 are challenging the majority of learning factories. Regardless of whether a conventional learning factory, a model factory, or a digital learning factory, traditional approaches such as the monotonous execution of specific instructions don‘t suffice the learner’s needs, market requirements as well as especially current technological developments. Contemporary teaching environments need a clear strategy, a road to follow for being able to successfully cope with the changes and develop towards digitized learning factories. This demand driven necessity of transformation leads to another obstacle: Assessing the status quo and developing and implementing adequate action plans. Within this paper, details of a maturity-based audit of the hybrid learning factory in the Research and Application Centre Industry 4.0 and a thereof derived roadmap for the digitization of a learning factory are presented.
Subject-oriented learning
(2019)
The transformation to a digitized company changes not only the work but also social context for the employees and requires inter alia new knowledge and skills from them. Additionally, individual action problems arise. This contribution proposes the subject-oriented learning theory, in which the employees´ action problems are the starting point of training activities in learning factories. In this contribution, the subject-oriented learning theory is exemplified and respective advantages for vocational training in learning factories are pointed out both theoretically and practically. Thereby, especially the individual action problems of learners and the infrastructure are emphasized as starting point for learning processes and competence development.
High-dimensional data is particularly useful for data analytics research. In the healthcare domain, for instance, high-dimensional data analytics has been used successfully for drug discovery. Yet, in order to adhere to privacy legislation, data analytics service providers must guarantee anonymity for data owners. In the context of high-dimensional data, ensuring privacy is challenging because increased data dimensionality must be matched by an exponential growth in the size of the data to avoid sparse datasets. Syntactically, anonymising sparse datasets with methods that rely of statistical significance, makes obtaining sound and reliable results, a challenge. As such, strong privacy is only achievable at the cost of high information loss, rendering the data unusable for data analytics. In this paper, we make two contributions to addressing this problem from both the privacy and information loss perspectives. First, we show that by identifying dependencies between attribute subsets we can eliminate privacy violating attributes from the anonymised dataset. Second, to minimise information loss, we employ a greedy search algorithm to determine and eliminate maximal partial unique attribute combinations. Thus, one only needs to find the minimal set of identifying attributes to prevent re-identification. Experiments on a health cloud based on the SAP HANA platform using a semi-synthetic medical history dataset comprised of 109 attributes, demonstrate the effectiveness of our approach.
Nowadays, structural health monitoring of critical infrastructures is considered as of primal importance especially for managing transport infrastructure however most current SHM methodologies are based on point-sensors that show various limitations relating to their spatial positioning capabilities, cost of development and measurement range. This publication describes the progress in the SENSKIN EC co-funded research project that is developing a dielectric-elastomer sensor, formed from a large highly extensible capacitance sensing membrane and is supported by an advanced micro-electronic circuitry, for monitoring transport infrastructure bridges. The sensor under development provides spatial measurements of strain in excess of 10%, while the sensing system is being designed to be easy to install, require low power in operation concepts, require simple signal processing, and have the ability to self-monitor and report. An appropriate wireless sensor network is also being designed and developed supported by local gateways for the required data collection and exploitation. SENSKIN also develops a Decision-Support-System (DSS) for proactive condition-based structural interventions under normal operating conditions and reactive emergency intervention following an extreme event. The latter is supported by a life-cycle-costing (LCC) and life-cycle-assessment (LCA) module responsible for the total internal and external costs for the identified bridge rehabilitation, analysis of options, yielding figures for the assessment of the economic implications of the bridge rehabilitation work and the environmental impacts of the bridge rehabilitation options and of the associated secondary effects respectively. The overall monitoring system will be evaluated and benchmarked on actual bridges of Egnatia Highway (Greece) and Bosporus Bridge (Turkey).
Network science is driven by the question which properties large real-world networks have and how we can exploit them algorithmically. In the past few years, hyperbolic graphs have emerged as a very promising model for scale-free networks. The connection between hyperbolic geometry and complex networks gives insights in both directions: (1) Hyperbolic geometry forms the basis of a natural and explanatory model for real-world networks. Hyperbolic random graphs are obtained by choosing random points in the hyperbolic plane and connecting pairs of points that are geometrically close. The resulting networks share many structural properties for example with online social networks like Facebook or Twitter. They are thus well suited for algorithmic analyses in a more realistic setting. (2) Starting with a real-world network, hyperbolic geometry is well-suited for metric embeddings. The vertices of a network can be mapped to points in this geometry, such that geometric distances are similar to graph distances. Such embeddings have a variety of algorithmic applications ranging from approximations based on efficient geometric algorithms to greedy routing solely using hyperbolic coordinates for navigation decisions.
JavaScript is the most popular programming language for web applications. Static analysis of JavaScript applications is highly challenging due to its dynamic language constructs and event-driven asynchronous executions, which also give rise to many security-related bugs. Several static analysis tools to detect such bugs exist, however, research has not yet reported much on the precision and scalability trade-off of these analyzers. As a further obstacle, JavaScript programs structured in Node. js modules need to be collected for analysis, but existing bundlers are either specific to their respective analysis tools or not particularly suitable for static analysis.
Mobile operating systems, such as Google's Android, have become a fixed part of our daily lives and are entrusted with a plethora of private information. Congruously, their data protection mechanisms have been improved steadily over the last decade and, in particular, for Android, the research community has explored various enhancements and extensions to the access control model. However, the vast majority of those solutions has been concerned with controlling the access to data, but equally important is the question of how to control the flow of data once released. Ignoring control over the dissemination of data between applications or between components of the same app, opens the door for attacks, such as permission re-delegation or privacy-violating third-party libraries. Controlling information flows is a long-standing problem, and one of the most recent and practical-oriented approaches to information flow control is secure multi-execution.
In this paper, we present Ariel, the design and implementation of an IFC architecture for Android based on the secure multi-execution of apps. Ariel demonstrably extends Android's system with support for executing multiple instances of apps, and it is equipped with a policy lattice derived from the protection levels of Android's permissions as well as an I/O scheduler to achieve control over data flows between application instances. We demonstrate how secure multi-execution with Ariel can help to mitigate two prominent attacks on Android, permission re-delegations and malicious advertisement libraries.
Detect me if you can
(2019)
Spam Bots have become a threat to online social networks with their malicious behavior, posting misinformation messages and influencing online platforms to fulfill their motives. As spam bots have become more advanced over time, creating algorithms to identify bots remains an open challenge. Learning low-dimensional embeddings for nodes in graph structured data has proven to be useful in various domains. In this paper, we propose a model based on graph convolutional neural networks (GCNN) for spam bot detection. Our hypothesis is that to better detect spam bots, in addition to defining a features set, the social graph must also be taken into consideration. GCNNs are able to leverage both the features of a node and aggregate the features of a node’s neighborhood. We compare our approach, with two methods that work solely on a features set and on the structure of the graph. To our knowledge, this work is the first attempt of using graph convolutional neural networks in spam bot detection.
The "Bachelor Project"
(2019)
One of the challenges of educating the next generation of computer scientists is to teach them to become team players, that are able to communicate and interact not only with different IT systems, but also with coworkers and customers with a non-it background. The “bachelor project” is a project based on team work and a close collaboration with selected industry partners. The authors hosted some of the teams since spring term 2014/15. In the paper at hand we explain and discuss this concept and evaluate its success based on students' evaluation and reports. Furthermore, the technology-stack that has been used by the teams is evaluated to understand how self-organized students in IT-related projects work. We will show that and why the bachelor is the most successful educational format in the perception of the students and how this positive results can be improved by the mentors.