Refine
Has Fulltext
- no (911) (remove)
Year of publication
Document Type
- Other (911) (remove)
Language
Keywords
- E-Learning (4)
- MOOC (4)
- Scrum (4)
- embodied cognition (4)
- errata, addenda (4)
- Cloud-Security (3)
- Digitalisierung (3)
- ISM: supernova remnants (3)
- Industry 4.0 (3)
- Internet of Things (3)
Institute
- Institut für Biochemie und Biologie (97)
- Institut für Physik und Astronomie (86)
- Hasso-Plattner-Institut für Digital Engineering GmbH (84)
- Institut für Geowissenschaften (64)
- Department Sport- und Gesundheitswissenschaften (46)
- Institut für Mathematik (46)
- Department Psychologie (45)
- Institut für Ernährungswissenschaft (31)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (30)
- Institut für Chemie (30)
This chapter investigates the trajectory of establishing the Forest Stewardship Council (FSC) in the early 1990s as the first private transnational certification organization with an antagonistic stakeholder body. Its main contribution is a micro-analysis of the founding assembly in 1993. By investigating the role of brokers within the negotiation as one institutional scope condition for ‘arguing’ having occurred, the chapter adopts a dramaturgical approach. It contends that the authority of brokers is not necessarily institutionally given, but needs to be gained: brokers have to prove situationally that their knowledge is relevant and that they are speaking impartially in the interest of progress rather than their own. The chapter stresses the importance of procedural knowledge which brokers provide in contrast to policy knowledge.
Already for decades it has been known that the winds of massive stars are inhomogeneous (i.e. clumped). To properly model observed spectra of massive star winds it is necessary to incorporate the 3-D nature of clumping into radiative transfer calculations. In this paper we present our full 3-D Monte Carlo radiative transfer code for inhomogeneous expanding stellar winds. We use a set of parameters to describe dense as well as the rarefied wind components. At the same time, we account for non-monotonic velocity fields. We show how the 3-D density and velocity wind inhomogeneities strongly affect the resonance line formation. We also show how wind clumping can solve the discrepancy between P v and H alpha mass-loss rate diagnostics.
A balance to death
(2018)
Leaf senescence plays a crucial role in nutrient recovery in late-stage plant development and requires vast transcriptional reprogramming by transcription factors such as ORESARA1 (ORE1). A proteolytic mechanism is now found to control ORE1 degradation, and thus senescence, during nitrogen starvation.
We analyze the problem of response suggestion in a closed domain along a real-world scenario of a digital library. We present a text-processing pipeline to generate question-answer pairs from chat transcripts. On this limited amount of training data, we compare retrieval-based, conditioned-generation, and dedicated representation learning approaches for response suggestion. Our results show that retrieval-based methods that strive to find similar, known contexts are preferable over parametric approaches from the conditioned-generation family, when the training data is limited. We, however, identify a specific representation learning approach that is competitive to the retrieval-based approaches despite the training data limitation.
Increasing demand for analytical processing capabilities can be managed by replication approaches. However, to evenly balance the replicas' workload shares while at the same time minimizing the data replication factor is a highly challenging allocation problem. As optimal solutions are only applicable for small problem instances, effective heuristics are indispensable. In this paper, we test and compare state-of-the-art allocation algorithms for partial replication. By visualizing and exploring their (heuristic) solutions for different benchmark workloads, we are able to derive structural insights and to detect an algorithm's strengths as well as its potential for improvement. Further, our application enables end-to-end evaluations of different allocations to verify their theoretical performance.
Microservice Architectures (MSA) structure applications as a collection of loosely coupled services that implement business capabilities. The key advantages of MSA include inherent support for continuous deployment of large complex applications, agility and enhanced productivity. However, studies indicate that most MSA are homogeneous, and introduce shared vulnerabilites, thus vulnerable to multi-step attacks, which are economics-of-scale incentives to attackers. In this paper, we address the issue of shared vulnerabilities in microservices with a novel solution based on the concept of Moving Target Defenses (MTD). Our mechanism works by performing risk analysis against microservices to detect and prioritize vulnerabilities. Thereafter, security risk-oriented software diversification is employed, guided by a defined diversification index. The diversification is performed at runtime, leveraging both model and template based automatic code generation techniques to automatically transform programming languages and container images of the microservices. Consequently, the microservices attack surfaces are altered thereby introducing uncertainty for attackers while reducing the attackability of the microservices. Our experiments demonstrate the efficiency of our solution, with an average success rate of over 70% attack surface randomization.
A Landscape for Case Models
(2019)
Case Management is a paradigm to support knowledge-intensive processes. The different approaches developed for modeling these types of processes tend to result in scattered models due to the low abstraction level at which the inherently complex processes are therein represented. Thus, readability and understandability is more challenging than that of traditional process models. By reviewing existing proposals in the field of process overviews and case models, this paper extends a case modeling language - the fragment-based Case Management (fCM) language - with the goal of modeling knowledge-intensive processes from a higher abstraction level - to generate a so-called fCM landscape. This proposal is empirically evaluated via an online experiment. Results indicate that interpreting an fCM landscape might be more effective and efficient than interpreting an informationally equivalent case model.
Low back pain (LBP) is a leading cause of activity limitation. Objective assessment of the spinal motion plays a key role in diagnosis and treatment of LBP. We propose a method that facilitates clinical assessment of lower back motions by means of a wireless inertial sensor network. The sensor units are attached to the right and left side of the lumbar region, the pelvis and the thighs, respectively. Since magnetometers are known to be unreliable in indoor environments, we use only 3D accelerometer and 3D gyroscope readings. Compensation of integration drift in the horizontal plane is achieved by estimating the gyroscope biases from automatically detected initial rest phases. For the estimation of sensor orientations, both a smoothing algorithm and a filtering algorithm are presented. From these orientations, we determine three-dimensional joint angles between the thighs and the pelvis and between the pelvis and the lumbar region. We compare the orientations and joint angles to measurements of an optical motion tracking system that tracks each skin-mounted sensor by means of reflective markers. Eight subjects perform a neutral initial pose, then flexion/extension, lateral flexion, and rotation of the trunk. The root mean square deviation between inertial and optical angles is about one degree for angles in the frontal and sagittal plane and about two degrees for angles in the transverse plane (both values averaged over all trials). We choose five features that characterize the initial pose and the three motions. Interindividual differences of all features are found to be clearly larger than the observed measurement deviations. These results indicate that the proposed inertial sensor-based method is a promising tool for lower back motion assessment.
General intelligence has a substantial genetic background in children, adolescents, and adults, but environmental factors also strongly correlate with cognitive performance as evidenced by a strong (up to one SD) increase in average intelligence test results in the second half of the previous century. This change occurred in a period apparently too short to accommodate radical genetic changes. It is highly suggestive that environmental factors interact with genotype by possible modification of epigenetic factors that regulate gene expression and thus contribute to individual malleability. This modification might as well be reflected in recent observations of an association between dopamine-dependent encoding of reward prediction errors and cognitive capacity, which was modulated by adverse life events.
Recently, Kocyan & Wiland-Szymańska (2016) have published a thorough research article on one of the outstanding members of the family Hypoxidaceae on the Seychelles, which resulted in the raise of a new genus (Friedmannia Kocyan & Wiland-Szymańska 2016: 60) to accommodate the former Curculigo seychellensis Bojer ex Baker (1877: 368). However, it has turned out that the name Friedmannia Chantanachat & Bold (1962: 45) already exists in literature for a green alga, which renders the new hypoxid genus illegitimate (Melbourne Code; McNeill et al. 2012). Therefore, we assign a new generic epithet to Curculigo seychellensis.
3D point cloud technology facilitates the automated and highly detailed digital acquisition of real-world environments such as assets, sites, cities, and countries; the acquired 3D point clouds represent an essential category of geodata used in a variety of geoinformation applications and systems. In this paper, we present a web-based system for the interactive and collaborative exploration and inspection of arbitrary large 3D point clouds. Our approach is based on standard WebGL on the client side and is able to render 3D point clouds with billions of points. It uses spatial data structures and level-of-detail representations to manage the 3D point cloud data and to deploy out-of-core and web-based rendering concepts. By providing functionality for both, thin-client and thick-client applications, the system scales for client devices that are vastly different in computing capabilities. Different 3D point-based rendering techniques and post-processing effects are provided to enable task-specific and data-specific filtering and highlighting, e.g., based on per-point surface categories or temporal information. A set of interaction techniques allows users to collaboratively work with the data, e.g., by measuring distances and areas, by annotating, or by selecting and extracting data subsets. Additional value is provided by the system's ability to display additional, context-providing geodata alongside 3D point clouds and to integrate task-specific processing and analysis operations. We have evaluated the presented techniques and the prototype system with different data sets from aerial, mobile, and terrestrial acquisition campaigns with up to 120 billion points to show their practicality and feasibility.
The rapid digitalization of the Facility Management (FM) sector has increased the demand for mobile, interactive analytics approaches concerning the operational state of a building. These approaches provide the key to increasing stakeholder engagement associated with Operation and Maintenance (O&M) procedures of living and working areas, buildings, and other built environment spaces. We present a generic and fast approach to process and analyze given 3D point clouds of typical indoor office spaces to create corresponding up-to-date approximations of classified segments and object-based 3D models that can be used to analyze, record and highlight changes of spatial configurations. The approach is based on machine-learning methods used to classify the scanned 3D point cloud data using 2D images. This approach can be used to primarily track changes of objects over time for comparison, allowing for routine classification, and presentation of results used for decision making. We specifically focus on classification, segmentation, and reconstruction of multiple different object types in a 3D point-cloud scene. We present our current research and describe the implementation of these technologies as a web-based application using a services-oriented methodology.
Rapid advances in location-acquisition technologies have led to large amounts of trajectory data. This data is the foundation for a broad spectrum of services driven and improved by trajectory data mining. However, for hybrid transactional and analytical workloads, the storing and processing of rapidly accumulated trajectory data is a non-trivial task. In this paper, we present a detailed survey about state-of-the-art trajectory data management systems. To determine the relevant aspects and requirements for such systems, we developed a trajectory data mining framework, which summarizes the different steps in the trajectory data mining process. Based on the derived requirements, we analyze different concepts to store, compress, index, and process spatio-temporal data. There are various trajectory management systems, which are optimized for scalability, data footprint reduction, elasticity, or query performance. To get a comprehensive overview, we describe and compare different exciting systems. Additionally, the observed similarities in the general structure of different systems are consolidated in a general blueprint of trajectory management systems.
We present a prototype of an integrated reasoning environment for educational purposes. The presented tool is a fragment of a proof assistant and automated theorem prover. We describe the existing and planned functionality of the theorem prover and especially the functionality of the educational fragment. This currently supports working with terms of the untyped lambda calculus and addresses both undergraduate students and researchers. We show how the tool can be used to support the students' understanding of functional programming and discuss general problems related to the process of building theorem proving software that aims at supporting both research and education.
High-throughput RNA sequencing (RNAseq) produces large data sets containing expression levels of thousands of genes. The analysis of RNAseq data leads to a better understanding of gene functions and interactions, which eventually helps to study diseases like cancer and develop effective treatments. Large-scale RNAseq expression studies on cancer comprise samples from multiple cancer types and aim to identify their distinct molecular characteristics. Analyzing samples from different cancer types implies analyzing samples from different tissue origin. Such multi-tissue RNAseq data sets require a meaningful analysis that accounts for the inherent tissue-related bias: The identified characteristics must not originate from the differences in tissue types, but from the actual differences in cancer types. However, current analysis procedures do not incorporate that aspect. As a result, we propose to integrate a tissue-awareness into the analysis of multi-tissue RNAseq data. We introduce an extension for gene selection that provides a tissue-wise context for every gene and can be flexibly combined with any existing gene selection approach. We suggest to expand conventional evaluation by additional metrics that are sensitive to the tissue-related bias. Evaluations show that especially low complexity gene selection approaches profit from introducing tissue-awareness.
Selection of initial points, the number of clusters and finding proper clusters centers are still the main challenge in clustering processes. In this paper, we suggest genetic algorithm based method which searches several solution spaces simultaneously. The solution spaces are population groups consisting of elements with similar structure. Elements in a group have the same size, while elements in different groups are of different sizes. The proposed algorithm processes the population in groups of chromosomes with one gene, two genes to k genes. These genes hold corresponding information about the cluster centers. In the proposed method, the crossover and mutation operators can accept parents with different sizes; this can lead to versatility in population and information transfer among sub-populations. We implemented the proposed method and evaluated its performance against some random datasets and the Ruspini dataset as well. The experimental results show that the proposed method could effectively determine the appropriate number of clusters and recognize their centers. Overall this research implies that using heterogeneous population in the genetic algorithm can lead to better results.
Abrupt monsoon transitions as seen in paleorecords can be explained by moisture-advection feedback
(2016)
Chronic ankle instability (CAI) is not only an ankle issue, but also affects sensorimotor system. People with CAI show altered muscle activation in proximal joints such as hip and knee. However, evidence is limited as controversial results have been presented regarding changes in activation of hip muscles in CAI population. PURPOSE: To investigate the effect of CAI on activity of hip muscles during normal walking and walking with perturbations. METHODS: 8 subjects with CAI (23 ± 2 years, 171 ± 7 cm and 65 ± 4 kg) and 8 controls (CON) matched by age, height, weight and dominant leg (25 ± 3 years, 172 ± 7 cm and 65 ± 6 kg) walked shoed on a split-belt treadmill (1 m/s). Subjects performed 5 minutes of baseline walking and 6 minutes walking with 10 perturbations (at 200 ms after heel contact with 42 m/s2 deceleration impulse) on each side. Electromyography signals from gluteus medius (Gmed) and gluteus maximus (Gmax) were recorded while walking. Muscle amplitudes (Root Mean Square normalized to maximum voluntary isometric contraction) were calculated at 200 ms before heel contact (Pre200), 100 ms after heel contact (Post100) during normal walking and 200 ms after perturbations (Pert200). Differences between groups were examined using Mann Whitney U test and Bonferroni correction to account for multiple testing (adjust α level p≤ 0.0125). RESULT: In Gmed, CAI group showed lower muscle amplitude than CON group after heel contact (Post100: 18±7 % and 47±21 %, p< .01) and after walking perturbations ( 31±13 % and 62±26 %, p< .01), but not before heel contact (Pre200: 5±2 % and 11±10 %, p= 0.195). In Gmax, no difference was found between CAI and CON groups in all three time points (Pre200: 12±5 % and 17±12 %, p= 0.574; Post100: 41±21 % and 41±13 %, p= 1.00; Pert200: 79±46 % and 62±35 %, p= 0.505). CONCLUSION: People with CAI activated Gmed less than healthy control in feedback mechanism (after heel contact and walking with perturbations), but not in feedforward mechanism (before heel contact). Less activation on Gmed may affect the balance in frontal plane and increase the risk of recurrent ankle sprain, giving way or feeling ankle instability in patients with CAI during walking. Future studies should investigate the effect of Gmed strengthening or neuromuscular training on CAI rehabilitation.
Adapting to a changing environment: inspiration for planetary health from east African communities
(2022)
Industry 4.0 and the Internet of Things are recent developments that have lead to the creation of new kinds of manufacturing data. Linking this new kind of sensor data to traditional business information is crucial for enterprises to take advantage of the data’s full potential. In this paper, we present a demo which allows experiencing this data integration, both vertically between technical and business contexts and horizontally along the value chain. The tool simulates a manufacturing company, continuously producing both business and sensor data, and supports issuing ad-hoc queries that answer specific questions related to the business. In order to adapt to different environments, users can configure sensor characteristics to their needs.
Eighteen scientists met at Jurata, Poland, to discuss various aspects of the transition from adolescence to adulthood. This transition is a delicate period facing complex interactions between the adolescents and the social group they belong to. Social identity, group identification and identity signalling, but also stress affecting basal salivary cortisol rhythms, hypertension, inappropriate nutrition causing latent and manifest obesity, moreover, in developing and under-developed countries, parasitosis causing anaemia thereby impairing growth and development, are issues to be dealt with during this period of the human development. In addition, some new aspects of the association between weight, height and head circumference in the newborns were discussed, as well as intrauterine head growth and head circumference as health risk indicators.
Adsorption of amino acids on the magnetite-(111)-surface: a force field study (vol 19, 851, 2013)
(2016)
Working in iterations and repeatedly improving team workflows based on collected feedback is fundamental to agile software development processes. Scrum, the most popular agile method, provides dedicated retrospective meetings to reflect on the last development iteration and to decide on process improvement actions. However, agile methods do not prescribe how these improvement actions should be identified, managed or tracked in detail. The approaches to detect and remove problems in software development processes are therefore often based on intuition and prior experiences and perceptions of team members. Previous research in this area has focused on approaches to elicit a team's improvement opportunities as well as measurements regarding the work performed in an iteration, e.g. Scrum burn-down charts. Little research deals with the quality and nature of identified problems or how progress towards removing issues is measured. In this research, we investigate how agile development teams in the professional software industry organize their feedback and process improvement approaches. In particular, we focus on the structure and content of improvement and reflection meetings, i.e. retrospectives, and their outcomes. Researching how the vital mechanism of process improvement is implemented in practice in modern software development leads to a more complete picture of agile process improvement.
Aktives kommunales Debt Management in Deutschland : ein bisher vernachlässigtes Sparpotenzial
(2006)
Basándose en el conjunto de la obra humboldtiana, desde sus comienzos hasta el Cosmos, este dossier trata de destacar la orientación cosmopolita del sabio prusiano así como, sobre todo, el fundamento americano de sus enfoques. El continente americano, para Humboldt, representa la diversidad de lo pensable y la multirrelacionalidad de lo imaginable: la llave para entender su cosmovisión.
alt'ai is an agent-based simulation inspired by aesthetics, culture and environmental conditions of the Altai mountain region on the borders between Russia, Kazakhstan, China and Mongolia. It is set into a scenario of a remote automated landscape populated by sentient machines, where biological species, machines and environments autonomously interact to produce unforeseeable visual outputs. It poses a question of designing future machine-to-machine authentication protocols that are based on the use of images encoding agent behavior. Also, the simulation provides rich visual perspective on this challenge. The project pleads for a heavily aestheticized approach to design practice and highlights the importance of productively inefficient and information redundant systems.
An energy consumption model for multiModal wireless sensor networks based on wake-up radio receivers
(2018)
Energy consumption is a major concern in Wireless Sensor Networks. A significant waste of energy occurs due to the idle listening and overhearing problems, which are typically avoided by turning off the radio, while no transmission is ongoing. The classical approach for allowing the reception of messages in such situations is to use a low-duty-cycle protocol, and to turn on the radio periodically, which reduces the idle listening problem, but requires timers and usually unnecessary wakeups. A better solution is to turn on the radio only on demand by using a Wake-up Radio Receiver (WuRx). In this paper, an energy model is presented to estimate the energy saving in various multi-hop network topologies under several use cases, when a WuRx is used instead of a classical low-duty-cycling protocol. The presented model also allows for estimating the benefit of various WuRx properties like using addressing or not.
Mobile sensing technology allows us to investigate human behaviour on a daily basis. In the study, we examined temporal orientation, which refers to the capacity of thinking or talking about personal events in the past and future. We utilise the mksense platform that allows us to use the experience-sampling method. Individual's thoughts and their relationship with smartphone's Bluetooth data is analysed to understand in which contexts people are influenced by social environments, such as the people they spend the most time with. As an exploratory study, we analyse social condition influence through a collection of Bluetooth data and survey information from participant's smartphones. Preliminary results show that people are likely to focus on past events when interacting with close-related people, and focus on future planning when interacting with strangers. Similarly, people experience present temporal orientation when accompanied by known people. We believe that these findings are linked to emotions since, in its most basic state, emotion is a state of physiological arousal combined with an appropriated cognition. In this contribution, we envision a smartphone application for automatically inferring human emotions based on user's temporal orientation by using Bluetooth sensors, we briefly elaborate on the influential factor of temporal orientation episodes and conclude with a discussion and lessons learned.
An Information System Supporting the Eliciting of Expert Knowledge for Successful IT Projects
(2018)
In order to guarantee the success of an IT project, it is necessary for a company to possess expert knowledge. The difficulty arises when experts no longer work for the company and it then becomes necessary to use their knowledge, in order to realise an IT project. In this paper, the ExKnowIT information system which supports the eliciting of expert knowledge for successful IT projects, is presented and consists of the following modules: (1) the identification of experts for successful IT projects, (2) the eliciting of expert knowledge on completed IT projects, (3) the expert knowledge base on completed IT projects, (4) the Group Method for Data Handling (GMDH) algorithm, (5) new knowledge in support of decisions regarding the selection of a manager for a new IT project. The added value of our system is that these three approaches, namely, the elicitation of expert knowledge, the success of an IT project and the discovery of new knowledge, gleaned from the expert knowledge base, otherwise known as the decision model, complement each other.
E-commerce marketplaces are highly dynamic with constant competition. While this competition is challenging for many merchants, it also provides plenty of opportunities, e.g., by allowing them to automatically adjust prices in order to react to changing market situations. For practitioners however, testing automated pricing strategies is time-consuming and potentially hazardously when done in production. Researchers, on the other side, struggle to study how pricing strategies interact under heavy competition. As a consequence, we built an open continuous time framework to simulate dynamic pricing competition called Price Wars. The microservice-based architecture provides a scalable platform for large competitions with dozens of merchants and a large random stream of consumers. Our platform stores each event in a distributed log. This allows to provide different performance measures enabling users to compare profit and revenue of various repricing strategies in real-time. For researchers, price trajectories are shown which ease evaluating mutual price reactions of competing strategies. Furthermore, merchants can access historical marketplace data and apply machine learning. By providing a set of customizable, artificial merchants, users can easily simulate both simple rule-based strategies as well as sophisticated data-driven strategies using demand learning to optimize their pricing strategies.
Das Projekt beschäftigt sich mit der visuellen Wirkungsdimension von Lyrik und der Möglichkeit ihrer analytischen Beschreibung. Dafür werden die Anordnung von Versen und Wörtern, Auszeichnungen und andere typographische Strukturen von nicht experimentellen Gedichten seit Ende des 18. Jahrhunderts im Rahmen von Modellanalysen untersucht.
Xenikoudakis et al. report a partial mitochondrial genome of the extinct giant beaver Castoroides and estimate the origin of aquatic behavior in beavers to approximately 20 million years. This time estimate coincides with the extinction of terrestrial beavers and raises the question whether the two events had a common cause.
One-tube osmotic fragility (OF) test is a rapid test used widely for screening thalassemia in countries with limited resources. The test has important limitation in that its accuracy relies on observers’ experience.
The iCheck Turbidity is a prototype of portable nephelometer developed by BioAnalyt (Bioanalyt GmbH, Germany). In this study, we assessed the applicability of the iCheck Turbidity, for checking turbidity of the OF-test
ASEDS
(2018)
The Massive adoption of social media has provided new ways for individuals to express their opinion and emotion online. In 2016, Facebook introduced a new reactions feature that allows users to express their psychological emotions regarding published contents using so-called Facebook reactions. In this paper, a framework for predicting the distribution of Facebook post reactions is presented. For this purpose, we collected an enormous amount of Facebook posts associated with their reactions labels using the proposed scalable Facebook crawler. The training process utilizes 3 million labeled posts for more than 64,000 unique Facebook pages from diverse categories. The evaluation on standard benchmarks using the proposed features shows promising results compared to previous research. The final model is able to predict the reaction distribution on Facebook posts with a recall score of 0.90 for "Joy" emotion.
Manufacturing industries are undergoing a major paradigm shift towards more autonomy. Automated planning and scheduling then becomes a necessity. The Planning and Execution Competition for Logistics Robots in Simulation held at ICAPS is based on this scenario and provides an interesting testbed. However, the posed problem is challenging as also demonstrated by the somewhat weak results in 2017. The domain requires temporal reasoning and dealing with uncertainty. We propose a novel planning system based on Answer Set Programming and the Clingo solver to tackle these problems and incentivize robot cooperation. Our results show a significant performance improvement, both, in terms of lowering computational requirements and better game metrics.
Aspirin inhibits release of platelet-derived sphingosine-1-phosphate in
acute myocardial infarction
(2013)
Cost models play an important role for the efficient implementation of software systems. These models can be embedded in operating systems and execution environments to optimize execution at run time. Even though non-uniform memory access (NUMA) architectures are dominating today's server landscape, there is still a lack of parallel cost models that represent NUMA system sufficiently. Therefore, the existing NUMA models are analyzed, and a two-step performance assessment strategy is proposed that incorporates low-level hardware counters as performance indicators. To support the two-step strategy, multiple tools are developed, all accumulating and enriching specific hardware event counter information, to explore, measure, and visualize these low-overhead performance indicators. The tools are showcased and discussed alongside specific experiments in the realm of performance assessment.
High-dimensional data is particularly useful for data analytics research. In the healthcare domain, for instance, high-dimensional data analytics has been used successfully for drug discovery. Yet, in order to adhere to privacy legislation, data analytics service providers must guarantee anonymity for data owners. In the context of high-dimensional data, ensuring privacy is challenging because increased data dimensionality must be matched by an exponential growth in the size of the data to avoid sparse datasets. Syntactically, anonymising sparse datasets with methods that rely of statistical significance, makes obtaining sound and reliable results, a challenge. As such, strong privacy is only achievable at the cost of high information loss, rendering the data unusable for data analytics. In this paper, we make two contributions to addressing this problem from both the privacy and information loss perspectives. First, we show that by identifying dependencies between attribute subsets we can eliminate privacy violating attributes from the anonymised dataset. Second, to minimise information loss, we employ a greedy search algorithm to determine and eliminate maximal partial unique attribute combinations. Thus, one only needs to find the minimal set of identifying attributes to prevent re-identification. Experiments on a health cloud based on the SAP HANA platform using a semi-synthetic medical history dataset comprised of 109 attributes, demonstrate the effectiveness of our approach.
Audit - and then what?
(2019)
Current trends such as digital transformation, Internet of Things, or Industry 4.0 are challenging the majority of learning factories. Regardless of whether a conventional learning factory, a model factory, or a digital learning factory, traditional approaches such as the monotonous execution of specific instructions don‘t suffice the learner’s needs, market requirements as well as especially current technological developments. Contemporary teaching environments need a clear strategy, a road to follow for being able to successfully cope with the changes and develop towards digitized learning factories. This demand driven necessity of transformation leads to another obstacle: Assessing the status quo and developing and implementing adequate action plans. Within this paper, details of a maturity-based audit of the hybrid learning factory in the Research and Application Centre Industry 4.0 and a thereof derived roadmap for the digitization of a learning factory are presented.
Auf dem Sprung
(2019)
Auslandserfahrung wird von vielen Kanzleien und Unternehmen im Einstellungsprozess geschätzt. Sie beweist, dass Bewerber:innen Sprachkenntnisse, interkulturelle Kompetenzen und Organisationsfähigkeiten erworben haben. Trotzdem bestehen bei Studierenden oft Bedenken, ob und wie sie ein Auslandssemester in das rechtswissenschaftliche Studium einbauen sollen.
Deshalb möchte dieser Beitrag hilfreiche Antworten und Tipps geben, wie Jurastudierende der Universität Potsdam das Auslandssemester bestmöglich integrieren können. Er soll als Ermutigung dienen, sich dafür zu bewerben.
Dieser Beitrag ist in drei Abschnitte eingeteilt. Zuerst möchte ich auf den vermeintlich „richtigen“ Zeitpunkt für ein Auslandssemester eingehen. Danach stelle ich dar, welche Leistungen gut anrechenbar sind. Zuletzt möchte ich sowohl die Nachteile als auch die Vorteile eines Erasmus-Semesters aus eigener Erfahrung vorstellen.
The classification of vulnerabilities is a fundamental step to derive formal attributes that allow a deeper analysis. Therefore, it is required that this classification has to be performed timely and accurate. Since the current situation demands a manual interaction in the classification process, the timely processing becomes a serious issue. Thus, we propose an automated alternative to the manual classification, because the amount of identified vulnerabilities per day cannot be processed manually anymore. We implemented two different approaches that are able to automatically classify vulnerabilities based on the vulnerability description. We evaluated our approaches, which use Neural Networks and the Naive Bayes methods respectively, on the base of publicly known vulnerabilities.
BACK PAIN: THE STUDY OF MECHANISMS AND THE TRANSLATION IN INTERVENTIONS WITHIN THE MISPEX NETWORK
(2016)
BDS-Kampagne in der Kommune
(2022)
Beacon in the Dark
(2018)
The large amount of heterogeneous data in these email corpora renders experts' investigations by hand infeasible. Auditors or journalists, e.g., who are looking for irregular or inappropriate content or suspicious patterns, are in desperate need for computer-aided exploration tools to support their investigations.
We present our Beacon system for the exploration of such corpora at different levels of detail. A distributed processing pipeline combines text mining methods and social network analysis to augment the already semi-structured nature of emails. The user interface ties into the resulting cleaned and enriched dataset. For the interface design we identify three objectives expert users have: gain an initial overview of the data to identify leads to investigate, understand the context of the information at hand, and have meaningful filters to iteratively focus onto a subset of emails. To this end we make use of interactive visualisations based on rearranged and aggregated extracted information to reveal salient patterns.
Beware of SMOMBIES
(2018)
Several research evaluated the user's style of walking for the verification of a claimed identity and showed high authentication accuracies in many settings. In this paper we present a system that successfully verifies a user's identity based on many real world smartphone placements and yet not regarded interactions while walking. Our contribution is the distinction of all considered activities into three distinct subsets and a specific one-class Support Vector Machine per subset. Using sensor data of 30 participants collected in a semi-supervised study approach, we prove that unsupervised verification is possible with very low false-acceptance and false-rejection rates. We furthermore show that these subsets can be distinguished with a high accuracy and demonstrate that this system can be deployed on off-the-shelf smartphones.
Beyond Surveys
(2018)
Blick in die Zukunft
(2019)
Seek and destroy: Filtration schemes and self-detoxifying protective fabrics based on the ZrIV-containing metal—organic frameworks (MOFs) MOF-808 and UiO-66 doped with LiOtBu have been developed that capture and hydrolytically detoxify simulants of nerve agents and mustard gas. Both MOFs function as highly catalytic elements in these applications.
We study the parameter sensitivity of hetero-polymeric DNA within the purview of DNA breathing dynamics. The degree of correlation between the mean bubble size and the model parameters is estimated for this purpose for three different DNA sequences. The analysis leads us to a better understanding of the sequence dependent nature of the breathing dynamics of hetero-polymeric DNA. Out of the 14 model parameters for DNA stability in the statistical Poland-Scheraga approach, the hydrogen bond interaction epsilon(hb)(AT) for an AT base pair and the ring factor. turn out to be the most sensitive parameters. In addition, the stacking interaction epsilon(st)(TA-TA) for an TA-TA nearest neighbor pair of base-pairs is found to be the most sensitive one among all stacking interactions. Moreover, we also establish that the nature of stacking interaction has a deciding effect on the DNA breathing dynamics, not the number of times a particular stacking interaction appears in a sequence. We show that the sensitivity analysis can be used as an effective measure to guide a stochastic optimization technique to find the kinetic rate constants related to the dynamics as opposed to the case where the rate constants are measured using the conventional unbiased way of optimization.
Bridging the Gap
(2019)
The recent restructuring of the electricity grid (i.e., smart grid) introduces a number of challenges for today's large-scale computing systems. To operate reliable and efficient, computing systems must adhere not only to technical limits (i.e., thermal constraints) but they must also reduce operating costs, for example, by increasing their energy efficiency. Efforts to improve the energy efficiency, however, are often hampered by inflexible software components that hardly adapt to underlying hardware characteristics. In this paper, we propose an approach to bridge the gap between inflexible software and heterogeneous hardware architectures. Our proposal introduces adaptive software components that dynamically adapt to heterogeneous processing units (i.e., accelerators) during runtime to improve the energy efficiency of computing systems.
Cain and Abel
(2021)
The biblical story of Cain and Abel in Genesis 4:1–16 appears as the first case of siblings’ rivalry in the Torah. It is the starting point of a socio-ethical process of human development within the book of Genesis. The sibling narrative also includes the first report of homicide, more precisely a fratricide, as Cain slays his own brother Abel (Gen 4:8). The Jewish and Christian reception discourse of the Cain-Abel-story developed early on to deal with a range of open questions and difficult passages provided by the biblical text. The basic assumptions of Jewish and Christian interpretations are initially similar in terms of attempting to explain God’s preference for Abel’s sacrifice and Cain’s motivation for killing his brother.