Refine
Year of publication
Document Type
- Other (874) (remove)
Language
- English (641)
- German (218)
- Spanish (5)
- Italian (4)
- Multiple languages (2)
- Polish (2)
- French (1)
- Portuguese (1)
Is part of the Bibliography
- yes (874) (remove)
Keywords
- E-Learning (4)
- MOOC (4)
- Scrum (4)
- embodied cognition (4)
- errata, addenda (4)
- Cloud-Security (3)
- Digitalisierung (3)
- ISM: supernova remnants (3)
- Industry 4.0 (3)
- Internet of Things (3)
Institute
- Institut für Biochemie und Biologie (85)
- Hasso-Plattner-Institut für Digital Engineering GmbH (83)
- Institut für Physik und Astronomie (83)
- Institut für Geowissenschaften (61)
- Institut für Mathematik (46)
- Department Psychologie (44)
- Department Sport- und Gesundheitswissenschaften (44)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (31)
- Institut für Chemie (30)
- Institut für Ernährungswissenschaft (30)
- Institut für Informatik und Computational Science (28)
- Fachgruppe Politik- & Verwaltungswissenschaft (23)
- Fachgruppe Soziologie (18)
- Historisches Institut (18)
- Institut für Romanistik (18)
- Department Linguistik (17)
- Institut für Germanistik (17)
- Sozialwissenschaften (17)
- Institut für Umweltwissenschaften und Geographie (15)
- Wirtschaftswissenschaften (14)
- Bürgerliches Recht (13)
- Institut für Anglistik und Amerikanistik (12)
- Institut für Jüdische Studien und Religionswissenschaft (12)
- Department Erziehungswissenschaft (11)
- Öffentliches Recht (11)
- MenschenRechtsZentrum (9)
- Department Grundschulpädagogik (8)
- Institut für Slavistik (7)
- Institut für Philosophie (5)
- Lehreinheit für Wirtschafts-Arbeit-Technik (5)
- Senat (5)
- Fachgruppe Betriebswirtschaftslehre (4)
- Zentrum für Umweltwissenschaften (4)
- Institut für Religionswissenschaft (3)
- Juristische Fakultät (3)
- Philosophische Fakultät (3)
- Universitätsbibliothek (3)
- Department Musik und Kunst (2)
- Fachgruppe Volkswirtschaftslehre (2)
- Institut für Jüdische Theologie (2)
- Kommissionen des Senats (2)
- Referat für Presse- und Öffentlichkeitsarbeit (2)
- Wirtschafts- und Sozialwissenschaftliche Fakultät (2)
- Department für Inklusionspädagogik (1)
- Extern (1)
- Forschungsbereich „Politik, Verwaltung und Management“ (1)
- Institut für Künste und Medien (1)
- Institut für Lebensgestaltung-Ethik-Religionskunde (1)
- Mathematisch-Naturwissenschaftliche Fakultät (1)
- Potsdam Institute for Climate Impact Research (PIK) e. V. (1)
- Präsident | Vizepräsidenten (1)
Point clouds provide high-resolution topographic data which is often classified into bare-earth, vegetation, and building points and then filtered and aggregated to gridded Digital Elevation Models (DEMs) or Digital Terrain Models (DTMs). Based on these equally-spaced grids flow-accumulation algorithms are applied to describe the hydrologic and geomorphologic mass transport on the surface. In this contribution, we propose a stochastic point-cloud filtering that, together with a spatial bootstrap sampling, allows for a flow accumulation directly on point clouds using Facet-Flow Networks (FFN). Additionally, this provides a framework for the quantification of uncertainties in point-cloud derived metrics such as Specific Catchment Area (SCA) even though the flow accumulation itself is deterministic.
Beacon in the Dark
(2018)
The large amount of heterogeneous data in these email corpora renders experts' investigations by hand infeasible. Auditors or journalists, e.g., who are looking for irregular or inappropriate content or suspicious patterns, are in desperate need for computer-aided exploration tools to support their investigations.
We present our Beacon system for the exploration of such corpora at different levels of detail. A distributed processing pipeline combines text mining methods and social network analysis to augment the already semi-structured nature of emails. The user interface ties into the resulting cleaned and enriched dataset. For the interface design we identify three objectives expert users have: gain an initial overview of the data to identify leads to investigate, understand the context of the information at hand, and have meaningful filters to iteratively focus onto a subset of emails. To this end we make use of interactive visualisations based on rearranged and aggregated extracted information to reveal salient patterns.
Web-based E-Learning uses Internet technologies and digital media to deliver education content to learners. Many universities in recent years apply their capacity in producing Massive Open Online Courses (MOOCs). They have been offering MOOCs with an expectation of rendering a comprehensive online apprenticeship. Typically, an online content delivery process requires an Internet connection. However, access to the broadband has never been a readily available resource in many regions. In Africa, poor and no networks are yet predominantly experienced by Internet users, frequently causing offline each moment a digital device disconnect from a network. As a result, a learning process is always disrupted, delayed and terminated in such regions. This paper raises the concern of E-Learning in poor and low bandwidths, in fact, it highlights the needs for an Offline-Enabled mode. The paper also explores technical approaches beamed to enhance the user experience inWeb-based E-Learning, particular in Africa.
The "Bachelor Project"
(2019)
One of the challenges of educating the next generation of computer scientists is to teach them to become team players, that are able to communicate and interact not only with different IT systems, but also with coworkers and customers with a non-it background. The “bachelor project” is a project based on team work and a close collaboration with selected industry partners. The authors hosted some of the teams since spring term 2014/15. In the paper at hand we explain and discuss this concept and evaluate its success based on students' evaluation and reports. Furthermore, the technology-stack that has been used by the teams is evaluated to understand how self-organized students in IT-related projects work. We will show that and why the bachelor is the most successful educational format in the perception of the students and how this positive results can be improved by the mentors.
Mobile expressive rendering gained increasing popularity among users seeking casual creativity by image stylization and supports the development of mobile artists as a new user group. In particular, neural style transfer has advanced as a core technology to emulate characteristics of manifold artistic styles. However, when it comes to creative expression, the technology still faces inherent limitations in providing low-level controls for localized image stylization. This work enhances state-of-the-art neural style transfer techniques by a generalized user interface with interactive tools to facilitate a creative and localized editing process. Thereby, we first propose a problem characterization representing trade-offs between visual quality, run-time performance, and user control. We then present MaeSTrO, a mobile app for orchestration of neural style transfer techniques using iterative, multi-style generative and adaptive neural networks that can be locally controlled by on-screen painting metaphors. At this, first user tests indicate different levels of satisfaction for the implemented techniques and interaction design.
DPP4 inhibition prevents AKI
(2017)
Logical modeling has been widely used to understand and expand the knowledge about protein interactions among different pathways. Realizing this, the caspo-ts system has been proposed recently to learn logical models from time series data. It uses Answer Set Programming to enumerate Boolean Networks (BNs) given prior knowledge networks and phosphoproteomic time series data. In the resulting sequence of solutions, similar BNs are typically clustered together. This can be problematic for large scale problems where we cannot explore the whole solution space in reasonable time. Our approach extends the caspo-ts system to cope with the important use case of finding diverse solutions of a problem with a large number of solutions. We first present the algorithm for finding diverse solutions and then we demonstrate the results of the proposed approach on two different benchmark scenarios in systems biology: (1) an artificial dataset to model TCR signaling and (2) the HPN-DREAM challenge dataset to model breast cancer cell lines.
Tikhonov regularization with oversmoothing penalty for linear statistical inverse learning problems
(2019)
In this paper, we consider the linear ill-posed inverse problem with noisy data in the statistical learning setting. The Tikhonov regularization scheme in Hilbert scales is considered in the reproducing kernel Hilbert space framework to reconstruct the estimator from the random noisy data. We discuss the rates of convergence for the regularized solution under the prior assumptions and link condition. For regression functions with smoothness given in terms of source conditions the error bound can explicitly be established.
When local poverty is more important than your income: Mental health in minorities in inner cities
(2015)
The influence of chemical composition and crystallisation conditions on the ferroelectric and paraelectric phases and the resulting morphology in Poly(vinylidene fluoride-trifluoroethylene-chlorofluoroethylene) (P(VDF-TrFE-CFE)) terpolymer films with 55.4/37.2/7.3 mol% or with 62.2/29.4/8.4 mol% of VDF/TrFE/CFE was studied. Poly(vinylidene fluoride trifluoroethylene) (P(VDF-TrFE)) with 75/25 mol% VDF/TrFE was employed as reference material. Fourier-Transform Infrared Spectroscopy (FTIR) was used to determine the fractions of the relevant terpolymer phases, and X-Ray Diffraction (XRD) was employed to assess the crystalline morphology. The FTIR results show an increase of the fraction of paraelectric phases after annealing. On the other hand, XRD results indicate a more stable paraelectric phase in the terpolymer with higher CFE content.
Cardiovascular drift response over two different constant-load exercises in healthy non-athletes
(2019)
Cardiovascular drift (CV-d) is a steady increase in heart rate (HR) over time while performing constant load moderate intensity exercise (CME) > 20 min. CV-d presents problems for the prescription of exercise intensity by means of HR, because the work rate (WR) during exercise must be adjusted to maintain target HR, thus disturbing the intended effect of the exercise intervention. It has been shown that the increase in HR during CME is due to changes in WR and not to CV-d.
Business process simulation is an important means for quantitative analysis of a business process and to compare different process alternatives. With the Business Process Model and Notation (BPMN) being the state-of-the-art language for the graphical representation of business processes, many existing process simulators support already the simulation of BPMN diagrams. However, they do not provide well-defined interfaces to integrate new concepts in the simulation environment. In this work, we present the design and architecture of a proof-of-concept implementation of an open and extensible BPMN process simulator. It also supports the simulation of multiple BPMN processes at a time and relies on the building blocks of the well-founded discrete event simulation. The extensibility is assured by a plug-in concept. Its feasibility is demonstrated by extensions supporting new BPMN concepts, such as the simulation of business rule activities referencing decision models and batch activities.
The target article discusses the question of how educational makerspaces can become places supportive of knowledge construction. This question is too often neglected by people who run makerspaces, as they mostly explain how to use different tools and focus on the creation of a product. In makerspaces, often pupils also engage in physical computing activities and thus in the creation of interactive artifacts containing embedded systems, such as smart shoes or wristbands, plant monitoring systems or drink mixing machines. This offers the opportunity to reflect on teaching physical computing in computer science education, where similarly often the creation of the product is so strongly focused upon that the reflection of the learning process is pushed into the background.
Aspirin inhibits release of platelet-derived sphingosine-1-phosphate in
acute myocardial infarction
(2013)
Minimising Information Loss on Anonymised High Dimensional Data with Greedy In-Memory Processing
(2018)
Minimising information loss on anonymised high dimensional data is important for data utility. Syntactic data anonymisation algorithms address this issue by generating datasets that are neither use-case specific nor dependent on runtime specifications. This results in anonymised datasets that can be re-used in different scenarios which is performance efficient. However, syntactic data anonymisation algorithms incur high information loss on high dimensional data, making the data unusable for analytics. In this paper, we propose an optimised exact quasi-identifier identification scheme, based on the notion of k-anonymity, to generate anonymised high dimensional datasets efficiently, and with low information loss. The optimised exact quasi-identifier identification scheme works by identifying and eliminating maximal partial unique column combination (mpUCC) attributes that endanger anonymity. By using in-memory processing to handle the attribute selection procedure, we significantly reduce the processing time required. We evaluated the effectiveness of our proposed approach with an enriched dataset drawn from multiple real-world data sources, and augmented with synthetic values generated in close alignment with the real-world data distributions. Our results indicate that in-memory processing drops attribute selection time for the mpUCC candidates from 400s to 100s, while significantly reducing information loss. In addition, we achieve a time complexity speed-up of O(3(n/3)) approximate to O(1.4422(n)).
High-dimensional data is particularly useful for data analytics research. In the healthcare domain, for instance, high-dimensional data analytics has been used successfully for drug discovery. Yet, in order to adhere to privacy legislation, data analytics service providers must guarantee anonymity for data owners. In the context of high-dimensional data, ensuring privacy is challenging because increased data dimensionality must be matched by an exponential growth in the size of the data to avoid sparse datasets. Syntactically, anonymising sparse datasets with methods that rely of statistical significance, makes obtaining sound and reliable results, a challenge. As such, strong privacy is only achievable at the cost of high information loss, rendering the data unusable for data analytics. In this paper, we make two contributions to addressing this problem from both the privacy and information loss perspectives. First, we show that by identifying dependencies between attribute subsets we can eliminate privacy violating attributes from the anonymised dataset. Second, to minimise information loss, we employ a greedy search algorithm to determine and eliminate maximal partial unique attribute combinations. Thus, one only needs to find the minimal set of identifying attributes to prevent re-identification. Experiments on a health cloud based on the SAP HANA platform using a semi-synthetic medical history dataset comprised of 109 attributes, demonstrate the effectiveness of our approach.
Cost models play an important role for the efficient implementation of software systems. These models can be embedded in operating systems and execution environments to optimize execution at run time. Even though non-uniform memory access (NUMA) architectures are dominating today's server landscape, there is still a lack of parallel cost models that represent NUMA system sufficiently. Therefore, the existing NUMA models are analyzed, and a two-step performance assessment strategy is proposed that incorporates low-level hardware counters as performance indicators. To support the two-step strategy, multiple tools are developed, all accumulating and enriching specific hardware event counter information, to explore, measure, and visualize these low-overhead performance indicators. The tools are showcased and discussed alongside specific experiments in the realm of performance assessment.
The overhead of moving data is the major limiting factor in todays hardware, especially in heterogeneous systems where data needs to be transferred frequently between host and accelerator memory. With the increasing availability of hardware-based compression facilities in modern computer architectures, this paper investigates the potential of hardware-accelerated I/O Link Compression as a promising approach to reduce data volumes and transfer time, thus improving the overall efficiency of accelerators in heterogeneous systems. Our considerations are focused on On-the-Fly compression in both Single-Node and Scale-Out deployments. Based on a theoretical analysis, this paper demonstrates the feasibility of hardware-accelerated On-the-Fly I/O Link Compression for many workloads in a Scale-Out scenario, and for some even in a Single-Node scenario. These findings are confirmed in a preliminary evaluation using software-and hardware-based implementations of the 842 compression algorithm.
This is a correction notice for ‘Post-adiabatic supernova remnants in an interstellar magnetic field: oblique shocks and non-uniform environment’ (DOI: https://doi.org/10.1093/mnras/sty1750), which was published in MNRAS 479, 4253–4270 (2018). The publisher regrets to inform that the colour was missing from the colour scales in Figs 8(a)–(d) and Figs 9(a) and (b). This has now been corrected online. The publisher apologizes for this error.
High-throughput RNA sequencing produces large gene expression datasets whose analysis leads to a better understanding of diseases like cancer. The nature of RNA-Seq data poses challenges to its analysis in terms of its high dimensionality, noise, and complexity of the underlying biological processes. Researchers apply traditional machine learning approaches, e. g. hierarchical clustering, to analyze this data. Until it comes to validation of the results, the analysis is based on the provided data only and completely misses the biological context. However, gene expression data follows particular patterns - the underlying biological processes. In our research, we aim to integrate the available biological knowledge earlier in the analysis process. We want to adapt state-of-the-art data mining algorithms to consider the biological context in their computations and deliver meaningful results for researchers.
High-throughput RNA sequencing (RNAseq) produces large data sets containing expression levels of thousands of genes. The analysis of RNAseq data leads to a better understanding of gene functions and interactions, which eventually helps to study diseases like cancer and develop effective treatments. Large-scale RNAseq expression studies on cancer comprise samples from multiple cancer types and aim to identify their distinct molecular characteristics. Analyzing samples from different cancer types implies analyzing samples from different tissue origin. Such multi-tissue RNAseq data sets require a meaningful analysis that accounts for the inherent tissue-related bias: The identified characteristics must not originate from the differences in tissue types, but from the actual differences in cancer types. However, current analysis procedures do not incorporate that aspect. As a result, we propose to integrate a tissue-awareness into the analysis of multi-tissue RNAseq data. We introduce an extension for gene selection that provides a tissue-wise context for every gene and can be flexibly combined with any existing gene selection approach. We suggest to expand conventional evaluation by additional metrics that are sensitive to the tissue-related bias. Evaluations show that especially low complexity gene selection approaches profit from introducing tissue-awareness.
In the course of patient treatments, psychotherapists aim to meet the challenges of being both a trusted, knowledgeable conversation partner and a diligent documentalist. We are developing the digital whiteboard system Tele-Board MED (TBM), which allows the therapist to take digital notes during the session together with the patient. This study investigates what therapists are experiencing when they document with TBM in patient sessions for the first time and whether this documentation saves them time when writing official clinical documents. As the core of this study, we conducted four anamnesis session dialogues with behavior psychotherapists and volunteers acting in the role of patients. Following a mixed-method approach, the data collection and analysis involved self-reported emotion samples, user experience curves and questionnaires. We found that even in the very first patient session with TBM, therapists come to feel comfortable, develop a positive feeling and can concentrate on the patient. Regarding administrative documentation tasks, we found with the TBM report generation feature the therapists save 60% of the time they normally spend on writing case reports to the health insurance.
An Information System Supporting the Eliciting of Expert Knowledge for Successful IT Projects
(2018)
In order to guarantee the success of an IT project, it is necessary for a company to possess expert knowledge. The difficulty arises when experts no longer work for the company and it then becomes necessary to use their knowledge, in order to realise an IT project. In this paper, the ExKnowIT information system which supports the eliciting of expert knowledge for successful IT projects, is presented and consists of the following modules: (1) the identification of experts for successful IT projects, (2) the eliciting of expert knowledge on completed IT projects, (3) the expert knowledge base on completed IT projects, (4) the Group Method for Data Handling (GMDH) algorithm, (5) new knowledge in support of decisions regarding the selection of a manager for a new IT project. The added value of our system is that these three approaches, namely, the elicitation of expert knowledge, the success of an IT project and the discovery of new knowledge, gleaned from the expert knowledge base, otherwise known as the decision model, complement each other.
Gamma-ray bursts (GRBs) are some of the Universe’s most enigmatic and exotic events. However, at energies above 10 GeV their behaviour remains largely unknown. Although space based telescopes such as the Fermi-LAT have been able to detect GRBs in this energy range, their photon statistics are limited by the small detector size. Such limitations are not present in ground based gamma-ray telescopes such as the H.E.S.S. experiment, which has now entered its second phase with the addition of a large 600 m2 telescope to the centre of the array. Such a large telescope allows H.E.S.S. to access the sub 100-GeV energy range while still maintaining a large effective collection area, helping to potentially probe the short timescale emission of these events.
We present a description of the H.E.S.S. GRB observation programme, summarising the performance of the rapid GRB repointing system and the conditions under which GRB observations are initiated. Additionally we will report on the GRB follow-ups made during the 2014-15 observation campaigns.
Factory Innovation Award
(2023)
Einmal mehr brachte die Hannover Messe die Spitzen der Industrie zusammen, um die wegweisenden Innovationen des Jahres mit dem begehrten Factory Innovation Award 2023 zu ehren. Dieser renommierte Preis, der erstmals auf der Industrial Transformation Stage verliehen wurde, markierte den Höhepunkt einer spannungsgeladenen Veranstaltung.
Climate change entails an intensification of extreme weather events that can potentially trigger socioeconomic and energy system disruptions. As we approach 1 degrees C of global warming we should start learning from historical extremes and explicitly incorporate such events in integrated climate-economy and energy systems models.
Predictive coding and its generalization to active inference offer a unified theory of brain function. The underlying predictive processing paradigmhas gained significant attention in artificial intelligence research for its representation learning and predictive capacity. Here, we suggest that it is possible to integrate human and artificial generative models with a predictive coding network that processes sensations simultaneously with the signature of predictive coding found in human neuroimaging data. We propose a recurrent hierarchical predictive coding model that predicts low-dimensional representations of stimuli, electroencephalogram and physiological signals with variational inference. We suggest that in a shared environment, such hybrid predictive coding networks learn to incorporate the human predictive model in order to reduce prediction error. We evaluate the model on a publicly available EEG dataset of subjects watching one-minute long video excerpts. Our initial results indicate that the model can be trained to predict visual properties such as the amount, distance and motion of human subjects in videos.
We describe how inversion symmetry separation of electronic state manifolds in resonant inelastic soft X-ray scattering (RIXS) can be applied to probe excited-state dynamics with compelling selectivity. In a case study of Fe L3-edge RIXS in the ferricyanide complex Fe(CN)63−, we demonstrate with multi-configurational restricted active space spectrum simulations how the information content of RIXS spectral fingerprints can be used to unambiguously separate species of different electronic configurations, spin multiplicities, and structures, with possible involvement in the decay dynamics of photo-excited ligand-to-metal charge-transfer. Specifically, we propose that this could be applied to confirm or reject the presence of a hitherto elusive transient Quartet species. Thus, RIXS offers a particular possibility to settle a recent controversy regarding the decay pathway, and we expect the technique to be similarly applicable in other model systems of photo-induced dynamics.
Surface acoustic wave (SAW) devices are well-known for gravimetric sensor applications. In biosensing applications, chemical-and biochemically evoked adsorption processes at surfaces are detected in liquid environments using delay-line or resonator sensor configurations, preferably in combination with appropriate microfluidic devices. In this paper, a novel SAW-based impedance sensor type is introduced which uses only one interdigital electrode transducer (IDT) simultaneously as SAW generator and sensor element. It is shown that the amplitude of the reflected S-11 signal directly depends on the input impedance of the SAW device. The input impedance is strongly influenced by mass adsorption which causes a characteristic and measurable impedance mismatch.
In the present study, the charge distribution and the charge transport across the thickness of 2- and 3-dimensional polymer nanodielectrics was investigated. Chemically surface-treated polypropylene (PP) films and low-density polyethylene nanocomposite films with 3 wt % of magnesium oxide (LDPE/MgO) served as examples of 2-D and 3-D nanodielectrics, respectively. Surface charges were deposited onto the non-metallized surfaces of the one-side metallized polymer films and found to broaden and to thus enter the bulk of the films upon thermal stimulation at suitable elevated temperatures. The resulting space-charge profiles in the thickness direction were probed by means of Piezoelectrically-generated Pressure Steps (PPSs). It was observed that the chemical surface treatment of PP which led to the formation of nano-structures or the use of bulk nanoparticles from LDPE/MgO nanocomposites enhance charge trapping on or in the respective polymer films and also reduce charge transport inside the respective samples.
Published results on LDPE/MgO nanocomposites (3wt%) show that they promise to be good electrical-insulation materials. In this work, the nanocomposites are examined as a potential (ferro-)electret material as well. Isothermal surface-potential decay measurements show that charged LDPE/MgO films still exhibit significant surface potentials after heating for 4 hours at 80 degrees C, which suggests good capabilities of LDPE/MgO nanocomposites to hold electric charges of both polarities. Open-tubular-channel ferroelectrets prepared from LDPE/MgO nanocomposite films show significant piezoelectricity with d(33) coefficients of about 20 pC/N after charging and are stable up to temperatures of at least 80 degrees C. Thus LDPE/MgO nanocomposites may become available as a new ferroelectret material. To increase their d(33) coefficients, it is desirable to optimize the charging conditions and the ferroelectret structure.
The design of embedded systems is becoming continuously more complex such that efficient system-level design methods are becoming crucial. Recently, combined Answer Set Programming (ASP) and Quantifier Free Integer Difference Logic (QF-IDL) solving has been shown to be a promising approach in system synthesis. However, this approach still has several restrictions limiting its applicability. In the paper at hand, we propose a novel ASP modulo Theories (ASPmT) system synthesis approach, which (i) supports more sophisticated system models, (ii) tightly integrates the QF-IDL solving into the ASP solving, and (iii) makes use of partial assignment checking. As a result, more realistic systems are considered and an early exclusion of infeasible solutions improves the entire system synthesis.
An efficient Design Space Exploration (DSE) is imperative for the design of modern, highly complex embedded systems in order to steer the development towards optimal design points. The early evaluation of design decisions at system-level abstraction layer helps to find promising regions for subsequent development steps in lower abstraction levels by diminishing the complexity of the search problem. In recent works, symbolic techniques, especially Answer Set Programming (ASP) modulo Theories (ASPmT), have been shown to find feasible solutions of highly complex system-level synthesis problems with non-linear constraints very efficiently. In this paper, we present a novel approach to a holistic system-level DSE based on ASPmT. To this end, we include additional background theories that concurrently guarantee compliance with hard constraints and perform the simultaneous optimization of several design objectives. We implement and compare our approach with a state-of-the-art preference handling framework for ASP. Experimental results indicate that our proposed method produces better solutions with respect to both diversity and convergence to the true Pareto front.
Utilizing quad-trees for efficient design space exploration with partial assignment evaluation
(2018)
Recently, it has been shown that constraint-based symbolic solving techniques offer an efficient way for deciding binding and routing options in order to obtain a feasible system level implementation. In combination with various background theories, a feasibility analysis of the resulting system may already be performed on partial solutions. That is, infeasible subsets of mapping and routing options can be pruned early in the decision process, which fastens the solving accordingly. However, allowing a proper design space exploration including multi-objective optimization also requires an efficient structure for storing and managing non-dominated solutions. In this work, we propose and study the usage of the Quad-Tree data structure in the context of partial assignment evaluation during system synthesis. Out experiments show that unnecessary dominance checks can be avoided, which indicates a preference of Quad-Trees over a commonly used list-based implementation for large combinatorial optimization problems.
P>Despite ample research, understanding plant spread and predicting their ability to track projected climate changes remain a formidable challenge to be confronted. We modelled the spread of North American wind-dispersed trees in current and future (c. 2060) conditions, accounting for variation in 10 key dispersal, demographic and environmental factors affecting population spread. Predicted spread rates vary substantially among 12 study species, primarily due to inter-specific variation in maturation age, fecundity and seed terminal velocity. Future spread is predicted to be faster if atmospheric CO2 enrichment would increase fecundity and advance maturation, irrespective of the projected changes in mean surface windspeed. Yet, for only a few species, predicted wind-driven spread will match future climate changes, conditioned on seed abscission occurring only in strong winds and environmental conditions favouring high survival of the farthest-dispersed seeds. Because such conditions are unlikely, North American wind-dispersed trees are expected to lag behind the projected climate range shift.
Nanocarriers
(2017)
Acute ankle sprain leads in 40% of all cases to chronic ankle instability (CAI). CAI is related to a variety of motor adaptations at the lower extremities. Previous investigations identified increased muscle activities while landing in CAI compared to healthy control participants. However, it remains unclear whether muscular alterations at the knee muscles are limited to the involved (unstable) ankle or are also present at the uninvolved leg. The latter might potentially indicate a risk of ankle sprain or future injury on the uninvolved leg. Purpose: To assess if there is a difference of knee muscle activities between the involved and uninvolved leg in participants with CAI during perturbed walking. Method: 10 participants (6 females; 4 males; 26±4 years; 169±9 cm; 65±7 kg) with unilateral CAI walked on a split-belt treadmill (1m/s) for 5 minutes of baseline walking and 6 minutes of perturbed walking (left and right side, each 10 perturbations). Electromyography (EMG) measurements were performed at biceps femoris (BF) and rectus femoris (RF). EMG amplitude (RMS; normalized to MVIC) were analyzed for 200ms pre-heel contact (Pre200), 100ms post heel contact (Post100) and 200ms after perturbation (Pert200). Data was analyzed by paired t-test/Wilcoxon test based on presence or absence of normal distribution (Bonferroni adjusted α level p≤ 0.0125). Results: No statistical difference was found between involved and uninvolved leg for RF (Pre200: 4±2% and 11± 22%, respectively, p= 0.878; Post100: 10± 5 and 18±31%, p=0.959; Pert200: 6±3% and 13±24%, p=0.721) as well as for BF (Pre200: 12±7% and 11±6, p=0.576; Post100: 10±7% and 9±7%, p=0.732; Pert200: 7±4 and 7±7%, p=0.386). Discussion: No side differences in muscle activity could be revealed for assessed feedforward and feedback responses (perturbed and unperturbed) in unilateral CAI. Reduced inter-individual variability of muscular activities at the involved leg might indicate a rather stereotypical response pattern. It remains to be investigated, whether muscular control at the knee is not affected by CAI, or whether both sides adapted in a similar style to the chronic condition at the ankle.
Introduction
(2018)
Microstructure Characterisation of Advanced Materials via 2D and 3D X-Ray Refraction Techniques
(2018)
3D imaging techniques have an enormous potential to understand the microstructure, its evolution, and its link to mechanical, thermal, and transport properties. In this conference paper we report the use of a powerful, yet not so wide-spread, set of X-ray techniques based on refraction effects. X-ray refraction allows determining internal specific surface (surface per unit volume) in a non-destructive fashion, position and orientation sensitive, and with a nanometric detectability. We demonstrate showcases of ceramics and composite materials, where microstructural parameters could be achieved in a way unrivalled even by high-resolution techniques such as electron microscopy or computed tomography. We present in situ analysis of the damage evolution in an Al/Al2O3 metal matrix composite during tensile load and the identification of void formation (different kinds of defects, particularly unsintered powder hidden in pores, and small inhomogeneity’s like cracks) in Ti64 parts produced by selective laser melting using synchrotron X-ray refraction radiography and tomography.