Refine
Has Fulltext
- no (1997) (remove)
Year of publication
- 2019 (1997) (remove)
Document Type
- Article (1546)
- Other (155)
- Doctoral Thesis (127)
- Monograph/Edited Volume (63)
- Review (58)
- Part of a Book (28)
- Conference Proceeding (7)
- Habilitation Thesis (6)
- Course Material (3)
- Journal/Publication series (2)
Keywords
- climate change (9)
- diffusion (8)
- Germany (7)
- stars: evolution (7)
- stars: winds, outflows (7)
- methods: numerical (6)
- quasars: absorption lines (6)
- stars: massive (6)
- Climate change (5)
- Deutschland (5)
Institute
- Institut für Biochemie und Biologie (304)
- Institut für Physik und Astronomie (283)
- Institut für Geowissenschaften (240)
- Institut für Chemie (152)
- Department Psychologie (109)
- Institut für Ernährungswissenschaft (77)
- Institut für Umweltwissenschaften und Geographie (66)
- Historisches Institut (65)
- Department Sport- und Gesundheitswissenschaften (57)
- Sozialwissenschaften (55)
Rebuilding an Austrian Army
(2019)
After the Second World War, a new Austrian Army (the Bundesheer) was formed to guarantee the country’s armed neutrality. But the period between 1938 and 1945 remained a point of contention. While some Austrian officers had been sidelined, the majority had served in the Wehrmacht and thus shared experiences and soldierly values. As Cold War realities necessitated a professional experienced army, a group around Erwin Fussenegger (1908–1986) dominated the new Bundesheer and contemplations about reforming the military culture and value system were postponed; while at the same time, the Bundesheer managed to prevent becoming a mere continuation of the Wehrmacht.
Während der hemdsärmelige Reporter lediglich für den Tag schreibt, schafft der Literat Texte für die Ewigkeit. Mit diesem über Jahrhunderte tradierten Klischee brechen zwei Autoren ganz bewusst, die in beiden Bereichen erfolgreich sind: Joseph Roth und Tom Wolfe - in Personalunion Journalisten wie Literaten - hinterfragen den Wertungsunterschied, der gemeinhin zwischen den Bereichen gemacht wird. Fanny Opitz zeichnet die diskursiven Umfelder nach, in denen die Debatte um die Leistung von Journalismus und Literatur breit geführt wurde: im Deutschland der 1920er Jahre im Kontext der gesamtkulturellen Strömung der Neuen Sachlichkeit und im US-amerikanischen New Journalism der 1960er und 1970er Jahre.
Ziel Vergleich der Erkennungsgüte von drei Depressions-Screeninginstrumenten bei Patienten mit koronarer Herzerkrankung (KHK).
Methodik 1019 KHK-Patienten erhielten den Patient Health Questionnaire (PHQ-9 und PHQ-2) und die Hospital Anxiety and Depression Scale (HADS-D) sowie ein klinisches Interview (Composite International Diagnostic Interview) als Referenzstandard.
Ergebnisse Bezüglich der Erkennungsgüte waren PHQ-9 und HADS-D dem PHQ-2 überlegen. Optimale Cut-off-Werte waren 7 (PHQ-9 und HADS-D) und 2 (PHQ-2).
Schlussfolgerung PHQ-9 und HADS-D haben eine vergleichbare Diskriminationsfähigkeit für depressive Störungen bei KHK-Patienten.
In September 2017, the IceCube Neutrino Observatory recorded a very-high-energy neutrino in directional coincidence with a blazar in an unusually bright gamma-ray state, TXS0506 + 056 (refs(1,2)). Blazars are prominent photon sources in the Universe because they harbour a relativistic jet whose radiation is strongly collimated and amplified. High-energy atomic nuclei known as cosmic rays can produce neutrinos; thus, the recent detection may help in identifying the sources of the diffuse neutrino flux(3) and the energetic cosmic rays. Here we report a self-consistent analysis of the physical relation between the observed neutrino and the blazar, in particular the time evolution and spectral behaviour of neutrino and photon emission. We demonstrate that a moderate enhancement in the number of cosmic rays during the flare can yield a very strong increase in the neutrino flux, which is limited by co-produced hard X-rays and teraelectronvolt gamma rays. We also test typical radiation models(4,5) for compatibility and identify several model classes(6,7) as incompatible with the observations. We investigate to what degree the findings can be generalized to the entire population of blazars, determine the relation between their output in photons, neutrinos and cosmic rays, and suggest how to optimize the strategy of future observations.
We show that elliptic complexes of (pseudo) differential operators on smooth compact manifolds with boundary can always be complemented to a Fredholm problem by boundary conditions involving global pseudodifferential projections on the boundary (similarly as the spectral boundary conditions of Atiyah, Patodi, and Singer for a single operator). We prove that boundary conditions without projections can be chosen if, and only if, the topological Atiyah-Bott obstruction vanishes. These results make use of a Fredholm theory for complexes of operators in algebras of generalized pseudodifferential operators of Toeplitz type which we also develop in the present paper.
Selenoneine and ergothioneine in human blood cells determined simultaneously by HPLC/ICP-QQQ-MS
(2019)
The possible relevance to human health of selenoneine and its sulfur-analogue ergothioneine has generated interest in their quantitative determination in biological samples. To gain more insight into the similarities and differences of these two species, a method for their simultaneous quantitative determination in human blood cells using reversed-phase high performance liquid chromatography (RP-HPLC) coupled to inductively coupled plasma triple quadrupole mass spectrometry (ICP-QQQ-MS) is presented. Spectral interferences hampering the determination of sulfur and selenium by ICPMS are overcome by introducing oxygen to the reaction cell. To access selenoneine and ergothioneine in the complex blood matrix, lysis of the cells with cold water followed by cut-off filtration (3000 Da) is performed. Recoveries based on blood cells spiked with selenoneine and ergothioneine were between 80% and 85%. The standard deviation of the method was around 0.10 mg S per L for ergothioneine (corresponding to relative standard deviations (RSD) between 10-1% for ergothioneine concentrations of 1-10 mg S per L) and 0.25 g Se per L for selenoneine (RSDs of 25-2% for concentrations of 1-10 g Se per L). The method was applied to blood cell samples from three volunteers which showed selenoneine and ergothioneine concentrations in the range of 3.25 to 7.35 g Se per L and 0.86 to 6.44 mg S per L, respectively. The method is expected to be of wide use in future studies investigating the dietary uptake of selenoneine and ergothioneine and their relevance in human health.
Winter is an important season for many limnological processes, which can range from biogeochemical transformations to ecological interactions. Interest in the structure and function of lake ecosystems under ice is on the rise. Although limnologists working at polar latitudes have a long history of winter work, the required knowledge to successfully sample under winter conditions is not widely available and relatively few limnologists receive formal training. In particular, the deployment and operation of equipment in below 0 degrees C temperatures pose considerable logistical and methodological challenges, as do the safety risks of sampling during the ice-covered period. Here, we consolidate information on winter lake sampling and describe effective methods to measure physical, chemical, and biological variables in and under ice. We describe variation in snow and ice conditions and discuss implications for sampling logistics and safety. We outline commonly encountered methodological challenges and make recommendations for best practices to maximize safety and efficiency when sampling through ice or deploying instruments in ice-covered lakes. Application of such practices over a broad range of ice-covered lakes will contribute to a better understanding of the factors that regulate lakes during winter and how winter conditions affect the subsequent ice-free period.
Methylmercury (MeHg), an abundant environmental pollutant, has long been known to adversely affect neurodevelopment in both animals and humans. Several reports from epidemiological studies, as well as experimental data indicate sex-specific susceptibility to this neurotoxicant; however, the molecular bases of this process are still not clear. In the present study, we used Caenorhabditis elegans (C. elegans), to investigate sex differences in response to MeHg toxicity during development. Worms at different developmental stage (L1, L4, and adult) were treated with MeHg for 1h. Lethality assays revealed that male worms exhibited significantly higher resistance to MeHg than hermaphrodites, when at L4 stage or adults. However, the number of worms with degenerated neurons was unaffected by MeHg, both in males and hermaphrodites. Lower susceptibility of males was not related to changes in mercury (Hg) accumulation, which was analogous for both wild-type (wt) and male-rich him-8 strain. Total glutathione (GSH) levels decreased upon MeHg in him-8, but not in wt. Moreover, the sex-dependent response of the cytoplasmic thioredoxin system was observedmales exhibited significantly higher expression of thioredoxin TRX-1, and thioredoxin reductase TRXR-1 expression was downregulated upon MeHg treatment only in hermaphrodites. These outcomes indicate that the redox status is an important contributor to sex-specific sensitivity to MeHg in C. elegans.
Non-predatory mortality of zooplankton provides an abundant, yet, little studied source of high quality labile organic matter (LOM) in aquatic ecosystems. Using laboratory microcosms, we followed the decomposition of organic carbon of fresh C-13-labelled Daphnia carcasses by natural bacterioplankton. The experimental setup comprised blank microcosms, that is, artificial lake water without any organic matter additions (B), and microcosms either amended with natural humic matter (H), fresh Daphnia carcasses (D) or both, that is, humic matter and Daphnia carcasses (HD). Most of the carcass carbon was consumed and respired by the bacterial community within 15 days of incubation. A shift in the bacterial community composition shaped by labile carcass carbon and by humic matter was observed. Nevertheless, we did not observe a quantitative change in humic matter degradation by heterotrophic bacteria in the presence of LOM derived from carcasses. However, carcasses were the main factor driving the bacterial community composition suggesting that the presence of large quantities of dead zooplankton might affect the carbon cycling in aquatic ecosystems. Our results imply that organic matter derived from zooplankton carcasses is efficiently remineralized by a highly specific bacterial community, but does not interfere with the bacterial turnover of more refractory humic matter.
We continue our study of invariant forms of the classical equations of mathematical physics, such as the Maxwell equations or the Lam´e system, on manifold with boundary. To this end we interpret them in terms of the de Rham complex at a certain step. On using the structure of the complex we get an insight to predict a degeneracy deeply encoded in the equations. In the present paper we develop an invariant approach to the classical Navier-Stokes equations.
Bacterial pore-forming toxins compromise plasmalemmal integrity, leading to Ca2+ influx, leakage of the cytoplasm, and cell death. Such lesions can be repaired by microvesicular shedding or by the endocytic uptake of the injured membrane sites. Cells have at their disposal an entire toolbox of repair proteins for the identification and elimination of membrane lesions. Sphingomyelinases catalyze the breakdown of sphingomyelin into ceramide and phosphocholine. Sphingomyelin is predominantly localized in the outer leaflet, where it is hydrolyzed by acid sphingomyelinase (ASM) after lysosomal fusion with the plasma membrane. The magnesium-dependent neutral sphingomyelinase (NSM)-2 is found at the inner leaflet of the plasmalemma. Because either sphingomyelinase has been ascribed a role in the cellular stress response, we investigated their role in plasma membrane repair and cellular survival after treatment with the pore-forming toxins listeriolysin O (LLO) or pneumolysin (PLY). Jurkat T cells, in which ASM or NSM-2 was down-regulated [ASM knockdown (KD) or NSM-2 KD cells], showed inverse reactions to toxin-induced membrane damage: ASM KD cells displayed reduced toxin resistance, decreased viability, and defects in membrane repair. In contrast, the down-regulation of NSM-2 led to an increase in viability and enhanced plasmalemmal repair. Yet, in addition to the increased plasmalemmal repair, the enhanced toxin resistance of NSM-2 KD cells also appeared to be dependent on the activation of p38/MAPK, which was constitutively activated, whereas in ASM KD cells, the p38/MAPK activation was constitutively blunted.Schoenauer, R., Larpin, Y., Babiychuk, E. B., Drucker, P., Babiychuk, V. S., Avota, E., Schneider-Schaulies, S., Schumacher, F., Kleuser, B., Koffel, R., Draeger, A. Down-regulation of acid sphingomyelinase and neutral sphingomyelinase-2 inversely determines the cellular resistance to plasmalemmal injury by pore-forming toxins.
We develop a numerical approach to reconstruct the phase dynamics of driven or coupled self-sustained oscillators. Employing a simple algorithm for computation of the phase of a perturbed system, we construct numerically the equation for the evolution of the phase. Our simulations demonstrate that the description of the dynamics solely by phase variables can be valid for rather strong coupling strengths and large deviations from the limit cycle. Coupling functions depend crucially on the coupling and are generally non-decomposable in phase response and forcing terms. We also discuss the limitations of the approach. Published under license by AIP Publishing.
River ecosystems receive and process vast quantities of terrestrial organic carbon, the fate of which depends strongly on microbial activity. Variation in and controls of processing rates, however, are poorly characterized at the global scale. In response, we used a peer-sourced research network and a highly standardized carbon processing assay to conduct a global-scale field experiment in greater than 1000 river and riparian sites. We found that Earth’s biomes have distinct carbon processing signatures. Slow processing is evident across latitudes, whereas rapid rates are restricted to lower latitudes. Both the mean rate and variability decline with latitude, suggesting temperature constraints toward the poles and greater roles for other environmental drivers (e.g., nutrient loading) toward the equator. These results and data set the stage for unprecedented “next-generation biomonitoring” by establishing baselines to help quantify environmental impacts to the functioning of ecosystems at a global scale.
In crop modeling and yield predictions, the heterogeneity of agricultural landscapes is usually not accounted for. This heterogeneity often arises from landscape elements like forests, hedges, or single trees and shrubs that cast shadows. Shading from forested areas or shrubs has effects on transpiration, temperature, and soil moisture, all of which affect the crop yield in the adjacent arable land. Transitional gradients of solar irradiance can be described as a function of the distance to the zero line (edge), the cardinal direction, and the height of trees. The magnitude of yield reduction in transition zones is highly influenced by solar irradiance-a factor that is not yet implemented in crop growth models on a landscape level. We present a spatially explicit model for shading caused by forested areas, in agricultural landscapes. With increasing distance to forest, solar irradiance and yield increase. Our model predicts that the shading effect from the forested areas occurs up to 15 m from the forest edge, for the simulated wheat yields, and up to 30 m, for simulated maize. Moreover, we estimated the spatial extent of transition zones, to calculate the regional yield reduction caused by shading of the forest edges, which amounted to 5% to 8% in an exemplary region.
Multidrug resistant (MDR) Pseudomonas aeruginosa having strong biofilm potential and virulence factors are a serious threat for hospitalized patients having compromised immunity In this study, 34 P. aeruginosa isolates of human origin (17 MDR and 17 non-MDR clinical isolates) were checked for biofilm formation potential in enriched and minimal media. The biofilms were detected using crystal violet method and a modified software package of the automated VideoScan screening method. Cytotoxic potential of the isolates was also investigated on HepG2, LoVo and T24 cell lines using automated VideoScan technology. Pulse field gel electrophoresis revealed 10 PFGE types in MDR and 8 in non-MDR isolates. Although all isolates showed biofilm formation potential, strong biofilm formation was found more in enriched media than in minimal media. Eight MDR isolates showed strong biofilm potential in both enriched and minimal media by both detection methods. Strong direct correlation between crystal violet and VideoScan methods was observed in identifying strong biofilm forming isolates. High cytotoxic effect was observed by 4 isolates in all cell lines used while 6 other isolates showed high cytotoxic effect on T24 cell line only. Strong association of multidrug resistance was found with biofilm formation as strong biofilms were observed significantly higher in MDR isolates (p-value < 0.05) than non-MDR isolates. No significant association of cytotoxic potential with multidrug resistance or biofilm formation was found (p-value > 0.05). The MDR isolates showing significant cytotoxic effects and strong biofilm formation impose a serious threat for hospitalized patients with weak immune system.
Tree species diversity can positively affect the multifunctionality of forests. This is why conifer monocultures of Scots pine and Norway spruce, widely promoted in Central Europe since the 18th and 19th century, are currently converted into mixed stands with naturally dominant European beech. Biodiversity is expected to benefit from these mixtures compared to pure conifer stands due to increased abiotic and biotic resource heterogeneity. Evidence for this assumption is, however, largely lacking. Here, we investigated the diversity of vascular plants, bryophytes and lichens at the plot (alpha diversity) and at the landscape (gamma diversity) level in pure and mixed stands of European beech and conifer species (Scots pine, Norway spruce, Douglas fir) in four regions in Germany. We aimed to identify compositions of pure and mixed stands in a hypothetical forest landscape that can optimize gamma diversity of vascular plants, bryophytes and lichens within regions. Results show that gamma diversity of the investigated groups is highest when a landscape comprises different pure stands rather than tree species mixtures at the stand scale. Species mainly associated with conifers rely on light regimes that are only provided in pure conifer forests, whereas mixtures of beech and conifers are more similar to beech stands. Combining pure beech and pure conifer stands at the landscape scale can increase landscape level biodiversity and conserve species assemblages of both stand types, while landscapes solely composed of stand scale tree species mixtures could lead to a biodiversity reduction of a combination of investigated groups of 7 up to 20%.
Schulische und vor allem unterrichtliche Implementationsprozesse zielen zumeist auf die Professionalisierung der Lehrkräfte ab. Die intendierte Veränderung des Unterrichts beginnt dabei mit einer gewünschten Veränderung von Einstellungen und Verhaltensweisen der Lehrkräfte, welche erst zu einer veränderten Handlungsroutine in der Arbeitspraxis führen kann. Das Modell der Stages of Concern von Hall und Hord (2006) stellt eine der wenigen Möglichkeiten dar, die individuelle Perspektive der Lehrkräfte im Implementationsprozess modellbasiert und standardisiert zu untersuchen. Der vorliegende Beitrag betrachtet anhand dieses Modells die affektiv-kognitive Auseinandersetzung der Beteiligten im Implementationsprozess sowie deren Zusammenhänge mit verschiedenen Aspekten der Kommunikation und der wahrgenommenen Entwicklung. Auf Basis einer Stichprobe von N = 66 Lehrkräften kann dabei gezeigt werden, dass insbesondere die Aspekte Häufigkeit der Kooperation, Kommunikation im Kollegium und Erfahrungen im Team die affektiv-kognitive Auseinandersetzung vorhersagen. Diese Auseinandersetzung - insbesondere mit den Konsequenzen der Neuerung - bedingt wiederum die wahrgenommene Entwicklung im Implementationsprozess.
Im Schuljahr 2008/09 war Jahrgangsübergreifendes Lernen (JÜL) in der Berliner Schuleingangsphase verpflichtend eingeführt worden. Doch nicht alle Schulen übernahmen diese Reform. In dieser Studie untersuchen wir, inwiefern Schulen sich in Abhängigkeit davon, wie schnell und umfassend sie JÜL implementiert hatten, in Merkmalen ihrer Schülerschaft voneinander unterscheiden. Wir nahmen an, dass mit dem Ziel von JÜL, Heterogenität produktiv für das Lernen zu nutzen, die Reform für solche Schulen besonders attraktiv war, die eine heterogene Schülerschaft haben. Heterogenität wurde über die Anteile von Kindern mit (a) nichtdeutscher Erstsprache und (b) Lernmittelzuzahlungsbefreiung operationalisiert. Weiter wurde untersucht, ob sich die Leistungen der Kinder in Deutsch und Mathematik zwischen den Schulen unterschieden. Die Ergebnisse zeigen erwartungsgemäß, dass Schulen mit einer heterogenen Schülerschaft JÜL schnell und nachhaltig implementierten. Im zeitlichen Verlauf ließen sich, nach Kontrolle der Heterogenität der Schülerschaft, keine Leistungsunterschiede zwischen den Schulen feststellen. Die Ergebnisse werden hinsichtlich der Frage diskutiert, unter welchen Voraussetzungen Schulen Reformen implementieren und wie sich JÜL auf Bildungsergebnisse auswirken kann.
Merchants on modern e-commerce platforms face a highly competitive environment. They compete against each other using automated dynamic pricing and ordering strategies. Successfully managing both inventory levels as well as offer prices is a challenging task as (i) demand is uncertain, (ii) competitors strategically interact, and (iii) optimized pricing and ordering decisions are mutually dependent. We show how to derive optimized data-driven pricing and ordering strategies which are based on demand learning techniques and efficient dynamic optimization models. We verify the superior performance of our self-adaptive strategies by comparing them to different rule-based as well as data-driven strategies in duopoly and oligopoly settings. Further, to study and to optimize joint dynamic ordering and pricing strategies on online marketplaces, we built an interactive simulation platform. To be both flexible and scalable, the platform has a microservice-based architecture and allows handling dozens of competing merchants and streams of consumers with configurable characteristics.
Editorial
(2019)
Introduction
(2019)
Over the past decades, it has become more and more obvious that ongoing globalisation processes have substantial impacts on the natural environment. Studies reveal that intensified global economic relations have caused or accelerated dramatic changes in the Earth system, defined as the sum of our planet’s interacting physical, chemical, biological and human processes (Schellnhuber et al. 2004). Climate change, biodiversity loss, disrupted biogeochemical cycles, and land degradation are often cited as emblematic problems of global environmental change (Rockström et al. 2009; Steffen et al. 2015). In this context, the term Anthropocene has lately received widespread attention and gained some prominence in the academic literature
Chronisch unspezifische Rückenschmerzen (CURS) gehören international zu den häufigsten Schmerzphänomenen und können für Athletinnen und Athleten karrierelimitierend sein. Knapp ein Drittel der jährlichen Trainingsausfallzeiten werden auf CURS zurückgeführt. In der Entstehung von chronischen Schmerzen ist ein multifaktorielles Ätiologiemodell mit einem signifikanten Einfluss psychosozialer Risikofaktoren evident. Obwohl dies in der Allgemeinbevölkerung bereits gut erforscht ist, gibt es in der Sportwissenschaft vergleichsweise wenige Arbeiten darüber. Dieses Thema wird daher in drei Multicenterstudien und zahlreichen Teilstudien des MiSpEx-Netzwerks (Medicine in Spine-Exercise-Network, Förderzeitraum 2011 – 2018) aufgegriffen. Entsprechend der Empfehlung einer frühzeitigen Diagnostik von Chronifizierungsfaktoren in der „Nationalen Versorgungsleitlinie Kreuzschmerz“, beschäftigt sich das Netzwerk u. a. mit der Überprüfung, Entwicklung und Evaluation diagnostischer Möglichkeiten. Der vorliegende Beitrag beschreibt die Entwicklung einer Diagnostik von psychosozialen Risikofaktoren, die einerseits eine Einschätzung des Risikos der Entwicklung von CURS und andererseits eine individuelle Zuweisung zu (Trainings)Interventionen erlaubt. Es wird die Entwicklungsrationale beschrieben und dabei verschiedene methodische Herangehensweisen und Entscheidungssequenzen reflektiert.
REFS-D
(2019)
Ziel des vorliegenden Artikels ist die teststatistische Überprüfung und Validierung einer deutschsprachigen Version der Referee Self-Efficacy Scale (REFS). Die REFS erfasst im englischsprachigen Original die Selbstwirksamkeit von Schiedsrichterinnen und Schiedsrichtern mit den Subskalen Wissen über das Spiel, Entscheidungsfindung, Druck und Kommunikation. Die Items wurden mit Hilfe der Übersetzung-Rückübersetzung ins Deutsche übertragen. Die Struktur und die psychometrischen Eigenschaften der deutschen Items wurden anhand einer Stichprobe aus 265 deutschsprachigen Fußballschiedsrichterinnen und -schiedsrichtern überprüft. Da die im englischsprachigen Original vorgeschlagene Skalenzuordnung der REFS nach der Übersetzung ins Deutsche nicht replizierbar war, wurden Items mit mangelhaften Skaleneigenschaften aus der deutschsprachigen REFS-Version (REFS-D) ausgeschlossen. Das Resultat der Analysen ist eine Skala mit acht Items, die sich drei Subskalen, Spielumsetzung, Druck und Kommunikation, zuordnen lassen. Die REFS-D weist zufriedenstellende interne Konsistenzen und signifikante mittelhohe Korrelationen mit allgemeiner Selbstwirksamkeit auf. Trotz einiger Einschränkungen stellt die REFS-D als ökonomische Skala einen Ansatzpunkt für zukünftige Forschung dar.
Der Beitrag untersucht, ob und zu welchen Anteilen frühe sprachliche Kompetenzen numerische Kompetenzen vorhersagen. An 72 dreijährigen Kindern wurden numerische, verbal produktive und rezeptive sowie grammatische Leistungen zwei Mal im Abstand von drei Monaten erhoben. Mithilfe von Strukturgleichungsmodellen kann gezeigt werden, dass sprachliche und numerische Leistungen in diesem Alter noch wenig distinkt sind. Für die numerischen Kompetenzen findet sich bereits in diesem Alter eine hohe interindividuelle Entwicklungsstabilität. Ein bedeutsamer Einfluss sprachlicher Kompetenz auf den Zuwachs mathematischer Kompetenz im vierten Lebensjahr konnte nicht nachgewiesen werden. Wir diskutieren die Ergebnisse vor dem Hintergrund der aktuellen Thesen zum Zusammenhang von Sprache und Numerik in der Entwicklung.
Previous cross-modal priming studies showed that lexical decisions to words after a pronoun were facilitated when these words were semantically related to the pronoun's antecedent. These studies suggested that semantic priming effectively measured antecedent retrieval during coreference. We examined whether these effects extended to implicit reading comprehension using the N400 response. The results of three experiments did not yield strong evidence of semantic facilitation due to coreference. Further, the comparison with two additional experiments showed that N400 facilitation effects were reduced in sentences (vs. word pair paradigms) and were modulated by the case morphology of the prime word. We propose that priming effects in cross-modal experiments may have resulted from task-related strategies. More generally, the impact of sentence context and morphological information on priming effects suggests that they may depend on the extent to which the upcoming input is predicted, rather than automatic spreading activation between semantically related words.
Many machine learning problems can be characterized by mutual contamination models. In these problems, one observes several random samples from different convex combinations of a set of unknown base distributions and the goal is to infer these base distributions. This paper considers the general setting where the base distributions are defined on arbitrary probability spaces. We examine three popular machine learning problems that arise in this general setting: multiclass classification with label noise, demixing of mixed membership models, and classification with partial labels. In each case, we give sufficient conditions for identifiability and present algorithms for the infinite and finite sample settings, with associated performance guarantees.
Basándose en el conjunto de la obra humboldtiana, desde sus comienzos hasta el Cosmos, este dossier trata de destacar la orientación cosmopolita del sabio prusiano así como, sobre todo, el fundamento americano de sus enfoques. El continente americano, para Humboldt, representa la diversidad de lo pensable y la multirrelacionalidad de lo imaginable: la llave para entender su cosmovisión.
General intelligence has a substantial genetic background in children, adolescents, and adults, but environmental factors also strongly correlate with cognitive performance as evidenced by a strong (up to one SD) increase in average intelligence test results in the second half of the previous century. This change occurred in a period apparently too short to accommodate radical genetic changes. It is highly suggestive that environmental factors interact with genotype by possible modification of epigenetic factors that regulate gene expression and thus contribute to individual malleability. This modification might as well be reflected in recent observations of an association between dopamine-dependent encoding of reward prediction errors and cognitive capacity, which was modulated by adverse life events.
In this paper we develop a general framework for constructing and analyzing coupled Markov chain Monte Carlo samplers, allowing for both (possibly degenerate) diffusion and piecewise deterministic Markov processes. For many performance criteria of interest, including the asymptotic variance, the task of finding efficient couplings can be phrased in terms of problems related to optimal transport theory. We investigate general structural properties, proving a singularity theorem that has both geometric and probabilistic interpretations. Moreover, we show that those problems can often be solved approximately and support our findings with numerical experiments. For the particular objective of estimating the variance of a Bayesian posterior, our analysis suggests using novel techniques in the spirit of antithetic variates. Addressing the convergence to equilibrium of coupled processes we furthermore derive a modified Poincare inequality.
Influence of the Main Border Faults on the 3D Hydraulic Field of the Central Upper Rhine Graben
(2019)
The Upper Rhine Graben (URG) is an active rift with a high geothermal potential. Despite being a well-studied area, the three-dimensional interaction of the main controlling factors of the thermal and hydraulic regime is still not fully understood. Therefore, we have used a data-based 3D structural model of the lithological configuration of the central URG for some conceptual numerical experiments of 3D coupled simulations of fluid and heat transport. To assess the influence of the main faults bordering the graben on the hydraulic and the deep thermal field, we carried out a sensitivity analysis on fault width and permeability. Depending on the assigned width and permeability of the main border faults, fluid velocity and temperatures are affected only in the direct proximity of the respective border faults. Hence, the hydraulic characteristics of these major faults do not significantly influence the graben-wide groundwater flow patterns. Instead, the different scenarios tested provide a consistent image of the main characteristics of fluid and heat transport as they have in common: (1) a topography-driven basin-wide fluid flow perpendicular to the rift axis from the graben shoulders to the rift center, (2) a N/NE-directed flow parallel to the rift axis in the center of the rift and, (3) a pronounced upflow of hot fluids along the rift central axis, where the streams from both sides of the rift merge. This upflow axis is predicted to occur predominantly in the center of the URG (northern and southern model area) and shifted towards the eastern boundary fault (central model area).
Shear-induced platelet adherence and activation in an in-vitro dynamic multiwell-plate system
(2019)
Circulating blood cells are prone to varying flow conditions when contacting cardiovascular devices. For a profound understanding of the complex interplay between the blood components/cells and cardiovascular implant surfaces, testing under varying shear conditions is required. Here, we study the influence of arterial and venous shear conditions on the in vitro evaluation of the thrombogenicity of polymer-based implant materials. Medical grade poly(dimethyl siloxane) (PDMS), polyethylene terephthalate (PET) and polytetrafluoroethylene (PTFE) films were included as reference materials. The polymers were exposed to whole blood from healthy humans. Blood was agitated orbitally at low (venous shear stress: 2.8 dyne. cm(-2)) and high (arterial shear stress: 22.2 dyne .cm(-2)) agitation speeds in a well-plate based test system. Numbers of non-adherent platelets, platelet activation (P-Selectin positive platelets), platelet function (PFA100 closure times) and platelet adhesion (laser scanning microscopy (LSM)) were determined. Microscopic data and counting of the circulating cells revealed increasing numbers of material-surface adherent platelets with increasing agitation speed. Also, activation of the platelets was substantially increased when tested under the high shear conditions (P-Selectin levels, PFA-100 closure times). At low agitation speed, the platelet densities did not differ between the three materials. Tested at the high agitation speed, lowest platelet densities were observed on PDMS, intermediate levels on PET and highest on PTFE. While activation of the circulating platelets was affected by the implant surfaces in a similar manner, PFA closure times did not reflect this trend. Differences in the thrombogenicity of the studied polymers were more pronounced when tested at high agitation speed due to the induced shear stresses. Testing under varying shear stresses, thus, led to a different evaluation of the implant thrombogenicity, which emphasizes the need for testing under various flow conditions. Our data further confirmed earlier findings where the same reference implants were tested under static (and not dynamic) conditions and with fresh human platelet rich plasma instead of whole blood. This supports that the application of common reference materials may improve inter-study comparisons, even under varying test conditions.
Non-swelling hydrophobic poly(n-butyl acrylate) network (cPnBA) is a candidate material for synthetic vascular grafts owing to its low toxicity and tailorable mechanical properties. Mesenchymal stem cells (MSCs) are an attractive cell type for accelerating endothelialization because of their superior anti-thrombosis and immune modulatory function. Further, they can differentiate into smooth muscle cells or endothelial-like cells and secret pro-angiogenic factors such as vascular endothelial growth factor (VEGF). MSCs are sensitive to the substrate mechanical properties, with the alteration of their major cellular behavior and functions as a response to substrate elasticity. Here, we cultured human adipose-derived mesenchymal stem cells (hADSCs) on cPnBAs with different mechanical properties (cPnBA250, Young’s modulus (E) = 250 kPa; cPnBA1100, E = 1100 kPa) matching the elasticity of native arteries, and investigated their cellular response to the materials including cell attachment, proliferation, viability, apoptosis, senescence and secretion. The cPnBA allowed high cell attachment and showed negligible cytotoxicity. F-actin assembly of hADSCs decreased on cPnBA films compared to classical tissue culture plate. The difference of cPnBA elasticity did not show dramatic effects on cell attachment, morphology, cytoskeleton assembly, apoptosis and senescence. Cells on cPnBA250, with lower proliferation rate, had significantly higher VEGF secretion activity. These results demonstrated that tuning polymer elasticity to regulate human stem cells might be a potential strategy for constructing stem cell-based artificial blood vessels.
Packungen aus Kreisscheiben
(2019)
Der englische Seefahrer Sir Walter Raleigh fragte sich einst, wie er in seinem Schiffsladeraum moeglichst viele Kanonenkugeln stapeln koennte. Johannes Kepler entwickelte daraufhin 1611 eine Vermutung ueber die optimale Anordnung der Kugeln. Diese Vermutung sollte sich als eine der haertesten mathematischen Nuesse der Geschichte erweisen. Selbst in der Ebene sind dichteste Packungen kongruenter Kreise eine Herausforderung. 1892 und 1910 veroeffentlichte Axel Thue (kritisierte) Beweise, dass die hexagonale Kreispackung optimal sei. Erst 1940 lieferte Laszlo Fejes Toth schliesslich einen wasserdichten Beweis fuer diese Tatsache. Eine Variante des Problems verlangt,
Packungen mit endlich vielen kongruenten Kugeln zu finden, die eine gewisse quadratische Energie minimieren: Diese spannende geometrische Aufgabe wurde 1967 von Toth gestellt. Sie ist auch heute noch nicht vollstaendig gelaest. In diesem Beitrag schlagen die Autorinnen eine originelle wahrscheinlichkeitstheoretische Methode vor, um in der Ebene Näherungen der Lösung zu konstruieren.
In recent years, named entity linking (NEL) tools were primarily developed in terms of a general approach, whereas today numerous tools are focusing on specific domains such as e.g. the mapping of persons and organizations only, or the annotation of locations or events in microposts. However, the available benchmark datasets necessary for the evaluation of NEL tools do not reflect this focalizing trend. We have analyzed the evaluation process applied in the NEL benchmarking framework GERBIL [in: Proceedings of the 24th International Conference on World Wide Web (WWW’15), International World Wide Web Conferences Steering Committee, Republic and Canton of Geneva, Switzerland, 2015, pp. 1133–1143, Semantic Web 9(5) (2018), 605–625] and all its benchmark datasets. Based on these insights we have extended the GERBIL framework to enable a more fine grained evaluation and in depth analysis of the available benchmark datasets with respect to different emphases. This paper presents the implementation of an adaptive filter for arbitrary entities and customized benchmark creation as well as the automated determination of typical NEL benchmark dataset properties, such as the extent of content-related ambiguity and diversity. These properties are integrated on different levels, which also enables to tailor customized new datasets out of the existing ones by remixing documents based on desired emphases. Besides a new system library to enrich provided NIF [in: International Semantic Web Conference (ISWC’13), Lecture Notes in Computer Science, Vol. 8219, Springer, Berlin, Heidelberg, 2013, pp. 98–113] datasets with statistical information, best practices for dataset remixing are presented, and an in depth analysis of the performance of entity linking systems on special focus datasets is presented.
We consider the dynamics of the Kuramoto ensemble oscillators not included in a common synchronized cluster, where the mean field is subject to fluctuations. The fluctuations can be either related to the finite size of the ensemble or superimposed on the mean field in the form of common noise due to the constructive features of the system. It is shown that the states of such oscillators with close natural frequencies appear correlated with each other, since the mean-field fluctuations act as common noise. We quantify the effect with the synchronization index of two oscillators, which is calculated numerically and analytically as a function of the frequency difference and noise intensity. The results are rigorous for large ensembles with additional noise superimposed on the mean field and are qualitatively true for the systems where the mean-field fluctuations are due to the finite size of the ensemble. In the latter case, the effect is found to be independent of the number of oscillators in the ensemble.
Literary criticism, particularly ecocriticism, occupies an uneasy position with regard to activism: reading books (or plays, or poems) seems like a rather leisurely activity to be undertaking if our environment—our planet—is in crisis. And yet, critiquing the narratives that structure worlds and discourses is key to the activities of the (literary) critic in this time of crisis. If this crisis manifests as a ‘crisis of imagination’ (e.g. Ghosh), I argue that this not so much a crisis of the absence of texts that address the environmental disaster, but rather a failure to comprehend the presences of the Anthropocene in the present. To interpret (literary) texts in this framework must entail acknowledging and scrutinising the extent of the incapacity of the privileged reader to comprehend the crisis as presence and present rather than spatially or temporally remote. The readings of the novels Carpentaria (2006) and The Swan Book (2013) by Waanyi writer Alexis Wright (Australia) trace the uneven presences of Anthropocenes in the present by way of bringing future worlds (The Swan Book) to the contemporary (Carpentaria). In both novels, protagonists must forge survival amongst ruins of the present and future: the depicted worlds, in particular the representations of the disenfranchisement of indigenous inhabitants of the far north of the Australian continent, emerge as a critique of the intersections of capitalist and colonial projects that define modernity and its impact on the global climate.
We have developed a method for deriving systems of closed equations for the dynamics of order parameters in the ensembles of phase oscillators. The Ott-Antonsen equation for the complex order parameter is a particular case of such equations. The simplest nontrivial extension of the Ott-Antonsen equation corresponds to two-bunch states of the ensemble. Based on the equations obtained, we study the dynamics of multi-bunch chimera states in coupled Kuramoto-Sakaguchi ensembles. We show an increase in the dimensionality of the system dynamics for two-bunch chimeras in the case of identical phase elements and a transition to one-bunch "Abrams chimeras" for imperfect identity (in the latter case, the one-bunch chimeras become attractive).
We provide explicit examples of positive and power-bounded operators on c(0) and l(infinity) which are mean ergodic but not weakly almost periodic. As a consequence we prove that a countably order complete Banach lattice on which every positive and power-bounded mean ergodic operator is weakly almost periodic is necessarily a KB-space. This answers several open questions from the literature. Finally, we prove that if T is a positive mean ergodic operator with zero fixed space on an arbitrary Banach lattice, then so is every power of T .
Composite actuators consisting of magnetic nanoparticles dispersed in a crystallizable multiphase polymer system can be remotely controlled by alternating magnetic fields (AMF). These actuators contain spatially segregated crystalline domains with chemically different compositions. Here, the crystalline domain associated to low melting transition range is responsible for actuation while the crystalline domain associated to the higher melting transition range determines the geometry of the shape change. This paper reports magnetomechanical actuators which are based on a single crystalline domain of oligo(omega-pentadecalactone) (OPDL) along with covalently integrated iron(III) oxide nanoparticles (ioNPs). Different geometrical modes of actuation such as a reversible change in length or twisting were implemented by a magneto-mechanical programming procedure. For an individual actuation mode, the degree of actuation could be tailored by variation of the magnetic field strengths. This material design can be easily extended to other composites containing other magnetic nanoparticles, e.g. with a high magnetic susceptibility.
By using synchrotron X-ray powder diffraction, the temperature dependent phase diagram of the hybrid perovskite tri-halide compounds, methyl ammonium lead iodide (MAPbI3, MA+ = CH3NH3+) and methyl ammonium lead bromide (MAPbBr3), as well as of their solid solutions, has been established. The existence of a large miscibility gap between 0.29 ≤ x ≤ 0.92 (±0.02) for the MAPb(I1−xBrx)3 solid solution has been proven. A systematic study of the lattice parameters for the solid solution series at room temperature revealed distinct deviations from Vegard's law. Furthermore, temperature dependent measurements showed that a strong temperature dependency of lattice parameters from the composition is present for iodine rich compositions. In contrast, the bromine rich compositions show an unusually low dependency of the phase transition temperature from the degree of substitution.
HexagDLy is a Python-library extending the PyTorch deep learning framework with convolution and pooling operations on hexagonal grids. It aims to ease the access to convolutional neural networks for applications that rely on hexagonally sampled data as, for example, commonly found in ground-based astroparticle physics experiments.
Shape-memory polymer actuators often contain crystallizable polyester segments. Here, the influence of accelerated hydrolytic degradation on the actuation performance in copolymer networks based on oligo(epsilon-caprolactone) dimethacrylate (OCL) and n-butyl acrylate is studied The semi-crystalline OCL was utilized as crosslinker with molecular weights of 2.3 and 15.2 kg.mol(-1) (ratio: 1:1 wt%) and n-butyl acrylate (25 wt% relative to OCL content) acted as softening agent creating the polymer main chain segments within the network architecture. The copolymer networks were programmed by 50% elongation and were degraded by means of alkaline hydrolysis utilizing sodium hydroxide solution (pH = 13). Experiments were performed in the range of the broad melting range of the actuators at 40 degrees C. The degradation of test specimen was monitored by the sample mass, which was reduced by 25 wt% within 105 d .45 degradation products, fragments of OCL with molecular masses ranging from 400 to 50.000 g.mol(-1) could be detected by NMR spectroscopy and GPC measurements. The cleavage of ester groups included in OCL segments resulted in a decrease of the melting temperature (T-m) related to the actuator domains (amorphous at the temperature of degradation) and simultaneously, the T-m associated to the skeleton domain was increased (semi-crystalline at the temperature of degradation). The alkaline hydrolysis decreased the polymer chain orientation of OCL domains until a random alignment of crystalline domains was obtained. This result was confirmed by cyclic thermomechanical actuation tests. The performance of directed movements decreased almost linearly as function of degradation time resulting in the loss of functionality when the orientation of polymer chains disappeared. Here, actuators were able to provide reversible movements until 91 d when the accelerated bulk degradation procedure using alkaline hydrolysis (pH = 13) was applied. Accordingly, a lifetime of more than one year can be guaranteed under physiological conditions (pH = 7.4) when, e.g., artificial muscles for biomimetic robots as potential application for these kind of shape-memory polymer actuators will be addressed.
Ziel der Studie ist die Untersuchung der individuellen und schulbezogenen Bedingungen der elterlichen häuslichen Unterstützung schulbezogener Lernprozesse von Schülerinnen und Schülern der Sekundarstufe I. Des Weiteren wurde untersucht, inwieweit diese Unterstützung mit der Veränderung der intrinsischen Motivation und des akademischen Selbstkonzeptes der Lernenden einhergeht. Der Beitrag zum Forschungsstand liegt neben der längsschnittlichen Untersuchung in der Analyse möglicher Moderatoren der Zusammenhänge. Für die Analysen wurden Fragebogendaten von n=157 Lernenden (MAlter=14.5) sowie deren Eltern genutzt. Als zentrales Ergebnis zeigt sich, dass Eltern ihre Kinder häuslich unterstützen, wenn Eltern ihr eigenes Schulengagement als nützlich wahrnehmen. Die Unterstützung im häuslichen Umfeld steht in positivem Zusammenhang zur Veränderung der intrinsischen Motivation. Sowohl die von Eltern wahrgenommene Kooperationsbereitschaft der Klassenlehrkraft als auch die Vielfalt des elterlichen Engagements im schulischen Umfeld moderieren den Zusammenhang zwischen häuslicher Unterstützung durch Eltern und dem akademischen Selbstkonzept. Limitationen wie die Verzerrung der Elternstichprobe sowie praktische Implikationen werden diskutiert.
Electronic health is one of the most popular applications of information and communication technologies and it has contributed immensely to health delivery through the provision of quality health service and ubiquitous access at a lower cost. Even though this mode of health service is increasingly becoming known or used in developing nations, these countries are faced with a myriad of challenges when implementing and deploying e-health services on both small and large scale. It is estimated that the Africa population alone carries the highest percentage of the world’s global diseases despite its certain level of e-health adoption. This paper aims at analyzing the progress so far and the current state of e-health in developing countries particularly Africa and propose a framework for further improvement.
SiO(2 )is the main component of silicate melts and thus controls their network structure and physical properties. The compressibility and viscosities of melts at depth are governed by their short range atomic and electronic structure. We measured the O K-edge and the Si L-2,L-3-edge in silica up to 110 GPa using X-ray Raman scattering spectroscopy, and found a striking match to calculated spectra based on structures from molecular dynamic simulations. Between 20 and 27 GPa, Si-[4] species are converted into a mixture of Si-[5] and Si-[6] species and between 60 and 70 GPa, Si-[6] becomes dominant at the expense of Si-[5] with no further increase up to at least 110 GPa. Coordination higher than 6 is only reached beyond 140 GPa, corroborating results from Brillouin scattering. Network modifying elements in silicate melts may shift this change in coordination to lower pressures and thus magmas could be denser than residual solids at the depth of the core-mantle boundary.
A significant percentage of urban traffic is caused by the search for parking spots. One possible approach to improve this situation is to guide drivers along routes which are likely to have free parking spots. The task of finding such a route can be modeled as a probabilistic graph problem which is NP-complete. Thus, we propose heuristic approaches for solving this problem and evaluate them experimentally. For this, we use probabilities of finding a parking spot, which are based on publicly available empirical data from TomTom International B.V. Additionally, we propose a heuristic that relies exclusively on conventional road attributes. Our experiments show that this algorithm comes close to the baseline by a factor of 1.3 in our cost measure. Last, we complement our experiments with results from a field study, comparing the success rates of our algorithms against real human drivers.
Hot subdwarf B (sdB) stars are evolved core helium burning stars that have lost most of their hydrogen envelope due to binary interaction on the red giant branch. As sdB stars in wide binary systems can only be created by stable Roche lobe overflow, they are a great test sample to constrain the theoretical models for stable mass loss on the red giant branch. We present here the findings of a long term monitoring program of wide sdB+MS binaries. We found two main features in the orbital parameters. The majority of the systems have eccentric orbits with systems on longer orbital period having a higher eccentricity. As these systems have undergone mass loss near the tip of the RGB, tidal circularisation theory predicts them to be circularized. Our observations suggest that efficient eccentricity pumping mechanisms are active during the mass loss phase. Secondly we find a strong correlation between the mass ratio and the orbital period. Using binary evolution models, this relation is used to derive both an upper and lower limit on the initial mass ratio at which RLOF will be stable. These limits depend on the core mass of the sdB progenitor.
This paper addresses the morpho-phonological, syntactic and pragmatic properties of postverbal subject constructions in Awing. Analogous to other inversion constructions in Bantu literature (Marten & Van der Wal 2014), Awing has a construction in which the subject occurs immediately after the verb, resulting in a subject or sentence focus interpretation. However in Awing, crucially, a VSX clause cannot host a subject marker, but must contain a certain le morpheme in sentence-initial position. Following Baker (2003) and Collins (2004), I argue that the subject marker triggers movement of the subject from Spec/vP, explaining why it is banned in VSX clauses. I further claim that although the subject is interpreted as focus, it is not in a lower focus phrase (Belletti 2004), but rather trapped in Spec/vP. Awing postverbal subject constructions also exhibit verb doubling: VSVO. I argue that verb doubling is due to Case requirement: In canonical SVO clauses the subject marker and the verb value the nominative and accusative Cases, respectively. In VSVO constructions, on the contrary, the verb values both nominative and accusative Cases, thus forcing syntax to spell out two copies of the same verb.
Audit - and then what?
(2019)
Current trends such as digital transformation, Internet of Things, or Industry 4.0 are challenging the majority of learning factories. Regardless of whether a conventional learning factory, a model factory, or a digital learning factory, traditional approaches such as the monotonous execution of specific instructions don‘t suffice the learner’s needs, market requirements as well as especially current technological developments. Contemporary teaching environments need a clear strategy, a road to follow for being able to successfully cope with the changes and develop towards digitized learning factories. This demand driven necessity of transformation leads to another obstacle: Assessing the status quo and developing and implementing adequate action plans. Within this paper, details of a maturity-based audit of the hybrid learning factory in the Research and Application Centre Industry 4.0 and a thereof derived roadmap for the digitization of a learning factory are presented.
Subject-oriented learning
(2019)
The transformation to a digitized company changes not only the work but also social context for the employees and requires inter alia new knowledge and skills from them. Additionally, individual action problems arise. This contribution proposes the subject-oriented learning theory, in which the employees´ action problems are the starting point of training activities in learning factories. In this contribution, the subject-oriented learning theory is exemplified and respective advantages for vocational training in learning factories are pointed out both theoretically and practically. Thereby, especially the individual action problems of learners and the infrastructure are emphasized as starting point for learning processes and competence development.
High-dimensional data is particularly useful for data analytics research. In the healthcare domain, for instance, high-dimensional data analytics has been used successfully for drug discovery. Yet, in order to adhere to privacy legislation, data analytics service providers must guarantee anonymity for data owners. In the context of high-dimensional data, ensuring privacy is challenging because increased data dimensionality must be matched by an exponential growth in the size of the data to avoid sparse datasets. Syntactically, anonymising sparse datasets with methods that rely of statistical significance, makes obtaining sound and reliable results, a challenge. As such, strong privacy is only achievable at the cost of high information loss, rendering the data unusable for data analytics. In this paper, we make two contributions to addressing this problem from both the privacy and information loss perspectives. First, we show that by identifying dependencies between attribute subsets we can eliminate privacy violating attributes from the anonymised dataset. Second, to minimise information loss, we employ a greedy search algorithm to determine and eliminate maximal partial unique attribute combinations. Thus, one only needs to find the minimal set of identifying attributes to prevent re-identification. Experiments on a health cloud based on the SAP HANA platform using a semi-synthetic medical history dataset comprised of 109 attributes, demonstrate the effectiveness of our approach.
Introduction to CTA Science
(2019)
Ground-based gamma-ray astronomy is a young field with enormous scientific potential. The possibility of astrophysical measurements at teraelectronvolt (TeV) energies was demonstrated in 1989 with the detection of a clear signal from the Crab nebula above 1 TeV with the Whipple 10 m imaging atmospheric Cherenkov telescope (IACT). Since then, the instrumentation for, and techniques of, astronomy with IACTs have evolved to the extent that a flourishing new scientific discipline has been established, with the detection of more than 150 sources and a major impact in astrophysics and more widely in physics. The current major arrays of IACTs, H.E.S.S., MAGIC, and VERITAS, have demonstrated the huge physics potential at these energies as well as the maturity of the detection technique. Many astrophysical source classes have been established, some with many well-studied individual objects, but there are indications that the known sources represent the tip of the iceberg in terms of both individual objects and source classes. The Cherenkov Telescope Array (CTA) will transform our understanding of the high-energy universe and will explore questions in physics of fundamental importance. As a key member of the suite of new and upcoming major astroparticle physics experiments and observatories, CTA will exploit synergies with gravitational wave and neutrino observatories as well as with classical photon observatories. CTA will address a wide range of major questions in and beyond astrophysics, which can be grouped into three broad themes…
Nowadays, structural health monitoring of critical infrastructures is considered as of primal importance especially for managing transport infrastructure however most current SHM methodologies are based on point-sensors that show various limitations relating to their spatial positioning capabilities, cost of development and measurement range. This publication describes the progress in the SENSKIN EC co-funded research project that is developing a dielectric-elastomer sensor, formed from a large highly extensible capacitance sensing membrane and is supported by an advanced micro-electronic circuitry, for monitoring transport infrastructure bridges. The sensor under development provides spatial measurements of strain in excess of 10%, while the sensing system is being designed to be easy to install, require low power in operation concepts, require simple signal processing, and have the ability to self-monitor and report. An appropriate wireless sensor network is also being designed and developed supported by local gateways for the required data collection and exploitation. SENSKIN also develops a Decision-Support-System (DSS) for proactive condition-based structural interventions under normal operating conditions and reactive emergency intervention following an extreme event. The latter is supported by a life-cycle-costing (LCC) and life-cycle-assessment (LCA) module responsible for the total internal and external costs for the identified bridge rehabilitation, analysis of options, yielding figures for the assessment of the economic implications of the bridge rehabilitation work and the environmental impacts of the bridge rehabilitation options and of the associated secondary effects respectively. The overall monitoring system will be evaluated and benchmarked on actual bridges of Egnatia Highway (Greece) and Bosporus Bridge (Turkey).
Recent years have seen a considerable broadening of the ambitions in urban sustainability policy-making. With its Sustainable Development Goal (SDG) 11 Making cities and human settlements inclusive, safe, resilient and sustainable, the 2030 Agenda stresses the critical role of cities in achieving sustainable development. In the context of SDG17 on partnerships, emphasis is also placed on the role of researchers and other scientific actors as change agents in the sustainability transformation. Against this backdrop, this article sheds light on different pathways through which science can contribute to urban sustainability. In particular, we discern four forms of science-policy-society interactions as key vectors: 1. sharing knowledge and providing scientific input to urban sustainability policy-making; 2. implementing transformative research projects; 3. contributing to local capacity building; and 4. self-governing towards sustainability. The pathways of influence are illustrated with empirical examples, and their interlinkages and limitations are discussed. We contend that there are numerous opportunities for actors from the field of sustainability science to engage with political and societal actors to enhance sustainable development at the local level.
The Author as Researcher
(2019)
This article proposes a new perspective on avant-garde travel writing through the lens of scientific field work, investigating these new writing techniques in Boris Pil’niak’s expedition prose. In the 1920s, the researching writer represents a hidden, but influential counterpart to the widely propagated figure of the working writer. While the author as producer combines word and deed in an operative act, the author as researcher investigates the production of knowledge. This entails revising the centrality of facts. Literature as artistic research subverts factography by going beyond the horizons of veristic data registration to include uncharted realms and vague possibilities. This exploration leads to specific genres: the author as researcher tries his hand at a kind of laboratory text, a prolific genre at the intersection of testing equipment, recording media, and hypothetical thought. Not confined to a sterile lab, avant-garde writer-researchers, as members of research expeditions, oscillate between their home writing desks and the remote depths of the emerging USSR. At the same time, they explore writing practices situated between data acquisition, sampling, fact-finding, observation and recording.
Signals for 2 degrees C
(2019)
The targets of the Paris Agreement make it necessary to redirect finance flows towards sustainable, low-carbon infrastructures and technologies. Currently, the potential of institutional investors to help finance this transition is widely discussed. Thus, this paper takes a closer look at influence factors for green investment decisions of large European insurance companies. With a mix of qualitative and quantitative methods, the importance of policy, market and civil society signals is evaluated. In summary, respondents favor measures that promote green investment, such as feed-in tariffs or adjustments of capital charges for green assets, over ones that make carbon-intensive investments less attractive, such as the phase-out of fossil fuel subsidies or a carbon price. While investors currently see a low impact of the carbon price, they rank a substantial reform as an important signal for the future. Respondents also emphasize that policy signals have to be coherent and credible to coordinate expectations.
Editorial
(2019)
A Fuzzy Rule-Based Model for Remote Monitoring of Preterm in the Intensive Care Unit of Hospitals
(2019)
The use of Remote patient monitoring (RPM) systems to monitor critically ill patients in the Intensive Care Unit (ICU) has enabled quality and real-time healthcare management. Fuzzy logic as an approach to designing RPM systems provides a means for encapsulating the subjective decision-making process of medical experts in an algorithm suitable for computer implementation. In this paper, a remote monitoring system for preterm in neonatal ICU incubators is modeled and simulated. The model was designed with 4 input variables (body temperature, heart rate, respiratory rate, and oxygen level saturation), and 1 output variable (action performed represented as ACT). ACT decides whether-an alert is generated or not and also determines the message displayed when a notification is required. ACT classifies the clinical priority of the monitored preterm into 5 different fields: code blue, code red, code yellow, code green, and-code black. The model was simulated using a fuzzy logic toolbox of MATLAB R2015A. About 216 IF_THEN rules were formulated to monitor the inputs data fed into the model. The performance of the model was evaluated using-the confusion matrix to determine the model’s accuracy, precision, sensitivity, specificity, and false alarm rate. The-experimental results obtained shows that the fuzzy-based system is capable of producing satisfactory results when used for monitoring and classifying the clinical statuses of neonates in ICU incubators.
Network science is driven by the question which properties large real-world networks have and how we can exploit them algorithmically. In the past few years, hyperbolic graphs have emerged as a very promising model for scale-free networks. The connection between hyperbolic geometry and complex networks gives insights in both directions: (1) Hyperbolic geometry forms the basis of a natural and explanatory model for real-world networks. Hyperbolic random graphs are obtained by choosing random points in the hyperbolic plane and connecting pairs of points that are geometrically close. The resulting networks share many structural properties for example with online social networks like Facebook or Twitter. They are thus well suited for algorithmic analyses in a more realistic setting. (2) Starting with a real-world network, hyperbolic geometry is well-suited for metric embeddings. The vertices of a network can be mapped to points in this geometry, such that geometric distances are similar to graph distances. Such embeddings have a variety of algorithmic applications ranging from approximations based on efficient geometric algorithms to greedy routing solely using hyperbolic coordinates for navigation decisions.
JavaScript is the most popular programming language for web applications. Static analysis of JavaScript applications is highly challenging due to its dynamic language constructs and event-driven asynchronous executions, which also give rise to many security-related bugs. Several static analysis tools to detect such bugs exist, however, research has not yet reported much on the precision and scalability trade-off of these analyzers. As a further obstacle, JavaScript programs structured in Node. js modules need to be collected for analysis, but existing bundlers are either specific to their respective analysis tools or not particularly suitable for static analysis.
Mobile operating systems, such as Google's Android, have become a fixed part of our daily lives and are entrusted with a plethora of private information. Congruously, their data protection mechanisms have been improved steadily over the last decade and, in particular, for Android, the research community has explored various enhancements and extensions to the access control model. However, the vast majority of those solutions has been concerned with controlling the access to data, but equally important is the question of how to control the flow of data once released. Ignoring control over the dissemination of data between applications or between components of the same app, opens the door for attacks, such as permission re-delegation or privacy-violating third-party libraries. Controlling information flows is a long-standing problem, and one of the most recent and practical-oriented approaches to information flow control is secure multi-execution.
In this paper, we present Ariel, the design and implementation of an IFC architecture for Android based on the secure multi-execution of apps. Ariel demonstrably extends Android's system with support for executing multiple instances of apps, and it is equipped with a policy lattice derived from the protection levels of Android's permissions as well as an I/O scheduler to achieve control over data flows between application instances. We demonstrate how secure multi-execution with Ariel can help to mitigate two prominent attacks on Android, permission re-delegations and malicious advertisement libraries.
Internet connectivity of cloud services is of exceptional importance for both their providers and consumers. This article demonstrates the outlines of a method for measuring cloud-service connectivity at the internet protocol level from a client's perspective. For this, we actively collect connectivity data via traceroute measurements from PlanetLab to several major cloud services. Furthermore, we construct graph models from the collected data, and analyse the connectivity of the services based on important graph-based measures. Then, random and targeted node removal attacks are simulated, and the corresponding vulnerability of cloud services is evaluated. Our results indicate that cloud service hosts are, on average, much better connected than average hosts. However, when interconnecting nodes are removed in a targeted manner, cloud connectivity is dramatically reduced.
Detect me if you can
(2019)
Spam Bots have become a threat to online social networks with their malicious behavior, posting misinformation messages and influencing online platforms to fulfill their motives. As spam bots have become more advanced over time, creating algorithms to identify bots remains an open challenge. Learning low-dimensional embeddings for nodes in graph structured data has proven to be useful in various domains. In this paper, we propose a model based on graph convolutional neural networks (GCNN) for spam bot detection. Our hypothesis is that to better detect spam bots, in addition to defining a features set, the social graph must also be taken into consideration. GCNNs are able to leverage both the features of a node and aggregate the features of a node’s neighborhood. We compare our approach, with two methods that work solely on a features set and on the structure of the graph. To our knowledge, this work is the first attempt of using graph convolutional neural networks in spam bot detection.
The aim of this paper is to discuss Nicolai Hartmann’s conception of personhood as developed in his philosophy of spiritual being. Many contemporary accounts of personhood are systematically focused on rational phenomena as self-consciousness or practical reasoning, which are understood as ‘conditions of personhood’. Apart from having some technical problems, those accounts limit our self-under-standing as persons on distinct rational properties and often fail to consider the sociocultural aspects of the personal situation. Nicolai Hartmann — although respecting the role of reason — understands personhood particularly as participation in a shared spiritual sphere called Objektiver Geist (objective spirit), which includes various intersubjective phenomena as languages, religion, moral, arts, and the
sciences. Being part of this sphere seems to be more fundamental than having distinct rational properties, which requests a spiritual frame to be exerted. Further it is shown that Hartmann’s ontology of person also includes a notion of being affected by the existential weight of situations and other person’s actions — an idea often maintained by phenomenological positions. By regarding rational, intersubjective and affective aspects, Hartmann’s philosophy of person succeeds in offering a broad articulation of our self-understanding and may also be seen as providing a background to understand certain phenomena that are part of the personal situation.
Spontaneous and induced platelet aggregation in apparently healthy subjects in relation to age
(2019)
Thrombotic disorders remain the leading cause of mortality and morbidity, despite the fact that anti-platelet therapies and vascular implants are successfully used today. As life expectancy is increasing in western societies, the specific knowledge about processes leading to thrombosis in elderly is essential for an adequate therapeutic management of platelet dysfunction and for tailoring blood contacting implants. This study addresses the limited available data on platelet function in apparently healthy subjects in relation to age, particularly in view of subjects of old age (80-98 years). Apparently healthy subjects between 20 and 98 years were included in this study. Platelet function was assessed by light transmission aggregometry and comprised experiments on spontaneous as well as ristocetin-, ADP- and collagen-induced platelet aggregation. The data of this study revealed a non-linear increase in the maximum spontaneous platelet aggregation (from 3.3% +/- 3.3% to 10.9% +/- 5.9%). The maximum induced aggregation decreased with age for ristocetin (from 85.8% +/- 7.2% to 75.0% +/- 7.8%), ADP (from 88.5% +/- 4.6% to 64.8% +/- 7.3%) and collagen (from 89.5% +/- 3.0% to 64.0% +/- 4.0%) in a non-linear manner (linear regression analysis). These observations indicate that during aging, circulating platelets become increasingly activated but lose their full aggregatory potential, a phenomenon that was earlier termed "platelet exhaustion". In this study we extended the limited existing data for spontaneous and induced platelet aggregation of apparently healthy donors above the age of 75 years. The presented data indicate that the extrapolation of data from a middle age group does not necessarily predict platelet function in apparently healthy subjects of old age. It emphasizes the need for respective studies to improve our understanding of thrombotic processes in elderly humans.
Lipid-containing adipocytes can dedifferentiate into fibroblast-like cells under appropriate culture conditions, which are known as dedifferentiated fat (DFAT) cells. However, the relative low dedifferentiation efficiency with the established protocols limit their widespread applications. In this study, we found that adipocyte dedifferentiation could be promoted via periodic exposure to cold (10 degrees C) in vitro. The lipid droplets in mature adipocytes were reduced by culturing the cells in periodic cooling/heating cycles (10-37 degrees C) for one week. The periodic temperature change led to the down-regulation of the adipogenic genes (FABP4, Leptin) and up-regulation of the mitochondrial uncoupling related genes (UCP1, PGC-1 alpha, and PRDM16). In addition, the enhanced expression of the cell proliferation marker Ki67 was observed in the dedifferentiated fibroblast-like cells after periodic exposure to cold, as compared to the cells cultured in 37 degrees C. Our in vitro model provides a simple and effective approach to promote lipolysis and can be used to improve the dedifferentiation efficiency of adipocytes towards multipotent DFAT cells.
Above and underground hydrological processes depend on soil moisture (SM) variability, driven by different environmental factors that seldom are well-monitored, leading to a misunderstanding of soil water temporal patterns. This study investigated the stability of the SM temporal dynamics to different monitoring temporal resolutions around the border between two soil types in a tropical watershed. Four locations were instrumented in a small-scale watershed (5.84 km(2)) within the tropical coast of Northeast Brazil, encompassing different soil types (Espodossolo Humiluvico or Carbic Podzol, and Argissolo Vermelho-Amarelo or Haplic Acrisol), land covers (Atlantic Forest, bush vegetation, and grassland) and topographies (flat and moderate slope). The SM was monitored at a temporal resolution of one hour along the 2013-2014 hydrological year and then resampled a resolutions of 6 h, 12 h, 1 day, 2 days, 4 days, 7 days, and 15 days. Descriptive statistics, temporal variability, time-stability ranking, and hierarchical clustering revealed uneven associations among SM time components. The results show that the time-invariant component ruled SM temporal variability over the time-varying parcel, either at high or low temporal resolutions. Time-steps longer than 2 days affected the mean statistical metrics of the SM time-variant parcel. Additionally, SM at downstream and upstream sites behaved differently, suggesting that the temporal mean was regulated by steady soil properties (slope, restrictive layer, and soil texture), whereas their temporal anomalies were driven by climate (rainfall) and hydrogeological (groundwater level) factors. Therefore, it is concluded that around the border between tropical soil types, the distinct behaviour of time-variant and time-invariant components of SM time series reflects different combinations of their soil properties.
Kyub
(2019)
We present an interactive editing system for laser cutting called kyub. Kyub allows users to create models efficiently in 3D, which it then unfolds into the 2D plates laser cutters expect. Unlike earlier systems, such as FlatFitFab, kyub affords construction based on closed box structures, which allows users to turn very thin material, such as 4mm plywood, into objects capable of withstanding large forces, such as chairs users can actually sit on. To afford such sturdy construction, every kyub project begins with a simple finger-joint "boxel"-a structure we found to be capable of withstanding over 500kg of load. Users then extend their model by attaching additional boxels. Boxels merge automatically, resulting in larger, yet equally strong structures. While the concept of stacking boxels allows kyub to offer the strong affordance and ease of use of a voxel-based editor, boxels are not confined to a grid and readily combine with kuyb's various geometry deformation tools. In our technical evaluation, objects built with kyub withstood hundreds of kilograms of loads. In our user study, non-engineers rated the learnability of kyub 6.1/7.
In this paper, we establish the underlying foundations of mechanisms that are composed of cell structures-known as metamaterial mechanisms. Such metamaterial mechanisms were previously shown to implement complete mechanisms in the cell structure of a 3D printed material, without the need for assembly. However, their design is highly challenging. A mechanism consists of many cells that are interconnected and impose constraints on each other. This leads to unobvious and non-linear behavior of the mechanism, which impedes user design. In this work, we investigate the underlying topological constraints of such cell structures and their influence on the resulting mechanism. Based on these findings, we contribute a computational design tool that automatically creates a metamaterial mechanism from user-defined motion paths. This tool is only feasible because our novel abstract representation of the global constraints highly reduces the search space of possible cell arrangements.
When it comes to autobiographical narratives, the most spontaneous and natural manner is preferable. But neither individually told narratives nor those grounded in the communicative repertoire of a social group are easily comparable. A clearly identifiable tertium comparationis is mandatory. We present the results of an experimental ‘Narrative Priming’ setting with French students. A potentially underlying model of narrating from personal experience was activated via a narrative prime, and in a second step, the participants were asked to tell a narrative of their own. The analysis focuses on similarities and differences between the primes and the students’ narratives. The results give evidence for the possibility to elicit a set of comparable narratives via a prime, and to activate an underlying narrative template. Meaningful differences are discussed as generational and age related styles. The transcriptions from the participants that authorized the publication are available online.
The "Bachelor Project"
(2019)
One of the challenges of educating the next generation of computer scientists is to teach them to become team players, that are able to communicate and interact not only with different IT systems, but also with coworkers and customers with a non-it background. The “bachelor project” is a project based on team work and a close collaboration with selected industry partners. The authors hosted some of the teams since spring term 2014/15. In the paper at hand we explain and discuss this concept and evaluate its success based on students' evaluation and reports. Furthermore, the technology-stack that has been used by the teams is evaluated to understand how self-organized students in IT-related projects work. We will show that and why the bachelor is the most successful educational format in the perception of the students and how this positive results can be improved by the mentors.
Currently, a transformation of our technical world into a networked technical world where besides the embedded systems with their interaction with the physical world the interconnection of these nodes in the cyber world becomes a reality can be observed. In parallel nowadays there is a strong trend to employ artificial intelligence techniques and in particular machine learning to make software behave smart. Often cyber-physical systems must be self-adaptive at the level of the individual systems to operate as elements in open, dynamic, and deviating overall structures and to adapt to open and dynamic contexts while being developed, operated, evolved, and governed independently.
In this presentation, we will first discuss the envisioned future scenarios for cyber-physical systems with an emphasis on the synergies networking can offer and then characterize which challenges for the design, production, and operation of these systems result. We will then discuss to what extent our current capabilities, in particular concerning software engineering match these challenges and where substantial improvements for the software engineering are crucial. In today's software engineering for embedded systems models are used to plan systems upfront to maximize envisioned properties on the one hand and minimize cost on the other hand. When applying the same ideas to software for smart cyber-physical systems, it soon turned out that for these systems often somehow more subtle links between the involved models and the requirements, users, and environment exist. Self-adaptation and runtime models have been advocated as concepts to covers the demands that result from these subtler links. Lately, both trends have been brought together more thoroughly by the notion of self-aware computing systems. We will review the underlying causes, discuss some our work in this direction, and outline related open challenges and potential for future approaches to software engineering for smart cyber-physical systems.
MOOCs in Secondary Education
(2019)
Computer science education in German schools is often less than optimal. It is only mandatory in a few of the federal states and there is a lack of qualified teachers. As a MOOC (Massive Open Online Course) provider with a German background, we developed the idea to implement a MOOC addressing pupils in secondary schools to fill this gap. The course targeted high school pupils and enabled them to learn the Python programming language. In 2014, we successfully conducted the first iteration of this MOOC with more than 7000 participants. However, the share of pupils in the course was not quite satisfactory. So we conducted several workshops with teachers to find out why they had not used the course to the extent that we had imagined. The paper at hand explores and discusses the steps we have taken in the following years as a result of these workshops.
Metamorphic geology
(2019)
From object to process
(2019)
One of the most difficult tasks today is trying to grasp the presence of computing. The almost ubiquitous and diverse forms of networked computers (in all their stationary, mobile, embedded, and autonomous modes) create a nearly overwhelming complexity. To speak of what is here evading and present at the same time, the paper proposes to reconsider the concept of interface, its historical roots, and its heuristic advantages for an analysis and critique of the current and especially everyday spread of computerization. The question of interfaces leads to isolable conditions and processes of conduction, as well as to the complexity of the cooperation formed by them. It opens both an investigative horizon and a mode of analysis, which always asks for further interface levels involved in the phenomenon I am currently investigating. As an example, the paper turns to the displacement of the file with the launch of the iPhone in 2007 and its comeback in 2017 with the "Files" apps. Both developments are profoundly related to the establishment of computers as permanently networked machines, whereby their functionality, depresentations, and ideology come into focus.
This book is about the building of alliances and about joint activities between two groups of social movement actors ascribed increasing relevance for the functioning and the eventual amendment of democratic capitalism. The chapters provide a well-balanced mix of theoretical and empirical accounts on the political, social and economic catalysts behind the changing motives finding expression in a multitude of novel types of joint collective action and inter-organizational alliances. The contributors to this volume go beyond attempting to place unions, movements, crises, precariousness, protests and coalitions at the centre of the research. Instead, they focus on actors who themselves transcend clear-cut social camps. They look at the values and motives underlying collective action by both types of actors as much as at their structural and strategic properties, and inter-organizational relations and networks. This creates a fresh, genuine and historically valid account of the incompatibilities and the commonalities of movements and unions, and of prospects for inter-organizational learning.
Preface
(2019)
Cloud Storage Broker (CSB) provides value-added cloud storage service for enterprise usage by leveraging multi-cloud storage architecture. However, it raises several challenges for managing resources and its access control in multiple Cloud Service Providers (CSPs) for authorized CSB stakeholders. In this paper we propose unified cloud access control model that provides the abstraction of CSP's services for centralized and automated cloud resource and access control management in multiple CSPs. Our proposal offers role-based access control for CSB stakeholders to access cloud resources by assigning necessary privileges and access control list for cloud resources and CSB stakeholders, respectively, following privilege separation concept and least privilege principle. We implement our unified model in a CSB system called CloudRAID for Business (CfB) with the evaluation result shows it provides system-and-cloud level security service for cfB and centralized resource and access control management in multiple CSPs.
The idea for this book arose out of discontent with essentially three shortcomings in the recent literature on the present state of politics in Western democracies and on forms of collective action. The general message resulting from research in the political economy and in forms of democracy is disastrous. We are confronted with a mix of decline, fragmentation, individualization, diminishing trust in institutions hollowed out from the inside, the hoarding of power by small political and economic elites, and the increasing marginalization and pauperization of vast parts of the population. While the accuracy of these trends shall not be called into question, it is noteworthy, and this is the first shortcoming, to what extent that literature tends to neglect one crucial aspect, namely the capacity of those suffering most from the above malaise to coming together and searching for possibilities of collectively halting, reversing, or otherwise influencing decline in defence of their needs and interests. The second shortcoming concerns the literatures on precisely these actors, namely established trade union research and research on social movements. While both fields acknowledge the extent of the current crisis and have submitted numerous books and articles on how their respective research targets are reacting to it, the situation continues to remain one of indifference. There hardly is cross-fertilization beyond the boundaries of established research traditions. At the same time, empirical reality seems to suggest that forms of joint activity by both types of actors may have become more advanced than theoretical reflection is so far prepared to admit. As observed by Fantasia and Stepan-Norris (2004: 561) students of each of the two forms of collective action "(…) mutually neglect each other". At best, trade union researchers and social movement research envisage their counterpart in purely instrumental
Increasing demand for analytical processing capabilities can be managed by replication approaches. However, to evenly balance the replicas' workload shares while at the same time minimizing the data replication factor is a highly challenging allocation problem. As optimal solutions are only applicable for small problem instances, effective heuristics are indispensable. In this paper, we test and compare state-of-the-art allocation algorithms for partial replication. By visualizing and exploring their (heuristic) solutions for different benchmark workloads, we are able to derive structural insights and to detect an algorithm's strengths as well as its potential for improvement. Further, our application enables end-to-end evaluations of different allocations to verify their theoretical performance.
Workload-Driven Fragment Allocation for Partially Replicated Databases Using Linear Programming
(2019)
In replication schemes, replica nodes can process read-only queries on snapshots of the master node without violating transactional consistency. By analyzing the workload, we can identify query access patterns and replicate data depending to its access frequency. In this paper, we define a linear programming (LP) model to calculate the set of partial replicas with the lowest overall memory capacity while evenly balancing the query load. Furthermore, we propose a scalable decomposition heuristic to calculate solutions for larger problem sizes. While guaranteeing the same performance as state-of-the-art heuristics, our decomposition approach calculates allocations with up to 23% lower memory footprint for the TPC-H benchmark.
Data analytics are moving beyond the limits of a single data processing platform. A cross-platform query optimizer is necessary to enable applications to run their tasks over multiple platforms efficiently and in a platform-agnostic manner. For the optimizer to be effective, it must consider data movement costs across different data processing platforms. In this paper, we present the graph-based data movement strategy used by RHEEM, our open-source cross-platform system. In particular, we (i) model the data movement problem as a new graph problem, which we prove to be NP-hard, and (ii) propose a novel graph exploration algorithm, which allows RHEEM to discover multiple hidden opportunities for cross-platform data processing.
An efficient selection of indexes is indispensable for database performance. For large problem instances with hundreds of tables, existing approaches are not suitable: They either exhibit prohibitive runtimes or yield far from optimal index configurations by strongly limiting the set of index candidates or not handling index interaction explicitly. We introduce a novel recursive strategy that does not exclude index candidates in advance and effectively accounts for index interaction. Using large real-world workloads, we demonstrate the applicability of our approach. Further, we evaluate our solution end to end with a commercial database system using a reproducible setup. We show that our solutions are near-optimal for small index selection problems. For larger problems, our strategy outperforms state-of-the-art approaches in both scalability and solution quality.
This article aims to sum up the main results of a research project made in 2016 and 2017 about the situation of 1190 Romanian migrants in Western Europe and to give an overview about the push and pull factors, transnational family structures, as well as the challenges and difficulties of the Romanian survey respondents living in Germany, France, the United Kingdom and Italy. It also considers the role of personal networks which represent an important motor of migration and constitute the main motive for the choice of a certain destination region. These migration networks lead to the construction of transnational social spaces between Romania and the destination country and have high influence in the search for housing or jobs but can also influence the integration process abroad.
For a singularly perturbed parabolic - ODE system we construct the asymptotic expansion in the small parameter in the case, when the degenerate equation has a double root. Such systems, which are called partly dissipative reaction-diffusion systems, are used to model various natural processes, including the signal transmission along axons, solid combustion and the kinetics of some chemical reactions. It turns out that the algorithm of the construction of the boundary layer functions and the behavior of the solution in the boundary layers essentially differ from that ones in case of a simple root. The multizonal initial and boundary layers behaviour was stated.
New Public Governance (NPG) as a paradigm for collaborative forms of public service delivery and Blockchain governance are trending topics for researchers and practitioners alike. Thus far, each topic has, on the whole, been discussed separately. This paper presents the preliminary results of ongoing research which aims to shed light on the more concrete benefits of Blockchain for the purpose of NPG. For the first time, a conceptual analysis is conducted on process level to spot benefits and limitations of Blockchain-based governance. Per process element, Blockchain key characteristics are mapped to functional aspects of NPG from a governance perspective. The preliminary results show that Blockchain offers valuable support for governments seeking methods to effectively coordinate co-producing networks. However, the extent of benefits of Blockchain varies across the process elements. It becomes evident that there is a need for off-chain processes. It is, therefore, argued in favour of intensifying research on off-chain governance processes to better understand the implications for and influences on on-chain governance.
This study explores the theoretical and political potentials of Édouard Glissant’s philosophy of relation and its approach to the issues of borders, migration, and the setup of political communities as proposed by his pensée nouvelle de la frontière (new border thought), against the background of the German migration crisis of 2015. The main argument of this article is that Glissant’s work offers an alternative epistemological and normative framework through which the contemporary political issues arising around the phenomenon of repressive border regimes can be studied. To demonstrate this point, this article works with Glissant’s border thought as an analytical lens and proposes a pathway for studying the contemporary German border regime. Particular emphasis is placed on the identification of potential areas where a Glissantian politics of relation could intervene with the goal of transforming borders from impermeable walls into points of passage. By exploring the political implications of his border thought, as well as the larger philosophical context from which it emerges, while using a transdisciplinary approach that borrows from literary and political studies, this work contributes to ongoing debates in postcolonial studies on borders and borderlessness, as well as Glissant’s political legacy in the twenty-first century.
TrussFormer
(2019)
We present TrussFormer, an integrated end-to-end system that allows users to 3D print large-scale kinetic structures, i.e., structures that involve motion and deal with dynamic forces. TrussFormer builds on TrussFab, from which it inherits the ability to create static large-scale truss structures from 3D printed connectors and PET bottles. TrussFormer adds movement to these structures by placing linear actuators into them: either manually, wrapped in reusable components called assets, or by demonstrating the intended movement. TrussFormer verifies that the resulting structure is mechanically sound and will withstand the dynamic forces resulting from the motion. To fabricate the design, TrussFormer generates the underlying hinge system that can be printed on standard desktop 3D printers. We demonstrate TrussFormer with several example objects, including a 6-legged walking robot and a 4m-tall animatronics dinosaur with 5 degrees of freedom.
Peer cultural socialisation
(2019)
This study investigated how peers can contribute to cultural minority students’ cultural identity, life satisfaction, and school values (school importance, utility, and intrinsic values) by talking about cultural values, beliefs, and behaviours associated with heritage and mainstream culture (peer cultural socialisation). We further distinguished between heritage and mainstream identity as two separate dimensions of cultural identity. Analyses were based on self-reports of 662 students of the first, second, and third migrant generation in Germany (Mean age = 14.75 years, 51% female). Path analyses revealed that talking about heritage culture with friends was positively related to heritage identity. Talking about mainstream culture with friends was negatively associated with heritage identity, but positively with mainstream identity as well as school values. Both dimensions of cultural identity related to higher life satisfaction and more positive school values. As expected, heritage and mainstream identity mediated the link between peer cultural socialisation and adjustment outcomes. Findings highlight the potential of peers as socialisation agents to help promote cultural belonging as well as positive adjustment of cultural minority youth in the school context.
Mobile sensing technology allows us to investigate human behaviour on a daily basis. In the study, we examined temporal orientation, which refers to the capacity of thinking or talking about personal events in the past and future. We utilise the mksense platform that allows us to use the experience-sampling method. Individual's thoughts and their relationship with smartphone's Bluetooth data is analysed to understand in which contexts people are influenced by social environments, such as the people they spend the most time with. As an exploratory study, we analyse social condition influence through a collection of Bluetooth data and survey information from participant's smartphones. Preliminary results show that people are likely to focus on past events when interacting with close-related people, and focus on future planning when interacting with strangers. Similarly, people experience present temporal orientation when accompanied by known people. We believe that these findings are linked to emotions since, in its most basic state, emotion is a state of physiological arousal combined with an appropriated cognition. In this contribution, we envision a smartphone application for automatically inferring human emotions based on user's temporal orientation by using Bluetooth sensors, we briefly elaborate on the influential factor of temporal orientation episodes and conclude with a discussion and lessons learned.
We discuss canonical representations of the de Rham cohomology on a compact manifold with boundary. They are obtained by minimising the energy integral in a Hilbert space of differential forms that belong along with the exterior derivative to the domain of the adjoint operator. The corresponding Euler-Lagrange equations reduce to an elliptic boundary value problem on the manifold, which is usually referred to as the Neumann problem after Spencer.
In this paper, we consider counting and projected model counting of extensions in abstract argumentation for various semantics. When asking for projected counts we are interested in counting the number of extensions of a given argumentation framework while multiple extensions that are identical when restricted to the projected arguments count as only one projected extension. We establish classical complexity results and parameterized complexity results when the problems are parameterized by treewidth of the undirected argumentation graph. To obtain upper bounds for counting projected extensions, we introduce novel algorithms that exploit small treewidth of the undirected argumentation graph of the input instance by dynamic programming (DP). Our algorithms run in time double or triple exponential in the treewidth depending on the considered semantics. Finally, we take the exponential time hypothesis (ETH) into account and establish lower bounds of bounded treewidth algorithms for counting extensions and projected extension.
Le centenaire de la publication du Cours de linguistique générale (1916) de Ferdinand de Saussure nous a invité à reconsidérer l’importance de cet ouvrage et le rôle de son auteur pour la fondation d’une linguistique intégrée dans une sémiologie. Il n’y a aucun doute que cet auteur fut extrêmement important pour le développement de la linguistique structurale en Europe et qu’avec son concept du signe linguistique il a fait œuvre de pionnier pour le tournant sémiologique. Mais l’accueil favorable d’une théorie dans le milieu scientifique ne s’explique pas seulement par sa qualité intérieure, mais par plusieurs conditions extérieures. Ces conditions seront analysées sur trois plans: (1) l’arrivée de la méthode des néogrammairiens à ses limites qui incitait alors à l’étude de l’unité du signifiant et du signifié; (2) la simplification et l’outrance de la pensée structurale dans le Cours, publié en 1916 par Charles Bally et Albert Sechehaye et (3) la préparation de la réception de la pensée sémiologique par plusieurs travaux parallèles.
In this chapter, we explore the aforementioned paradigm shifts and how they offer an ave nue for new research. We first elucidate what precisely “ mental imagery,” the parent construct of motor imagery, is and explain the research milestones that have elucidated our understanding of this complex topic. The construct of motor imagery has become a thriving research topic thanks to the development of the action simulation model by Marc Jeannerod, which provided a framework in which imagery and movement are viewed as part of an action continuum ( Jeannerod 1994, 2006).
Of Sources and Files
(2019)
Files produced by the secret police forces of former Eastern Bloc countries are complex documents, not completely reliable and yet not fully untrustworthy either—or as the British historian Timothy Garton Ash has remarked, “There is a truth that can be found [in a secret police file]. Not a single, absolute Truth with a capital T but still a real and important one” (2002, 282). As historical documents—texts anchored in a time and place and resulting from specific circumstances—files in general “supplement or rework ‘reality’” and are never “mere sources that divulge facts about ‘reality’” (LaCapra 1985, 11)
High-throughput RNA sequencing produces large gene expression datasets whose analysis leads to a better understanding of diseases like cancer. The nature of RNA-Seq data poses challenges to its analysis in terms of its high dimensionality, noise, and complexity of the underlying biological processes. Researchers apply traditional machine learning approaches, e. g. hierarchical clustering, to analyze this data. Until it comes to validation of the results, the analysis is based on the provided data only and completely misses the biological context. However, gene expression data follows particular patterns - the underlying biological processes. In our research, we aim to integrate the available biological knowledge earlier in the analysis process. We want to adapt state-of-the-art data mining algorithms to consider the biological context in their computations and deliver meaningful results for researchers.