Refine
Has Fulltext
- no (117)
Year of publication
- 2017 (117) (remove)
Document Type
- Other (117) (remove)
Is part of the Bibliography
- yes (117)
Keywords
- Internet (2)
- MOOC (2)
- affect (2)
- carbon dioxide (2)
- embodied cognition (2)
- 2.5D Treemaps (1)
- Absorption kinetics (1)
- Aluminium (1)
- Aluminium adjuvants (1)
- Aufklarung (1)
- BMI (1)
- Bandwidth (1)
- Buntsandstein (1)
- CO2 storage monitoring (1)
- Cloud Native Applications (1)
- Cloud-Security (1)
- DPP-4 inhibitors (1)
- Dialysis patients (1)
- Digital Learning Factory (1)
- Dipeptidyl peptidase IV (1)
- Durkheim (1)
- Durkheim’s German Reception, Max Weber, Georg Simmel, Jürgen Habermas (1)
- E-Learning (1)
- ERPs (1)
- Educational Technology (1)
- Enlightenment (1)
- Exploratory interfaces (1)
- Gamification (1)
- Geoeletrical imaging (1)
- Growth faltering (1)
- Growth modelling (1)
- In vitro dissolution (1)
- Industrial IoT Competences (1)
- Information Visualization (1)
- Massive Open Online Courses (1)
- Media retrieval (1)
- Memory management (1)
- Mobile Learning (1)
- Mobiles (1)
- Mortality (1)
- Multi-perspective Views (1)
- Multidimensional scaling (1)
- Obesity (1)
- Offline-Enabled (1)
- Overview plus Detail (1)
- P300 (1)
- PHEMA (1)
- Parallel programming (1)
- Performance analysis (1)
- Photon Density Wave Spectroscopy (1)
- SAW impedance sensor (1)
- Secular trend (1)
- Security-as-a-Service (1)
- Serum intact-parathyroid hormone level (1)
- Strategic growth adjustment (1)
- Student Training (1)
- Toxicokinetic modelling (1)
- Treemaps (1)
- Ubiquitous (1)
- User study (1)
- Vocational Training (1)
- Vulnerability Assessment (1)
- abstract concepts (1)
- action words (1)
- acute kidney injury (1)
- age of acquisition (1)
- agreement processing (1)
- algae cultivation (1)
- anaphor resolution (1)
- bilingualism (1)
- blind feeling (1)
- brain rhythms (1)
- brain synchronization (1)
- carbon cycle (1)
- child development (1)
- climate change (1)
- coefficient of determination (1)
- complex networks (1)
- connectivity (1)
- contingent encounters (1)
- critical period for language (1)
- damage (1)
- data based model (1)
- digital rock physics (1)
- digital technologies (1)
- dissolved (1)
- e-learning (1)
- effective elastic properties (1)
- eighteenth century (1)
- emotions (1)
- encoding (1)
- enjoyment (1)
- epilepsy (1)
- exercise (1)
- feelings (1)
- fermentation (1)
- fiber spectroscopy (1)
- filler-gap dependencies (1)
- gas storage (1)
- geothermal reservoir (1)
- gliptins (1)
- graph analysis (1)
- historiography (1)
- human rights (1)
- hybrid nanomaterials (1)
- hydrogen (1)
- implicit (1)
- interference (1)
- ischemia reperfusion injury (1)
- language (1)
- language acquisition (1)
- learning (1)
- lifespan (1)
- mattering (1)
- memory (1)
- memory retrieval (1)
- mental simulation (1)
- meteorological extremes (1)
- methane (1)
- microfluidic (1)
- mobile phone (1)
- modernity (1)
- multiple light scattering (1)
- multivariate regression (1)
- neural synchonization (1)
- non-linear dynamics (1)
- norepinephrine (1)
- numerical (1)
- numerical simulation (1)
- old/new effect (1)
- permanent downhole electrode array (1)
- physical activity (1)
- process analytical technology (1)
- regulation (1)
- renewable energy (1)
- risk analysis (1)
- saline aquifer (1)
- scenario analysis (1)
- sensation (1)
- sensitive periods (1)
- sentence comprehension (1)
- smartphones (1)
- storage capacity (1)
- tablet computers (1)
- transcutaneous vagus nerve stimulation (1)
- tropical cyclones (1)
- vulnerability (1)
- water management (1)
- wondering (1)
Institute
- Institut für Biochemie und Biologie (22)
- Institut für Physik und Astronomie (17)
- Institut für Geowissenschaften (11)
- Department Sport- und Gesundheitswissenschaften (10)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (10)
- Department Psychologie (8)
- Institut für Ernährungswissenschaft (6)
- Department Linguistik (5)
- Institut für Informatik und Computational Science (5)
- Institut für Chemie (4)
Preclinical studies in cell culture systems as well as in whole animal chronic kidney disease (CKD) models showed that parathyroid hormone (PTH), oxidized at the 2 methionine residues (positions 8 and 18), caused a loss of function. This was so far not considered in the development of PTH assays used in current clinical practice. Patients with advanced CKD are subject to oxidative stress, and plasma proteins (including PTH) are targets for oxidants. In patients with CKD, a considerable but variable fraction (about 70 to 90%) of measured PTH appears to be oxidized. Oxidized PTH (oxPTH) does not interact with the PTH receptor resulting in loss of biological activity. Currently used intact PTH (iPTH) assays detect both oxidized and non-oxPTH (n-oxPTH). Clinical studies demonstrated that bioactive, n-oxPTH, but not iPTH nor oxPTH, is associated with mortality in CKD patients.
We develop a simple two-zone interpretation of the broadband baseline Crab nebula spectrum between 10(-5) eV and similar to 100 TeV by using two distinct log-parabola energetic electrons distributions. We determine analytically the very-high energy photon spectrum as originated by inverse-Compton scattering of the far-infrared soft ambient photons within the nebula off a first population of electrons energized at the nebula termination shock. The broad and flat 200 GeV peak jointly observed by Fermi/LAT and MAGIC is naturally reproduced. The synchrotron radiation from a second energetic electron population explains the spectrum from the radio range up to similar to 10 keV. We infer from observations the energy dependence of the microscopic probability of remaining in proximity of the shock of the accelerating electrons.
As a potentially toxic agent on nervous system and bone, the safety of aluminium exposure from adjuvants in vaccines and subcutaneous immune therapy (SCIT) products has to be continuously reevaluated, especially regarding concomitant administrations. For this purpose, knowledge on absorption and disposition of aluminium in plasma and tissues is essential. Pharmacokinetic data after vaccination in humans, however, are not available, and for methodological and ethical reasons difficult to obtain. To overcome these limitations, we discuss the possibility of an in vitro-in silico approach combining a toxicokinetic model for aluminium disposition with biorelevant kinetic absorption parameters from adjuvants. We critically review available kinetic aluminium-26 data for model building and, on the basis of a reparameterized toxicokinetic model (Nolte et al., 2001), we identify main modelling gaps. The potential of in vitro dissolution experiments for the prediction of intramuscular absorption kinetics of aluminium after vaccination is explored. It becomes apparent that there is need for detailed in vitro dissolution and in vivo absorption data to establish an in vitro-in vivo correlation (IVIVC) for aluminium adjuvants. We conclude that a combination of new experimental data and further refinement of the Nolte model has the potential to fill a gap in aluminium risk assessment. (C) 2017 Elsevier Inc. All rights reserved.
THE P300 AND THE LC-NE SYSTEM: NEW INSIGHTS FROM TRANSCUTANEOUS VAGUS NERVE STIMULATION (TVNS)
(2017)
Gamma-ray bursts (GRBs) are some of the Universe’s most enigmatic and exotic events. However, at energies above 10 GeV their behaviour remains largely unknown. Although space based telescopes such as the Fermi-LAT have been able to detect GRBs in this energy range, their photon statistics are limited by the small detector size. Such limitations are not present in ground based gamma-ray telescopes such as the H.E.S.S. experiment, which has now entered its second phase with the addition of a large 600 m2 telescope to the centre of the array. Such a large telescope allows H.E.S.S. to access the sub 100-GeV energy range while still maintaining a large effective collection area, helping to potentially probe the short timescale emission of these events.
We present a description of the H.E.S.S. GRB observation programme, summarising the performance of the rapid GRB repointing system and the conditions under which GRB observations are initiated. Additionally we will report on the GRB follow-ups made during the 2014-15 observation campaigns.
The German Enlightenment
(2017)
The term Enlightenment (or Aufklärung) remains heavily contested. Even when historians delimit the remit of the concept, assigning it to a particular historical period rather than to an intellectual or moral programme, the public resonance of the Enlightenment remains high and problematic—especially when equated in an essentialist manner with modernity or some core values of ‘the West’. This Forum has been convened to discuss recent research on the Enlightenment in Germany, different views of the term and its ideological use in public discourse outside academia (and sometimes within it).
Massive Open Online Courses (MOOCs) have left their mark on the face of education during the recent years. At the Hasso Plattner Institute (HPI) in Potsdam, Germany, we are actively developing a MOOC platform, which provides our research with a plethora of e-learning topics, such as learning analytics, automated assessment, peer assessment, team-work, online proctoring, and gamification. We run several instances of this platform. On openHPI, we provide our own courses from within the HPI context. Further instances are openSAP, openWHO, and mooc.HOUSE, which is the smallest of these platforms, targeting customers with a less extensive course portfolio. In 2013, we started to work on the gamification of our platform. By now, we have implemented about two thirds of the features that we initially have evaluated as useful for our purposes. About a year ago we activated the implemented gamification features on mooc.HOUSE. Before activating the features on openHPI as well, we examined, and re-evaluated our initial considerations based on the data we collected so far and the changes in other contexts of our platforms.
Structural health monitoring activities are of primal importance for managing transport infrastructure, however most SHM methodologies are based on point-based sensors that have limitations in terms of their spatial positioning requirements, cost of development and measurement range. This paper describes the progress on the SENSKIN EC project whose objective is to develop a dielectric-elastomer and micro-electronics-based sensor, formed from a large highly extensible capacitance sensing membrane supported by advanced microelectronic circuitry, for monitoring transport infrastructure bridges. Such a sensor could provide spatial measurements of strain in excess of 10%. The actual sensor along with the data acquisition module, the communication module and power electronics are all integrated into a compact unit, the SENSKIN device, which is energy-efficient, requires simple signal processing and it is easy to install over various surface types. In terms of communication, SENSKIN devices interact with each other to form the SENSKIN system; a fully distributed and autonomous wireless sensor network that is able to self-monitor. SENSKIN system utilizes Delay-/Disruption-Tolerant Networking technologies to ensure that the strain measurements will be received by the base station even under extreme conditions where normal communications are disrupted. This paper describes the architecture of the SENSKIN system and the development and testing of the first SENSKIN prototype sensor, the data acquisition system, and the communication system.
Since the Shallow Structure Hypothesis (SSH) was first put forward in 2006, it has inspired a growing body of research on grammatical processing in nonnative (L2) speakers. More than 10 years later, we think it is time for the SSH to be reconsidered in the light of new empirical findings and current theoretical assumptions about human language processing. The purpose of our critical commentary is twofold: to clarify some issues regarding the SSH and to sketch possible ways in which this hypothesis might be refined and improved to better account for L1 and L2 speakers’ performance patterns.
Surface acoustic wave (SAW) devices are well-known for gravimetric sensor applications. In biosensing applications, chemical-and biochemically evoked adsorption processes at surfaces are detected in liquid environments using delay-line or resonator sensor configurations, preferably in combination with appropriate microfluidic devices. In this paper, a novel SAW-based impedance sensor type is introduced which uses only one interdigital electrode transducer (IDT) simultaneously as SAW generator and sensor element. It is shown that the amplitude of the reflected S-11 signal directly depends on the input impedance of the SAW device. The input impedance is strongly influenced by mass adsorption which causes a characteristic and measurable impedance mismatch.
The keynote article (Mayberry & Kluender, 2017) makes an important contribution to questions concerning the existence and characteristics of sensitive periods in language acquisition. Specifically, by comparing groups of non-native L1 and L2 signers, the authors have been able to ingeniously disentangle the effects of maturation from those of early language exposure. Based on L1 versus L2 contrasts, the paper convincingly argues that L2 learning is a less clear test of sensitive periods. Nevertheless, we believe Mayberry and Kluender underestimate the evidence for maturational factors in L2 learning, especially that coming from recent research.
Root infinitives on Twitter
(2017)
Recently a multitude of empirically derived damage models have been applied to project future tropical cyclone (TC) losses for the United States. In their study (Geiger et al 2016 Environ. Res. Lett. 11 084012) compared two approaches that differ in the scaling of losses with socio-economic drivers: the commonly-used approach resulting in a sub-linear scaling of historical TC losses with a nation's affected gross domestic product (GDP), and the disentangled approach that shows a sub-linear increase with affected population and a super-linear scaling of relative losses with per capita income. Statistics cannot determine which approach is preferable but since process understanding demands that there is a dependence of the loss on both GDP per capita and population, an approach that accounts for both separately is preferable to one which assumes a specific relation between the two dependencies. In the accompanying comment, Rybski et al argued that there is no rigorous evidence to reach the conclusion that high-income does not protect against hurricane losses. Here we affirm that our conclusion is drawn correctly and reply to further remarks raised in the comment, highlighting the adequateness of our approach but also the potential for future extension of our research.
New data from the LEADER trial show that the glucagon-like peptide 1 receptor agonist liraglutide protects against diabetic nephropathy in patients with type 2 diabetes mellitus. The renoprotective efficacy of liraglutide is not, however, as great as that reported for the sodium-glucose cotransporter 2 inhibitor emplagiflozin in the EMPA-REG OUTCOME trial.
Background: Evidence that home telemonitoring (HTM) for patients with chronic heart failure (CHF) offers clinical benefit over usual care is controversial as is evidence of a health economic advantage. Therefore the CardioBBEAT trial was designed to prospectively assess the health economic impact of a dedicated home monitoring system for patients with CHF based on actual costs directly obtained from patients’ health care providers.
Methods: Between January 2010 and June 2013, 621 patients (mean age 63,0 ± 11,5 years, 88 % male) with a confirmed diagnosis of CHF (LVEF ≤ 40 %) were enrolled and randomly assigned to two study groups comprising usual care with and without an interactive bi-directional HTM (Motiva®). The primary endpoint was the Incremental Cost-Effectiveness Ratio (ICER) established by the groups’ difference in total cost and in the combined clinical endpoint “days alive and not in hospital nor inpatient care per potential days in study” within the follow up of 12 months. Secondary outcome measures were total mortality and health related quality of life (SF-36, WHO-5 and KCCQ).
Results: In the intention-to-treat analysis, total mortality (HR 0.81; 95% CI 0.45 – 1.45) and days alive and not in hospital (343.3 ± 55.4 vs. 347.2 ± 43.9; p = 0.909) were not significantly different between HTM and usual care. While the resulting primary endpoint ICER was not positive (-181.9; 95% CI −1626.2 ± 1628.9), quality of life assessed by SF-36, WHO-5 and KCCQ as a secondary endpoint was significantly higher in the HTW group at 6 and 12 months of follow-up.
Conclusions: The first simultaneous assessment of clinical and economic outcome of HTM in patients with CHF did not demonstrate superior incremental cost effectiveness compared to usual care. On the other hand, quality of life was improved. It remains open whether the tested HTM solution represents a useful innovative approach in the recent health care setting.
The identification of vulnerabilities relies on detailed information about the target infrastructure. The gathering of the necessary information is a crucial step that requires an intensive scanning or mature expertise and knowledge about the system even though the information was already available in a different context. In this paper we propose a new method to detect vulnerabilities that reuses the existing information and eliminates the necessity of a comprehensive scan of the target system. Since our approach is able to identify vulnerabilities without the additional effort of a scan, we are able to increase the overall performance of the detection. Because of the reuse and the removal of the active testing procedures, our approach could be classified as a passive vulnerability detection. We will explain the approach and illustrate the additional possibility to increase the security awareness of users. Therefore, we applied the approach on an experimental setup and extracted security relevant information from web logs.
Der Leserbrief fokussiert in weiten Teilen auf das Gutachterwesen, weshalb wir ausschließlich auf die inhaltlichen Punkte im Zusammenhang mit unserer Arbeit eingehen. Untersucht wurden schmerzpsychologische Interventionen, wie beschrieben definiert als psychologische Interventionen, deren primäres Ziel die Schmerzreduktion war.
Die extrahierten Zielgrößen, wie Lebensqualität oder Depressivität, ergaben sich aus den in den Primärstudien untersuchten Hauptoutcomes und nicht aus der Suchstrategie.
Zur Einschätzung der methodischen Qualität der Primärstudien konnte ein Kriterium des von Johannsen und Kollegen [2] gebildeten Scores nicht berücksichtigt werden, da die eingeschlossenen Primärstudien keine metaanalytische Zusammenfassung erlaubten. Stellt man dies in Rechnung, bleibt die Vergleichbarkeit beider Werte erhalten.
Die Evidenzsynthese erfolgte narrativ in Text- und Tabellenform, d. h. in Form einer strukturierten Zusammenfassung und Diskussion von Studien [1].
Um unsere Arbeit zu fokussieren, hätten wir eine weitergehende Gegenüberstellung wie auch eine Überprüfung von Zitaten und Übersetzungen selbstverständlich vorgenommen, wenn wir den Hinweis dazu vor Publikation erhalten hätten.
Preface
(2017)
Predicting macroscopic elastic rock properties requires detailed information on microstructure
(2017)
Predicting variations in macroscopic mechanical rock behaviour due to microstructural changes, driven by mineral precipitation and dissolution is necessary to couple chemo-mechanical processes in geological subsurface simulations. We apply 3D numerical homogenization models to estimate Young’s moduli for five synthetic microstructures, and successfully validate our results for comparable geometries with the analytical Mori-Tanaka approach. Further, we demonstrate that considering specific rock microstructures is of paramount importance, since calculated elastic properties may deviate by up to 230 % for the same mineral composition. Moreover, agreement between simulated and experimentally determined Young’s moduli is significantly improved, when detailed spatial information are employed.
Photonic sensing in highly concentrated biotechnical processes by photon density wave spectroscopy
(2017)
Photon Density Wave (PDW) spectroscopy is introduced as a new approach for photonic sensing in highly concentrated biotechnical processes. It independently quantifies the absorption and reduced scattering coefficient calibration-free and as a function of time, thus describing the optical properties in the vis/NIR range of the biomaterial during their processing. As examples of industrial relevance, enzymatic milk coagulation, beer mashing, and algae cultivation in photo bioreactors are discussed.
Preclinical assessment of penetration not only in intact, but also in barrier‐disrupted skin is important to explore the surplus value of novel drug delivery systems, which can be specifically designed for diseased skin. Here, we characterized physical and chemical barrier disruption protocols for short‐term ex vivo skin cultures with regard to structural integrity, physiological and biological parameters. Further, we compared the penetration of dexamethasone (Dex) in different nanoparticle‐based formulations in stratum corneum, epidermis and dermis extracts of intact vs. barrier‐disrupted skin as well as by dermal microdialysis at 6, 12 and 24 hours after topical application. Dex was quantified by liquid‐chromatography ‐ tandem‐mass spectrometry (LC‐MS/MS). Simultaneously, we investigated the Dex efficacy by interleukin (IL) analysis. Tape‐stripping (TS) and 4 hours sodium lauryl sulfate 5 % (SLS) exposure were identified as highly effective barrier disruption methods assessed by reproducible transepidermal water loss (TEWL) changes and IL‐6/8 increase which was more pronounced in SLS‐treated skin. The barrier state has also a significant impact on the Dex penetration kinetics: for all formulations, TS highly increased dermal Dex concentration despite the fact that nanocrystals quickly and effectively penetrated both, intact and barrier‐disrupted skin reaching significantly higher dermal Dex concentration after 6 hours compared to Dex cream. The surplus value of encapsulation in ethyl cellulose nanocarriers could mostly be observed when applied on intact skin, in general showing a delayed Dex penetration. Estimation of cytokines was limited due to the trauma caused by probe insertion. In summary, ex vivo human skin is a highly interesting short‐term preclinical model for the analysis of penetration and efficacy of novel drug delivery systems.
Background: Infliximab (IFX), an anti-TNF monoclonal antibody approved for the treatment of inflammatory bowel disease, is dosed per kg body weight (BW). However, the rationale for body size adjustment has not been unequivocally demonstrated [1], and first attempts to improve IFX therapy have been undertaken [2]. The aim of our study was to assess the impact of different dosing strategies (i.e. body size-adjusted and fixed dosing) on drug exposure and pharmacokinetic (PK) target attainment. For this purpose, a comprehensive simulation study was performed, using patient characteristics (n=116) from an in-house clinical database.
Methods: IFX concentration-time profiles of 1000 virtual, clinically representative patients were generated using a previously published PK model for IFX in patients with Crohn's disease [3]. For each patient 1000 profiles accounting for PK variability were considered. The IFX exposure during maintenance treatment after the following dosing strategies was compared: i) fixed dose, and per ii) BW, iii) lean BW (LBW), iv) body surface area (BSA), v) height (HT), vi) body mass index (BMI) and vii) fat-free mass (FFM)). For each dosing strategy the variability in maximum concentration Cmax, minimum concentration Cmin (= C8weeks) and area under the concentration-time curve (AUC), as well as percent of patients achieving the PK target, Cmin=3 μg/mL [4] were assessed.
Results: For all dosing strategies the variability of Cmin (CV ≈110%) was highest, compared to Cmax and AUC, and was of similar extent regardless of dosing strategy. The proportion of patients reaching the PK target (≈⅓ was approximately equal for all dosing strategies.
This paper describes architectural extensions for a dynamically scheduled processor, so that it can be used in three different operation modes, ranging from high-performance, to high-reliability. With minor hardware-extensions of the control path, the resources of the superscalar data-path can be used either for high-performance execution, fail-safe-operation, or fault-tolerant-operation. This makes the processor-architecture a very good candidate for applications with dynamically changing reliability requirements, e.g. for automotive applications. The paper reports the hardware-overhead for the extensions, and investigates the performance penalties introduced by the fail-safe and fault-tolerant mode. Furthermore, a comprehensive fault simulation was carried out in order to investigate the fault-coverage of the proposed approach.
Web-based E-Learning uses Internet technologies and digital media to deliver education content to learners. Many universities in recent years apply their capacity in producing Massive Open Online Courses (MOOCs). They have been offering MOOCs with an expectation of rendering a comprehensive online apprenticeship. Typically, an online content delivery process requires an Internet connection. However, access to the broadband has never been a readily available resource in many regions. In Africa, poor and no networks are yet predominantly experienced by Internet users, frequently causing offline each moment a digital device disconnect from a network. As a result, a learning process is always disrupted, delayed and terminated in such regions. This paper raises the concern of E-Learning in poor and low bandwidths, in fact, it highlights the needs for an Offline-Enabled mode. The paper also explores technical approaches beamed to enhance the user experience inWeb-based E-Learning, particular in Africa.
Tailed bacteriophages specific for Gram‐negative bacteria encounter lipopolysaccharide (LPS) during the first infection steps. Yet, it is not well understood how biochemistry of these initial interactions relates to subsequent events that orchestrate phage adsorption and tail rearrangements to initiate cell entry. For many phages, long O‐antigen chains found on the LPS of smooth bacterial strains serve as essential receptor recognized by their tailspike proteins (TSP). Many TSP are depolymerases and O‐antigen cleavage was described as necessary step for subsequent orientation towards a secondary receptor. However, O‐antigen specific host attachment must not always come along with O‐antigen degradation. In this issue of Molecular Microbiology Prokhorov et al. report that coliphage G7C carries a TSP that deacetylates O‐antigen but does not degrade it, whereas rough strains or strains lacking O‐antigen acetylation remain unaffected. Bacteriophage G7C specifically functionalizes its tail by attaching the deacetylase TSP directly to a second TSP that is nonfunctional on the host's O‐antigen. This challenges the view that bacteriophages use their TSP only to clear their way to a secondary receptor. Rather, O‐antigen specific phages may employ enzymatically active TSP as a tool for irreversible LPS membrane binding to initiate subsequent infection steps.
Nanocarriers
(2017)
Water management tools are necessary to guarantee the preservation of natural resources while ensuring optimum utilization. Linear regression models are a simple and quick solution for creating prognostic capabilities. Multivariate models show higher precision than univariate models. In the case of Waiwera, implementation of individual production rates is more accurate than applying just the total production rate. A maximum of approximately 1,075 m3/day can be pumped to ensure a water level of at least 0.5 m a.s.l. in the monitoring well. The model should be renewed annually to implement new data and current water level trends to keep the quality.
Moving Forces
(2017)
Throughout a large part of the twentieth century, the body was interpreted as a field of signs, the meaning of which pointed to an unconscious dimension. At the height of the popularity of structuralism, Jacques Lacan deemed the unconscious to be “structured like a language.” Starting in the early 1990s, however, a deep shift occurred in the way the body was interpreted. A new movement cast tremendous doubt on the hegemony of language and instead advocated a performative, pictorial, and affective approach — the so-called material turn — which encompassed all of these. In the words of Karen Barad, this turn inquired as to why meaning, history, and truth are assigned to language only, whereas the movements of materiality are given less prominence: “How did language come to be more trustworthy than matter? Why are language and culture granted their own agency and historicity while matter is figured as passive and immutable?” With this shift toward the material, bodies began to be seen in a different light and their materiality understood as something that follows its own laws and movements, which cannot be understood exclusively in terms of social-cultural codes. Instead, these laws and movements call into question the very dichotomies of nature/culture and body/spirit.
In this paper, the applicability of deep downhole geoelectrical monitoring for detecting CO2 related signatures is evaluated after a nearly ten year period of CO2 storage at the Ketzin pilot site. Deep downhole electrode arrays have been studied as part of a multi-physical monitoring concept at four CO2 pilot test sites worldwide so far. For these sites, it was considered important to implement the geoelectrical method into the measurement program of tracking the CO2 plume. Analyzing the example of the Ketzin site, it can be seen that during all phases of the CO2 storage reservoir development the resistivity measurements and their corresponding tomographic interpretation contribute in a beneficial manner to the measurement, monitoring and verification (MMV) protocol. The most important impact of a permanent electrode array is its potential as tool for estimating reservoir saturations.
During their evolution, massive stars are characterized by a significant loss of mass either via spherically symmetric stellar winds or by aspherical mass-loss mechanisms, namely outflowing equatorial disks. However, the scenario that leads to the formation of a disk or rings of gas and dust around these objects is still under debate. Is it a viscous disk or an ouftlowing disk-forming wind or some other mechanism? It is also unclear how various physical mechanisms that act on the circumstellar environment of the stars affect its shape, density, kinematic, and thermal structure. We assume that the disk-forming mechanism is a viscous transport within an equatorial outflowing disk of a rapidly or even critically rotating star. We study the hydrodynamic and thermal structure of optically thick dense parts of outflowing circumstellar disks that may form around,e.g., Be stars, sgB[e] stars, or Pop m stars. We calculate self-consistent time dependent models of the inner dense region of the disk that is strongly affected either by irradiation from the central star and by contributions of viscous heating effects. We also simulate the dynamic effects of collision between expanding ejecta of supernovae and circumstellar disks that may be form in sgB[e] stars and, e.g., LBVs or Pop in stars.
Mixed-projection treemaps
(2017)
This paper presents a novel technique for combining 2D and 2.5D treemaps using multi-perspective views to leverage the advantages of both treemap types. It enables a new form of overview+detail visualization for tree-structured data and contributes new concepts for real-time rendering of and interaction with treemaps. The technique operates by tilting the graphical elements representing inner nodes using affine transformations and animated state transitions. We explain how to mix orthogonal and perspective projections within a single treemap. Finally, we show application examples that benefit from the reduced interaction overhead.
The maximum entropy method is used to derive an alternative gravity model for a transport network. The proposed method builds on previous methods which assign the discrete value of a maximum entropy distribution to equal the traffic flow rate. The proposed method however, uses a distribution to represent each flow rate. The proposed method is shown to be able to handle uncertainty in a more elegant way and give similar results to traditional methods. It is able to incorporate more of the observed data through the entropy function, prior distribution and integration limits potentially allowing better inferences to be made.
This paper discusses a new approach for designing and deploying Security-as-a-Service (SecaaS) applications using cloud native design patterns. Current SecaaS approaches do not efficiently handle the increasing threats to computer systems and applications. For example, requests for security assessments drastically increase after a high-risk security vulnerability is disclosed. In such scenarios, SecaaS applications are unable to dynamically scale to serve requests. A root cause of this challenge is employment of architectures not specifically fitted to cloud environments. Cloud native design patterns resolve this challenge by enabling certain properties e.g. massive scalability and resiliency via the combination of microservice patterns and cloud-focused design patterns. However adopting these patterns is a complex process, during which several security issues are introduced. In this work, we investigate these security issues, we redesign and deploy a monolithic SecaaS application using cloud native design patterns while considering appropriate, layered security counter-measures i.e. at the application and cloud networking layer. Our prototype implementation out-performs traditional, monolithic applications with an average Scanner Time of 6 minutes, without compromising security. Our approach can be employed for designing secure, scalable and performant SecaaS applications that effectively handle unexpected increase in security assessment requests.