Refine
Has Fulltext
- no (622) (remove)
Year of publication
Document Type
- Other (622) (remove)
Language
- English (622) (remove)
Is part of the Bibliography
- yes (622)
Keywords
- E-Learning (4)
- MOOC (4)
- Scrum (4)
- embodied cognition (4)
- errata, addenda (4)
- Cloud-Security (3)
- ISM: supernova remnants (3)
- Industry 4.0 (3)
- Internet of Things (3)
- Security Metrics (3)
Institute
- Hasso-Plattner-Institut für Digital Engineering GmbH (83)
- Institut für Biochemie und Biologie (81)
- Institut für Physik und Astronomie (80)
- Institut für Geowissenschaften (61)
- Department Psychologie (40)
- Department Sport- und Gesundheitswissenschaften (35)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (29)
- Institut für Ernährungswissenschaft (27)
- Institut für Chemie (26)
- Institut für Informatik und Computational Science (26)
The origin of ambling horses
(2016)
Horseback riding is the most fundamental use of domestic horses and has had a huge influence on the development of human societies for millennia. Over time, riding techniques and the style of riding improved. Therefore, horses with the ability to perform comfortable gaits (e.g. ambling or pacing), so-called ‘gaited’ horses, have been highly valued by humans, especially for long distance travel. Recently, the causative mutation for gaitedness in horses has been linked to a substitution causing a premature stop codon in the DMRT3 gene (DMRT3_Ser301STOP) [1]. In mice, Dmrt3 is expressed in spinal cord interneurons and plays an important role in the development of limb movement coordination [1]. Genotyping the position in 4396 modern horses from 141 breeds revealed that nowadays the mutated allele is distributed worldwide with an especially high frequency in gaited horses and breeds used for harness racing [2]. Here, we examine historic horse remains for the DMRT3 SNP, tracking the origin of gaitedness to Medieval England between 850 and 900 AD. The presence of the corresponding allele in Icelandic horses (9th–11th century) strongly suggests that ambling horses were brought from the British Isles to Iceland by Norse people. Considering the high frequency of the ambling allele in early Icelandic horses, we believe that Norse settlers selected for this comfortable mode of horse riding soon after arrival. The absence of the allele in samples from continental Europe (including Scandinavia) at this time implies that ambling horses may have spread from Iceland and maybe also the British Isles across the continent at a later date.
The Gradient Symbolic Computation (GSC) model presented in the keynote article (Goldrick, Putnam & Schwarz) constitutes a significant theoretical development, not only as a model of bilingual code-mixing, but also as a general framework that brings together symbolic grammars and graded representations. The authors are to be commended for successfully integrating a theory of grammatical knowledge with the voluminous research on lexical co-activation in bilinguals. It is, however, unfortunate that a certain conception of bilingualism was inherited from this latter research tradition, one in which the contrast between native and non-native language takes a back seat.
Adsorption of amino acids on the magnetite-(111)-surface: a force field study (vol 19, 851, 2013)
(2016)
Storm runoff from the Marikina River Basin frequently causes flood events in the Philippine capital region Metro Manila. This paper presents and evaluates a system to predict short-term runoff from the upper part of that basin (380km(2)). It was designed as a possible component of an operational warning system yet to be installed. For the purpose of forecast verification, hindcasts of streamflow were generated for a period of 15 months with a time-continuous, conceptual hydrological model. The latter was fed with real-time observations of rainfall. Both ground observations and weather radar data were tested as rainfall forcings. The radar-based precipitation estimates clearly outperformed the raingauge-based estimates in the hydrological verification. Nevertheless, the quality of the deterministic short-term runoff forecasts was found to be limited. For the radar-based predictions, the reduction of variance for lead times of 1, 2 and 3hours was 0.61, 0.62 and 0.54, respectively, with reference to a no-forecast scenario, i.e. persistence. The probability of detection for major increases in streamflow was typically less than 0.5. Given the significance of flood events in the Marikina Basin, more effort needs to be put into the reduction of forecast errors and the quantification of remaining uncertainties.
Recent advances in high-throughput sequencing experiments and their theoretical descriptions have determined fast dynamics of the "chromatin and epigenetics" field, with new concepts appearing at high rate. This field includes but is not limited to the study of DNA-protein-RNA interactions, chromatin packing properties at different scales, regulation of gene expression and protein trafficking in the cell nucleus, binding site search in the crowded chromatin environment and modulation of physical interactions by covalent chemical modifications of the binding partners. The current special issue does not pretend for the full coverage of the field, but it rather aims to capture its development and provide a snapshot of the most recent concepts and approaches. Eighteen open-access articles comprising this issue provide a delicate balance between current theoretical and experimental biophysical approaches to uncover chromatin structure and understand epigenetic regulation, allowing free flow of new ideas and preliminary results.
Harmonized data file as the basis for comparative analysis of quality of life in the Candidate Countries and the European Union member states, based on seven different data sets, one Eurobarometer survey covering 13 Candidate Countries with an identical set of variables conducted in April 2002, the other six Standard Eurobarometer of different subjects and fielded in different years, each with another set of questions identical with the CC Eurobarometer. Selected aggregate indicators of quality of life ... describing the social situation in the EU15 and Candidate Countries.
Prevention of Cognitive Decline: A Physical Exercise Perspective on Brain Health in the Long Run
(2016)
Pancreatic secretory zymogen-granule membrane glycoprotein 2 (GP2) has been identified to be a major autoantigenic target in Crohn’s disease patients. It was discussed recently that a long and a short isoform of GP2 exists whereas the short isoform is often detected by GP2-specific autoantibodies. In the outcome of inflammatory bowel diseases, these GP2-specific autoantibodies are discussed as new serological markers for diagnosis and therapeutic monitoring. To investigate this further, camelid nanobodies were generated by phage display and selected against the short isoform of GP2 in order to isolate specific tools for the discrimination of both isoforms. Nanobodies are single domain antibodies derived from camelid heavy chain only antibodies and characterized by a high stability and solubility. The selected candidates were expressed, purified and validated regarding their binding properties in different enzyme-linked immunosorbent assays formats, immunofluorescence, immunohistochemistry and surface plasmon resonance spectroscopy. Four different nanobodies could be selected whereof three recognize the short isoform of GP2 very specifically and one nanobody showed a high binding capacity for both isoforms. The KD values measured for all nanobodies were between 1.3 nM and 2.3 pM indicating highly specific binders suitable for the application as diagnostic tool in inflammatory bowel disease.
Monoclonal antibodies are highly valuable tools in biomedicine but the generation by hybridoma technology is very time-consuming and elaborate. In order to circumvent the consisting drawbacks an in vitro immunization approach was established by which murine as well as human monoclonal antibodies against a viral coat protein could be developed. The in vitro immunization process was performed by isolation of murine hematopoietic stem cells or human monocytes and an in vitro differentiation into immature dendritic cells. After antigen loading the cells were co-cultivated with naive T and B lymphocytes for three days in order to obtain antigen-specific B lymphocytes in culture, followed by fusion with murine myeloma cells or human/murine heteromyeloma cells. Antigen-specific hybridomas were selected and the generated antibodies were purified and characterized in this study by ELISA, western blot, gene sequencing, affinity measurements. Further the characteristics were compared to a monoclonal antibody against the same target generated by conventional hybridoma technology. Isotype detection revealed a murine IgM and a human IgG4 antibody in comparison to an IgG1 for the conventionally generated antibody. The antibodies derived from in vitro immunization showed indeed a lower affinity for the antigen as compared to the conventionally generated one, which is probably based on the significantly shorter B cell maturation (3 days) during the immunization process. Nevertheless, they were suitable for building up a sandwich based detection system. Therefore, the in vitro immunization approach seems to be a good and particularly fast alternative to conventional hybridoma technology.
Twenty-four scientists met at Aschauhof, Altenhof, Germany, to discuss the associations between child growth and development, and nutrition, health, environment and psychology. Meta-analyses of body height, height variability and household inequality, in historic and modern growth studies published since 1794, highlighting the enormously flexible patterns of child and adolescent height and weight increments throughout history which do not only depend on genetics, prenatal development, nutrition, health, and economic circumstances, but reflect social interactions. A Quality of Life in Short Stature Youth Questionnaire was presented to cross-culturally assess health-related quality of life in children. Changes of child body proportions in recent history, the relation between height and longevity in historic Dutch samples and also measures of body height in skeletal remains belonged to the topics of this meeting. Bayesian approaches and Monte Carlo simulations offer new statistical tools for the study of human growth.
Seek and destroy: Filtration schemes and self-detoxifying protective fabrics based on the ZrIV-containing metal—organic frameworks (MOFs) MOF-808 and UiO-66 doped with LiOtBu have been developed that capture and hydrolytically detoxify simulants of nerve agents and mustard gas. Both MOFs function as highly catalytic elements in these applications.
Traditional economic theory could not explain, much less predict, the near collapse of the financial system and its long-lasting effects on the global economy. Since the 2008 crisis, there has been increasing interest in using ideas from complexity theory to make sense of economic and financial markets. Concepts, such as tipping points, networks, contagion, feedback, and resilience have entered the financial and regulatory lexicon, but actual use of complexity models and results remains at an early stage. Recent insights and techniques offer potential for better monitoring and management of highly interconnected economic and financial systems and, thus, may help anticipate and manage future crises.
It is argued that, despite differences in cultural norms and practices, the evidence for a link between violent media use and aggression is remarkably consistent across different countries. Along with evidence that different operationalizations of violent media use also converge across countries, these findings strengthen the conclusion that violent media are a risk factor for aggression and validate the psychological explanations for these effects. However, we need comparative studies based on a consistent methodology and a theory-based selection of cultural difference variables to properly examine the potential impact of culture on the association between violent media use and aggression.
Abrupt monsoon transitions as seen in paleorecords can be explained by moisture-advection feedback
(2016)
The German Enlightenment
(2017)
The term Enlightenment (or Aufklärung) remains heavily contested. Even when historians delimit the remit of the concept, assigning it to a particular historical period rather than to an intellectual or moral programme, the public resonance of the Enlightenment remains high and problematic—especially when equated in an essentialist manner with modernity or some core values of ‘the West’. This Forum has been convened to discuss recent research on the Enlightenment in Germany, different views of the term and its ideological use in public discourse outside academia (and sometimes within it).
The interview offers a reconstruction of the German reception of Durkheim since the middle of the 1970s. Hans Joas, who was one of its major protagonists, discusses the backdrop that finally permitted a scholarly examination of Durkheim’s sociology in Germany. Focussing on his personal reception Joas then gives an account of the Durkheimian themes that inspire his work.
Just after the publication of the Theory of Communicative Action in 1981, a new generation of interpreters started a different reception of Durkheim in Germany. Hans-Peter Müller, sociologist and editor of the German translation of Leçons de sociologie, reconstructs the history of the German Durkheim’s Reception and illuminates the reasons for his interest in the French sociologist. He delivers different insights into the background which permitted the post-Habermasian generation to reach a new understanding of Durkheim’s work by enlightening the scientific and political conditions from which this new sensibility emerged.
New data from the LEADER trial show that the glucagon-like peptide 1 receptor agonist liraglutide protects against diabetic nephropathy in patients with type 2 diabetes mellitus. The renoprotective efficacy of liraglutide is not, however, as great as that reported for the sodium-glucose cotransporter 2 inhibitor emplagiflozin in the EMPA-REG OUTCOME trial.
Editorial
(2017)
Tailed bacteriophages specific for Gram‐negative bacteria encounter lipopolysaccharide (LPS) during the first infection steps. Yet, it is not well understood how biochemistry of these initial interactions relates to subsequent events that orchestrate phage adsorption and tail rearrangements to initiate cell entry. For many phages, long O‐antigen chains found on the LPS of smooth bacterial strains serve as essential receptor recognized by their tailspike proteins (TSP). Many TSP are depolymerases and O‐antigen cleavage was described as necessary step for subsequent orientation towards a secondary receptor. However, O‐antigen specific host attachment must not always come along with O‐antigen degradation. In this issue of Molecular Microbiology Prokhorov et al. report that coliphage G7C carries a TSP that deacetylates O‐antigen but does not degrade it, whereas rough strains or strains lacking O‐antigen acetylation remain unaffected. Bacteriophage G7C specifically functionalizes its tail by attaching the deacetylase TSP directly to a second TSP that is nonfunctional on the host's O‐antigen. This challenges the view that bacteriophages use their TSP only to clear their way to a secondary receptor. Rather, O‐antigen specific phages may employ enzymatically active TSP as a tool for irreversible LPS membrane binding to initiate subsequent infection steps.
Kijko et al. (2016) present various methods to estimate parameters that are relevant for probabilistic seismic-hazard assessment. One of these parameters, although not the most influential, is the maximum possible earthquake magnitude m(max). I show that the proposed estimation of m(max) is based on an erroneous equation related to a misuse of the estimator in Cooke (1979) and leads to unstable results. So far, reported finite estimations of m(max) arise from data selection, because the estimator in Kijko et al. (2016) diverges with finite probability. This finding is independent of the assumed distribution of earthquake magnitudes. For the specific choice of the doubly truncated Gutenberg-Richter distribution, I illustrate the problems by deriving explicit equations. Finally, I conclude that point estimators are generally not a suitable approach to constrain m(max).
DPP4 inhibition prevents AKI
(2017)
The design of embedded systems is becoming continuously more complex such that efficient system-level design methods are becoming crucial. Recently, combined Answer Set Programming (ASP) and Quantifier Free Integer Difference Logic (QF-IDL) solving has been shown to be a promising approach in system synthesis. However, this approach still has several restrictions limiting its applicability. In the paper at hand, we propose a novel ASP modulo Theories (ASPmT) system synthesis approach, which (i) supports more sophisticated system models, (ii) tightly integrates the QF-IDL solving into the ASP solving, and (iii) makes use of partial assignment checking. As a result, more realistic systems are considered and an early exclusion of infeasible solutions improves the entire system synthesis.
Gamma-ray bursts (GRBs) are some of the Universe’s most enigmatic and exotic events. However, at energies above 10 GeV their behaviour remains largely unknown. Although space based telescopes such as the Fermi-LAT have been able to detect GRBs in this energy range, their photon statistics are limited by the small detector size. Such limitations are not present in ground based gamma-ray telescopes such as the H.E.S.S. experiment, which has now entered its second phase with the addition of a large 600 m2 telescope to the centre of the array. Such a large telescope allows H.E.S.S. to access the sub 100-GeV energy range while still maintaining a large effective collection area, helping to potentially probe the short timescale emission of these events.
We present a description of the H.E.S.S. GRB observation programme, summarising the performance of the rapid GRB repointing system and the conditions under which GRB observations are initiated. Additionally we will report on the GRB follow-ups made during the 2014-15 observation campaigns.
HESS J1826-130
(2017)
HESS J1826-130 is an unidentified hard spectrum source discovered by H.E.S.S. along the Galactic plane, the spectral index being Gamma = 1.6 with an exponential cut-off at about 12 TeV. While the source does not have a clear counterpart at longer wavelengths, the very hard spectrum emission at TeV energies implies that electrons or protons accelerated up to several hundreds of TeV are responsible for the emission. In the hadronic case, the VHE emission can be produced by runaway cosmic-rays colliding with the dense molecular clouds spatially coincident with the H.E.S.S. source.
E-commerce marketplaces are highly dynamic with constant competition. While this competition is challenging for many merchants, it also provides plenty of opportunities, e.g., by allowing them to automatically adjust prices in order to react to changing market situations. For practitioners however, testing automated pricing strategies is time-consuming and potentially hazardously when done in production. Researchers, on the other side, struggle to study how pricing strategies interact under heavy competition. As a consequence, we built an open continuous time framework to simulate dynamic pricing competition called Price Wars. The microservice-based architecture provides a scalable platform for large competitions with dozens of merchants and a large random stream of consumers. Our platform stores each event in a distributed log. This allows to provide different performance measures enabling users to compare profit and revenue of various repricing strategies in real-time. For researchers, price trajectories are shown which ease evaluating mutual price reactions of competing strategies. Furthermore, merchants can access historical marketplace data and apply machine learning. By providing a set of customizable, artificial merchants, users can easily simulate both simple rule-based strategies as well as sophisticated data-driven strategies using demand learning to optimize their pricing strategies.
The ionospheric delay of global navigation satellite systems (GNSS) signals typically is compensated by adding a single correction value to the pseudorange measurement of a GNSS receiver. Yet, this neglects the dispersive nature of the ionosphere. In this context we analyze the ionospheric signal distortion beyond a constant delay. These effects become increasingly significant with the signal bandwidth and hence more important for new broadband navigation signals. Using measurements of the Galileo E5 signal, captured with a high gain antenna, we verify that the expected influence can indeed be observed and compensated. A new method to estimate the total electron content (TEC) from a single frequency high gain antenna measurement of a broadband GNSS signal is proposed and described in detail. The received signal is de facto unaffected by multi-path and interference because of the narrow aperture angle of the used antenna which should reduce the error source of the result in general. We would like to point out that such measurements are independent of code correlation, like in standard receiver applications. It is therefore also usable without knowledge of the signal coding. Results of the TEC estimation process are shown and discussed comparing to common TEC products like TEC maps and dual frequency receiver estimates.
Mixed-projection treemaps
(2017)
This paper presents a novel technique for combining 2D and 2.5D treemaps using multi-perspective views to leverage the advantages of both treemap types. It enables a new form of overview+detail visualization for tree-structured data and contributes new concepts for real-time rendering of and interaction with treemaps. The technique operates by tilting the graphical elements representing inner nodes using affine transformations and animated state transitions. We explain how to mix orthogonal and perspective projections within a single treemap. Finally, we show application examples that benefit from the reduced interaction overhead.
Photonic sensing in highly concentrated biotechnical processes by photon density wave spectroscopy
(2017)
Photon Density Wave (PDW) spectroscopy is introduced as a new approach for photonic sensing in highly concentrated biotechnical processes. It independently quantifies the absorption and reduced scattering coefficient calibration-free and as a function of time, thus describing the optical properties in the vis/NIR range of the biomaterial during their processing. As examples of industrial relevance, enzymatic milk coagulation, beer mashing, and algae cultivation in photo bioreactors are discussed.
Nanocarriers
(2017)
Background: Evidence that home telemonitoring (HTM) for patients with chronic heart failure (CHF) offers clinical benefit over usual care is controversial as is evidence of a health economic advantage. Therefore the CardioBBEAT trial was designed to prospectively assess the health economic impact of a dedicated home monitoring system for patients with CHF based on actual costs directly obtained from patients’ health care providers.
Methods: Between January 2010 and June 2013, 621 patients (mean age 63,0 ± 11,5 years, 88 % male) with a confirmed diagnosis of CHF (LVEF ≤ 40 %) were enrolled and randomly assigned to two study groups comprising usual care with and without an interactive bi-directional HTM (Motiva®). The primary endpoint was the Incremental Cost-Effectiveness Ratio (ICER) established by the groups’ difference in total cost and in the combined clinical endpoint “days alive and not in hospital nor inpatient care per potential days in study” within the follow up of 12 months. Secondary outcome measures were total mortality and health related quality of life (SF-36, WHO-5 and KCCQ).
Results: In the intention-to-treat analysis, total mortality (HR 0.81; 95% CI 0.45 – 1.45) and days alive and not in hospital (343.3 ± 55.4 vs. 347.2 ± 43.9; p = 0.909) were not significantly different between HTM and usual care. While the resulting primary endpoint ICER was not positive (-181.9; 95% CI −1626.2 ± 1628.9), quality of life assessed by SF-36, WHO-5 and KCCQ as a secondary endpoint was significantly higher in the HTW group at 6 and 12 months of follow-up.
Conclusions: The first simultaneous assessment of clinical and economic outcome of HTM in patients with CHF did not demonstrate superior incremental cost effectiveness compared to usual care. On the other hand, quality of life was improved. It remains open whether the tested HTM solution represents a useful innovative approach in the recent health care setting.
Root infinitives on Twitter
(2017)
Recently, Kocyan & Wiland-Szymańska (2016) have published a thorough research article on one of the outstanding members of the family Hypoxidaceae on the Seychelles, which resulted in the raise of a new genus (Friedmannia Kocyan & Wiland-Szymańska 2016: 60) to accommodate the former Curculigo seychellensis Bojer ex Baker (1877: 368). However, it has turned out that the name Friedmannia Chantanachat & Bold (1962: 45) already exists in literature for a green alga, which renders the new hypoxid genus illegitimate (Melbourne Code; McNeill et al. 2012). Therefore, we assign a new generic epithet to Curculigo seychellensis.
Gaussianity Fair
(2017)
Editorial
(2017)
Since the Shallow Structure Hypothesis (SSH) was first put forward in 2006, it has inspired a growing body of research on grammatical processing in nonnative (L2) speakers. More than 10 years later, we think it is time for the SSH to be reconsidered in the light of new empirical findings and current theoretical assumptions about human language processing. The purpose of our critical commentary is twofold: to clarify some issues regarding the SSH and to sketch possible ways in which this hypothesis might be refined and improved to better account for L1 and L2 speakers’ performance patterns.
The keynote article (Mayberry & Kluender, 2017) makes an important contribution to questions concerning the existence and characteristics of sensitive periods in language acquisition. Specifically, by comparing groups of non-native L1 and L2 signers, the authors have been able to ingeniously disentangle the effects of maturation from those of early language exposure. Based on L1 versus L2 contrasts, the paper convincingly argues that L2 learning is a less clear test of sensitive periods. Nevertheless, we believe Mayberry and Kluender underestimate the evidence for maturational factors in L2 learning, especially that coming from recent research.
Emergency Care in Germany being re-assessed Hybrid Medical Care Model Seen As Potential Answer
(2017)
In recent years, “transnationalism” has become a key concept for historians and other scholars in the humanities and social sciences. However, its overuse threatens to dilute what would otherwise be a distinct approach with promising heuristic potential. This danger seems especially pronounced when the notion of transnationalism is applied to Jewish history, which, paradoxically, most scholars would agree, is at its core transnational. Many studies have analyzed how Jewries in different times and places, from the biblical era to the present, have been shaped by people, ideas, texts, and institutions that migrated across state lines and between cultures. So what is new about transnationalism in Jewish Studies? What new insights does it offer?
American Jewry offers an obvious arena to test transnationalism’s significance as an approach to historical research within Jewish studies. As a “nation of nations,” the United States is made up of a distinct and unique society, built on ideas of diversity and pluralism, and transcending old European concepts of nation and state. The transformative incorporation in American life of cultural, political, and social traditions brought from abroad is one feature of this distinctiveness. American Jewish history and culture, in particular, are best understood in the context of interaction with Jews in other places, both because of American Jews’ roots in and continued entanglement with Europe, and because of their differences from other Jews.
These considerations guided the participants in a roundtable that formed a prologue to an international conference held July 20–22, 2016, at the School of Jewish Theology at the University of Potsdam and the Center for Jewish Studies Berlin-Brandenburg, Germany. The conference title, “Re-Framing American Jewish History and Thought: New Transnational Perspectives,” indicated the organizers’ conviction that the transnational approach does have the potential to shed fresh light on the American Jewish experience. The participants were asked to bring their experiences to the table, in an effort to clarify what transnationalism might mean for American Jewish Studies, and where it might yield new approaches and insights.
The conference brought together some thirty scholars of various disciplines from Europe, Israel, and the United States. In addition to exploring a relatively new approach (at least, in the field of American Jewish Studies), the conference also served a second purpose: to further the interest in American Jewry as a subject of scholarly attention in countries outside the U.S., where the topic has been curiously neglected. The assumption underlying the conference was that a transnational perspective on American Jewry would bring to bear the particular interests and skills of scholars working outside the American academy, and thereby complement, rather than replicate, the ways American Jewish Studies have been pursued in North America itself.
We compare Visual Berrypicking, an interactive approach allowing users to explore large and highly faceted information spaces using similarity-based two-dimensional maps, with traditional browsing techniques. For large datasets, current projection methods used to generate maplike overviews suffer from increased computational costs and a loss of accuracy resulting in inconsistent visualizations. We propose to interactively align inexpensive small maps, showing local neighborhoods only, which ideally creates the impression of panning a large map. For evaluation, we designed a web-based prototype for movie exploration and compared it to the web interface of The Movie Database (TMDb) in an online user study. Results suggest that users are able to effectively explore large movie collections by hopping from one neighborhood to the next. Additionally, due to the projection of movie similarities, interesting links between movies can be found more easily, and thus, compared to browsing serendipitous discoveries are more likely.
In this extended abstract, we will analyze the current challenges for the envisioned Self-Adaptive CPS. In addition, we will outline our results to approach these challenges with SMARTSOS [10] a generic approach based on extensions of graph transformation systems employing open and adaptive collaborations and models at runtime for trustworthy self-adaptation, self-organization, and evolution of the individual systems and the system-of-systems level taking the independent development, operation, management, and evolution of these systems into account.
In this paper, the applicability of deep downhole geoelectrical monitoring for detecting CO2 related signatures is evaluated after a nearly ten year period of CO2 storage at the Ketzin pilot site. Deep downhole electrode arrays have been studied as part of a multi-physical monitoring concept at four CO2 pilot test sites worldwide so far. For these sites, it was considered important to implement the geoelectrical method into the measurement program of tracking the CO2 plume. Analyzing the example of the Ketzin site, it can be seen that during all phases of the CO2 storage reservoir development the resistivity measurements and their corresponding tomographic interpretation contribute in a beneficial manner to the measurement, monitoring and verification (MMV) protocol. The most important impact of a permanent electrode array is its potential as tool for estimating reservoir saturations.
Predicting macroscopic elastic rock properties requires detailed information on microstructure
(2017)
Predicting variations in macroscopic mechanical rock behaviour due to microstructural changes, driven by mineral precipitation and dissolution is necessary to couple chemo-mechanical processes in geological subsurface simulations. We apply 3D numerical homogenization models to estimate Young’s moduli for five synthetic microstructures, and successfully validate our results for comparable geometries with the analytical Mori-Tanaka approach. Further, we demonstrate that considering specific rock microstructures is of paramount importance, since calculated elastic properties may deviate by up to 230 % for the same mineral composition. Moreover, agreement between simulated and experimentally determined Young’s moduli is significantly improved, when detailed spatial information are employed.
Water management tools are necessary to guarantee the preservation of natural resources while ensuring optimum utilization. Linear regression models are a simple and quick solution for creating prognostic capabilities. Multivariate models show higher precision than univariate models. In the case of Waiwera, implementation of individual production rates is more accurate than applying just the total production rate. A maximum of approximately 1,075 m3/day can be pumped to ensure a water level of at least 0.5 m a.s.l. in the monitoring well. The model should be renewed annually to implement new data and current water level trends to keep the quality.
Integration and development of the energy supply in China and worldwide is a challenge for the years to come. The innovative idea presented here is based on an extension of the “power-to-gas-to-power” technology by establishing a closed carbon cycle. It is an implementation of a low-carbon energy system based on carbon dioxide capture and storage (CCS) to store and reuse wind and solar energy. The Chenjiacun storage project in China compares well with the German case study for the towns Potsdam and Brandenburg/Havel in the Federal State of Brandenburg based on the Ketzin pilot site for CCS.
Editorial
(2017)
The maximum entropy method is used to predict flows on water distribution networks. This analysis extends the water distribution network formulation of Waldrip et al. (2016) Journal of Hydraulic Engineering (ASCE), by the use of a continuous relative entropy defined on a reduced parameter set. This reduction in the parameters that the entropy is defined over ensures consistency between different representations of the same network. The performance of the proposed reduced parameter method is demonstrated with a one-loop network case study.
The maximum entropy method is used to derive an alternative gravity model for a transport network. The proposed method builds on previous methods which assign the discrete value of a maximum entropy distribution to equal the traffic flow rate. The proposed method however, uses a distribution to represent each flow rate. The proposed method is shown to be able to handle uncertainty in a more elegant way and give similar results to traditional methods. It is able to incorporate more of the observed data through the entropy function, prior distribution and integration limits potentially allowing better inferences to be made.