Refine
Has Fulltext
- no (654) (remove)
Year of publication
Document Type
- Other (654) (remove)
Language
- English (654) (remove)
Is part of the Bibliography
- yes (654)
Keywords
- E-Learning (4)
- MOOC (4)
- Scrum (4)
- embodied cognition (4)
- errata, addenda (4)
- Cloud-Security (3)
- ISM: supernova remnants (3)
- Industry 4.0 (3)
- Internet of Things (3)
- Security Metrics (3)
Institute
- Hasso-Plattner-Institut für Digital Engineering GmbH (83)
- Institut für Biochemie und Biologie (82)
- Institut für Physik und Astronomie (82)
- Institut für Geowissenschaften (63)
- Department Psychologie (42)
- Department Sport- und Gesundheitswissenschaften (39)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (30)
- Institut für Chemie (27)
- Institut für Ernährungswissenschaft (27)
- Institut für Informatik und Computational Science (26)
Dielectrophoretic functionalization of nanoelectrode arrays for the detection of influenza viruses
(2017)
Structural health monitoring activities are of primal importance for managing transport infrastructure, however most SHM methodologies are based on point-based sensors that have limitations in terms of their spatial positioning requirements, cost of development and measurement range. This paper describes the progress on the SENSKIN EC project whose objective is to develop a dielectric-elastomer and micro-electronics-based sensor, formed from a large highly extensible capacitance sensing membrane supported by advanced microelectronic circuitry, for monitoring transport infrastructure bridges. Such a sensor could provide spatial measurements of strain in excess of 10%. The actual sensor along with the data acquisition module, the communication module and power electronics are all integrated into a compact unit, the SENSKIN device, which is energy-efficient, requires simple signal processing and it is easy to install over various surface types. In terms of communication, SENSKIN devices interact with each other to form the SENSKIN system; a fully distributed and autonomous wireless sensor network that is able to self-monitor. SENSKIN system utilizes Delay-/Disruption-Tolerant Networking technologies to ensure that the strain measurements will be received by the base station even under extreme conditions where normal communications are disrupted. This paper describes the architecture of the SENSKIN system and the development and testing of the first SENSKIN prototype sensor, the data acquisition system, and the communication system.
Preclinical assessment of penetration not only in intact, but also in barrier‐disrupted skin is important to explore the surplus value of novel drug delivery systems, which can be specifically designed for diseased skin. Here, we characterized physical and chemical barrier disruption protocols for short‐term ex vivo skin cultures with regard to structural integrity, physiological and biological parameters. Further, we compared the penetration of dexamethasone (Dex) in different nanoparticle‐based formulations in stratum corneum, epidermis and dermis extracts of intact vs. barrier‐disrupted skin as well as by dermal microdialysis at 6, 12 and 24 hours after topical application. Dex was quantified by liquid‐chromatography ‐ tandem‐mass spectrometry (LC‐MS/MS). Simultaneously, we investigated the Dex efficacy by interleukin (IL) analysis. Tape‐stripping (TS) and 4 hours sodium lauryl sulfate 5 % (SLS) exposure were identified as highly effective barrier disruption methods assessed by reproducible transepidermal water loss (TEWL) changes and IL‐6/8 increase which was more pronounced in SLS‐treated skin. The barrier state has also a significant impact on the Dex penetration kinetics: for all formulations, TS highly increased dermal Dex concentration despite the fact that nanocrystals quickly and effectively penetrated both, intact and barrier‐disrupted skin reaching significantly higher dermal Dex concentration after 6 hours compared to Dex cream. The surplus value of encapsulation in ethyl cellulose nanocarriers could mostly be observed when applied on intact skin, in general showing a delayed Dex penetration. Estimation of cytokines was limited due to the trauma caused by probe insertion. In summary, ex vivo human skin is a highly interesting short‐term preclinical model for the analysis of penetration and efficacy of novel drug delivery systems.
Embedded smart home
(2017)
The popularity of MOOCs has increased considerably in the last years. A typical MOOC course consists of video content, self tests after a video and homework, which is normally in multiple choice format. After solving this homeworks for every week of a MOOC, the final exam certificate can be issued when the student has reached a sufficient score. There are also some attempts to include practical tasks, such as programming, in MOOCs for grading. Nevertheless, until now there is no known possibility to teach embedded system programming in a MOOC course where the programming can be done in a remote lab and where grading of the tasks is additionally possible. This embedded programming includes communication over GPIO pins to control LEDs and measure sensor values. We started a MOOC course called "Embedded Smart Home" as a pilot to prove the concept to teach real hardware programming in a MOOC environment under real life MOOC conditions with over 6000 students. Furthermore, also students with real hardware have the possibility to program on their own real hardware and grade their results in the MOOC course. Finally, we evaluate our approach and analyze the student acceptance of this approach to offer a course on embedded programming. We also analyze the hardware usage and working time of students solving tasks to find out if real hardware programming is an advantage and motivating achievement to support students learning success.
Selection of initial points, the number of clusters and finding proper clusters centers are still the main challenge in clustering processes. In this paper, we suggest genetic algorithm based method which searches several solution spaces simultaneously. The solution spaces are population groups consisting of elements with similar structure. Elements in a group have the same size, while elements in different groups are of different sizes. The proposed algorithm processes the population in groups of chromosomes with one gene, two genes to k genes. These genes hold corresponding information about the cluster centers. In the proposed method, the crossover and mutation operators can accept parents with different sizes; this can lead to versatility in population and information transfer among sub-populations. We implemented the proposed method and evaluated its performance against some random datasets and the Ruspini dataset as well. The experimental results show that the proposed method could effectively determine the appropriate number of clusters and recognize their centers. Overall this research implies that using heterogeneous population in the genetic algorithm can lead to better results.
The identification of vulnerabilities relies on detailed information about the target infrastructure. The gathering of the necessary information is a crucial step that requires an intensive scanning or mature expertise and knowledge about the system even though the information was already available in a different context. In this paper we propose a new method to detect vulnerabilities that reuses the existing information and eliminates the necessity of a comprehensive scan of the target system. Since our approach is able to identify vulnerabilities without the additional effort of a scan, we are able to increase the overall performance of the detection. Because of the reuse and the removal of the active testing procedures, our approach could be classified as a passive vulnerability detection. We will explain the approach and illustrate the additional possibility to increase the security awareness of users. Therefore, we applied the approach on an experimental setup and extracted security relevant information from web logs.
Dissolved CO2 storage in geological formations with low pressure, low risk and large capacities
(2017)
Geological CO2 storage is a mitigation technology to reduce CO2 emissions from fossil fuel combustion. However, major concerns are the pressure increase and saltwater displacement in the mainly targeted deep groundwater aquifers due to injection of supercritical CO2. The suggested solution is storage of CO2 exclusively in the dissolved state. In our exemplary regional case study of the North East German Basin based on a highly resolved temperature and pressure distribution model and a newly developed reactive transport coupling, we have quantified that 4.7 Gt of CO2 can be stored in solution compared to 1.5 Gt in the supercritical state.
This paper discusses a new approach for designing and deploying Security-as-a-Service (SecaaS) applications using cloud native design patterns. Current SecaaS approaches do not efficiently handle the increasing threats to computer systems and applications. For example, requests for security assessments drastically increase after a high-risk security vulnerability is disclosed. In such scenarios, SecaaS applications are unable to dynamically scale to serve requests. A root cause of this challenge is employment of architectures not specifically fitted to cloud environments. Cloud native design patterns resolve this challenge by enabling certain properties e.g. massive scalability and resiliency via the combination of microservice patterns and cloud-focused design patterns. However adopting these patterns is a complex process, during which several security issues are introduced. In this work, we investigate these security issues, we redesign and deploy a monolithic SecaaS application using cloud native design patterns while considering appropriate, layered security counter-measures i.e. at the application and cloud networking layer. Our prototype implementation out-performs traditional, monolithic applications with an average Scanner Time of 6 minutes, without compromising security. Our approach can be employed for designing secure, scalable and performant SecaaS applications that effectively handle unexpected increase in security assessment requests.
Web-based E-Learning uses Internet technologies and digital media to deliver education content to learners. Many universities in recent years apply their capacity in producing Massive Open Online Courses (MOOCs). They have been offering MOOCs with an expectation of rendering a comprehensive online apprenticeship. Typically, an online content delivery process requires an Internet connection. However, access to the broadband has never been a readily available resource in many regions. In Africa, poor and no networks are yet predominantly experienced by Internet users, frequently causing offline each moment a digital device disconnect from a network. As a result, a learning process is always disrupted, delayed and terminated in such regions. This paper raises the concern of E-Learning in poor and low bandwidths, in fact, it highlights the needs for an Offline-Enabled mode. The paper also explores technical approaches beamed to enhance the user experience inWeb-based E-Learning, particular in Africa.
Massive Open Online Courses (MOOCs) have left their mark on the face of education during the recent years. At the Hasso Plattner Institute (HPI) in Potsdam, Germany, we are actively developing a MOOC platform, which provides our research with a plethora of e-learning topics, such as learning analytics, automated assessment, peer assessment, team-work, online proctoring, and gamification. We run several instances of this platform. On openHPI, we provide our own courses from within the HPI context. Further instances are openSAP, openWHO, and mooc.HOUSE, which is the smallest of these platforms, targeting customers with a less extensive course portfolio. In 2013, we started to work on the gamification of our platform. By now, we have implemented about two thirds of the features that we initially have evaluated as useful for our purposes. About a year ago we activated the implemented gamification features on mooc.HOUSE. Before activating the features on openHPI as well, we examined, and re-evaluated our initial considerations based on the data we collected so far and the changes in other contexts of our platforms.
Recently a multitude of empirically derived damage models have been applied to project future tropical cyclone (TC) losses for the United States. In their study (Geiger et al 2016 Environ. Res. Lett. 11 084012) compared two approaches that differ in the scaling of losses with socio-economic drivers: the commonly-used approach resulting in a sub-linear scaling of historical TC losses with a nation's affected gross domestic product (GDP), and the disentangled approach that shows a sub-linear increase with affected population and a super-linear scaling of relative losses with per capita income. Statistics cannot determine which approach is preferable but since process understanding demands that there is a dependence of the loss on both GDP per capita and population, an approach that accounts for both separately is preferable to one which assumes a specific relation between the two dependencies. In the accompanying comment, Rybski et al argued that there is no rigorous evidence to reach the conclusion that high-income does not protect against hurricane losses. Here we affirm that our conclusion is drawn correctly and reply to further remarks raised in the comment, highlighting the adequateness of our approach but also the potential for future extension of our research.
Root infinitives on Twitter
(2017)
Editorial
(2017)
We develop a simple two-zone interpretation of the broadband baseline Crab nebula spectrum between 10(-5) eV and similar to 100 TeV by using two distinct log-parabola energetic electrons distributions. We determine analytically the very-high energy photon spectrum as originated by inverse-Compton scattering of the far-infrared soft ambient photons within the nebula off a first population of electrons energized at the nebula termination shock. The broad and flat 200 GeV peak jointly observed by Fermi/LAT and MAGIC is naturally reproduced. The synchrotron radiation from a second energetic electron population explains the spectrum from the radio range up to similar to 10 keV. We infer from observations the energy dependence of the microscopic probability of remaining in proximity of the shock of the accelerating electrons.
Since the Shallow Structure Hypothesis (SSH) was first put forward in 2006, it has inspired a growing body of research on grammatical processing in nonnative (L2) speakers. More than 10 years later, we think it is time for the SSH to be reconsidered in the light of new empirical findings and current theoretical assumptions about human language processing. The purpose of our critical commentary is twofold: to clarify some issues regarding the SSH and to sketch possible ways in which this hypothesis might be refined and improved to better account for L1 and L2 speakers’ performance patterns.
Photonic sensing in highly concentrated biotechnical processes by photon density wave spectroscopy
(2017)
Photon Density Wave (PDW) spectroscopy is introduced as a new approach for photonic sensing in highly concentrated biotechnical processes. It independently quantifies the absorption and reduced scattering coefficient calibration-free and as a function of time, thus describing the optical properties in the vis/NIR range of the biomaterial during their processing. As examples of industrial relevance, enzymatic milk coagulation, beer mashing, and algae cultivation in photo bioreactors are discussed.
Background: Infliximab (IFX), an anti-TNF monoclonal antibody approved for the treatment of inflammatory bowel disease, is dosed per kg body weight (BW). However, the rationale for body size adjustment has not been unequivocally demonstrated [1], and first attempts to improve IFX therapy have been undertaken [2]. The aim of our study was to assess the impact of different dosing strategies (i.e. body size-adjusted and fixed dosing) on drug exposure and pharmacokinetic (PK) target attainment. For this purpose, a comprehensive simulation study was performed, using patient characteristics (n=116) from an in-house clinical database.
Methods: IFX concentration-time profiles of 1000 virtual, clinically representative patients were generated using a previously published PK model for IFX in patients with Crohn's disease [3]. For each patient 1000 profiles accounting for PK variability were considered. The IFX exposure during maintenance treatment after the following dosing strategies was compared: i) fixed dose, and per ii) BW, iii) lean BW (LBW), iv) body surface area (BSA), v) height (HT), vi) body mass index (BMI) and vii) fat-free mass (FFM)). For each dosing strategy the variability in maximum concentration Cmax, minimum concentration Cmin (= C8weeks) and area under the concentration-time curve (AUC), as well as percent of patients achieving the PK target, Cmin=3 μg/mL [4] were assessed.
Results: For all dosing strategies the variability of Cmin (CV ≈110%) was highest, compared to Cmax and AUC, and was of similar extent regardless of dosing strategy. The proportion of patients reaching the PK target (≈⅓ was approximately equal for all dosing strategies.
As a potentially toxic agent on nervous system and bone, the safety of aluminium exposure from adjuvants in vaccines and subcutaneous immune therapy (SCIT) products has to be continuously reevaluated, especially regarding concomitant administrations. For this purpose, knowledge on absorption and disposition of aluminium in plasma and tissues is essential. Pharmacokinetic data after vaccination in humans, however, are not available, and for methodological and ethical reasons difficult to obtain. To overcome these limitations, we discuss the possibility of an in vitro-in silico approach combining a toxicokinetic model for aluminium disposition with biorelevant kinetic absorption parameters from adjuvants. We critically review available kinetic aluminium-26 data for model building and, on the basis of a reparameterized toxicokinetic model (Nolte et al., 2001), we identify main modelling gaps. The potential of in vitro dissolution experiments for the prediction of intramuscular absorption kinetics of aluminium after vaccination is explored. It becomes apparent that there is need for detailed in vitro dissolution and in vivo absorption data to establish an in vitro-in vivo correlation (IVIVC) for aluminium adjuvants. We conclude that a combination of new experimental data and further refinement of the Nolte model has the potential to fill a gap in aluminium risk assessment. (C) 2017 Elsevier Inc. All rights reserved.
The design of embedded systems is becoming continuously more complex such that efficient system-level design methods are becoming crucial. Recently, combined Answer Set Programming (ASP) and Quantifier Free Integer Difference Logic (QF-IDL) solving has been shown to be a promising approach in system synthesis. However, this approach still has several restrictions limiting its applicability. In the paper at hand, we propose a novel ASP modulo Theories (ASPmT) system synthesis approach, which (i) supports more sophisticated system models, (ii) tightly integrates the QF-IDL solving into the ASP solving, and (iii) makes use of partial assignment checking. As a result, more realistic systems are considered and an early exclusion of infeasible solutions improves the entire system synthesis.
It has been observationally established that winds of hot massive stars have highly variable characteristics. The variability evident in the winds is believed to be caused by structures on a broad range of spatial scales. Small-scale structures (clumping) in stellar winds of hot stars are possible consequence of an instability appearing in their radiation hydrodynamics. To understand how clumping may influence calculation of theoretical spectra, different clumping properties and their 3D nature have to be taken into account. Properties of clumping have been examined using our 3D radiative transfer calculations. Effects of clumping for the case of the B[e] phenomenon are discussed.
Conclusion
(2016)
This chapter revisits the role of the new modes of governance in areas of limited statehood. First, it states that there is no linear relationship between degrees of statehood and the overall effectiveness of new modes of sustainability governance. Second, the chapter states that, in most of the cases, national governments are hesitant or even actively hamper the development of new modes of governance. Third, it shows that the absence of the shadow of hierarchy can indeed lead to ineffective new modes of governance. However, the shadow of hierarchy does not necessarily need to be cast by states. Finally, the author reviews the complexities involved in participatory practices, stressing the importance of institutional structures and knowledgeable brokers. The chapter concludes by outlining fields for future research.
Introduction
(2016)
The Paris Agreement for Climate Change or the Sustainable Development Goals (SDGs) rely on new modes of governance for implementation. Indeed, new modes of governance such as market-based instruments, public-private partnerships or multi-stakeholder initiatives have been praised for playing a pivotal role in effective and legitimate sustainability governance. Yet, do they also deliver in areas of limited statehood? States such as Malaysia or the Dominican Republic partly lack the ability to implement and enforce rules; their statehood is limited. This introduction provides the analytical framework of this volume and critically examines the performance of new modes of governance in areas of limited statehood, drawing on the book’s in-depth case studies on issues of climate change, biodiversity, and health.
This chapter investigates the trajectory of establishing the Forest Stewardship Council (FSC) in the early 1990s as the first private transnational certification organization with an antagonistic stakeholder body. Its main contribution is a micro-analysis of the founding assembly in 1993. By investigating the role of brokers within the negotiation as one institutional scope condition for ‘arguing’ having occurred, the chapter adopts a dramaturgical approach. It contends that the authority of brokers is not necessarily institutionally given, but needs to be gained: brokers have to prove situationally that their knowledge is relevant and that they are speaking impartially in the interest of progress rather than their own. The chapter stresses the importance of procedural knowledge which brokers provide in contrast to policy knowledge.
Gilbert et al. conclude that evidence from the Open Science Collaboration’s Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted.
Harmonized data file as the basis for comparative analysis of quality of life in the Candidate Countries and the European Union member states, based on seven different data sets, one Eurobarometer survey covering 13 Candidate Countries with an identical set of variables conducted in April 2002, the other six Standard Eurobarometer of different subjects and fielded in different years, each with another set of questions identical with the CC Eurobarometer. Selected aggregate indicators of quality of life ... describing the social situation in the EU15 and Candidate Countries.
Pancreatic secretory zymogen-granule membrane glycoprotein 2 (GP2) has been identified to be a major autoantigenic target in Crohn’s disease patients. It was discussed recently that a long and a short isoform of GP2 exists whereas the short isoform is often detected by GP2-specific autoantibodies. In the outcome of inflammatory bowel diseases, these GP2-specific autoantibodies are discussed as new serological markers for diagnosis and therapeutic monitoring. To investigate this further, camelid nanobodies were generated by phage display and selected against the short isoform of GP2 in order to isolate specific tools for the discrimination of both isoforms. Nanobodies are single domain antibodies derived from camelid heavy chain only antibodies and characterized by a high stability and solubility. The selected candidates were expressed, purified and validated regarding their binding properties in different enzyme-linked immunosorbent assays formats, immunofluorescence, immunohistochemistry and surface plasmon resonance spectroscopy. Four different nanobodies could be selected whereof three recognize the short isoform of GP2 very specifically and one nanobody showed a high binding capacity for both isoforms. The KD values measured for all nanobodies were between 1.3 nM and 2.3 pM indicating highly specific binders suitable for the application as diagnostic tool in inflammatory bowel disease.
Monoclonal antibodies are highly valuable tools in biomedicine but the generation by hybridoma technology is very time-consuming and elaborate. In order to circumvent the consisting drawbacks an in vitro immunization approach was established by which murine as well as human monoclonal antibodies against a viral coat protein could be developed. The in vitro immunization process was performed by isolation of murine hematopoietic stem cells or human monocytes and an in vitro differentiation into immature dendritic cells. After antigen loading the cells were co-cultivated with naive T and B lymphocytes for three days in order to obtain antigen-specific B lymphocytes in culture, followed by fusion with murine myeloma cells or human/murine heteromyeloma cells. Antigen-specific hybridomas were selected and the generated antibodies were purified and characterized in this study by ELISA, western blot, gene sequencing, affinity measurements. Further the characteristics were compared to a monoclonal antibody against the same target generated by conventional hybridoma technology. Isotype detection revealed a murine IgM and a human IgG4 antibody in comparison to an IgG1 for the conventionally generated antibody. The antibodies derived from in vitro immunization showed indeed a lower affinity for the antigen as compared to the conventionally generated one, which is probably based on the significantly shorter B cell maturation (3 days) during the immunization process. Nevertheless, they were suitable for building up a sandwich based detection system. Therefore, the in vitro immunization approach seems to be a good and particularly fast alternative to conventional hybridoma technology.
Preface to BPM 2014
(2016)
In this Comment, we review the results of pattern formation in a reaction-diffusion-advection system following the kinetics of the Gray-Scott model. A recent paper by Das [Phys. Rev. E 92, 052914 (2015)] shows that spatiotemporal chaos of the intermittency type can disappear as the advective flow is increased. This study, however, refers to a single point in the space of kinetic parameters of the original Gray-Scott model. Here we show that the wealth of patterns increases substantially as some of these parameters are changed. In addition to spatiotemporal intermittency, defect-mediated turbulence can also be found. In all cases, however, the chaotic behavior is seen to disappear as the advective flow is increased, following a scenario similar to what was reported in our earlier work [I. Berenstein and C. Beta, Phys. Rev. E 86, 056205 (2012)] as well as by Das. We also point out that a similar phenomenon can be found in other reaction-diffusion-advection models, such as the Oregonator model for the Belousov-Zhabotinsky reaction under flow conditions.
The origin of ambling horses
(2016)
Horseback riding is the most fundamental use of domestic horses and has had a huge influence on the development of human societies for millennia. Over time, riding techniques and the style of riding improved. Therefore, horses with the ability to perform comfortable gaits (e.g. ambling or pacing), so-called ‘gaited’ horses, have been highly valued by humans, especially for long distance travel. Recently, the causative mutation for gaitedness in horses has been linked to a substitution causing a premature stop codon in the DMRT3 gene (DMRT3_Ser301STOP) [1]. In mice, Dmrt3 is expressed in spinal cord interneurons and plays an important role in the development of limb movement coordination [1]. Genotyping the position in 4396 modern horses from 141 breeds revealed that nowadays the mutated allele is distributed worldwide with an especially high frequency in gaited horses and breeds used for harness racing [2]. Here, we examine historic horse remains for the DMRT3 SNP, tracking the origin of gaitedness to Medieval England between 850 and 900 AD. The presence of the corresponding allele in Icelandic horses (9th–11th century) strongly suggests that ambling horses were brought from the British Isles to Iceland by Norse people. Considering the high frequency of the ambling allele in early Icelandic horses, we believe that Norse settlers selected for this comfortable mode of horse riding soon after arrival. The absence of the allele in samples from continental Europe (including Scandinavia) at this time implies that ambling horses may have spread from Iceland and maybe also the British Isles across the continent at a later date.
The Gradient Symbolic Computation (GSC) model presented in the keynote article (Goldrick, Putnam & Schwarz) constitutes a significant theoretical development, not only as a model of bilingual code-mixing, but also as a general framework that brings together symbolic grammars and graded representations. The authors are to be commended for successfully integrating a theory of grammatical knowledge with the voluminous research on lexical co-activation in bilinguals. It is, however, unfortunate that a certain conception of bilingualism was inherited from this latter research tradition, one in which the contrast between native and non-native language takes a back seat.
Adsorption of amino acids on the magnetite-(111)-surface: a force field study (vol 19, 851, 2013)
(2016)
Recent advances in high-throughput sequencing experiments and their theoretical descriptions have determined fast dynamics of the "chromatin and epigenetics" field, with new concepts appearing at high rate. This field includes but is not limited to the study of DNA-protein-RNA interactions, chromatin packing properties at different scales, regulation of gene expression and protein trafficking in the cell nucleus, binding site search in the crowded chromatin environment and modulation of physical interactions by covalent chemical modifications of the binding partners. The current special issue does not pretend for the full coverage of the field, but it rather aims to capture its development and provide a snapshot of the most recent concepts and approaches. Eighteen open-access articles comprising this issue provide a delicate balance between current theoretical and experimental biophysical approaches to uncover chromatin structure and understand epigenetic regulation, allowing free flow of new ideas and preliminary results.
Seek and destroy: Filtration schemes and self-detoxifying protective fabrics based on the ZrIV-containing metal—organic frameworks (MOFs) MOF-808 and UiO-66 doped with LiOtBu have been developed that capture and hydrolytically detoxify simulants of nerve agents and mustard gas. Both MOFs function as highly catalytic elements in these applications.
Traditional economic theory could not explain, much less predict, the near collapse of the financial system and its long-lasting effects on the global economy. Since the 2008 crisis, there has been increasing interest in using ideas from complexity theory to make sense of economic and financial markets. Concepts, such as tipping points, networks, contagion, feedback, and resilience have entered the financial and regulatory lexicon, but actual use of complexity models and results remains at an early stage. Recent insights and techniques offer potential for better monitoring and management of highly interconnected economic and financial systems and, thus, may help anticipate and manage future crises.
It is argued that, despite differences in cultural norms and practices, the evidence for a link between violent media use and aggression is remarkably consistent across different countries. Along with evidence that different operationalizations of violent media use also converge across countries, these findings strengthen the conclusion that violent media are a risk factor for aggression and validate the psychological explanations for these effects. However, we need comparative studies based on a consistent methodology and a theory-based selection of cultural difference variables to properly examine the potential impact of culture on the association between violent media use and aggression.
Abrupt monsoon transitions as seen in paleorecords can be explained by moisture-advection feedback
(2016)
Storm runoff from the Marikina River Basin frequently causes flood events in the Philippine capital region Metro Manila. This paper presents and evaluates a system to predict short-term runoff from the upper part of that basin (380km(2)). It was designed as a possible component of an operational warning system yet to be installed. For the purpose of forecast verification, hindcasts of streamflow were generated for a period of 15 months with a time-continuous, conceptual hydrological model. The latter was fed with real-time observations of rainfall. Both ground observations and weather radar data were tested as rainfall forcings. The radar-based precipitation estimates clearly outperformed the raingauge-based estimates in the hydrological verification. Nevertheless, the quality of the deterministic short-term runoff forecasts was found to be limited. For the radar-based predictions, the reduction of variance for lead times of 1, 2 and 3hours was 0.61, 0.62 and 0.54, respectively, with reference to a no-forecast scenario, i.e. persistence. The probability of detection for major increases in streamflow was typically less than 0.5. Given the significance of flood events in the Marikina Basin, more effort needs to be put into the reduction of forecast errors and the quantification of remaining uncertainties.