Refine
Year of publication
- 2019 (2591) (remove)
Document Type
- Article (1679)
- Doctoral Thesis (257)
- Postprint (211)
- Other (166)
- Monograph/Edited Volume (81)
- Review (68)
- Working Paper (36)
- Part of a Book (28)
- Part of Periodical (21)
- Master's Thesis (12)
Language
Keywords
- morphology (34)
- linguistics (31)
- syntax (31)
- Informationsstruktur (30)
- Morphologie (30)
- information structure (30)
- Festschrift (29)
- Linguistik (29)
- Syntax (29)
- festschrift (29)
Institute
- Institut für Biochemie und Biologie (352)
- Institut für Physik und Astronomie (312)
- Institut für Geowissenschaften (276)
- Institut für Chemie (183)
- Department Psychologie (110)
- Institut für Ernährungswissenschaft (89)
- Department Linguistik (86)
- Institut für Umweltwissenschaften und Geographie (80)
- Wirtschaftswissenschaften (73)
- Hasso-Plattner-Institut für Digital Engineering GmbH (69)
Predator-prey cycles rank among the most fundamental concepts in ecology, are predicted by the simplest ecological models and enable, theoretically, the indefinite persistence of predator and prey(1-4). However, it remains an open question for how long cyclic dynamics can be self-sustained in real communities. Field observations have been restricted to a few cycle periods(5-8) and experimental studies indicate that oscillations may be short-lived without external stabilizing factors(9-19). Here we performed microcosm experiments with a planktonic predator-prey system and repeatedly observed oscillatory time series of unprecedented length that persisted for up to around 50 cycles or approximately 300 predator generations. The dominant type of dynamics was characterized by regular, coherent oscillations with a nearly constant predator-prey phase difference. Despite constant experimental conditions, we also observed shorter episodes of irregular, non-coherent oscillations without any significant phase relationship. However, the predator-prey system showed a strong tendency to return to the dominant dynamical regime with a defined phase relationship. A mathematical model suggests that stochasticity is probably responsible for the reversible shift from coherent to non-coherent oscillations, a notion that was supported by experiments with external forcing by pulsed nutrient supply. Our findings empirically demonstrate the potential for infinite persistence of predator and prey populations in a cyclic dynamic regime that shows resilience in the presence of stochastic events.
We elaborate on the possibilities and needs to integrate design thinking into requirements engineering, drawing from our research and project experiences. We suggest three approaches for tailoring and integrating design thinking and requirements engineering with complementary synergies and point at open challenges for research and practice.
Three poly(tetrafluoroethylene-hexafluoropropylene-vinylidenefluoride) (TFE-HFP-VDF or THV) terpolymers (Dyneon (R)) with different monomer ratios are investigated to demonstrate the concept of "modified" PTFE for space-charge electrets. HFP and VDF monomers distort the highly ordered PTFE molecules, which effectively enhances processability and adversely affects space-charge storage. Particularly, VDF component renders the material polar and probably also more conductive, partially undermining the space-charge-storage capabilities of PTFE. Nevertheless, the terpolymer THV815 with a TFE/HFP/VDF wt% ratio of 76.1/10.9/13 combines easy processability and relatively good space-charge stability. Our results shed light on novel concepts for space-charge electret materials with enhanced processing properties and reasonable charge-storage capabilities.
A second peak in the extreme ultraviolet sometimes appears during the gradual phase of solar flares, which is known as the EUV late phase (ELP). Stereotypically ELP is associated with two separated sets of flaring loops with distinct sizes, and it has been debated whether ELP is caused by additional heating or extended plasma cooling in the longer loop system. Here we carry out a survey of 55 M-and-above GOES-class flares with ELP during 2010-2014. Based on the flare-ribbon morphology, these flares are categorized as circular-ribbon (19 events), two-ribbon (23 events), and complex-ribbon (13 events) flares. Among them, 22 events (40%) are associated with coronal mass ejections, while the rest are confined. An extreme ELP, with the late-phase peak exceeding the main-phase peak, is found in 48% of two-ribbon flares, 37% of circular-ribbon flares, and 31% of complex-ribbon flares, suggesting that additional heating is more likely present during ELP in two-ribbon than in circular-ribbon flares. Overall, cooling may be the dominant factor causing the delay of the ELP peak relative to the main-phase peak, because the loop system responsible for the ELP emission is generally larger than, and well separated from, that responsible for the main-phase emission. All but one of the circular-ribbon flares can be well explained by a composite "dome-plate" quasi-separatrix layer (QSL). Only half of these show a magnetic null point, with its fan and spine embedded in the dome and plate, respectively. The dome-plate QSL, therefore, is a general and robust structure characterizing circular-ribbon flares.
Self-propelled rods
(2019)
A wide range of experimental systems including gliding, swarming and swimming bacteria, in vitro motility assays, and shaken granular media are commonly described as self-propelled rods. Large ensembles of those entities display a large variety of self-organized, collective phenomena, including the formation of moving polar clusters, polar and nematic dynamic bands, mobility-induced phase separation, topological defects, and mesoscale turbulence, among others. Here, we give a brief survey of experimental observations and review the theoretical description of self-propelled rods. Our focus is on the emergent pattern formation of ensembles of dry self-propelled rods governed by short-ranged, contact mediated interactions and their wet counterparts that are also subject to long-ranged hydrodynamic flows. Altogether, self-propelled rods provide an overarching theme covering many aspects of active matter containing well-explored limiting cases. Their collective behavior not only bridges the well-studied regimes of polar selfpropelled particles and active nematics, and includes active phase separation, but also reveals a rich variety of new patterns.
The German start-up subsidy (SUS) program for the unemployed has recently undergone a major makeover, altering its institutional setup, adding an additional layer of selection and leading to ambiguous predictions of the program's effectiveness. Using propensity score matching (PSM) as our main empirical approach, we provide estimates of long-term effects of the post-reform subsidy on individual employment prospects and labor market earnings up to 40 months after entering the program. Our results suggest large and persistent long-term effects of the subsidy on employment probabilities and net earned income. These effects are larger than what was estimated for the pre-reform program. Extensive sensitivity analyses within the standard PSM framework reveal that the results are robust to different choices regarding the implementation of the weighting procedure and also with respect to deviations from the conditional independence assumption. As a further assessment of the results' sensitivity, we go beyond the standard selection-on-observables approach and employ an instrumental variable setup using regional variation in the likelihood of receiving treatment. Here, we exploit the fact that the reform increased the discretionary power of local employment agencies in allocating active labor market policy funds, allowing us to obtain a measure of local preferences for SUS as the program of choice. The results based on this approach give rise to similar estimates. Thus, our results indicating that SUS are still an effective active labor market program after the reform do not appear to be driven by "hidden bias."
The numerical prediction of radiative transport is a challenging task due to the complexity of the radiative transport equation. We apply the lattice Boltzmann method (LBM), originally developed for fluid flow problems, to solve the radiative transport in volume. One model (meso RTLBM) is derived directly from a discretization of the radiative transport equation, yielding in a precise but numerical costly scheme. The second model (macro RTLBM) solves the Helmholtz equation, which is a proper approximation for highly scattering volumes. Both numerical algorithms are validated against Monte-Carlo data for a set of 35 optical parameters, which correspond to radiative transport ranging from ballistic to diffuse regimes. Together with a set of four benchmark simulations, the comprehensive validation concludes the overall quality and detects asymptotic trends for radiative transport LBM. Furthermore, an accuracy map is presented, which summarizes the error for all parameters. This graph allows to determine the validity range for both radiative transport LBM at a glance. Finally, comprehensive guidelines are formulated to facilitate the choice of the radiative transport LBM model.
The knowledge of transformation pathways and identification of transformation products (TPs) of veterinary drugs is important for animal health, food, and environmental matters. The active agent Monensin (MON) belongs to the ionophore antibiotics and is widely used as a veterinary drug against coccidiosis in broiler farming. However, no electrochemically (EC) generated TPs of MON have been described so far. In this study, the online coupling of EC and mass spectrometry (MS) was used for the generation of oxidative TPs. EC-conditions were optimized with respect to working electrode material, solvent, modifier, and potential polarity. Subsequent LC/HRMS (liquid+ chromatography/high resolution mass spectrometry) and MS/MS experiments were performed to identify the structures of derived TPs by a suspected target analysis. The obtained EC-results were compared to TPs observed in metabolism tests with microsomes and hydrolysis experiments of MON. Five previously undescribed TPs of MON were identified in our EC/MS based study and one TP, which was already known from literature and found by a microsomal assay, could be confirmed. Two and three further TPs were found as products in microsomal tests and following hydrolysis, respectively. We found decarboxylation, O-demethylation and acid-catalyzed ring-opening reactions to be the major mechanisms of MON transformation
Dermal Delivery of the High-Molecular-Weight Drug Tacrolimus by Means of Polyglycerol-Based Nanogels
(2019)
Polyglycerol-based thermoresponsive nanogels (tNGs) have been shown to have excellent skin hydration properties and to be valuable delivery systems for sustained release of drugs into skin. In this study, we compared the skin penetration of tacrolimus formulated in tNGs with a commercial 0.1% tacrolimus ointment. The penetration of the drug was investigated in ex vivo abdominal and breast skin, while different methods for skin barrier disruption were investigated to improve skin permeability or simulate inflammatory conditions with compromised skin barrier. The amount of penetrated tacrolimus was measured in skin extracts by liquid chromatography tandem-mass spectrometry (LC-MS/MS), whereas the inflammatory markers IL-6 and IL-8 were detected by enzyme-linked immunosorbent assay (ELISA). Higher amounts of tacrolimus penetrated in breast as compared to abdominal skin or in barrier-disrupted as compared to intact skin, confirming that the stratum corneum is the main barrier for tacrolimus skin penetration. The anti-proliferative effect of the penetrated drug was measured in skin tissue/Jurkat cells co-cultures. Interestingly, tNGs exhibited similar anti-proliferative effects as the 0.1% tacrolimus ointment. We conclude that polyglycerol-based nanogels represent an interesting alternative to paraffin-based formulations for the treatment of inflammatory skin conditions.
OpenForecast
(2019)
The development and deployment of new operational runoff forecasting systems are a strong focus of the scientific community due to the crucial importance of reliable and timely runoff predictions for early warnings of floods and flashfloods for local businesses and communities. OpenForecast, the first operational runoff forecasting system in Russia, open for public use, is presented in this study. We developed OpenForecast based only on open-source software and data-GR4J hydrological model, ERA-Interim meteorological reanalysis, and ICON deterministic short-range meteorological forecasts. Daily forecasts were generated for two basins in the European part of Russia. Simulation results showed a limited efficiency in reproducing the spring flood of 2019. Although the simulations managed to capture the timing of flood peaks, they failed in estimating flood volume. However, further implementation of the parsimonious data assimilation technique significantly alleviates simulation errors. The revealed limitations of the proposed operational runoff forecasting system provided a foundation to outline its further development and improvement.
Forest structure is a crucial component in the assessment of whether a forest is likely to act as a carbon sink under changing climate. Detailed 3D structural information about the tundra–taiga ecotone of Siberia is mostly missing and still underrepresented in current research due to the remoteness and restricted accessibility. Field based, high-resolution remote sensing can provide important knowledge for the understanding of vegetation properties and dynamics. In this study, we test the applicability of consumer-grade Unmanned Aerial Vehicles (UAVs) for rapid calculation of stand metrics in treeline forests. We reconstructed high-resolution photogrammetric point clouds and derived canopy height models for 10 study sites from NE Chukotka and SW Yakutia. Subsequently, we detected individual tree tops using a variable-window size local maximum filter and applied a marker-controlled watershed segmentation for the delineation of tree crowns. With this, we successfully detected 67.1% of the validation individuals. Simple linear regressions of observed and detected metrics show a better correlation (R2) and lower relative root mean square percentage error (RMSE%) for tree heights (mean R2 = 0.77, mean RMSE% = 18.46%) than for crown diameters (mean R2 = 0.46, mean RMSE% = 24.9%). The comparison between detected and observed tree height distributions revealed that our tree detection method was unable to representatively identify trees <2 m. Our results show that plot sizes for vegetation surveys in the tundra–taiga ecotone should be adapted to the forest structure and have a radius of >15–20 m to capture homogeneous and representative forest stands. Additionally, we identify sources of omission and commission errors and give recommendations for their mitigation. In summary, the efficiency of the used method depends on the complexity of the forest’s stand structure.
In this paper Lie group method in combination with Magnus expansion is utilized to develop a universal method applicable to solving a Sturm–Liouville problem (SLP) of any order with arbitrary boundary conditions. It is shown that the method has ability to solve direct regular (and some singular) SLPs of even orders (tested for up to eight), with a mix of (including non-separable and finite singular endpoints) boundary conditions, accurately and efficiently. The present technique is successfully applied to overcome the difficulties in finding suitable sets of eigenvalues so that the inverse SLP problem can be effectively solved. The inverse SLP algorithm proposed by Barcilon (1974) is utilized in combination with the Magnus method so that a direct SLP of any (even) order and an inverse SLP of order two can be solved effectively.
In this paper Lie group method in combination with Magnus expansion is utilized to develop a universal method applicable to solving a Sturm–Liouville problem (SLP) of any order with arbitrary boundary conditions. It is shown that the method has ability to solve direct regular (and some singular) SLPs of even orders (tested for up to eight), with a mix of (including non-separable and finite singular endpoints) boundary conditions, accurately and efficiently. The present technique is successfully applied to overcome the difficulties in finding suitable sets of eigenvalues so that the inverse SLP problem can be effectively solved. The inverse SLP algorithm proposed by Barcilon (1974) is utilized in combination with the Magnus method so that a direct SLP of any (even) order and an inverse SLP of order two can be solved effectively.
In this paper, we investigate the continuous version of modified iterative Runge–Kutta-type methods for nonlinear inverse ill-posed problems proposed in a previous work. The convergence analysis is proved under the tangential cone condition, a modified discrepancy principle, i.e., the stopping time T is a solution of ∥𝐹(𝑥𝛿(𝑇))−𝑦𝛿∥=𝜏𝛿+ for some 𝛿+>𝛿, and an appropriate source condition. We yield the optimal rate of convergence.
In this paper, we investigate the continuous version of modified iterative Runge–Kutta-type methods for nonlinear inverse ill-posed problems proposed in a previous work. The convergence analysis is proved under the tangential cone condition, a modified discrepancy principle, i.e., the stopping time T is a solution of ∥𝐹(𝑥𝛿(𝑇))−𝑦𝛿∥=𝜏𝛿+ for some 𝛿+>𝛿, and an appropriate source condition. We yield the optimal rate of convergence.
In animals and humans, behavior can be influenced by irrelevant stimuli, a phenomenon called Pavlovian-to-instrumental transfer (PIT). In subjects with substance use disorder, PIT is even enhanced with functional activation in the nucleus accumbens (NAcc) and amygdala. While we observed enhanced behavioral and neural PIT effects in alcohol-dependent subjects, we here aimed to determine whether behavioral PIT is enhanced in young men with high-risk compared to low-risk drinking and subsequently related functional activation in an a-priori region of interest encompassing the NAcc and amygdala and related to polygenic risk for alcohol consumption. A representative sample of 18-year old men (n = 1937) was contacted: 445 were screened, 209 assessed: resulting in 191 valid behavioral, 139 imaging and 157 genetic datasets. None of the subjects fulfilled criteria for alcohol dependence according to the Diagnostic and Statistical Manual of Mental Disorders-IV-TextRevision (DSM-IV-TR). We measured how instrumental responding for rewards was influenced by background Pavlovian conditioned stimuli predicting action-independent rewards and losses. Behavioral PIT was enhanced in high-compared to low-risk drinkers (b = 0.09, SE = 0.03, z = 2.7, p < 0.009). Across all subjects, we observed PIT-related neural blood oxygen level-dependent (BOLD) signal in the right amygdala (t = 3.25, p(SVC) = 0.04, x = 26, y = -6, z = -12), but not in NAcc. The strength of the behavioral PIT effect was positively correlated with polygenic risk for alcohol consumption (r(s) = 0.17, p = 0.032). We conclude that behavioral PIT and polygenic risk for alcohol consumption might be a biomarker for a subclinical phenotype of risky alcohol consumption, even if no drug-related stimulus is present. The association between behavioral PIT effects and the amygdala might point to habitual processes related to out PIT task. In non-dependent young social drinkers, the amygdala rather than the NAcc is activated during PIT; possible different involvement in association with disease trajectory should be investigated in future studies.
This study presents the first suite of apatite fission-track (AFT) ages from the SE part of the Western Sudetes. AFT cooling ages from the Orlica-snie(z) over dotnik Dome and the Upper Nysa Klodzka Graben range from Late Cretaceous (84 Ma) to Early Palaeocene-Middle Eocene (64-45 Ma). The first stage of basin evolution (similar to 100-90 Ma) was marked by the formation of a local extensional depocentre and disruption of the Mesozoic planation surface. Subsequent far-field convergence of European microplates resulted in Coniacian-Santonian (similar to 89-83 Ma) thrust faulting. AFT data from both metamorphic basement and Mesozoic sedimentary cover indicate homogenous Late Cretaceous burial of the entire Western Sudetes. Thermal history modeling suggests that the onset of cooling could be constrained between 89 and 63 Ma with a climax during the Palaeocene-Middle Eocene basin inversion phase.
Die vorliegende Arbeit geht der Fragestellung nach, inwiefern der Beutelsbacher Konsens in einem ausgewählten Schulbuch für den Politikunterricht in der Sekundarstufe I in Brandenburg berücksichtigt wird. Um sich dieser Frage anzunähern, werden zunächst die drei Grundsätze des Konsenses wiedergegeben: das Überwältigungsverbot, das Kontroversitätsgebot und die Schülerorientierung. Da der Konsens, auch wenn er von einem Großteil der Fachdidaktikerinnen und Fachdidaktiker geteilt wird, immer wieder Gegenstand von Diskussionen ist, werden in einem ersten Schritt Ansätze zur Aktualisierung bzw. Erweiterung dargestellt und anschließend aktuelle Streitpunkte aufgezeigt. In einem kurzen Zwischenfazit wird dann ein für die Schulbuchanalyse unabdingliches, eindeutiges Verständnis des Konsenses entwickelt.
Im folgenden Schritt wird die Rolle von Schulbüchern als Lehr- und Lernmedien diskutiert. Dabei steht insbesondere die Frage im Zentrum, weshalb sich gerade Schulbücher für eine Analyse im Rahmen der vorliegenden Arbeit eignen. Vor diesem Hintergrund wird das Konzept der Schulbuchanalyse vorgestellt. In diesem Rahmen werden der Untersuchungsschwerpunkt (Kontroversitätsgebot) und der Untersuchungsgegenstand (Kontroverse um Migration und Integration) eingegrenzt. In der Folge wird das Schulbuch Politik und Co. 1 mithilfe des erarbeiteten Untersuchungsinstruments (Kodierleitfaden) analysiert. Zudem werden die Ergebnisse pointiert und die gewählte Vorgehensweise reflektiert.
Narratives are shaping our understanding of the world. They convey values and norms and point to desirable future developments. In this way, they justify and legitimize political actions and social practices. Once a narrative has emerged and this world view is supported by broad societal groups, narratives can provide powerful momentum to trigger innovation and changes in the course of action. Narratives, however, are not necessarily based on evidence and precise categories, but can instead be vague and ambiguous in order to be acceptable and attractive to different actors. However, the more open and inclusive a narrative is, the less impact can be expected. We investigate whether there is a shared narrative in research for the sustainable economy and how this can be evaluated in terms of its potential societal impact. The paper carves out the visions for the future that have been underlying the research projects conducted within the German Federal Ministry of Education and Research (BMBF) funding programme "The Sustainable Economy". It then analyzes whether these visions are compatible with narratives dominating societal discourse on the sustainable economy, and concludes how the use of visions and narratives in research can contribute to fostering societal transformations.
Bank filtration (BF) is an established indirect water-treatment technology. The quality of water gained via BF depends on the subsurface capture zone, the mixing ratio (river water versus ambient groundwater), spatial and temporal distribution of subsurface travel times, and subsurface temperature patterns. Surface-water infiltration into the adjacent aquifer is determined by the local hydraulic gradient and riverbed permeability, which could be altered by natural clogging, scouring and artificial decolmation processes. The seasonal behaviour of a BF system in Germany, and its development during and about 6 months after decolmation (canal reconstruction), was observed with a long-term monitoring programme. To quantify the spatial and temporal variation in the BF system, a transient flow and heat transport model was implemented and two model scenarios, 'with' and 'without' canal reconstruction, were generated. Overall, the simulated water heads and temperatures matched those observed. Increased hydraulic connection between the canal and aquifer caused by the canal reconstruction led to an increase of similar to 23% in the already high share of BF water abstracted by the nearby waterworks. Subsurface travel-time distribution substantially shifted towards shorter travel times. Flow paths with travel times <200 days increased by similar to 10% and those with <300 days by 15%. Generally, the periodic temperature signal, and the summer and winter temperature extrema, increased and penetrated deeper into the aquifer. The joint hydrological and thermal effects caused by the canal reconstruction might increase the potential of biodegradable compounds to further penetrate into the aquifer, also by potentially affecting the redox zonation in the aquifer.
When does life end?
(2019)
If you look at the question of the end-of-life legislation, one – or rather THE basic question – is particularly interesting: What is the "end of life"? What is death? Ofcourse, one can approach this question theologically or philosophically, but alsolegally and especially medically. Since the 1960 s, medical progress has made itpossible to distinguish between different individual points of time within the na-tural dying process. However, this raises the question as to which of these pointsof time is relevant for criminal law. This question, which is usually onsideredvery emotionally, will be examined in more detail in the paper.
Pride is linked to conviviality, to the practice of life-with-an-other, and to an awareness of the limitations of the life forms and life norms which guide and regulate the life of culturally, socially, and historically defined communities. Assuming this link, pride in living-together and conviviality appear as concepts creating a framework for future perspectives. But these concepts need a space in which they can unfold critically and confidently with a view to the future. For millennia, the literatures of the world have created this space of simulation and experimentation in which knowledge of how-to-live-with-an-other has been put down on paper through the open-ended tradition of writing. It is the space of the life forms and life norms of conviviality: it offers us prospective knowledge for the future by translating the imaginable into the thinkable, and the readable into the livable.
Assessments of psychotherapeutic competencies play a crucial role in research and training. However, research on the reliability and validity of such assessments is sparse. This study aimed to provide an overview of the current evidence and to provide an average interrater reliability (IRR) of psychotherapeutic competence ratings. A systematic review was conducted, and 20 studies reported in 32 publications were collected. These 20 studies were included in a narrative synthesis, and 20 coefficients were entered into the meta-analysis. Most primary studies referred to cognitive-behavioral therapies and the treatment of depression, used the Cognitive Therapy Scale, based ratings on videos, and trained the raters. Our meta-analysis revealed a pooled ICC of 0.82, but at the same time severe heterogeneity. The evidence map highlighted a variety of variables related to competence assessments. Further aspects influencing the reliability of competence ratings and regarding the considerable heterogeneity are discussed in detail throughout the manuscript.
Education
(2019)
Vives emphasizes needlework as an appropriate occupation for all women, even for ‘a princess or a queen’. A wide variety of schools run by individual tradesmen or women offered instruction in certain fields, such as writing and calculus, while schools erected or licensed by the authorities concentrated on religious education. A large group of orphanages founded during the sixteenth and early seventeenth centuries provided a sound education for boys and girls. Authorities, parents and educational thinkers of the time were much less concerned with girls’ education than with that of boys. Private tutoring at home concentrated on the same subjects but, when boys were instructed at home, some girls had a chance to participate in a more academically oriented education. In most educational settings, be it at day schools, boarding schools or in private homes, teachers, mothers and governesses were expected to raise good housewives, pious mothers and obedient spouses.
Concepts and theory
(2019)
There is no threat to Western democracies today comparable to the rise of right-wing populism. While it has played an increasing role at least since the 1990s, only the social consequences of the global financial crises in 2008 have given it its break that led to UK’s ‘Brexit’ and the election of Donald Trump as US President in 2016, as well as promoting what has been called left populism in countries that were hit the hardest by both the banking crisis and consequential neo-liberal austerity politics in the EU, such as Greece and Portugal.
In 2017, the French Front National (FN) attracted many voters in the French Presidential elections; we have seen the radicalization of the Alternative für Deutschland (AfD) in Germany and the formation of centre-right government in Austria. Further, we have witnessed the consolidation of autocratic regimes, as in the EU member states Poland and Greece. All these manifestations of right-wing populism share a common feature: they attack or even compromise the core elements of democratic societies such as the separation of powers, protection of minorities, or the rule of law.
Despite a broad debate on the re-emergence of ‘populism’ in the transition from the twentieth to the twenty-first century that has brought forth many interesting findings, a lack of sociological reasoning cannot be denied, as sociology itself withdrew from theorising populism decades ago and largely left the field to political sciences and history. In a sense, Populism and the Crisis of Democracy considers itself a contribution to begin filling this lacuna. Written in a direct and clear style, this set of volumes will be an invaluable reference for students and scholars in the field of political theory, political sociology and European Studies.
This volume Concepts and Theory offers new and fresh perspectives on the debate on populism. Starting from complaints about the problems of conceptualising populism that in recent years have begun to revolve around themselves, the chapters offer a fundamental critique of the term and concept of populism, theoretically inspired typologies and descriptions of currently dominant concepts, and ways to elaborate on them. With regard to theory, the volume offers approaches that exceed the disciplinary horizon of political science that so far has dominated the debate. As sociological theory so far has been more or less absent in the debate on populism, only few efforts have been made to discuss populism more intensely within different theoretical contexts in order to explain its dynamics and processes. Thus, this volume offers critical views on the debate on populism from the perspectives of political economy and the analysis of critical historical events, the links of analyses of populism with social movement mobilisation, the significance of ‘superfluous populations’ in the rise of populism and an analysis of the exclusionary character of populism from the perspective of the theory of social closure.
Leben in der ehemaligen DDR
(2019)
This article draws on the experience from an ongoing research project employing respondent-driven sampling (RDS) to survey (illicit) 24-hour home care workers. We highlight issues around the preparatory work and the fielding of the survey to provide researchers with useful insights on how to implement RDS when surveying populations for which the method has not yet been used. We conclude the article with ethical considerations that occur when employing RDS.
This paper compares the usability of data stemming from probability sampling with data stemming from nonprobability sampling. It develops six research scenarios that differ in their research goals and assumptions about the data generating process. It is shown that inferences from data stemming from nonprobability sampling implies demanding assumptions on the homogeneity of the units being studied. Researchers who are not willing to pose these assumptions are generally better off using data from probability sampling, regardless of the amount of nonresponse. However, even in cases when data from probability sampling is clearly advertised, data stemming from nonprobability sampling may contribute to the cumulative scientific endeavour of pinpointing a plausible interval for the parameter of interest.
Classical Wolf-Rayet (cWR) stars are at a crucial evolutionary stage for constraining the fates of massive stars. The feedback of these hot, hydrogen-depleted stars dominates their surrounding by tremendous injections of ionizing radiation and kinetic energy. The strength of a Wolf-Rayet (WR) wind decides the eventual mass of its remnant, likely a massive black hole. However, despite their major influence and importance for gravitational wave detection statistics, WR winds are particularly poorly understood. In this paper, we introduce the first set of hydrodynamically consistent stellar atmosphere models for cWR stars of both the carbon (C) and the nitrogen (N) sequence, i.e. WC and WN stars, as a function of stellar luminosity-to-mass ratio (or Eddington Gamma) and metallicity. We demonstrate the inapplicability of the CAK wind theory for cWR stars and confirm earlier findings that their winds are launched at the (hot) iron (Fe) opacity peak. For log Z/Z(circle dot) > -2, Fe is also the main accelerator throughout the wind. Contrasting previous claims of a sharp lower mass-loss limit forWR stars, we obtain a smooth transition to optically thin winds. Furthermore, we find a strong dependence of the mass-loss rates on Eddington Gamma, both at solar and subsolar metallicity. Increases inWCcarbon and oxygen abundances turn out to slightly reduce the predicted mass-loss rates. Calculations at subsolar metallicities indicate that below the metallicity of the Small Magellanic Cloud, WR mass-loss rates decrease much faster than previously assumed, potentially allowing for high black hole masses even in the local Universe.
The ability to reflect is considered an essential element of Education for Sustainable Development (ESD) and a key competence for learners and educators in ESD (UNECE Strategy for ESD, 2012). In contrast to its high importance, little is known about how reflective thinking can be identified, influenced or increased in the classroom. Therefore, the objective of this study is to address this need by developing an empirical multi-stage model designed to help educators diagnose different levels of reflective thinking and to identify factors that influence students’ reflective thinking about sustainability. Based on a 4–8-week project with grade 10 and 11 students studying sustainability, reflective thinking performance using weblogs as reflective journals was analysed. In addition, qualitative semi-structured interviews were conducted with the teachers to comprehend the learning environment and the personal value they assigned to ESD in their geography class. To determine the levels of reflective thinking achieved by the students, the study built on the work of Dewey (1933) and pre-existing multi-stage models of reflective thinking (Bain, Ballantyne, & Packer, 1999; Chen, Wei, Wu, & Uden, 2009). Using a qualitative, iterative data analysis, the study adapted the stage models to be applicable in ESD and found great differences in the students’ reflection levels. Furthermore, the study identified eight factors that influence students’ reflective thinking about sustainability. The outcomes of this study may be valuable for educators in high school and higher education, who seek to diagnose their students’ reflective thinking performance and facilitate reflection about sustainability.
We examine how and under what conditions informal institutional constraints, such as precedent and doctrine, are likely to affect collective choice within international organisations even in the absence of powerful bureaucratic agents. With a particular focus on the United Nations Security Council, we first develop a theoretical account of why such informal constraints might affect collective decisions even of powerful and strategically behaving actors. We show that precedents provide focal points that allow adopting collective decisions in coordination situations despite diverging preferences. Reliance on previous cases creates tacitly evolving doctrine that may develop incrementally. Council decision-making is also likely to be facilitated by an institutional logic of escalation driven by institutional constraints following from the typically staged response to crisis situations. We explore the usefulness of our theoretical argument with evidence from the Council doctrine on terrorism that has evolved since 1985. The key decisions studied include the 1992 sanctions resolution against Libya and the 2001 Council response to the 9/11 attacks. We conclude that, even within intergovernmentally structured international organisations, member states do not operate on a clean slate, but in a highly institutionalised environment that shapes their opportunities for action.
Professional development on fostering students’ academic language proficiency across the curriculum
(2019)
This meta-analysis aggregates effects from 10 studies evaluating professional development interventions aimed at qualifying in-service teachers to support their students in mastering academic language skills while teaching their respective subject areas. The analysis of a subset of studies revealed a small non-significant weighted training effect on teachers' cognition (g' = 0.21, SE = 0.14). An effect aggregation including all studies (with 650 teachers) revealed a medium to large weighted overall effect on teachers' classroom practices (g' = 0.71, SE = 0.16). Methodological variables moderated the effect magnitude. Nevertheless, the results suggest professional development is beneficial for improving teachers' practice.
This chapter aims to analyse whether and how democracy is actually threatened by big-data-based operations and what role international law can play to respond to this possible threat. It shows how big-data-based operations challenge democracy and how international law can help in defending it. The chapter focuses on both state and non-state actors may undermine democracy through big data operations; although democracy as such is a rather underdeveloped concept in international law, which is often more concerned with effectivity than legitimacy – international law protects against these challenges via a democracy-based approach rooted in international human rights law on the one hand, and the principle of non-intervention on the other hand. Thus, although democracy does not play a major role in international law, international law nevertheless is able to protect democracy against challenges from the inside as well as outside.
Duplicate detection algorithms produce clusters of database records, each cluster representing a single real-world entity. As most of these algorithms use pairwise comparisons, the resulting (transitive) clusters can be inconsistent: Not all records within a cluster are sufficiently similar to be classified as duplicate. Thus, one of many subsequent clustering algorithms can further improve the result. <br /> We explain in detail, compare, and evaluate many of these algorithms and introduce three new clustering algorithms in the specific context of duplicate detection. Two of our three new algorithms use the structure of the input graph to create consistent clusters. Our third algorithm, and many other clustering algorithms, focus on the edge weights, instead. For evaluation, in contrast to related work, we experiment on true real-world datasets, and in addition examine in great detail various pair-selection strategies used in practice. While no overall winner emerges, we are able to identify best approaches for different situations. In scenarios with larger clusters, our proposed algorithm, Extended Maximum Clique Clustering (EMCC), and Markov Clustering show the best results. EMCC especially outperforms Markov Clustering regarding the precision of the results and additionally has the advantage that it can also be used in scenarios where edge weights are not available.
Industry 4.0, based on increasingly progressive digitalization, is a global phenomenon that affects every part of our work. The Internet of Things (IoT) is pushing the process of automation, culminating in the total autonomy of cyber-physical systems. This process is accompanied by a massive amount of data, information, and new dimensions of flexibility. As the amount of available data increases, their specific timeliness decreases. Mastering Industry 4.0 requires humans to master the new dimensions of information and to adapt to relevant ongoing changes. Intentional forgetting can make a difference in this context, as it discards nonprevailing information and actions in favor of prevailing ones. Intentional forgetting is the basis of any adaptation to change, as it ensures that nonprevailing memory items are not retrieved while prevailing ones are retained. This study presents a novel experimental approach that was introduced in a learning factory (the Research and Application Center Industry 4.0) to investigate intentional forgetting as it applies to production routines. In the first experiment (N = 18), in which the participants collectively performed 3046 routine related actions (t1 = 1402, t2 = 1644), the results showed that highly proceduralized actions were more difficult to forget than actions that were less well-learned. Additionally, we found that the quality of cues that trigger the execution of routine actions had no effect on the extent of intentional forgetting.
Selfish Network Creation focuses on modeling real world networks from a game-theoretic point of view. One of the classic models by Fabrikant et al. (2003) is the network creation game, where agents correspond to nodes in a network which buy incident edges for the price of alpha per edge to minimize their total distance to all other nodes. The model is well-studied but still has intriguing open problems. The most famous conjectures state that the price of anarchy is constant for all alpha and that for alpha >= n all equilibrium networks are trees. We introduce a novel technique for analyzing stable networks for high edge-price alpha and employ it to improve on the best known bound for the latter conjecture. In particular we show that for alpha > 4n - 13 all equilibrium networks must be trees, which implies a constant price of anarchy for this range of alpha. Moreover, we also improve the constant upper bound on the price of anarchy for equilibrium trees.
Based upon the current debate on international practices with its focus on taken-for-granted everyday practices, we examine how Security Council practices may affect member state action and collective decisions on intrastate conflicts. We outline a concept that integrates the structuring effect of practices and their emergence from interaction among reflective actors. It promises to overcome the unresolved tension between understanding practices as a social regularity and as a fluid entity. We analyse the constitutive mechanisms of two Council practices that affect collective decisions on intrastate conflicts and elucidate how even reflective Council members become enmeshed with the constraining implications of evolving practices and their normative implications. (1) Previous Council decisions create precedent pressure and give rise to a virtually uncontested permissive Council practice that defines the purview for intervention into such conflicts. (2) A ratcheting practice forces opponents to choose between accepting steadily reinforced Council action, as occurred regarding Sudan/Darfur, and outright blockade, as in the case of Syria. We conclude that practices constitute a source of influence that is not captured by the traditional perspectives on Council activities as the consequence of geopolitical interests or of externally evolving international norms like the ‘responsibility to protect’ (R2P).
Little is known about how far-reaching decisions in UN Security Council sanctions committees are made. Developing a novel committee governance concept and using examples drawn from sanctions imposed on Iraq, Al-Qaida, Congo, Sudan and Iran, this book shows that Council members tend to follow the will of the powerful, whereas sanctions committee members often decide according to the rules. This is surprising since both Council and committees are staffed by the same member states.
Offering a fascinating account of Security Council micro-politics and decision-making processes on sanctions, this rigorous comparative and theory-driven analysis treats the Council and its sanctions committees as distinguishable entities that may differ in decision practice despite having the same members. Drawing extensively on primary documents, diplomatic cables, well-informed press coverage, reports by close observers and extensive interviews with committee members, Council diplomats and sanctions experts, it contrasts with the conventional wisdom on decision-making within these bodies, which suggests that the powerful permanent members would not accept rule-based decisions against their interests.
This book will be of interest to policy practitioners and scholars working in the broad field of international organizations and international relations theory as well as those specializing in sanctions, international law, the Security Council and counter-terrorism.
Demokratie, Krieg und Tod
(2019)
From the international perspective, the peace process in Liberia has generally been described as a successful model for international peacebuilding interventions. But how do Liberians perceive the peace process in their country? The aim of this paper is to complement an institutionalist approach looking at the security and justice mechanism in Liberia with some insights into local perceptions in order to answer the following question: how do Liberians perceive the peace process in their country and which institutions have been supportive for the establishment of sustaining peace? After briefly introducing the background of the Liberian conflict and the data collection, I present first results, analyzing the mechanism linking two peacebuilding institutions (peacekeeping and transitional justice) with the establishment of sustaining peace in Liberia.
Verbessern Planspiele als aktive Lernmethode die Lernergebnisse von Student*innen der Friedens- und Konfliktforschung (FuK)? Dieser Beitrag untersucht verschiedene UN-Simulationen, um deren Effektivität in Bezug auf drei Wissensbereichen (Fakten- und Verfahrenswissen, Soft Skills) nachzuweisen. Im Gegensatz zu theoretischen Aussagen über die positiven Auswirkungen aktiver Lernumgebungen auf die Lernergebnisse von Student*innen sind empirische Belege begrenzt. Mit diesem Beitrag sollen frühere Behauptungen über die Lerneffekte von UN-Simulationen systematisch überprüft und der Mehrwert für die FuK demonstriert werden. Um umfassende Daten zu erhalten, evaluieren wir drei Planspiele, die eine Reihe von Simulationseigenschaften abdecken: Eine kurze Simulation des UN-Sicherheitsrats, eine regionale UN-Simulation sowie die Teilnahme von zwei Delegationen am National Model United Nations. Die Ergebnisse zeigen, dass Planspiele als Lehrmethode positive Auswirkungen auf die Lernergebnisse der Student*innen haben: Sie führen zu einem besseren Wissen über die UN, fördern Soft Skills sowie Reflexionsfähigkeit.
Voltaire-Preis
(2019)