Refine
Has Fulltext
- no (1996) (remove)
Year of publication
- 2019 (1996) (remove)
Document Type
- Article (1537)
- Other (165)
- Doctoral Thesis (127)
- Monograph/Edited Volume (62)
- Review (57)
- Part of a Book (28)
- Conference Proceeding (7)
- Habilitation Thesis (6)
- Course Material (3)
- Journal/Publication series (2)
Keywords
- climate change (9)
- diffusion (8)
- Germany (7)
- stars: evolution (7)
- stars: winds, outflows (7)
- methods: numerical (6)
- quasars: absorption lines (6)
- stars: massive (6)
- Climate change (5)
- Deutschland (5)
Institute
- Institut für Biochemie und Biologie (304)
- Institut für Physik und Astronomie (283)
- Institut für Geowissenschaften (240)
- Institut für Chemie (152)
- Department Psychologie (109)
- Institut für Ernährungswissenschaft (77)
- Institut für Umweltwissenschaften und Geographie (66)
- Historisches Institut (64)
- Department Sport- und Gesundheitswissenschaften (57)
- Sozialwissenschaften (55)
This paper compares the usability of data stemming from probability sampling with data stemming from nonprobability sampling. It develops six research scenarios that differ in their research goals and assumptions about the data generating process. It is shown that inferences from data stemming from nonprobability sampling implies demanding assumptions on the homogeneity of the units being studied. Researchers who are not willing to pose these assumptions are generally better off using data from probability sampling, regardless of the amount of nonresponse. However, even in cases when data from probability sampling is clearly advertised, data stemming from nonprobability sampling may contribute to the cumulative scientific endeavour of pinpointing a plausible interval for the parameter of interest.
Classical Wolf-Rayet (cWR) stars are at a crucial evolutionary stage for constraining the fates of massive stars. The feedback of these hot, hydrogen-depleted stars dominates their surrounding by tremendous injections of ionizing radiation and kinetic energy. The strength of a Wolf-Rayet (WR) wind decides the eventual mass of its remnant, likely a massive black hole. However, despite their major influence and importance for gravitational wave detection statistics, WR winds are particularly poorly understood. In this paper, we introduce the first set of hydrodynamically consistent stellar atmosphere models for cWR stars of both the carbon (C) and the nitrogen (N) sequence, i.e. WC and WN stars, as a function of stellar luminosity-to-mass ratio (or Eddington Gamma) and metallicity. We demonstrate the inapplicability of the CAK wind theory for cWR stars and confirm earlier findings that their winds are launched at the (hot) iron (Fe) opacity peak. For log Z/Z(circle dot) > -2, Fe is also the main accelerator throughout the wind. Contrasting previous claims of a sharp lower mass-loss limit forWR stars, we obtain a smooth transition to optically thin winds. Furthermore, we find a strong dependence of the mass-loss rates on Eddington Gamma, both at solar and subsolar metallicity. Increases inWCcarbon and oxygen abundances turn out to slightly reduce the predicted mass-loss rates. Calculations at subsolar metallicities indicate that below the metallicity of the Small Magellanic Cloud, WR mass-loss rates decrease much faster than previously assumed, potentially allowing for high black hole masses even in the local Universe.
The ability to reflect is considered an essential element of Education for Sustainable Development (ESD) and a key competence for learners and educators in ESD (UNECE Strategy for ESD, 2012). In contrast to its high importance, little is known about how reflective thinking can be identified, influenced or increased in the classroom. Therefore, the objective of this study is to address this need by developing an empirical multi-stage model designed to help educators diagnose different levels of reflective thinking and to identify factors that influence students’ reflective thinking about sustainability. Based on a 4–8-week project with grade 10 and 11 students studying sustainability, reflective thinking performance using weblogs as reflective journals was analysed. In addition, qualitative semi-structured interviews were conducted with the teachers to comprehend the learning environment and the personal value they assigned to ESD in their geography class. To determine the levels of reflective thinking achieved by the students, the study built on the work of Dewey (1933) and pre-existing multi-stage models of reflective thinking (Bain, Ballantyne, & Packer, 1999; Chen, Wei, Wu, & Uden, 2009). Using a qualitative, iterative data analysis, the study adapted the stage models to be applicable in ESD and found great differences in the students’ reflection levels. Furthermore, the study identified eight factors that influence students’ reflective thinking about sustainability. The outcomes of this study may be valuable for educators in high school and higher education, who seek to diagnose their students’ reflective thinking performance and facilitate reflection about sustainability.
We examine how and under what conditions informal institutional constraints, such as precedent and doctrine, are likely to affect collective choice within international organisations even in the absence of powerful bureaucratic agents. With a particular focus on the United Nations Security Council, we first develop a theoretical account of why such informal constraints might affect collective decisions even of powerful and strategically behaving actors. We show that precedents provide focal points that allow adopting collective decisions in coordination situations despite diverging preferences. Reliance on previous cases creates tacitly evolving doctrine that may develop incrementally. Council decision-making is also likely to be facilitated by an institutional logic of escalation driven by institutional constraints following from the typically staged response to crisis situations. We explore the usefulness of our theoretical argument with evidence from the Council doctrine on terrorism that has evolved since 1985. The key decisions studied include the 1992 sanctions resolution against Libya and the 2001 Council response to the 9/11 attacks. We conclude that, even within intergovernmentally structured international organisations, member states do not operate on a clean slate, but in a highly institutionalised environment that shapes their opportunities for action.
Professional development on fostering students’ academic language proficiency across the curriculum
(2019)
This meta-analysis aggregates effects from 10 studies evaluating professional development interventions aimed at qualifying in-service teachers to support their students in mastering academic language skills while teaching their respective subject areas. The analysis of a subset of studies revealed a small non-significant weighted training effect on teachers' cognition (g' = 0.21, SE = 0.14). An effect aggregation including all studies (with 650 teachers) revealed a medium to large weighted overall effect on teachers' classroom practices (g' = 0.71, SE = 0.16). Methodological variables moderated the effect magnitude. Nevertheless, the results suggest professional development is beneficial for improving teachers' practice.
This chapter aims to analyse whether and how democracy is actually threatened by big-data-based operations and what role international law can play to respond to this possible threat. It shows how big-data-based operations challenge democracy and how international law can help in defending it. The chapter focuses on both state and non-state actors may undermine democracy through big data operations; although democracy as such is a rather underdeveloped concept in international law, which is often more concerned with effectivity than legitimacy – international law protects against these challenges via a democracy-based approach rooted in international human rights law on the one hand, and the principle of non-intervention on the other hand. Thus, although democracy does not play a major role in international law, international law nevertheless is able to protect democracy against challenges from the inside as well as outside.
Duplicate detection algorithms produce clusters of database records, each cluster representing a single real-world entity. As most of these algorithms use pairwise comparisons, the resulting (transitive) clusters can be inconsistent: Not all records within a cluster are sufficiently similar to be classified as duplicate. Thus, one of many subsequent clustering algorithms can further improve the result. <br /> We explain in detail, compare, and evaluate many of these algorithms and introduce three new clustering algorithms in the specific context of duplicate detection. Two of our three new algorithms use the structure of the input graph to create consistent clusters. Our third algorithm, and many other clustering algorithms, focus on the edge weights, instead. For evaluation, in contrast to related work, we experiment on true real-world datasets, and in addition examine in great detail various pair-selection strategies used in practice. While no overall winner emerges, we are able to identify best approaches for different situations. In scenarios with larger clusters, our proposed algorithm, Extended Maximum Clique Clustering (EMCC), and Markov Clustering show the best results. EMCC especially outperforms Markov Clustering regarding the precision of the results and additionally has the advantage that it can also be used in scenarios where edge weights are not available.
Industry 4.0, based on increasingly progressive digitalization, is a global phenomenon that affects every part of our work. The Internet of Things (IoT) is pushing the process of automation, culminating in the total autonomy of cyber-physical systems. This process is accompanied by a massive amount of data, information, and new dimensions of flexibility. As the amount of available data increases, their specific timeliness decreases. Mastering Industry 4.0 requires humans to master the new dimensions of information and to adapt to relevant ongoing changes. Intentional forgetting can make a difference in this context, as it discards nonprevailing information and actions in favor of prevailing ones. Intentional forgetting is the basis of any adaptation to change, as it ensures that nonprevailing memory items are not retrieved while prevailing ones are retained. This study presents a novel experimental approach that was introduced in a learning factory (the Research and Application Center Industry 4.0) to investigate intentional forgetting as it applies to production routines. In the first experiment (N = 18), in which the participants collectively performed 3046 routine related actions (t1 = 1402, t2 = 1644), the results showed that highly proceduralized actions were more difficult to forget than actions that were less well-learned. Additionally, we found that the quality of cues that trigger the execution of routine actions had no effect on the extent of intentional forgetting.