Refine
Year of publication
- 2020 (1230) (remove)
Document Type
- Article (1230) (remove)
Language
- English (1230) (remove)
Keywords
- random point processes (18)
- statistical mechanics (18)
- stochastic analysis (18)
- climate change (17)
- machine learning (11)
- Germany (9)
- dynamics (9)
- model (9)
- performance (8)
- COVID-19 (7)
Institute
- Institut für Physik und Astronomie (175)
- Institut für Biochemie und Biologie (158)
- Institut für Geowissenschaften (141)
- Institut für Chemie (101)
- Institut für Mathematik (73)
- Department Psychologie (66)
- Institut für Umweltwissenschaften und Geographie (62)
- Department Sport- und Gesundheitswissenschaften (52)
- Institut für Ernährungswissenschaft (48)
- Department Linguistik (41)
Alcohol intoxication is known to affect many aspects of human behavior and cognition; one of such affected systems is articulation during speech production. Although much research has revealed that alcohol negatively impacts pronunciation in a first language (L1), there is only initial evidence suggesting a potential beneficial effect of inebriation on articulation in a non-native language (L2). The aim of this study was thus to compare the effect of alcohol consumption on pronunciation in an L1 and an L2. Participants who had ingested different amounts of alcohol provided speech samples in their L1 (Dutch) and L2 (English), and native speakers of each language subsequently rated the pronunciation of these samples on their intelligibility (for the L1) and accent nativelikeness (for the L2). These data were analyzed with generalized additive mixed modeling. Participants' blood alcohol concentration indeed negatively affected pronunciation in L1, but it produced no significant effect on the L2 accent ratings. The expected negative impact of alcohol on L1 articulation can be explained by reduction in fine motor control. We present two hypotheses to account for the absence of any effects of intoxication on L2 pronunciation: (1) there may be a reduction in L1 interference on L2 speech due to decreased motor control or (2) alcohol may produce a differential effect on each of the two linguistic subsystems.
Objective:
Depression and coronary heart disease (CHD) are highly comorbid conditions. Brain-derived neurotrophic factor (BDNF) plays an important role in cardiovascular processes. Depressed patients typically show decreased BDNF concentrations. We analysed the relationship between BDNF and depression in a sample of patients with CHD and additionally distinguished between cognitive-affective and somatic depression symptoms. We also investigated whether BDNF was associated with somatic comorbidity burden, acute coronary syndrome (ACS) or congestive heart failure (CHF).
Methods:
The following variables were assessed for 225 hospitalised patients with CHD: BDNF concentrations, depression [Patient Health Questionnaire-9 (PHQ-9)], somatic comorbidity (Charlson Comorbidity Index), CHF, ACS, platelet count, smoking status and antidepressant treatment.
Results:
Regression models revealed that BDNF was not associated with severity of depression. Although depressed patients (PHQ-9 score >7) had significantly lower BDNF concentrations compared to non-depressed patients (p = 0.04), this was not statistically significant after controlling for confounders (p = 0.15). Cognitive-affective symptoms and somatic comorbidity burden each closely missed a statistically significant association with BDNF concentrations (p = 0.08, p = 0.06, respectively). BDNF was reduced in patients with CHF (p = 0.02). There was no covariate-adjusted, significant association between BDNF and ACS.
Conclusion:
Serum BDNF concentrations are associated with cardiovascular dysfunction. Somatic comorbidities should be considered when investigating the relationship between depression and BDNF.
Gender stereotypes influence subjective beliefs about the world, and this is reflected in our use of language. But do gender biases in language transparently reflect subjective beliefs? Or is the process of translating thought to language itself biased? During the 2016 United States (N = 24,863) and 2017 United Kingdom (N = 2,609) electoral campaigns, we compared participants' beliefs about the gender of the next head of government with their use and interpretation of pronouns referring to the next head of government. In the United States, even when the female candidate was expected to win, she pronouns were rarely produced and induced substantial comprehension disruption. In the United Kingdom, where the incumbent female candidate was heavily favored, she pronouns were preferred in production but yielded no comprehension advantage. These and other findings suggest that the language system itself is a source of implicit biases above and beyond previously known biases, such as those measured by the Implicit Association Test.
Making sense of the world
(2020)
For human infants, the first years after birth are a period of intense exploration-getting to understand their own competencies in interaction with a complex physical and social environment. In contemporary neuroscience, the predictive-processing framework has been proposed as a general working principle of the human brain, the optimization of predictions about the consequences of one's own actions, and sensory inputs from the environment. However, the predictive-processing framework has rarely been applied to infancy research. We argue that a predictive-processing framework may provide a unifying perspective on several phenomena of infant development and learning that may seem unrelated at first sight. These phenomena include statistical learning principles, infants' motor and proprioceptive learning, and infants' basic understanding of their physical and social environment. We discuss how a predictive-processing perspective can advance the understanding of infants' early learning processes in theory, research, and application.
Only the right noise?
(2020)
Seminal work by Werker and colleagues (Stager & Werker [1997]Nature, 388, 381-382) has found that 14-month-old infants do not show evidence for learning minimal pairs in the habituation-switch paradigm. However, when multiple speakers produce the minimal pair in acoustically variable ways, infants' performance improves in comparison to a single speaker condition (Rost & McMurray [2009]Developmental Science, 12, 339-349). The current study further extends these results and assesses how different kinds of input variability affect 14-month-olds' minimal pair learning in the habituation-switch paradigm testing German learning infants. The first two experiments investigated word learning when the labels were spoken by a single speaker versus when the labels were spoken by multiple speakers. In the third experiment we studied whether non-acoustic variability, implemented by visual variability of the objects presented together with the labels, would also affect minimal pair learning. We found enhanced learning in the multiple speakers compared to the single speaker condition, confirming previous findings with English-learning infants. In contrast, visual variability of the presented objects did not support learning. These findings both confirm and better delimit the beneficial role of speech-specific variability in minimal pair learning. Finally, we review different proposals on the mechanisms via which variability confers benefits to learning and outline what may be likely principles that underlie this benefit. We highlight among these the multiplicity of acoustic cues signalling phonemic contrasts and the presence of relations among these cues. It is in these relations where we trace part of the source for the apparent paradoxical benefit of variability in learning.
In his essay, Mel Ainscow looks at inclusion and equity from an international perspective and makes suggestions on how to develop inclusive education in a ‘whole-system approach’. After discussing different conceptions of inclusion and equity, he describes international policies which address them. From this international macro-level, Ainscow zooms in to the meso-level of the school and its immediate environment, defining dimensions to be considered for an inclusive school development. One of these dimensions is the ‘use of evidence’. In my comment, I want to focus on this dimension and discuss its scope and the potential to apply it in inclusive education development. As a first and important precondition, Ainscow explains that different circumstances lead to different linguistic uses of the term ‘inclusive education’. Thus, the term ‘inclusive education’ does not refer to an identical set of objectives across countries, and neither does the term ‘equity’.
Rats are a reservoir of human- and livestock-associated methicillin-resistant Staphylococcus aureus (MRSA). However, the composition of the natural S. aureus population in wild and laboratory rats is largely unknown. Here, 144 nasal S. aureus isolates from free-living wild rats, captive wild rats and laboratory rats were genotyped and profiled for antibiotic resistances and human-specific virulence genes. The nasal S. aureus carriage rate was higher among wild rats (23.4%) than laboratory rats (12.3%). Free-living wild rats were primarily colonized with isolates of clonal complex (CC) 49 and CC130 and maintained these strains even in husbandry. Moreover, upon livestock contact, CC398 isolates were acquired. In contrast, laboratory rats were colonized with many different S. aureus lineages—many of which are commonly found in humans. Five captive wild rats were colonized with CC398-MRSA. Moreover, a single CC30-MRSA and two CC130-MRSA were detected in free-living or captive wild rats. Rat-derived S. aureus isolates rarely harbored the phage-carried immune evasion gene cluster or superantigen genes, suggesting long-term adaptation to their host. Taken together, our study revealed a natural S. aureus population in wild rats, as well as a colonization pressure on wild and laboratory rats by exposure to livestock- and human-associated S. aureus, respectively.
Strong hydroclimatic controls on vulnerability to subsurface nitrate contamination across Europe
(2020)
Subsurface contamination due to excessive nutrient surpluses is a persistent and widespread problem in agricultural areas across Europe. The vulnerability of a particular location to pollution from reactive solutes, such as nitrate, is determined by the interplay between hydrologic transport and biogeochemical transformations. Current studies on the controls of subsurface vulnerability do not consider the transient behaviour of transport dynamics in the root zone. Here, using state-of-the-art hydrologic simulations driven by observed hydroclimatic forcing, we demonstrate the strong spatiotemporal heterogeneity of hydrologic transport dynamics and reveal that these dynamics are primarily controlled by the hydroclimatic gradient of the aridity index across Europe. Contrasting the space-time dynamics of transport times with reactive timescales of denitrification in soil indicate that similar to 75% of the cultivated areas across Europe are potentially vulnerable to nitrate leaching for at least onethird of the year. We find that neglecting the transient nature of transport and reaction timescale results in a great underestimation of the extent of vulnerable regions by almost 50%. Therefore, future vulnerability and risk assessment studies must account for the transient behaviour of transport and biogeochemical transformation processes.
Forage availability has been suggested as one driver of the observed decline in honey bees. However, little is known about the effects of its spatiotemporal variation on colony success. We present a modeling framework for assessing honey bee colony viability in cropping systems. Based on two real farmland structures, we developed a landscape generator to design cropping systems varying in crop species identity, diversity, and relative abundance. The landscape scenarios generated were evaluated using the existing honey bee colony model BEEHAVE, which links foraging to in-hive dynamics. We thereby explored how different cropping systems determine spatiotemporal forage availability and, in turn, honey bee colony viability (e.g., time to extinction, TTE) and resilience (indicated by, e.g., brood mortality). To assess overall colony viability, we developed metrics,P(H)andP(P,)which quantified how much nectar and pollen provided by a cropping system per year was converted into a colony's adult worker population. Both crop species identity and diversity determined the temporal continuity in nectar and pollen supply and thus colony viability. Overall farmland structure and relative crop abundance were less important, but details mattered. For monocultures and for four-crop species systems composed of cereals, oilseed rape, maize, and sunflower,P(H)andP(P)were below the viability threshold. Such cropping systems showed frequent, badly timed, and prolonged forage gaps leading to detrimental cascading effects on life stages and in-hive work force, which critically reduced colony resilience. Four-crop systems composed of rye-grass-dandelion pasture, trefoil-grass pasture, sunflower, and phacelia ensured continuous nectar and pollen supply resulting in TTE > 5 yr, andP(H)(269.5 kg) andP(P)(108 kg) being above viability thresholds for 5 yr. Overall, trefoil-grass pasture, oilseed rape, buckwheat, and phacelia improved the temporal continuity in forage supply and colony's viability. Our results are hypothetical as they are obtained from simplified landscape settings, but they nevertheless match empirical observations, in particular the viability threshold. Our framework can be used to assess the effects of cropping systems on honey bee viability and to develop land-use strategies that help maintain pollination services by avoiding prolonged and badly timed forage gaps.