Open Access
Refine
Year of publication
Is part of the Bibliography
- yes (229)
Keywords
- sentence comprehension (7)
- embodied cognition (6)
- stress (6)
- acquisition (5)
- cardiac rehabilitation (5)
- english (5)
- language acquisition (5)
- late bilinguals (5)
- numerical cognition (5)
- reliability (5)
Institute
- Humanwissenschaftliche Fakultät (229) (remove)
Ultraschall Berlin
(2014)
Ultraschall Berlin
(2014)
In Germany, active bat rabies surveillance was conducted between 1993 and 2012. A total of 4546 oropharyngeal swab samples from 18 bat species were screened for the presence of EBLV-1- , EBLV-2- and BBLV-specific RNA. Overall, 0 center dot 15% of oropharyngeal swab samples tested EBLV-1 positive, with the majority originating from Eptesicus serotinus. Interestingly, out of seven RT-PCR-positive oropharyngeal swabs subjected to virus isolation, viable virus was isolated from a single serotine bat (E. serotinus). Additionally, about 1226 blood samples were tested serologically, and varying virus neutralizing antibody titres were found in at least eight different bat species. The detection of viral RNA and seroconversion in repeatedly sampled serotine bats indicates long-term circulation of the virus in a particular bat colony. The limitations of random-based active bat rabies surveillance over passive bat rabies surveillance and its possible application of targeted approaches for future research activities on bat lyssavirus dynamics and maintenance are discussed.
This study presents results from a cross-modal priming experiment investigating inflected verb forms of German. A group of late learners of German with Russian as their native language (L1) was compared to a control group of German L1 speakers. The experiment showed different priming patterns for the two participant groups. The L1 German data yielded a stem-priming effect for inflected forms involving regular affixation and a partial priming effect for irregular forms irrespective of stem allomorphy. By contrast, the data from the late bilinguals showed reduced priming effects for both regular and irregular forms. We argue that late learners rely more on lexically stored inflected word forms during word recognition and less on morphological parsing than native speakers.
Object and action naming in Russian- and German- speaking monolingual and bilingual children*
(2014)
The present study investigates the influence of word category on naming performance in two populations: bilingual and monolingual children. The question is whether and, if so, to what extent monolingual and bilingual children differ with respect to noun and verb naming and whether a noun bias exists in the lexical abilities of bilingual children. Picture naming of objects and actions by Russian-German bilingual children (aged 4-7 years) was compared to age-matched monolingual children. The results clearly demonstrate a naming deficit of bilingual children in comparison to monolingual children that increases with age. Noun learning is more fragile in bilingual contexts than is verb learning. In bilingual language acquisition, nouns do not predominate over verbs as much as is seen in monolingual German and Russian children. The results are discussed with respect to semantic-conceptual aspects and language-specific features of nouns and verbs, and the impact of input on the acquisition of these word categories.
Previous research has shown that high phonotactic frequencies
facilitate the production of regularly inflected verbs in English-learning
children with specific language impairment (SLI) but not with typical
development (TD). We asked whether this finding can be replicated
for German, a language with a much more complex inflectional
verb paradigm than English. Using an elicitation task, the production
of inflected nonce verb forms (3 rd person singular with -t suffix)
with either high- or low-frequency subsyllables was tested in
sixteen German-learning children with SLI (ages 4;1–5 ;1), sixteen
TD-children matched for chronological age (CA) and fourteen TD-
children matched for verbal age (VA) (ages 3;0–3 ;11). The findings
revealed that children with SLI, but not CA- or VA-children, showed
differential performance between the two types of verbs, producing
more inflectional errors when the verb forms resulted in low-frequency
subsyllables than when they resulted in high-frequency subsyllables,
replicating the results from English-learning children.
Regular and irregular inflection in children's production has been examined in many previous studies. Yet, little is known about the processes involved in children's recognition of inflected words. To gain insight into how children process inflected words, the current study examines regular -t and irregular -n participles of German using the cross-modal priming technique testing 108 monolingual German-speaking children in two age groups (group I, mean age: 8;4, group II, mean age: 9;9) and a control group of.. adults. Although both age groups of children had the same full priming effect as adults for -t forms, only children of age group II showed an adult-like (partial) priming effect for -n participles. We argue that children (within the age range tested) employ the same mechanisms for regular inflection as adults but that the lexical retrieval processes required for irregular forms become more efficient when children get older.
Although morphosyntax has been identified as a major source of difficulty for adult (nonnative) language learners, most previous studies have examined a limited set of largely affix-based phenomena. Little is known about word-based morphosyntax in late bilinguals and of how morphosyntax is represented and processed in a nonnative speaker's lexicon. To address these questions, we report results from two behavioral experiments investigating stem variants of strong verbs in German (which encode features such as tense, person, and number) in groups of advanced adult learners as well as native speakers of German. Although the late bilinguals were highly proficient in German, the results of a lexical priming experiment revealed clear native-nonnative differences. We argue that lexical representation and processing relies less on morphosyntactic information in a nonnative than in a native language.
This article investigates the nature of preposition copying and preposition pruning structures in present-day English. We begin by illustrating the two phenomena and consider how they might be accounted for in syntactic terms, and go on to explore the possibility that preposition copying and pruning arise for processing reasons. We then report on two acceptability judgement experiments examining the extent to which native speakers of English are sensitive to these types of 'error' in language comprehension. Our results indicate that preposition copying creates redundancy rather than ungrammaticality, whereas preposition pruning creates processing problems for comprehenders that may render it unacceptable in timed (but not necessarily in untimed) judgement tasks. Our findings furthermore illustrate the usefulness of combining corpus studies and experimentally elicited data for gaining a clearer picture of usage and acceptability, and the potential benefits of examining syntactic phenomena from both a theoretical and a processing perspective.
Using the eye-movement monitoring technique in two reading comprehension experiments, this study investigated the timing of constraints on wh-dependencies (so-called island constraints) in first- and second-language (L1 and L2) sentence processing. The results show that both L1 and L2 speakers of English are sensitive to extraction islands during processing, suggesting that memory storage limitations affect L1 and L2 comprehenders in essentially the same way. Furthermore, these results show that the timing of island effects in L1 compared to L2 sentence comprehension is affected differently by the type of cue (semantic fit versus filled gaps) signaling whether dependency formation is possible at a potential gap site. Even though L1 English speakers showed immediate sensitivity to filled gaps but not to lack of semantic fit, proficient German-speaking learners of English as a L2 showed the opposite sensitivity pattern. This indicates that initial wh-dependency formation in L2 processing is based on semantic feature matching rather than being structurally mediated as in L1 comprehension.
This study investigates whether number dissimilarities on subject and object DPs facilitate the comprehension of subject-and object-extracted centre-embedded relative clauses in children with Grammatical Specific Language Impairment (G-SLI). We compared the performance of a group of English-speaking children with G-SLI (mean age: 12; 11) with that of two groups of younger typically developing (TD) children, matched on grammar and receptive vocabulary, respectively. All groups were more accurate on subject-extracted relative clauses than object-extracted ones and, crucially, they all showed greater accuracy for sentences with dissimilar number features (i.e., one singular, one plural) on the head noun and the embedded DP. These findings are interpreted in the light of current psycholinguistic models of sentence comprehension in TD children and provide further insight into the linguistic nature of G-SLI.
The effect of implicitly incentivized faking on explicit and implicit measures of doping attitude
(2015)
The Implicit Association Test (IAT) aims to measure participants' automatic evaluation of an attitude object and is useful especially for the measurement of attitudes related to socially sensitive subjects, e.g. doping in sports. Several studies indicate that IAT scores can be faked on instruction. But fully or semi-instructed research scenarios might not properly reflect what happens in more realistic situations, when participants secretly decide to try faking the test. The present study is the first to investigate IAT faking when there is only an implicit incentive to do so. Sixty-five athletes (22.83 years +/- 2.45; 25 women) were randomly assigned to an incentive-to-fake condition or a control condition. Participants in the incentive-to-fake condition were manipulated to believe that athletes with lenient doping attitudes would be referred to a tedious 45-minute anti-doping program. Attitudes were measured with the pictorial doping brief IAT (BIAT) and with the Performance Enhancement Attitude Scale (PEAS). A one-way MANOVA revealed significant differences between conditions after the manipulation in PEAS scores, but not in the doping BIAT. In the light of our hypothesis this suggests that participants successfully faked an exceedingly negative attitude to doping when completing the PEAS, but were unsuccessful in doing so on the reaction time-based test. This study assessed BIAT faking in a setting that aimed to resemble a situation in which participants want to hide their attempts to cheat. The two measures of attitude were differentially affected by the implicit incentive. Our findings provide evidence that the pictorial doping BIAT is relatively robust against spontaneous and naive faking attempts. (B) IATs might be less prone to faking than implied by previous studies.
Background: Continuous treatment is an important indicator of medication adherence in dementia. However, long-term studies in larger clinical settings are lacking, and little is known about moderating effects of patient and service characteristics.
Methods: Data from 12,910 outpatients with dementia (mean age 79.2 years; SD = 7.6 years) treated between January 2003 and December 2013 in Germany were included. Continuous treatment was analysed using Kaplan-Meier curves and log-rank tests. In addition, multivariate Cox regression models were fitted with continuous treatment as dependent variable and the predictors antidementia agent, age, gender, medical comorbidities, physician specialty, and health insurance status.
Results: After one year of follow-up, nearly 60% of patients continued drug treatment. Donezepil (HR: 0.88; 95% CI: 0.82-0.95) and memantine (HR: 0.85; 0.79-0.91) patients were less likely to be discontinued treatment as compared to rivastigmine users. Patients were less likely to be discontinued if they were treated by specialist physicians as compared to general practitioners (HR: 0.44; 0.41-0.48). Younger male patients and patients who had private health insurance had a lower discontinuation risk. Regarding comorbidity, patients were more likely to be continuously treated with the index substance if a diagnosis of heart failure or hypertension had been diagnosed at baseline.
Conclusions: Our results imply that besides type of antidementia agent, involvement of a specialist in the complex process of prescribing antidementia drugs can provide meaningful benefits to patients, in terms of more disease-specific and continuous treatment.
Processes involved in late bilinguals' production of morphologically complex words were studied using an event-related brain potentials (ERP) paradigm in which EEGs were recorded during participants' silent productions of English past- and present-tense forms. Twenty-three advanced second language speakers of English (first language [L1] German) were compared to a control group of 19 L1 English speakers from an earlier study. We found a frontocentral negativity for regular relative to irregular past-tense forms (e.g., asked vs. held) during (silent) production, and no difference for the present-tense condition (e.g., asks vs. holds), replicating the ERP effect obtained for the L1 group. This ERP effect suggests that combinatorial processing is involved in producing regular past-tense forms, in both late bilinguals and L1 speakers. We also suggest that this paradigm is a useful tool for future studies of online language production.
Although morphosyntax has been identified as a major source of difficulty for adult (nonnative) language learners, most previous studies have examined a limited set of largely affix-based phenomena. Little is known about word-based morphosyntax in late bilinguals and of how morphosyntax is represented and processed in a nonnative speaker's lexicon. To address these questions, we report results from two behavioral experiments investigating stem variants of strong verbs in German (which encode features such as tense, person, and number) in groups of advanced adult learners as well as native speakers of German. Although the late bilinguals were highly proficient in German, the results of a lexical priming experiment revealed clear native-nonnative differences. We argue that lexical representation and processing relies less on morphosyntactic information in a nonnative than in a native language.
Degeneration of the intervertebral disc – triggered by ageing, mechanical stress, traumatic injury, infection, inflammation and other factors – has a significant role in the development of low back pain. Back pain not only has a high prevalence, but also a major socio-economic impact. With the ageing population, its occurrence and costs are expected to grow even more in the future. Disc degeneration is characterized by matrix breakdown, loss in proteoglycans and thus water content, disc height loss and an increase in inflammatory molecules. The accumulation of cytokines, such as interleukin (IL)-1 , IL-8 or tumor necrosis factor (TNF)-, together with age-related immune deficiency, leads to the so-called inflammaging – low-grade, chronic inflammation with a crucial role in pain development. Despite the relevance of these molecular processes, current therapies target symptoms, but not underlying causes. This review describes the biological and biomechanical changes that occur in a degenerated disc, discusses the connection between disc degeneration and inflammaging, highlights factors that enhance the inflammatory processes in disc pathologies and suggests future research avenues.
The Gradient Symbolic Computation (GSC) model presented in the keynote article (Goldrick, Putnam & Schwarz) constitutes a significant theoretical development, not only as a model of bilingual code-mixing, but also as a general framework that brings together symbolic grammars and graded representations. The authors are to be commended for successfully integrating a theory of grammatical knowledge with the voluminous research on lexical co-activation in bilinguals. It is, however, unfortunate that a certain conception of bilingualism was inherited from this latter research tradition, one in which the contrast between native and non-native language takes a back seat.
The semantics of focus particles like only requires a set of alternatives (Rooth, 1992). In two experiments, we investigated the impact of such particles on the retrieval of alternatives that are mentioned in the prior context or unmentioned. The first experiment used a probe recognition task and showed that focus particles interfere with the recognition of mentioned alternatives and the rejection of unmentioned alternatives relative to a condition without a particle. A second lexical decision experiment demonstrated priming effects for mentioned and unmentioned alternatives (compared with an unrelated condition) while focus particles caused additional interference effects. Overall, our results indicate that focus particles trigger an active search for alternatives and lead to a competition between mentioned alternatives, unmentioned alternatives, and the focused element.
We asked whether invariant phonetic indices for syllable structure can be identified in a language where word-initial consonant clusters, regardless of their sonority profile, are claimed to be parsed heterosyllabically. Four speakers of Moroccan Arabic were recorded, using Electromagnetic Articulography. Pursuing previous work, we employed temporal diagnostics for syllable structure, consisting of static correspondences between any given phonological organisation and its presumed phonetic indices. We show that such correspondences offer only a partial understanding of the relation between syllabic organisation and continuous indices of that organisation. We analyse the failure of the diagnostics and put forth a new approach in which different phonological organisations prescribe different ways in which phonetic indices change as phonetic parameters are scaled. The main finding is that invariance is found in these patterns of change, rather than in static correspondences between phonological constructs and fixed values for their phonetic indices.
How much is too much?
(2010)
Although dietary nutrient intake is often adequate, nutritional supplement use is common among elite athletes. However, high-dose supplements or the use of multiple supplements may exceed the recommended daily allowance (RDA) of particular nutrients or even result in a daily intake above tolerable upper limits (UL). The present case report presents nutritional intake data and supplement use of a highly trained male swimmer competing at international level. Habitual energy and micronutrient intake were analysed by 3 d dietary reports. Supplement use and dosage were assessed, and total amount of nutrient supply was calculated. Micronutrient intake was evaluated based on RDA and UL as presented by the European Scientific Committee on Food, and maximum permitted levels in supplements (MPL) are given. The athlete’s diet provided adequate micronutrient content well above RDA except for vitamin D. Simultaneous use of ten different supplements was reported, resulting in excess intake above tolerable UL for folate, vitamin E and Zn. Additionally, daily supplement dosage was considerably above MPL for nine micronutrients consumed as artificial products. Risks and possible side effects of exceeding UL by the athlete are discussed. Athletes with high energy intake may be at risk of exceeding UL of particular nutrients if multiple supplements are added. Therefore, dietary counselling of athletes should include assessment of habitual diet and nutritional supplement intake. Educating athletes to balance their diets instead of taking supplements might be prudent to prevent health risks
that may occur with long-term excess nutrient intake.
Drawing on phonology research within the generative linguistics tradition, stochastic methods, and notions from complex systems, we develop a modelling paradigm linking phonological structure, expressed in terms of syllables, to speech movement data acquired with 3D electromagnetic articulography and X-ray microbeam methods. The essential variable in the models is syllable structure. When mapped to discrete coordination topologies, syllabic organization imposes systematic patterns of variability on the temporal dynamics of speech articulation. We simulated these dynamics under different syllabic parses and evaluated simulations against experimental data from Arabic and English, two languages claimed to parse similar strings of segments into different syllabic structures. Model simulations replicated several key experimental results, including the fallibility of past phonetic heuristics for syllable structure, and exposed the range of conditions under which such heuristics remain valid. More importantly, the modelling approach consistently diagnosed syllable structure proving resilient to multiple sources of variability in experimental data including measurement variability, speaker variability, and contextual variability. Prospects for extensions of our modelling paradigm to acoustic data are also discussed.
Strategic sexual signals
(2016)
The color red has special meaning in mating-relevant contexts. Wearing red can enhance perceptions of women's attractiveness and desirability as a potential romantic partner. Building on recent findings, the present study examined whether women's (N = 74) choice to display the color red is influenced by the attractiveness of an expected opposite-sex interaction partner. Results indicated that female participants who expected to interact with an attractive man displayed red (on clothing, accessories, and/or makeup) more often than a baseline consisting of women in a natural environment with no induced expectation. In contrast, when women expected to interact with an unattractive man, they eschewed red, displaying it less often than in the baseline condition. Findings are discussed with respect to evolutionary and cultural perspectives on mate evaluation and selection.
Much previous experimental research on morphological processing has focused on surface and meaning-level properties of morphologically complex words, without paying much attention to the morphological differences between inflectional and derivational processes. Realization-based theories of morphology, for example, assume specific morpholexical representations for derived words that distinguish them from the products of inflectional or paradigmatic processes. The present study reports results from a series of masked priming experiments investigating the processing of inflectional and derivational phenomena in native (L1) and non-native (L2) speakers in a non-Indo-European language, Turkish. We specifically compared regular (Aorist) verb inflection with deadjectival nominalization, both of which are highly frequent, productive and transparent in Turkish. The experiments demonstrated different priming patterns for inflection and derivation, specifically within the L2 group. Implications of these findings are discussed both for accounts of L2 morphological processing and for the controversial linguistic distinction between inflection and derivation.
Two experiments tested how faithfully German children aged 4; 5 to 5; 6 reproduce ditransitive sentences that are unmarked or marked with respect to word order and focus (Exp1) or definiteness (Exp2). Adopting an optimality theory (OT) approach, it is assumed that in the German adult grammar word order is ranked lower than focus and definiteness. Faithfulness of children's reproductions decreased as markedness of inputs increased; unmarked structures were reproduced most faithfully and unfaithful outputs had most often an unmarked form. Consistent with the OT proposal, children were more tolerant against inputs marked for word order than for focus; in conflict with the proposal, children were less tolerant against inputs marked for word order than for definiteness. Our results suggest that the linearization of objects in German double object constructions is affected by focus and definiteness, but that prosodic principles may have an impact on the position of a focused constituent.
This study investigates phenomena that have been claimed to be indicative of Specific Language Impairment (SLI) in German, focusing on subject-verb agreement marking. Longitudinal data from fourteen German-speaking children with SLI, seven monolingual and seven Turkish-German successive bilingual children, were examined. We found similar patterns of impairment in the two participant groups. Both the monolingual and the bilingual children with SLI had correct (present vs. preterit) tense marking and produced syntactically complex sentences such as embedded clauses and wh-questions, but were limited in reliably producing correct agreement-marked verb forms. These contrasts indicate that agreement marking is impaired in German-speaking children with SLI, without any necessary concurrent deficits in either the CP-domain or in tense marking. Our results also show that it is possible to identify SLI from an early successive bilingual child's performance in one of her two languages.
Restrictions on addition
(2012)
Children up to school age have been reported to perform poorly when interpreting sentences containing restrictive and additive focus particles by treating sentences with a focus particle in the same way as sentences without it. Careful comparisons between results of previous studies indicate that this phenomenon is less pronounced for restrictive than for additive particles. We argue that this asymmetry is an effect of the presuppositional status of the proposition triggered by the additive particle. We tested this in two experiments with German-learning three-and four-year-olds using a method that made the exploitation of the information provided by the particles highly relevant for completing the task. Three-year-olds already performed remarkably well with sentences both with auch 'also' and with nur 'only'. Thus, children can consider the presuppositional contribution of the additive particle in their sentence interpretation and can exploit the restrictive particle as a marker of exhaustivity.
Extract: Topics in psycholinguistics and the neurocognition of language rarely attract the attention of journalists or the general public. One topic that has done so, however, is the potential benefits of bilingualism for general cognitive functioning and development, and as a precaution against cognitive decline in old age. Sensational claims have been made in the public domain, mostly by journalists and politicians. Recently (September 4, 2014) The Guardian reported that “learning a foreign language can increase the size of your brain”, and Michael Gove, the UK's previous Education Secretary, noted in an interview with The Guardian (September 30, 2011) that “learning languages makes you smarter”. The present issue of BLC addresses these topics by providing a state-of-the-art overview of theoretical and experimental research on the role of bilingualism for cognition in children and adults.
Masked priming research with late (non-native) bilinguals has reported facilitation effects following morphologically derived prime words (scanner - scan). However, unlike for native speakers, there are suggestions that purely orthographic prime-target overlap (scandal - scan) also produces priming in non-native visual word recognition. Our study directly compares orthographically related and derived prime-target pairs. While native readers showed morphological but not formal overlap priming, the two prime types yielded the same magnitudes of facilitation for non-natives. We argue that early word recognition processes in a non-native language are more influenced by surface-form properties than in one's native language.
Saccades to single targets in peripheral vision are typically characterized by an undershoot bias. Putting this bias to a test, Kapoula [1] used a paradigm in which observers were presented with two different sets of target eccentricities that partially overlapped each other. Her data were suggestive of a saccadic range effect (SRE): There was a tendency for saccades to overshoot close targets and undershoot far targets in a block, suggesting that there was a response bias towards the center of eccentricities in a given block. Our Experiment 1 was a close replication of the original study by Kapoula [1]. In addition, we tested whether the SRE is sensitive to top-down requirements associated with the task, and we also varied the target presentation duration. In Experiments 1 and 2, we expected to replicate the SRE for a visual discrimination task. The simple visual saccade-targeting task in Experiment 3, entailing minimal top-down influence, was expected to elicit a weaker SRE. Voluntary saccades to remembered target locations in Experiment 3 were expected to elicit the strongest SRE. Contrary to these predictions, we did not observe a SRE in any of the tasks. Our findings complement the results reported by Gillen et al. [2] who failed to find the effect in a saccade-targeting task with a very brief target presentation. Together, these results suggest that unlike arm movements, saccadic eye movements are not biased towards making saccades of a constant, optimal amplitude for the task.
The aim of the present study was to test the functional relevance of the spatial concepts UP or DOWN for words that use these concepts either literally (space) or metaphorically (time, valence). A functional relevance would imply a symmetrical relationship between the spatial concepts and words related to these concepts, showing that processing words activate the related spatial concepts on one hand, but also that an activation of the concepts will ease the retrieval of a related word on the other. For the latter, the rotation angle of participant's body position was manipulated either to an upright or a head-down tilted body position to activate the related spatial concept. Afterwards participants produced in a within-subject design previously memorized words of the concepts space, time and valence according to the pace of a metronome. All words were related either to the spatial concept UP or DOWN. The results including Bayesian analyses show (1) a significant interaction between body position and words using the concepts UP and DOWN literally, (2) a marginal significant interaction between body position and temporal words and (3) no effect between body position and valence words. However, post-hoc analyses suggest no difference between experiments. Thus, the authors concluded that integrating sensorimotor experiences is indeed of functional relevance for all three concepts of space, time and valence. However, the strength of this functional relevance depends on how close words are linked to mental concepts representing vertical space.
Processes involved in late bilinguals' production of morphologically complex words were studied using an event-related brain potentials (ERP) paradigm in which EEGs were recorded during participants' silent productions of English past- and present-tense forms. Twenty-three advanced second language speakers of English (first language [L1] German) were compared to a control group of 19 L1 English speakers from an earlier study. We found a frontocentral negativity for regular relative to irregular past-tense forms (e.g., asked vs. held) during (silent) production, and no difference for the present-tense condition (e.g., asks vs. holds), replicating the ERP effect obtained for the L1 group. This ERP effect suggests that combinatorial processing is involved in producing regular past-tense forms, in both late bilinguals and L1 speakers. We also suggest that this paradigm is a useful tool for future studies of online language production.
The present study examines the effect of language experience on vocal emotion perception in a second language. Native speakers of French with varying levels of self-reported English ability were asked to identify emotions from vocal expressions produced by American actors in a forced-choice task, and to rate their pleasantness, power, alertness and intensity on continuous scales. Stimuli included emotionally expressive English speech (emotional prosody) and non-linguistic vocalizations (affect bursts), and a baseline condition with Swiss-French pseudo-speech. Results revealed effects of English ability on the recognition of emotions in English speech but not in non-linguistic vocalizations. Specifically, higher English ability was associated with less accurate identification of positive emotions, but not with the interpretation of negative emotions. Moreover, higher English ability was associated with lower ratings of pleasantness and power, again only for emotional prosody. This suggests that second language skills may sometimes interfere with emotion recognition from speech prosody, particularly for positive emotions.
The present study aimed to integrate findings from technology acceptance research with research on applicant reactions to new technology for the emerging selection procedure of asynchronous video interviewing. One hundred six volunteers experienced asynchronous video interviewing and filled out several questionnaires including one on the applicants' personalities. In line with previous technology acceptance research, the data revealed that perceived usefulness and perceived ease of use predicted attitudes toward asynchronous video interviewing. Furthermore, openness revealed to moderate the relation between perceived usefulness and attitudes toward this particular selection technology. No significant effects emerged for computer self-efficacy, job interview self efficacy, extraversion, neuroticism, and conscientiousness. Theoretical and practical implications are discussed.
Purpose: The acquisition of skills is essential to the conceptualization of cognitive-behavioural therapy. Yet, what experiences are encountered and what skills actually learned during therapy, and whether patients and therapists have concurrent views hereof, remains poorly understood. Method: An explorative pilot study with semi-structured, corresponding interview guides was conducted. Pilot data from our outpatient unit were transcribed and content-analyzed following current guidelines. Results: The responses of 18 participants (patients and their psychotherapists) were assigned to six main categories. Educational and cognitive aspects were mentioned most frequently and consistently by both groups. Having learned Behavioural alternatives attained the second highest agreement between perspectives. Conclusions: Patients and therapists valued CBT as an opportunity to learn new skills, which is an important prerequisite also for the maintenance of therapeutic change. We discuss limitations to generalizability but also theoretical and therapy implications.
Background: Core-specific sensorimotor exercises are proven to enhance neuromuscular activity of the trunk, improve athletic performance and prevent back pain. However, the dose-response relationship and, therefore, the dose required to improve trunk function is still under debate. The purpose of the present trial will be to compare four different intervention strategies of sensorimotor exercises that will result in improved trunk function.
Methods/design: A single-blind, four-armed, randomized controlled trial with a 3-week (home-based) intervention phase and two measurement days pre and post intervention (M1/M2) is designed. Experimental procedures on both measurement days will include evaluation of maximum isokinetic and isometric trunk strength (extension/flexion, rotation) including perturbations, as well as neuromuscular trunk activity while performing strength testing. The primary outcome is trunk strength (peak torque). Neuromuscular activity (amplitude, latencies as a response to perturbation) serves as secondary outcome. The control group will perform a standardized exercise program of four sensorimotor exercises (three sets of 10 repetitions) in each of six training sessions (30 min duration) over 3 weeks. The intervention groups’ programs differ in the number of exercises, sets per exercise and, therefore, overall training amount (group I: six sessions, three exercises, two sets; group II: six sessions, two exercises, two sets; group III: six sessions, one exercise, three sets). The intervention programs of groups I, II and III include additional perturbations for all exercises to increase both the difficulty and the efficacy of the exercises performed. Statistical analysis will be performed after examining the underlying assumptions for parametric and non-parametric testing.
Discussion: The results of the study will be clinically relevant, not only for researchers but also for (sports) therapists, physicians, coaches, athletes and the general population who have the aim of improving trunk function.
Background: Core-specific sensorimotor exercises are proven to enhance neuromuscular activity of the trunk, improve athletic performance and prevent back pain. However, the dose-response relationship and, therefore, the dose required to improve trunk function is still under debate. The purpose of the present trial will be to compare four different intervention strategies of sensorimotor exercises that will result in improved trunk function.
Methods/design: A single-blind, four-armed, randomized controlled trial with a 3-week (home-based) intervention phase and two measurement days pre and post intervention (M1/M2) is designed. Experimental procedures on both measurement days will include evaluation of maximum isokinetic and isometric trunk strength (extension/flexion, rotation) including perturbations, as well as neuromuscular trunk activity while performing strength testing. The primary outcome is trunk strength (peak torque). Neuromuscular activity (amplitude, latencies as a response to perturbation) serves as secondary outcome. The control group will perform a standardized exercise program of four sensorimotor exercises (three sets of 10 repetitions) in each of six training sessions (30 min duration) over 3 weeks. The intervention groups’ programs differ in the number of exercises, sets per exercise and, therefore, overall training amount (group I: six sessions, three exercises, two sets; group II: six sessions, two exercises, two sets; group III: six sessions, one exercise, three sets). The intervention programs of groups I, II and III include additional perturbations for all exercises to increase both the difficulty and the efficacy of the exercises performed. Statistical analysis will be performed after examining the underlying assumptions for parametric and non-parametric testing.
Discussion: The results of the study will be clinically relevant, not only for researchers but also for (sports) therapists, physicians, coaches, athletes and the general population who have the aim of improving trunk function.
Introduction
To date, several meta-analyses clearly demonstrated that resistance and plyometric training are effective to improve physical fitness in children and adolescents. However, a methodological limitation of meta-analyses is that they synthesize results from different studies and hence ignore important differences across studies (i.e., mixing apples and oranges). Therefore, we aimed at examining comparative intervention studies that assessed the effects of age, sex, maturation, and resistance or plyometric training descriptors (e.g., training intensity, volume etc.) on measures of physical fitness while holding other variables constant.
Methods
To identify relevant studies, we systematically searched multiple electronic databases (e.g., PubMed) from inception to March 2018. We included resistance and plyometric training studies in healthy young athletes and non-athletes aged 6 to 18 years that investigated the effects of moderator variables (e.g., age, maturity, sex, etc.) on components of physical fitness (i.e., muscle strength and power).
Results
Our systematic literature search revealed a total of 75 eligible resistance and plyometric training studies, including 5,138 participants. Mean duration of resistance and plyometric training programs amounted to 8.9 ± 3.6 weeks and 7.1±1.4 weeks, respectively. Our findings showed that maturation affects plyometric and resistance training outcomes differently, with the former eliciting greater adaptations pre-peak height velocity (PHV) and the latter around- and post-PHV. Sex has no major impact on resistance training related outcomes (e.g., maximal strength, 10 repetition maximum). In terms of plyometric training, around-PHV boys appear to respond with larger performance improvements (e.g., jump height, jump distance) compared with girls. Different types of resistance training (e.g., body weight, free weights) are effective in improving measures of muscle strength (e.g., maximum voluntary contraction) in untrained children and adolescents. Effects of plyometric training in untrained youth primarily follow the principle of training specificity. Despite the fact that only 6 out of 75 comparative studies investigated resistance or plyometric training in trained individuals, positive effects were reported in all 6 studies (e.g., maximum strength and vertical jump height, respectively).
Conclusions
The present review article identified research gaps (e.g., training descriptors, modern alternative training modalities) that should be addressed in future comparative studies.
Introduction
To date, several meta-analyses clearly demonstrated that resistance and plyometric training are effective to improve physical fitness in children and adolescents. However, a methodological limitation of meta-analyses is that they synthesize results from different studies and hence ignore important differences across studies (i.e., mixing apples and oranges). Therefore, we aimed at examining comparative intervention studies that assessed the effects of age, sex, maturation, and resistance or plyometric training descriptors (e.g., training intensity, volume etc.) on measures of physical fitness while holding other variables constant.
Methods
To identify relevant studies, we systematically searched multiple electronic databases (e.g., PubMed) from inception to March 2018. We included resistance and plyometric training studies in healthy young athletes and non-athletes aged 6 to 18 years that investigated the effects of moderator variables (e.g., age, maturity, sex, etc.) on components of physical fitness (i.e., muscle strength and power).
Results
Our systematic literature search revealed a total of 75 eligible resistance and plyometric training studies, including 5,138 participants. Mean duration of resistance and plyometric training programs amounted to 8.9 ± 3.6 weeks and 7.1±1.4 weeks, respectively. Our findings showed that maturation affects plyometric and resistance training outcomes differently, with the former eliciting greater adaptations pre-peak height velocity (PHV) and the latter around- and post-PHV. Sex has no major impact on resistance training related outcomes (e.g., maximal strength, 10 repetition maximum). In terms of plyometric training, around-PHV boys appear to respond with larger performance improvements (e.g., jump height, jump distance) compared with girls. Different types of resistance training (e.g., body weight, free weights) are effective in improving measures of muscle strength (e.g., maximum voluntary contraction) in untrained children and adolescents. Effects of plyometric training in untrained youth primarily follow the principle of training specificity. Despite the fact that only 6 out of 75 comparative studies investigated resistance or plyometric training in trained individuals, positive effects were reported in all 6 studies (e.g., maximum strength and vertical jump height, respectively).
Conclusions
The present review article identified research gaps (e.g., training descriptors, modern alternative training modalities) that should be addressed in future comparative studies.
Introduction
We investigated blood glucose (BG) and hormone response to aerobic high-intensity interval exercise (HIIE) and moderate continuous exercise (CON) matched for mean load and duration in type 1 diabetes mellitus (T1DM).
Material and Methods
Seven trained male subjects with T1DM performed a maximal incremental exercise test and HIIE and CON at 3 different mean intensities below (A) and above (B) the first lactate turn point and below the second lactate turn point (C) on a cycle ergometer. Subjects were adjusted to ultra-long-acting insulin Degludec (Tresiba/Novo Nordisk, Denmark). Before exercise, standardized meals were administered, and short-acting insulin dose was reduced by 25% (A), 50% (B), and 75% (C) dependent on mean exercise intensity. During exercise, BG, adrenaline, noradrenaline, dopamine, cortisol, glucagon, and insulin-like growth factor-1, blood lactate, heart rate, and gas exchange variables were measured. For 24 h after exercise, interstitial glucose was measured by continuous glucose monitoring system.
Results
BG decrease during HIIE was significantly smaller for B (p = 0.024) and tended to be smaller for A and C compared to CON. No differences were found for post-exercise interstitial glucose, acute hormone response, and carbohydrate utilization between HIIE and CON for A, B, and C. In HIIE, blood lactate for A (p = 0.006) and B (p = 0.004) and respiratory exchange ratio for A (p = 0.003) and B (p = 0.003) were significantly higher compared to CON but not for C.
Conclusion
Hypoglycemia did not occur during or after HIIE and CON when using ultra-long-acting insulin and applying our methodological approach for exercise prescription. HIIE led to a smaller BG decrease compared to CON, although both exercises modes were matched for mean load and duration, even despite markedly higher peak workloads applied in HIIE. Therefore, HIIE and CON could be safely performed in T1DM.
There is evidence for cortical contribution to the regulation of human postural control. Interference from concurrently performed cognitive tasks supports this notion, and the lateral prefrontal cortex (lPFC) has been suggested to play a prominent role in the processing of purely cognitive as well as cognitive-postural dual tasks. The degree of cognitive-motor interference varies greatly between individuals, but it is unresolved whether individual differences in the recruitment of specific lPFC regions during cognitive dual tasking are associated with individual differences in cognitive-motor interference. Here, we investigated inter-individual variability in a cognitive-postural multitasking situation in healthy young adults (n = 29) in order to relate these to inter-individual variability in lPFC recruitment during cognitive multitasking. For this purpose, a oneback working memory task was performed either as single task or as dual task in order to vary cognitive load. Participants performed these cognitive single and dual tasks either during upright stance on a balance pad that was placed on top of a force plate or during fMRI measurement with little to no postural demands. We hypothesized dual one-back task performance to be associated with lPFC recruitment when compared to single one-back task performance. In addition, we expected individual variability in lPFC recruitment to be associated with postural performance costs during concurrent dual one-back performance. As expected, behavioral performance costs in postural sway during dual-one back performance largely varied between individuals and so did lPFC recruitment during dual one-back performance. Most importantly, individuals who recruited the right mid-lPFC to a larger degree during dual one-back performance also showed greater postural sway as measured by larger performance costs in total center of pressure displacements. This effect was selective to the high-load dual one-back task and suggests a crucial role of the right lPFC in allocating resources during cognitivemotor interference. Our study provides further insight into the mechanisms underlying cognitive-motor multitasking and its impairments.
Are individual differences in reading speed related to extrafoveal visual acuity and crowding?
(2015)
Readers differ considerably in their speed of self-paced reading. One factor known to influence fixation durations in reading is the preprocessing of words in parafoveal vision. Here we investigated whether individual differences in reading speed or the amount of information extracted from upcoming words (the preview benefit) can be explained by basic differences in extrafoveal vision-i.e., the ability to recognize peripheral letters with or without the presence of flanking letters. Forty participants were given an adaptive test to determine their eccentricity thresholds for the identification of letters presented either in isolation (extrafoveal acuity) or flanked by other letters (crowded letter recognition). In a separate eye-tracking experiment, the same participants read lists of words from left to right, while the preview of the upcoming words was manipulated with the gaze-contingent moving window technique. Relationships between dependent measures were analyzed on the observational level and with linear mixed models. We obtained highly reliable estimates both for extrafoveal letter identification (acuity and crowding) and measures of reading speed (overall reading speed, size of preview benefit). Reading speed was higher in participants with larger uncrowded windows. However, the strength of this relationship was moderate and it was only observed if other sources of variance in reading speed (e.g., the occurrence of regressive saccades) were eliminated. Moreover, the size of the preview benefit-an important factor in normal reading-was larger in participants with better extrafoveal acuity. Together, these results indicate a significant albeit moderate contribution of extrafoveal vision to individual differences in reading speed.
It has been proposed that in online sentence comprehension the dependency between a reflexive pronoun such as himself/herself and its antecedent is resolved using exclusively syntactic constraints. Under this strictly syntactic search account, Principle A of the binding theory which requires that the antecedent c-command the reflexive within the same clause that the reflexive occurs in constrains the parser's search for an antecedent. The parser thus ignores candidate antecedents that might match agreement features of the reflexive (e.g., gender) but are ineligible as potential antecedents because they are in structurally illicit positions. An alternative possibility accords no special status to structural constraints: in addition to using Principle A, the parser also uses non-structural cues such as gender to access the antecedent. According to cue -based retrieval theories of memory (e.g., Lewis and Vasishth, 2005), the use of non-structural cues should result in increased retrieval times and occasional errors when candidates partially match the cues, even if the candidates are in structurally illicit positions. In this paper, we first show how the retrieval processes that underlie the reflexive binding are naturally realized in the Lewis and Vasishth (2005) model. We present the predictions of the model under the assumption that both structural and non-structural cues are used during retrieval, and provide a critical analysis of previous empirical studies that failed to find evidence for the use of non-structural cues, suggesting that these failures may be Type II errors. We use this analysis and the results of further modeling to motivate a new empirical design that we use in an eye tracking study. The results of this study confirm the key predictions of the model concerning the use of non-structural cues, and are inconsistent with the strictly syntactic search account. These results present a challenge for theories advocating the infallibility of the human parser in the case of reflexive resolution, and provide support for the inclusion of agreement features such as gender in the set of retrieval cues.
We report two experiments and Bayesian modelling of the data collected. In both experiments, participants performed a long-lag primed picture naming task. Black-and-white line drawings were used as targets, which were overtly named by the participants. Their naming latencies were measured. In both experiments, primes consisted of past participle verbs (er tanzt/er hat getanzt “he dances/he has danced”) and the relationship between primes and targets was either morphological or unrelated. Experiment 1 additionally had phonologically and semantically related prime-target pairs as well as present tense primes. Both in Experiment 1 and 2, participants showed significantly faster naming latencies for morphologically related targets relative to the unrelated verb primes. In Experiment 1, no priming effects were observed in phonologically and semantically related control conditions. In addition, the production latencies were not influenced by verb type.
It is a common finding across languages that young children have problems in understanding patient-initial sentences. We used Tagalog, a verb-initial language with a reliable voice-marking system and highly frequent patient voice constructions, to test the predictions of several accounts that have been proposed to explain this difficulty: the frequency account, the Competition Model, and the incremental processing account. Study 1 presents an analysis of Tagalog child-directed speech, which showed that the dominant argument order is agent-before-patient and that morphosyntactic markers are highly valid cues to thematic role assignment. In Study 2, we used a combined self-paced listening and picture verification task to test how Tagalog-speaking adults and 5- and 7-year-old children process reversible transitive sentences. Results showed that adults performed well in all conditions, while children's accuracy and listening times for the first noun phrase indicated more difficulty in interpreting patient-initial sentences in the agent voice compared to the patient voice. The patient voice advantage is partly explained by both the frequency account and incremental processing account.
Purpose: The acquisition of skills is essential to the conceptualization of cognitive-behavioural therapy. Yet, what experiences are encountered and what skills actually learned during therapy, and whether patients and therapists have concurrent views hereof, remains poorly understood.
Method: An explorative pilot study with semi-structured, corresponding interview guides was conducted. Pilot data from our outpatient unit were transcribed and content-analyzed following current guidelines.
Results: The responses of 18 participants (patients and their psychotherapists) were assigned to six main categories. Educational and cognitive aspects were mentioned most frequently and consistently by both groups. Having learned Behavioural alternatives attained the second highest agreement between perspectives.
Conclusions: Patients and therapists valued CBT as an opportunity to learn new skills, which is an important prerequisite also for the maintenance of therapeutic change. We discuss limitations to generalizability but also theoretical and therapy implications.
Moving arms
(2018)
Embodied cognition postulates a bi-directional link between the human body and its cognitive functions. Whether this holds for higher cognitive functions such as problem solving is unknown. We predicted that arm movement manipulations performed by the participants could affect the problem-solving solutions. We tested this prediction in quantitative reasoning tasks that allowed two solutions to each problem (addition or subtraction). In two studies with healthy adults (N=53 and N=50), we found an effect of problem-congruent movements on problem solutions. Consistent with embodied cognition, sensorimotor information gained via right or left arm movements affects the solution in different types of problem-solving tasks.
This study examined psychometric properties of figure rating scales, particularly the effects of ascending silhouette ordering, in 153 children, 9 to 13 years old. Two versions of Collins’s (1991) figural rating scale were presented: the original scale (figures arranged ascendingly) and a modified version (randomized figure ordering. Ratings of current and ideal figure were elicited and body dissatisfaction was calculated. All children were randomly assigned to one of two subgroups and completed both scale versions in a different sequence. There were no significant differences in figure selection and body dissatisfaction between the two figure orderings. Regarding the selection of the current figure, results showed that girls are more affected by the silhouette ordering than boys. Our results suggest that figure rating scales are both valid and reliable, whereby correlation coefficients reveal greater stability for ideal figure selections and body dissatisfaction ratings when using the scale with ascending figure ordering.
Is there an ideal time window for language acquisition after which nativelike
representation and processing are unattainable? Although this question has
been heavily debated, no consensus has been reached. Here, we present
evidence for a sensitive period in language development and show that it is
specific to grammar. We conducted a masked priming task with a group of
Turkish-German bilinguals and examined age of acquisition (AoA) effects on
the processing of complex words. We compared a subtle but meaningful
linguistic contrast, that between grammatical inflection and lexical-based
derivation. The results showed a highly selective AoA effect on inflectional
(but not derivational) priming. In addition, the effect displayed a discontinuity
indicative of a sensitive period: Priming from inflected forms was nativelike
when acquisition started before the age of 5 but declined with increasing
AoA. We conclude that the acquisition of morphological rules expressing
morphosyntactic properties is constrained by maturational factors.
Background: Infection with human immunodeficiency virus (HIV) affects muscle mass, altering independent activities of people living with HIV (PLWH). Resistance training alone (RT) or combined with aerobic exercise (AE) is linked to improved muscle mass and strength maintenance in PLWH. These exercise benefits have been the focus of different meta-analyses, although only a limited number of studies have been identified up to the year 2013/4. An up-to-date systematic review and meta-analysis concerning the effect of RT alone or combined with AE on strength parameters and hormones is of high value, since more and recent studies dealing with these types of exercise in PLWH have been published. Methods: Randomized controlled trials evaluating the effects of RT alone, AE alone or the combination of both (AERT) on PLWH was performed through five web-databases up to December 2017. Risk of bias and study quality was attained using the PEDro scale. Weighted mean difference (WMD) from baseline to post-intervention changes was calculated. The I2 statistics for heterogeneity was calculated. Results: Thirteen studies reported strength outcomes. Eight studies presented a low risk of bias. The overall change in upper body strength was 19.3 Kg (95% CI: 9.8±28.8, p< 0.001) after AERT and 17.5 Kg (95% CI: 16±19.1, p< 0.001) for RT. Lower body change was 29.4 Kg (95% CI: 18.1±40.8, p< 0.001) after RT and 10.2 Kg (95% CI: 6.7±13.8, p< 0.001) for AERT. Changes were higher after controlling for the risk of bias in upper and lower body strength and for supervised exercise in lower body strength. A significant change towards lower levels of IL-6 was found (-2.4 ng/dl (95% CI: -2.6, -2.1, p< 0.001). Conclusion: Both resistance training alone and combined with aerobic exercise showed a positive change when studies with low risk of bias and professional supervision were analyzed, improving upper and, more critically, lower body muscle strength. Also, this study found that exercise had a lowering effect on IL-6 levels in PLWH.
Background: Infection with human immunodeficiency virus (HIV) affects muscle mass, altering independent activities of people living with HIV (PLWH). Resistance training alone (RT) or combined with aerobic exercise (AE) is linked to improved muscle mass and strength maintenance in PLWH. These exercise benefits have been the focus of different meta-analyses, although only a limited number of studies have been identified up to the year 2013/4. An up-to-date systematic review and meta-analysis concerning the effect of RT alone or combined with AE on strength parameters and hormones is of high value, since more and recent studies dealing with these types of exercise in PLWH have been published.
Methods: Randomized controlled trials evaluating the effects of RT alone, AE alone or the combination of both (AERT) on PLWH was performed through five web-databases up to December 2017. Risk of bias and study quality was attained using the PEDro scale. Weighted mean difference (WMD) from baseline to post-intervention changes was calculated. The I2 statistics for heterogeneity was calculated.
Results: Thirteen studies reported strength outcomes. Eight studies presented a low risk of bias. The overall change in upper body strength was 19.3 Kg (95% CI: 9.8±28.8, p< 0.001) after AERT and 17.5 Kg (95% CI: 16±19.1, p< 0.001) for RT. Lower body change was 29.4 Kg (95% CI: 18.1±40.8, p< 0.001) after RT and 10.2 Kg (95% CI: 6.7±13.8, p< 0.001) for AERT. Changes were higher after controlling for the risk of bias in upper and lower body strength and for supervised exercise in lower body strength. A significant change towards lower levels of IL-6 was found (-2.4 ng/dl (95% CI: -2.6, -2.1, p< 0.001).
Conclusion: Both resistance training alone and combined with aerobic exercise showed a positive change when studies with low risk of bias and professional supervision were analyzed, improving upper and, more critically, lower body muscle strength. Also, this study found that exercise had a lowering effect on IL-6 levels in PLWH.
Background: Infection with human immunodeficiency virus (HIV) affects muscle mass, altering independent activities of people living with HIV (PLWH). Resistance training alone (RT) or combined with aerobic exercise (AE) is linked to improved muscle mass and strength maintenance in PLWH. These exercise benefits have been the focus of different meta-analyses, although only a limited number of studies have been identified up to the year 2013/4. An up-to-date systematic review and meta-analysis concerning the effect of RT alone or combined with AE on strength parameters and hormones is of high value, since more and recent studies dealing with these types of exercise in PLWH have been published. Methods: Randomized controlled trials evaluating the effects of RT alone, AE alone or the combination of both (AERT) on PLWH was performed through five web-databases up to December 2017. Risk of bias and study quality was attained using the PEDro scale. Weighted mean difference (WMD) from baseline to post-intervention changes was calculated. The I2 statistics for heterogeneity was calculated.
Results: Thirteen studies reported strength outcomes. Eight studies presented a low risk of bias. The overall change in upper body strength was 19.3 Kg (95% CI: 9.8±28.8, p< 0.001) after AERT and 17.5 Kg (95% CI: 16±19.1, p< 0.001) for RT. Lower body change was 29.4 Kg (95% CI: 18.1±40.8, p< 0.001) after RT and 10.2 Kg (95% CI: 6.7±13.8, p< 0.001) for AERT. Changes were higher after controlling for the risk of bias in upper and lower body strength and for supervised exercise in lower body strength. A significant change towards lower levels of IL-6 was found (-2.4 ng/dl (95% CI: -2.6, -2.1, p< 0.001). Conclusion: Both resistance training alone and combined with aerobic exercise showed a positive change when studies with low risk of bias and professional supervision were analyzed, improving upper and, more critically, lower body muscle strength. Also, this study found that exercise had a lowering effect on IL-6 levels in PLWH.
Accuracy of training recommendations based on a treadmill multistage incremental exercise test
(2018)
Competitive runners will occasionally undergo exercise in a laboratory setting to obtain predictive and prescriptive information regarding their performance. The present research aimed to assess whether the physiological demands of lab-based treadmill running (TM) can simulate that of over-ground (OG) running using a commonly used protocol. Fifteen healthy volunteers with a weekly mileage of ≥ 20 km over the past 6 months and treadmill experience participated in this cross-sectional study. Two stepwise incremental tests until volitional exhaustion was performed in a fixed order within one week in an Outpatient Clinic research laboratory and outdoor athletic track. Running velocity (IATspeed), heart rate (IATHR) and lactate concentration at the individual anaerobic threshold (IATbLa) were primary endpoints. Additionally, distance covered (DIST), maximal heart rate (HRmax), maximal blood lactate concentration (bLamax) and rate of perceived exertion (RPE) at IATspeed were analyzed. IATspeed, DIST and HRmax were not statistically significantly different between conditions, whereas bLamax and RPE at IATspeed showed statistical significance (p < 0.05). Apart from RPE at IATspeed, IATspeed, DIST, HRmax and bLamax strongly correlate between conditions (r = 0.815–0.988). High reliability between conditions provides strong evidence to suggest that running on a treadmill are physiologically comparable to that of OG and that training recommendations and be made with assurance.
Accuracy of training recommendations based on a treadmill multistage incremental exercise test
(2018)
Competitive runners will occasionally undergo exercise in a laboratory setting to obtain predictive and prescriptive information regarding their performance. The present research aimed to assess whether the physiological demands of lab-based treadmill running (TM) can simulate that of over-ground (OG) running using a commonly used protocol. Fifteen healthy volunteers with a weekly mileage of ≥ 20 km over the past 6 months and treadmill experience participated in this cross-sectional study. Two stepwise incremental tests until volitional exhaustion was performed in a fixed order within one week in an Outpatient Clinic research laboratory and outdoor athletic track. Running velocity (IATspeed), heart rate (IATHR) and lactate concentration at the individual anaerobic threshold (IATbLa) were primary endpoints. Additionally, distance covered (DIST), maximal heart rate (HRmax), maximal blood lactate concentration (bLamax) and rate of perceived exertion (RPE) at IATspeed were analyzed. IATspeed, DIST and HRmax were not statistically significantly different between conditions, whereas bLamax and RPE at IATspeed showed statistical significance (p < 0.05). Apart from RPE at IATspeed, IATspeed, DIST, HRmax and bLamax strongly correlate between conditions (r = 0.815–0.988). High reliability between conditions provides strong evidence to suggest that running on a treadmill are physiologically comparable to that of OG and that training recommendations and be made with assurance.
Hatred directed at members of groups due to their origin, race, gender, religion, or sexual orientation is not new, but it has taken on a new dimension in the online world. To date, very little is known about online hate among adolescents. It is also unknown how online disinhibition might influence the association between being bystanders and being perpetrators of online hate. Thus, the present study focused on examining the associations among being bystanders of online hate, being perpetrators of online hate, and the moderating role of toxic online disinhibition in the relationship between being bystanders and perpetrators of online hate. In total, 1480 students aged between 12 and 17 years old were included in this study. Results revealed positive associations between being online hate bystanders and perpetrators, regardless of whether adolescents had or had not been victims of online hate themselves. The results also showed an association between toxic online disinhibition and online hate perpetration. Further, toxic online disinhibition moderated the relationship between being bystanders of online hate and being perpetrators of online hate. Implications for prevention programs and future research are discussed.
Static (one-legged stance) and dynamic (star excursion balance) postural control tests were performed by 14 adolescent athletes with and 17 without back pain to determine reproducibility. The total displacement, mediolateral and anterior-posterior displacements of the centre of pressure in mm for the static, and the normalized and composite reach distances for the dynamic tests were analysed. Intraclass correlation coefficients, 95% confidence intervals, and a Bland-Altman analysis were calculated for reproducibility. Intraclass correlation coefficients for subjects with (0.54 to 0.65), (0.61 to 0.69) and without (0.45 to 0.49), (0.52 to 0.60) back pain were obtained on the static test for right and left legs, respectively. Likewise, (0.79 to 0.88), (0.75 to 0.93) for subjects with and (0.61 to 0.82), (0.60 to 0.85) for those without back pain were obtained on the dynamic test for the right and left legs, respectively. Systematic bias was not observed between test and retest of subjects on both static and dynamic tests. The one-legged stance and star excursion balance tests have fair to excellent reliabilities on measures of postural control in adolescent athletes with and without back pain. They can be used as measures of postural control in adolescent athletes with and without back pain.
Static (one-legged stance) and dynamic (star excursion balance) postural control tests were performed by 14 adolescent athletes with and 17 without back pain to determine reproducibility. The total displacement, mediolateral and anterior-posterior displacements of the centre of pressure in mm for the static, and the normalized and composite reach distances for the dynamic tests were analysed. Intraclass correlation coefficients, 95% confidence intervals, and a Bland-Altman analysis were calculated for reproducibility. Intraclass correlation coefficients for subjects with (0.54 to 0.65), (0.61 to 0.69) and without (0.45 to 0.49), (0.52 to 0.60) back pain were obtained on the static test for right and left legs, respectively. Likewise, (0.79 to 0.88), (0.75 to 0.93) for subjects with and (0.61 to 0.82), (0.60 to 0.85) for those without back pain were obtained on the dynamic test for the right and left legs, respectively. Systematic bias was not observed between test and retest of subjects on both static and dynamic tests. The one-legged stance and star excursion balance tests have fair to excellent reliabilities on measures of postural control in adolescent athletes with and without back pain. They can be used as measures of postural control in adolescent athletes with and without back pain.
Do properties of individual languages shape the mechanisms by which they are processed? By virtue of their non-concatenative morphological structure, the recognition of complex words in Semitic languages has been argued to rely strongly on morphological information and on decomposition into root and pattern constituents. Here, we report results from a masked priming experiment in Hebrew in which we contrasted verb forms belonging to two morphological classes, Paal and Piel, which display similar properties, but crucially differ on whether they are extended to novel verbs. Verbs from the open-class Piel elicited familiar root priming effects, but verbs from the closed-class Paal did not. Our findings indicate that, similarly to other (e.g., Indo-European) languages, down-to-the-root decomposition in Hebrew does not apply to stems of non-productive verbal classes. We conclude that the Semitic word processor is less unique than previously thought: Although it operates on morphological units that are combined in a non-linear way, it engages the same universal mechanisms of storage and computation as those seen in other languages.
Understanding a sentence and integrating it into the discourse depends upon the identification of its focus, which, in spoken German, is marked by accentuation. In the case of written language, which lacks explicit cues to accent, readers have to draw on other kinds of information to determine the focus. We study the joint or interactive effects of two kinds of information that have no direct representation in print but have each been shown to be influential in the reader's text comprehension: (i) the (low-level) rhythmic-prosodic structure that is based on the distribution of lexically stressed syllables, and (ii) the (high-level) discourse context that is grounded in the memory of previous linguistic content. Systematically manipulating these factors, we examine the way readers resolve a syntactic ambiguity involving the scopally ambiguous focus operator auch (engl. "too") in both oral (Experiment 1) and silent reading (Experiment 2). The results of both experiments attest that discourse context and local linguistic rhythm conspire to guide the syntactic and, concomitantly, the focus-structural analysis of ambiguous sentences. We argue that reading comprehension requires the (implicit) assignment of accents according to the focus structure and that, by establishing a prominence profile, the implicit prosodic rhythm directly affects accent assignment.
Many educational technology proponents support the Technological
Pedagogical Content Knowledge (TPACK) model as a way to
conceptualize teaching with technology, but recent TPACK research
shows a need for empirical studies regarding the development of this
knowledge. This proof-of-concept study applies mixed-methods to
investigate the meta-cognitive awareness produced by teachers who
participate in the Graphic Assessment of TPACK Instrument (GATI).
This process involves creating graphical representations (circles of
differing sizes and the degree of their overlap) that represent what
teachers understand to be their current and aspired TPACK. This study
documented teachers’ explanations during a think-aloud procedure as
they created their GATI figures. The in-depth data from two German
teachers who participated in the process captured the details of their
experience and demonstrated the potential of the GATI to support
teachers in reflecting about their professional knowledge and in
determining their own professional development activities. These
findings will be informative to future pilot studies involving the larger
design of the GATI process, to better understand the role of teachers’
meta-conceptual awareness, and to better ascertain how the GATI
might be used to support professional development on a larger scale.
Between-school variation in students' achievement, motivation, affect, and learning strategies
(2017)
To plan group-randomized trials where treatment conditions are assigned to schools, researchers need design parameters that provide information about between-school differences in outcomes as well as the amount of variance that can be explained by covariates at the student (L1) and school (L2) levels. Most previous research has offered these parameters for U.S. samples and for achievement as the outcome. This paper and the online supplementary materials provide design parameters for 81 countries in three broad outcome categories (achievement, affect and motivation, and learning strategies) for domain-general and domain-specific (mathematics, reading, and science) measures. Sociodemographic characteristics were used as covariates. Data from representative samples of 15-year-old students stemmed from five cycles of the Programme for International Student Assessment (PISA; total number of students/schools: 1,905,147/70,098). Between-school differences as well as the amount of variance explained at L1 and L2 varied widely across countries and educational outcomes, demonstrating the limited generalizability of design parameters across these dimensions. The use of the design parameters to plan group-randomized trials is illustrated.
This study investigates the comprehension of wh-questions in individuals with aphasia (IWA) speaking Turkish, a non-wh-movement language, and German, a wh-movement language. We examined six German-speaking and 11 Turkish-speaking IWA using picture-pointing tasks. Findings from our experiments show that the Turkish IWA responded more accurately to both object who and object which questions than to subject questions, while the German IWA performed better for subject which questions than in all other conditions. Using random forest models, a machine learning technique used in tree-structured classification, on the individual data revealed that both the Turkish and German IWA’s response accuracy is largely predicted by the presence of overt and unambiguous case marking. We discuss our results with regard to different theoretical approaches to the comprehension of wh-questions in aphasia.
The starting point of this contribution is the potential risk to health and performance from the combination of elite sporting careers with the pursuit of education. In European sport science and politics, structural measures to promote dual careers in elite sports have been discussed increasingly of late. In addition to organisational measures, there are calls for educational-psychological intervention programmes supporting the successful management of dual careers at the individual level. This paper presents an appropriate intervention programme and its evaluation: stress-resistance training for elite athletes (SRT-EA). It comprises 10 units, each lasting 90 minutes. It is intended for athletes and aims to improve their resistance to chronic stress. The evaluation was carried out in a quasi-experimental design, with three points of measurement (baseline, immediately after, and three months after) and two non-randomised groups: an intervention group (n = 128) and an untreated control group (n = 117). Participants were between 13 and 20 years of age (53.5% male) and represented
various Olympic sports. Outcome variables were assessed with questionnaires. Significant short- and mid-term intervention effects were explored. The intervention increased stress-related knowledge, general self-efficacy, and stress sensitivity. Chronic stress level, stress symptoms, and stress reactivity were reduced. In line with the intention of the intervention, the results showed short- and mid-term, small to medium-sized effects. Accordingly, separate measurements at the end of the intervention and three months later showed mostly positive subjective experiences. Thus, the results reinforce the hope that educational-psychological stress-management interventions can support dual careers.
The aim of our study was to examine the extent to which linguistic
approaches to sentence comprehension deficits in aphasia can
account for differential impairment patterns in the comprehension
of wh-questions in bilingual persons with aphasia (PWA). We investi-
gated the comprehension of subject and object wh-questions in both
Turkish, a wh-in-situ language, and German, a wh-fronting language,
in two bilingual PWA using a sentence-to-picture matching task. Both
PWA showed differential impairment patterns in their two languages.
SK, an early bilingual PWA, had particular difficulty comprehending
subject which-questions in Turkish but performed normal across all
conditions in German. CT, a late bilingual PWA, performed more
poorly for object which-questions in German than in all other condi-
tions, whilst in Turkish his accuracy was at chance level across all
conditions. We conclude that the observed patterns of selective
cross-linguistic impairments cannot solely be attributed either to
difficulty with wh-movement or to problems with the integration of
discourse-level information. Instead our results suggest that differ-
ences between our PWA’s individual bilingualism profiles (e.g. onset
of bilingualism, premorbid language dominance) considerably
affected the nature and extent of their impairments.
Using behavioral observation for the longitudinal study of anger regulation in middle childhood
(2017)
Assessing anger regulation via self-reports is fraught with problems, especially among children. Behavioral observation provides an ecologically valid alternative for measuring anger regulation. The present study uses data from two waves of a longitudinal study to present a behavioral observation approach for measuring anger regulation in middle childhood. At T1, 599 children from Germany (6–10 years old) were observed during an anger eliciting task, and the use of anger regulation strategies was coded. At T2, 3 years later, the observation was repeated with an age-appropriate version of the same task. Partial metric measurement invariance over time demonstrated the structural equivalence of the two versions. Maladaptive anger regulation between the two time points showed moderate stability. Validity was established by showing correlations with aggressive behavior, peer problems, and conduct problems (concurrent and predictive criterion validity). The study presents an ecologically valid and economic approach to assessing anger regulation strategies in situations.
Schools are a major context for academic and socio-emotional development, but
also an important acculturative context. This is notably the case in adolescence,
which is a critical period for the development of a social and ethnic identity, as
well as moral reasoning and intergroup attitudes. How schools approach cultural
diversity issues is therefore likely to affect these developmental and acculturative
processes and adaptation outcomes. In the present article, the manifestation
and effects of the most prominent approaches to cultural diversity, namely
those guided by a perspective of equality and inclusion, and those guided by
a perspective of cultural pluralism, are reviewed and compared in the context
of multi-ethnic schools. The aim is to explore when and how the potential of
cultural diversity can best flourish, enhancing the academic and socio-emotional
development of culturally diverse students.
Although all bilinguals encounter cross-language interference (CLI), some bilinguals are more susceptible to interference than others. Here, we report on language performance of late bilinguals (Russian/German) on two bilingual tasks (interview, verbal fluency), their language use and switching habits. The only between-group difference was CLI: one group consistently produced significantly more errors of CLI on both tasks than the other (thereby replicating our findings from a bilingual picture naming task). This striking group difference in language control ability can only be explained by differences in cognitive control, not in language proficiency or language mode.
Background: The goal of this study was to estimate the prevalence of and risk factors for diagnosed depression in heart failure (HF) patients in German primary care practices.
Methods: This study was a retrospective database analysis in Germany utilizing the Disease Analyzer (R) Database (IMS Health, Germany). The study population included 132,994 patients between 40 and 90 years of age from 1,072 primary care practices. The observation period was between 2004 and 2013. Follow-up lasted up to five years and ended in April 2015. A total of 66,497 HF patients were selected after applying exclusion criteria. The same number of 66,497 controls were chosen and were matched (1:1) to HF patients on the basis of age, sex, health insurance, depression diagnosis in the past, and follow-up duration after index date.
Results: HF was a strong risk factor for diagnosed depression (p < 0.0001). A total of 10.5% of HF patients and 6.3% of matched controls developed depression after one year of follow-up (p < 0.001). Depression was documented in 28.9% of the HF group and 18.2% of the control group after the five-year follow-up (p < 0.001). Cancer, dementia, osteoporosis, stroke, and osteoarthritis were associated with a higher risk of developing depression. Male gender and private health insurance were associated with lower risk of depression.
Conclusions: The risk of diagnosed depression is significantly increased in patients with HF compared to patients without HF in primary care practices in Germany.
Changes in free symptom attributions in hypochondriasis after cognitive therapy and exposure therapy
(2016)
Background: Cognitive-behavioural therapy can change dysfunctional symptom attributions in patients with hypochondriasis. Past research has used forced-choice answer formats, such as questionnaires, to assess these misattributions; however, with this approach, idiosyncratic attributions cannot be assessed. Free associations are an important complement to existing approaches that assess symptom attributions. Aims: With this study, we contribute to the current literature by using an open-response instrument to investigate changes in freely associated attributions after exposure therapy (ET) and cognitive therapy (CT) compared with a wait list (WL). Method: The current study is a re-examination of a formerly published randomized controlled trial (Weck, Neng, Richtberg, Jakob and Stangier, 2015) that investigated the effectiveness of CT and ET. Seventy-three patients with hypochondriasis were randomly assigned to CT, ET or a WL, and completed a 12-week treatment (or waiting period). Before and after the treatment or waiting period, patients completed an Attribution task in which they had to spontaneously attribute nine common bodily sensations to possible causes in an open-response format. Results: Compared with the WL, both CT and ET reduced the frequency of somatic attributions regarding severe diseases (CT: Hedges's g = 1.12; ET: Hedges's g = 1.03) and increased the frequency of normalizing attributions (CT: Hedges's g = 1.17; ET: Hedges's g = 1.24). Only CT changed the attributions regarding moderate diseases (Hedges's g = 0.69). Changes in somatic attributions regarding mild diseases and psychological attributions were not observed. Conclusions: Both CT and ET are effective for treating freely associated misattributions in patients with hypochondriasis. This study supplements research that used a forced-choice assessment.
This paper examines phonological phrasing in the Kwa language Akan. Regressive [+ATR] vowel harmony between words (RVH) serves as a hitherto unreported diagnostic of phonological phrasing. In this paper I discuss VP-internal and NP-internal structures, as well as SVO(O) and serial verb constructions. RVH is a general process in Akan grammar, although it is blocked in certain contexts. The analysis of phonological phrasing relies on universal syntax-phonology mapping constraints whereby lexically headed syntactic phrases are mapped onto phonological phrases. Blocking contexts call for a domain-sensitive analysis of RVH assuming recursive prosodic structure which makes reference to maximal and non-maximal phonological phrases. It is proposed (i) that phonological phrase structure is isomorphic to syntactic structure in Akan, and (ii) that the process of RVH is blocked at the edge of a maximal phonological phrase; this is formulated in terms of a domain-sensitive CrispEdge constraint.
Feeling Half-Half?
(2018)
Growing up in multicultural environments, Turkish-heritage individuals in
Europe face specific challenges in combining their multiple cultural iden-
tities to form a coherent sense of self. Drawing from social identity com-
plexity, this study explores four modes of combining cultural identities and
their variation in relational contexts. Problem-centered interviews with
Turkish-heritage young adults in Austria revealed the preference for com-
plex, supranational labels, such as multicultural. Furthermore, most partici-
pants described varying modes of combining cultural identities over time
and across relational contexts. Social exclusion experiences throughout
adolescence related to perceived conflict of cultural identities, whereas
multicultural peer groups supported perceived compatibility of cultural
identities. Findings emphasize the need for complex, multidimensional
approaches to study ethnic minorities’ combination of cultural identities.
Sensitivity to salience
(2018)
Sentence comprehension is optimised by indicating entities as salient through linguistic (i.e., information-structural) or visual means. We compare how salience of a depicted referent due to a linguistic (i.e., topic status) or visual cue (i.e., a virtual person’s gaze shift) modulates sentence comprehension in German. We investigated processing of sentences with varying word order and pronoun resolution by means of self-paced reading and an antecedent choice task, respectively. Our results show that linguistic as well as visual salience cues immediately speeded up reading times of sentences mentioning the salient referent first. In contrast, for pronoun resolution, linguistic and visual cues modulated antecedent choice preferences less congruently. In sum, our findings speak in favour of a significant impact of linguistic and visual salience cues on sentence comprehension, substantiating that salient information delivered via language as well as the visual environment is integrated in the current mental representation of the discourse.
In a self-paced reading study on German sluicing, Paape (Paape, 2016) found that reading times were shorter at the ellipsis site when the antecedent was a temporarily ambiguous garden-path structure. As a post-hoc explanation of this finding, Paape assumed that the antecedent’s memory representation was reactivated during syntactic reanalysis, making it easier to retrieve. In two eye tracking experiments, we subjected the reactivation hypothesis to further empirical scrutiny. Experiment 1, carried out in French, showed no evidence in favor in the reactivation hypothesis. Instead, results for one out of the three types of garden-path sentences that were tested suggest that subjects sometimes failed to resolve the temporary ambiguity in the antecedent clause, and subsequently failed to resolve the ellipsis. The results of Experiment 2, a conceptual replication of Paape’s (Paape, 2016) original study carried out in German, are compatible with the reactivation hypothesis, but leave open the possibility that the observed speedup for ambiguous antecedents may be due to occasional retrievals of an incorrect structure.
In a self-paced reading study on German sluicing, Paape (Paape, 2016) found that reading times were shorter at the ellipsis site when the antecedent was a temporarily ambiguous garden-path structure. As a post-hoc explanation of this finding, Paape assumed that the antecedent’s memory representation was reactivated during syntactic reanalysis, making it easier to retrieve. In two eye tracking experiments, we subjected the reactivation hypothesis to further empirical scrutiny. Experiment 1, carried out in French, showed no evidence in favor in the reactivation hypothesis. Instead, results for one out of the three types of garden-path sentences that were tested suggest that subjects sometimes failed to resolve the temporary ambiguity in the antecedent clause, and subsequently failed to resolve the ellipsis. The results of Experiment 2, a conceptual replication of Paape’s (Paape, 2016) original study carried out in German, are compatible with the reactivation hypothesis, but leave open the possibility that the observed speedup for ambiguous antecedents may be due to occasional retrievals of an incorrect structure.
Much research on language control in bilinguals has relied on the interpretation of the costs of switching between two languages. Of the two types of costs that are linked to language control, switching costs are assumed to be transient in nature and modulated by trial-specific manipulations (e.g., by preparation time), while mixing costs are supposed to be more stable and less affected by trial-specific manipulations. The present study investigated the effect of preparation time on switching and mixing costs, revealing that both types of costs can be influenced by trial-specific manipulations.
Rhythm perception is assumed to be guided by a domain-general auditory principle, the Iambic/Trochaic Law, stating that sounds varying in intensity are grouped as strong-weak, and sounds varying in duration are grouped as weak-strong. Recently, Bhatara et al. (2013) showed that rhythmic grouping is influenced by native language experience, French listeners having weaker grouping preferences than German listeners. This study explores whether L2 knowledge and musical experience also affect rhythmic grouping. In a grouping task, French late learners of German listened to sequences of coarticulated syllables varying in either intensity or duration. Data on their language and musical experience were obtained by a questionnaire. Mixed-effect model comparisons showed influences of musical experience as well as L2 input quality and quantity on grouping preferences. These results imply that adult French listeners' sensitivity to rhythm can be enhanced through L2 and musical experience.
Background: Dementia is a psychiatric condition the development of which is associated with numerous aspects of life. Our aim was to estimate dementia risk factors in German primary care patients.
Methods: The case-control study included primary care patients (70-90 years) with first diagnosis of dementia (all-cause) during the index period (01/2010-12/2014) (Disease Analyzer, Germany), and controls without dementia matched (1:1) to cases on the basis of age, sex, type of health insurance, and physician. Practice visit records were used to verify that there had been 10 years of continuous follow-up prior to the index date. Multivariate logistic regression models were fitted with dementia as a dependent variable and the potential predictors.
Results: The mean age for the 11,956 cases and the 11,956 controls was 80.4 (SD: 5.3) years. 39.0% of them were male and 1.9% had private health insurance. In the multivariate regression model, the following variables were linked to a significant extent with an increased risk of dementia: diabetes (OR: 1.17; 95% CI: 1.10-1.24), lipid metabolism (1.07; 1.00-1.14), stroke incl. TIA (1.68; 1.57-1.80), Parkinson's disease (PD) (1.89; 1.64-2.19), intracranial injury (1.30; 1.00-1.70), coronary heart disease (1.06; 1.00-1.13), mild cognitive impairment (MCI) (2.12; 1.82-2.48), mental and behavioral disorders due to alcohol use (1.96; 1.50-2.57). The use of statins (OR: 0.94; 0.90-0.99), proton-pump inhibitors (PPI) (0.93; 0.90-0.97), and antihypertensive drugs (0.96, 0.94-0.99) were associated with a decreased risk of developing dementia.
Conclusions: Risk factors for dementia found in this study are consistent with the literature. Nevertheless, the associations between statin, PPI and antihypertensive drug use, and decreased risk of dementia need further investigations.
Background: Given the well-established association between perceived stress and quality of life (QoL) in dementia patients and their partners, our goal was to identify whether relationship quality and dyadic coping would operate as mediators between perceived stress and QoL.
Methods: 82 dyads of dementia patients and their spousal caregivers were included in a cross-sectional assessment from a prospective study. QoL was assessed with the Quality of Life in Alzheimer's Disease scale (QoL-AD) for dementia patients and the WHO Quality of Life-BREF for spousal caregivers. Perceived stress was measured with the Perceived Stress Scale (PSS-14). Both partners were assessed with the Dyadic Coping Inventory (DCI). Analyses of correlation as well as regression models including mediator analyses were performed.
Results: We found negative correlations between stress and QoL in both partners (QoL-AD: r = -0.62; p < 0.001; WHO-QOL Overall: r = -0.27; p = 0.02). Spousal caregivers had a significantly lower DCI total score than dementia patients (p < 0.001). Dyadic coping was a significant mediator of the relationship between stress and QoL in spousal caregivers (z = 0.28; p = 0.02), but not in dementia patients. Likewise, relationship quality significantly mediated the relationship between stress and QoL in caregivers only (z = -2.41; p = 0.02).
Conclusions: This study identified dyadic coping as a mediator on the relationship between stress and QoL in (caregiving) partners of dementia patients. In patients, however, we found a direct negative effect of stress on QoL. The findings suggest the importance of stress reducing and dyadic interventions for dementia patients and their partners, respectively.
Previous research informs us about facilitators of employees’ promotive voice. Yet little is known about what determines whether a specific idea for constructive change brought up by an employee will be approved or rejected by a supervisor. Drawing on interactionist theories of motivation and personality, we propose that a supervisor will be least likely to support an idea when it threatens the supervisor’s power motive, and when it is perceived to serve the employee’s own striving for power. The prosocial versus egoistic intentions attributed to the idea presenter are proposed to mediate the latter effect. We conducted three scenario-based studies in which supervisors evaluated fictitious ideas voiced by employees that – if implemented – would have power-related consequences for them as a supervisor. Results show that the higher a supervisors’ explicit power motive was, the less likely they were to support a power-threatening idea (Study 1, N = 60). Moreover, idea support was less likely when this idea was proposed by an employee that was described as high (rather than low) on power motivation (Study 2, N = 79); attributed prosocial intentions mediated this effect. Study 3 (N = 260) replicates these results.
Several personality dispositions with common features capturing sensitivities to negative social cues have recently been introduced into psychological research. To date, however, little is known about their interrelations, their conjoint effects on behavior, or their interplay with other risk factors. We asked N = 349 adults from Germany to rate their justice, rejection, moral disgust, and provocation sensitivity, hostile attribution bias, trait anger, and forms and functions of aggression. The sensitivity measures were mostly positively correlated; particularly those with an egoistic focus, such as victim justice, rejection, and provocation sensitivity, hostile attributions and trait anger as well as those with an altruistic focus, such as observer justice, perpetrator justice, and moral disgust sensitivity. The sensitivity measures had independent and differential effects on forms and functions of aggression when considered simultaneously and when controlling for hostile attributions and anger. They could not be integrated into a single factor of interpersonal sensitivity or reduced to other well-known risk factors for aggression. The sensitivity measures, therefore, require consideration in predicting and preventing aggression.
Background:
Deception can distort psychological tests on socially sensitive topics. Understanding the cerebral
processes that are involved in such faking can be useful in detection and prevention of deception. Previous research
shows that faking a brief implicit association test (BIAT ) evokes a characteristic ERP response. It is not yet known
whether temporarily available self-control resources moderate this response. We randomly assigned 22 participants
(15 females, 24.23
±
2.91
years old) to a counterbalanced repeated-measurements design. Participants first com-
pleted a Brief-IAT (BIAT ) on doping attitudes as a baseline measure and were then instructed to fake a negative dop
-
ing attitude both when self-control resources were depleted and non-depleted. Cerebral activity during BIAT perfor
-
mance was assessed using high-density EEG.
Results:
Compared to the baseline BIAT, event-related potentials showed a first interaction at the parietal P1,
while significant post hoc differences were found only at the later occurring late positive potential. Here, signifi-
cantly decreased amplitudes were recorded for ‘normal’ faking, but not in the depletion condition. In source space,
enhanced activity was found for ‘normal’ faking in the bilateral temporoparietal junction. Behaviorally, participants
were successful in faking the BIAT successfully in both conditions.
Conclusions:
Results indicate that temporarily available self-control resources do not affect overt faking success on
a BIAT. However, differences were found on an electrophysiological level. This indicates that while on a phenotypical
level self-control resources play a negligible role in deliberate test faking the underlying cerebral processes are markedly different.
Aim: We aimed to identify patient characteristics and comorbidities that correlate with the initial exercise capacity of
cardiac rehabilitation (CR) patients and to study the significance of patient characteristics, comorbidities and training
methods for training achievements and final fitness of CR patients.
Methods: We studied 557 consecutive patients (51.7 Æ 6.9 years; 87.9% men) admitted to a three-week in-patient CR.
Cardiopulmonary exercise testing (CPX) was performed at discharge. Exercise capacity (watts) at entry, gain in training
volume and final physical fitness (assessed by peak O 2 utilization (VO 2peak ) were analysed using analysis of covariance
(ANCOVA) models.
Results: Mean training intensity was 90.7 Æ 9.7% of maximum heart rate (81% continuous/19% interval training, 64%
additional strength training). A total of 12.2 Æ 2.6 bicycle exercise training sessions were performed. Increase of training
volume by an average of more than 100% was achieved (difference end/beginning of CR: 784 Æ 623 watts  min). In the
multivariate model the gain in training volume was significantly associated with smoking, age and exercise capacity at
entry of CR. The physical fitness level achieved at discharge from CR as assessed by VO 2peak was mainly dependent on
age, but also on various factors related to training, namely exercise capacity at entry, increase of training volume and
training method.
Conclusion: CR patients were trained in line with current guidelines with moderate-to-high intensity and reached a
considerable increase of their training volume. The physical fitness level achieved at discharge from CR depended on
various factors associated with training, which supports the recommendation that CR should be offered to all cardiac
patients.
Editorial
(2016)
Background:
It has previously been shown that conditioning activities consisting of repetitive hops have the
potential to induce better drop jump (DJ) performance in recreationally active individuals. In the present pilot study,
we investigated whether repetitive conditioning hops can also increase reactive jump and sprint performance in
sprint-trained elite athletes competing at an international level.
Methods:
Jump and sprint performances of 5 athletes were randomly assessed under 2 conditions. The control
condition (CON) comprised 8 DJs and 4 trials of 30-m sprints. The intervention condition (HOP) consisted of 10
maximal repetitive two-legged hops that were conducted 10 s prior to each single DJ and sprint trial. DJ
performance was analyzed using a one-dimensional ground reaction force plate. Step length (SL), contact time (CT),
and sprint time (ST) during the 30-m sprints were recorded using an opto-electronic measurement system.
Results:
Following the conditioning activity, DJ height and external DJ peak power were both significantly
increased by 11 % compared to the control condition. All other variables did not show any significant differences
between HOP and CON.
Conclusions:
In the present pilot study, we were able to demonstrate large improvements in DJ performance even
in sprint-trained elite athletes following a conditioning activity consisting of maximal two-legged repetitive hops.
This strengthens the hypothesis that plyometric conditioning exercises can induce performance enhancements in
elite athletes that are even greater than those observed in recreationally active athletes.. In addition, it appears that
the transfer of these effects to other stretch-shortening cycle activities is limited, as we did not observe any
changes in sprint performance following the plyometric conditioning activity.
The genesis of chronic pain is explained by a biopsychosocial model. It hypothesizes an interdependency between environmental and genetic factors provoking aberrant long-term changes in biological and psychological regulatory systems. Physiological effects of psychological and physical stressors may play a crucial role in these maladaptive processes. Specifically, long-term demands on the stress response system may moderate central pain processing and influence descending serotonergic and noradrenergic signals from the brainstem, regulating nociceptive processing at the spinal level. However, the underlying mechanisms of this pathophysiological interplay still remain unclear. This paper aims to shed light on possible pathways between physical (exercise) and psychological stress and the potential neurobiological consequences in the genesis and treatment of chronic pain, highlighting evolving concepts and promising research directions in the treatment of chronic pain. Two treatment forms (exercise and mindfulness-based stress reduction as exemplary therapies), their interaction, and the dose-response will be discussed in more detail, which might pave the way to a better understanding of alterations in the pain matrix and help to develop future prevention and therapeutic concepts
The genesis of chronic pain is explained by a biopsychosocial model. It hypothesizes an interdependency between environmental and genetic factors provoking aberrant long-term changes in biological and psychological regulatory systems. Physiological effects of psychological and physical stressors may play a crucial role in these maladaptive processes. Specifically, long-term demands on the stress response system may moderate central pain processing and influence descending serotonergic and noradrenergic signals from the brainstem, regulating nociceptive processing at the spinal level. However, the underlying mechanisms of this pathophysiological interplay still remain unclear. This paper aims to shed light on possible pathways between physical (exercise) and psychological stress and the potential neurobiological consequences in the genesis and treatment of chronic pain, highlighting evolving concepts and promising research directions in the treatment of chronic pain. Two treatment forms (exercise and mindfulness-based stress reduction as exemplary therapies), their interaction, and the dose-response will be discussed in more detail, which might pave the way to a better understanding of alterations in the pain matrix and help to develop future prevention and therapeutic concepts
Objective: To investigate associations between socioeconomic status (SES) indicators (education, job position, income, multidimensional index) and the genesis of chronic low back pain (CLBP).
Design: Longitudinal field study (baseline and 6-month follow-up).
Setting: Four medical clinics across Germany.
Participants: 352 people were included according to the following criteria: (1) between 18 and 65 years of age, (2) intermittent pain and (3) an understanding of the study and the ability to answer a questionnaire without help. Exclusion criteria were: (1) pregnancy, (2) inability to stand upright, (3) inability to give sick leave information, (4) signs of serious spinal pathology, (5) acute pain in the past 7 days or (6) an incomplete SES indicators questionnaire.
Outcome measures: Subjective intensity and disability of CLBP.
Results: Analysis showed that job position was the best single predictor of CLBP intensity, followed by a multidimensional index. Education and income had no significant association with intensity. Subjective disability was best predicted by job position, succeeded by the multidimensional index and education, while income again had no significant association.
Conclusion: The results showed that SES indicators have different strong associations with the genesis of CLBP and should therefore not be used interchangeably. Job position was found to be the single most important indicator. These results could be helpful in the planning of back pain care programmes, but in general, more research on the relationship between SES and health outcomes is needed.
Objective: To investigate associations between socioeconomic status (SES) indicators (education, job position, income, multidimensional index) and the genesis of chronic low back pain (CLBP).
Design: Longitudinal field study (baseline and 6-month follow-up).
Setting: Four medical clinics across Germany.
Participants: 352 people were included according to the following criteria: (1) between 18 and 65 years of age, (2) intermittent pain and (3) an understanding of the study and the ability to answer a questionnaire without help. Exclusion criteria were: (1) pregnancy, (2) inability to stand upright, (3) inability to give sick leave information, (4) signs of serious spinal pathology, (5) acute pain in the past 7 days or (6) an incomplete SES indicators questionnaire.
Outcome measures: Subjective intensity and disability of CLBP.
Results Analysis: showed that job position was the best single predictor of CLBP intensity, followed by a multidimensional index. Education and income had no significant association with intensity. Subjective disability was best predicted by job position, succeeded by the multidimensional index and education, while income again had no significant association.
Conclusion: The results showed that SES indicators have different strong associations with the genesis of CLBP and should therefore not be used interchangeably. Job position was found to be the single most important indicator. These results could be helpful in the planning of back pain care programmes, but in general, more research on the relationship between SES and health outcomes is needed.
The regular monitoring of physical fitness and sport-specific performance is important in elite sports to increase the likelihood of success in competition. This study aimed to systematically review and to critically appraise the methodological quality, validation data, and feasibility of the sport-specific performance assessment in Olympic combat sports like amateur boxing, fencing, judo, karate, taekwondo, and wrestling. A systematic search was conducted in the electronic databases PubMed, Google-Scholar, and Science-Direct up to October 2017. Studies in combat sports were included that reported validation data (e.g., reliability, validity, sensitivity) of sport-specific tests. Overall, 39 studies were eligible for inclusion in this review. The majority of studies (74%) contained sample sizes <30 subjects. Nearly, 1/3 of the reviewed studies lacked a sufficient description (e.g., anthropometrics, age, expertise level) of the included participants. Seventy-two percent of studies did not sufficiently report inclusion/exclusion criteria of their participants. In 62% of the included studies, the description and/or inclusion of a familiarization session (s) was either incomplete or not existent. Sixty-percent of studies did not report any details about the stability of testing conditions. Approximately half of the studies examined reliability measures of the included sport-specific tests (intraclass correlation coefficient [ICC] = 0.43–1.00). Content validity was addressed in all included studies, criterion validity (only the concurrent aspect of it) in approximately half of the studies with correlation coefficients ranging from r = −0.41 to 0.90. Construct validity was reported in 31% of the included studies and predictive validity in only one. Test sensitivity was addressed in 13% of the included studies. The majority of studies (64%) ignored and/or provided incomplete information on test feasibility and methodological limitations of the sport-specific test. In 28% of the included studies, insufficient information or a complete lack of information was provided in the respective field of the test application. Several methodological gaps exist in studies that used sport-specific performance tests in Olympic combat sports. Additional research should adopt more rigorous validation procedures in the application and description of sport-specific performance tests in Olympic combat sports.
The regular monitoring of physical fitness and sport-specific performance is important in elite sports to increase the likelihood of success in competition. This study aimed to systematically review and to critically appraise the methodological quality, validation data, and feasibility of the sport-specific performance assessment in Olympic combat sports like amateur boxing, fencing, judo, karate, taekwondo, and wrestling. A systematic search was conducted in the electronic databases PubMed, Google-Scholar, and Science-Direct up to October 2017. Studies in combat sports were included that reported validation data (e.g., reliability, validity, sensitivity) of sport-specific tests. Overall, 39 studies were eligible for inclusion in this review. The majority of studies (74%) contained sample sizes <30 subjects. Nearly, 1/3 of the reviewed studies lacked a sufficient description (e.g., anthropometrics, age, expertise level) of the included participants. Seventy-two percent of studies did not sufficiently report inclusion/exclusion criteria of their participants. In 62% of the included studies, the description and/or inclusion of a familiarization session (s) was either incomplete or not existent. Sixty-percent of studies did not report any details about the stability of testing conditions. Approximately half of the studies examined reliability measures of the included sport-specific tests (intraclass correlation coefficient [ICC] = 0.43–1.00). Content validity was addressed in all included studies, criterion validity (only the concurrent aspect of it) in approximately half of the studies with correlation coefficients ranging from r = −0.41 to 0.90. Construct validity was reported in 31% of the included studies and predictive validity in only one. Test sensitivity was addressed in 13% of the included studies. The majority of studies (64%) ignored and/or provided incomplete information on test feasibility and methodological limitations of the sport-specific test. In 28% of the included studies, insufficient information or a complete lack of information was provided in the respective field of the test application. Several methodological gaps exist in studies that used sport-specific performance tests in Olympic combat sports. Additional research should adopt more rigorous validation procedures in the application and description of sport-specific performance tests in Olympic combat sports.
Findings on the perceptual reorganization of lexical tones are mixed. Some studies report good tone discrimination abilities for all tested age groups, others report decreased or enhanced discrimination with increasing age, and still others report U-shaped developmental curves. Since prior studies have used a wide range of contrasts and experimental procedures, it is unclear how specific task requirements interact with discrimination abilities at different ages. In the present work, we tested German and Cantonese adults on their discrimination of Cantonese lexical tones, as well as German-learning infants between 6 and 18 months of age on their discrimination of two specific Cantonese tones using two different types of experimental procedures. The adult experiment showed that German native speakers can discriminate between lexical tones, but native Cantonese speakers show significantly better performance. The results from German-learning infants suggest that 6- and 18-month-olds discriminate tones, while 9-month-olds do not, supporting a U-shaped developmental curve. Furthermore, our results revealed an effect of methodology, with good discrimination performance at 6 months after habituation but not after familiarization. These results support three main conclusions. First, habituation can be a more sensitive procedure for measuring infants' discrimination than familiarization. Second, the previous finding of a U-shaped curve in the discrimination of lexical tones is further supported. Third, discrimination abilities at 18 months appear to reflect mature perceptual sensitivity to lexical tones, since German adults also discriminated the lexical tones with high accuracy.
Findings on the perceptual reorganization of lexical tones are mixed. Some studies report good tone discrimination abilities for all tested age groups, others report decreased or enhanced discrimination with increasing age, and still others report U-shaped developmental curves. Since prior studies have used a wide range of contrasts and experimental procedures, it is unclear how specific task requirements interact with discrimination abilities at different ages. In the present work, we tested German and Cantonese adults on their discrimination of Cantonese lexical tones, as well as German-learning infants between 6 and 18 months of age on their discrimination of two specific Cantonese tones using two different types of experimental procedures. The adult experiment showed that German native speakers can discriminate between lexical tones, but native Cantonese speakers show significantly better performance. The results from German-learning infants suggest that 6- and 18-month-olds discriminate tones, while 9-month-olds do not, supporting a U-shaped developmental curve. Furthermore, our results revealed an effect of methodology, with good discrimination performance at 6 months after habituation but not after familiarization. These results support three main conclusions. First, habituation can be a more sensitive procedure for measuring infants' discrimination than familiarization. Second, the previous finding of a U-shaped curve in the discrimination of lexical tones is further supported. Third, discrimination abilities at 18 months appear to reflect mature perceptual sensitivity to lexical tones, since German adults also discriminated the lexical tones with high accuracy.
Infants start learning the prosodic properties of their native language before 12 months, as shown by the emergence of a trochaic bias in English-learning infants between 6 and 9 months (Jusczyk et al., 1993), and in German-learning infants between 4 and 6 months (Huhle et al., 2009, 2014), while French-learning infants do not show a bias at 6 months (Hohle et al., 2009). This language-specific emergence of a trochaic bias is supported by the fact that English and German are languages with trochaic predominance in their lexicons, while French is a language with phrase-final lengthening but lacking lexical stress. We explored the emergence of a trochaic bias in bilingual French/German infants, to study whether the developmental trajectory would be similar to monolingual infants and whether amount of relative exposure to the two languages has an impact on the emergence of the bias. Accordingly, we replicated Hohle et al. (2009) with 24 bilingual 6-month-olds learning French and German simultaneously. All infants had been exposed to both languages for 30 to 70% of the time from birth. Using the Head Preference Procedure, infants were presented with two lists of stimuli, one made up of several occurrences of the pseudoword /GAba/ with word-initial stress (trochaic pattern), the second one made up of several occurrences of the pseudoword /gaBA/ with word-final stress (iambic pattern). The stimuli were recorded by a native German female speaker. Results revealed that these French/German bilingual 6-month olds have a trochaic bias (as evidenced by a preference to listen to the trochaic pattern). Hence, their listening preference is comparable to that of monolingual German-learning 6-month-olds, but differs from that of monolingual French-learning 6-month-olds who did not show any preference (Noble et al., 2009). Moreover, the size of the trochaic bias in the bilingual infants was not correlated with their amount of exposure to German. The present results thus establish that the development of a trochaic bias in simultaneous bilinguals is not delayed compared to monolingual German-learning infants (Hohle et al., 2009) and is rather independent of the amount of exposure to German relative to French.
Drugs as instruments
(2016)
Neuroenhancement (NE) is the non-medical use of psychoactive substances to produce a subjective enhancement in psychological functioning and experience. So far empirical investigations of individuals' motivation for NE however have been hampered by the lack of theoretical foundation. This study aimed to apply drug instrumentalization theory to user motivation for NE. We argue that NE should be defined and analyzed from a behavioral perspective rather than in terms of the characteristics of substances used for NE. In the empirical study we explored user behavior by analyzing relationships between drug options (use over-the-counter products, prescription drugs, illicit drugs) and postulated drug instrumentalization goals (e.g., improved cognitive performance, counteracting fatigue, improved social interaction). Questionnaire data from 1438 university students were subjected to exploratory and confirmatory factor analysis to address the question of whether analysis of drug instrumentalization should be based on the assumption that users are aiming to achieve a certain goal and choose their drug accordingly or whether NE behavior is more strongly rooted in a decision to try or use a certain drug option. We used factor mixture modeling to explore whether users could be separated into qualitatively different groups defined by a shared "goal X drug option" configuration. Our results indicate, first, that individuals decisions about NE are eventually based on personal attitude to drug options (e.g., willingness to use an over-the-counter product but not to abuse prescription drugs) rather than motivated by desire to achieve a specific goal (e.g., fighting tiredness) for which different drug options might be tried. Second, data analyses suggested two qualitatively different classes of users. Both predominantly used over-the-counter products, but "neuroenhancers" might be characterized by a higher propensity to instrumentalize over-the-counter products for virtually all investigated goals whereas "fatigue-fighters" might be inclined to use over-the-counter products exclusively to fight fatigue. We believe that psychological investigations like these are essential, especially for designing programs to prevent risky behavior.
Due to maturation of the postural control system and secular declines in motor performance, adolescents experience deficits in postural control during standing and walking while concurrently performing cognitive interference tasks. Thus, adequately designed balance training programs may help to counteract these deficits. While the general effectiveness of youth balance training is well-documented, there is hardly any information available on the specific effects of single-task (ST) versus dual-task (DT) balance training. Therefore, the objectives of this study were (i) to examine static/dynamic balance performance under ST and DT conditions in adolescents and (ii) to study the effects of ST versus DT balance training on static/dynamic balance under ST and DT conditions in adolescents. Twenty-eight healthy girls and boys aged 12–13 years were randomly assigned to either 8 weeks of ST or DT balance training. Before and after training, postural sway and spatio-temporal gait parameters were registered under ST (standing/walking only) and DT conditions (standing/walking while concurrently performing an arithmetic task). At baseline, significantly slower gait speed (p < 0.001, d = 5.1), shorter stride length (p < 0.001, d = 4.8), and longer stride time (p < 0.001, d = 3.8) were found for DT compared to ST walking but not standing. Training resulted in significant pre–post decreases in DT costs for gait velocity (p < 0.001, d = 3.1), stride length (-45%, p < 0.001, d = 2.4), and stride time (-44%, p < 0.01, d = 1.9). Training did not induce any significant changes (p > 0.05, d = 0–0.1) in DT costs for all parameters of secondary task performance during standing and walking. Training produced significant pre–post increases (p = 0.001; d = 1.47) in secondary task performance while sitting. The observed increase was significantly greater for the ST training group (p = 0.04; d = 0.81). For standing, no significant changes were found over time irrespective of the experimental group. We conclude that adolescents showed impaired DT compared to ST walking but not standing. ST and DT balance training resulted in significant and similar changes in DT costs during walking. Thus, there appears to be no preference for either ST or DT balance training in adolescents.
Due to maturation of the postural control system and secular declines in motor performance, adolescents experience deficits in postural control during standing and walking while concurrently performing cognitive interference tasks. Thus, adequately designed balance training programs may help to counteract these deficits. While the general effectiveness of youth balance training is well-documented, there is hardly any information available on the specific effects of single-task (ST) versus dual-task (DT) balance training. Therefore, the objectives of this study were (i) to examine static/dynamic balance performance under ST and DT conditions in adolescents and (ii) to study the effects of ST versus DT balance training on static/dynamic balance under ST and DT conditions in adolescents. Twenty-eight healthy girls and boys aged 12–13 years were randomly assigned to either 8 weeks of ST or DT balance training. Before and after training, postural sway and spatio-temporal gait parameters were registered under ST (standing/walking only) and DT conditions (standing/walking while concurrently performing an arithmetic task). At baseline, significantly slower gait speed (p < 0.001, d = 5.1), shorter stride length (p < 0.001, d = 4.8), and longer stride time (p < 0.001, d = 3.8) were found for DT compared to ST walking but not standing. Training resulted in significant pre–post decreases in DT costs for gait velocity (p < 0.001, d = 3.1), stride length (-45%, p < 0.001, d = 2.4), and stride time (-44%, p < 0.01, d = 1.9). Training did not induce any significant changes (p > 0.05, d = 0–0.1) in DT costs for all parameters of secondary task performance during standing and walking. Training produced significant pre–post increases (p = 0.001; d = 1.47) in secondary task performance while sitting. The observed increase was significantly greater for the ST training group (p = 0.04; d = 0.81). For standing, no significant changes were found over time irrespective of the experimental group. We conclude that adolescents showed impaired DT compared to ST walking but not standing. ST and DT balance training resulted in significant and similar changes in DT costs during walking. Thus, there appears to be no preference for either ST or DT balance training in adolescents.
Concrete-operational thinking depicts an important aspect of cognitive development. A promising approach in promoting these skills is the instruction of strategies. The construction of such instructional programs requires insights into the mental operations involved in problem-solving. In the present paper, we address the question to which extent variations of the effect of isolated and combined mental operations (strategies) on correct solution of concrete-operational concepts can be observed. Therefore, a cross-sectional design was applied. The use of mental operations was measured by thinking-aloud reports from 80 first- and second-graders (N = 80) while solving tasks depicting concrete-operational thinking. Concrete-operational thinking was assessed using the subscales conservation of numbers, classification and sequences of the TEKO. The verbal reports were transcribed and coded with regard to the mental operations applied per task. Data analyses focused on tasks level, resulting in the analyses of N = 240 tasks per subscale. Differences regarding the contribution of isolated and combined mental operations (strategies) to correct solution were observed. Thereby, the results indicate the necessity of selection and integration of appropriate mental operations as strategies. The results offer insights in involved mental operations while solving concrete-operational tasks and depict a contribution to the construction of instructional programs.
Concrete-operational thinking depicts an important aspect of cognitive development. A promising approach in promoting these skills is the instruction of strategies. The construction of such instructional programs requires insights into the mental operations involved in problem-solving. In the present paper, we address the question to which extent variations of the effect of isolated and combined mental operations (strategies) on correct solution of concrete-operational concepts can be observed. Therefore, a cross-sectional design was applied. The use of mental operations was measured by thinking-aloud reports from 80 first- and second-graders (N = 80) while solving tasks depicting concrete-operational thinking. Concrete-operational thinking was assessed using the subscales conservation of numbers, classification and sequences of the TEKO. The verbal reports were transcribed and coded with regard to the mental operations applied per task. Data analyses focused on tasks level, resulting in the analyses of N = 240 tasks per subscale. Differences regarding the contribution of isolated and combined mental operations (strategies) to correct solution were observed. Thereby, the results indicate the necessity of selection and integration of appropriate mental operations as strategies. The results offer insights in involved mental operations while solving concrete-operational tasks and depict a contribution to the construction of instructional programs.