Refine
Has Fulltext
- no (227) (remove)
Year of publication
Document Type
- Article (206)
- Conference Proceeding (6)
- Review (5)
- Monograph/Edited Volume (4)
- Doctoral Thesis (3)
- Preprint (2)
- Habilitation Thesis (1)
Language
- English (227) (remove)
Is part of the Bibliography
- yes (227)
Keywords
- exercise (8)
- fMRI (7)
- football (7)
- embodied cognition (6)
- Cardiac rehabilitation (5)
- German (5)
- adolescents (5)
- aging (5)
- language acquisition (5)
- neuroimaging (5)
Institute
- Strukturbereich Kognitionswissenschaften (227) (remove)
Sentence comprehension requires the assignment of thematic relations between the verb and its noun arguments in order to determine who is doing what to whom. In some languages, such as English, word order is the primary syntactic cue. In other languages, such as German, case-marking is additionally used to assign thematic roles. During development children have to acquire the thematic relevance of these syntactic cues and weigh them against semantic cues. Here we investigated the processing of syntactic cues and semantic cues in 2- and 3-year-old children by analyzing their behavioral and neurophysiological responses. Case-marked subject-first and object-first sentences (syntactic cue) including animate and inanimate nouns (semantic cue) were presented auditorily. The semantic animacy cue either conflicted with or supported the thematic roles assigned by syntactic case-marking. In contrast to adults, for whom semantics did not interfere with case-marking, children attended to both syntactic and to semantic cues with a stronger reliance on semantic cues in early development. Children’s event-related brain potentials indicated sensitivity to syntactic information but increased processing costs when case-marking and animacy assigned conflicting thematic roles. These results demonstrate an early developmental sensitivity and ongoing shift towards the use of syntactic cues during sentence comprehension.
Injuries in professional soccer are a significant concern for teams, and they are caused amongst others by high training load. This cohort study describes the relationship between workload parameters and the occurrence of non-contact injuries, during weeks with high and low workload in professional soccer players throughout the season. Twenty-one professional soccer players aged 28.3 ± 3.9 yrs. who competed in the Iranian Persian Gulf Pro League participated in this 48-week study. The external load was monitored using global positioning system (GPS, GPSPORTS Systems Pty Ltd) and the type of injury was documented daily by the team's medical staff. Odds ratio (OR) and relative risk (RR) were calculated for non-contact injuries for high- and low-load weeks according to acute (AW), chronic (CW), acute to chronic workload ratio (ACWR), and AW variation (Δ-Acute) values. By using Poisson distribution, the interval between previous and new injuries were estimated. Overall, 12 non-contact injuries occurred during high load and 9 during low load weeks. Based on the variables ACWR and Δ-AW, there was a significantly increased risk of sustaining non-contact injuries (p < 0.05) during high-load weeks for ACWR (OR: 4.67), and Δ-AW (OR: 4.07). Finally, the expected time between injuries was significantly shorter in high load weeks for ACWR [1.25 vs. 3.33, rate ratio time (RRT)] and Δ-AW (1.33 vs. 3.45, RRT) respectively, compared to low load weeks. The risk of sustaining injuries was significantly larger during high workload weeks for ACWR, and Δ-AW compared with low workload weeks. The observed high OR in high load weeks indicate that there is a significant relationship between workload and occurrence of non-contact injuries. The predicted time to new injuries is shorter in high load weeks compared to low load weeks. Therefore, the frequency of injuries is higher during high load weeks for ACWR and Δ-AW. ACWR and Δ-AW appear to be good indicators for estimating the injury risk, and the time interval between injuries.
Working memory load-dependent brain response predicts behavioral training gains in older adults
(2014)
In the domain of working memory (WM), a sigmoid-shaped relationship between WM load and brain activation patterns has been demonstrated in younger adults. It has been suggested that age-related alterations of this pattern are associated with changes in neural efficiency and capacity. At the same time, WM training studies have shown that some older adults are able to increase their WM performance through training. In this study, functional magnetic resonance imaging during an n-back WM task at different WM load levels was applied to compare blood oxygen level-dependent (BOLD) responses between younger and older participants and to predict gains in WM performance after a subsequent 12-session WM training procedure in older adults. We show that increased neural efficiency and capacity, as reflected by more "youth-like" brain response patterns in regions of interest of the frontoparietal WM network, were associated with better behavioral training outcome beyond the effects of age, sex, education, gray matter volume, and baseline WM performance. Furthermore, at low difficulty levels, decreases in BOLD response were found after WM training. Results indicate that both neural efficiency (i. e., decreased activation at comparable performance levels) and capacity (i. e., increasing activation with increasing WM load) of a WM-related network predict plasticity of the WM system, whereas WM training may specifically increase neural efficiency in older adults.
There is a wealth of evidence showing that increasing the distance between an argument and its head leads to more processing effort, namely, locality effects; these are usually associated with constraints in working memory (DLT: Gibson, 2000; activation-based model: Lewis and Vasishth, 2005). In SOV languages, however, the opposite effect has been found: antilocality (see discussion in Levy et al., 2013). Antilocality effects can be explained by the expectation-based approach as proposed by Levy (2008) or by the activation-based model of sentence processing as proposed by Lewis and Vasishth (2005). We report an eye-tracking and a self-paced reading study with sentences in Spanish together with measures of individual differences to examine the distinction between expectation- and memory-based accounts, and within memory-based accounts the further distinction between DLT and the activation-based model. The experiments show that (i) antilocality effects as predicted by the expectation account appear only for high-capacity readers; (ii) increasing dependency length by interposing material that modifies the head of the dependency (the verb) produces stronger facilitation than increasing dependency length with material that does not modify the head; this is in agreement with the activation-based model but not with the expectation account; and (iii) a possible outcome of memory load on low-capacity readers is the increase in regressive saccades (locality effects as predicted by memory-based accounts) or, surprisingly, a speedup in the self-paced reading task; the latter consistent with good-enough parsing (Ferreira et al., 2002). In sum, the study suggests that individual differences in working memory capacity play a role in dependency resolution, and that some of the aspects of dependency resolution can be best explained with the activation-based model together with a prediction component.
We report two corpus analyses to examine the impact of animacy, definiteness, givenness and type of referring expression on the ordering of double objects in the spontaneous speech of German-speaking two- to four-year-old children and the child-directed speech of their mothers. The first corpus analysis revealed that definiteness, givenness and type of referring expression influenced word order variation in child language and child-directed speech when the type of referring expression distinguished between pronouns and lexical noun phrases. These results correspond to previous child language studies in English (e.g., de Marneffe et al. 2012). Extending the scope of previous studies, our second corpus analysis examined the role of different pronoun types on word order. It revealed that word order in child language and child-directed speech was predictable from the types of pronouns used. Different types of pronouns were associated with different sentence positions but also showed a strong correlation to givenness and definiteness. Yet, the distinction between pronoun types diminished the effects of givenness so that givenness had an independent impact on word order only in child-directed speech but not in child language. Our results support a multi-factorial approach to word order in German. Moreover, they underline the strong impact of the type of referring expression on word order and suggest that it plays a crucial role in the acquisition of the factors influencing word order variation.
We examined the effects of argument-head distance in SVO and SOV languages (Spanish and German), while taking into account readers' working memory capacity and controlling for expectation (Levy, 2008) and other factors. We predicted only locality effects, that is, a slowdown produced by increased dependency distance (Gibson, 2000; Lewis and Vasishth, 2005). Furthermore, we expected stronger locality effects for readers with low working memory capacity. Contrary to our predictions, low-capacity readers showed faster reading with increased distance, while high-capacity readers showed locality effects. We suggest that while the locality effects are compatible with memory-based explanations, the speedup of low-capacity readers can be explained by an increased probability of retrieval failure. We present a computational model based on ACT-R built under the previous assumptions, which is able to give a qualitative account for the present data and can be tested in future research. Our results suggest that in some cases, interpreting longer RTs as indexing increased processing difficulty and shorter RTs as facilitation may be too simplistic: The same increase in processing difficulty may lead to slowdowns in high-capacity readers and speedups in low-capacity ones. Ignoring individual level capacity differences when investigating locality effects may lead to misleading conclusions.
Regulatory focus is a motivational construct that describes humans’ motivational orientation during goal pursuit. It is conceptualized as a chronic, trait-like, as well as a momentary, state-like orientation. Whereas there is a large number of measures to capture chronic regulatory focus, measures for its momentary assessment are only just emerging. This paper presents the development and validation of a measure of Momentary–Chronic Regulatory Focus. Our development incorporates the distinction between self-guide and reference-point definitions of regulatory focus. Ideals and ought striving are the promotion and prevention dimension in the self-guide system; gain and non-loss regulatory focus are the respective dimensions within the reference-point system. Three-survey-based studies test the structure, psychometric properties, and validity of the measure in its version to assess chronic regulatory focus (two samples of working participants, N = 389, N = 672; one student sample [time 1, N = 105; time 2, n = 91]). In two further studies, an experience sampling study with students (N = 84, k = 1649) and a daily-diary study with working individuals (N = 129, k = 1766), the measure was applied to assess momentary regulatory focus. Multilevel analyses test the momentary measure’s factorial structure, provide support for its sensitivity to capture within-person fluctuations, and provide evidence for concurrent construct validity.
Introduction: The goal of the present study was to identify the prospective relations between weight/shape and muscularity concerns and emotional problems in adolescents. Methods: Self-report data of 966 German male and female adolescents were analyzed in a cross lagged panel design. Results: Analyses of latent means revealed significant correlations between weight/shape concern and emotional problems as well as between muscularity concern and emotional problems in both genders. Moreover, weight/shape concern predicted emotional problems prospectively, but only in girls. Regarding muscularity concern, we could not find any prospective relation with emotional problems In boys or girls from the general population. Conclusions: It is assumed that as appearance is highly relevant for the self-concept in girls, concerns about the look might promote emotional problems. Thus, weight/shape concern should be addressed in the prevention of emotional problems in adolescent girls, whereas further research is necessary investigating the contribution of muscularity concern in this context.
This study is the first to use kinematic data to assess lingual carryover coarticulation in children. We investigated whether the developmental decrease previously attested in anticipatory coarticulation, as well as the relation between coarticulatory degree and the consonantal context, also characterize carryover coarticulation. Sixty-two children and 13 adults, all native speakers of German, were recruited according to five age cohorts: three-year-olds, four-year-olds, five-year-olds, seven-year-olds, and adults. Tongue movements during the production of ə.CV.Cə utterances (C = /b, d, g/, V = /i, y, e, a, o, u/) were recorded with ultrasound. We measured vowel-induced horizontal displacement of the tongue dorsum within the last syllable and compared the resulting coarticulatory patterns between age cohorts and consonantal contexts. Results indicate that the degree of vocalic carryover coarticulation decreases with age. Vocalic prominence within an utterance as well as its change across childhood depended on the postvocalic consonant’s articulatory demands for the tongue dorsum (i.e., its coarticulatory resistance): Low resistant /b/ and /g/ allowed for more vocalic perseveration and a continuous decrease, while the highly resistant /d/ displayed lower coarticulation degrees and discontinuous effects. These findings parallel those in anticipation suggesting a similar organization of anticipatory and carryover coarticulation. Implications for theories of speech production are discussed.
Validation of two accelerometers to determine mechanical loading of physical activities in children
(2015)
The purpose of this study was to assess the validity of accelerometers using force plates (i.e., ground reaction force (GRF)) during the performance of different tasks of daily physical activity in children. Thirteen children (10.1 (range 5.4-15.7)years, 3 girls) wore two accelerometers (ActiGraph GT3X+ (ACT), GENEA (GEN)) at the hip that provide raw acceleration signals at 100Hz. Participants completed different tasks (walking, jogging, running, landings from boxes of different height, rope skipping, dancing) on a force plate. GRF was collected for one step per trial (10 trials) for ambulatory movements and for all landings (10 trials), rope skips and dance procedures. Accelerometer outputs as peak loading (g) per activity were averaged. ANOVA, correlation analyses and Bland-Altman plots were computed to determine validity of accelerometers using GRF. There was a main effect of task with increasing acceleration values in tasks with increasing locomotion speed and landing height (P<0.001). Data from ACT and GEN correlated with GRF (r=0.90 and 0.89, respectively) and between each other (r=0.98), but both accelerometers consistently overestimated GRF. The new generation of accelerometer models that allow raw signal detection are reasonably accurate to measure impact loading of bone in children, although they systematically overestimate GRF.
Depression is the most prevalent psychiatric disorder in the general population. Despite a large demand for efficient treatment options, the majority of older depressed adults does not receive adequate treatment: Additional low-threshold treatments are needed for this age group. Over the past two decades, a growing number of randomized controlled trials (RCT) have been conducted, testing the efficacy of physical exercise in the alleviation of depression in older adults. This meta-analysis systematically reviews and evaluates these studies; some subanalyses testing specific effects of different types of exercise and settings are also performed. In order to be included, exercise programs of the RCTs had to fulfill the criteria of exercise according to the American College of Sports Medicine, including a sample mean age of 60 or above and an increased level of depressive symptoms. Eighteen trials with 1,063 participants fulfilled our inclusion criteria. A comparison of the posttreatment depression scores between the exercise and control groups revealed a moderate effect size in favor of the exercise groups (standardized mean difference (SMD) of –0.68, p < .001). The effect was comparable to the results achieved when only the eleven trials with low risk of bias were included (SMD = –0.63, p < .001). The subanalyses showed significant effects for all types of exercise and for supervised interventions. The results of this meta-analysis suggest that physical exercise may serve as a feasible, additional intervention to fight depression in older adults. However, because of small sample sizes of the majority of individual trials and high statistical heterogeneity, results must be interpreted carefully.
Background: The COVID-19 pandemic has highlighted the importance of scientific endeavors. The goal of this systematic review is to evaluate the quality of the research on physical activity (PA) behavior change and its potential to contribute to policy-making processes in the early days of COVID-19 related restrictions.
Methods: We conducted a systematic review of methodological quality of current research according to PRISMA guidelines using Pubmed and Web of Science, of articles on PA behavior change that were published within 365 days after COVID-19 was declared a pandemic by the World Health Organization (WHO). Items from the JBI checklist and the AXIS tool were used for additional risk of bias assessment. Evidence mapping is used for better visualization of the main results. Conclusions about the significance of published articles are based on hypotheses on PA behavior change in the light of the COVID-19 pandemic.
Results: Among the 1,903 identified articles, there were 36% opinion pieces, 53% empirical studies, and 9% reviews. Of the 332 studies included in the systematic review, 213 used self-report measures to recollect prepandemic behavior in often small convenience samples. Most focused changes in PA volume, whereas changes in PA types were rarely measured. The majority had methodological reporting flaws. Few had very large samples with objective measures using repeated measure design (pre and during the pandemic). In addition to the expected decline in PA duration, these studies show that many of those who were active prepandemic, continued to be active during the pandemic.
Conclusions: Research responded quickly at the onset of the pandemic. However, most of the studies lacked robust methodology, and PA behavior change data lacked the accuracy needed to guide policy makers. To improve the field, we propose the implementation of longitudinal cohort studies by larger organizations such as WHO to ease access to data on PA behavior, and suggest those institutions set clear standards for this research. Researchers need to ensure a better fit between the measurement method and the construct being measured, and use both objective and subjective measures where appropriate to complement each other and provide a comprehensive picture of PA behavior.
Recent research has indicated that university students sometimes use caffeine pills for neuroenhancement (NE; non-medical use of psychoactive substances or technology to produce a subjective enhancement in psychological functioning and experience), especially during exam preparation. In our factorial survey experiment, we manipulated the evidence participants were given about the prevalence of NE amongst peers and measured the resulting effects on the psychological predictors included in the Prototype-Willingness Model of risk behavior. Two hundred and thirty-one university students were randomized to a high prevalence condition (read faked research results overstating usage of caffeine pills amongst peers by a factor of 5; 50%), low prevalence condition (half the estimated prevalence; 5%) or control condition (no information about peer prevalence). Structural equation modeling confirmed that our participants’ willingness and intention to use caffeine pills in the next exam period could be explained by their past use of neuroenhancers, attitude to NE and subjective norm about use of caffeine pills whilst image of the typical user was a much less important factor. Provision of inaccurate information about prevalence reduced the predictive power of attitude with respect to willingness by 40-45%. This may be because receiving information about peer prevalence which does not fit with their perception of the social norm causes people to question their attitude. Prevalence information might exert a deterrent effect on NE via the attitude-willingness association. We argue that research into NE and deterrence of associated risk behaviors should be informed by psychological theory.
In the context of back pain, great emphasis has been placed on the importance of trunk stability, especially in situations requiring compensation of repetitive, intense loading induced during high-performance activities, e.g., jumping or landing. This study aims to evaluate trunk muscle activity during drop jump in adolescent athletes with back pain (BP) compared to athletes without back pain (NBP). Eleven adolescent athletes suffering back pain (BP: m/f: n = 4/7; 15.9 ± 1.3 y; 176 ± 11 cm; 68 ± 11 kg; 12.4 ± 10.5 h/we training) and 11 matched athletes without back pain (NBP: m/f: n = 4/7; 15.5 ± 1.3 y; 174 ± 7 cm; 67 ± 8 kg; 14.9 ± 9.5 h/we training) were evaluated. Subjects conducted 3 drop jumps onto a force plate (ground reaction force). Bilateral 12-lead SEMG (surface Electromyography) was applied to assess trunk muscle activity. Ground contact time [ms], maximum vertical jump force [N], jump time [ms] and the jump performance index [m/s] were calculated for drop jumps. SEMG amplitudes (RMS: root mean square [%]) for all 12 single muscles were normalized to MIVC (maximum isometric voluntary contraction) and analyzed in 4 time windows (100 ms pre- and 200 ms post-initial ground contact, 100 ms pre- and 200 ms post-landing) as outcome variables. In addition, muscles were grouped and analyzed in ventral and dorsal muscles, as well as straight and transverse trunk muscles. Drop jump ground reaction force variables did not differ between NBP and BP (p > 0.05). Mm obliquus externus and internus abdominis presented higher SEMG amplitudes (1.3–1.9-fold) for BP (p < 0.05). Mm rectus abdominis, erector spinae thoracic/lumbar and latissimus dorsi did not differ (p > 0.05). The muscle group analysis over the whole jumping cycle showed statistically significantly higher SEMG amplitudes for BP in the ventral (p = 0.031) and transverse muscles (p = 0.020) compared to NBP. Higher activity of transverse, but not straight, trunk muscles might indicate a specific compensation strategy to support trunk stability in athletes with back pain during drop jumps. Therefore, exercises favoring the transverse trunk muscles could be recommended for back pain treatment.
The purpose of this study was to investigate the effects of back extensor fatigue on performance measures and electromyographic (EMG) activity of leg and trunk muscles during jumping on stable and unstable surfaces.
Before and after a modified Biering-Sorensen fatigue protocol for the back extensors, countermovement (CMJ) and lateral jumps (LJ) were performed on a force plate under stable and unstable (balance pad on the force plate) conditions. Performance measures for LJ (contact time) and CMJ height and leg and trunk muscles EMG activity were tested in 14 male experienced jumpers during 2 time intervals for CMJ (braking phase, push-off phase) and 5 intervals for LJ (-30 to 0, 0-30, 30-60, 60-90, and 90-120 ms) in non-fatigued and fatigued conditions.
A significant main effect of test (fatigue) (p = 0.007, f = 0.57) was observed for CMJ height. EMG analysis showed a significant fatigue-induced decrease in biceps femoris and gastrocnemius activity with CMJ (p = 0.008, f = 0.58 andp = 0.04, f = 0.422, respectively). LJ contact time was not affected by fatigue or surface interaction. EMG activity was significantly lower in the tibialis anterior with LJ following fatigue (p = 0.05, f = 0.405). A test x surface (p = 0.04, f = 0.438) interaction revealed that the non-fatigued unstable CMJ gastrocnemius EMG activity was lower than the non-fatigued stable condition during the onset-of-force phase.
The findings revealed that fatiguing the trunk negatively impacts CMJ height and muscle activity during the performance of CMJs. However, skilled jumpers are not additionally affected by a moderately unstable surface as compared to a stable surface.
Background: Chronic kidney disease (CKD) is a frequent comorbidity among elderly patients and those with cardiovascular disease. CKD carries prognostic relevance. We aimed to describe patient characteristics, risk factor management and control status of patients in cardiac rehabilitation (CR), differentiated by presence or absence of CKD.
Design and methods: Data from 92,071 inpatients with adequate information to calculate glomerular filtration rate (GFR) based on the Cockcroft-Gault formula were analyzed at the beginning and the end of a 3-week CR stay. CKD was defined as estimated GFR <60 ml/min/1.73 m(2).
Results: Compared with non-CKD patients, CKD patients were significantly older (72.0 versus 58.0 years) and more often had diabetes mellitus, arterial hypertension, and atherothrombotic manifestations (previous stroke, peripheral arterial disease), but fewer were current or previous smokers had a CHD family history. Exercise capacity was much lower in CKD (59 vs. 92Watts). Fewer patients with CKD were treated with percutaneous coronary intervention (PCI), but more had coronary artery bypass graft (CABG) surgery. Patients with CKD compared with non-CKD less frequently received statins, acetylsalicylic acid (ASA), clopidogrel, beta blockers, and angiotensin converting enzyme (ACE) inhibitors, and more frequently received angiotensin receptor blockers, insulin and oral anticoagulants. In CKD, mean low density lipoprotein cholesterol (LDL-C), total cholesterol, and high density lipoprotein cholesterol (HDL-C) were slightly higher at baseline, while triglycerides were substantially lower. This lipid pattern did not change at the discharge visit, but overall control rates for all described parameters (with the exception of HDL-C) were improved substantially. At discharge, systolic blood pressure (BP) was higher in CKD (124 versus 121 mmHg) and diastolic BP was lower (72 versus 74 mmHg). At discharge, 68.7% of CKD versus 71.9% of non-CKD patients had LDL-C <100 mg/dl. Physical fitness on exercise testing improved substantially in both groups. When the Modification of Diet in Renal Disease (MDRD) formula was used for CKD classification, there was no clinically relevant change in these results.
Conclusion: Within a short period of 3-4 weeks, CR led to substantial improvements in key risk factors such as lipid profile, blood pressure, and physical fitness for all patients, even if CKD was present.
Exploring generalisation following treatment of language deficits in aphasia can provide insights into the functional relation of the cognitive processing systems involved. In the present study, we first review treatment outcomes of interventions targeting sentence processing deficits and, second report a treatment study examining the occurrence of practice effects and generalisation in sentence comprehension and production. In order to explore the potential linkage between processing systems involved in comprehending and producing sentences, we investigated whether improvements generalise within (i.e., uni-modal generalisation in comprehension or in production) and/or across modalities (i.e., cross-modal generalisation from comprehension to production or vice versa). Two individuals with aphasia displaying co-occurring deficits in sentence comprehension and production were trained on complex, non-canonical sentences in both modalities. Two evidence-based treatment protocols were applied in a crossover intervention study with sequence of treatment phases being randomly allocated. Both participants benefited significantly from treatment, leading to uni-modal generalisation in both comprehension and production. However, cross-modal generalisation did not occur. The magnitude of uni-modal generalisation in sentence production was related to participants’ sentence comprehension performance prior to treatment. These findings support the assumption of modality-specific sub-systems for sentence comprehension and production, being linked uni-directionally from comprehension to production.
Background: Travel-related conditions have impact on the quality of oral anticoagulation therapy (OAT) with vitamin K-antagonists. No predictors for travel activity and for travel-associated haemorrhage or thromboembolic complications of patients on OAT are known.
Methods: A standardised questionnaire was sent to 2500 patients on long-term OAT in Austria, Switzerland and Germany. 997 questionnaires were received (responder rate 39.9%). Ordinal or logistic regression models with travel activity before and after onset of OAT or travel-associated haemorrhages and thromboembolic complications as outcome measures were applied.
Results: 43.4% changed travel habits since onset of OAT with 24.9% and 18.5% reporting decreased or increased travel activity, respectively. Long-distance worldwide before OAT or having suffered from thromboembolic complications was associated with reduced travel activity. Increased travel activity was associated with more intensive travel experience, increased duration of OAT, higher education, or performing patient self-management (PSM). Travel-associated haemorrhages or thromboennbolic complications were reported by 6.5% and 0.9% of the patients, respectively. Former thromboennbolic complications, former bleedings and PSM were significant predictors of travel-associated complications.
Conclusions: OAT also increases travel intensity. Specific medical advice prior travelling to prevent complications should be given especially to patients with former bleedings or thromboennbolic complications and to those performing PSM. (C) 2014 Elsevier Ltd. All rights reserved.
Working memory (WM) performance declines with age. However, several studies have shown that WM training may lead to performance increases not only in the trained task, but also in untrained cognitive transfer tasks. It has been suggested that transfer effects occur if training task and transfer task share specific processing components that are supposedly processed in the same brain areas. In the current study, we investigated whether single-task WM training and training-related alterations in neural activity might support performance in a dual-task setting, thus assessing transfer effects to higher-order control processes in the context of dual-task coordination. A sample of older adults (age 60–72) was assigned to either a training or control group. The training group participated in 12 sessions of an adaptive n-back training. At pre and post-measurement, a multimodal dual-task was performed in all participants to assess transfer effects. This task consisted of two simultaneous delayed match to sample WM tasks using two different stimulus modalities (visual and auditory) that were performed either in isolation (single-task) or in conjunction (dual-task). A subgroup also participated in functional magnetic resonance imaging (fMRI) during the performance of the n-back task before and after training. While no transfer to single-task performance was found, dual-task costs in both the visual modality (p < 0.05) and the auditory modality (p < 0.05) decreased at post-measurement in the training but not in the control group. In the fMRI subgroup of the training participants, neural activity changes in left dorsolateral prefrontal cortex (DLPFC) during one-back predicted post-training auditory dual-task costs, while neural activity changes in right DLPFC during three-back predicted visual dual-task costs. Results might indicate an improvement in central executive processing that could facilitate both WM and dual-task coordination.
Previous clinical research found that invasive vagus nerve stimulation (VNS) enhanced word recognition memory in epileptic patients, an effect assumed to be related to the activation of brainstem arousal systems. In this study, we applied non-invasive transcutaneous auricular VNS (tVNS) to replicate and extend the previous work. Using a single-blind, randomized, between-subject design, 60 healthy volunteers received active or sham stimulation during a lexical decision task, in which emotional and neutral stimuli were classified as words or non-words. In a subsequent recognition memory task (1 day after stimulation), participants' memory performance on these words and their subjective memory confidence were tested. Salivary alpha-amylase (sAA) levels, a putative indirect measure of central noradrenergic activation, were also measured before and after stimulation. During encoding, pleasant words were more accurately detected than neutral and unpleasant words. However, no tVNS effects were observed on task performance or on overall sAA level changes. tVNS also did not modulate overall recognition memory, which was particularly enhanced for pleasant emotional words. However, when hit rates were split based on confidence ratings reflecting familiarity- and recollection-based memory, higher recollection-based memory performance (irrespective of emotional category) was observed during active stimulation than during sham stimulation. To summarize, we replicated prior findings of enhanced processing and memory for emotional (pleasant) words. Whereas tVNS showed no effects on word processing, subtle effects on recollection-based memory performance emerged, which may indicate that tVNS facilitates hippocampus-mediated consolidation processes.
There is an ongoing debate about how to test and operationalize self-control. This limited understanding is in large part due to a variety of different tests and measures used to assess self-control, as well as the lack of empirical studies examining the temporal dynamics during the exertion of self-control. In order to track changes that occur over the course of exposure to a self-control task, we investigate and compare behavioral, subjective, and physiological indicators during the exertion of self-control. Participants completed both a task requiring inhibitory control (Go/No-Go task) and a control task (two-choice task). Behavioral performance and pupil size were measured during the tasks. Subjective vitality was measured before and after the tasks. While pupil size and subjective vitality showed similar trajectories in the two tasks, behavioral performance decreased in the inhibitory control-demanding task, but not in the control task. However, behavioral, subjective, and physiological measures were not significantly correlated. These results suggest that there is a disconnect between different measures of self-control with high intra- and interindividual variability. Theoretical and methodological implications for self-control theory and future empirical work are discussed.
As an emerging sub-field of music information retrieval (MIR), music imagery information retrieval (MIIR) aims to retrieve information from brain activity recorded during music cognition–such as listening to or imagining music pieces. This is a highly inter-disciplinary endeavor that requires expertise in MIR as well as cognitive neuroscience and psychology. The OpenMIIR initiative strives to foster collaborations between these fields to advance the state of the art in MIIR. As a first step, electroencephalography (EEG) recordings of music perception and imagination have been made publicly available, enabling MIR researchers to easily test and adapt their existing approaches for music analysis like fingerprinting, beat tracking or tempo estimation on this new kind of data. This paper reports on first results of MIIR experiments using these OpenMIIR datasets and points out how these findings could drive new research in cognitive neuroscience.
Two experiments examined how individuals respond to a restriction presented within an approach versus an avoidance frame. In Study 1, working on a problem-solving task, participants were initially free to choose their strategy, but for a second task were told to change their strategy. The message to change was embedded in either an approach or avoidance frame. When confronted with an avoidance compared to an approach frame, the participants’ reactance toward the request was greater and, in turn, led to impaired performance. The role of reactance as a response to threat to freedom was explicitly examined in Study 2, in which participants evaluated a potential change in policy affecting their program of study herein explicitly varying whether a restriction was present or absent and whether the message was embedded in an approach versus avoidance frame. When communicated with an avoidance frame and as a restriction, participants showed the highest resistance in terms of reactance, message agreement and evaluation of the communicator. The difference in agreement with the change was mediated by reactance only when a restriction was present. Overall, avoidance goal frames were associated with more resistance to change on different levels of experience (reactance, performance, and person perception). Reactance mediated the effect of goal frame on other outcomes only when a restriction was present.
It is a common finding across languages that young children have problems in understanding patient-initial sentences. We used Tagalog, a verb-initial language with a reliable voice-marking system and highly frequent patient voice constructions, to test the predictions of several accounts that have been proposed to explain this difficulty: the frequency account, the Competition Model, and the incremental processing account. Study 1 presents an analysis of Tagalog child-directed speech, which showed that the dominant argument order is agent-before-patient and that morphosyntactic markers are highly valid cues to thematic role assignment. In Study 2, we used a combined self-paced listening and picture verification task to test how Tagalog-speaking adults and 5- and 7-year-old children process reversible transitive sentences. Results showed that adults performed well in all conditions, while children’s accuracy and listening times for the first noun phrase indicated more difficulty in interpreting patient-initial sentences in the agent voice compared to the patient voice. The patient voice advantage is partly explained by both the frequency account and incremental processing account.
Meaning-making in the brain has become one of the most intensely discussed topics in cognitive science. Traditional theories on cognition that emphasize abstract symbol manipulations often face a dead end: The symbol grounding problem. The embodiment idea tries to overcome this barrier by assuming that the mind is grounded in sensorimotor experiences. A recent surge in behavioral and brain-imaging studies has therefore focused on the role of the motor cortex in language processing. Concrete, action-related words have received convincing evidence to rely on sensorimotor activation. Abstract concepts, however, still pose a distinct challenge for embodied theories on cognition. Fully embodied abstraction mechanisms were formulated but sensorimotor activation alone seems unlikely to close the explanatory gap. In this respect, the idea of integration areas, such as convergence zones or the ‘hub and spoke’ model, do not only appear like the most promising candidates to account for the discrepancies between concrete and abstract concepts but could also help to unite the field of cognitive science again. The current review identifies milestones in cognitive science research and recent achievements that highlight fundamental challenges, key questions and directions for future research.
Previous studies have shown that multilingual speakers are influenced by their native (L1) and non-native (L2) grammars when learning a new language. But, so far, these studies have mostly used untimed metalinguistic tasks. Here we examine whether multilinguals’ prior grammars also affect their sensitivity to morphosyntactic constraints during processing. We use speeded judgment and self-paced reading tasks to examine the comprehension of German possessive pronouns. To investigate whether native and non-native grammars differentially affect participants’ performance, we compare two groups of non-native German speakers with inverse L1–L2 distributions: a group with L1 Spanish – L2 English, and a group with L1 English – L2 Spanish. We show that the reading profiles of both groups are modulated by their L1 grammar, with L2 proficiency selectively affecting participants’ judgment accuracy but not their reading times. We propose that reading comprehension is mainly influenced by multilinguals’ native grammar, but that knowledge of an L2 grammar can further increase sensitivity to morphosyntactic violations in an additional language.
The Role of Interoceptive Sensibility and Emotional Conceptualization for the Experience of Emotions
(2021)
The theory of constructed emotions suggests that different psychological components, including core affect (mental and neural representations of bodily changes), and conceptualization (meaning-making based on prior experiences and semantic knowledge), are involved in the formation of emotions. However, little is known about their role in experiencing emotions. In the current study, we investigated how individual differences in interoceptive sensibility and emotional conceptualization (as potential correlates of these components) interact to moderate three important aspects of emotional experiences: emotional intensity (strength of emotion felt), arousal (degree of activation), and granularity (ability to differentiate emotions with precision). To this end, participants completed a series of questionnaires assessing interoceptive sensibility and emotional conceptualization and underwent two emotion experience tasks, which included standardized material (emotion differentiation task; ED task) and self-experienced episodes (day reconstruction method; DRM). Correlational analysis showed that individual differences in interoceptive sensibility and emotional conceptualization were related to each other. Principal Component Analysis (PCA) revealed two independent factors that were referred to as sensibility and monitoring. The Sensibility factor, interpreted as beliefs about the accuracy of an individual in detecting internal physiological and emotional states, predicted higher granularity for negative words. The Monitoring factor, interpreted as the tendency to focus on the internal states of an individual, was negatively related to emotional granularity and intensity. Additionally, Sensibility scores were more strongly associated with greater well-being and adaptability measures than Monitoring scores. Our results indicate that independent processes underlying individual differences in interoceptive sensibility and emotional conceptualization contribute to emotion experiencing.
The processing of nonverbal auditory stimuli has not yet been sufficiently investigated in patients with aphasia. On the basis of a duration discrimination task, we examined whether patients with left-sided cerebrovascular lesions were able to perceive time differences in the scale of approximately 150ms. Further linguistic and memory-related tasks were used to characterize more exactly the relationships in the performances between auditory nonverbal task and selective linguistic or mnemonic disturbances. All examined conduction aphasics showed increased thresholds in the duration discrimination task. The low thresholds on this task were in a strong correlative relation to the reduced performances in repetition and working memory task. This was interpreted as an indication of a pronounced disturbance in integrating auditory verbal information into a long-term window (sampling disturbance) resulting in an additional load of working memory. In order to determine the lesion topography of patients with sampling disturbances, the anatomical and psychophysical data were correlated on the basis of a voxelwise statistical approach. It was found that tissue damage extending through the insula, the posterior superior temporal gyrus, and the supramarginal gyrus causes impairments in sequencing of time-sensitive information.
This paper reports the results of a production experiment that explores the prosodic realization of focus in Hungarian, a language that is characterized by obligatory syntactic focus marking. Our study investigates narrow focus in sentences in which focus is unambiguously marked by syntactic means, comparing it to broad focus sentences. Potential independent effects of the salience (textual givenness) of the background of the narrow focus and the contrastiveness of the focus are controlled for and are also examined.
The results show that both continuous phonetic measures and categorical factors such as the distribution of contour types are affected by the focus-related factors, despite the presence of syntactic focus marking. The phonetic effects found are mostly parallel to those of typical prosodic focus marking languages like English. The prosodic prominence required of focus is realized through changes to the scaling and slope of F0 targets and contours. The asymmetric prominence relation between the focus and the background can be expressed not only by the phonetic marking of the prominence of the focused element, but also by the phonetic marking of the reduced prominence of the background. Furthermore, contrastiveness of focus and (textual) givenness of the background show independent phonetic effects, both of them affecting the realization of the background. These results are argued to shed light on alternative approaches to the information structural notion of contrastive focus and the relation between the notions of focus and givenness. (C) 2014 Elsevier B.V. All rights reserved.
Background
High prevalence rates have been reported for physical inactivity, mobility limitations, and falls in older adults. Home-based exercise might be an adequate means to increase physical activity by improving health- (i.e., muscle strength) and skill-related components of physical fitness (i.e., balance), particularly in times of restricted physical activity due to pandemics.
Objective
The objective of this study was to examine the effects of home-based balance exercises conducted during daily tooth brushing on measures of balance and muscle strength in healthy older adults.
Methods
Fifty-one older adults were randomly assigned to a balance exercise group (n = 27; age: 65.1 ± 1.1 years) or a passive control group (n = 24; age: 66.2 ± 3.3 years). The intervention group conducted balance exercises over a period of eight weeks twice daily for three minutes each during their daily tooth brushing routine. Pre- and post-intervention, tests were included for the assessment of static steady-state balance (i.e., Romberg test), dynamic steady-state balance (i.e., 10-m single and dual-task walk test using a cognitive and motor interference task), proactive balance (i.e., Timed-Up-and-Go Test [TUG], Functional-Reach-Test [FRT]), and muscle strength (i.e., Chair-Rise-Test [CRT]).
Results
Irrespective of group, the statistical analysis revealed significant main effects for time (pre vs. post) for dual-task gait speed (p < .001, 1.12 ≤ d ≤ 2.65), TUG (p < .001, d = 1.17), FRT (p = .002, d = 0.92), and CRT (p = .002, d = 0.94) but not for single-task gait speed and for the Romberg-Test. No significant group × time interactions were found for any of the investigated variables.
Conclusions
The applied lifestyle balance training program conducted twice daily during tooth brushing routines appears not to be sufficient in terms of exercise dosage and difficulty level to enhance balance and muscle strength in healthy adults aged 60–72 years. Consequently, structured balance training programs using higher exercise dosages and/or more difficult balance tasks are recommended for older adults to improve balance and muscle strength.
The predictions of two contrasting approaches to the acquisition of transitive relative clauses were tested within the same groups of German-speaking participants aged from 3 to 5 years old. The input frequency approach predicts that object relative clauses with inanimate heads (e.g., the pullover that the man is scratching) are comprehended earlier and more accurately than those with an animate head (e.g., the man that the boy is scratching). In contrast, the structural intervention approach predicts that object relative clauses with two full NP arguments mismatching in number (e.g., the man that the boys are scratching) are comprehended earlier and more accurately than those with number-matching NPs (e.g., the man that the boy is scratching). These approaches were tested in two steps. First, we ran a corpus analysis to ensure that object relative clauses with number-mismatching NPs are not more frequent than object relative clauses with number-matching NPs in child directed speech. Next, the comprehension of these structures was tested experimentally in 3-, 4-, and 5-year-olds respectively by means of a color naming task. By comparing the predictions of the two approaches within the same participant groups, we were able to uncover that the effects predicted by the input frequency and by the structural intervention approaches co-exist and that they both influence the performance of children on transitive relative clauses, but in a manner that is modulated by age. These results reveal a sensitivity to animacy mismatch already being demonstrated by 3-year-olds and show that animacy is initially deployed more reliably than number to interpret relative clauses correctly. In all age groups, the animacy mismatch appears to explain the performance of children, thus, showing that the comprehension of frequent object relative clauses is enhanced compared to the other conditions. Starting with 4-year-olds but especially in 5-year-olds, the number mismatch supported comprehension—a facilitation that is unlikely to be driven by input frequency. Once children fine-tune their sensitivity to verb agreement information around the age of four, they are also able to deploy number marking to overcome the intervention effects. This study highlights the importance of testing experimentally contrasting theoretical approaches in order to characterize the multifaceted, developmental nature of language acquisition.
Background:
Several standards have been developed to assess methodological quality of systematic reviews (SR). One widely used tool is the AMSTAR. A recent update -AMSTAR 2 -is a 16 item evaluation tool that enables a detailed assessment of SR that include randomised (RCT) or non-randomised studies (NRS) of healthcare interventions.
Methods:
A cross-sectional study of SR on pharmacological or psychological interventions in major depression in adults was conducted. SR published during 2012-2017 were sampled from MEDLINE, EMBASE and the Cochrane Database of SR. Methodological quality was assessed using AMSTAR 2. Potential predictive factors associated with quality were examined.
Results:
In rating overall confidence in the results of 60 SR four reviews were rated "high", two were "moderate", one was "low" and 53 were "critically low". The mean AMSTAR 2 percentage score was 45.3% (standard deviation 22.6%) in a wide range from 7.1% to 93.8%. Predictors of higher quality were: type of review (higher quality in Cochrane Reviews), SR including only randomized trials and higher journal impact factor.
Limitations:
AMSTAR 2 is not intended to be used for the generation of a percentage score.
Conclusions:
According to AMSTAR 2 the overall methodological quality of SR on the treatment of adult major depression needs improvement. Although there is a high need for summarized information in the field of mental health, this work demonstrates the need to critically assess SR before using their findings. Better adherence to established reporting guidelines for SR is needed.
Research within the framework of Basic Psychological Need Theory (BPNT) finds strong associations between basic need frustration and depressive symptoms. This study examined the role of rumination as an underlying mechanism in the association between basic psychological need frustration and depressive symptoms. A cross-sectional sample of N = 221 adults (55.2% female, mean age = 27.95, range = 18–62, SD = 10.51) completed measures assessing their level of basic psychological need frustration, rumination, and depressive symptoms. Correlational analyses and multiple mediation models were conducted. Brooding partially mediated the relation between need frustration and depressive symptoms. BPNT and Response Styles Theory are compatible and can further advance knowledge about depression vulnerabilities.
Background
Due to inconclusive evidence on the effects of foot orthoses treatment on lower limb kinematics and kinetics in children, studies are needed that particularly evaluate the long-term use of foot orthoses on lower limb alignment during walking. Thus, the main objective of this study was to evaluate the effects of long-term treatment with arch support foot orthoses versus a sham condition on lower extremity kinematics and kinetics during walking in children with flexible flat feet.
Methods
Thirty boys aged 8–12 years with flexible flat feet participated in this study. While the experimental group (n = 15) used medial arch support foot orthoses during everyday activities over a period of four months, the control group (n = 15) received flat 2-mm-thick insoles (i.e., sham condition) for the same time period. Before and after the intervention period, walking kinematics and ground reaction forces were collected.
Results
Significant group by time interactions were observed during walking at preferred gait speed for maximum ankle eversion, maximum ankle internal rotation angle, minimum knee abduction angle, maximum knee abduction angle, maximum knee external rotation angle, maximum knee internal rotation angle, maximum hip extension angle, and maximum hip external rotation angle in favor of the foot orthoses group. In addition, statistically significant group by time interactions were detected for maximum posterior, and vertical ground reaction forces in favor of the foot orthoses group.
Conclusions
The long-term use of arch support foot orthoses proved to be feasible and effective in boys with flexible flat feet to improve lower limb alignment during walking.
Background: The regular assessment of hormonal and mood state parameters in professional soccer are proposed as good indicators during periods of intense training and/or competition to avoid overtraining.
Objective: The aim of this study was to analyze hormonal, psychological, workload and physical fitness parameters in elite soccer players in relation to changes in training and match exposure during a congested period of match play.
Methods: Sixteen elite soccer players from a team playing in the first Tunisian soccer league were evaluated three times (T1, T2, and T3) over 12 weeks. The non-congested period of match play was from T1 to T2, when the players played 6 games over 6 weeks. The congested period was from T2 to T3, when the players played 10 games over 6 weeks. From T1 to T3, players performed the Yo-Yo intermittent recovery test level 1 (YYIR1), the repeated shuttle sprint ability test (RSSA), the countermovement jump test (CMJ), and the squat jump test (SJ). Plasma Cortisol (C), Testosterone (T), and the T/C ratio were analyzed at T1, T2, and T3. Players had their mood dimensions (tension, depression, anger, vigor, fatigue, confusion, and a Total Mood Disturbance) assessed through the Profile of Mood State questionnaire (POMS). Training session rating of perceived exertion (sRPE) was also recorded on a daily basis in order to quantify internal training load and elements of monotony and strain.
Results: Significant performance declines (T1 < T2 < T3) were found for SJ performance (p = 0.04, effect size [ES] ES₁₋₂ = 0.15−0.06, ES₂₋₃ = 0.24) from T1 to T3. YYIR1 performance improved significantly from T1 to T2 and declined significantly from T2 to T3 (p = 0.001, ES₁₋₂ = 0.24, ES₂₋₃ = −2.54). Mean RSSA performance was significantly higher (p = 0.019, ES₁₋₂ = −0.47, ES₂₋₃ = 1.15) in T3 compared with T2 and T1. Best RSSA performance was significantly higher in T3 when compared with T2 and T1 (p = 0.006, ES₂₋₃ = 0.47, ES₁₋₂ = −0.56), but significantly lower in T2 when compared with to T1. T and T/C were significantly lower in T3 when compared with T2 and T1 (T: p = 0.03, ES₃₋₂ = −0.51, ES₃₋₁ = −0.51, T/C: p = 0.017, ES₃₋₂ = −1.1, ES₃₋₁ = −1.07). Significant decreases were found for the vigor scores in T3 when compared to T2 and T1 (p = 0.002, ES₁₋₂ = 0.31, ES₃₋₂ = −1.25). A significant increase was found in fatigue scores in T3 as compared to T1 and T2 (p = 0.002, ES₁₋₂ = 0.43, ES₂₋₃ = 0.81). A significant increase was found from T1 < T2 < T3 intension score (p = 0.002, ES₁₋₂ = 1.1, ES₂₋₃ = 0.2) and anger score (p = 0.03, ES₁₋₂ = 0.47, ES₂₋₃ = 0.33) over the study period. Total mood disturbance increased significantly (p = 0.02, ES₁₋₂ = 0.91, ES₂₋₃ = 1.1) from T1 to T3. Between T1-T2, significant relationships were observed between workload and changes in T (r = 0.66, p = 0.003), and T/C ratio (r = 0.62, p = 0.01). There were significant relationships between performance in RSSAbest and training load parameters (workload: r = 0.52, p = 0.03; monotony: r = 0.62, p = 0.01; strain: r = 0.62, p = 0.009). Between T2-T3, there was a significant relationship between Δ% of total mood disturbance and Δ% of YYIR1 (r = −0.54; p = 0.04), RSSAbest (r = 0.58, p = 0.01), SJ (r = −0,55, p = 0.01), T (r = 0.53; p = 0.03), and T/C (r = 0.5; p = 0.04).
Conclusion: An intensive period of congested match play significantly compromised elite soccer players’ physical and mental fitness. These changes were related to psychological but not hormonal parameters; even though significant alterations were detected for selected measures. Mood monitoring could be a simple and useful tool to determine the degree of preparedness for match play during a congested period in professional soccer.
Background: There is evidence that fully recovered COVID-19 patients usually resume physical exercise, but do not perform at the same intensity level performed prior to infection. The aim of this study was to evaluate the impact of COVID-19 infection and recovery as well as muscle fatigue on cardiorespiratory fitness and running biomechanics in female recreational runners.
Methods: Twenty-eight females were divided into a group of hospitalized and recovered COVID-19 patients (COV, n = 14, at least 14 days following recovery) and a group of healthy age-matched controls (CTR, n = 14). Ground reaction forces from stepping on a force plate while barefoot overground running at 3.3 m/s was measured before and after a fatiguing protocol. The fatigue protocol consisted of incrementally increasing running speed until reaching a score of 13 on the 6–20 Borg scale, followed by steady-state running until exhaustion. The effects of group and fatigue were assessed for steady-state running duration, steady-state running speed, ground contact time, vertical instantaneous loading rate and peak propulsion force.
Results: COV runners completed only 56% of the running time achieved by the CTR (p < 0.0001), and at a 26% slower steady-state running speed (p < 0.0001). There were fatigue-related reductions in loading rate (p = 0.004) without group differences. Increased ground contact time (p = 0.002) and reduced peak propulsion force (p = 0.005) were found for COV when compared to CTR.
Conclusion: Our results suggest that female runners who recovered from COVID-19 showed compromised running endurance and altered running kinetics in the form of longer stance periods and weaker propulsion forces. More research is needed in this area using larger sample sizes to confirm our study findings.
Background:
There is evidence that fully recovered COVID-19 patients usually resume physical exercise, but do not perform at the same intensity level performed prior to infection. The aim of this study was to evaluate the impact of COVID-19 infection and recovery as well as muscle fatigue on cardiorespiratory fitness and running biomechanics in female recreational runners.
Methods:
Twenty-eight females were divided into a group of hospitalized and recovered COVID-19 patients (COV, n = 14, at least 14 days following recovery) and a group of healthy age-matched controls (CTR, n = 14). Ground reaction forces from stepping on a force plate while barefoot overground running at 3.3 m/s was measured before and after a fatiguing protocol. The fatigue protocol consisted of incrementally increasing running speed until reaching a score of 13 on the 6-20 Borg scale, followed by steady-state running until exhaustion. The effects of group and fatigue were assessed for steady-state running duration, steady-state running speed, ground contact time, vertical instantaneous loading rate and peak propulsion force.
Results:
COV runners completed only 56% of the running time achieved by the CTR (p < 0.0001), and at a 26% slower steady-state running speed (p < 0.0001). There were fatigue-related reductions in loading rate (p = 0.004) without group differences. Increased ground contact time (p = 0.002) and reduced peak propulsion force (p = 0.005) were found for COV when compared to CTR.
Conclusion:
Our results suggest that female runners who recovered from COVID-19 showed compromised running endurance and altered running kinetics in the form of longer stance periods and weaker propulsion forces. More research is needed in this area using larger sample sizes to confirm our study findings.
Objective: To determine the effects of low- vs. high-intensity aerobic and resistance training on motor and cognitive function, brain activation, brain structure, and neurochemical markers of neuroplasticity and the association thereof in healthy young and older adults and in patients with multiple sclerosis, Parkinson's disease, and stroke. Design: Systematic review and robust variance estimation meta-analysis with meta-regression. Data sources: Systematic search of MEDLINE, Web of Science, and CINAHL databases. Results: Fifty studies with 60 intervention arms and 2283 in-analyses participants were included. Due to the low number of studies, the three patient groups were combined and analyzed as a single group. Overall, low- (g=0.19, p = 0.024) and high-intensity exercise (g=0.40, p = 0.001) improved neuroplasticity. Exercise intensity scaled with neuroplasticity only in healthy young adults but not in healthy older adults or patient groups. Exercise-induced improvements in neuroplasticity were associated with changes in motor but not cognitive outcomes. Conclusion: Exercise intensity is an important variable to dose and individualize the exercise stimulus for healthy young individuals but not necessarily for healthy older adults and neurological patients. This conclusion warrants caution because studies are needed that directly compare the effects of low- vs. high-intensity exercise on neuroplasticity to determine if such changes are mechanistically and incrementally linked to improved cognition and motor function.
The Human Takes It All
(2020)
Background: The increasing involvement of social robots in human lives raises the question as to how humans perceive social robots. Little is known about human perception of synthesized voices.
Aim: To investigate which synthesized voice parameters predict the speaker's eeriness and voice likability; to determine if individual listener characteristics (e.g., personality, attitude toward robots, age) influence synthesized voice evaluations; and to explore which paralinguistic features subjectively distinguish humans from robots/artificial agents.
Methods: 95 adults (62 females) listened to randomly presented audio-clips of three categories: synthesized (Watson, IBM), humanoid (robot Sophia, Hanson Robotics), and human voices (five clips/category). Voices were rated on intelligibility, prosody, trustworthiness, confidence, enthusiasm, pleasantness, human-likeness, likability, and naturalness. Speakers were rated on appeal, credibility, human-likeness, and eeriness. Participants' personality traits, attitudes to robots, and demographics were obtained.
Results: The human voice and human speaker characteristics received reliably higher scores on all dimensions except for eeriness. Synthesized voice ratings were positively related to participants' agreeableness and neuroticism. Females rated synthesized voices more positively on most dimensions. Surprisingly, interest in social robots and attitudes toward robots played almost no role in voice evaluation. Contrary to the expectations of an uncanny valley, when the ratings of human-likeness for both the voice and the speaker characteristics were higher, they seemed less eerie to the participants. Moreover, when the speaker's voice was more humanlike, it was more liked by the participants. This latter point was only applicable to one of the synthesized voices. Finally, pleasantness and trustworthiness of the synthesized voice predicted the likability of the speaker's voice. Qualitative content analysis identified intonation, sound, emotion, and imageability/embodiment as diagnostic features.
Discussion: Humans clearly prefer human voices, but manipulating diagnostic speech features might increase acceptance of synthesized voices and thereby support human-robot interaction. There is limited evidence that human-likeness of a voice is negatively linked to the perceived eeriness of the speaker.
Background: The standard method to treat physically active patients with anterior cruciate ligament (ACL) rupture is ligament reconstruction surgery. The rehabilitation training program is very important to improve functional performance in recreational athletes following ACL reconstruction.
Objectives: The aims of this study were to compare the effects of three different training programs, eccentric training (ECC), plyometric training (PLYO), or combined eccentric and plyometric training (COMB), on dynamic balance (Y-BAL), the Lysholm Knee Scale (LKS), the return to sport index (RSI), and the leg symmetry index (LSI) for the single leg hop test for distance in elite female athletes after ACL surgery.
Materials and Methods: Fourteen weeks after rehabilitation from surgery, 40 elite female athletes (20.3 ± 3.2 years), who had undergone an ACL reconstruction, participated in a short-term (6 weeks; two times a week) training study. All participants received the same rehabilitation protocol prior to the training study. Athletes were randomly assigned to three experimental groups, ECC (n = 10), PLYO (n = 10), and COMB (n = 10), and to a control group (CON: n = 10). Testing was conducted before and after the 6-week training programs and included the Y-BAL, LKS, and RSI. LSI was assessed after the 6-week training programs only.
Results: Adherence rate was 100% across all groups and no training or test-related injuries were reported. No significant between-group baseline differences (pre-6-week training) were observed for any of the parameters. Significant group-by-time interactions were found for Y-BAL (p < 0.001, ES = 1.73), LKS (p < 0.001, ES = 0.76), and RSI (p < 0.001, ES = 1.39). Contrast analysis demonstrated that COMB yielded significantly greater improvements in Y-BAL, LKS, and RSI (all p < 0.001), in addition to significantly better performances in LSI (all p < 0.001), than CON, PLYO, and ECC, respectively.
Conclusion: In conclusion, combined (eccentric/plyometric) training seems to represent the most effective training method as it exerts positive effects on both stability and functional performance in the post-ACL-surgical rehabilitation period of elite female athletes.
Combining training of muscle strength and cardiorespiratory fitness within a training cycle could increase athletic performance more than single-mode training. However, the physiological effects produced by each training modality could also interfere with each other, improving athletic performance less than single-mode training. Because anthropometric, physiological, and biomechanical differences between young and adult athletes can affect the responses to exercise training, young athletes might respond differently to concurrent training (CT) compared with adults. Thus, the aim of the present systematic review with meta-analysis was to determine the effects of concurrent strength and endurance training on selected physical fitness components and athletic performance in youth. A systematic literature search of PubMed and Web of Science identified 886 records. The studies included in the analyses examined children (girls age 6–11 years, boys age 6–13 years) or adolescents (girls age 12–18 years, boys age 14–18 years), compared CT with single-mode endurance (ET) or strength training (ST), and reported at least one strength/power—(e.g., jump height), endurance—(e.g., peak V°O2, exercise economy), or performance-related (e.g., time trial) outcome. We calculated weighted standardized mean differences (SMDs). CT compared to ET produced small effects in favor of CT on athletic performance (n = 11 studies, SMD = 0.41, p = 0.04) and trivial effects on cardiorespiratory endurance (n = 4 studies, SMD = 0.04, p = 0.86) and exercise economy (n = 5 studies, SMD = 0.16, p = 0.49) in young athletes. A sub-analysis of chronological age revealed a trend toward larger effects of CT vs. ET on athletic performance in adolescents (SMD = 0.52) compared with children (SMD = 0.17). CT compared with ST had small effects in favor of CT on muscle power (n = 4 studies, SMD = 0.23, p = 0.04). In conclusion, CT is more effective than single-mode ET or ST in improving selected measures of physical fitness and athletic performance in youth. Specifically, CT compared with ET improved athletic performance in children and particularly adolescents. Finally, CT was more effective than ST in improving muscle power in youth.
Background: Telerehabilitation can contribute to the maintenance of successful rehabilitation regardless of location and time. The aim of this study was to investigate a specific three-month interactive telerehabilitation routine regarding its effectiveness in assisting patients with physical functionality and with returning to work compared to typical aftercare.
Objective: The aim of the study was to investigate a specific three-month interactive telerehabilitation with regard to effectiveness in functioning and return to work compared to usual aftercare.
Methods: From August 2016 to December 2017, 111 patients (mean 54.9 years old; SD 6.8; 54.3% female) with hip or knee replacement were enrolled in the randomized controlled trial. At discharge from inpatient rehabilitation and after three months, their distance in the 6-minute walk test was assessed as the primary endpoint. Other functional parameters, including health related quality of life, pain, and time to return to work, were secondary endpoints.
Results: Patients in the intervention group performed telerehabilitation for an average of 55.0 minutes (SD 9.2) per week. Adherence was high, at over 75%, until the 7th week of the three-month intervention phase. Almost all the patients and therapists used the communication options. Both the intervention group (average difference 88.3 m; SD 57.7; P=.95) and the control group (average difference 79.6 m; SD 48.7; P=.95) increased their distance in the 6-minute-walk-test. Improvements in other functional parameters, as well as in quality of life and pain, were achieved in both groups. The higher proportion of working patients in the intervention group (64.6%; P=.01) versus the control group (46.2%) is of note.
Conclusions: The effect of the investigated telerehabilitation therapy in patients following knee or hip replacement was equivalent to the usual aftercare in terms of functional testing, quality of life, and pain. Since a significantly higher return-to-work rate could be achieved, this therapy might be a promising supplement to established aftercare.
Objective: There is a lack of brief rating scales for the reliable assessment of psychotherapeutic skills, which do not require intensive rater training and/or a high level of expertise. Thus, the objective is to validate a 14-item version of the Clinical Communication Skills Scale (CCSS-S).
Methods: Using a sample of N = 690 video-based ratings of role-plays with simulated patients, we calculated a confirmatory factor analysis and an exploratory structural equation modeling (ESEM), assessed convergent validities, determined inter-rater reliabilities and compared these with those who were either psychology students, advanced psychotherapy trainees, or experts.
Results: Correlations with other competence rating scales were high (rs > 0.86–0.89). The intraclass correlations ranged between moderate and good [ICC(2,2) = 0.65–0.80], with student raters yielding the lowest scores. The one-factor model only marginally replicated the data, but the internal consistencies were excellent (α = 0.91–95). The ESEM yielded a two-factor solution (Collaboration and Structuring and Exploration Skills).
Conclusion: The CCSS-S is a brief and valid rating scale that reliably assesses basic communication skills, which is particularly useful for psychotherapy training using standardized role-plays. To ensure good inter-rater reliabilities, it is still advisable to employ raters with at least some clinical experience. Future studies should further investigate the one- or two-factor structure of the instrument.
Conceptualisation is the first step of speech production and describes the process by which we map our thoughts onto spoken language. Recent studies suggest that some people with language impairments have conceptualisation deficits manifested by information selection and sequencing difficulties. In this study, we examined conceptualisation in the complex picture descriptions of individuals with and without aphasia. We analysed the number and the order of main concepts (ideas produced by >= 60% of unimpaired speakers) and non-main concepts (e.g. irrelevant details). Half of the individuals with aphasia showed a reduced number of main concepts that could not be fully accounted for by their language production deficits. Moreover, individuals with aphasia produced both a larger amount of marginally relevant information, as well as having greater variability in the order of main concepts. Both findings provide support for the idea that conceptualisation deficits are a relatively common impairment in people with aphasia.
Background: As the number of cardiac diseases continuously increases within the last years in modern society, so does cardiac treatment, especially cardiac catheterization. The procedure of a cardiac catheterization is challenging for both patients and practitioners. Several potential stressors of psychological or physical nature can occur during the procedure. The objective of the study is to develop and implement a stress management intervention for both practitioners and patients that aims to reduce the psychological and physical strain of a cardiac catheterization.
Methods: The clinical study (DRKS00026624) includes two randomized controlled intervention trials with parallel groups, for patients with elective cardiac catheterization and practitioners at the catheterization lab, in two clinic sites of the Ernst-von-Bergmann clinic network in Brandenburg, Germany. Both groups received different interventions for stress management. The intervention for patients comprises a psychoeducational video with different stress management technics and additional a standardized medical information about the cardiac catheterization examination. The control condition includes the in hospitals practiced medical patient education before the examination (usual care). Primary and secondary outcomes are measured by physiological parameters and validated questionnaires, the day before (M1) and after (M2) the cardiac catheterization and at a postal follow-up 6 months later (M3). It is expected that people with standardized information and psychoeducation show reduced complications during cardiac catheterization procedures, better pre- and post-operative wellbeing, regeneration, mood and lower stress levels over time. The intervention for practitioners includes a Mindfulness-based stress reduction program (MBSR) over 8 weeks supervised by an experienced MBSR practitioner directly at the clinic site and an operative guideline. It is expected that practitioners with intervention show improved perceived and chronic stress, occupational health, physical and mental function, higher effort-reward balance, regeneration and quality of life. Primary and secondary outcomes are measured by physiological parameters (heart rate variability, saliva cortisol) and validated questionnaires and will be assessed before (M1) and after (M2) the MBSR intervention and at a postal follow-up 6 months later (M3). Physiological biomarkers in practitioners will be assessed before (M1) and after intervention (M2) on two work days and a two days off. Intervention effects in both groups (practitioners and patients) will be evaluated separately using multivariate variance analysis.
Discussion: This study evaluates the effectiveness of two stress management intervention programs for patients and practitioners within cardiac catheter laboratory. Study will disclose strains during a cardiac catheterization affecting both patients and practitioners. For practitioners it may contribute to improved working conditions and occupational safety, preservation of earning capacity, avoidance of participation restrictions and loss of performance. In both groups less anxiety, stress and complications before and during the procedures can be expected. The study may add knowledge how to eliminate stressful exposures and to contribute to more (psychological) security, less output losses and exhaustion during work. The evolved stress management guidelines, training manuals and the standardized patient education should be transferred into clinical routines
Decades of research have demonstrated that physical stress (PS) stimulates bone remodeling and affects bone structure and function through complex mechanotransduction mechanisms. Recent research has laid ground to the hypothesis that mental stress (MS) also influences bone biology, eventually leading to osteoporosis and increased bone fracture risk. These effects are likely exerted by modulation of hypothalamic–pituitary–adrenal axis activity, resulting in an altered release of growth hormones, glucocorticoids and cytokines, as demonstrated in human and animal studies. Furthermore, molecular cross talk between mental and PS is thought to exist, with either synergistic or preventative effects on bone disease progression depending on the characteristics of the applied stressor. This mini review will explain the emerging concept of MS as an important player in bone adaptation and its potential cross talk with PS by summarizing the current state of knowledge, highlighting newly evolving notions (such as intergenerational transmission of stress and its epigenetic modifications affecting bone) and proposing new research directions.
Background
Psychotherapy is highly effective and widely acknowledged for treating various mental disorders. Nevertheless, in terms of methods for teaching effective psychotherapeutic approaches and competencies, there has been a lack of investigation. Training and supervision are the main strategies for teaching therapist competencies, and standardized role-plays with simulated patients (i.e., trained individuals playing someone with a mental disorder) seem useful for evaluating training approaches. In medical education, this procedure is now internationally established. However, so far, little use has been made of standardized role-playing to evaluate training and supervision in the area of clinical psychology and psychotherapy.
Methods
In this study, standardized role-plays are used to evaluate methods for training and supervision. Central cognitive behavioral approaches for treating depression are taught in the training. The first experiment compares an active training approach (i.e., model learning) with a passive one (i.e., reading manual-based instructions). The second experiment compares a direct supervision technique (i.e., supervision based on video analysis) with an indirect one (i.e., supervision based on verbal reporting). In each experiment, 68 bachelor’s and master’s students of psychology will be randomly assigned to the experimental and control groups. Each student takes part in three role-plays (baseline, post and 3-month follow-up), which are all videotaped. Two independent raters assess therapist competencies in each role-play on the basis of a standardized competence scale.
Discussion
The research project aims to contribute to the development of specific training and supervision methods in order to improve psychotherapy training and patient care.
Sprint and jump performances in highly trained young soccer players of different chronological age
(2020)
Objective
The aim of this study was to examine the effects of two different sprint-training regimes on sprint and jump performances according to age in elite young male soccer players over the course of one soccer season.
Methods
Players were randomly assigned to two training groups. Group 1 performed systematic change-of-direction sprints (CODST, U19 [n = 9], U17 [n = 9], U15 [n = 10]) while group 2 conducted systematic linear sprints (LST, U19 [n = 9], U17 [n = 9], U15 [n = 9]). Training volumes were similar between groups (40 sprints per week x 30 weeks = 1200 sprints per season). Pre and post training, all players performed tests for the assessment of linear and slalom sprint speed (5-m and 10-m), countermovement jump, and maximal aerobic speed performance.
Results
For all physical fitness measures, the baseline-adjusted means data (ANCOVA) across the age groups showed no significant differences between LST and CODST at post (0.061 < p < 0.995; 0.0017 < d < 1.01). The analyses of baseline-adjusted means for all physical fitness measures for U15, U17, and U19 (LST vs. CODST) revealed no significant differences between LST and CODST for U15 (0.213 < p < 0.917; 0.001 < d < 0.087), U17 (0.132 < p < 0.976; 0.001 < d < 0.310), and U19 (0.300 < p < 0.999; 0.001 < d < 0.049) at post.
Conclusions
The results from this study showed that both, LST and CODST induced significant changes in the sprint, lower limbs power, and aerobic performances in young elite soccer players. Since no significant differences were observed between LST and CODST, the observed changes are most likely due to training and/or maturation. Therefore, more research is needed to elucidate whether CODST, LST or a combination of both is beneficial for youth soccer athletes’ performance development.
In numerical processing, the functional role of Spatial-Numerical Associations (SNAs, such as the association of smaller numbers with left space and larger numbers with right space, the Mental Number Line hypothesis) is debated. Most studies demonstrate SNAs with lateralized responses, and there is little evidence that SNAs appear when no response is required. We recorded passive holding grip forces in no-go trials during number processing. In Experiment 1, participants performed a surface numerical decision task (“Is it a number or a letter?”). In Experiment 2, we used a deeper semantic task (“Is this number larger or smaller than five?”). Despite instruction to keep their grip force constant, participants' spontaneous grip force changed in both experiments: Smaller numbers led to larger force increase in the left than in the right hand in the numerical decision task (500–700 ms after stimulus onset). In the semantic task, smaller numbers again led to larger force increase in the left hand, and larger numbers increased the right-hand holding force. This effect appeared earlier (180 ms) and lasted longer (until 580 ms after stimulus onset). This is the first demonstration of SNAs with passive holding force. Our result suggests that (1) explicit motor response is not a prerequisite for SNAs to appear, and (2) the timing and strength of SNAs are task-dependent. (216 words).
Introduction: The body-specificity hypothesis states that in right-handers, positive concepts should be associated with the right side and negative concepts with the left side of the body. Following this hypothesis, our study postulated that negative out-group ethnic stereotypes would be associated with the left side, and positive in-group stereotypes would be associated with the right side.
Methods: The experiment consisted of two parts. First, we measured the spatial mapping of ethnic stereotypes by using a sensibility judgment task, in which participants had to decide whether a sentence was sensible or not by pressing either a left or a right key. The sentences included German vs. Arabic proper names. Second, we measured implicit ethnic stereotypes in the same participants using the Go/No-go Association Task (GNAT), in which Arabic vs. German proper names were presented in combination with positive vs. negative adjectives. Right-handed German native speakers (N = 92) participated in an online study.
Results: As predicted, in the GNAT, participants reacted faster to German names combined with positive adjectives and to Arabic names combined with negative adjectives, which is diagnostic of existing valenced in-and outgroup ethnic stereotypes. However, we failed to find any reliable effects in the sensibility judgment task, i.e., there was no evidence of spatial mapping of positive and negative ethnic stereotypes. There was no correlation between the results of the two tasks at the individual level. Further Bayesian analysis and exploratory analysis in the left-handed subsample (N = 9) corroborated the evidence in favor of null results.
Discussion: Our study suggests that ethnic stereotypes are not automatically mapped in a body-specific manner.
We quantified the acute and chronic effects of whole body vibration on athletic performance or its proxy measures in competitive and/or elite athletes.
Systematic literature review and meta-analysis.
Whole body vibration combined with exercise had an overall 0.3 % acute effect on maximal voluntary leg force (-6.4 %, effect size = -0.43, 1 study), leg power (4.7 %, weighted mean effect size = 0.30, 6 studies), flexibility (4.6 %, effect size = -0.12 to 0.22, 2 studies), and athletic performance (-1.9 %, weighted mean effect size = 0.26, 6 studies) in 191 (103 male, 88 female) athletes representing eight sports (overall effect size = 0.28). Whole body vibration combined with exercise had an overall 10.2 % chronic effect on maximal voluntary leg force (14.6 %, weighted mean effect size = 0.44, 5 studies), leg power (10.7 %, weighted mean effect size = 0.42, 9 studies), flexibility (16.5 %, effect size = 0.57 to 0.61, 2 studies), and athletic performance (-1.2 %, weighted mean effect size = 0.45, 5 studies) in 437 (169 male, 268 female) athletes (overall effect size = 0.44).
Whole body vibration has small and inconsistent acute and chronic effects on athletic performance in competitive and/or elite athletes. These findings lead to the hypothesis that neuromuscular adaptive processes following whole body vibration are not specific enough to enhance athletic performance. Thus, other types of exercise programs (e.g., resistance training) are recommended if the goal is to improve athletic performance.
Background: We assessed the effects of gender, in association with a four-week small-sided games (SSGs) training program, during Ramadan intermitting fasting (RIF) on changes in psychometric and physiological markers in professional male and female basketball players.
Methods: Twenty-four professional basketball players from the first Tunisian (Tunisia) division participated in this study. The players were dichotomized by sex (males [GM = 12]; females [GF = 12]). Both groups completed a 4 weeks SSGs training program with 3 sessions per week. Psychometric (e.g., quality of sleep, fatigue, stress, and delayed onset of muscle soreness [DOMS]) and physiological parameters (e.g., heart rate frequency, blood lactate) were measured during the first week (baseline) and at the end of RIF (post-test).
Results: Post hoc tests showed a significant increase in stress levels in both groups (GM [− 81.11%; p < 0.001, d = 0.33, small]; GF [− 36,53%; p = 0.001, d = 0.25, small]). Concerning physiological parameters, ANCOVA revealed significantly lower heart rates in favor of GM at post-test (1.70%, d = 0.38, small, p = 0.002).
Conclusions: Our results showed that SSGs training at the end of the RIF negatively impacted psychometric parameters of male and female basketball players. It can be concluded that there are sex-mediated effects of training during RIF in basketball players, and this should be considered by researchers and practitioners when programing training during RIF.
There is evidence of substantial benefit of cardiac rehabilitation (CR) for patients with low exercise capacity at admission. Nevertheless, some patients are not able to perform an initial exercise stress test (EST). We aimed to describe this group using data of 1094 consecutive patients after a cardiac event (71 +/- 7 years, 78% men) enrolled in nine centres for inpatient CR. We analysed sociodemographic and clinical variables (e.g. cardiovascular risk factors, comorbidities, complications at admission), amount of therapy (e.g. exercise training, nursing care) and the results of the initial and the final 6-min walking test (6MWT) with respect to the application of an EST. Fifteen per cent of patients did not undergo an EST (non-EST group). In multivariable analysis, the probability of obtaining an EST was higher for men [odds ratio (OR) 1.89, P=0.01], a 6MWT (per 10 m, OR 1.07, P<0.01) and lower for patients with diabetes mellitus (OR 0.48, P<0.01), NYHA-class III/IV (OR 0.27, P<0.01), osteoarthritis (OR 0.39, P<0.01) and a longer hospital stay (per 5 days, OR 0.87, P=0.02). The non-EST group received fewer therapy units of exercise training, but more units of nursing care and physiotherapy than the EST group. However, there were no significant differences between both groups in the increase of the 6MWT during CR (123 vs. 108 m, P=0.122). The present study confirms the feasibility of an EST at the start of CR as an indicator of disease severity. Nevertheless, patients without EST benefit from CR even if exercising less. Thus, there is a justified need for individualized, comprehensive and interdisciplinary CR.
Objectives: Chronic back pain (CBP) can lead to disability and burden. In addition to its medical causes, its development is influenced by psychosocial risk factors, the so-called flag factors, which are categorized and integrated into many treatment guidelines. Currently, most studies investigate single flag factors, which limit the estimation of individual factor significance in the development of chronic pain. Furthermore, factors concerning patients’ lifestyle, biography and treatment history are often neglected. Therefore, the objectives of the present study are to identify commonly neglected factors of CBP and integrate them into an analysis model comparing their significance with established flag factors.
Methods: A total of 24 patients and therapists were cross-sectionally interviewed to identify commonly neglected factors of CBP. Subsequently, the impact of these factors was surveyed in a longitudinal study. In two rehabilitation clinics, CBP patients (n = 145) were examined before and 6 months after a 3-week inpatient rehabilitation. Outcome variables, chronification factor pain experience (CF-PE) and chronification factor disability (CF-D), were ascertained with confirmatory factor analysis (CFA) of standardized questionnaires. Predictors were evaluated using stepwise calculations of simple and multiple regression models.
Results: Through interviews, medical history, iatrogenic factors, poor compliance, critical life events (LEs), social support (SS) type and effort–reward were identified as commonly neglected factors. However, only the final three held significance in comparison to established factors such as depression and pain-related cognitions. Longitudinally, lifestyle factors found to influence future pain were initial pain, physically demanding work, nicotine consumption, gender and rehabilitation clinic. LEs were unexpectedly found to be a strong predictor of future pain, as were the protective factors, reward at work and perceived SS.
Discussion: These findings shed insight regarding often overlooked factors in the development of CBP, suggesting that more detailed operationalization and superordinate frameworks would be beneficial to further research.
Conclusion: In particular, LEs should be taken into account in future research. Protective factors should be integrated in therapeutic settings.
Reward expectation and affective responses across psychiatric disorders - A dimensional approach
(2014)
Processing of reward is the basis of adaptive behavior of the human being. Neural correlates of reward processing seem to be influenced by developmental changes from adolescence to late adulthood. The aim of this study is to uncover these neural correlates during a slot machine gambling task across the lifespan. Therefore, we used functional magnetic resonance imaging to investigate 102 volunteers in three different age groups: 34 adolescents, 34 younger adults, and 34 older adults. We focused on the core reward areas ventral striatum (VS) and ventromedial prefrontal cortex (VMPFC), the valence processing associated areas, anterior cingulate cortex (ACC) and insula, as well as information integration associated areas, dorsolateral prefrontal cortex (DLPFC), and inferior parietal lobule (IPL). Results showed that VS and VMPFC were characterized by a hyperactivation in adolescents compared with younger adults. Furthermore, the ACC and insula were characterized by a U-shape pattern (hypoactivation in younger adults compared with adolescents and older adults), whereas the DLPFC and IPL were characterized by a J-shaped form (hyperactivation in older adults compared with younger groups). Furthermore, a functional connectivity analysis revealed an elevated negative functional coupling between the inhibition-related area rIFG and VS in younger adults compared with adolescents. Results indicate that lifespan-related changes during reward anticipation are characterized by different trajectories in different reward network modules and support the hypothesis of an imbalance in maturation of striatal and prefrontal cortex in adolescents. Furthermore, these results suggest compensatory age-specific effects in fronto-parietal regions. Hum Brain Mapp 35:5153-5165, 2014. (c) 2014 Wiley Periodicals, Inc.
Background
Millions of people in Germany suffer from chronic pain, in which course and intensity are multifactorial. Besides physical injuries, certain psychosocial risk factors are involved in the disease process. The national health care guidelines for the diagnosis and treatment of non-specific low back pain recommend the screening of psychosocial risk factors as early as possible, to be able to adapt the therapy to patient needs (e.g., unimodal or multimodal). However, such a procedure has been difficult to implement in practice and has not yet been integrated into the rehabilitation care structures across the country.
Methods
The aim of this study is to implement an individualized therapy and aftercare program within the rehabilitation offer of the German Pension Insurance in the area of orthopedics and to examine its success and sustainability in comparison to the previous standard aftercare program.
The study is a multicenter randomized controlled trial including 1204 patients from six orthopedic rehabilitation clinics. A 2:1 allocation ratio to intervention (individualized and home-based rehabilitation aftercare) versus the control group (regular outpatient rehabilitation aftercare) is set. Upon admission to the rehabilitation clinic, participants in the intervention group will be screened according to their psychosocial risk profile. They could then receive either unimodal or multimodal, together with an individualized training program. The program is instructed in the clinic (approximately 3 weeks) and will continue independently at home afterwards for 3 months. The success of the program is examined by means of a total of four surveys. The co-primary outcomes are the Characteristic Pain Intensity and Disability Score assessed by the German version of the Chronic Pain Grade questionnaire (CPG).
Discussion
An improvement in terms of pain, work ability, patient compliance, and acceptance in our intervention program compared to the standard aftercare is expected. The study contributes to provide individualized care also to patients living far away from clinical centers.
Trial registration
DRKS, DRKS00020373. Registered on 15 April 2020
The purpose of this study was to examine the test-retest reliability, and convergent and discriminative validity of a new taekwondo-specific change-of-direction (COD) speed test with striking techniques (TST) in elite taekwondo athletes. Twenty (10 males and 10 females) elite (athletes who compete at national level) and top-elite (athletes who compete at national and international level) taekwondo athletes with an average training background of 8.9 ± 1.3 years of systematic taekwondo training participated in this study. During the two-week test-retest period, various generic performance tests measuring COD speed, balance, speed, and jump performance were carried out during the first week and as a retest during the second week. Three TST trials were conducted with each athlete and the best trial was used for further analyses. The relevant performance measure derived from the TST was the time with striking penalty (TST-TSP). TST-TSP performances amounted to 10.57 ± 1.08 s for males and 11.74 ± 1.34 s for females. The reliability analysis of the TST performance was conducted after logarithmic transformation, in order to address the problem of heteroscedasticity. In both groups, the TST demonstrated a high relative test-retest reliability (intraclass correlation coefficients and 90% compatibility limits were 0.80 and 0.47 to 0.93, respectively). For absolute reliability, the TST’s typical error of measurement (TEM), 90% compatibility limits, and magnitudes were 4.6%, 3.4 to 7.7, for males, and 5.4%, 3.9 to 9.0, for females. The homogeneous sample of taekwondo athletes meant that the TST’s TEM exceeded the usual smallest important change (SIC) with 0.2 effect size in the two groups. The new test showed mostly very large correlations with linear sprint speed (r = 0.71 to 0.85) and dynamic balance (r = −0.71 and −0.74), large correlations with COD speed (r = 0.57 to 0.60) and vertical jump performance (r = −0.50 to −0.65), and moderate correlations with horizontal jump performance (r = −0.34 to −0.45) and static balance (r = −0.39 to −0.44). Top-elite athletes showed better TST performances than elite counterparts. Receiver operating characteristic analysis indicated that the TST effectively discriminated between top-elite and elite taekwondo athletes. In conclusion, the TST is a valid, and sensitive test to evaluate the COD speed with taekwondo specific skills, and reliable when considering ICC and TEM. Although the usefulness of the TST is questioned to detect small performance changes in the present population, the TST can detect moderate changes in taekwondo-specific COD speed.
This study investigated associations between variables of trunk muscle strength (TMS), spinal mobility, and balance in seniors. Thirty-four seniors (sex: 18 female, 16 male; age: 70 +/- 4 years; activity level: 13 +/- 7 hr/week) were tested for maximal isometric strength (MIS) of the trunk extensors, flexors, lateral flexors, rotators, spinal mobility, and steady-state, reactive, and proactive balance. Significant correlations were detected between all measures of TMS and static steady-state balance (r = .43.57, p < .05). Significant correlations were observed between specific measures of TMS and dynamic steady-state balance (r = .42.55, p < .05). No significant correlations were found between all variables of TMS and reactive/proactive balance and between all variables of spinal mobility and balance. Regression analyses revealed that TMS explains between 1-33% of total variance of the respective balance parameters. Findings indicate that TMS is related to measures of steady-state balance which may imply that TMS promoting exercises should be integrated in strength training for seniors.
Finger-based numerical representations have gained increasing research interest. However, their description and assessment often refer to different numerical principles of ordinality, cardinality and 1-to-1 correspondence. Our aim was to investigate similarities and differences between these principles in finger-based numerical representations. Sixty-eight healthy adults performed ordinal finger counting, cardinal finger montring (showing the number of gestures) and finger-to-number mapping with twisted arms and fingers. We found that counting gestures and montring postures were identical for Number 10 but differed to varying degrees for other numbers. Interestingly, there was no systematic relation between finger-to-number mapping and ordinal finger counting habits. These data question the assumption of a unitary embodied finger-based numerical representation, but suggest that different finger-based representations co-exist and can be recruited flexibly depending on the numerical aspects to be conveyed.
Referential Coding Does Not Rely on Location Features: Evidence for a Nonspatial Joint Simon Effect
(2015)
The joint Simon effect (JSE) shows that the presence of another agent can change one's representation of one's task and/or action. According to the spatial response coding approach, this is because another person in one's peri-personal space automatically induces the spatial coding of one's own action, which in turn invites spatial stimulus-response priming. According to the referential coding approach, the presence of another person or event creates response conflict, which the actor is assumed to solve by emphasizing response features that discriminate between one's own response and that of the other. The 2 approaches often make the same predictions, but the spatial response coding approach considers spatial location as the only dimension that can drive response coding, whereas the referential coding approach allows for other dimensions as well. To compare these approaches, the authors ran 2 experiments to see whether a nonspatial JSE can be demonstrated. Participants responded to the geometrical shape of a central colored stimulus by pressing a left or right button, while wearing gloves of the same or different color as the stimuli. Participants performed the task individually, either by responding to either stimulus shapes (Experiment 1) or by responding to only 1 of the 2 shapes (Experiment 2), and in the presence of a coactor. Congruence between stimulus and glove color affected performance in the 2-choice and the joint tasks but not in the individual go/no-go task. This demonstration of a nonspatial JSE is inconsistent with the spatial response coding approach but supports the referential coding approach.
Understanding the development of spoken language in young children has become increasingly important for advancing basic theories of language acquisition and for clinical practice. However, such a goal requires refined measurements of speech articulation (e.g., from the tongue), which are difficult to obtain from young children. In recent years though, technological advances have allowed developmental researchers to make significant steps in that direction. For instance, movements of the tongue, an articulator that is essential for spoken language, can now be tracked and recorded in children with ultrasound imaging. This technique has opened novel research avenues in (a)typical language acquisition, enabling researchers to reliably capture what has long remained invisible in the speech of young children. Within this context, we have designed an experimental platform for the recording and the processing of kinematic data: SOLLAR (Sonographic and Optical Linguo-Labial Articulatory Recording system). The method has been tailored for children, but it is suitable for adults. In the present article, we introduce the recording environment developed to record over 100 children and 30 adults within SOLLAR. We then describe SOLLAR’s data processing framework, providing examples of data visualization and a summary of strengths and limitations.
Stress-levels experienced by school-aged elite athletes are pronounced, but data on their mental health status are widely lacking. In our study, we examined self-reported psychological symptoms and chronic mood. Data from a representative sample of 866 elite student-athletes (aged 12-15 years), enrolled in high-performance sport programming in German Elite Schools of Sport, were compared with data from 80 student-athletes from the same schools who have just been deselected from elite sport promotion, and from 432 age-and sex-matched non-sport students from regular schools (without such programming). Anxiety symptoms were least prevalent in female elite student-athletes. In male elite student-athletes, only symptoms of posttraumatic stress were less prevalent than in the other groups. Somatoform symptoms were generally more frequent in athletes, a trend that was significantly pronounced in deselected athletes. Deselected athletes showed an increased risk for psychological symptoms compared with both other groups. Regarding chronic mood, again deselected athletes showed less positive scores. While there was a trend toward high-performance sport being associated with better psychological health at least in girls, preventative programs should take into account that deselection from elite sport programming may be associated with specific risks for mental disorders.
Prosody and information status in typological perspective - Introduction to the Special Issue
(2015)
The aim of this review was to describe and summarize the scientific literature on programming parameters related to jump or plyometric training in male and female soccer players of different ages and fitness levels. A literature search was conducted in the electronic databases PubMed, Web of Science and Scopus using keywords related to the main topic of this study (e.g., “ballistic” and “plyometric”). According to the PICOS framework, the population for the review was restricted to soccer players, involved in jump or plyometric training. Among 7556 identified studies, 90 were eligible for inclusion. Only 12 studies were found for females. Most studies (n = 52) were conducted with youth male players. Moreover, only 35 studies determined the effectiveness of a given jump training programming factor. Based on the limited available research, it seems that a dose of 7 weeks (1–2 sessions per week), with ~80 jumps (specific of combined types) per session, using near-maximal or maximal intensity, with adequate recovery between repetitions (<15 s), sets (≥30 s) and sessions (≥24–48 h), using progressive overload and taper strategies, using appropriate surfaces (e.g., grass), and applied in a well-rested state, when combined with other training methods, would increase the outcome of effective and safe plyometric-jump training interventions aimed at improving soccer players physical fitness. In conclusion, jump training is an effective and easy-to-administer training approach for youth, adult, male and female soccer players. However, optimal programming for plyometric-jump training in soccer is yet to be determined in future research.
Purpose: Although subjective knowledge about the prognosis of an advanced disease is extremely important for coping and treatment planning, the concept of prognostic awareness (PA) remains inconsistently defined. The aims of the scoping review were to synthesize a definition of PA from the most recent literature, describe preconditions, correlates and consequences, and suggest a conceptual model.
Methods: By using scoping review methodology, we searched the Web of Science and PubMed databases, and included publications, reviews, meta-analyses or guidelines on all physical diagnoses, as well as publications offering a conceptual or an operational definition of PA. The data were analyzed by means of content analysis techniques.
Results: Of the 24 included publications, 21 referred exclusively to cancer, one to patients with hip fractures and two to palliative care in general. The deduced definition of PA comprised the following facets: adequate estimation of chances for recovery, knowledge of limited time to live, adequate estimation of life expectancy, knowledge of therapy goals, and knowledge of the course of the disease. Further content analysis results were mapped graphically and in a detailed table.
Conclusion: There appears to be a lack of theoretical embedding of PA that in turn influences the methods used for empirical investigation. Drawing on a clear conceptual definition, longitudinal or experimental studies would be desirable.
For life-long learning, an effective learning strategy repertoire is particularly important during acquisition of knowledge in lower secondary school—an educational level characterized with transition into more autonomous learning environments with increased complex academic demands. Using latent profile analysis, we explored the occurrence of different secondary school learner profiles depending on their various combinations of cognitive and metacognitive learning strategy use, as well as their differences in perceived autonomy support, intrinsic motivation, and gender. Data were collected from 576 ninth grade students in Uganda using self-report questionnaires. Four learner profiles were identified: competent strategy user, struggling user, surface-level learner, and deep-level learner profiles. Gender differences were noted in students’ use of elaboration and organization strategies to learn Physics, in favor of girls. In terms of profile memberships, significant differences in gender, intrinsic motivation and perceived autonomy support were also noted. Girls were 2.4–2.7 times more likely than boys to be members of the competent strategy user and surface-level learner profiles. Additionally, higher levels of intrinsic motivation predicted an increased likelihood membership into the deep-level learner profile, while higher levels of perceived teacher autonomy predicted an increased likelihood membership into the competent strategy user profile as compared to other profiles. Further, implications of the findings were discussed.
Recent studies have suggested that musical rhythm perception ability can affect the phonological system. The most prevalent causal account for developmental dyslexia is the phonological deficit hypothesis. As rhythm is a subpart of phonology, we hypothesized that reading deficits in dyslexia are associated with rhythm processing in speech and in music. In a rhythmic grouping task, adults with diagnosed dyslexia and age-matched controls listened to speech streams with syllables alternating in intensity, duration, or neither, and indicated whether they perceived a strong-weak or weak-strong rhythm pattern. Additionally, their reading and musical rhythm abilities were measured. Results showed that adults with dyslexia had lower musical rhythm abilities than adults without dyslexia. Moreover, lower musical rhythm ability was associated with lower reading ability in dyslexia. However, speech grouping by adults with dyslexia was not impaired when musical rhythm perception ability was controlled: like adults without dyslexia, they showed consistent preferences. However, rhythmic grouping was predicted by musical rhythm perception ability, irrespective of dyslexia. The results suggest associations among musical rhythm perception ability, speech rhythm perception, and reading ability. This highlights the importance of considering individual variability to better understand dyslexia and raises the possibility that musical rhythm perception ability is a key to phonological and reading acquisition.
In a self-paced reading study on German sluicing, Paape (Paape, 2016) found that reading times were shorter at the ellipsis site when the antecedent was a temporarily ambiguous garden-path structure. As a post-hoc explanation of this finding, Paape assumed that the antecedent’s memory representation was reactivated during syntactic reanalysis, making it easier to retrieve. In two eye tracking experiments, we subjected the reactivation hypothesis to further empirical scrutiny. Experiment 1, carried out in French, showed no evidence in favor in the reactivation hypothesis. Instead, results for one out of the three types of garden-path sentences that were tested suggest that subjects sometimes failed to resolve the temporary ambiguity in the antecedent clause, and subsequently failed to resolve the ellipsis. The results of Experiment 2, a conceptual replication of Paape’s (Paape, 2016) original study carried out in German, are compatible with the reactivation hypothesis, but leave open the possibility that the observed speedup for ambiguous antecedents may be due to occasional retrievals of an incorrect structure.
Sexual aggression is a major public health issue worldwide, but most knowledge is derived from studies conducted in North America and Western Europe. Little research has been conducted on the prevalence of sexual aggression in developing countries, including Chile. This article presents the first systematic review of the evidence on the prevalence of sexual aggression victimization and perpetration among women and men in Chile. Furthermore, it reports differences in prevalence rates in relation to victim and perpetrator characteristics and victim–perpetrator relationships. A total of N = 28 studies were identified by a three-stage literature search, including the screening of academic databases, publications of Chilean institutions, and reference lists. A great heterogeneity was found for prevalence rates of sexual victimization, ranging between 1.0% and 51.9% for women and 0.4% and 48.0% for men. Only four studies provided perpetration rates, which varied between 0.8% and 26.8% for men and 0.0% and 16.5% for women. No consistent evidence emerged for differences in victimization rates in relation to victims’ gender, age, and education. Perpetrators were more likely to be persons known to the victim. Conceptual and methodological differences between the studies are discussed as reasons for the great variability in prevalence rates, and recommendations are provided for a more harmonized and gender-inclusive approach for future research on sexual aggression in Chile.
Presupposition triggers differ with respect to whether their presupposition is easily accommodatable. The presupposition of focus-sensitive additive particles like also or too is often classified as hard to accommodate, i.e., these triggers are infelicitous if their presupposition is not entailed by the immediate linguistic or non-linguistic context. We tested two competing accounts for the German additive particle auch concerning this requirement: First, that it requires a focus alternative to the whole proposition to be salient, and second, that it merely requires an alternative to the focused constituent (e.g., an individual) to be salient. We conducted two experiments involving felicity judgments as well as questions asking for the truth of the presupposition to be accommodated. Our results suggest that the latter account is too weak: mere previous mention of a potential alternative to the focused constituent is not enough to license the use of auch. However, our results also suggest that the former account is too strong: when an alternative of the focused constituent is prementioned and certain other accommodation-enhancing factors are present, the context does not have to entail the presupposed proposition. We tested the following two potentially accommodation-enhancing factors: First, whether the discourse can be construed to be from the perspective of the individual that the presupposition is about, and second, whether the presupposition is needed to establish coherence between the host sentence of the additive particle and the preceding context. The factor coherence was found to play a significant role. Our results thus corroborate the results of other researchers showing that discourse participants go to great lengths in order to identify a potential presupposition to accommodate, and we contribute to these results by showing that coherence is one of the factors that enhance accommodation.
Postural control is important to cope with demands of everyday life. It has been shown that both attentional demand (i.e., cognitive processing) and fatigue affect postural control in young adults. However, their combined effect is still unresolved. Therefore, we investigated the effects of fatigue on single- (ST) and dual-task (DT) postural control. Twenty young subjects (age: 23.7 ± 2.7) performed an all-out incremental treadmill protocol. After each completed stage, one-legged-stance performance on a force platform under ST (i.e., one-legged-stance only) and DT conditions (i.e., one-legged-stance while subtracting serial 3s) was registered. On a second test day, subjects conducted the same balance tasks for the control condition (i.e., non-fatigued). Results showed that heart rate, lactate, and ventilation increased following fatigue (all p < 0.001; d = 4.2–21). Postural sway and sway velocity increased during DT compared to ST (all p < 0.001; d = 1.9–2.0) and fatigued compared to non-fatigued condition (all p < 0.001; d = 3.3–4.2). In addition, postural control deteriorated with each completed stage during the treadmill protocol (all p < 0.01; d = 1.9–3.3). The addition of an attention-demanding interference task did not further impede one-legged-stance performance. Although both additional attentional demand and physical fatigue affected postural control in healthy young adults, there was no evidence for an overadditive effect (i.e., fatigue-related performance decrements in postural control were similar under ST and DT conditions). Thus, attentional resources were sufficient to cope with the DT situations in the fatigue condition of this experiment.
Increased Achilles (AT) and Patellar tendon (PT) thickness in adolescent athletes compared to non-athletes could be shown. However, it is unclear, if changes are of pathological or physiological origin due to training. The aim of this study was to determine physiological AT and PT thickness adaptation in adolescent elite athletes compared to non-athletes, considering sex and sport. In a longitudinal study design with two measurement days (M1/M2) within an interval of 3.2 ± 0.8 years, 131 healthy adolescent elite athletes (m/f: 90/41) out of 13 different sports and 24 recreationally active controls (m/f: 6/18) were included. Both ATs and PTs were measured at standardized reference points. Athletes were divided into 4 sport categories [ball (B), combat (C), endurance (E) and explosive strength sports (S)]. Descriptive analysis (mean ± SD) and statistical testing for group differences was performed (α = 0.05). AT thickness did not differ significantly between measurement days, neither in athletes (5.6 ± 0.7 mm/5.6 ± 0.7 mm) nor in controls (4.8 ± 0.4 mm/4.9 ± 0.5 mm, p > 0.05). For PTs, athletes presented increased thickness at M2 (M1: 3.5 ± 0.5 mm, M2: 3.8 ± 0.5 mm, p < 0.001). In general, males had thicker ATs and PTs than females (p < 0.05). Considering sex and sports, only male athletes from B, C, and S showed significant higher PT-thickness at M2 compared to controls (p ≤ 0.01). Sport-specific adaptation regarding tendon thickness in adolescent elite athletes can be detected in PTs among male athletes participating in certain sports with high repetitive jumping and strength components. Sonographic microstructural analysis might provide an enhanced insight into tendon material properties enabling the differentiation of sex and influence of different sports.
Background
The aim of this study was to analyze the shoulder functional profile (rotation range of motion [ROM] and strength), upper and lower body performance, and throwing speed of U13 versus U15 male handball players, and to establish the relationship between these measures of physical fitness and throwing speed.
Methods
One-hundred and nineteen young male handball players (under (U)-13 (U13) [n = 85]) and U15 [n = 34]) volunteered to participate in this study. The participating athletes had a mean background of sytematic handball training of 5.5 ± 2.8 years and they exercised on average 540 ± 10.1 min per week including sport-specific team handball training and strength and conditioning programs. Players were tested for passive shoulder range-of-motion (ROM) for both internal (IR) and external rotation (ER) and isometric strength (i.e., IR and ER) of the dominant/non-dominant shoulders, overhead medicine ball throw (OMB), hip isometric abductor (ABD) and adductor (ADD) strength, hip ROM, jumps (countermovement jump [CMJ] and triple leg-hop [3H] for distance), linear sprint test, modified 505 change-of-direction (COD) test and handball throwing speed (7 m [HT7] and 9 m [HT9]).
Results
U15 players outperformed U13 in upper (i.e., HT7 and HT9 speed, OMB, absolute IR and ER strength of the dominant and non-dominant sides; Cohen’s d: 0.76–2.13) and lower body (i.e., CMJ, 3H, 20-m sprint and COD, hip ABD and ADD; d: 0.70–2.33) performance measures. Regarding shoulder ROM outcomes, a lower IR ROM was found of the dominant side in the U15 group compared to the U13 and a higher ER ROM on both sides in U15 (d: 0.76–1.04). It seems that primarily anthropometric characteristics (i.e., body height, body mass) and upper body strength/power (OMB distance) are the most important factors that explain the throw speed variance in male handball players, particularly in U13.
Conclusions
Findings from this study imply that regular performance monitoring is important for performance development and for minimizing injury risk of the shoulder in both age categories of young male handball players. Besides measures of physical fitness, anthropometric data should be recorded because handball throwing performance is related to these measures.
Objective: To determine immediate performance measures for short-term, multicomponent cardiac rehabilitation (CR) in clinical routine in patients of working age, taking into
account cardiovascular risk factors, physical performance, social medicine, and subjective health parameters and to explore the underlying dimensionality.
Design: Prospective observational multicenter register study in 12 rehabilitation centers throughout Germany.
Setting: Comprehensive 3-week CR.
Importance Alcohol consumption (AC) leads to death and disability worldwide. Ongoing discussions on potential negative effects of the COVID-19 pandemic on AC need to be informed by real-world evidence.
Objective To examine whether lockdown measures are associated with AC and consumption-related temporal and psychological within-person mechanisms.
Design, Setting, and Participants This quantitative, intensive, longitudinal cohort study recruited 1743 participants from 3 sites from February 20, 2020, to February 28, 2021. Data were provided before and within the second lockdown of the COVID-19 pandemic in Germany: before lockdown (October 2 to November 1, 2020); light lockdown (November 2 to December 15, 2020); and hard lockdown (December 16, 2020, to February 28, 2021).
Main Outcomes and Measures Daily ratings of AC (main outcome) captured during 3 lockdown phases (main variable) and temporal (weekends and holidays) and psychological (social isolation and drinking intention) correlates.
Results Of the 1743 screened participants, 189 (119 [63.0%] male; median [IQR] age, 37 [27.5-52.0] years) with at least 2 alcohol use disorder (AUD) criteria according to the Diagnostic and Statistical Manual of Mental Disorders (Fifth Edition) yet without the need for medically supervised alcohol withdrawal were included. These individuals provided 14 694 smartphone ratings from October 2020 through February 2021. Multilevel modeling revealed significantly higher AC (grams of alcohol per day) on weekend days vs weekdays (β = 11.39; 95% CI, 10.00-12.77; P < .001). Alcohol consumption was above the overall average on Christmas (β = 26.82; 95% CI, 21.87-31.77; P < .001) and New Year’s Eve (β = 66.88; 95% CI, 59.22-74.54; P < .001). During the hard lockdown, perceived social isolation was significantly higher (β = 0.12; 95% CI, 0.06-0.15; P < .001), but AC was significantly lower (β = −5.45; 95% CI, −8.00 to −2.90; P = .001). Independent of lockdown, intention to drink less alcohol was associated with lower AC (β = −11.10; 95% CI, −13.63 to −8.58; P < .001). Notably, differences in AC between weekend and weekdays decreased both during the hard lockdown (β = −6.14; 95% CI, −9.96 to −2.31; P = .002) and in participants with severe AUD (β = −6.26; 95% CI, −10.18 to −2.34; P = .002).
Conclusions and Relevance This 5-month cohort study found no immediate negative associations of lockdown measures with overall AC. Rather, weekend-weekday and holiday AC patterns exceeded lockdown effects. Differences in AC between weekend days and weekdays evinced that weekend drinking cycles decreased as a function of AUD severity and lockdown measures, indicating a potential mechanism of losing and regaining control. This finding suggests that temporal patterns and drinking intention constitute promising targets for prevention and intervention, even in high-risk individuals.
Objective: We aimed to characterize patients after an acute cardiac event regarding their negative expectations around returning to work and the impact on work capacity upon discharge from cardiac rehabilitation (CR).
Methods: We analyzed routine data of 884 patients (52±7 years, 76% men) who attended 3 weeks of inpatient CR after an acute coronary syndrome (ACS) or cardiac surgery between October 2013 and March 2015. The primary outcome was their status determining their capacity to work (fit vs unfit) at discharge from CR. Further, sociodemographic data (eg, age, sex, and education level), diagnoses, functional data (eg, exercise stress test and 6-min walking test [6MWT]), the Hospital Anxiety and Depression Scale (HADS) and self-assessment of the occupational prognosis (negative expectations and/or unemployment, Würzburger screening) at admission to CR were considered.
Results: A negative occupational prognosis was detected in 384 patients (43%). Out of these, 368 (96%) expected not to return to work after CR and/or were unemployed before CR at 29% (n=113). Affected patients showed a reduced exercise capacity (bicycle stress test: 100 W vs 118 W, P<0.01; 6MWT: 380 m vs 421 m, P<0.01) and were more likely to receive a depression diagnosis (12% vs 3%, P<0.01), as well as higher levels on the HADS. At discharge from CR, 21% of this group (n=81) were fit for work (vs 35% of patients with a normal occupational prognosis (n=175, P<0.01)). Sick leave before the cardiac event (OR 0.4, 95% CI 0.2–0.6, P<0.01), negative occupational expectations (OR 0.4, 95% CI 0.3–0.7, P<0.01) and depression (OR 0.3, 95% CI 0.1–0.8, P=0.01) reduced the likelihood of achieving work capacity upon discharge. In contrast, higher exercise capacity was positively associated.
Conclusion: Patients with a negative occupational prognosis often revealed a reduced physical performance and suffered from a high psychosocial burden. In addition, patients’ occupational expectations were a predictor of work capacity at discharge from CR. Affected patients should be identified at admission to allow for targeted psychosocial care.
Background
Postoperative delirium is a common disorder in older adults that is associated with higher morbidity and mortality, prolonged cognitive impairment, development of dementia, higher institutionalization rates, and rising healthcare costs. The probability of delirium after surgery increases with patients’ age, with pre-existing cognitive impairment, and with comorbidities, and its diagnosis and treatment is dependent on the knowledge of diagnostic criteria, risk factors, and treatment options of the medical staff. In this study, we will investigate whether a cross-sectoral and multimodal intervention for preventing delirium can reduce the prevalence of delirium and postoperative cognitive decline (POCD) in patients older than 70 years undergoing elective surgery. Additionally, we will analyze whether the intervention is cost-effective.
Methods
The study will be conducted at five medical centers (with two or three surgical departments each) in the southwest of Germany. The study employs a stepped-wedge design with cluster randomization of the medical centers. Measurements are performed at six consecutive points: preadmission, preoperative, and postoperative with daily delirium screening up to day 7 and POCD evaluations at 2, 6, and 12 months after surgery. Recruitment goals are to enroll 1500 patients older than 70 years undergoing elective operative procedures (cardiac, thoracic, vascular, proximal big joints and spine, genitourinary, gastrointestinal, and general elective surgery procedures.
Discussion
Results of the trial should form the basis of future standards for preventing delirium and POCD in surgical wards. Key aims are the improvement of patient safety and quality of life, as well as the reduction of the long-term risk of conversion to dementia. Furthermore, from an economic perspective, we expect benefits and decreased costs for hospitals, patients, and healthcare insurances.
Trial registration
German Clinical Trials Register, DRKS00013311. Registered on 10 November 2017.
The pathophysiology of Parkinson’s disease (PD) is still not understood. There are investigations which show a changed oscillatory behaviour of brain circuits or changes in variability of, e.g., gait parameters in PD. The aim of this study was to investigate whether or not the motor output differs between PD patients and healthy controls. Thereby, patients without tremor are investigated in the medication off state performing a special bilateral isometric motor task. The force and accelerations (ACC) were recorded as well as the Mechanomyography (MMG) of the biceps brachii, the brachioradialis and of the pectoralis major muscles using piezoelectric-sensors during the bilateral motor task at 60% of the maximal isometric contraction. The frequency, a specific power ratio, the amplitude variation and the slope of amplitudes were analysed. The results indicate that the oscillatory behaviour of motor output in PD patients without tremor deviates from controls: thereby, the 95%-confidence-intervals of power ratio and of amplitude variation of all signals are disjoint between PD and controls and show significant differences in group comparisons (power ratio: p = 0.000–0.004, r = 0.441–0.579; amplitude variation: p = 0.000–0.001, r = 0.37–0.67). The mean frequency shows a significant difference for ACC (p = 0.009, r = 0.43), but not for MMG. It remains open, whether this muscular output reflects changes of brain circuits and whether the results are reproducible and specific for PD.
Speech and action sequences are continuous streams of information that can be segmented into sub-units. In both domains, this segmentation can be facilitated by perceptual cues contained within the information stream. In speech, prosodic cues (e.g., a pause, pre-boundary lengthening, and pitch rise) mark boundaries between words and phrases, while boundaries between actions of an action sequence can be marked by kinematic cues (e.g., a pause, pre-boundary deceleration). The processing of prosodic boundary cues evokes an Event-related Potentials (ERP) component known as the Closure Positive Shift (CPS), and it is possible that the CPS reflects domaingeneral cognitive processes involved in segmentation, given that the CPS is also evoked by boundaries between subunits of non-speech auditory stimuli. This study further probed the domain-generality of the CPS and its underlying processes by investigating electrophysiological correlates of the processing of boundary cues in sequences of spoken verbs (auditory stimuli; Experiment 1; N = 23 adults) and actions (visual stimuli; Experiment 2; N = 23 adults). The EEG data from both experiments revealed a CPS-like broadly distributed positivity during the 250 ms prior to the onset of the post-boundary word or action, indicating similar electrophysiological correlates of boundary processing across domains, suggesting that the cognitive processes underlying speech and action segmentation might also be shared.
Degenerative disc disease is associated with increased expression of pro-inflammatory cytokines in the intervertebral disc (IVD). However, it is not completely clear how inflammation arises in the IVD and which cellular compartments are involved in this process. Recently, the endoplasmic reticulum (ER) has emerged as a possible modulator of inflammation in age-related disorders. In addition, ER stress has been associated with the microenvironment of degenerated IVDs. Therefore, the aim of this study was to analyze the effects of ER stress on inflammatory responses in degenerated human IVDs and associated molecular mechanisms. Gene expression of ER stress marker GRP78 and pro-inflammatory cytokines IL-6, IL-8, IL-1 beta, and TNF-alpha was analyzed in human surgical IVD samples (n = 51, Pfirrmann grade 2-5). The expression of GRP78 positively correlated with the degeneration grade in lumbar IVDs and IL-6, but not with IL-1 beta and TNF-alpha. Another set of human surgical IVD samples (n = 25) was used to prepare primary cell cultures. ER stress inducer thapsigargin (Tg, 100 and 500 nM) activated gene and protein expression of IL-6 and induced phosphorylation of p38 MAPK. Both inhibition of p38 MAPK by SB203580 (10 mu M) and knockdown of ER stress effector CCAAT-enhancer-binding protein homologous protein (CHOP) reduced gene and protein expression of IL-6 in Tg-treated cells. Furthermore, the effects of an inflammatory microenvironment on ER stress were tested. TNF-alpha (5 and 10 ng/mL) did not activate ER stress, while IL-1 beta (5 and 10 ng/mL) activated gene and protein expression of GRP78, but did not influence [Ca2+](i) flux and expression of CHOP, indicating that pro-inflammatory cytokines alone may not induce ER stress in vivo. This study showed that IL-6 release in the IVD can be initiated following ER stress and that ER stress mediates IL-6 release through p38 MAPK and CHOP. Therapeutic targeting of ER stress response may reduce the consequences of the harsh microenvironment in degenerated IVD.
Background: Outcome quality management requires the consecutive registration of defined variables. The aim was to identify relevant parameters in order to objectively assess the in-patient rehabilitation outcome.
Methods: From February 2009 to June 2010 1253 patients (70.9 +/- 7.0 years, 78.1% men) at 12 rehabilitation clinics were enrolled. Items concerning sociodemographic data, the impairment group (surgery, conservative/interventional treatment), cardiovascular risk factors, structural and functional parameters and subjective health were tested in respect of their measurability, sensitivity to change and their propensity to be influenced by rehabilitation.
Results: The majority of patients (61.1%) were referred for rehabilitation after cardiac surgery, 38.9% after conservative or interventional treatment for an acute coronary syndrome. Functionally relevant comorbidities were seen in 49.2% (diabetes mellitus, stroke, peripheral artery disease, chronic obstructive lung disease). In three key areas 13 parameters were identified as being sensitive to change and subject to modification by rehabilitation: cardiovascular risk factors (blood pressure, low-density lipoprotein cholesterol, triglycerides), exercise capacity (resting heart rate, maximal exercise capacity, maximal walking distance, heart failure, angina pectoris) and subjective health (IRES-24 (indicators of rehabilitation status): pain, somatic health, psychological well-being and depression as well as anxiety on the Hospital Anxiety and Depression Scale).
Conclusion: The outcome of in-patient rehabilitation in elderly patients can be comprehensively assessed by the identification of appropriate key areas, that is, cardiovascular risk factors, exercise capacity and subjective health. This may well serve as a benchmark for internal and external quality management.
Aim: The aim of the study was to identify common orthopedic sports injury profiles in adolescent elite athletes with respect to age, sex, and anthropometrics.
Methods: A retrospective data analysis of 718 orthopedic presentations among 381 adolescent elite athletes from 16 different sports to a sports medical department was performed. Recorded data of history and clinical examination included area, cause and structure of acute and overuse injuries. Injury-events were analyzed in the whole cohort and stratified by age (11–14/15–17 years) and sex. Group differences were tested by chi-squared-tests. Logistic regression analysis was applied examining the influence of factors age, sex, and body mass index (BMI) on the outcome variables area and structure (a = 0.05).
Results: Higher proportions of injury-events were reported for females (60%) and athletes of the older age group (66%) than males and younger athletes. The most frequently injured area was the lower extremity (47%) followed by the spine (30.5%) and the upper extremity (12.5%). Acute injuries were mainly located at the lower extremity (74.5%), while overuse injuries were predominantly observed at the lower extremity (41%) as well as the spine (36.5%). Joints (34%), muscles (22%), and tendons (21.5%) were found to be the most often affected structures. The injured structures were different between the age groups (p = 0.022), with the older age group presenting three times more frequent with ligament pathology events (5.5%/2%) and less frequent with bony problems (11%/20.5%) than athletes of the younger age group. The injured area differed between the sexes (p = 0.005), with males having fewer spine injury-events (25.5%/34%) but more upper extremity injuries (18%/9%) than females. Regression analysis showed statistically significant influence for BMI (p = 0.002) and age (p = 0.015) on structure, whereas the area was significantly influenced by sex (p = 0.005).
Conclusion: Events of soft-tissue overuse injuries are the most common reasons resulting in orthopedic presentations of adolescent elite athletes. Mostly, the lower extremity and the spine are affected, while sex and age characteristics on affected area and structure must be considered. Therefore, prevention strategies addressing the injury-event profiles should already be implemented in early adolescence taking age, sex as well as injury entity into account.
The purpose of this study was to compare static balance performance and muscle activity during one-leg standing on the dominant and nondominant leg under various sensory conditions with increased levels of task difficulty. Thirty healthy young adults (age: 23 +/- 2 years) performed one-leg standing tests for 30 s under three sensory conditions (ie, eyes open/firm ground; eyes open/foam ground [elastic pad on top of the balance plate]; eyes closed/firm ground). Center of pressure displacements and activity of four lower leg muscles (ie, m. tibialis anterior [TA], m. soleus [SOL], m. gastrocnemius medialis [GAS], m. peroneus longus [PER]) were analyzed. An increase in sensory task difficulty resulted in deteriorated balance performance (P < .001, effect size [ES] = .57-2.54) and increased muscle activity (P < .001, ES = .50-1.11) for all but two muscles (ie, GAS, PER). However, regardless of the sensory condition, one-leg standing on the dominant as compared with the nondominant limb did not produce statistically significant differences in various balance (P > .05, ES = .06-.22) and electromyographic (P > .05, ES = .03-.13) measures. This indicates that the dominant and the nondominant leg can be used interchangeably during static one-leg balance testing in healthy young adults.
Various behavioural studies show that semantic typicality (TYP) and age of acquisition (AOA) of a specific word influence processing time and accuracy during the performance of lexical-semantic tasks. This study examines the influence of TYP and AOA on semantic processing at behavioural (response times and accuracy data) and electrophysiological levels using an auditory category-member-verification task. Reaction time data reveal independent TYP and AOA effects, while in the accuracy data and the event-related potentials predominantly effects of TYP can be found. The present study thus confirms previous findings and extends evidence found in the visual modality to the auditory modality. A modality-independent influence on semantic word processing is manifested. However, with regard to the influence of AOA, the diverging results raise questions on the origin of AOA effects as well as on the interpretation of offline and online data. Hence, results will be discussed against the background of recent theories on N400 correlates in semantic processing. In addition, an argument in favour of a complementary use of research techniques will be made. (C) 2015 Elsevier Ltd. All rights reserved.
In the first years of life, children differ greatly from adults in the temporal organization of their speech gestures in fluent language production. However, dissent remains as to the maturational direction of such organization. The present study sheds new light on this process by tracking the development of anticipatory vowel-to-vowel coarticulation in a cross-sectional investigation of 62 German children (from 3.5 to 7 years of age) and 13 adults. It focuses on gestures of the tongue, a complex organ whose spatiotemporal control is indispensable for speech production. The goal of the study was threefold: 1) investigate whether children as well as adults initiate the articulation for a target vowel in advance of its acoustic onset, 2) test if the identity of the intervocalic consonant matters and finally, 3) describe age-related developments of these lingual coarticulatory patterns. To achieve this goal, ultrasound tongue imaging was used to record lingual movements and quantify changes in coarticulation degree as a function of consonantal context and age. Results from linear mixed effects models indicate that like adults, children initiate vowels' lingual gestures well ahead of their acoustic onset. Second, while the identity of the intervocalic consonant affects the degree of vocalic anticipation in adults, it does not in children at any age. Finally, the degree of vowelto-vowel coarticulation is significantly higher in all cohorts of children than in adults. However, among children, a developmental decrease of vocalic coarticulation is only found for sequences including the alveolar stop /d/ which requires finer spatiotemporal coordination of the tongue's subparts compared to labial and velar stops. Altogether, results suggest greater gestural overlap in child than in adult speech and support the view of a non-uniform and protracted maturation of lingual coarticulation calling for thorough considerations of the articulatory intricacies from which subtle developmental differences may originate.
Background: Castilian-Spanish, Catalan, Galician, and European Portuguese are the most widely spoken languages of the Ibero-Romance group. An increasing number of authors have addressed the impact of aphasia on the morphosyntax of these varieties. However, accurate linguistic characterisations are scarce and the different sources of data have not been yet compiled.Aims: To stimulate state-of-the-art research, we provided a comprehensive summary of morphosyntactic aspects of Ibero-Romance and a review of how these are affected in non-fluent aphasia. The topics we dealt with are the use of verb argument structure and morphology, sentential negation and word order, definite articles, personal and reflexive pronouns, passives, topicalised constructions, questions, and relative clauses.Methods & Procedures: An exhaustive fieldwork and search of PubMed, Web of Science, and Medline records were performed to retrieve studies focused on morphosyntactic issues concerning the Ibero-Romance varieties. A total of 27 studies produced by 46 authors of varying background emerged. We did not review studies of category-specific deficits and aspects related to bilingual aphasia, although we assume that most speakers of Galician and Catalan are bilingual. Studies of spontaneous speech were included when no controlled experimental tasks were available.Outcomes & Results: The morphosyntactic commonalities of Ibero-Romance have been tackled from different theoretical perspectives. There exist asymmetries in findings which we explain with the use of different tasks (and task complexity) and individual differences between participants.Conclusions: Discourse-linking factors as well as deviations from the canonical pattern are recurrent answers to these asymmetries. A comprehensive theory of impairments in non-fluent aphasia integrating relevant aspects of both structural and processing accounts seems necessary.
Walking while concurrently performing cognitive and/or motor interference tasks is the norm rather than the exception during everyday life and there is evidence from behavioral studies that it negatively affects human locomotion. However, there is hardly any information available regarding the underlying neural correlates of single- and dual-task walking. We had 12 young adults (23.8 ± 2.8 years) walk while concurrently performing a cognitive interference (CI) or a motor interference (MI) task. Simultaneously, neural activation in frontal, central, and parietal brain areas was registered using a mobile EEG system. Results showed that the MI task but not the CI task affected walking performance in terms of significantly decreased gait velocity and stride length and significantly increased stride time and tempo-spatial variability. Average activity in alpha and beta frequencies was significantly modulated during both CI and MI walking conditions in frontal and central brain regions, indicating an increased cognitive load during dual-task walking. Our results suggest that impaired motor performance during dual-task walking is mirrored in neural activation patterns of the brain. This finding is in line with established cognitive theories arguing that dual-task situations overstrain cognitive capabilities resulting in motor performance decrements.
Objective: Alexithymia relates to difficulties recognizing and describing emotions. It has been linked to subjectively increased interoceptive awareness (IA) and to psychiatric illnesses such as major depressive disorder (MDD) and somatization. MDD in turn is characterized by aberrant emotion processing and IA on the subjective as well as on the neural level. However, a link between neural activity in response to IA and alexithymic traits in health and depression remains unclear.
Methods: A well-established fMRI task was used to investigate neural activity during IA (heartbeat counting) and exteroceptive awareness (tone counting) in non-psychiatric controls (NC) and MDD. Firstly, comparing MDD and NC, a linear relationship between IA-related activity and scores of the Toronto Alexithymia Scale (TAS) was investigated through whole-brain regression. Secondly, NC were divided by median-split of TAS scores into groups showing low (NC-low) or high (NC-high) alexithymia. MDD and NC-high showed equally high TAS scores. Subsequently, IA-related neural activity was compared on a whole-brain level between the three independent samples (MDD, NC-low, NC-high).
Results: Whole-brain regressions between MDD and NC revealed neural differences during IA as a function of TAS-DD (subscale difficulty describing feelings) in the supragenual anterior cingulate cortex (sACC; BA 24/32), which were due to negative associations between TAS-DD and IA-related activity in NC. Contrasting NC subgroups after median-split on a whole-brain level, high TAS scores were associated with decreased neural activity during IA in the sACC and increased insula activity. Though having equally high alexithymia scores, NC-high showed increased insula activity during IA compared to MDD, whilst both groups showed decreased activity in the sACC.
Conclusions: Within the context of decreased sACC activity during IA in alexithymia (NC-high and MDD), increased insula activity might mirror a compensatory mechanism in NC-high, which is disrupted in MDD.
Background
Doping presents a potential health risk for young athletes. Prevention programs are intended to prevent doping by educating athletes about banned substances. However, such programs have their limitations in practice. This led Germany to introduce the National Doping Prevention Plan (NDPP), in hopes of ameliorating the situation among young elite athletes. Two studies examined 1) the degree to which the NDPP led to improved prevention efforts in elite sport schools, and 2) the extent to which newly developed prevention activities of the national anti-doping agency (NADA) based on the NDPP have improved knowledge among young athletes within elite sports schools.
Methods
The first objective was investigated in a longitudinal study (Study I: t0 = baseline, t1 = follow-up 4 years after NDPP introduction) with N = 22 teachers engaged in doping prevention in elite sports schools. The second objective was evaluated in a cross-sectional comparison study (Study II) in N = 213 elite sports school students (54.5 % male, 45.5 % female, age M = 16.7 ± 1.3 years (all students had received the improved NDDP measure in school; one student group had received additionally NADA anti-doping activities and a control group did not). Descriptive statistics were calculated, followed by McNemar tests, Wilcoxon tests and Analysis of Covariance (ANCOVA).
Results
Results indicate that 4 years after the introduction of the NDPP there have been limited structural changes with regard to the frequency, type, and scope of doping prevention in elite sport schools. On the other hand, in study II, elite sport school students who received further NADA anti-doping activities performed better on an anti-doping knowledge test than students who did not take part (F(1, 207) = 33.99, p <0.001), although this difference was small.
Conclusion
The integration of doping-prevention in elite sport schools as part of the NDPP was only partially successful. The results of the evaluation indicate that the introduction of the NDPP has contributed more to a change in the content of doping prevention activities than to a structural transformation in anti-doping education in elite sport schools. Moreover, while students who did receive additional education in the form of the NDPP“booster sessions” had significantly more knowledge about doping than students who did not receive such education, this difference was only small and may not translate to actual behavior.
Different systems for habitual versus goal-directed control are thought to underlie human decision-making. Working memory is known to shape these decision-making systems and
their interplay, and is known to support goal-directed decision making even under stress. Here, we investigated if and how decision systems are differentially influenced by breaks filled with diverse everyday life activities known to modulate working memory performance. We used a within-subject design where young adults listened to music and played a video game during breaks interleaved with trials of a sequential two-step Markov decision task, designed to assess habitual as well as goal-directed decision making. Based on a neurocomputational model of task performance, we observed that for individuals with a rather limited
working memory capacity video gaming as compared to music reduced reliance on the goal-directed decision-making system, while a rather large working memory capacity prevented such a decline. Our findings suggest differential effects of everyday activities on key decision-making processes.
The improvement of power is an objective in training of athletes. In order to detect effective methods of exercise, basic research is required regarding the mechanisms of muscular activity. The purpose of this study is to investigate whether or not a muscular pre-activation prior to an external impulse-like force impact has an effect on the maximal explosive eccentric Adaptive Force (xpAFeccmax). This power capability combines different probable power enhancing mechanisms. To measure the xpAFeccmax an innovative pneumatic device was used. During measuring, the subject tries to hold an isometric position as long as possible. In the moment in which the subjects’ maximal isometric holding strength is exceeded, it merges into eccentric muscle action. This process is very close to motions in sports, where an adaptation of the neuromuscular system is required, e.g., force impacts caused by uneven surfaces during skiing. For investigating the effect of pre-activation on the xpAFeccmax of the quadriceps femoris muscle, n = 20 subjects had to pass three different pre-activation levels in a randomized order (level 1: 0.4 bar, level 2: 0.8 bar, level 3: 1.2 bar). After adjusting the standardized pre-pressure by pushing against the interface, an impulse-like load impacted on the distal tibia of the subject. During this, the xpAFeccmax was detected. The maximal voluntary isometric contraction (MVIC) was also measured. The torque values of the xpAFeccmax were compared with regard to the pre-activation levels. The results show a significant positive relation between the pre-activation of the quadriceps femoris muscle and the xpAFeccmax (male: p = 0.000, η2= 0.683; female: p = 0.000, η2= 0.907). The average percentage increase of torque amounted +28.15 ± 25.4% between MVIC and xpAFeccmax with pre-pressure level 1, +12.09 ± 7.9% for the xpAFeccmax comparing pre-pressure levels 1 vs. 2 and +2.98 ± 4.2% comparing levels 2 and 3. A higher but not maximal muscular activation prior to a fast impacting eccentric load seems to produce an immediate increase of force outcome. Different possible physiological explanatory approaches and the use as a potential training method are discussed.
Background
Isometric muscle actions can be performed either by initiating the action, e.g., pulling on an immovable resistance (PIMA), or by reacting to an external load, e.g., holding a weight (HIMA). In the present study, it was mainly examined if these modalities could be differentiated by oxygenation variables as well as by time to task failure (TTF). Furthermore, it was analyzed if variables are changed by intermittent voluntary muscle twitches during weight holding (Twitch). It was assumed that twitches during a weight holding task change the character of the isometric muscle action from reacting (≙ HIMA) to acting (≙ PIMA).
Methods
Twelve subjects (two drop outs) randomly performed two tasks (HIMA vs. PIMA or HIMA vs. Twitch, n = 5 each) with the elbow flexors at 60% of maximal torque maintained until muscle failure with each arm. Local capillary venous oxygen saturation (SvO2) and relative hemoglobin amount (rHb) were measured by light spectrometry.
Results
Within subjects, no significant differences were found between tasks regarding the behavior of SvO2 and rHb, the slope and extent of deoxygenation (max. SvO2 decrease), SvO2 level at global rHb minimum, and time to SvO2 steady states. The TTF was significantly longer during Twitch and PIMA (incl. Twitch) compared to HIMA (p = 0.043 and 0.047, respectively). There was no substantial correlation between TTF and maximal deoxygenation independently of the task (r = − 0.13).
Conclusions
HIMA and PIMA seem to have a similar microvascular oxygen and blood supply. The supply might be sufficient, which is expressed by homeostatic steady states of SvO2 in all trials and increases in rHb in most of the trials. Intermittent voluntary muscle twitches might not serve as a further support but extend the TTF. A changed neuromuscular control is discussed as possible explanation.