Refine
Has Fulltext
- yes (86)
Year of publication
Document Type
- Postprint (86) (remove)
Language
- English (86) (remove)
Is part of the Bibliography
- yes (86) (remove)
Keywords
- muscle strength (5)
- adolescents (4)
- exercise (4)
- resistance training (4)
- Adaptive Force (3)
- Neuroenhancement (3)
- depression (3)
- inflammation (3)
- maximal isometric Adaptive Force (3)
- youth sports (3)
Institute
- Department Sport- und Gesundheitswissenschaften (86) (remove)
Satisfaction and frustration of the needs for autonomy, competence, and relatedness, as assessed with the 24-item Basic Psychological Need Satisfaction and Frustration Scale (BPNSFS), have been found to be crucial indicators of individuals’ psychological health. To increase the usability of this scale within a clinical and health services research context, we aimed to validate a German short version (12 items) of this scale in individuals with depression including the examination of the relations from need frustration and need satisfaction to ill-being and quality of life (QOL). This cross-sectional study involved 344 adults diagnosed with depression (Mage (SD) = 47.5 years (11.1); 71.8% females). Confirmatory factor analyses indicated that the short version of the BPNSFS was not only reliable, but also fitted a six-factor structure (i.e., satisfaction/frustration X type of need). Subsequent structural equation modeling showed that need frustration related positively to indicators of ill-being and negatively to QOL. Surprisingly, need satisfaction did not predict differences in ill-being or QOL. The short form of the BPNSFS represents a practical instrument to measure need satisfaction and frustration in people with depression. Further, the results support recent evidence on the importance of especially need frustration in the prediction of psychopathology.
Background: Knowing and, if necessary, altering competitive athletes' real attitudes towards the use of banned performance-enhancing substances is an important goal of worldwide doping prevention efforts. However athletes will not always be willing to reporting their real opinions. Reaction time-based attitude tests help conceal the ultimate goal of measurement from the participant and impede strategic answering. This study investigated how well a reaction time-based attitude test discriminated between athletes who were doping and those who were not. We investigated whether athletes whose urine samples were positive for at least one banned substance (dopers) evaluated doping more favorably than clean athletes (non-dopers).
Methods: We approached a group of 61 male competitive bodybuilders and collected urine samples for biochemical testing. The pictorial doping Brief Implicit Association Test (BIAT) was used for attitude measurement. This test quantifies the difference in response latencies (in milliseconds) to stimuli representing related concepts (i.e. doping-dislike/like-[health food]).
Results: Prohibited substances were found in 43% of all tested urine samples. Dopers had more lenient attitudes to doping than non-dopers (Hedges's g = -0.76). D-scores greater than -0.57 (CI95 = -0.72 to -0.46) might be indicative of a rather lenient attitude to doping. In urine samples evidence of administration of combinations of substances, complementary administration of substances to treat side effects and use of stimulants to promote loss of body fat was common.
Conclusion: This study demonstrates that athletes' attitudes to doping can be assessed indirectly with a reaction time-based test, and that their attitudes are related to their behavior. Although bodybuilders may be more willing to reveal their attitude to doping than other athletes, these results still provide evidence that the pictorial doping BIAT may be useful in athletes from other sports, perhaps as a complementary measure in evaluations of the effectiveness of doping prevention interventions.
Background
Artificial intelligence (AI) is one of the most promising areas in medicine with many possibilities for improving health and wellness. Already today, diagnostic decision support systems may help patients to estimate the severity of their complaints. This fictional case study aimed to test the diagnostic potential of an AI algorithm for common sports injuries and pathologies.
Methods
Based on a literature review and clinical expert experience, five fictional “common” cases of acute, and subacute injuries or chronic sport-related pathologies were created: Concussion, ankle sprain, muscle pain, chronic knee instability (after ACL rupture) and tennis elbow. The symptoms of these cases were entered into a freely available chatbot-guided AI app and its diagnoses were compared to the pre-defined injuries and pathologies.
Results
A mean of 25–36 questions were asked by the app per patient, with optional explanations of certain questions or illustrative photos on demand. It was stressed, that the symptom analysis would not replace a doctor’s consultation. A 23-yr-old male patient case with a mild concussion was correctly diagnosed. An ankle sprain of a 27-yr-old female without ligament or bony lesions was also detected and an ER visit was suggested. Muscle pain in the thigh of a 19-yr-old male was correctly diagnosed. In the case of a 26-yr-old male with chronic ACL instability, the algorithm did not sufficiently cover the chronic aspect of the pathology, but the given recommendation of seeing a doctor would have helped the patient. Finally, the condition of the chronic epicondylitis in a 41-yr-old male was correctly detected.
Conclusions
All chosen injuries and pathologies were either correctly diagnosed or at least tagged with the right advice of when it is urgent for seeking a medical specialist. However, the quality of AI-based results could presumably depend on the data-driven experience of these programs as well as on the understanding of their users. Further studies should compare existing AI programs and their diagnostic accuracy for medical injuries and pathologies.
Background:
Deception can distort psychological tests on socially sensitive topics. Understanding the cerebral
processes that are involved in such faking can be useful in detection and prevention of deception. Previous research
shows that faking a brief implicit association test (BIAT ) evokes a characteristic ERP response. It is not yet known
whether temporarily available self-control resources moderate this response. We randomly assigned 22 participants
(15 females, 24.23
±
2.91
years old) to a counterbalanced repeated-measurements design. Participants first com-
pleted a Brief-IAT (BIAT ) on doping attitudes as a baseline measure and were then instructed to fake a negative dop
-
ing attitude both when self-control resources were depleted and non-depleted. Cerebral activity during BIAT perfor
-
mance was assessed using high-density EEG.
Results:
Compared to the baseline BIAT, event-related potentials showed a first interaction at the parietal P1,
while significant post hoc differences were found only at the later occurring late positive potential. Here, signifi-
cantly decreased amplitudes were recorded for ‘normal’ faking, but not in the depletion condition. In source space,
enhanced activity was found for ‘normal’ faking in the bilateral temporoparietal junction. Behaviorally, participants
were successful in faking the BIAT successfully in both conditions.
Conclusions:
Results indicate that temporarily available self-control resources do not affect overt faking success on
a BIAT. However, differences were found on an electrophysiological level. This indicates that while on a phenotypical
level self-control resources play a negligible role in deliberate test faking the underlying cerebral processes are markedly different.
We are glad to introduce the Second Journal Club of Volume Five, Second Issue. This edition is focused on relevant studies published in the last few years in the field of resistance training, chosen by our Editorial Board members and their colleagues. We hope to stimulate your curiosity in this field and to share with you the passion for the sport, seen also from the scientific point of view. The Editorial Board members wish you an inspiring lecture.
The use of functional music in gait training termed rhythmic auditory stimulation (RAS) and treadmill training (TT) have both been shown to be effective in stroke patients (SP). The combination of RAS and treadmill training (RAS-TT) has not been clinically evaluated to date. The aim of the study was to evaluate the efficacy of RAS-TT on functional gait in SR The protocol followed the design of an explorative study with a rater-blinded three arm prospective randomized controlled parallel group design. Forty-five independently walking SP with a hemiparesis of the lower limb or an unsafe and asymmetrical walking pattern were recruited. RAS-TT was carried out over 4 weeks with TT and neurodevelopmental treatment based on Bobath approach (NDT) serving as control interventions. For RAS-TT functional music was adjusted individually while walking on the treadmill. Pre and post-assessments consisted of the fast gait speed test (FGS), a gait analysis with the locometre (LOC), 3 min walking time test (3MWT), and an instrumental evaluation of balance (IEB). Raters were blinded to group assignments. An analysis of covariance (ANCOVA) was performed with affiliated measures from pre-assessment and time between stroke and start of study as covariates. Thirty-five participants (mean age 63.6 +/- 8.6 years, mean time between stroke and start of study 42.1 +/- 23.7 days) completed the study (11 RAS-TT, 13 TT, 11 NDT). Significant group differences occurred in the FGS for adjusted post-measures in gait velocity [F-(2,F- (34)) = 3.864, p = 0.032; partial eta(2) = 0.205] and cadence [F-(2,F- 34) = 7.656, p = 0.002; partial eta(2) = 0.338]. Group contrasts showed significantly higher values for RAS-TT. Stride length results did not vary between the groups. LOC, 3MWT, and IEB did not indicate group differences. One patient was withdrawn from TT because of pain in one arm. The study provides first evidence for a higher efficacy of RAS-TT in comparison to the standard approaches TT and NDT in restoring functional gait in SP. The results support the implementation of functional music in neurological gait rehabilitation and its use in combination with treadmill training.
Serious knee pain and related disability have an annual prevalence of approximately 25% on those over the age of 55 years. As curative treatments for the common knee problems are not available to date, knee pathologies typically progress and often lead to osteoarthritis (OA). While the roles that the meniscus plays in knee biomechanics are well characterized, biological mechanisms underlying meniscus pathophysiology and roles in knee pain and OA progression are not fully clear. Experimental treatments for knee disorders that are successful in animal models often produce unsatisfactory results in humans due to species differences or the inability to fully replicate disease progression in experimental animals. The use of animals with spontaneous knee pathologies, such as dogs, can significantly help addressing this issue. As microscopic and macroscopic anatomy of the canine and human menisci are similar, spontaneous meniscal pathologies in canine patients are thought to be highly relevant for translational medicine. However, it is not clear whether the biomolecular mechanisms of pain, degradation of extracellular matrix, and inflammatory responses are species dependent. The aims of this review are (1) to provide an overview of the anatomy, physiology, and pathology of the human and canine meniscus, (2) to compare the known signaling pathways involved in spontaneous meniscus pathology between both species, and (3) to assess the relevance of dogs with spontaneous meniscal pathology as a translational model. Understanding these mechanisms in human and canine meniscus can help to advance diagnostic and therapeutic strategies for painful knee disorders and improve clinical decision making.
Background: Chronic ankle instability, developing from ankle sprain, is one of the most common sports injuries. Besides it being an ankle issue, chronic ankle instability can also cause additional injuries. Investigating the epidemiology of chronic ankle instability is an essential step to develop an adequate injury prevention strategy. However, the epidemiology of chronic ankle instability remains unknown. Therefore, the purpose of this study was to investigate the epidemiology of chronic ankle instability through valid and reliable self-reported tools in active populations.
Methods: An electronic search was performed on PubMed and Web of Science in July 2020. The inclusion criteria for articles were peer-reviewed, published between 2006 and 2020, using one of the valid and reliable tools to evaluate ankle instability, determining chronic ankle instability based on the criteria of the International Ankle
Consortium, and including the outcome of epidemiology of chronic ankle instability. The risk of bias of the included studies was evaluated with an adapted tool for the sports injury review method.
Results: After removing duplicated studies, 593 articles were screened for eligibility. Twenty full-texts were screened and finally nine studies were included, assessing 3804 participants in total. The participants were between 15 and 32 years old and represented soldiers, students, athletes and active individuals with a history of ankle sprain. The prevalence of chronic ankle instability was 25%, ranging between 7 and 53%. The prevalence of chronic ankle instability within participants with a history of ankle sprains was 46%, ranging between 9 and 76%. Five included studies identified chronic ankle instability based on the standard criteria, and four studies applied adapted exclusion criteria to conduct the study. Five out of nine included studies showed a low risk of bias.
Conclusions: The prevalence of chronic ankle instability shows a wide range. This could be due to the different exclusion criteria, age, sports discipline, or other factors among the included studies. For future studies, standardized criteria to investigate the epidemiology of chronic ankle instability are required. The epidemiology of
CAI should be prospective. Factors affecting the prevalence of chronic ankle instability should be investigated and clearly described.
Combining training of muscle strength and cardiorespiratory fitness within a training cycle could increase athletic performance more than single-mode training. However, the physiological effects produced by each training modality could also interfere with each other, improving athletic performance less than single-mode training. Because anthropometric, physiological, and biomechanical differences between young and adult athletes can affect the responses to exercise training, young athletes might respond differently to concurrent training (CT) compared with adults. Thus, the aim of the present systematic review with meta-analysis was to determine the effects of concurrent strength and endurance training on selected physical fitness components and athletic performance in youth. A systematic literature search of PubMed and Web of Science identified 886 records. The studies included in the analyses examined children (girls age 6–11 years, boys age 6–13 years) or adolescents (girls age 12–18 years, boys age 14–18 years), compared CT with single-mode endurance (ET) or strength training (ST), and reported at least one strength/power—(e.g., jump height), endurance—(e.g., peak V°O2, exercise economy), or performance-related (e.g., time trial) outcome. We calculated weighted standardized mean differences (SMDs). CT compared to ET produced small effects in favor of CT on athletic performance (n = 11 studies, SMD = 0.41, p = 0.04) and trivial effects on cardiorespiratory endurance (n = 4 studies, SMD = 0.04, p = 0.86) and exercise economy (n = 5 studies, SMD = 0.16, p = 0.49) in young athletes. A sub-analysis of chronological age revealed a trend toward larger effects of CT vs. ET on athletic performance in adolescents (SMD = 0.52) compared with children (SMD = 0.17). CT compared with ST had small effects in favor of CT on muscle power (n = 4 studies, SMD = 0.23, p = 0.04). In conclusion, CT is more effective than single-mode ET or ST in improving selected measures of physical fitness and athletic performance in youth. Specifically, CT compared with ET improved athletic performance in children and particularly adolescents. Finally, CT was more effective than ST in improving muscle power in youth.
Background/Purpose
Muscular reflex responses of the lower extremities to sudden gait disturbances are related to postural stability and injury risk. Chronic ankle instability (CAI) has shown to affect activities related to the distal leg muscles while walking. Its effects on proximal muscle activities of the leg, both for the injured- (IN) and uninjured-side (NON), remain unclear. Therefore, the aim was to compare the difference of the motor control strategy in ipsilateral and contralateral proximal joints while unperturbed walking and perturbed walking between individuals with CAI and matched controls.
Materials and methods
In a cross-sectional study, 13 participants with unilateral CAI and 13 controls (CON) walked on a split-belt treadmill with and without random left- and right-sided perturbations. EMG amplitudes of muscles at lower extremities were analyzed 200 ms after perturbations, 200 ms before, and 100 ms after (Post100) heel contact while walking. Onset latencies were analyzed at heel contacts and after perturbations. Statistical significance was set at alpha≤0.05 and 95% confidence intervals were applied to determine group differences. Cohen’s d effect sizes were calculated to evaluate the extent of differences.
Results
Participants with CAI showed increased EMG amplitudes for NON-rectus abdominus at Post100 and shorter latencies for IN-gluteus maximus after heel contact compared to CON (p<0.05). Overall, leg muscles (rectus femoris, biceps femoris, and gluteus medius) activated earlier and less bilaterally (d = 0.30–0.88) and trunk muscles (bilateral rectus abdominus and NON-erector spinae) activated earlier and more for the CAI group than CON group (d = 0.33–1.09).
Conclusion
Unilateral CAI alters the pattern of the motor control strategy around proximal joints bilaterally. Neuromuscular training for the muscles, which alters motor control strategy because of CAI, could be taken into consideration when planning rehabilitation for CAI.
Background
In health research, indicators of socioeconomic status (SES) are often used interchangeably and often lack theoretical foundation. This makes it difficult to compare results from different studies and to explore the relationship between SES and health outcomes. To aid researchers in choosing appropriate indicators of SES, this article proposes and tests a theory-based selection of SES indicators using chronic back pain as a health outcome.
Methods
Strength of relationship predictions were made using Brunner & Marmot’s model of ‘social determinants of health’. Subsequently, a longitudinal study was conducted with 66 patients receiving in-patient treatment for chronic back pain. Sociodemographic variables, four SES indicators (education, job position, income, multidimensional index) and back pain intensity and disability were obtained at baseline. Both pain dimensions were assessed again 6 months later. Using linear regression, the predictive strength of each SES indicator on pain intensity and disability was estimated and compared to the theory based prediction.
Results
Chronic back pain intensity was best predicted by the multidimensional index (beta = 0.31, p < 0.05), followed by job position (beta = 0.29, p < 0.05) and education (beta = −0.29, p < 0.05); whereas, income exerted no significant influence. Back pain disability was predicted strongest by education (beta = −0.30, p < 0.05) and job position (beta = 0.29, p < 0.05). Here, multidimensional index and income had no significant influence.
Conclusions
The choice of SES indicators influences predictive power on both back pain dimensions, suggesting SES predictors cannot be used interchangeably. Therefore, researchers should carefully consider prior to each study which SES indicator to use. The introduced framework can be valuable in supporting this decision because it allows for a stable prediction of SES indicator influence and their hierarchy on a specific health outcomes.
Long COVID patients show symptoms, such as fatigue, muscle weakness and pain. Adequate diagnostics are still lacking. Investigating muscle function might be a beneficial approach. The holding capacity (maximal isometric Adaptive Force; AFisomax) was previously suggested to be especially sensitive for impairments. This longitudinal, non-clinical study aimed to investigate the AF in long COVID patients and their recovery process. AF parameters of elbow and hip flexors were assessed in 17 patients at three time points (pre: long COVID state, post: immediately after first treatment, end: recovery) by an objectified manual muscle test. The tester applied an increasing force on the limb of the patient, who had to resist isometrically for as long as possible. The intensity of 13 common symptoms were queried. At pre, patients started to lengthen their muscles at ~50% of the maximal AF (AFmax), which was then reached during eccentric motion, indicating unstable adaptation. At post and end, AFisomax increased significantly to ~99% and 100% of AFmax, respectively, reflecting stable adaptation. AFmax was statistically similar for all three time points. Symptom intensity decreased significantly from pre to end. The findings revealed a substantially impaired maximal holding capacity in long COVID patients, which returned to normal function with substantial health improvement. AFisomax might be a suitable sensitive functional parameter to assess long COVID patients and to support therapy process
Purpose
To test whether the negative relationship between perceived stress and quality of life (Hypothesis 1) can be buffered by perceived social support in patients with dementia as well as in caregivers individually (Hypothesis 2: actor effects) and across partners (Hypothesis 3: partner effects and actor-partner effects).
Method
A total of 108 couples (N = 216 individuals) comprised of one individual with early-stage dementia and one caregiving partner were assessed at baseline and one month apart. Moderation effects were investigated by applying linear mixed models and actor-partner interdependence models.
Results
Although the stress-quality of life association was more pronounced in caregivers (beta = -.63, p<.001) compared to patients (beta= -.31, p<.001), this association was equally moderated by social support in patients (beta = .14, p<.05) and in the caregivers (beta =.13, p<.05). From one partner to his or her counterpart, the partner buffering and actor-partner-buffering effect were not present.
Conclusion
The stress-buffering effect has been replicated in individuals with dementia and caregivers but not across partners. Interventions to improve quality of life through perceived social support should not only focus on caregivers, but should incorporate both partners.
Background: To handle the competition demands, sparring drills are used for specific technical–tactical training as well as physical–physiological conditioning in combat sports. While the effects of different area sizes and number of within-round sparring partners on physiological and perceptive responses in combats sports were examined in previous studies, technical and tactical aspects were not investigated. This study investigated the effect of different within-round sparring partners number (i.e., at a time; 1 vs. 1, 1 vs. 2, and 1 vs. 4) and area sizes (2 m × 2 m, 4 m × 4 m, and 6 m × 6 m) variation on the technical–tactical aspects of small combat games in kickboxing.
Method: Twenty male kickboxers (mean ± standard deviation, age: 20.3 ± 0.9 years), regularly competing in regional and national events randomly performed nine different kickboxing combats, lasting 2 min each. All combats were video recorded and analyzed using the software Dartfish.
Results: Results showed that the total number of punches was significantly higher in 1 versus 4 compared with 1 versus 1 (p = 0.011, d = 0.83). Further, the total number of kicks was significantly higher in 1 versus 4 compared with 1 versus 1 and 1 versus 2 (p < 0.001; d = 0.99 and d = 0.83, respectively). Moreover, the total number of kick combinations was significantly higher in 1 versus 4 compared with 1 versus 1 and 1 versus 2 (p < 0.001; d = 1.05 and d = 0.95, respectively). The same outcome was significantly lower in 2 m × 2 m compared with 4 m × 4 m and 6 m × 6 m areas (p = 0.010 and d = − 0.45; p < 0.001 and d = − 0.6, respectively). The number of block-and-parry was significantly higher in 1 versus 4 compared with 1 versus 1 (p < 0.001, d = 1.45) and 1 versus 2 (p = 0.046, d = 0.61) and in 2 m × 2 m compared with 4 m × 4 m and 6 × 6 m areas (p < 0.001; d = 0.47 and d = 0.66, respectively). Backwards lean actions occurred more often in 2 m × 2 m compared with 4 m × 4 m (p = 0.009, d = 0.53) and 6 m × 6 m (p = 0.003, d = 0.60). However, the number of foot defenses was significantly lower in 2 m × 2 m compared with 6 m × 6 m (p < 0.001, d = 1.04) and 4 m × 4 m (p = 0.004, d = 0.63). Additionally, the number of clinches was significantly higher in 1 versus 1 compared with 1 versus 2 (p = 0.002, d = 0.7) and 1 versus 4 (p = 0.034, d = 0.45).
Conclusions: This study provides practical insights into how to manipulate within-round sparring partners’ number and/or area size to train specific kickboxing technical–tactical fundamentals.
Real options are widely applied in strategic and operational decision-making, allowing for managerial flexibility in uncertaincontexts. Increased scholarly interest has led to an extensive but fragmented research landscape. We aim to measure andsystematize the research field quantitatively. To achieve this goal, we conduct bibliometric performance analyses and bibliographiccoupling analyses with an in-depth content review. The results of the performance analyses show an increasing interest in realoptions since the beginning of the 2000s and identify the most influential journals and authors. The science mappings reveal sixand seven research clusters over the last two decades. Based on an in-depth analysis of their themes, we develop a researchframework comprising antecedents, application areas, internal and external contingencies, and uncertainty resolution throughreal option valuation or reasoning. We identify several gaps in that framework, which we propose to tackle in future research.
Development of chronic pain after a low back pain episode is associated with increased pain sensitivity, altered pain processing mechanisms and the influence of psychosocial factors. Although there is some evidence that multimodal therapy (such as behavioral or motor control therapy) may be an important therapeutic strategy, its long-term effect on pain reduction and psychosocial load is still unclear. Prospective longitudinal designs providing information about the extent of such possible long-term effects are missing. This study aims to investigate the long-term effects of a homebased uni- and multidisciplinary motor control exercise program on low back pain intensity, disability and psychosocial variables. 14 months after completion of a multicenter study comparing uni- and multidisciplinary exercise interventions, a sample of one study center (n = 154) was assessed once more. Participants filled in questionnaires regarding their low back pain symptoms (characteristic pain intensity and related disability), stress and vital exhaustion (short version of the Maastricht Vital Exhaustion Questionnaire), anxiety and depression experiences (the Hospital and Anxiety Depression Scale), and pain-related cognitions (the Fear Avoidance Beliefs Questionnaire). Repeated measures mixed ANCOVAs were calculated to determine the long-term effects of the interventions on characteristic pain intensity and disability as well as on the psychosocial variables. Fifty four percent of the sub-sample responded to the questionnaires (n = 84). Longitudinal analyses revealed a significant long-term effect of the exercise intervention on pain disability. The multidisciplinary group missed statistical significance yet showed a medium sized long-term effect. The groups did not differ in their changes of the psychosocial variables of interest. There was evidence of long-term effects of the interventions on pain-related disability, but there was no effect on the other variables of interest. This may be partially explained by participant's low comorbidities at baseline. Results are important regarding costless homebased alternatives for back pain patients and prevention tasks. Furthermore, this study closes the gap of missing long-term effect analysis in this field.
Background: The use of psychoactive substances to neuroenhance cognitive performance is prevalent. Neuroenhancement (NE) in everyday life and doping in sport might rest on similar attitudinal representations, and both behaviors can be theoretically modeled by comparable means-to-end relations (substance-performance). A behavioral (not substance-based) definition of NE is proposed, with assumed functionality as its core component. It is empirically tested whether different NE variants (lifestyle drug, prescription drug, and illicit substance) can be regressed to school stressors.
Findings: Participants were 519 students (25.8 +/- 8.4 years old, 73.1% female). Logistic regressions indicate that a modified doping attitude scale can predict all three NE variants. Multiple NE substance abuse was frequent. Overwhelming demands in school were associated with lifestyle and prescription drug NE.
Conclusions: Researchers should be sensitive for probable structural similarities between enhancement in everyday life and sport and systematically explore where findings from one domain can be adapted for the other. Policy makers should be aware that students might misperceive NE as an acceptable means of coping with stress in school, and help to form societal sensitivity for the topic of NE among our younger ones in general.
Stress and pain
(2022)
Introduction: Low back pain (LBP) leads to considerable impairment of quality of life worldwide and is often accompanied by psychosomatic symptoms.
Objectives: First, to assess the association between stress and chronic low back pain (CLBP) and its simultaneous appearance with fatigue and depression as a symptom triad. Second, to identify the most predictive stress-related pattern set for CLBP for a 1-year diagnosis.
Methods: In a 1-year observational study with four measurement points, a total of 140 volunteers (aged 18–45 years with intermittent pain) were recruited. The primary outcomes were pain [characteristic pain intensity (CPI), subjective pain disability (DISS)], fatigue, and depressive mood. Stress was assessed as chronic stress, perceived stress, effort reward imbalance, life events, and physiological markers [allostatic load index (ALI), hair cortisol concentration (HCC)]. Multiple linear regression models and selection procedures for model shrinkage and variable selection (least absolute shrinkage and selection operator) were applied. Prediction accuracy was calculated by root mean squared error (RMSE) and receiver-operating characteristic curves.
Results: There were 110 participants completed the baseline assessments (28.2 7.5 years, 38.1% female), including HCC, and a further of 46 participants agreed to ALI laboratory measurements. Different stress types were associated with LBP, CLBP, fatigue, and depressive mood and its joint occurrence as a symptom triad at baseline; mainly social-related stress types were of relevance. Work-related stress, such as “excessive demands at work”[b = 0.51 (95%CI -0.23, 1.25), p = 0.18] played a role for upcoming chronic pain disability. “Social overload” [b = 0.45 (95%CI -0.06, 0.96), p = 0.080] and “over-commitment at work” [b = 0.28 (95%CI -0.39, 0.95), p = 0.42] were associated with an upcoming depressive mood within 1-year. Finally, seven psychometric (CPI: RMSE = 12.63; DISS: RMSE = 9.81) and five biomarkers (CPI: RMSE = 12.21; DISS: RMSE = 8.94) could be derived as the most predictive pattern set for a 1-year prediction of CLBP. The biomarker set showed an apparent area under the curve of 0.88 for CPI and 0.99 for DISS.
Conclusion: Stress disrupts allostasis and favors the development of chronic pain, fatigue, and depression and the emergence of a “hypocortisolemic symptom triad,” whereby the social-related stressors play a significant role. For translational medicine, a predictive pattern set could be derived which enables to diagnose the individuals at higher risk for the upcoming pain disorders and can be used in practice.
Purpose: The aim of this study was to compare the effects of moderate intensity, low volume (MILV) vs. low intensity, high volume (LIHV) strength training on sport-specific performance, measures of muscular fitness, and skeletal muscle mass in young kayakers and canoeists.
Methods: Semi-elite young kayakers and canoeists (N = 40, 13 ± 0.8 years, 11 girls) performed either MILV (70–80% 1-RM, 6–12 repetitions per set) or LIHV (30–40% 1-RM, 60–120 repetitions per set) strength training for one season. Linear mixed-effects models were used to compare effects of training condition on changes over time in 250 and 2,000 m time trials, handgrip strength, underhand shot throw, average bench pull power over 2 min, and skeletal muscle mass. Both between- and within-subject designs were used for analysis. An alpha of 0.05 was used to determine statistical significance.
Results: Between- and within-subject analyses showed that monthly changes were greater in LIHV vs. MILV for the 2,000 m time trial (between: 9.16 s, SE = 2.70, p < 0.01; within: 2,000 m: 13.90 s, SE = 5.02, p = 0.01) and bench pull average power (between: 0.021 W⋅kg–1, SE = 0.008, p = 0.02; within: 0.010 W⋅kg–1, SE = 0.009, p > 0.05). Training conditions did not affect other outcomes.
Conclusion: Young sprint kayakers and canoeists benefit from LIHV more than MILV strength training in terms of 2,000 m performance and muscular endurance (i.e., 2 min bench pull power).
The aim of this study is to monitor short-term seasonal development of young Olympic weightlifters’ anthropometry, body composition, physical fitness, and sport-specific performance. Fifteen male weightlifters aged 13.2 ± 1.3 years participated in this study. Tests for the assessment of anthropometry (e.g., body-height, body-mass), body-composition (e.g., lean-body-mass, relative fat-mass), muscle strength (grip-strength), jump performance (drop-jump (DJ) height, countermovement-jump (CMJ) height, DJ contact time, DJ reactive-strength-index (RSI)), dynamic balance (Y-balance-test), and sport-specific performance (i.e., snatch and clean-and-jerk) were conducted at different time-points (i.e., T1 (baseline), T2 (9 weeks), T3 (20 weeks)). Strength tests (i.e., grip strength, clean-and-jerk and snatch) and training volume were normalized to body mass. Results showed small-to-large increases in body-height, body-mass, lean-body-mass, and lower-limbs lean-mass from T1-to-T2 and T2-to-T3 (∆0.7–6.7%; 0.1 ≤ d ≤ 1.2). For fat-mass, a significant small-sized decrease was found from T1-to-T2 (∆13.1%; d = 0.4) and a significant increase from T2-to-T3 (∆9.1%; d = 0.3). A significant main effect of time was observed for DJ contact time (d = 1.3) with a trend toward a significant decrease from T1-to-T2 (∆–15.3%; d = 0.66; p = 0.06). For RSI, significant small increases from T1-to-T2 (∆9.9%, d = 0.5) were noted. Additionally, a significant main effect of time was found for snatch (d = 2.7) and clean-and-jerk (d = 3.1) with significant small-to-moderate increases for both tests from T1-to-T2 and T2-to-T3 (∆4.6–11.3%, d = 0.33 to 0.64). The other tests did not change significantly over time (0.1 ≤ d ≤ 0.8). Results showed significantly higher training volume for sport-specific training during the second period compared with the first period (d = 2.2). Five months of Olympic weightlifting contributed to significant changes in anthropometry, body-composition, and sport-specific performance. However, hardly any significant gains were observed for measures of physical fitness. Coaches are advised to design training programs that target a variety of fitness components to lay an appropriate foundation for later performance as an elite athlete.
The Adaptive Force (AF) reflects the neuromuscular capacity to adapt to external loads during holding muscle actions and is similar to motions in real life and sports. The maximal isometric AF (AFisomax) was considered to be the most relevant parameter and was assumed to have major importance regarding injury mechanisms and the development of musculoskeletal pain. The aim of this study was to investigate the behavior of different torque parameters over the course of 30 repeated maximal AF trials. In addition, maximal holding vs. maximal pushing isometric muscle actions were compared. A side consideration was the behavior of torques in the course of repeated AF actions when comparing strength and endurance athletes. The elbow flexors of n = 12 males (six strength/six endurance athletes, non-professionals) were measured 30 times (120 s rest) using a pneumatic device. Maximal voluntary isometric contraction (MVIC) was measured pre and post. MVIC, AFisomax, and AFmax (maximal torque of one AF measurement) were evaluated regarding different considerations and statistical tests. AFmax and AFisomax declined in the course of 30 trials [slope regression (mean ± standard deviation): AFmax = −0.323 ± 0.263; AFisomax = −0.45 ± 0.45]. The decline from start to end amounted to −12.8% ± 8.3% (p < 0.001) for AFmax and −25.41% ± 26.40% (p < 0.001) for AFisomax. AF parameters declined more in strength vs. endurance athletes. Thereby, strength athletes showed a rather stable decline for AFmax and a plateau formation for AFisomax after 15 trials. In contrast, endurance athletes reduced their AFmax, especially after the first five trials, and remained on a rather similar level for AFisomax. The maximum of AFisomax of all 30 trials amounted 67.67% ± 13.60% of MVIC (p < 0.001, n = 12), supporting the hypothesis of two types of isometric muscle action (holding vs. pushing). The findings provided the first data on the behavior of torque parameters after repeated isometric–eccentric actions and revealed further insights into neuromuscular control strategies. Additionally, they highlight the importance of investigating AF parameters in athletes based on the different behaviors compared to MVIC. This is assumed to be especially relevant regarding injury mechanisms.
Rehabilitation after autologous chondrocyte implantation for isolated cartilage defects of the knee
(2017)
Autologous chondrocyte implantation for treatment of isolated cartilage defects of the knee has become well established. Although various publications report technical modifications, clinical results, and cell-related issues, little is known about appropriate and optimal rehabilitation after autologous chondrocyte implantation. This article reviews the literature on rehabilitation after autologous chondrocyte implantation and presents a rehabilitation protocol that has been developed considering the best available evidence and has been successfully used for several years in a large number of patients who underwent autologous chondrocyte implantation for cartilage defects of the knee.
Background: Neuroenhancement (NE), the use of psychoactive substances in order to enhance a healthy individual's cognitive functioning from a proficient to an even higher level, is prevalent in student populations. According to the strength model of self-control, people fail to self-regulate and fall back on their dominant behavioral response when finite self-control resources are depleted. An experiment was conducted to test the hypothesis that ego-depletion will prevent students who are unfamiliar with NE from trying it.
Findings: 130 undergraduates, who denied having tried NE before (43% female, mean age = 22.76 +/- 4.15 years old), were randomly assigned to either an ego-depletion or a control condition. The dependent variable was taking an "energy-stick" (a legal nutritional supplement, containing low doses of caffeine, taurine and vitamin B), offered as a potential means of enhancing performance on the bogus concentration task that followed. Logistic regression analysis showed that ego-depleted participants were three times less likely to take the substance, OR = 0.37, p = .01.
Conclusion: This experiment found that trying NE for the first time was more likely if an individual's cognitive capacities were not depleted. This means that mental exhaustion is not predictive for NE in students for whom NE is not the dominant response. Trying NE for the first time is therefore more likely to occur as a thoughtful attempt at self-regulation than as an automatic behavioral response in stressful situations. We therefore recommend targeting interventions at this inter-individual difference. Students without previous reinforcing NE experience should be provided with information about the possible negative health outcomes of NE. Reconfiguring structural aspects in the academic environment (e.g. lessening workloads) might help to deter current users.
Introduction: Adequate cognitive function in patients is a prerequisite for successful implementation of patient education and lifestyle coping in comprehensive cardiac rehabilitation (CR) programs. Although the association between cardiovascular diseases and cognitive impairments (CIs) is well known, the prevalence particularly of mild CI in CR and the characteristics of affected patients have been insufficiently investigated so far.
Methods: In this prospective observational study, 496 patients (54.5 ± 6.2 years, 79.8% men) with coronary artery disease following an acute coronary event (ACE) were analyzed. Patients were enrolled within 14 days of discharge from the hospital in a 3-week inpatient CR program. Patients were tested for CI using the Montreal Cognitive Assessment (MoCA) upon admission to and discharge from CR. Additionally, sociodemographic, clinical, and physiological variables were documented. The data were analyzed descriptively and in a multivariate stepwise backward elimination regression model with respect to CI.
Results: At admission to CR, the CI (MoCA score < 26) was determined in 182 patients (36.7%). Significant differences between CI and no CI groups were identified, and CI group was associated with high prevalence of smoking (65.9 vs 56.7%, P = 0.046), heavy (physically demanding) workloads (26.4 vs 17.8%, P < 0.001), sick leave longer than 1 month prior to CR (28.6 vs 18.5%, P = 0.026), reduced exercise capacity (102.5 vs 118.8 W, P = 0.006), and a shorter 6-min walking distance (401.7 vs 421.3 m, P = 0.021) compared to no CI group. The age- and education-adjusted model showed positive associations with CI only for sick leave more than 1 month prior to ACE (odds ratio [OR] 1.673, 95% confidence interval 1.07–2.79; P = 0.03) and heavy workloads (OR 2.18, 95% confidence interval 1.42–3.36; P < 0.01).
Conclusion: The prevalence of CI in CR was considerably high, affecting more than one-third of cardiac patients. Besides age and education level, CI was associated with heavy workloads and a longer sick leave before ACE.
Background: The back pain screening tool Risk-Prevention-Index Social (RPI-S) identifies the individual psychosocial risk for low back pain chronification and supports the allocation of patients at risk in additional multidisciplinary treatments. The study objectives were to evaluate (1) the prognostic validity of the RPI-S for a 6-month time frame and (2) the clinical benefit of the RPI-S.
Methods: In a multicenter single-blind 3-armed randomized controlled trial, n = 660 persons (age 18–65 years) were randomly assigned to a twelve-week uni- or multidisciplinary exercise intervention or control group. Psychosocial risk was assessed by the RPI-S domain social environment (RPI-SSE) and the outcome pain by the Chronic Pain Grade Questionnaire (baseline M1, 12-weeks M4, 24-weeks M5). Prognostic validity was quantified by the root mean squared error (RMSE) within the control group. The clinical benefit of RPI-SSE was calculated by repeated measures ANOVA in intervention groups.
Results: A subsample of n = 274 participants (mean = 38.0 years, SD 13.1) was analyzed, of which 30% were classified at risk in their psychosocial profile. The half-year prognostic validity was good (RMSE for disability of 9.04 at M4 and of 9.73 at M5; RMSE for pain intensity of 12.45 at M4 and of 14.49 at M5). People at risk showed significantly stronger reduction in pain disability and intensity at M4/M5, if participating in a multidisciplinary exercise treatment. Subjects at no risk showed a smaller reduction in pain disability in both interventions and no group differences for pain intensity. Regarding disability due to pain, around 41% of the sample would gain an unfitted treatment without the back pain screening.
Conclusion: The RPI-SSE prognostic validity demonstrated good applicability and a clinical benefit confirmed by a clear advantage of an individualized treatment possibility.
There is ample evidence that youth resistance training (RT) is safe, joyful, and effective for different markers of performance (e.g., muscle strength, power, linear sprint speed) and health (e.g., injury prevention). Accordingly, the first aim of this narrative review is to present and discuss the relevance of muscle strength for youth physical development. The second purpose is to report evidence on the effectiveness of RT on muscular fitness (muscle strength, power, muscle endurance), on movement skill performance and injury prevention in youth. There is evidence that RT is effective in enhancing measures of muscle fitness in children and adolescents, irrespective of sex. Additionally, numerous studies indicate that RT has positive effects on fundamental movement skills (e.g., jumping, running, throwing) in youth regardless of age, maturity, training status, and sex. Further, irrespective of age, sex, and training status, regular exposure to RT (e.g., plyometric training) decreases the risk of sustaining injuries in youth. This implies that RT should be a meaningful element of youths’ exercise programming. This has been acknowledged by global (e.g., World Health Organization) and national (e.g., National Strength and Conditioning Association) health- and performance-related organizations which is why they recommended to perform RT as an integral part of weekly exercise programs to promote muscular strength, fundamental movement skills, and to resist injuries in youth.
In animals and humans, behavior can be influenced by irrelevant stimuli, a phenomenon called Pavlovian-to-instrumental transfer (PIT). In subjects with substance use disorder, PIT is even enhanced with functional activation in the nucleus accumbens (NAcc) and amygdala. While we observed enhanced behavioral and neural PIT effects in alcohol-dependent subjects, we here aimed to determine whether behavioral PIT is enhanced in young men with high-risk compared to low-risk drinking and subsequently related functional activation in an a-priori region of interest encompassing the NAcc and amygdala and related to polygenic risk for alcohol consumption. A representative sample of 18-year old men (n = 1937) was contacted: 445 were screened, 209 assessed: resulting in 191 valid behavioral, 139 imaging and 157 genetic datasets. None of the subjects fulfilled criteria for alcohol dependence according to the Diagnostic and Statistical Manual of Mental Disorders-IV-TextRevision (DSM-IV-TR). We measured how instrumental responding for rewards was influenced by background Pavlovian conditioned stimuli predicting action-independent rewards and losses. Behavioral PIT was enhanced in high-compared to low-risk drinkers (b = 0.09, SE = 0.03, z = 2.7, p < 0.009). Across all subjects, we observed PIT-related neural blood oxygen level-dependent (BOLD) signal in the right amygdala (t = 3.25, p(SVC) = 0.04, x = 26, y = -6, z = -12), but not in NAcc. The strength of the behavioral PIT effect was positively correlated with polygenic risk for alcohol consumption (r(s) = 0.17, p = 0.032). We conclude that behavioral PIT and polygenic risk for alcohol consumption might be a biomarker for a subclinical phenotype of risky alcohol consumption, even if no drug-related stimulus is present. The association between behavioral PIT effects and the amygdala might point to habitual processes related to out PIT task. In non-dependent young social drinkers, the amygdala rather than the NAcc is activated during PIT; possible different involvement in association with disease trajectory should be investigated in future studies.
In sports and movement sciences isometric muscle function is usually measured by pushing against a stable resistance. However, subjectively one can hold or push isometrically. Several investigations suggest a distinction of those forms. The aim of this study was to investigate whether these two forms of isometric muscle action can be distinguished by objective parameters in an interpersonal setting. 20 subjects were grouped in 10 same sex pairs, in which one partner should perform the pushing isometric muscle action (PIMA) and the other partner executed the holding isometric muscle action (HIMA). The partners had contact at the distal forearms via an interface, which included a strain gauge and an acceleration sensor. The mechanical oscillations of the triceps brachii (MMGtri) muscle, its tendon (MTGtri) and the abdominal muscle (MMGobl) were recorded by a piezoelectric-sensor-based measurement system. Each partner performed three 15s (80% MVIC) and two fatiguing trials (90% MVIC) during PIMA and HIMA, respectively. Parameters to compare PIMA and HIMA were the mean frequency, the normalized mean amplitude, the amplitude variation, the power in the frequency range of 8 to 15 Hz, a special power-frequency ratio and the number of task failures during HIMA or PIMA (partner who quit the task). A “HIMA failure” occurred in 85% of trials (p < 0.001). No significant differences between PIMA and HIMA were found for the mean frequency and normalized amplitude. The MMGobl showed significantly higher values of amplitude variation (15s: p = 0.013; fatiguing: p = 0.007) and of power-frequency-ratio (15s: p = 0.040; fatiguing: p = 0.002) during HIMA and a higher power in the range of 8 to 15 Hz during PIMA (15s: p = 0.001; fatiguing: p = 0.011). MMGtri and MTGtri showed no significant differences. Based on the findings it is suggested that a holding and a pushing isometric muscle action can be distinguished objectively, whereby a more complex neural control is assumed for HIMA.
Degenerative disc disease is associated with increased expression of pro-inflammatory cytokines in the intervertebral disc (IVD). However, it is not completely clear how inflammation arises in the IVD and which cellular compartments are involved in this process. Recently, the endoplasmic reticulum (ER) has emerged as a possible modulator of inflammation in age-related disorders. In addition, ER stress has been associated with the microenvironment of degenerated IVDs. Therefore, the aim of this study was to analyze the effects of ER stress on inflammatory responses in degenerated human IVDs and associated molecular mechanisms. Gene expression of ER stress marker GRP78 and pro-inflammatory cytokines IL-6, IL-8, IL-1 beta, and TNF-alpha was analyzed in human surgical IVD samples (n = 51, Pfirrmann grade 2-5). The expression of GRP78 positively correlated with the degeneration grade in lumbar IVDs and IL-6, but not with IL-1 beta and TNF-alpha. Another set of human surgical IVD samples (n = 25) was used to prepare primary cell cultures. ER stress inducer thapsigargin (Tg, 100 and 500 nM) activated gene and protein expression of IL-6 and induced phosphorylation of p38 MAPK. Both inhibition of p38 MAPK by SB203580 (10 mu M) and knockdown of ER stress effector CCAAT-enhancer-binding protein homologous protein (CHOP) reduced gene and protein expression of IL-6 in Tg-treated cells. Furthermore, the effects of an inflammatory microenvironment on ER stress were tested. TNF-alpha (5 and 10 ng/mL) did not activate ER stress, while IL-1 beta (5 and 10 ng/mL) activated gene and protein expression of GRP78, but did not influence [Ca2+](i) flux and expression of CHOP, indicating that pro-inflammatory cytokines alone may not induce ER stress in vivo. This study showed that IL-6 release in the IVD can be initiated following ER stress and that ER stress mediates IL-6 release through p38 MAPK and CHOP. Therapeutic targeting of ER stress response may reduce the consequences of the harsh microenvironment in degenerated IVD.
Intervertebral disc (IVD) cells are naturally exposed to high osmolarity and complex mechanical loading, which drive microenvironmental osmotic changes. Age- and degeneration-induced degradation of the IVD's extracellular matrix causes osmotic imbalance, which, together with an altered function of cellular receptors and signalling pathways, instigates local osmotic stress. Cellular responses to osmotic stress include osmoadaptation and activation of pro-inflammatory pathways. This review summarises the current knowledge on how IVD cells sense local osmotic changes and translate these signals into physiological or pathophysiological responses, with a focus on inflammation. Furthermore, it discusses the expression and function of putative membrane osmosensors (e.g. solute carrier transporters, transient receptor potential channels, aquaporins and acid-sensing ion channels) and osmosignalling mediators [e.g. tonicity responseelement-binding protein/nuclear factor of activated T-cells 5 (TonEBP/NFAT5), nuclear factor kappa-lightchain-enhancer of activated B cells (NF-kappa B)] in healthy and degenerated IVDs. Finally, an overview of the potential therapeutic targets for modifying osmosensing and osmosignalling in degenerated IVDs is provided.
Background Recent shoulder injury prevention programs have utilized resistance exercises combined with different forms of instability, with the goal of eliciting functional adaptations and thereby reducing the risk of injury. However, it is still unknown how an unstable weight mass (UWM) affects the muscular activity of the shoulder stabilizers. Aim of the study was to assess neuromuscular activity of dynamic shoulder stabilizers under four conditions of stable and UWM during three shoulder exercises. It was hypothesized that a combined condition of weight with UWM would elicit greater activation due to the increased stabilization demand. Methods Sixteen participants (7 m/9 f) were included in this cross-sectional study and prepared with an EMG-setup for the: Mm. upper/lower trapezius (U.TA/L.TA), lateral deltoid (DE), latissimus dorsi (LD), serratus anterior (SA) and pectoralis major (PE). A maximal voluntary isometric contraction test (MVIC; 5 s.) was performed on an isokinetic dynamometer. Next, internal/external rotation (In/Ex), abduction/adduction (Ab/Ad) and diagonal flexion/extension (F/E) exercises (5 reps.) were performed with four custom-made-pipes representing different exercise conditions. First, the empty-pipe (P; 0.5 kg) and then, randomly ordered, water-filled-pipe (PW; 1 kg), weight-pipe (PG; 4.5 kg) and weight + water-filled-pipe (PWG; 4.5 kg), while EMG was recorded. Raw root-mean-square values (RMS) were normalized to MVIC (%MVIC). Differences between conditions for RMS%MVIC, scapular stabilizer (SR: U.TA/L.TA; U.TA/SA) and contraction (CR: concentric/eccentric) ratios were analyzed (paired t-test; p <= 0.05; Bonferroni adjusted alpha = 0.008). Results PWG showed significantly greater muscle activity for all exercises and all muscles except for PE compared to P and PW. Condition PG elicited muscular activity comparable to PWG (p > 0.008) with significantly lower activation of L.TA and SA in the In/Ex rotation. The SR ratio was significantly higher in PWG compared to P and PW. No significant differences were found for the CR ratio in all exercises and for all muscles. Conclusion Higher weight generated greater muscle activation whereas an UWM raised the neuromuscular activity, increasing the stabilization demands. Especially in the In/Ex rotation, an UWM increased the RMS%MVIC and SR ratio. This might improve training effects in shoulder prevention and rehabilitation programs.
Background: Healthy university students have been shown to use psychoactive substances, expecting them to be functional means for enhancing their cognitive capacity, sometimes over and above an essentially proficient level. This behavior called Neuroenhancement (NE) has not yet been integrated into a behavioral theory that is able to predict performance. Job Demands Resources (JD-R) Theory for example assumes that strain (e.g. burnout) will occur and influence performance when job demands are high and job resources are limited at the same time. The aim of this study is to investigate whether or not university students’ self-reported NE can be integrated into JD-R Theory’s comprehensive approach to psychological health and performance.
Methods: 1,007 students (23.56 ± 3.83 years old, 637 female) participated in an online survey. Lifestyle drug, prescription drug, and illicit substance NE together with the complete set of JD-R variables (demands, burnout, resources, motivation, and performance) were measured. Path models were used in order to test our data’s fit to hypothesized main effects and interactions.
Results: JD-R Theory could successfully be applied to describe the situation of university students. NE was mainly associated with the JD-R Theory’s health impairment process: Lifestyle drug NE (p < .05) as well as prescription drug NE (p < .001) is associated with higher burnout scores, and lifestyle drug NE aggravates the study demands-burnout interaction. In addition, prescription drug NE mitigates the protective influence of resources on burnout and on motivation.
Conclusion: According to our results, the uninformed trying of NE (i.e., without medical supervision) might result in strain. Increased strain is related to decreased performance. From a public health perspective, intervention strategies should address these costs of non-supervised NE. With regard to future research we propose to model NE as a means to reach an end (i.e. performance enhancement) rather than a target behavior itself. This is necessary to provide a deeper understanding of the behavioral roots and consequences of the phenomenon.
Background
The metabolic syndrome (MetS) is a risk cluster for a number of secondary diseases. The implementation of prevention programs requires early detection of individuals at risk. However, access to health care providers is limited in structurally weak regions. Brandenburg, a rural federal state in Germany, has an especially high MetS prevalence and disease burden. This study aims to validate and test the feasibility of a setup for mobile diagnostics of MetS and its secondary diseases, to evaluate the MetS prevalence and its association with moderating factors in Brandenburg and to identify new ways of early prevention, while establishing a “Mobile Brandenburg Cohort” to reveal new causes and risk factors for MetS.
Methods
In a pilot study, setups for mobile diagnostics of MetS and secondary diseases will be developed and validated. A van will be equipped as an examination room using point-of-care blood analyzers and by mobilizing standard methods. In study part A, these mobile diagnostic units will be placed at different locations in Brandenburg to locally recruit 5000 participants aged 40-70 years. They will be examined for MetS and advice on nutrition and physical activity will be provided. Questionnaires will be used to evaluate sociodemographics, stress perception, and physical activity. In study part B, participants with MetS, but without known secondary diseases, will receive a detailed mobile medical examination, including MetS diagnostics, medical history, clinical examinations, and instrumental diagnostics for internal, cardiovascular, musculoskeletal, and cognitive disorders. Participants will receive advice on nutrition and an exercise program will be demonstrated on site. People unable to participate in these mobile examinations will be interviewed by telephone. If necessary, participants will be referred to general practitioners for further diagnosis.
Discussion
The mobile diagnostics approach enables early detection of individuals at risk, and their targeted referral to local health care providers. Evaluation of the MetS prevalence, its relation to risk-increasing factors, and the “Mobile Brandenburg Cohort” create a unique database for further longitudinal studies on the implementation of home-based prevention programs to reduce mortality, especially in rural regions.
Trial registration
German Clinical Trials Register, DRKS00022764; registered 07 October 2020—retrospectively registered.
Background:
Arising from the relevance of sensorimotor training in the therapy of nonspecific low back pain patients and from the value of individualized therapy, the present trial aims to test the feasibility and efficacy of individualized sensorimotor training interventions in patients suffering from nonspecific low back pain.
Methods and study design:
A multicentre, single-blind two-armed randomized controlled trial to evaluate the
effects of a 12-week (3 weeks supervised centre-based and 9 weeks home-based) individualized sensorimotor exercise program is performed. The control group stays inactive during this period. Outcomes are pain, and pain-associated function as well as motor function in adults with nonspecific low back pain. Each participant is scheduled to five measurement dates: baseline (M1), following centre-based training (M2), following home-based training (M3) and at two follow-up time points 6 months (M4) and 12 months (M5) after M1. All investigations and the assessment of the primary and secondary outcomes are performed in a standardized order: questionnaires – clinical examination – biomechanics (motor function). Subsequent statistical procedures are executed after the examination of underlying assumptions for parametric or rather non-parametric testing.
Discussion:
The results and practical relevance of the study will be of clinical and practical relevance not only for researchers and policy makers but also for the general population suffering from nonspecific low back pain.
Trial registration:
Identification number DRKS00010129. German Clinical Trial registered on 3 March 2016.
The mechanotendography (MTG) is a method for analyzing the mechanical oscillations of tendons during muscular actions. The aim of this investigation was to evaluate the technical reliability of a piezo-based measurement system used for MTG. The reliability measurements were performed by using audio samples played by a subwoofer. The thereby generated pressure waves were recorded by a piezo-based measurement system. An audio of 40 Hz sine oscillations and four different formerly in vivo recorded MTG-signals were converted into audio files and were used as test signals. Five trials with each audio were performed and one audio was used for repetition trials on another day. The signals’ correlation was estimated by Spearman (MCC) and intraclass correlation coefficients (ICC(3,1)), Cronbach’s alpha (CA) and by mean distances (MD). All parameters were compared between repetition and randomized matched signals. The repetition trials show high correlations (MCC: 0.86 ± 0.13, ICC: 0.89 ± 0.12, CA: 0.98 ± 0.03), low MD (0.03 ± 0.03V) and differ significantly from the randomized matched signals (MCC: 0.15 ± 0.10, ICC: 0.17 ± 0.09, CA: 0.37 ± 0.16, MD: 0.19 ± 0.01V) (p = 0.001 – 0.043). This speaks for an excellent reliability of the measurement system. Presuming the skin above superficial tendons oscillates adequately, we estimate this tool as valid for the application in musculoskeletal system.
The mechanical muscular oscillations are rarely the objective of investigations regarding the identification of a biomarker for Parkinson’s disease (PD). Therefore, the aim of this study was to investigate whether or not this specific motor output differs between PD patients and controls. The novelty is that patients without tremor are investigated performing a unilateral isometric motor task. The force of armflexors and the forearm acceleration (ACC) were recorded as well as the mechanomyography of the biceps brachii (MMGbi), brachioradialis (MMGbra) and pectoralis major (MMGpect) muscles using a piezoelectric-sensor-based system during a unilateral motor task at 70% of the MVIC. The frequency, a power-frequency-ratio, the amplitude variation, the slope of amplitudes and their interlimb asymmetries were analysed. The results indicate that the oscillatory behavior of muscular output in PD without tremor deviates from controls in some parameters: Significant differences appeared for the power-frequency-ratio (p = 0.001, r = 0.43) and for the amplitude variation (p = 0.003, r = 0.34) of MMGpect. The interlimb asymmetries differed significantly concerning the power-frequency-ratio of MMGbi (p = 0.013, r = 0.42) and MMGbra (p = 0.048, r = 0.39) as well as regarding the mean frequency (p = 0.004, r = 0.48) and amplitude variation of MMGpect (p = 0.033, r = 0.37). The mean (M) and variation coefficient (CV) of slope of ACC differed significantly (M: p = 0.022, r = 0.33; CV: p = 0.004, r = 0.43). All other parameters showed no significant differences between PD and controls. It remains open, if this altered mechanical muscular output is reproducible and specific for PD.
The manual muscle test (MMT) is a flexible diagnostic tool, which is used in many disciplines, applied in several ways. The main problem is the subjectivity of the test. The MMT in the version of a “break test” depends on the tester’s force rise and the patient’s ability to resist the applied force. As a first step, the investigation of the reproducibility of the testers’ force profile is required for valid application. The study examined the force profiles of n = 29 testers (n = 9 experiences (Exp), n = 8 little experienced (LitExp), n = 12 beginners (Beg)). The testers performed 10 MMTs according to the test of hip flexors, but against a fixed leg to exclude the patient’s reaction. A handheld device recorded the temporal course of the applied force. The results show significant differences between Exp and Beg concerning the starting force (padj = 0.029), the ratio of starting to maximum force (padj = 0.005) and the normalized mean Euclidean distances between the 10 trials (padj = 0.015). The slope is significantly higher in Exp vs. LitExp (p = 0.006) and Beg (p = 0.005). The results also indicate that experienced testers show inter-tester differences and partly even a low intra-tester reproducibility. This highlights the necessity of an objective MMT-assessment. Furthermore, an agreement on a standardized force profile is required. A suggestion for this is given.
Several studies have investigated the effects of music on both submaximal and maximal exercise performance at a constant work-rate. However, there is a lack of research that has examined the effects of music on the pacing strategy during self-paced exercise. The aim of this study was to examine the effects of preferred music on performance and pacing during a 6 min run test (6-MSPRT) in young male adults. Twenty healthy male participants volunteered for this study. They performed two randomly assigned trials (with or without music) of a 6-MSPRT three days apart. Mean running speed, the adopted pacing strategy, total distance covered (TDC), peak and mean heart rate (HRpeak, HRmean), blood lactate (3 min after the test), and rate of perceived exertion (RPE) were measured. Listening to preferred music during the 6-MSPRT resulted in significant TDC improvement (?10%; p = 0.016; effect size (ES) = 0.80). A significantly faster mean running speed was observed when listening to music compared with no music. The improvement of TDC in the present study is explained by a significant overall increase in speed (main effect for conditions) during the music trial. Music failed to modify pacing patterns as suggested by the similar reversed “J-shaped” profile during the two conditions. Blood-lactate concentrations were significantly reduced by 9% (p = 0.006, ES = 1.09) after the 6-MSPRT with music compared to those in the control condition. No statistically significant differences were found between the test conditions for HRpeak, HRmean, and RPE. Therefore, listening to preferred music can have positive effects on exercise performance during the 6-MSPRT, such as greater TDC, faster running speeds, and reduced blood lactate levels but has no effect on the pacing strategy.
Background
Overweight and obesity are increasing health problems that are not restricted to adults only. Childhood obesity is associated with metabolic, psychological and musculoskeletal comorbidities. However, knowledge about the effect of obesity on the foot function across maturation is lacking. Decreased foot function with disproportional loading characteristics is expected for obese children. The aim of this study was to examine foot loading characteristics during gait of normal-weight, overweight and obese children aged 1-12 years.
Methods
A total of 10382 children aged one to twelve years were enrolled in the study. Finally, 7575 children (m/f: n = 3630/3945; 7.0 +/- 2.9yr; 1.23 +/- 0.19m; 26.6 +/- 10.6kg; BMI: 17.1 +/- 2.4kg/m(2)) were included for (complete case) data analysis. Children were categorized to normalweight (>= 3rd and <90th percentile; n = 6458), overweight (>= 90rd and <97th percentile; n = 746) or obese (>97th percentile; n = 371) according to the German reference system that is based on age and gender-specific body mass indices (BMI). Plantar pressure measurements were assessed during gait on an instrumented walkway. Contact area, arch index (AI), peak pressure (PP) and force time integral (FTI) were calculated for the total, fore-, mid-and hindfoot. Data was analyzed descriptively (mean +/- SD) followed by ANOVA/Welch-test (according to homogeneity of variances: yes/no) for group differences according to BMI categorization (normal-weight, overweight, obesity) and for each age group 1 to 12yrs (post-hoc Tukey Kramer/Dunnett's C; alpha = 0.05).
Results
Mean walking velocity was 0.95 +/- 0.25 m/s with no differences between normal-weight, overweight or obese children (p = 0.0841). Results show higher foot contact area, arch index, peak pressure and force time integral in overweight and obese children (p< 0.001). Obese children showed the 1.48-fold (1 year-old) to 3.49-fold (10 year-old) midfoot loading (FTI) compared to normal-weight.
Conclusion
Additional body mass leads to higher overall load, with disproportional impact on the midfoot area and longitudinal foot arch showing characteristic foot loading patterns. Already the feet of one and two year old children are significantly affected. Childhood overweight and obesity is not compensated by the musculoskeletal system. To avoid excessive foot loading with potential risk of discomfort or pain in childhood, prevention strategies should be developed and validated for children with a high body mass index and functional changes in the midfoot area. The presented plantar pressure values could additionally serve as reference data to identify suspicious foot loading patterns in children.
Background
Recently, the incidence rate of back pain (BP) in adolescents has been reported at 21%. However, the development of BP in adolescent athletes is unclear. Hence, the purpose of this study was to examine the incidence of BP in young elite athletes in relation to gender and type of sport practiced.
Methods
Subjective BP was assessed in 321 elite adolescent athletes (m/f 57%/43%; 13.2 ± 1.4 years; 163.4 ± 11.4 cm; 52.6 ± 12.6 kg; 5.0 ± 2.6 training yrs; 7.6 ± 5.3 training h/week). Initially, all athletes were free of pain. The main outcome criterion was the incidence of back pain [%] analyzed in terms of pain development from the first measurement day (M1) to the second measurement day (M2) after 2.0 ± 1.0 year. Participants were classified into athletes who developed back pain (BPD) and athletes who did not develop back pain (nBPD). BP (acute or within the last 7 days) was assessed with a 5-step face scale (face 1–2 = no pain; face 3–5 = pain). BPD included all athletes who reported faces 1 and 2 at M1 and faces 3 to 5 at M2. nBPD were all athletes who reported face 1 or 2 at both M1 and M2. Data was analyzed descriptively. Additionally, a Chi2 test was used to analyze gender- and sport-specific differences (p = 0.05).
Results
Thirty-two athletes were categorized as BPD (10%). The gender difference was 5% (m/f: 12%/7%) but did not show statistical significance (p = 0.15). The incidence of BP ranged between 6 and 15% for the different sport categories. Game sports (15%) showed the highest, and explosive strength sports (6%) the lowest incidence. Anthropometrics or training characteristics did not significantly influence BPD (p = 0.14 gender to p = 0.90 sports; r2 = 0.0825).
Conclusions
BP incidence was lower in adolescent athletes compared to young non-athletes and even to the general adult population. Consequently, it can be concluded that high-performance sports do not lead to an additional increase in back pain incidence during early adolescence. Nevertheless, back pain prevention programs should be implemented into daily training routines for sport categories identified as showing high incidence rates.
Background: Change-of-direction (CoD) is a necessary physical ability of a field sport and may vary in youth players according to their maturation status.
Objectives: The aim of this study is: to compare the effectiveness of a 6-week CoD training intervention on dynamic balance (CS-YBT), horizontal jump (5JT), speed (10 and 30-m linear sprint times), CoD with (15 m-CoD + B) and without (15 m-CoD) the ball, in youth male soccer players at different levels of maturity [pre- and post-peak height velocity (PHV)].
Materials and Methods: Thirty elite male youth soccer players aged 10–17 years from the Tunisian first division participated in this study. The players were divided into pre- (G1, n = 15) and post-PHV (G2, n = 15) groups. Both groups completed a similar 6-week training program with two sessions per week of four CoD exercises. All players completed the following tests before and after intervention: CS-YBT; 5 JT; 10, 30, and 15 m-CoD; and 15 m-CoD + B, and data were analyzed using ANCOVA.
Results: All 30 players completed the study according to the study design and methodology. Adherence rate was 100% across all groups, and no training or test-related injuries were reported. Pre-PHV and post-PHV groups showed significant amelioration post-intervention for all dependent variables (after test > before test; p < 0.01, d = 0.09–1.51). ANOVA revealed a significant group × time interaction only for CS-YBT (F = 4.45; p < 0.04; η2 = 0.14), 5JT (F = 6.39; p < 0.02; η2 = 0.18), and 15 m-CoD (F = 7.88; p < 0.01; η2 = 0.22). CS-YBT, 5JT, and 15 m-CoD improved significantly in the post-PHV group (+ 4.56%, effect size = 1.51; + 4.51%, effect size = 1.05; and -3.08%, effect size = 0.51, respectively), more than the pre-PHV group (+ 2.77%, effect size = 0.85; + 2.91%, effect size = 0.54; and -1.56%, effect size = 0.20, respectively).
Conclusion: The CoD training program improved balance, horizontal jump, and CoD without the ball in male preadolescent and adolescent soccer players, and this improvement was greater in the post-PHV players. The maturity status of the athletes should be considered when programming CoD training for soccer players.
Background: Doping attitude is a key variable in predicting athletes' intention to use forbidden performance enhancing drugs. Indirect reaction-time based attitude tests, such as the implicit association test, conceal the ultimate goal of measurement from the participant better than questionnaires. Indirect tests are especially useful when socially sensitive constructs such as attitudes towards doping need to be described. The present study serves the development and validation of a novel picture-based brief implicit association test (BIAT) for testing athletes' attitudes towards doping in sport. It shall provide the basis for a transnationally compatible research instrument able to harmonize anti-doping research efforts.
Method: Following a known-group differences validation strategy, the doping attitudes of 43 athletes from bodybuilding (representative for a highly doping prone sport) and handball (as a contrast group) were compared using the picture-based doping-BIAT. The Performance Enhancement Attitude Scale (PEAS) was employed as a corresponding direct measure in order to additionally validate the results.
Results: As expected, in the group of bodybuilders, indirectly measured doping attitudes as tested with the picture-based doping-BIAT were significantly less negative (eta(2) = .11). The doping-BIAT and PEAS scores correlated significantly at r = .50 for bodybuilders, and not significantly at r = .36 for handball players. There was a low error rate (7%) and a satisfactory internal consistency (r(dagger dagger) = .66) for the picture-based doping-BIAT.
Conclusions: The picture-based doping-BIAT constitutes a psychometrically tested method, ready to be adopted by the international research community. The test can be administered via the internet. All test material is available "open source". The test might be implemented, for example, as a new effect-measure in the evaluation of prevention programs.
Introduction
Injury prevention programs (IPPs) are an inherent part of training in recreational and professional sports. Providing performance-enhancing benefits in addition to injury prevention may help adjust coaches and athletes’ attitudes towards implementation of injury prevention into daily routine. Conventional thinking by players and coaches alike seems to suggest that IPPs need to be specific to one’s sport to allow for performance enhancement. The systematic literature review aims to firstly determine the IPPs nature of exercises and whether they are specific to the sport or based on general conditioning. Secondly, can they demonstrate whether general, sports-specific or even mixed IPPs improve key performance indicators with the aim to better facilitate long-term implementation of these programs?
Methods
PubMed and Web of Science were electronically searched throughout March 2018. The inclusion criteria were randomized control trials, publication dates between Jan 2006 and Feb 2018, athletes (11–45 years), injury prevention programs and included predefined performance measures that could be categorized into balance, power, strength, speed/agility and endurance. The methodological quality of included articles was assessed with the Cochrane Collaboration assessment tools.
Results
Of 6619 initial findings, 22 studies met the inclusion criteria. In addition, reference lists unearthed a further 6 studies, making a total of 28. Nine studies used sports specific IPPs, eleven general and eight mixed prevention strategies. Overall, general programs ranged from 29–57% in their effectiveness across performance outcomes. Mixed IPPs improved in 80% balance outcomes but only 20–44% in others. Sports-specific programs led to larger scale improvements in balance (66%), power (83%), strength (75%), and speed/agility (62%).
Conclusion
Sports-specific IPPs have the strongest influence on most performance indices based on the significant improvement versus control groups. Other factors such as intensity, technical execution and compliance should be accounted for in future investigations in addition to exercise modality.
Eccentric exercise is discussed as a treatment option for clinical populations, but specific responses in terms of muscle damage and systemic inflammation after repeated loading of large muscle groups have not been conclusively characterized. Therefore, this study tested the feasibility of an isokinetic protocol for repeated maximum eccentric loading of the trunk muscles. Nine asymptomatic participants (5 f/4 m; 34±6 yrs; 175±13 cm; 76±17 kg) performed three isokinetic 2-minute all-out trunk strength tests (1x concentric (CON), 2x eccentric (ECC1, ECC2), 2 weeks apart; flexion/extension, 60°/s, ROM 55°). Outcomes were peak torque, torque decline, total work, and indicators of muscle damage and inflammation (over 168 h). Statistics were done using the Friedman test (Dunn’s post-test). For ECC1 and ECC2, peak torque and total work were increased and torque decline reduced compared to CON. Repeated ECC bouts yielded unaltered torque and work outcomes. Muscle damage markers were highest after ECC1 (soreness 48 h, creatine kinase 72 h; p<0.05). Their overall responses (area under the curve) were abolished post-ECC2 compared to post-ECC1 (p<0.05). Interleukin-6 was higher post-ECC1 than CON, and attenuated post-ECC2 (p>0.05). Interleukin-10 and tumor necrosis factor-α were not detectable. All markers showed high inter-individual variability. The protocol was feasible to induce muscle damage indicators after exercising a large muscle group, but the pilot results indicated only weak systemic inflammatory responses in asymptomatic adults.
Muscle quality defined as the ratio of muscle strength to muscle mass disregards underlying factors which influence muscle strength. The aim of this review was to investigate the relationship of phase angle (PhA), echo intensity (EI), muscular adipose tissue (MAT), muscle fiber type, fascicle pennation angle (θf), fascicle length (lf), muscle oxidative capacity, insulin sensitivity (IS), neuromuscular activation, and motor unit to muscle strength. PubMed search was performed in 2021. The inclusion criteria were: (i) original research, (ii) human participants, (iii) adults (≥18 years). Exclusion criteria were: (i) no full-text, (ii) non-English or -German language, (iii) pathologies. Forty-one studies were identified. Nine studies found a weak–moderate negative (range r: [−0.26]–[−0.656], p < 0.05) correlation between muscle strength and EI. Four studies found a weak–moderate positive correlation (range r: 0.177–0.696, p < 0.05) between muscle strength and PhA. Two studies found a moderate-strong negative correlation (range r: [−0.446]–[−0.87], p < 0.05) between muscle strength and MAT. Two studies found a weak-strong positive correlation (range r: 0.28–0.907, p < 0.05) between θf and muscle strength. Muscle oxidative capacity was found to be a predictor of muscle strength. This review highlights that the current definition of muscle quality should be expanded upon as to encompass all possible factors of muscle quality.
Extracellular vesicles: potential mediators of psychosocial stress contribution to osteoporosis?
(2021)
Osteoporosis is characterized by low bone mass and damage to the bone tissue’s microarchitecture, leading to increased fracture risk. Several studies have provided evidence for associations between psychosocial stress and osteoporosis through various pathways, including the hypothalamic-pituitary-adrenocortical axis, the sympathetic nervous system, and other endocrine factors. As psychosocial stress provokes oxidative cellular stress with consequences for mitochondrial function and cell signaling (e.g., gene expression, inflammation), it is of interest whether extracellular vesicles (EVs) may be a relevant biomarker in this context or act by transporting substances. EVs are intercellular communicators, transfer substances encapsulated in them, modify the phenotype and function of target cells, mediate cell-cell communication, and, therefore, have critical applications in disease progression and clinical diagnosis and therapy. This review summarizes the characteristics of EVs, their role in stress and osteoporosis, and their benefit as biological markers. We demonstrate that EVs are potential mediators of psychosocial stress and osteoporosis and may be beneficial in innovative research settings.
(1) Background: People with HIV (PWH) may perform more than one type of exercise cumulatively. The objective of this study is to investigate recreational exercise and its association with health-related quality of life (HRQOL) and comorbidities in relation to potential covariates. (2) Methods: The HIBES study (HIV-Begleiterkrankungen-Sport) is a cross-sectional study for people with HIV. The differences between non-exercisers versus exercisers (cumulated vs. single type of exercises) were investigated using regression models based on 454 participants. (3) Results: Exercisers showed a higher HRQOL score compared to non-exercisers (Wilcox r = 0.2 to 0.239). Psychological disorders were identified as the main covariate. Participants performing exercise cumulatively showed higher scores in duration, frequency, and intensity when compared to participants performing only one type of exercise. The mental health summary score was higher for the cumulated and single type of exercise if a psychological disorder existed. Duration and intensity were associated with an increase of HRQOL, whilst a stronger association between psychological disorders and exercise variables were evident. Exercise duration (minutes) showed a significant effect on QOL (standardized beta = 0.1) and for participants with psychological disorders (standardized beta = 0.3), respectively. (4) Conclusions: Psychological disorders and other covariates have a prominent effect on HRQOL and its association with exercise. For PWH with a psychological disorder, a stronger relationship between HRQOL with exercise duration and intensity emerged. However, differentiation of high-HRQOL individuals warrants further investigation by considering additional factors.
Exercise or not?
(2023)
Objective: Individuals’ decisions to engage in exercise are often the result of in-the-moment choices between exercise and a competing behavioral alternative. The purpose of this study was to investigate processes that occur in-the-moment (i.e., situated processes) when individuals are faced with the choice between exercise and a behavioral alternative during a computerized task. These were analyzed against the background of interindividual differences in individuals’ automatic valuation and controlled evaluation of exercise.
Method: In a behavioral alternatives task 101 participants were asked whether they would rather choose an exercise option or a behavioral alternative in 25 trials. Participants’ gaze behavior (first gaze and fixations) was recorded using eye-tracking. An exercise-specific affect misattribution procedure (AMP) was used to assess participants’ automatic valuation of exercise before the task. After the task, self-reported feelings towards exercise (controlled evaluation) and usual weekly exercise volume were assessed. Mixed effects models with random effects for subjects and trials were used for data analysis.
Results: Choosing exercise was positively correlated with individuals’ automatic valuation (r = 0.20, p = 0.05), controlled evaluation (r = 0.58, p < 0.001), and their weekly exercise volume (r = 0.43, p < 0.001). Participants showed no bias in their initial gaze or number of fixations towards the exercise or the non-exercise alternative. However, participants were 1.30 times more likely to fixate on the chosen alternative first and more frequently, but this gaze behavior was not related to individuals’ automatic valuation, controlled evaluation, or weekly exercise volume.
Conclusion: The results suggest that situated processes arising from defined behavioral alternatives may be independent of individuals’ general preferences. Despite one’s best general intention to exercise more, the choice of a non-exercise alternative behavior may seem more appealing in-the-moment and eventually be chosen. New psychological theories of health behavior change should therefore better consider the role of potentially conflicting alternatives when it comes to initiating physical activity or exercise.
Models employed in exercise psychology highlight the role of reflective processes for explaining behavior change. However, as discussed in social cognition literature, information-processing models also consider automatic processes (dual-process models). To examine the relevance of automatic processing in exercise psychology, we used a priming task to assess the automatic evaluations of exercise stimuli in physically active sport and exercise majors (n = 32), physically active nonsport majors (n = 31), and inactive students (n = 31). Results showed that physically active students responded faster to positive words after exercise primes, whereas inactive students responded more rapidly to negative words. Priming task reaction times were successfully used to predict reported amounts of exercise in an ordinal regression model. Findings were obtained only with experiential items reflecting negative and positive consequences of exercise. The results illustrate the potential importance of dual-process models in exercise psychology.