Refine
Has Fulltext
- yes (84)
Year of publication
Document Type
- Postprint (84) (remove)
Language
- English (84) (remove)
Is part of the Bibliography
- yes (84)
Keywords
- exercise (4)
- muscle strength (4)
- Adaptive Force (3)
- Neuroenhancement (3)
- adolescents (3)
- depression (3)
- inflammation (3)
- maximal isometric Adaptive Force (3)
- resistance training (3)
- Prevention (2)
Institute
- Department Sport- und Gesundheitswissenschaften (84) (remove)
(1) Background: People with HIV (PWH) may perform more than one type of exercise cumulatively. The objective of this study is to investigate recreational exercise and its association with health-related quality of life (HRQOL) and comorbidities in relation to potential covariates. (2) Methods: The HIBES study (HIV-Begleiterkrankungen-Sport) is a cross-sectional study for people with HIV. The differences between non-exercisers versus exercisers (cumulated vs. single type of exercises) were investigated using regression models based on 454 participants. (3) Results: Exercisers showed a higher HRQOL score compared to non-exercisers (Wilcox r = 0.2 to 0.239). Psychological disorders were identified as the main covariate. Participants performing exercise cumulatively showed higher scores in duration, frequency, and intensity when compared to participants performing only one type of exercise. The mental health summary score was higher for the cumulated and single type of exercise if a psychological disorder existed. Duration and intensity were associated with an increase of HRQOL, whilst a stronger association between psychological disorders and exercise variables were evident. Exercise duration (minutes) showed a significant effect on QOL (standardized beta = 0.1) and for participants with psychological disorders (standardized beta = 0.3), respectively. (4) Conclusions: Psychological disorders and other covariates have a prominent effect on HRQOL and its association with exercise. For PWH with a psychological disorder, a stronger relationship between HRQOL with exercise duration and intensity emerged. However, differentiation of high-HRQOL individuals warrants further investigation by considering additional factors.
Background:
Deception can distort psychological tests on socially sensitive topics. Understanding the cerebral
processes that are involved in such faking can be useful in detection and prevention of deception. Previous research
shows that faking a brief implicit association test (BIAT ) evokes a characteristic ERP response. It is not yet known
whether temporarily available self-control resources moderate this response. We randomly assigned 22 participants
(15 females, 24.23
±
2.91
years old) to a counterbalanced repeated-measurements design. Participants first com-
pleted a Brief-IAT (BIAT ) on doping attitudes as a baseline measure and were then instructed to fake a negative dop
-
ing attitude both when self-control resources were depleted and non-depleted. Cerebral activity during BIAT perfor
-
mance was assessed using high-density EEG.
Results:
Compared to the baseline BIAT, event-related potentials showed a first interaction at the parietal P1,
while significant post hoc differences were found only at the later occurring late positive potential. Here, signifi-
cantly decreased amplitudes were recorded for ‘normal’ faking, but not in the depletion condition. In source space,
enhanced activity was found for ‘normal’ faking in the bilateral temporoparietal junction. Behaviorally, participants
were successful in faking the BIAT successfully in both conditions.
Conclusions:
Results indicate that temporarily available self-control resources do not affect overt faking success on
a BIAT. However, differences were found on an electrophysiological level. This indicates that while on a phenotypical
level self-control resources play a negligible role in deliberate test faking the underlying cerebral processes are markedly different.
Background: Healthy university students have been shown to use psychoactive substances, expecting them to be functional means for enhancing their cognitive capacity, sometimes over and above an essentially proficient level. This behavior called Neuroenhancement (NE) has not yet been integrated into a behavioral theory that is able to predict performance. Job Demands Resources (JD-R) Theory for example assumes that strain (e.g. burnout) will occur and influence performance when job demands are high and job resources are limited at the same time. The aim of this study is to investigate whether or not university students’ self-reported NE can be integrated into JD-R Theory’s comprehensive approach to psychological health and performance.
Methods: 1,007 students (23.56 ± 3.83 years old, 637 female) participated in an online survey. Lifestyle drug, prescription drug, and illicit substance NE together with the complete set of JD-R variables (demands, burnout, resources, motivation, and performance) were measured. Path models were used in order to test our data’s fit to hypothesized main effects and interactions.
Results: JD-R Theory could successfully be applied to describe the situation of university students. NE was mainly associated with the JD-R Theory’s health impairment process: Lifestyle drug NE (p < .05) as well as prescription drug NE (p < .001) is associated with higher burnout scores, and lifestyle drug NE aggravates the study demands-burnout interaction. In addition, prescription drug NE mitigates the protective influence of resources on burnout and on motivation.
Conclusion: According to our results, the uninformed trying of NE (i.e., without medical supervision) might result in strain. Increased strain is related to decreased performance. From a public health perspective, intervention strategies should address these costs of non-supervised NE. With regard to future research we propose to model NE as a means to reach an end (i.e. performance enhancement) rather than a target behavior itself. This is necessary to provide a deeper understanding of the behavioral roots and consequences of the phenomenon.
Background: The use of psychoactive substances to neuroenhance cognitive performance is prevalent. Neuroenhancement (NE) in everyday life and doping in sport might rest on similar attitudinal representations, and both behaviors can be theoretically modeled by comparable means-to-end relations (substance-performance). A behavioral (not substance-based) definition of NE is proposed, with assumed functionality as its core component. It is empirically tested whether different NE variants (lifestyle drug, prescription drug, and illicit substance) can be regressed to school stressors.
Findings: Participants were 519 students (25.8 +/- 8.4 years old, 73.1% female). Logistic regressions indicate that a modified doping attitude scale can predict all three NE variants. Multiple NE substance abuse was frequent. Overwhelming demands in school were associated with lifestyle and prescription drug NE.
Conclusions: Researchers should be sensitive for probable structural similarities between enhancement in everyday life and sport and systematically explore where findings from one domain can be adapted for the other. Policy makers should be aware that students might misperceive NE as an acceptable means of coping with stress in school, and help to form societal sensitivity for the topic of NE among our younger ones in general.
Background: Neuroenhancement (NE), the use of psychoactive substances in order to enhance a healthy individual's cognitive functioning from a proficient to an even higher level, is prevalent in student populations. According to the strength model of self-control, people fail to self-regulate and fall back on their dominant behavioral response when finite self-control resources are depleted. An experiment was conducted to test the hypothesis that ego-depletion will prevent students who are unfamiliar with NE from trying it.
Findings: 130 undergraduates, who denied having tried NE before (43% female, mean age = 22.76 +/- 4.15 years old), were randomly assigned to either an ego-depletion or a control condition. The dependent variable was taking an "energy-stick" (a legal nutritional supplement, containing low doses of caffeine, taurine and vitamin B), offered as a potential means of enhancing performance on the bogus concentration task that followed. Logistic regression analysis showed that ego-depleted participants were three times less likely to take the substance, OR = 0.37, p = .01.
Conclusion: This experiment found that trying NE for the first time was more likely if an individual's cognitive capacities were not depleted. This means that mental exhaustion is not predictive for NE in students for whom NE is not the dominant response. Trying NE for the first time is therefore more likely to occur as a thoughtful attempt at self-regulation than as an automatic behavioral response in stressful situations. We therefore recommend targeting interventions at this inter-individual difference. Students without previous reinforcing NE experience should be provided with information about the possible negative health outcomes of NE. Reconfiguring structural aspects in the academic environment (e.g. lessening workloads) might help to deter current users.
Background: The back pain screening tool Risk-Prevention-Index Social (RPI-S) identifies the individual psychosocial risk for low back pain chronification and supports the allocation of patients at risk in additional multidisciplinary treatments. The study objectives were to evaluate (1) the prognostic validity of the RPI-S for a 6-month time frame and (2) the clinical benefit of the RPI-S.
Methods: In a multicenter single-blind 3-armed randomized controlled trial, n = 660 persons (age 18–65 years) were randomly assigned to a twelve-week uni- or multidisciplinary exercise intervention or control group. Psychosocial risk was assessed by the RPI-S domain social environment (RPI-SSE) and the outcome pain by the Chronic Pain Grade Questionnaire (baseline M1, 12-weeks M4, 24-weeks M5). Prognostic validity was quantified by the root mean squared error (RMSE) within the control group. The clinical benefit of RPI-SSE was calculated by repeated measures ANOVA in intervention groups.
Results: A subsample of n = 274 participants (mean = 38.0 years, SD 13.1) was analyzed, of which 30% were classified at risk in their psychosocial profile. The half-year prognostic validity was good (RMSE for disability of 9.04 at M4 and of 9.73 at M5; RMSE for pain intensity of 12.45 at M4 and of 14.49 at M5). People at risk showed significantly stronger reduction in pain disability and intensity at M4/M5, if participating in a multidisciplinary exercise treatment. Subjects at no risk showed a smaller reduction in pain disability in both interventions and no group differences for pain intensity. Regarding disability due to pain, around 41% of the sample would gain an unfitted treatment without the back pain screening.
Conclusion: The RPI-SSE prognostic validity demonstrated good applicability and a clinical benefit confirmed by a clear advantage of an individualized treatment possibility.
Background Low back pain (LBP) is a common pain syndrome in athletes, responsible for 28% of missed training days/year. Psychosocial factors contribute to chronic pain development. This study aims to investigate the transferability of psychosocial screening tools developed in the general population to athletes and to define athlete-specific thresholds.
Methods Data from a prospective multicentre study on LBP were collected at baseline and 1-year follow-up (n=52 athletes, n=289 recreational athletes and n=246 non-athletes). Pain was assessed using the Chronic Pain Grade questionnaire. The psychosocial Risk Stratification Index (RSI) was used to obtain prognostic information regarding the risk of chronic LBP (CLBP). Individual psychosocial risk profile was gained with the Risk Prevention Index – Social (RPI-S). Differences between groups were calculated using general linear models and planned contrasts. Discrimination thresholds for athletes were defined with receiver operating characteristics (ROC) curves.
Results Athletes and recreational athletes showed significantly lower psychosocial risk profiles and prognostic risk for CLBP than non-athletes. ROC curves suggested discrimination thresholds for athletes were different compared with non-athletes. Both screenings demonstrated very good sensitivity (RSI=100%; RPI-S: 75%–100%) and specificity (RSI: 76%–93%; RPI-S: 71%–93%). RSI revealed two risk classes for pain intensity (area under the curve (AUC) 0.92(95% CI 0.85 to 1.0)) and pain disability (AUC 0.88(95% CI 0.71 to 1.0)).
Conclusions Both screening tools can be used for athletes. Athlete-specific thresholds will improve physicians’ decision making and allow stratified treatment and prevention.
Stress and pain
(2022)
Introduction: Low back pain (LBP) leads to considerable impairment of quality of life worldwide and is often accompanied by psychosomatic symptoms.
Objectives: First, to assess the association between stress and chronic low back pain (CLBP) and its simultaneous appearance with fatigue and depression as a symptom triad. Second, to identify the most predictive stress-related pattern set for CLBP for a 1-year diagnosis.
Methods: In a 1-year observational study with four measurement points, a total of 140 volunteers (aged 18–45 years with intermittent pain) were recruited. The primary outcomes were pain [characteristic pain intensity (CPI), subjective pain disability (DISS)], fatigue, and depressive mood. Stress was assessed as chronic stress, perceived stress, effort reward imbalance, life events, and physiological markers [allostatic load index (ALI), hair cortisol concentration (HCC)]. Multiple linear regression models and selection procedures for model shrinkage and variable selection (least absolute shrinkage and selection operator) were applied. Prediction accuracy was calculated by root mean squared error (RMSE) and receiver-operating characteristic curves.
Results: There were 110 participants completed the baseline assessments (28.2 7.5 years, 38.1% female), including HCC, and a further of 46 participants agreed to ALI laboratory measurements. Different stress types were associated with LBP, CLBP, fatigue, and depressive mood and its joint occurrence as a symptom triad at baseline; mainly social-related stress types were of relevance. Work-related stress, such as “excessive demands at work”[b = 0.51 (95%CI -0.23, 1.25), p = 0.18] played a role for upcoming chronic pain disability. “Social overload” [b = 0.45 (95%CI -0.06, 0.96), p = 0.080] and “over-commitment at work” [b = 0.28 (95%CI -0.39, 0.95), p = 0.42] were associated with an upcoming depressive mood within 1-year. Finally, seven psychometric (CPI: RMSE = 12.63; DISS: RMSE = 9.81) and five biomarkers (CPI: RMSE = 12.21; DISS: RMSE = 8.94) could be derived as the most predictive pattern set for a 1-year prediction of CLBP. The biomarker set showed an apparent area under the curve of 0.88 for CPI and 0.99 for DISS.
Conclusion: Stress disrupts allostasis and favors the development of chronic pain, fatigue, and depression and the emergence of a “hypocortisolemic symptom triad,” whereby the social-related stressors play a significant role. For translational medicine, a predictive pattern set could be derived which enables to diagnose the individuals at higher risk for the upcoming pain disorders and can be used in practice.
Objective:
Depression and coronary heart disease (CHD) are highly comorbid conditions. Brain-derived neurotrophic factor (BDNF) plays an important role in cardiovascular processes. Depressed patients typically show decreased BDNF concentrations. We analysed the relationship between BDNF and depression in a sample of patients with CHD and additionally distinguished between cognitive-affective and somatic depression symptoms. We also investigated whether BDNF was associated with somatic comorbidity burden, acute coronary syndrome (ACS) or congestive heart failure (CHF).
Methods:
The following variables were assessed for 225 hospitalised patients with CHD: BDNF concentrations, depression [Patient Health Questionnaire-9 (PHQ-9)], somatic comorbidity (Charlson Comorbidity Index), CHF, ACS, platelet count, smoking status and antidepressant treatment.
Results:
Regression models revealed that BDNF was not associated with severity of depression. Although depressed patients (PHQ-9 score >7) had significantly lower BDNF concentrations compared to non-depressed patients (p = 0.04), this was not statistically significant after controlling for confounders (p = 0.15). Cognitive-affective symptoms and somatic comorbidity burden each closely missed a statistically significant association with BDNF concentrations (p = 0.08, p = 0.06, respectively). BDNF was reduced in patients with CHF (p = 0.02). There was no covariate-adjusted, significant association between BDNF and ACS.
Conclusion:
Serum BDNF concentrations are associated with cardiovascular dysfunction. Somatic comorbidities should be considered when investigating the relationship between depression and BDNF.
Exercise or not?
(2023)
Objective: Individuals’ decisions to engage in exercise are often the result of in-the-moment choices between exercise and a competing behavioral alternative. The purpose of this study was to investigate processes that occur in-the-moment (i.e., situated processes) when individuals are faced with the choice between exercise and a behavioral alternative during a computerized task. These were analyzed against the background of interindividual differences in individuals’ automatic valuation and controlled evaluation of exercise.
Method: In a behavioral alternatives task 101 participants were asked whether they would rather choose an exercise option or a behavioral alternative in 25 trials. Participants’ gaze behavior (first gaze and fixations) was recorded using eye-tracking. An exercise-specific affect misattribution procedure (AMP) was used to assess participants’ automatic valuation of exercise before the task. After the task, self-reported feelings towards exercise (controlled evaluation) and usual weekly exercise volume were assessed. Mixed effects models with random effects for subjects and trials were used for data analysis.
Results: Choosing exercise was positively correlated with individuals’ automatic valuation (r = 0.20, p = 0.05), controlled evaluation (r = 0.58, p < 0.001), and their weekly exercise volume (r = 0.43, p < 0.001). Participants showed no bias in their initial gaze or number of fixations towards the exercise or the non-exercise alternative. However, participants were 1.30 times more likely to fixate on the chosen alternative first and more frequently, but this gaze behavior was not related to individuals’ automatic valuation, controlled evaluation, or weekly exercise volume.
Conclusion: The results suggest that situated processes arising from defined behavioral alternatives may be independent of individuals’ general preferences. Despite one’s best general intention to exercise more, the choice of a non-exercise alternative behavior may seem more appealing in-the-moment and eventually be chosen. New psychological theories of health behavior change should therefore better consider the role of potentially conflicting alternatives when it comes to initiating physical activity or exercise.
Objective: The aim of the present study was to examine the effect of Cold Water Immersion (CWI) on the recovery of physical performance, hematological stress markers and perceived wellness (i.e., Hooper scores) following a simulated Mixed Martial Arts (MMA) competition.
Methods: Participants completed two experimental sessions in a counter-balanced order (CWI or passive recovery for control condition: CON), after a simulated MMAs competition (3 x 5-min MMA rounds separated by 1-min of passive rest). During CWI, athletes were required to submerge their bodies, except the trunk, neck and head, in the seated position in a temperature-controlled bath (similar to 10 degrees C) for 15-min. During CON, athletes were required to be in a seated position for 15-min in same room ambient temperature. Venous blood samples (creatine kinase, cortisol, and testosterone concentrations) were collected at rest (PRE-EX, i.e., before MMAs), immediately following MMAs (POST-EX), immediately following recovery (POST-R) and 24 h post MMAs (POST-24), whilst physical fitness (squat jump, countermovement-jump and 5- and 10-m sprints) and perceptual measures (well-being Hooper index: fatigue, stress, delayed onset muscle soreness (DOMS), and sleep) were collected at PRE-EX, POST-R and POST-24, and at PRE-EX and POST-24, respectively.
Results: The main results indicate that POST-R sprint (5- and 10-m) performances were 'likely to very likely' (d = 0.64 and 0.65) impaired by prior CWI. However, moderate improvements were in 10-m sprint performance were 'likely' evident at POST-24 after CWI compared with CON (d = 0.53). Additionally, the use of CWI 'almost certainly' resulted in a large overall improvement in Hooper scores (d = 1.93). Specifically, CWI 'almost certainly' resulted in improved sleep quality (d = 1.36), stress (d = 1.56) and perceived fatigue (d = 1.51), and 'likely' resulted in a moderate decrease in DOMS (d = 0.60).
Conclusion: The use of CWI resulted in an enhanced recovery of 10-m sprint performance, as well as improved perceived wellness 24-h following simulated MMA competition.
Models employed in exercise psychology highlight the role of reflective processes for explaining behavior change. However, as discussed in social cognition literature, information-processing models also consider automatic processes (dual-process models). To examine the relevance of automatic processing in exercise psychology, we used a priming task to assess the automatic evaluations of exercise stimuli in physically active sport and exercise majors (n = 32), physically active nonsport majors (n = 31), and inactive students (n = 31). Results showed that physically active students responded faster to positive words after exercise primes, whereas inactive students responded more rapidly to negative words. Priming task reaction times were successfully used to predict reported amounts of exercise in an ordinal regression model. Findings were obtained only with experiential items reflecting negative and positive consequences of exercise. The results illustrate the potential importance of dual-process models in exercise psychology.
Background
The metabolic syndrome (MetS) is a risk cluster for a number of secondary diseases. The implementation of prevention programs requires early detection of individuals at risk. However, access to health care providers is limited in structurally weak regions. Brandenburg, a rural federal state in Germany, has an especially high MetS prevalence and disease burden. This study aims to validate and test the feasibility of a setup for mobile diagnostics of MetS and its secondary diseases, to evaluate the MetS prevalence and its association with moderating factors in Brandenburg and to identify new ways of early prevention, while establishing a “Mobile Brandenburg Cohort” to reveal new causes and risk factors for MetS.
Methods
In a pilot study, setups for mobile diagnostics of MetS and secondary diseases will be developed and validated. A van will be equipped as an examination room using point-of-care blood analyzers and by mobilizing standard methods. In study part A, these mobile diagnostic units will be placed at different locations in Brandenburg to locally recruit 5000 participants aged 40-70 years. They will be examined for MetS and advice on nutrition and physical activity will be provided. Questionnaires will be used to evaluate sociodemographics, stress perception, and physical activity. In study part B, participants with MetS, but without known secondary diseases, will receive a detailed mobile medical examination, including MetS diagnostics, medical history, clinical examinations, and instrumental diagnostics for internal, cardiovascular, musculoskeletal, and cognitive disorders. Participants will receive advice on nutrition and an exercise program will be demonstrated on site. People unable to participate in these mobile examinations will be interviewed by telephone. If necessary, participants will be referred to general practitioners for further diagnosis.
Discussion
The mobile diagnostics approach enables early detection of individuals at risk, and their targeted referral to local health care providers. Evaluation of the MetS prevalence, its relation to risk-increasing factors, and the “Mobile Brandenburg Cohort” create a unique database for further longitudinal studies on the implementation of home-based prevention programs to reduce mortality, especially in rural regions.
Trial registration
German Clinical Trials Register, DRKS00022764; registered 07 October 2020—retrospectively registered.
The general purpose of this systematic review was to summarize, structure and evaluate the findings on automatic evaluations of exercising. Studies were eligible for inclusion if they reported measuring automatic evaluations of exercising with an implicit measure and assessed some kind of exercise variable. Fourteen nonexperimental and six experimental studies (out of a total N = 1,928) were identified and rated by two independent reviewers. The main study characteristics were extracted and the grade of evidence for each study evaluated. First, results revealed a large heterogeneity in the applied measures to assess automatic evaluations of exercising and the exercise variables. Generally, small to large-sized significant relations between automatic evaluations of exercising and exercise variables were identified in the vast majority of studies. The review offers a systematization of the various examined exercise variables and prompts to differentiate more carefully between actually observed exercise behavior (proximal exercise indicator) and associated physiological or psychological variables (distal exercise indicator). Second, a lack of transparent reported reflections on the differing theoretical basis leading to the use of specific implicit measures was observed. Implicit measures should be applied purposefully, taking into consideration the individual advantages or disadvantages of the measures. Third, 12 studies were rated as providing first-grade evidence (lowest grade of evidence), five represent second-grade and three were rated as third-grade evidence. There is a dramatic lack of experimental studies, which are essential for illustrating the cause-effect relation between automatic evaluations of exercising and exercise and investigating under which conditions automatic evaluations of exercising influence behavior. Conclusions about the necessity of exercise interventions targeted at the alteration of automatic evaluations of exercising should therefore not be drawn too hastily.
The mechanical muscular oscillations are rarely the objective of investigations regarding the identification of a biomarker for Parkinson’s disease (PD). Therefore, the aim of this study was to investigate whether or not this specific motor output differs between PD patients and controls. The novelty is that patients without tremor are investigated performing a unilateral isometric motor task. The force of armflexors and the forearm acceleration (ACC) were recorded as well as the mechanomyography of the biceps brachii (MMGbi), brachioradialis (MMGbra) and pectoralis major (MMGpect) muscles using a piezoelectric-sensor-based system during a unilateral motor task at 70% of the MVIC. The frequency, a power-frequency-ratio, the amplitude variation, the slope of amplitudes and their interlimb asymmetries were analysed. The results indicate that the oscillatory behavior of muscular output in PD without tremor deviates from controls in some parameters: Significant differences appeared for the power-frequency-ratio (p = 0.001, r = 0.43) and for the amplitude variation (p = 0.003, r = 0.34) of MMGpect. The interlimb asymmetries differed significantly concerning the power-frequency-ratio of MMGbi (p = 0.013, r = 0.42) and MMGbra (p = 0.048, r = 0.39) as well as regarding the mean frequency (p = 0.004, r = 0.48) and amplitude variation of MMGpect (p = 0.033, r = 0.37). The mean (M) and variation coefficient (CV) of slope of ACC differed significantly (M: p = 0.022, r = 0.33; CV: p = 0.004, r = 0.43). All other parameters showed no significant differences between PD and controls. It remains open, if this altered mechanical muscular output is reproducible and specific for PD.
The link between emotions and motor function has been known for decades but is still not clarified. The Adaptive Force (AF) describes the neuromuscular capability to adapt to increasing forces and was suggested to be especially vulnerable to interfering inputs. This study investigated the influence of pleasant an unpleasant food imagery on the manually assessed AF of elbow and hip flexors objectified by a handheld device in 12 healthy women. The maximal isometric AF was significantly reduced during unpleasant vs. pleasant imagery and baseline (p < 0.001, dz = 0.98–1.61). During unpleasant imagery, muscle lengthening started at 59.00 ± 22.50% of maximal AF, in contrast to baseline and pleasant imagery, during which the isometric position could be maintained mostly during the entire force increase up to ~97.90 ± 5.00% of maximal AF. Healthy participants showed an immediately impaired holding function triggered by unpleasant imagery, presumably related to negative emotions. Hence, AF seems to be suitable to test instantaneously the effect of emotions on motor function. Since musculoskeletal complaints can result from muscular instability, the findings provide insights into the understanding of the causal chain of linked musculoskeletal pain and mental stress. A case example (current stress vs. positive imagery) suggests that the approach presented in this study might have future implications for psychomotor diagnostics and therapeutics.
The Adaptive Force (AF) reflects the neuromuscular capacity to adapt to external loads during holding muscle actions and is similar to motions in real life and sports. The maximal isometric AF (AFisomax) was considered to be the most relevant parameter and was assumed to have major importance regarding injury mechanisms and the development of musculoskeletal pain. The aim of this study was to investigate the behavior of different torque parameters over the course of 30 repeated maximal AF trials. In addition, maximal holding vs. maximal pushing isometric muscle actions were compared. A side consideration was the behavior of torques in the course of repeated AF actions when comparing strength and endurance athletes. The elbow flexors of n = 12 males (six strength/six endurance athletes, non-professionals) were measured 30 times (120 s rest) using a pneumatic device. Maximal voluntary isometric contraction (MVIC) was measured pre and post. MVIC, AFisomax, and AFmax (maximal torque of one AF measurement) were evaluated regarding different considerations and statistical tests. AFmax and AFisomax declined in the course of 30 trials [slope regression (mean ± standard deviation): AFmax = −0.323 ± 0.263; AFisomax = −0.45 ± 0.45]. The decline from start to end amounted to −12.8% ± 8.3% (p < 0.001) for AFmax and −25.41% ± 26.40% (p < 0.001) for AFisomax. AF parameters declined more in strength vs. endurance athletes. Thereby, strength athletes showed a rather stable decline for AFmax and a plateau formation for AFisomax after 15 trials. In contrast, endurance athletes reduced their AFmax, especially after the first five trials, and remained on a rather similar level for AFisomax. The maximum of AFisomax of all 30 trials amounted 67.67% ± 13.60% of MVIC (p < 0.001, n = 12), supporting the hypothesis of two types of isometric muscle action (holding vs. pushing). The findings provided the first data on the behavior of torque parameters after repeated isometric–eccentric actions and revealed further insights into neuromuscular control strategies. Additionally, they highlight the importance of investigating AF parameters in athletes based on the different behaviors compared to MVIC. This is assumed to be especially relevant regarding injury mechanisms.
In sports and movement sciences isometric muscle function is usually measured by pushing against a stable resistance. However, subjectively one can hold or push isometrically. Several investigations suggest a distinction of those forms. The aim of this study was to investigate whether these two forms of isometric muscle action can be distinguished by objective parameters in an interpersonal setting. 20 subjects were grouped in 10 same sex pairs, in which one partner should perform the pushing isometric muscle action (PIMA) and the other partner executed the holding isometric muscle action (HIMA). The partners had contact at the distal forearms via an interface, which included a strain gauge and an acceleration sensor. The mechanical oscillations of the triceps brachii (MMGtri) muscle, its tendon (MTGtri) and the abdominal muscle (MMGobl) were recorded by a piezoelectric-sensor-based measurement system. Each partner performed three 15s (80% MVIC) and two fatiguing trials (90% MVIC) during PIMA and HIMA, respectively. Parameters to compare PIMA and HIMA were the mean frequency, the normalized mean amplitude, the amplitude variation, the power in the frequency range of 8 to 15 Hz, a special power-frequency ratio and the number of task failures during HIMA or PIMA (partner who quit the task). A “HIMA failure” occurred in 85% of trials (p < 0.001). No significant differences between PIMA and HIMA were found for the mean frequency and normalized amplitude. The MMGobl showed significantly higher values of amplitude variation (15s: p = 0.013; fatiguing: p = 0.007) and of power-frequency-ratio (15s: p = 0.040; fatiguing: p = 0.002) during HIMA and a higher power in the range of 8 to 15 Hz during PIMA (15s: p = 0.001; fatiguing: p = 0.011). MMGtri and MTGtri showed no significant differences. Based on the findings it is suggested that a holding and a pushing isometric muscle action can be distinguished objectively, whereby a more complex neural control is assumed for HIMA.
The mechanotendography (MTG) is a method for analyzing the mechanical oscillations of tendons during muscular actions. The aim of this investigation was to evaluate the technical reliability of a piezo-based measurement system used for MTG. The reliability measurements were performed by using audio samples played by a subwoofer. The thereby generated pressure waves were recorded by a piezo-based measurement system. An audio of 40 Hz sine oscillations and four different formerly in vivo recorded MTG-signals were converted into audio files and were used as test signals. Five trials with each audio were performed and one audio was used for repetition trials on another day. The signals’ correlation was estimated by Spearman (MCC) and intraclass correlation coefficients (ICC(3,1)), Cronbach’s alpha (CA) and by mean distances (MD). All parameters were compared between repetition and randomized matched signals. The repetition trials show high correlations (MCC: 0.86 ± 0.13, ICC: 0.89 ± 0.12, CA: 0.98 ± 0.03), low MD (0.03 ± 0.03V) and differ significantly from the randomized matched signals (MCC: 0.15 ± 0.10, ICC: 0.17 ± 0.09, CA: 0.37 ± 0.16, MD: 0.19 ± 0.01V) (p = 0.001 – 0.043). This speaks for an excellent reliability of the measurement system. Presuming the skin above superficial tendons oscillates adequately, we estimate this tool as valid for the application in musculoskeletal system.
Long COVID patients show symptoms, such as fatigue, muscle weakness and pain. Adequate diagnostics are still lacking. Investigating muscle function might be a beneficial approach. The holding capacity (maximal isometric Adaptive Force; AFisomax) was previously suggested to be especially sensitive for impairments. This longitudinal, non-clinical study aimed to investigate the AF in long COVID patients and their recovery process. AF parameters of elbow and hip flexors were assessed in 17 patients at three time points (pre: long COVID state, post: immediately after first treatment, end: recovery) by an objectified manual muscle test. The tester applied an increasing force on the limb of the patient, who had to resist isometrically for as long as possible. The intensity of 13 common symptoms were queried. At pre, patients started to lengthen their muscles at ~50% of the maximal AF (AFmax), which was then reached during eccentric motion, indicating unstable adaptation. At post and end, AFisomax increased significantly to ~99% and 100% of AFmax, respectively, reflecting stable adaptation. AFmax was statistically similar for all three time points. Symptom intensity decreased significantly from pre to end. The findings revealed a substantially impaired maximal holding capacity in long COVID patients, which returned to normal function with substantial health improvement. AFisomax might be a suitable sensitive functional parameter to assess long COVID patients and to support therapy process
Inter-brain synchronization is primarily investigated during social interactions but had not been examined during coupled muscle action between two persons until now. It was previously shown that mechanical muscle oscillations can develop coherent behavior between two isometrically interacting persons. This case study investigated if inter-brain synchronization appears thereby, and if differences of inter- and intrapersonal muscle and brain coherence exist regarding two different types of isometric muscle action. Electroencephalography (EEG) and mechanomyography/mechanotendography (MMG/MTG) of right elbow extensors were recorded during six fatiguing trials of two coupled isometrically interacting participants (70% MVIC). One partner performed holding and one pushing isometric muscle action (HIMA/PIMA; tasks changed). The wavelet coherence of all signals (EEG, MMG/MTG, force, ACC) were analyzed intra- and interpersonally. The five longest coherence patches in 8–15 Hz and their weighted frequency were compared between real vs. random pairs and between HIMA vs. PIMA. Real vs. random pairs showed significantly higher coherence for intra-muscle, intra-brain, and inter-muscle-brain activity (p < 0.001 to 0.019). Inter-brain coherence was significantly higher for real vs. random pairs for EEG of right and central areas and for sub-regions of EEG left (p = 0.002 to 0.025). Interpersonal muscle-brain synchronization was significantly higher than intrapersonal one, whereby it was significantly higher for HIMA vs. PIMA. These preliminary findings indicate that inter-brain synchronization can arise during muscular interaction. It is hypothesized both partners merge into one oscillating neuromuscular system. The results reinforce the hypothesis that HIMA is characterized by more complex control strategies than PIMA. The pilot study suggests investigating the topic further to verify these results on a larger sample size. Findings could contribute to the basic understanding of motor control and is relevant for functional diagnostics such as the manual muscle test which is applied in several disciplines, e.g., neurology, physiotherapy.
Background: Change-of-direction (CoD) is a necessary physical ability of a field sport and may vary in youth players according to their maturation status.
Objectives: The aim of this study is: to compare the effectiveness of a 6-week CoD training intervention on dynamic balance (CS-YBT), horizontal jump (5JT), speed (10 and 30-m linear sprint times), CoD with (15 m-CoD + B) and without (15 m-CoD) the ball, in youth male soccer players at different levels of maturity [pre- and post-peak height velocity (PHV)].
Materials and Methods: Thirty elite male youth soccer players aged 10–17 years from the Tunisian first division participated in this study. The players were divided into pre- (G1, n = 15) and post-PHV (G2, n = 15) groups. Both groups completed a similar 6-week training program with two sessions per week of four CoD exercises. All players completed the following tests before and after intervention: CS-YBT; 5 JT; 10, 30, and 15 m-CoD; and 15 m-CoD + B, and data were analyzed using ANCOVA.
Results: All 30 players completed the study according to the study design and methodology. Adherence rate was 100% across all groups, and no training or test-related injuries were reported. Pre-PHV and post-PHV groups showed significant amelioration post-intervention for all dependent variables (after test > before test; p < 0.01, d = 0.09–1.51). ANOVA revealed a significant group × time interaction only for CS-YBT (F = 4.45; p < 0.04; η2 = 0.14), 5JT (F = 6.39; p < 0.02; η2 = 0.18), and 15 m-CoD (F = 7.88; p < 0.01; η2 = 0.22). CS-YBT, 5JT, and 15 m-CoD improved significantly in the post-PHV group (+ 4.56%, effect size = 1.51; + 4.51%, effect size = 1.05; and -3.08%, effect size = 0.51, respectively), more than the pre-PHV group (+ 2.77%, effect size = 0.85; + 2.91%, effect size = 0.54; and -1.56%, effect size = 0.20, respectively).
Conclusion: The CoD training program improved balance, horizontal jump, and CoD without the ball in male preadolescent and adolescent soccer players, and this improvement was greater in the post-PHV players. The maturity status of the athletes should be considered when programming CoD training for soccer players.
Effects of the barbell load on the acceleration phase during the snatch in Olympic weightlifting
(2020)
The load-depended loss of vertical barbell velocity at the end of the acceleration phase limits the maximum weight that can be lifted. Thus, the purpose of this study was to analyze how increased barbell loads affect the vertical barbell velocity in the sub-phases of the acceleration phase during the snatch. It was hypothesized that the load-dependent velocity loss at the end of the acceleration phase is primarily associated with a velocity loss during the 1st pull. For this purpose, 14 male elite weightlifters lifted seven load-stages from 70–100% of their personal best in the snatch. The load–velocity relationship was calculated using linear regression analysis to determine the velocity loss at 1st pull, transition, and 2nd pull. A group mean data contrast analysis revealed the highest load-dependent velocity loss for the 1st pull (t = 1.85, p = 0.044, g = 0.49 [−0.05, 1.04]) which confirmed our study hypothesis. In contrast to the group mean data, the individual athlete showed a unique response to increased loads during the acceleration sub-phases of the snatch. With the proposed method, individualized training recommendations on exercise selection and loading schemes can be derived to specifically improve the sub-phases of the snatch acceleration phase. Furthermore, the results highlight the importance of single-subject assessment when working with elite athletes in Olympic weightlifting.
This study examined the concurrent validity of an inverse dynamic (force computed from barbell acceleration [reference method]) and a work-energy (force computed from work at the barbell [alternative method]) approach to measure the mean vertical barbell force during the snatch using kinematic data from video analysis. For this purpose, the acceleration phase of the snatch was analyzed in thirty male medal winners of the 2018 weightlifting World Championships (age: 25.2±3.1 years; body mass: 88.9±28.6 kg). Vertical barbell kinematics were measured using a custom-made 2D real-time video analysis software. Agreement between the two computational approaches was assessed using Bland-Altman analysis, Deming regression, and Pearson product-moment correlation. Further, principal component analysis in conjunction with multiple linear regression was used to assess whether individual differences related to the two approaches are due to the waveforms of the acceleration time-series data. Results indicated no mean difference (p > 0.05; d = −0.04) and an extremely large correlation (r = 0.99) between the two approaches. Despite the high agreement, the total error of individual differences was 8.2% (163.0 N). The individual differences can be explained by a multiple linear regression model (R2adj = 0.86) on principal component scores from the principal component analysis of vertical barbell acceleration time-series waveforms. Findings from this study indicate that the individual errors of force measures can be associated with the inverse dynamic approach. This approach uses vertical barbell acceleration data from video analysis that is prone to error. Therefore, it is recommended to use the work-energy approach to compute mean vertical barbell force as this approach did not rely on vertical barbell acceleration.
Introduction: Adequate cognitive function in patients is a prerequisite for successful implementation of patient education and lifestyle coping in comprehensive cardiac rehabilitation (CR) programs. Although the association between cardiovascular diseases and cognitive impairments (CIs) is well known, the prevalence particularly of mild CI in CR and the characteristics of affected patients have been insufficiently investigated so far.
Methods: In this prospective observational study, 496 patients (54.5 ± 6.2 years, 79.8% men) with coronary artery disease following an acute coronary event (ACE) were analyzed. Patients were enrolled within 14 days of discharge from the hospital in a 3-week inpatient CR program. Patients were tested for CI using the Montreal Cognitive Assessment (MoCA) upon admission to and discharge from CR. Additionally, sociodemographic, clinical, and physiological variables were documented. The data were analyzed descriptively and in a multivariate stepwise backward elimination regression model with respect to CI.
Results: At admission to CR, the CI (MoCA score < 26) was determined in 182 patients (36.7%). Significant differences between CI and no CI groups were identified, and CI group was associated with high prevalence of smoking (65.9 vs 56.7%, P = 0.046), heavy (physically demanding) workloads (26.4 vs 17.8%, P < 0.001), sick leave longer than 1 month prior to CR (28.6 vs 18.5%, P = 0.026), reduced exercise capacity (102.5 vs 118.8 W, P = 0.006), and a shorter 6-min walking distance (401.7 vs 421.3 m, P = 0.021) compared to no CI group. The age- and education-adjusted model showed positive associations with CI only for sick leave more than 1 month prior to ACE (odds ratio [OR] 1.673, 95% confidence interval 1.07–2.79; P = 0.03) and heavy workloads (OR 2.18, 95% confidence interval 1.42–3.36; P < 0.01).
Conclusion: The prevalence of CI in CR was considerably high, affecting more than one-third of cardiac patients. Besides age and education level, CI was associated with heavy workloads and a longer sick leave before ACE.
Intervertebral disc (IVD) cells are naturally exposed to high osmolarity and complex mechanical loading, which drive microenvironmental osmotic changes. Age- and degeneration-induced degradation of the IVD's extracellular matrix causes osmotic imbalance, which, together with an altered function of cellular receptors and signalling pathways, instigates local osmotic stress. Cellular responses to osmotic stress include osmoadaptation and activation of pro-inflammatory pathways. This review summarises the current knowledge on how IVD cells sense local osmotic changes and translate these signals into physiological or pathophysiological responses, with a focus on inflammation. Furthermore, it discusses the expression and function of putative membrane osmosensors (e.g. solute carrier transporters, transient receptor potential channels, aquaporins and acid-sensing ion channels) and osmosignalling mediators [e.g. tonicity responseelement-binding protein/nuclear factor of activated T-cells 5 (TonEBP/NFAT5), nuclear factor kappa-lightchain-enhancer of activated B cells (NF-kappa B)] in healthy and degenerated IVDs. Finally, an overview of the potential therapeutic targets for modifying osmosensing and osmosignalling in degenerated IVDs is provided.
Background
Artificial intelligence (AI) is one of the most promising areas in medicine with many possibilities for improving health and wellness. Already today, diagnostic decision support systems may help patients to estimate the severity of their complaints. This fictional case study aimed to test the diagnostic potential of an AI algorithm for common sports injuries and pathologies.
Methods
Based on a literature review and clinical expert experience, five fictional “common” cases of acute, and subacute injuries or chronic sport-related pathologies were created: Concussion, ankle sprain, muscle pain, chronic knee instability (after ACL rupture) and tennis elbow. The symptoms of these cases were entered into a freely available chatbot-guided AI app and its diagnoses were compared to the pre-defined injuries and pathologies.
Results
A mean of 25–36 questions were asked by the app per patient, with optional explanations of certain questions or illustrative photos on demand. It was stressed, that the symptom analysis would not replace a doctor’s consultation. A 23-yr-old male patient case with a mild concussion was correctly diagnosed. An ankle sprain of a 27-yr-old female without ligament or bony lesions was also detected and an ER visit was suggested. Muscle pain in the thigh of a 19-yr-old male was correctly diagnosed. In the case of a 26-yr-old male with chronic ACL instability, the algorithm did not sufficiently cover the chronic aspect of the pathology, but the given recommendation of seeing a doctor would have helped the patient. Finally, the condition of the chronic epicondylitis in a 41-yr-old male was correctly detected.
Conclusions
All chosen injuries and pathologies were either correctly diagnosed or at least tagged with the right advice of when it is urgent for seeking a medical specialist. However, the quality of AI-based results could presumably depend on the data-driven experience of these programs as well as on the understanding of their users. Further studies should compare existing AI programs and their diagnostic accuracy for medical injuries and pathologies.
The relevance for in vitro three-dimensional (3D) tissue culture of skin has been present for almost a century. From using skin biopsies in organ culture, to vascularized organotypic full-thickness reconstructed human skin equivalents, in vitro tissue regeneration of 3D skin has reached a golden era. However, the reconstruction of 3D skin still has room to grow and develop. The need for reproducible methodology, physiological structures and tissue architecture, and perfusable vasculature are only recently becoming a reality, though the addition of more complex structures such as glands and tactile corpuscles require advanced technologies. In this review, we will discuss the current methodology for biofabrication of 3D skin models and highlight the advantages and disadvantages of the existing systems as well as emphasize how new techniques can aid in the production of a truly physiologically relevant skin construct for preclinical innovation.
This meta-analysis aimed to assess the effects of plyometric jump training (PJT) on volleyball players’ vertical jump height (VJH), comparing changes with those observed in a matched control group. A literature search in the databases of PubMed, MEDLINE, Web of Science, and SCOPUS was conducted. Only randomized-controlled trials and studies that included a pre-to-post intervention assessment of VJH were included. They involved only healthy volleyball players with no restrictions on age or sex. Data were independently extracted from the included studies by two authors. The Physiotherapy Evidence Database scale was used to assess the risk of bias, and methodological quality, of eligible studies included in the review. From 7,081 records, 14 studies were meta-analysed. A moderate Cohen’s d effect size (ES = 0.82, p <0.001) was observed for VJH, with moderate heterogeneity (I2 = 34.4%, p = 0.09) and no publication bias (Egger’s test, p = 0.59). Analyses of moderator variables revealed no significant differences for PJT program duration (≤8 vs. >8 weeks, ES = 0.79 vs. 0.87, respectively), frequency (≤2 vs. >2 sessions/week, ES = 0.83 vs. 0.78, respectively), total number of sessions (≤16 vs. >16 sessions, ES = 0.73 vs. 0.92, respectively), sex (female vs. male, ES = 1.3 vs. 0.5, respectively), age (≥19 vs. <19 years of age, ES = 0.89 vs. 0.70, respectively), and volume (>2,000 vs. <2,000 jumps, ES = 0.76 vs. 0.79, respectively). In conclusion, PJT appears to be effective in inducing improvements in volleyball players’ VJH. Improvements in VJH may be achieved by both male and female volleyball players, in different age groups, with programs of relatively low volume and frequency. Though PJT seems to be safe for volleyball players, it is recommended that an individualized approach, according to player position, is adopted with some players (e.g. libero) less prepared to sustain PJT loads.
Development of chronic pain after a low back pain episode is associated with increased pain sensitivity, altered pain processing mechanisms and the influence of psychosocial factors. Although there is some evidence that multimodal therapy (such as behavioral or motor control therapy) may be an important therapeutic strategy, its long-term effect on pain reduction and psychosocial load is still unclear. Prospective longitudinal designs providing information about the extent of such possible long-term effects are missing. This study aims to investigate the long-term effects of a homebased uni- and multidisciplinary motor control exercise program on low back pain intensity, disability and psychosocial variables. 14 months after completion of a multicenter study comparing uni- and multidisciplinary exercise interventions, a sample of one study center (n = 154) was assessed once more. Participants filled in questionnaires regarding their low back pain symptoms (characteristic pain intensity and related disability), stress and vital exhaustion (short version of the Maastricht Vital Exhaustion Questionnaire), anxiety and depression experiences (the Hospital and Anxiety Depression Scale), and pain-related cognitions (the Fear Avoidance Beliefs Questionnaire). Repeated measures mixed ANCOVAs were calculated to determine the long-term effects of the interventions on characteristic pain intensity and disability as well as on the psychosocial variables. Fifty four percent of the sub-sample responded to the questionnaires (n = 84). Longitudinal analyses revealed a significant long-term effect of the exercise intervention on pain disability. The multidisciplinary group missed statistical significance yet showed a medium sized long-term effect. The groups did not differ in their changes of the psychosocial variables of interest. There was evidence of long-term effects of the interventions on pain-related disability, but there was no effect on the other variables of interest. This may be partially explained by participant's low comorbidities at baseline. Results are important regarding costless homebased alternatives for back pain patients and prevention tasks. Furthermore, this study closes the gap of missing long-term effect analysis in this field.
Introduction
Injury prevention programs (IPPs) are an inherent part of training in recreational and professional sports. Providing performance-enhancing benefits in addition to injury prevention may help adjust coaches and athletes’ attitudes towards implementation of injury prevention into daily routine. Conventional thinking by players and coaches alike seems to suggest that IPPs need to be specific to one’s sport to allow for performance enhancement. The systematic literature review aims to firstly determine the IPPs nature of exercises and whether they are specific to the sport or based on general conditioning. Secondly, can they demonstrate whether general, sports-specific or even mixed IPPs improve key performance indicators with the aim to better facilitate long-term implementation of these programs?
Methods
PubMed and Web of Science were electronically searched throughout March 2018. The inclusion criteria were randomized control trials, publication dates between Jan 2006 and Feb 2018, athletes (11–45 years), injury prevention programs and included predefined performance measures that could be categorized into balance, power, strength, speed/agility and endurance. The methodological quality of included articles was assessed with the Cochrane Collaboration assessment tools.
Results
Of 6619 initial findings, 22 studies met the inclusion criteria. In addition, reference lists unearthed a further 6 studies, making a total of 28. Nine studies used sports specific IPPs, eleven general and eight mixed prevention strategies. Overall, general programs ranged from 29–57% in their effectiveness across performance outcomes. Mixed IPPs improved in 80% balance outcomes but only 20–44% in others. Sports-specific programs led to larger scale improvements in balance (66%), power (83%), strength (75%), and speed/agility (62%).
Conclusion
Sports-specific IPPs have the strongest influence on most performance indices based on the significant improvement versus control groups. Other factors such as intensity, technical execution and compliance should be accounted for in future investigations in addition to exercise modality.
Introduction
To date, several meta-analyses clearly demonstrated that resistance and plyometric training are effective to improve physical fitness in children and adolescents. However, a methodological limitation of meta-analyses is that they synthesize results from different studies and hence ignore important differences across studies (i.e., mixing apples and oranges). Therefore, we aimed at examining comparative intervention studies that assessed the effects of age, sex, maturation, and resistance or plyometric training descriptors (e.g., training intensity, volume etc.) on measures of physical fitness while holding other variables constant.
Methods
To identify relevant studies, we systematically searched multiple electronic databases (e.g., PubMed) from inception to March 2018. We included resistance and plyometric training studies in healthy young athletes and non-athletes aged 6 to 18 years that investigated the effects of moderator variables (e.g., age, maturity, sex, etc.) on components of physical fitness (i.e., muscle strength and power).
Results
Our systematic literature search revealed a total of 75 eligible resistance and plyometric training studies, including 5,138 participants. Mean duration of resistance and plyometric training programs amounted to 8.9 ± 3.6 weeks and 7.1±1.4 weeks, respectively. Our findings showed that maturation affects plyometric and resistance training outcomes differently, with the former eliciting greater adaptations pre-peak height velocity (PHV) and the latter around- and post-PHV. Sex has no major impact on resistance training related outcomes (e.g., maximal strength, 10 repetition maximum). In terms of plyometric training, around-PHV boys appear to respond with larger performance improvements (e.g., jump height, jump distance) compared with girls. Different types of resistance training (e.g., body weight, free weights) are effective in improving measures of muscle strength (e.g., maximum voluntary contraction) in untrained children and adolescents. Effects of plyometric training in untrained youth primarily follow the principle of training specificity. Despite the fact that only 6 out of 75 comparative studies investigated resistance or plyometric training in trained individuals, positive effects were reported in all 6 studies (e.g., maximum strength and vertical jump height, respectively).
Conclusions
The present review article identified research gaps (e.g., training descriptors, modern alternative training modalities) that should be addressed in future comparative studies.
We are glad to introduce the Second Journal Club of Volume Five, Second Issue. This edition is focused on relevant studies published in the last few years in the field of resistance training, chosen by our Editorial Board members and their colleagues. We hope to stimulate your curiosity in this field and to share with you the passion for the sport, seen also from the scientific point of view. The Editorial Board members wish you an inspiring lecture.
Background: To handle the competition demands, sparring drills are used for specific technical–tactical training as well as physical–physiological conditioning in combat sports. While the effects of different area sizes and number of within-round sparring partners on physiological and perceptive responses in combats sports were examined in previous studies, technical and tactical aspects were not investigated. This study investigated the effect of different within-round sparring partners number (i.e., at a time; 1 vs. 1, 1 vs. 2, and 1 vs. 4) and area sizes (2 m × 2 m, 4 m × 4 m, and 6 m × 6 m) variation on the technical–tactical aspects of small combat games in kickboxing.
Method: Twenty male kickboxers (mean ± standard deviation, age: 20.3 ± 0.9 years), regularly competing in regional and national events randomly performed nine different kickboxing combats, lasting 2 min each. All combats were video recorded and analyzed using the software Dartfish.
Results: Results showed that the total number of punches was significantly higher in 1 versus 4 compared with 1 versus 1 (p = 0.011, d = 0.83). Further, the total number of kicks was significantly higher in 1 versus 4 compared with 1 versus 1 and 1 versus 2 (p < 0.001; d = 0.99 and d = 0.83, respectively). Moreover, the total number of kick combinations was significantly higher in 1 versus 4 compared with 1 versus 1 and 1 versus 2 (p < 0.001; d = 1.05 and d = 0.95, respectively). The same outcome was significantly lower in 2 m × 2 m compared with 4 m × 4 m and 6 m × 6 m areas (p = 0.010 and d = − 0.45; p < 0.001 and d = − 0.6, respectively). The number of block-and-parry was significantly higher in 1 versus 4 compared with 1 versus 1 (p < 0.001, d = 1.45) and 1 versus 2 (p = 0.046, d = 0.61) and in 2 m × 2 m compared with 4 m × 4 m and 6 × 6 m areas (p < 0.001; d = 0.47 and d = 0.66, respectively). Backwards lean actions occurred more often in 2 m × 2 m compared with 4 m × 4 m (p = 0.009, d = 0.53) and 6 m × 6 m (p = 0.003, d = 0.60). However, the number of foot defenses was significantly lower in 2 m × 2 m compared with 6 m × 6 m (p < 0.001, d = 1.04) and 4 m × 4 m (p = 0.004, d = 0.63). Additionally, the number of clinches was significantly higher in 1 versus 1 compared with 1 versus 2 (p = 0.002, d = 0.7) and 1 versus 4 (p = 0.034, d = 0.45).
Conclusions: This study provides practical insights into how to manipulate within-round sparring partners’ number and/or area size to train specific kickboxing technical–tactical fundamentals.
Background:
Arising from the relevance of sensorimotor training in the therapy of nonspecific low back pain patients and from the value of individualized therapy, the present trial aims to test the feasibility and efficacy of individualized sensorimotor training interventions in patients suffering from nonspecific low back pain.
Methods and study design:
A multicentre, single-blind two-armed randomized controlled trial to evaluate the
effects of a 12-week (3 weeks supervised centre-based and 9 weeks home-based) individualized sensorimotor exercise program is performed. The control group stays inactive during this period. Outcomes are pain, and pain-associated function as well as motor function in adults with nonspecific low back pain. Each participant is scheduled to five measurement dates: baseline (M1), following centre-based training (M2), following home-based training (M3) and at two follow-up time points 6 months (M4) and 12 months (M5) after M1. All investigations and the assessment of the primary and secondary outcomes are performed in a standardized order: questionnaires – clinical examination – biomechanics (motor function). Subsequent statistical procedures are executed after the examination of underlying assumptions for parametric or rather non-parametric testing.
Discussion:
The results and practical relevance of the study will be of clinical and practical relevance not only for researchers and policy makers but also for the general population suffering from nonspecific low back pain.
Trial registration:
Identification number DRKS00010129. German Clinical Trial registered on 3 March 2016.
Background
Recently, the incidence rate of back pain (BP) in adolescents has been reported at 21%. However, the development of BP in adolescent athletes is unclear. Hence, the purpose of this study was to examine the incidence of BP in young elite athletes in relation to gender and type of sport practiced.
Methods
Subjective BP was assessed in 321 elite adolescent athletes (m/f 57%/43%; 13.2 ± 1.4 years; 163.4 ± 11.4 cm; 52.6 ± 12.6 kg; 5.0 ± 2.6 training yrs; 7.6 ± 5.3 training h/week). Initially, all athletes were free of pain. The main outcome criterion was the incidence of back pain [%] analyzed in terms of pain development from the first measurement day (M1) to the second measurement day (M2) after 2.0 ± 1.0 year. Participants were classified into athletes who developed back pain (BPD) and athletes who did not develop back pain (nBPD). BP (acute or within the last 7 days) was assessed with a 5-step face scale (face 1–2 = no pain; face 3–5 = pain). BPD included all athletes who reported faces 1 and 2 at M1 and faces 3 to 5 at M2. nBPD were all athletes who reported face 1 or 2 at both M1 and M2. Data was analyzed descriptively. Additionally, a Chi2 test was used to analyze gender- and sport-specific differences (p = 0.05).
Results
Thirty-two athletes were categorized as BPD (10%). The gender difference was 5% (m/f: 12%/7%) but did not show statistical significance (p = 0.15). The incidence of BP ranged between 6 and 15% for the different sport categories. Game sports (15%) showed the highest, and explosive strength sports (6%) the lowest incidence. Anthropometrics or training characteristics did not significantly influence BPD (p = 0.14 gender to p = 0.90 sports; r2 = 0.0825).
Conclusions
BP incidence was lower in adolescent athletes compared to young non-athletes and even to the general adult population. Consequently, it can be concluded that high-performance sports do not lead to an additional increase in back pain incidence during early adolescence. Nevertheless, back pain prevention programs should be implemented into daily training routines for sport categories identified as showing high incidence rates.
Background
Overweight and obesity are increasing health problems that are not restricted to adults only. Childhood obesity is associated with metabolic, psychological and musculoskeletal comorbidities. However, knowledge about the effect of obesity on the foot function across maturation is lacking. Decreased foot function with disproportional loading characteristics is expected for obese children. The aim of this study was to examine foot loading characteristics during gait of normal-weight, overweight and obese children aged 1-12 years.
Methods
A total of 10382 children aged one to twelve years were enrolled in the study. Finally, 7575 children (m/f: n = 3630/3945; 7.0 +/- 2.9yr; 1.23 +/- 0.19m; 26.6 +/- 10.6kg; BMI: 17.1 +/- 2.4kg/m(2)) were included for (complete case) data analysis. Children were categorized to normalweight (>= 3rd and <90th percentile; n = 6458), overweight (>= 90rd and <97th percentile; n = 746) or obese (>97th percentile; n = 371) according to the German reference system that is based on age and gender-specific body mass indices (BMI). Plantar pressure measurements were assessed during gait on an instrumented walkway. Contact area, arch index (AI), peak pressure (PP) and force time integral (FTI) were calculated for the total, fore-, mid-and hindfoot. Data was analyzed descriptively (mean +/- SD) followed by ANOVA/Welch-test (according to homogeneity of variances: yes/no) for group differences according to BMI categorization (normal-weight, overweight, obesity) and for each age group 1 to 12yrs (post-hoc Tukey Kramer/Dunnett's C; alpha = 0.05).
Results
Mean walking velocity was 0.95 +/- 0.25 m/s with no differences between normal-weight, overweight or obese children (p = 0.0841). Results show higher foot contact area, arch index, peak pressure and force time integral in overweight and obese children (p< 0.001). Obese children showed the 1.48-fold (1 year-old) to 3.49-fold (10 year-old) midfoot loading (FTI) compared to normal-weight.
Conclusion
Additional body mass leads to higher overall load, with disproportional impact on the midfoot area and longitudinal foot arch showing characteristic foot loading patterns. Already the feet of one and two year old children are significantly affected. Childhood overweight and obesity is not compensated by the musculoskeletal system. To avoid excessive foot loading with potential risk of discomfort or pain in childhood, prevention strategies should be developed and validated for children with a high body mass index and functional changes in the midfoot area. The presented plantar pressure values could additionally serve as reference data to identify suspicious foot loading patterns in children.
Background
Back pain patients (BPP) show delayed muscle onset, increased co-contractions, and variability as response to quasi-static sudden trunk loading in comparison to healthy controls (H). However, it is unclear whether these results can validly be transferred to suddenly applied walking perturbations, an automated but more functional and complex movement pattern. There is an evident need to develop research-based strategies for the rehabilitation of back pain. Therefore, the investigation of differences in trunk stability between H and BPP in functional movements is of primary interest in order to define suitable intervention regimes. The purpose of this study was to analyse neuromuscular reflex activity as well as three-dimensional trunk kinematics between H and BPP during walking perturbations.
Methods
Eighty H (31m/49f;29±9yrs;174±10cm;71±13kg) and 14 BPP (6m/8f;30±8yrs;171±10cm;67±14kg) walked (1m/s) on a split-belt treadmill while 15 right-sided perturbations (belt decelerating, 40m/s2, 50ms duration; 200ms after heel contact) were randomly applied. Trunk muscle activity was assessed using a 12-lead EMG set-up. Trunk kinematics were measured using a 3-segment-model consisting of 12 markers (upper thoracic (UTA), lower thoracic (LTA), lumbar area (LA)). EMG-RMS ([%],0-200ms after perturbation) was calculated and normalized to the RMS of unperturbed gait. Latency (TON;ms) and time to maximum activity (TMAX;ms) were analysed. Total motion amplitude (ROM;[°]) and mean angle (Amean;[°]) for extension-flexion, lateral flexion and rotation were calculated (whole stride cycle; 0-200ms after perturbation) for each of the three segments during unperturbed and perturbed gait. For ROM only, perturbed was normalized to unperturbed step [%] for the whole stride as well as the 200ms after perturbation. Data were analysed descriptively followed by a student´s t-test to account for group differences. Co-contraction was analyzed between ventral and dorsal muscles (V:R) as well as side right:side left ratio (Sright:Sleft). The coefficient of variation (CV;%) was calculated (EMG-RMS;ROM) to evaluate variability between the 15 perturbations for all groups. With respect to unequal distribution of participants to groups, an additional matched-group analysis was conducted. Fourteen healthy controls out of group H were sex-, age- and anthropometrically matched (group Hmatched) to the BPP.
Results
No group differences were observed for EMG-RMS or CV analysis (EMG/ROM) (p>0.025). Co-contraction analysis revealed no differences for V:R and Srigth:Sleft between the groups (p>0.025). BPP showed an increased TON and TMAX, being significant for Mm. rectus abdominus (p = 0.019) and erector spinae T9/L3 (p = 0.005/p = 0.015). ROM analysis over the unperturbed stride cycle revealed no differences between groups (p>0.025). Normalization of perturbed to unperturbed step lead to significant differences for the lumbar segment (LA) in lateral flexion with BPP showing higher normalized ROM compared to Hmatched (p = 0.02). BPP showed a significant higher flexed posture (UTA (p = 0.02); LTA (p = 0.004)) during normal walking (Amean). Trunk posture (Amean) during perturbation showed higher trunk extension values in LTA segments for H/Hmatched compared to BPP (p = 0.003). Matched group (BPP vs. Hmatched) analysis did not show any systematic changes of all results between groups.
Conclusion
BPP present impaired muscle response times and trunk posture, especially in the sagittal and transversal planes, compared to H. This could indicate reduced trunk stability and higher loading during gait perturbations.
This study aimed to determine the specific physical and basic gymnastics skills considered critical in gymnastics talent identification and selection as well as in promoting men’s artistic gymnastics performances. Fifty-one boys from a provincial gymnastics team (age 11.03 ± 0.95 years; height 1.33 ± 0.05 m; body mass 30.01 ± 5.53 kg; body mass index [BMI] 16.89 ± 3.93 kg/m²) regularly competing at national level voluntarily participated in this study. Anthropometric measures as well as the men’s artistic gymnastics physical test battery (i.e., International Gymnastics Federation [FIG] age group development programme) were used to assess the somatic and physical fitness profile of participants, respectively. The physical characteristics assessed were: muscle strength, flexibility, speed, endurance, and muscle power. Test outcomes were subjected to a principal components analysis to identify the most representative factors. The main findings revealed that power speed, isometric and explosive strength, strength endurance, and dynamic and static flexibility are the most determinant physical fitness aspects of the talent selection process in young male artistic gymnasts. These findings are of utmost importance for talent identification, selection, and development.
The use of functional music in gait training termed rhythmic auditory stimulation (RAS) and treadmill training (TT) have both been shown to be effective in stroke patients (SP). The combination of RAS and treadmill training (RAS-TT) has not been clinically evaluated to date. The aim of the study was to evaluate the efficacy of RAS-TT on functional gait in SR The protocol followed the design of an explorative study with a rater-blinded three arm prospective randomized controlled parallel group design. Forty-five independently walking SP with a hemiparesis of the lower limb or an unsafe and asymmetrical walking pattern were recruited. RAS-TT was carried out over 4 weeks with TT and neurodevelopmental treatment based on Bobath approach (NDT) serving as control interventions. For RAS-TT functional music was adjusted individually while walking on the treadmill. Pre and post-assessments consisted of the fast gait speed test (FGS), a gait analysis with the locometre (LOC), 3 min walking time test (3MWT), and an instrumental evaluation of balance (IEB). Raters were blinded to group assignments. An analysis of covariance (ANCOVA) was performed with affiliated measures from pre-assessment and time between stroke and start of study as covariates. Thirty-five participants (mean age 63.6 +/- 8.6 years, mean time between stroke and start of study 42.1 +/- 23.7 days) completed the study (11 RAS-TT, 13 TT, 11 NDT). Significant group differences occurred in the FGS for adjusted post-measures in gait velocity [F-(2,F- (34)) = 3.864, p = 0.032; partial eta(2) = 0.205] and cadence [F-(2,F- 34) = 7.656, p = 0.002; partial eta(2) = 0.338]. Group contrasts showed significantly higher values for RAS-TT. Stride length results did not vary between the groups. LOC, 3MWT, and IEB did not indicate group differences. One patient was withdrawn from TT because of pain in one arm. The study provides first evidence for a higher efficacy of RAS-TT in comparison to the standard approaches TT and NDT in restoring functional gait in SP. The results support the implementation of functional music in neurological gait rehabilitation and its use in combination with treadmill training.
Background/Purpose
Muscular reflex responses of the lower extremities to sudden gait disturbances are related to postural stability and injury risk. Chronic ankle instability (CAI) has shown to affect activities related to the distal leg muscles while walking. Its effects on proximal muscle activities of the leg, both for the injured- (IN) and uninjured-side (NON), remain unclear. Therefore, the aim was to compare the difference of the motor control strategy in ipsilateral and contralateral proximal joints while unperturbed walking and perturbed walking between individuals with CAI and matched controls.
Materials and methods
In a cross-sectional study, 13 participants with unilateral CAI and 13 controls (CON) walked on a split-belt treadmill with and without random left- and right-sided perturbations. EMG amplitudes of muscles at lower extremities were analyzed 200 ms after perturbations, 200 ms before, and 100 ms after (Post100) heel contact while walking. Onset latencies were analyzed at heel contacts and after perturbations. Statistical significance was set at alpha≤0.05 and 95% confidence intervals were applied to determine group differences. Cohen’s d effect sizes were calculated to evaluate the extent of differences.
Results
Participants with CAI showed increased EMG amplitudes for NON-rectus abdominus at Post100 and shorter latencies for IN-gluteus maximus after heel contact compared to CON (p<0.05). Overall, leg muscles (rectus femoris, biceps femoris, and gluteus medius) activated earlier and less bilaterally (d = 0.30–0.88) and trunk muscles (bilateral rectus abdominus and NON-erector spinae) activated earlier and more for the CAI group than CON group (d = 0.33–1.09).
Conclusion
Unilateral CAI alters the pattern of the motor control strategy around proximal joints bilaterally. Neuromuscular training for the muscles, which alters motor control strategy because of CAI, could be taken into consideration when planning rehabilitation for CAI.
Background: Chronic ankle instability, developing from ankle sprain, is one of the most common sports injuries. Besides it being an ankle issue, chronic ankle instability can also cause additional injuries. Investigating the epidemiology of chronic ankle instability is an essential step to develop an adequate injury prevention strategy. However, the epidemiology of chronic ankle instability remains unknown. Therefore, the purpose of this study was to investigate the epidemiology of chronic ankle instability through valid and reliable self-reported tools in active populations.
Methods: An electronic search was performed on PubMed and Web of Science in July 2020. The inclusion criteria for articles were peer-reviewed, published between 2006 and 2020, using one of the valid and reliable tools to evaluate ankle instability, determining chronic ankle instability based on the criteria of the International Ankle
Consortium, and including the outcome of epidemiology of chronic ankle instability. The risk of bias of the included studies was evaluated with an adapted tool for the sports injury review method.
Results: After removing duplicated studies, 593 articles were screened for eligibility. Twenty full-texts were screened and finally nine studies were included, assessing 3804 participants in total. The participants were between 15 and 32 years old and represented soldiers, students, athletes and active individuals with a history of ankle sprain. The prevalence of chronic ankle instability was 25%, ranging between 7 and 53%. The prevalence of chronic ankle instability within participants with a history of ankle sprains was 46%, ranging between 9 and 76%. Five included studies identified chronic ankle instability based on the standard criteria, and four studies applied adapted exclusion criteria to conduct the study. Five out of nine included studies showed a low risk of bias.
Conclusions: The prevalence of chronic ankle instability shows a wide range. This could be due to the different exclusion criteria, age, sports discipline, or other factors among the included studies. For future studies, standardized criteria to investigate the epidemiology of chronic ankle instability are required. The epidemiology of
CAI should be prospective. Factors affecting the prevalence of chronic ankle instability should be investigated and clearly described.
Background: The aim of the present study was to verify concurrent validity of the Gyko inertial sensor system for the assessment of vertical jump height. - Methods: Nineteen female sub-elite youth soccer players (mean age: 14.7 ± 0.6 years) performed three trials of countermovement (CMJ) and squat jumps (SJ), respectively. Maximal vertical jump height was simultaneously quantified with the Gyko system, a Kistler force-plate (i.e., gold standard), and another criterion device that is frequently used in the field, the Optojump system. - Results: Compared to the force-plate, the Gyko system determined significant systematic bias for mean CMJ (−0.66 cm, p < 0.01, d = 1.41) and mean SJ (−0.91 cm, p < 0.01, d = 1.69) height. Random bias was ± 3.2 cm for CMJ and ± 4.0 cm for SJ height and intraclass correlation coefficients (ICCs) were “excellent” (ICC = 0.87 for CMJ and 0.81 for SJ). Compared to the Optojump device, the Gyko system detected a significant systematic bias for mean CMJ (0.55 cm, p < 0.05, d = 0.94) but not for mean SJ (0.39 cm) height. Random bias was ± 3.3 cm for CMJ and ± 4.2 cm for SJ height and ICC values were “excellent” (ICC = 0.86 for CMJ and 0.82 for SJ). - Conclusion: Consequently, apparatus specific regression equations were provided to estimate true vertical jump height for the Kistler force-plate and the Optojump device from Gyko-derived data. Our findings indicate that the Gyko system cannot be used interchangeably with a Kistler force-plate and the Optojump device in trained individuals. It is suggested that practitioners apply the correction equations to estimate vertical jump height for the force-plate and the Optojump system from Gyko-derived data.
Background:
It has previously been shown that conditioning activities consisting of repetitive hops have the
potential to induce better drop jump (DJ) performance in recreationally active individuals. In the present pilot study,
we investigated whether repetitive conditioning hops can also increase reactive jump and sprint performance in
sprint-trained elite athletes competing at an international level.
Methods:
Jump and sprint performances of 5 athletes were randomly assessed under 2 conditions. The control
condition (CON) comprised 8 DJs and 4 trials of 30-m sprints. The intervention condition (HOP) consisted of 10
maximal repetitive two-legged hops that were conducted 10 s prior to each single DJ and sprint trial. DJ
performance was analyzed using a one-dimensional ground reaction force plate. Step length (SL), contact time (CT),
and sprint time (ST) during the 30-m sprints were recorded using an opto-electronic measurement system.
Results:
Following the conditioning activity, DJ height and external DJ peak power were both significantly
increased by 11 % compared to the control condition. All other variables did not show any significant differences
between HOP and CON.
Conclusions:
In the present pilot study, we were able to demonstrate large improvements in DJ performance even
in sprint-trained elite athletes following a conditioning activity consisting of maximal two-legged repetitive hops.
This strengthens the hypothesis that plyometric conditioning exercises can induce performance enhancements in
elite athletes that are even greater than those observed in recreationally active athletes.. In addition, it appears that
the transfer of these effects to other stretch-shortening cycle activities is limited, as we did not observe any
changes in sprint performance following the plyometric conditioning activity.
Background: Life events (LEs) are associated with future physical and mental health. They are crucial for understanding the pathways to mental disorders as well as the interactions with biological parameters. However, deeper insight is needed into the complex interplay between the type of LE, its subjective evaluation and accompanying factors such as social support. The "Stralsund Life Event List" (SEL) was developed to facilitate this research.
Methods: The SEL is a standardized interview that assesses the time of occurrence and frequency of 81 LEs, their subjective emotional valence, the perceived social support during the LE experience and the impact of past LEs on present life. Data from 2265 subjects from the general population-based cohort study "Study of Health in Pomerania" (SHIP) were analysed. Based on the mean emotional valence ratings of the whole sample, LEs were categorized as "positive" or "negative". For verification, the SEL was related to lifetime major depressive disorder (MDD; Munich Composite International Diagnostic Interview), childhood trauma (Childhood Trauma Questionnaire), resilience (Resilience Scale) and subjective health (SF-12 Health Survey).
Results: The report of lifetime MDD was associated with more negative emotional valence ratings of negative LEs (OR = 2.96, p < 0.0001). Negative LEs (b = 0.071, p < 0.0001, beta = 0.25) and more negative emotional valence ratings of positive LEs (b = 3.74, p < 0.0001, beta = 0.11) were positively associated with childhood trauma. In contrast, more positive emotional valence ratings of positive LEs were associated with higher resilience (b = -7.05, p < 0.0001, beta = 0.13), and a lower present impact of past negative LEs was associated with better subjective health (b = 2.79, p = 0.001, beta = 0.05). The internal consistency of the generated scores varied considerably, but the mean value was acceptable (averaged Cronbach's alpha > 0.75).
Conclusions: The SEL is a valid instrument that enables the analysis of the number and frequency of LEs, their emotional valence, perceived social support and current impact on life on a global score and on an individual item level. Thus, we can recommend its use in research settings that require the assessment and analysis of the relationship between the occurrence and subjective evaluation of LEs as well as the complex balance between distressing and stabilizing life experiences.
Muscle quality defined as the ratio of muscle strength to muscle mass disregards underlying factors which influence muscle strength. The aim of this review was to investigate the relationship of phase angle (PhA), echo intensity (EI), muscular adipose tissue (MAT), muscle fiber type, fascicle pennation angle (θf), fascicle length (lf), muscle oxidative capacity, insulin sensitivity (IS), neuromuscular activation, and motor unit to muscle strength. PubMed search was performed in 2021. The inclusion criteria were: (i) original research, (ii) human participants, (iii) adults (≥18 years). Exclusion criteria were: (i) no full-text, (ii) non-English or -German language, (iii) pathologies. Forty-one studies were identified. Nine studies found a weak–moderate negative (range r: [−0.26]–[−0.656], p < 0.05) correlation between muscle strength and EI. Four studies found a weak–moderate positive correlation (range r: 0.177–0.696, p < 0.05) between muscle strength and PhA. Two studies found a moderate-strong negative correlation (range r: [−0.446]–[−0.87], p < 0.05) between muscle strength and MAT. Two studies found a weak-strong positive correlation (range r: 0.28–0.907, p < 0.05) between θf and muscle strength. Muscle oxidative capacity was found to be a predictor of muscle strength. This review highlights that the current definition of muscle quality should be expanded upon as to encompass all possible factors of muscle quality.
Serious knee pain and related disability have an annual prevalence of approximately 25% on those over the age of 55 years. As curative treatments for the common knee problems are not available to date, knee pathologies typically progress and often lead to osteoarthritis (OA). While the roles that the meniscus plays in knee biomechanics are well characterized, biological mechanisms underlying meniscus pathophysiology and roles in knee pain and OA progression are not fully clear. Experimental treatments for knee disorders that are successful in animal models often produce unsatisfactory results in humans due to species differences or the inability to fully replicate disease progression in experimental animals. The use of animals with spontaneous knee pathologies, such as dogs, can significantly help addressing this issue. As microscopic and macroscopic anatomy of the canine and human menisci are similar, spontaneous meniscal pathologies in canine patients are thought to be highly relevant for translational medicine. However, it is not clear whether the biomolecular mechanisms of pain, degradation of extracellular matrix, and inflammatory responses are species dependent. The aims of this review are (1) to provide an overview of the anatomy, physiology, and pathology of the human and canine meniscus, (2) to compare the known signaling pathways involved in spontaneous meniscus pathology between both species, and (3) to assess the relevance of dogs with spontaneous meniscal pathology as a translational model. Understanding these mechanisms in human and canine meniscus can help to advance diagnostic and therapeutic strategies for painful knee disorders and improve clinical decision making.
Degenerative disc disease is associated with increased expression of pro-inflammatory cytokines in the intervertebral disc (IVD). However, it is not completely clear how inflammation arises in the IVD and which cellular compartments are involved in this process. Recently, the endoplasmic reticulum (ER) has emerged as a possible modulator of inflammation in age-related disorders. In addition, ER stress has been associated with the microenvironment of degenerated IVDs. Therefore, the aim of this study was to analyze the effects of ER stress on inflammatory responses in degenerated human IVDs and associated molecular mechanisms. Gene expression of ER stress marker GRP78 and pro-inflammatory cytokines IL-6, IL-8, IL-1 beta, and TNF-alpha was analyzed in human surgical IVD samples (n = 51, Pfirrmann grade 2-5). The expression of GRP78 positively correlated with the degeneration grade in lumbar IVDs and IL-6, but not with IL-1 beta and TNF-alpha. Another set of human surgical IVD samples (n = 25) was used to prepare primary cell cultures. ER stress inducer thapsigargin (Tg, 100 and 500 nM) activated gene and protein expression of IL-6 and induced phosphorylation of p38 MAPK. Both inhibition of p38 MAPK by SB203580 (10 mu M) and knockdown of ER stress effector CCAAT-enhancer-binding protein homologous protein (CHOP) reduced gene and protein expression of IL-6 in Tg-treated cells. Furthermore, the effects of an inflammatory microenvironment on ER stress were tested. TNF-alpha (5 and 10 ng/mL) did not activate ER stress, while IL-1 beta (5 and 10 ng/mL) activated gene and protein expression of GRP78, but did not influence [Ca2+](i) flux and expression of CHOP, indicating that pro-inflammatory cytokines alone may not induce ER stress in vivo. This study showed that IL-6 release in the IVD can be initiated following ER stress and that ER stress mediates IL-6 release through p38 MAPK and CHOP. Therapeutic targeting of ER stress response may reduce the consequences of the harsh microenvironment in degenerated IVD.
Background: The goal of this study was to estimate the prevalence of and risk factors for diagnosed depression in heart failure (HF) patients in German primary care practices.
Methods: This study was a retrospective database analysis in Germany utilizing the Disease Analyzer (R) Database (IMS Health, Germany). The study population included 132,994 patients between 40 and 90 years of age from 1,072 primary care practices. The observation period was between 2004 and 2013. Follow-up lasted up to five years and ended in April 2015. A total of 66,497 HF patients were selected after applying exclusion criteria. The same number of 66,497 controls were chosen and were matched (1:1) to HF patients on the basis of age, sex, health insurance, depression diagnosis in the past, and follow-up duration after index date.
Results: HF was a strong risk factor for diagnosed depression (p < 0.0001). A total of 10.5% of HF patients and 6.3% of matched controls developed depression after one year of follow-up (p < 0.001). Depression was documented in 28.9% of the HF group and 18.2% of the control group after the five-year follow-up (p < 0.001). Cancer, dementia, osteoporosis, stroke, and osteoarthritis were associated with a higher risk of developing depression. Male gender and private health insurance were associated with lower risk of depression.
Conclusions: The risk of diagnosed depression is significantly increased in patients with HF compared to patients without HF in primary care practices in Germany.
Background: Data on electrocardiographic and echocardiographic pre-participation screening findings in paediatric athletes are limited.
Methods and results: 10-15 year-old athletes (n = 343) were screened using electro- and echocardiography. The electrocardiogram (ECG) was normal in 220 (64%), mildly abnormal in 108 (31%), and distinctly abnormal in 15 (4%) athletes. Echocardiographic upper reference limits (URL, 97.5 percentile) for the left ventricular (LV) wall thickness in 10-11-year-old boys and girls were 9-10 mm and 8-9 mm, respectively; in 12-13-year-old boys and girls 9-10 mm; and in 14-15-year-old boys and girls 10-11 mm and 9-10 mm, respectively. Three athletes were excluded from competitive sports: one for symptomatic Wolff-Parkinson-White syndrome with a normal echocardiogram; one for negative T-waves in V-1-V-4 and a dilated right ventricle by echocardiography suggestive of (arrhythmogenic) right ventricular disease; and one for normal ECG and biscupid aortic valve including an aneurysm of the ascending aorta detected by echocardiography. Related to echocardiographic findings, the sensitivity and specificity of the ECG to identify cardiovascular abnormalities was 38% and 64%, respectively. The ECG's positive-predictive and negative-predictive values were 13% and 88%, respectively. The numbers needed to screen and calculated costs were 172 for ECG ( 7049), 172 for echocardiography ( 11,530), and 114 combining ECG and echocardiography ( 9323).
Conclusions: Compared to adults, paediatric athletes presented with fewer distinctly abnormal ECGs, and there was no gender difference in paediatric athletes' ECG-pattern distribution. A combination of ECG and echocardiography for pre-participation screening of paediatric athletes is superior to ECG alone but 30% more costly.