Referiert
Filtern
Erscheinungsjahr
Dokumenttyp
- Wissenschaftlicher Artikel (97)
- Postprint (37)
- Konferenzveröffentlichung (33)
- Sonstiges (9)
- Preprint (3)
- Rezension (2)
Gehört zur Bibliographie
- ja (181)
Schlagworte
- sonography (13)
- EMG (7)
- young athletes (7)
- MiSpEx (6)
- Basketball (5)
- Exercise (5)
- Functional ankle instability (5)
- back pain (5)
- performance (5)
- reliability (5)
Background Recent shoulder injury prevention programs have utilized resistance exercises combined with different forms of instability, with the goal of eliciting functional adaptations and thereby reducing the risk of injury. However, it is still unknown how an unstable weight mass (UWM) affects the muscular activity of the shoulder stabilizers. Aim of the study was to assess neuromuscular activity of dynamic shoulder stabilizers under four conditions of stable and UWM during three shoulder exercises. It was hypothesized that a combined condition of weight with UWM would elicit greater activation due to the increased stabilization demand. Methods Sixteen participants (7 m/9 f) were included in this cross-sectional study and prepared with an EMG-setup for the: Mm. upper/lower trapezius (U.TA/L.TA), lateral deltoid (DE), latissimus dorsi (LD), serratus anterior (SA) and pectoralis major (PE). A maximal voluntary isometric contraction test (MVIC; 5 s.) was performed on an isokinetic dynamometer. Next, internal/external rotation (In/Ex), abduction/adduction (Ab/Ad) and diagonal flexion/extension (F/E) exercises (5 reps.) were performed with four custom-made-pipes representing different exercise conditions. First, the empty-pipe (P; 0.5 kg) and then, randomly ordered, water-filled-pipe (PW; 1 kg), weight-pipe (PG; 4.5 kg) and weight + water-filled-pipe (PWG; 4.5 kg), while EMG was recorded. Raw root-mean-square values (RMS) were normalized to MVIC (%MVIC). Differences between conditions for RMS%MVIC, scapular stabilizer (SR: U.TA/L.TA; U.TA/SA) and contraction (CR: concentric/eccentric) ratios were analyzed (paired t-test; p <= 0.05; Bonferroni adjusted alpha = 0.008). Results PWG showed significantly greater muscle activity for all exercises and all muscles except for PE compared to P and PW. Condition PG elicited muscular activity comparable to PWG (p > 0.008) with significantly lower activation of L.TA and SA in the In/Ex rotation. The SR ratio was significantly higher in PWG compared to P and PW. No significant differences were found for the CR ratio in all exercises and for all muscles. Conclusion Higher weight generated greater muscle activation whereas an UWM raised the neuromuscular activity, increasing the stabilization demands. Especially in the In/Ex rotation, an UWM increased the RMS%MVIC and SR ratio. This might improve training effects in shoulder prevention and rehabilitation programs.
Background Ankle sprain is the most common injury in basketball. Chronic ankle instability develops from an acute ankle sprain may cause negative effects on quality of life, ankle functionality or on increasing risk for recurrent ankle sprains and post-traumatic osteoarthritis. To facilitate a preventative strategy of chronic ankle instability (CAI) in the basketball population, gathering epidemiological data is essential. However, the epidemiological data of CAI in basketball is limited. Therefore, this study aims to investigate the prevalence of CAI in basketball athletes and to determine whether gender, competitive level, and basketball playing position influence this prevalence. Methods In a cross-sectional study, in total 391 Taiwanese basketball athletes from universities and sports clubs participated. Besides non-standardized questions about demographics and their history of ankle sprains, participants further filled out the standard Cumberland Ankle Instability Tool applied to determine the presence of ankle instability. Questionnaires from 255 collegiate and 133 semi-professional basketball athletes (male = 243, female = 145, 22.3 +/- 3.8 years, 23.3 +/- 2.2 kg/m(2)) were analyzed. Differences in prevalence between gender, competitive level and playing position were determined using the Chi-square test. Results In the surveyed cohort, 26% had unilateral CAI while 50% of them had bilateral CAI. Women had a higher prevalence than men in the whole surveyed cohort (X-2(1) = 0.515, p = 0.003). This gender disparity also showed from sub-analyses, that the collegiate female athletes had a higher prevalence than collegiate men athletes (X-2(1) = 0.203, p = 0.001). Prevalence showed no difference between competitive levels (p > 0.05) and among playing positions (p > 0.05). Conclusions CAI is highly prevalent in the basketball population. Gender affects the prevalence of CAI. Regardless of the competitive level and playing position the prevalence of CAI is similar. The characteristic of basketball contributes to the high prevalence. Prevention of CAI should be a focus in basketball. When applying the CAI prevention measures, gender should be taken into consideration.
Physical activity and exercise are effective approaches in prevention and therapy of multiple diseases. Although the specific characteristics of lengthening contractions have the potential to be beneficial in many clinical conditions, eccentric training is not commonly used in clinical populations with metabolic, orthopaedic, or neurologic conditions. The purpose of this pilot study is to investigate the feasibility, functional benefits, and systemic responses of an eccentric exercise program focused on the trunk and lower extremities in people with low back pain (LBP) and multiple sclerosis (MS). A six-week eccentric training program with three weekly sessions is performed by people with LBP and MS. The program consists of ten exercises addressing strength of the trunk and lower extremities. The study follows a four-group design (N = 12 per group) in two study centers (Israel and Germany): three groups perform the eccentric training program: A) control group (healthy, asymptomatic); B) people with LBP; C) people with MS; group D (people with MS) receives standard care physiotherapy. Baseline measurements are conducted before first training, post-measurement takes place after the last session both comprise blood sampling, self-reported questionnaires, mobility, balance, and strength testing. The feasibility of the eccentric training program will be evaluated using quantitative and qualitative measures related to the study process, compliance and adherence, safety, and overall program assessment. For preliminary assessment of potential intervention effects, surrogate parameters related to mobility, postural control, muscle strength and systemic effects are assessed. The presented study will add knowledge regarding safety, feasibility, and initial effects of eccentric training in people with orthopaedic and neurological conditions. The simple exercises, that are easily modifiable in complexity and intensity, are likely beneficial to other populations. Thus, multiple applications and implementation pathways for the herein presented training program are conceivable.
Physical activity and exercise are effective approaches in prevention and therapy of multiple diseases. Although the specific characteristics of lengthening contractions have the potential to be beneficial in many clinical conditions, eccentric training is not commonly used in clinical populations with metabolic, orthopaedic, or neurologic conditions. The purpose of this pilot study is to investigate the feasibility, functional benefits, and systemic responses of an eccentric exercise program focused on the trunk and lower extremities in people with low back pain (LBP) and multiple sclerosis (MS). A six-week eccentric training program with three weekly sessions is performed by people with LBP and MS. The program consists of ten exercises addressing strength of the trunk and lower extremities. The study follows a four-group design (N = 12 per group) in two study centers (Israel and Germany): three groups perform the eccentric training program: A) control group (healthy, asymptomatic); B) people with LBP; C) people with MS; group D (people with MS) receives standard care physiotherapy. Baseline measurements are conducted before first training, post-measurement takes place after the last session both comprise blood sampling, self-reported questionnaires, mobility, balance, and strength testing. The feasibility of the eccentric training program will be evaluated using quantitative and qualitative measures related to the study process, compliance and adherence, safety, and overall program assessment. For preliminary assessment of potential intervention effects, surrogate parameters related to mobility, postural control, muscle strength and systemic effects are assessed. The presented study will add knowledge regarding safety, feasibility, and initial effects of eccentric training in people with orthopaedic and neurological conditions. The simple exercises, that are easily modifiable in complexity and intensity, are likely beneficial to other populations. Thus, multiple applications and implementation pathways for the herein presented training program are conceivable.
Intervention in the form of core-specific stability exercises is evident to improve trunk stability. The purpose was to assess the effect of an additional 6 weeks sensorimotor or resistance training on maximum isokinetic trunk strength and response to sudden dynamic trunk loading (STL) in highly trained adolescent athletes. The study was conducted as a single-blind, 3-armed randomized controlled trial. Twenty-four adolescent athletes (14f/10 m, 16 +/- 1 yrs.;178 +/- 10 cm; 67 +/- 11 kg; training sessions/week 15 +/- 5; training h/week 22 +/- 8) were randomized into resistance training (RT; n = 7), sensorimotor training (SMT; n = 10), and control group (CG; n = 7). Athletes were instructed to perform standardized, center-based training for 6 weeks, two times per week, with a duration of 1 h each session. SMT consisted of four different core-specific sensorimotor exercises using instable surfaces. RT consisted of four trunk strength exercises using strength training machines, as well as an isokinetic dynamometer. All participants in the CG received an unspecific heart frequency controlled, ergometer-based endurance training (50 min at max. heart frequency of 130HF). For each athlete, each training session was documented in an individual training diary (e.g., level of SMT exercise; 1RM for strength exercise, pain). At baseline (M1) and after 6 weeks of intervention (M2), participants' maximum strength in trunk rotation (ROM:63 degrees) and flexion/extension (ROM:55 degrees) was tested on an isokinetic dynamometer (concentric/eccentric 30 degrees/s). STL was assessed in eccentric (30 degrees/s) mode with additional dynamometer-induced perturbation as a marker of core stability. Peak torque [Nm] was calculated as the main outcome. The primary outcome measurements (trunk rotation/extension peak torque: con, ecc, STL) were statistically analyzed by means of the two-factor repeated measures analysis of variance (alpha = 0.05). Out of 12 possible sessions, athletes participated between 8 and 9 sessions (SMT: 9 +/- 3; RT: 8 +/- 3; CG: 8 +/- 4). Regarding main outcomes of trunk performance, experimental groups showed no significant pre-post difference for maximum trunk strength testing as well as for perturbation compensation (p > 0.05). It is concluded, that future interventions should exceed 6 weeks duration with at least 2 sessions per week to induce enhanced trunk strength or compensatory response to sudden, high-intensity trunk loading in already highly trained adolescent athletes, regardless of training regime.
Background Recent shoulder injury prevention programs have utilized resistance exercises combined with different forms of instability, with the goal of eliciting functional adaptations and thereby reducing the risk of injury. However, it is still unknown how an unstable weight mass (UWM) affects the muscular activity of the shoulder stabilizers. Aim of the study was to assess neuromuscular activity of dynamic shoulder stabilizers under four conditions of stable and UWM during three shoulder exercises. It was hypothesized that a combined condition of weight with UWM would elicit greater activation due to the increased stabilization demand. Methods Sixteen participants (7 m/9 f) were included in this cross-sectional study and prepared with an EMG-setup for the: Mm. upper/lower trapezius (U.TA/L.TA), lateral deltoid (DE), latissimus dorsi (LD), serratus anterior (SA) and pectoralis major (PE). A maximal voluntary isometric contraction test (MVIC; 5 s.) was performed on an isokinetic dynamometer. Next, internal/external rotation (In/Ex), abduction/adduction (Ab/Ad) and diagonal flexion/extension (F/E) exercises (5 reps.) were performed with four custom-made-pipes representing different exercise conditions. First, the empty-pipe (P; 0.5 kg) and then, randomly ordered, water-filled-pipe (PW; 1 kg), weight-pipe (PG; 4.5 kg) and weight + water-filled-pipe (PWG; 4.5 kg), while EMG was recorded. Raw root-mean-square values (RMS) were normalized to MVIC (%MVIC). Differences between conditions for RMS%MVIC, scapular stabilizer (SR: U.TA/L.TA; U.TA/SA) and contraction (CR: concentric/eccentric) ratios were analyzed (paired t-test; p <= 0.05; Bonferroni adjusted alpha = 0.008). Results PWG showed significantly greater muscle activity for all exercises and all muscles except for PE compared to P and PW. Condition PG elicited muscular activity comparable to PWG (p > 0.008) with significantly lower activation of L.TA and SA in the In/Ex rotation. The SR ratio was significantly higher in PWG compared to P and PW. No significant differences were found for the CR ratio in all exercises and for all muscles. Conclusion Higher weight generated greater muscle activation whereas an UWM raised the neuromuscular activity, increasing the stabilization demands. Especially in the In/Ex rotation, an UWM increased the RMS%MVIC and SR ratio. This might improve training effects in shoulder prevention and rehabilitation programs.
Background Recent studies indicate the existence of a repeated bout effect on the contralateral untrained limb following eccentric and isometric contractions. Aims This review aims to summarize the evidence for magnitude, duration and differences of this effect following isometric and eccentric preconditioning exercises. Methods Medline, Cochrane, and Web of science were searched from January 1971 until September 2020. Randomized controlled trials, case-control studies and cross-sectional studies were identified by combining keywords and synonyms (e.g., "contralateral", "exercise", "preconditioning", "protective effect"). At least two of the following outcome parameters were mandatory for study inclusion: strength, muscle soreness, muscle swelling, limb circumference, inflammatory blood markers or protective index (relative change of aforementioned measures). Results After identifying 1979 articles, 13 studies were included. Most investigations examined elbow flexors and utilized eccentric isokinetic protocols to induce the contralateral repeated bout effect. The magnitude of protection was observed in four studies, smaller values of the contralateral when compared to the ipsilateral repeated bout effect were noted in three studies. The potential mechanism is thought to be of neural central nature since no differences in peripheral muscle activity were observed. Time course was examined in three investigations. One study showed a smaller protective effect following isometric preconditioning when compared to eccentric preconditioning exercises. Conclusions The contralateral repeated bout effect demonstrates a smaller magnitude and lasts shorter than the ipsilateral repeated bout effect. Future research should incorporate long-term controlled trials including larger populations to identify central mechanisms. This knowledge should be used in clinical practice to prepare immobilized limbs prospectively for an incremental load.
Repetitive overhead motions in combination with heavy loading were identified as risk factors for the development of shoulder pain. However, the underlying mechanism is not fully understood. Altered scapular kinematics as a result of muscle fatigue is suspected to be a contributor. PURPOSE: To determine scapular kinematics and scapular muscle activity at the beginning and end of constant shoulder flexion and extension loading in asymptomatic individuals. METHODS: Eleven asymptomatic adults (28±4yrs; 1.74±0.13m; 74±16kg) underwent maximum isokinetic loading of shoulder flexion (FLX) and extension (EXT) in the sagittal plane (ROM: 20- 180°; concentric mode; 180°/s) until individual peak torque was reduced by 50%. Simultaneously 3D scapular kinematics were assessed with a motion capture system and scapular muscle activity with a 3-lead sEMG of upper and lower trapezius (UT, LT) and serratus anterior (SA). Scapular position angles were calculated for every 20° increment between 20-120° humerothoracic positions. Muscle activity was quantified by amplitudes (RMS) of the total ROM. Descriptive analyses (mean±SD) of kinematics and muscle activity at begin (taskB) and end (taskE) of the loading task was followed by ANOVA and paired t-tests. RESULTS: At taskB activity ranged from 589±343mV to 605±250mV during FLX and from 105±41mV to 164±73mV during EXT across muscles. At taskE activity ranged from 594±304mV to 875±276mV during FLX and from 97±33mV to 147±57mV during EXT. Differences with increased muscle activity were seen for LT and UT during FLX (meandiff= 141±113mV for LT, p<0.01; 191±153mV for UT, p<0.01). Scapula position angles continuously increased in upward rotation, posterior tilt and external rotation during FLX and reversed during EXT both at taskB and taskE. At taskE scapula showed greater external rotation (meandiff= 3.6±3.7°, p<0.05) during FLX and decreased upward rotation (meandiff= 1.9±2.3°, p<0.05) and posterior tilt (meandiff= 1.0±2.1°, p<0.05) during EXT across humeral positions. CONCLUSIONS: Force reduction in consequence of fatiguing shoulder loading results in increased scapular muscle activity and minor alterations in scapula motion. Whether even small changes have a clinical impact by creating unfavorable subacromial conditions potentially initiating pain remains unclear.
PURPOSE: To determine the feasibility of upright compared to supine MRI measurements to determine characteristics of the lumbar spine in AA with spondylolisthesis.
METHODS: Ten AA (n=10; m/f: 4/6; 14.5±1.7y; 163±7cm; 52±8kg) from various sports, diagnosed with spondylolisthesis grade I-II Meyerding confirmed by x-ray in standing lateral view, were included. Open low-field MRI images (0.25 Tesla) in upright (82°) and supine (0°) position were evaluated by two observers. Medical imaging software was used to measure the anterior translation (AT, mm), lumbosacral joint angle (LSJA, °) and lordosis angle (LA, °). Reliability was analyzed by the intra-rater correlation coefficient (ICC) and standard error of measurements (SEM).
RESULTS: Due to motion artifacts during upright position, measures of three participants had to be excluded. Between observers, AT ranged from 4.2±2.7mm to 5.5±1.9mm (ICC=0.94, SEM=0.6mm) in upright and from 4.9±2.4mm to 5.9±3.0mm (ICC=0.89, SEM=0.9mm) in supine position. LSJA varied from 5.1±2.2° to 7.3±1.5° (ICC=0.54, SEM=1.5°) in upright and from 9.8±2.5° to 10±2.4° (ICC=0.73, SEM=1.1°) in supine position. LA differed from 58.8±14.6° to 61.9±6° (ICC=0.94, SEM=1.19°) in upright and from 51.9±11.7° to 52.6±11.1° (ICC=0.98, SEM=1.59°) in supine position.
CONCLUSIONS: Determination of AT and LA showed good to excellent reliability in both, upright and supine position. In contrast, reliability of LSJA had only moderate to good correlation
between observers and should therefore be interpreted with caution. However, motion artifacts should be taken into consideration during upright imaging procedures.
Schomoller, A, Schugardt, M, Kotsch, P, and Mayer, F. The effect of body composition on cycling power during an incremental test in young athletes. J Strength Cond Res 35(11): 3225-3231, 2021-As body composition (BC) is a modifiable factor influencing sports performance, it is of interest for athletes and coaches to optimize BC to fulfill the specific physical demands of one sport discipline. The purpose of this study is to test the impact of body fat (BF) and fat-free mass (FFM) on aerobic performance in young athletes. Body composition parameters were evaluated among gender and age groups of young athletes undergoing their mandatory health examination. The maximal power (in Watts per kilogram body mass) of a stepwise incremental ergometer test was compared between 6 BC types: high BF, high FFM, high BF and high FFM, normal BC values, low BF, and low FFM. With increasing age (11-13 vs. 14-16 years) BF decreased and FFM increased in both genders. Both BC parameters, as well as body mass, correlated moderately with performance output (r = 0.36-0.6). Subjects with high BF or high FFM or both had significantly lower ergometer test results compared with those with low BF and FFM in all age and gender groups (p < 0.05). The finding that high levels of BF and FFM are detrimental for cycle power output is important to consider in disciplines that demand high levels of aerobic and anaerobic performance.
Cardiac remodeling in child and adolescent athletes in association with sport discipline and sex
(2020)
Continuous high training loads are associated with structural cardiac adaptations and development of an athletic heart in adult athletes, especially in sport disciplines with high dynamic training components. In child and adolescent athletes these effects are increasingly reported. However, study populations are still very small.
Objective: To assess the reliability of measurements of paraspinal muscle transverse relaxation times (T2 times) between two observers and within one observer on different time points. <br /> Methods: 14 participants (9f/5m, 33 +/- 5 years, 176 +/- 10 cm, 73 +/- 12 kg) underwent 2 consecutive MRI scans (M1,M2) on the same day, followed by 1 MRI scan 13-14 days later (M3) in a mobile 1.5 Tesla MRI. T2 times were calculated in T-2 weighted turbo spin- echo-sequences at the spinal level of the third lumbar vertebrae (11 slices, 2 mm slice thickness, 1 mm interslice gap, echo times: 20, 40, 60, 80, 100 ms) for M. erector spinae (ES) and M. multifidius (MF). The following reliability parameter were calculated for the agreement of T2 times between two different investigators (OBS1 & OBS2) on the same MRI (inter rater reliability, IR) and by one investigator between different MRI of the same participant (intersession variability, IS): Test-Retest Variability (TRV, Differences/Mean*100); Coefficient of Variation (CV, Standard deviation/Mean*100); Bland-Altman Analysis (systematic bias = Mean of the Differences; Upper/Lower Limits of Agreement = Bias+/-1.96*SD); Intraclass Correlation Coefficient 3.1 (ICC) with absolute agreement, as well as its 95% confidence interval. <br /> Results: Mean TRV for IR was 2.6% for ES and 4.2% for MF. Mean TRV for IS was 3.5% (ES) and 5.1% (MF). Mean CV for IR was 1.9 (ES) and 3.0 (MF). Mean CV for IS was 2.5% (ES) and 3.6% (MF). A systematic bias of 1.3 ms (ES) and 2.1 ms (MF) were detected for IR and a systematic bias of 0.4 ms (ES) and 0.07 ms (MF) for IS. ICC for IR was 0.94 (ES) and 0.87 (MF). ICC for IS was 0.88 (ES) and 0.82 (MF). <br /> Conclusion: Reliable assessment of paraspinal muscle T2 time justifies its use for scientific purposes. The applied technique could be recommended to use for future studies that aim to assess changes of T2 times, e.g. after an intense bout of eccentric exercises.
Progression or impediment of fundamental motor skills performance (FMSP) in children depends on internal and environmental factors. Shoes as an environmental constraint are believed to affect these movements as children showed to perform qualitatively better with sports shoes than flip-flop sandals. However, locomotor performance assessments based on biomechanical variables are limited. Therefore, the objective of this experiment was to assess the biomechanical effects of wearing shoes while performing fundamental motor skills in children. Barefoot and shod conditions were tested in healthy children between the age of 4 and 7 years. They were asked to perform basic and advanced motor skills including double-leg stance, horizontal jumps, walking as well as counter-movement jumps, single-leg stance and sprinting. Postural control and ground reaction data were measured with two embedded force plates. A 3D motion capture system was used to analyse the spatiotemporal parameters of walking and sprinting. Findings showed that the parameters of single- and double-leg stance, horizontal and counter-movement jump did not differ between barefoot and shod conditions. Most of the spatiotemporal variables including cadence, stride length, stride time, and contact time of walking and sprinting were statistically different between the barefoot and shod conditions. Consequently, tested shoes did not change performance and biomechanics of postural control and jumping tasks; however, the spatiotemporal gait parameters indicate changes in walking and sprinting characteristics with shoes in children.
Acute ankle sprain leads in 40% of all cases to chronic ankle instability (CAI). CAI is related to a variety of motor adaptations at the lower extremities. Previous investigations identified increased muscle activities while landing in CAI compared to healthy control participants. However, it remains unclear whether muscular alterations at the knee muscles are limited to the involved (unstable) ankle or are also present at the uninvolved leg. The latter might potentially indicate a risk of ankle sprain or future injury on the uninvolved leg. Purpose: To assess if there is a difference of knee muscle activities between the involved and uninvolved leg in participants with CAI during perturbed walking. Method: 10 participants (6 females; 4 males; 26±4 years; 169±9 cm; 65±7 kg) with unilateral CAI walked on a split-belt treadmill (1m/s) for 5 minutes of baseline walking and 6 minutes of perturbed walking (left and right side, each 10 perturbations). Electromyography (EMG) measurements were performed at biceps femoris (BF) and rectus femoris (RF). EMG amplitude (RMS; normalized to MVIC) were analyzed for 200ms pre-heel contact (Pre200), 100ms post heel contact (Post100) and 200ms after perturbation (Pert200). Data was analyzed by paired t-test/Wilcoxon test based on presence or absence of normal distribution (Bonferroni adjusted α level p≤ 0.0125). Results: No statistical difference was found between involved and uninvolved leg for RF (Pre200: 4±2% and 11± 22%, respectively, p= 0.878; Post100: 10± 5 and 18±31%, p=0.959; Pert200: 6±3% and 13±24%, p=0.721) as well as for BF (Pre200: 12±7% and 11±6, p=0.576; Post100: 10±7% and 9±7%, p=0.732; Pert200: 7±4 and 7±7%, p=0.386). Discussion: No side differences in muscle activity could be revealed for assessed feedforward and feedback responses (perturbed and unperturbed) in unilateral CAI. Reduced inter-individual variability of muscular activities at the involved leg might indicate a rather stereotypical response pattern. It remains to be investigated, whether muscular control at the knee is not affected by CAI, or whether both sides adapted in a similar style to the chronic condition at the ankle.
Chronic ankle instability (CAI) is not only an ankle issue, but also affects sensorimotor system. People with CAI show altered muscle activation in proximal joints such as hip and knee. However, evidence is limited as controversial results have been presented regarding changes in activation of hip muscles in CAI population. PURPOSE: To investigate the effect of CAI on activity of hip muscles during normal walking and walking with perturbations. METHODS: 8 subjects with CAI (23 ± 2 years, 171 ± 7 cm and 65 ± 4 kg) and 8 controls (CON) matched by age, height, weight and dominant leg (25 ± 3 years, 172 ± 7 cm and 65 ± 6 kg) walked shoed on a split-belt treadmill (1 m/s). Subjects performed 5 minutes of baseline walking and 6 minutes walking with 10 perturbations (at 200 ms after heel contact with 42 m/s2 deceleration impulse) on each side. Electromyography signals from gluteus medius (Gmed) and gluteus maximus (Gmax) were recorded while walking. Muscle amplitudes (Root Mean Square normalized to maximum voluntary isometric contraction) were calculated at 200 ms before heel contact (Pre200), 100 ms after heel contact (Post100) during normal walking and 200 ms after perturbations (Pert200). Differences between groups were examined using Mann Whitney U test and Bonferroni correction to account for multiple testing (adjust α level p≤ 0.0125). RESULT: In Gmed, CAI group showed lower muscle amplitude than CON group after heel contact (Post100: 18±7 % and 47±21 %, p< .01) and after walking perturbations ( 31±13 % and 62±26 %, p< .01), but not before heel contact (Pre200: 5±2 % and 11±10 %, p= 0.195). In Gmax, no difference was found between CAI and CON groups in all three time points (Pre200: 12±5 % and 17±12 %, p= 0.574; Post100: 41±21 % and 41±13 %, p= 1.00; Pert200: 79±46 % and 62±35 %, p= 0.505). CONCLUSION: People with CAI activated Gmed less than healthy control in feedback mechanism (after heel contact and walking with perturbations), but not in feedforward mechanism (before heel contact). Less activation on Gmed may affect the balance in frontal plane and increase the risk of recurrent ankle sprain, giving way or feeling ankle instability in patients with CAI during walking. Future studies should investigate the effect of Gmed strengthening or neuromuscular training on CAI rehabilitation.
Eccentric (ECC) exercises might cause muscle damage, characterized by delayed-onset muscle soreness, elevated creatine kinase (CK) levels and local muscle oedema, shown by elevated T2 times in magnet resonance imaging (MRI) scans. Previous research suggests a high inter-individual difference regarding these systemic and local responses to eccentric workload. PURPOSE: To analyze ECC exercise-induced muscle damage in lumbar paraspinal muscles assessed via MRI. METHODS: Ten participants (3f/7m; 33±6y; 174±8cm; 71±12kg) were included in the study. Quantitative paraspinal muscle constitution of M. erector spinae and M. multifidius were assessed in supine position before and 72h after an intense eccentric trunk exercise bout in a mobile 1.5 tesla MRI device. MRI scans were recorded on spinal level L3 (T2-weighted TSE echo sequences, 11 slices, 2mm slice thickness, 3mm gap, echo times: 20, 40, 60, 80, 100ms, TR time: 2500ms). Muscle T2 times were calculated for manually traced regions of interest of the respective muscles with an imaging software. The exercise protocol was performed in an isokinetic device and consisted of 120sec alternating ECC trunk flexion-extension with maximal effort. Venous blood samples were taken before and 72h after the ECC exercise. Descriptive statistics (mean±SD) and t-testing for pre-post ECC exercises were performed. RESULTS: T2 times increased from pre- to post-ECC MRI measurements from 55±3ms to 79±28ms in M. erector spinae and from 62±5ms to 78±24ms in M. multifidius (p<0.001). CK increased from 126±97 U/L to 1447±20579 U/L. High SDs of T2 time and CK in post-ECC measures could be due to inter-individual reactions to ECC exercises. 3 participants showed high local and systemic reactions (HR) with T2 time increases of 120±24% (M. erector spinae) and 73±50% (M. multifidius). In comparison, the remaining 7 participants showed increases of 11±12% (M. erector spinae) and 7±9% (M. multifidius) in T2 time. Mean CK increased 9.5-fold in the 3 HR subjects compared with the remaining 7 subjects. CONCLUSIONS: The 120sec maximal ECC trunk flexion-extension protocol induced high amounts of muscle damage in 3 participants. Moderate to low responses were found in the remaining 7 subjects, assuming that inter-individual predictors play a role regarding physiological responses to ECC workload.
Eccentric exercises (ECC) induce reversible muscle damage, delayed-onset muscle soreness and an inflammatory reaction that is often followed by a systemic anti-inflammatory response. Thus, ECC might be beneficial for treatment of metabolic disorders which are frequently accompanied by a low-grade systemic inflammation. However, extent and time course of a systemic immune response after repeated ECC bouts are poorly characterized.
PURPOSE: To analyze the (anti-)inflammatory response after repeated ECC loading of the trunk.
METHODS: Ten healthy participants (33 ± 6 y; 173 ± 14 cm; 74 ± 16 kg) performed three isokinetic strength measurements of the trunk (concentric (CON), ECC1, ECC2, each 2 wks apart; flexion/extension, velocity 60°/s, 120s MVC). Pre- and 4, 24, 48, 72, 168h post-exercise, muscle soreness (numeric rating scale, NRS) was assessed and blood samples were taken and analyzed [Creatine kinase (CK), C-reactive protein (CRP), Interleukin-6 (IL-6), IL-10, Tumor necrosis factor-α (TNF-α)]. Statistics were done by Friedman‘s test with Dunn‘s post hoc test (α=.05).
RESULTS: Mean peak torque was higher during ECC1 (319 ± 142 Nm) than during CON (268 ± 108 Nm; p<.05) and not different between ECC1 and ECC2 (297 ± 126 Nm; p>.05). Markers of muscle damage (peaks post-ECC1: NRS 48h, 4.4±2.9; CK 72h, 14407 ± 19991 U/l) were higher after ECC1 than after CON and ECC2 (p<.05). The responses over 72h (stated as Area under the Curve, AUC) were abolished after ECC2 compared to ECC1 (p<.05) indicating the presence of the repeated bout effect. CRP levels were not changed. IL-6 levels increased 2-fold post-ECC1 (pre: 0.5 ± 0.4 vs. 72h: 1.0 ± 0.8 pg/ml). The IL-6 response was enhanced after ECC1 (AUC 61 ± 37 pg/ml*72h) compared to CON (AUC 33 ± 31 pg/ml*72h; p<.05). After ECC2, the IL-6 response (AUC 43 ± 25 pg/ml*72h) remained lower than post-ECC1, but the difference was not statistically significant. Serum levels of TNF-α and of the anti-inflammatory cytokine IL-10 were below detection limits. Overall, markers of muscle damage and immune response showed high inter-individual variability.
CONCLUSION: Despite maximal ECC loading of a large muscle group, no anti-inflammatory and just weak inflammatory responses were detected in healthy adults. Whether ECC elicits a different reaction in inflammatory clinical conditions is unclear.
Background
Ankle sprain is the most common injury in basketball. Chronic ankle instability develops from an acute ankle sprain may cause negative effects on quality of life, ankle functionality or on increasing risk for recurrent ankle sprains and post-traumatic osteoarthritis. To facilitate a preventative strategy of chronic ankle instability (CAI) in the basketball population, gathering epidemiological data is essential. However, the epidemiological data of CAI in basketball is limited. Therefore, this study aims to investigate the prevalence of CAI in basketball athletes and to determine whether gender, competitive level, and basketball playing position influence this prevalence.
Methods
In a cross-sectional study, in total 391 Taiwanese basketball athletes from universities and sports clubs participated. Besides non-standardized questions about demographics and their history of ankle sprains, participants further filled out the standard Cumberland Ankle Instability Tool applied to determine the presence of ankle instability. Questionnaires from 255 collegiate and 133 semi-professional basketball athletes (male = 243, female = 145, 22.3 ± 3.8 years, 23.3 ± 2.2 kg/m2) were analyzed. Differences in prevalence between gender, competitive level and playing position were determined using the Chi-square test.
Results
In the surveyed cohort, 26% had unilateral CAI while 50% of them had bilateral CAI. Women had a higher prevalence than men in the whole surveyed cohort (X2(1) = 0.515, p = 0.003). This gender disparity also showed from sub-analyses, that the collegiate female athletes had a higher prevalence than collegiate men athletes (X2(1) = 0.203, p = 0.001). Prevalence showed no difference between competitive levels (p > 0.05) and among playing positions (p > 0.05).
Conclusions
CAI is highly prevalent in the basketball population. Gender affects the prevalence of CAI. Regardless of the competitive level and playing position the prevalence of CAI is similar. The characteristic of basketball contributes to the high prevalence. Prevention of CAI should be a focus in basketball. When applying the CAI prevention measures, gender should be taken into consideration.
Background
Ankle sprain is the most common injury in basketball. Chronic ankle instability develops from an acute ankle sprain may cause negative effects on quality of life, ankle functionality or on increasing risk for recurrent ankle sprains and post-traumatic osteoarthritis. To facilitate a preventative strategy of chronic ankle instability (CAI) in the basketball population, gathering epidemiological data is essential. However, the epidemiological data of CAI in basketball is limited. Therefore, this study aims to investigate the prevalence of CAI in basketball athletes and to determine whether gender, competitive level, and basketball playing position influence this prevalence.
Methods
In a cross-sectional study, in total 391 Taiwanese basketball athletes from universities and sports clubs participated. Besides non-standardized questions about demographics and their history of ankle sprains, participants further filled out the standard Cumberland Ankle Instability Tool applied to determine the presence of ankle instability. Questionnaires from 255 collegiate and 133 semi-professional basketball athletes (male = 243, female = 145, 22.3 ± 3.8 years, 23.3 ± 2.2 kg/m2) were analyzed. Differences in prevalence between gender, competitive level and playing position were determined using the Chi-square test.
Results
In the surveyed cohort, 26% had unilateral CAI while 50% of them had bilateral CAI. Women had a higher prevalence than men in the whole surveyed cohort (X2(1) = 0.515, p = 0.003). This gender disparity also showed from sub-analyses, that the collegiate female athletes had a higher prevalence than collegiate men athletes (X2(1) = 0.203, p = 0.001). Prevalence showed no difference between competitive levels (p > 0.05) and among playing positions (p > 0.05).
Conclusions
CAI is highly prevalent in the basketball population. Gender affects the prevalence of CAI. Regardless of the competitive level and playing position the prevalence of CAI is similar. The characteristic of basketball contributes to the high prevalence. Prevention of CAI should be a focus in basketball. When applying the CAI prevention measures, gender should be taken into consideration.
Purpose:To cross-cultural translate the Cumberland Ankle Instability Tool (CAIT) to Taiwan-Chinese version (CAIT-TW), and to evaluate the validity, reliability and cutoff score of CAIT-TW for Taiwan-Chinese athletic population. Materials and methods:The English version of CAIT was translated to CAIT-TW based on a guideline of cross-cultural adaptation. 77 and 58 Taiwanese collegial athletes with and without chronic ankle instability filled out CAIT-TW, Taiwan-Chinese version of Lower Extremity Functional Score (LEFS-TW) and Numeric Rating Scale (NRS). The construct validity, test-retest reliability, internal consistency and cutoff score of CAIT-TW were evaluated. Results:In construct validity, the Spearman's correlation coefficients were moderate (CAIT-TW vs LEFS-TW: Rho = 0.39,p < 0.001) and strong (CAIT-TW vs NRS: Rho= 0.76,p < 0.001). The test retest reliability was excellent (ICC2.1= 0.91, 95% confidential interval = 0.87-0.94,p < 0.001) with a good internal consistency (Cronbach's alpha: 0.87). Receiver operating characteristic curve showed a cutoff score of 21.5 (Youden index: 0.73, sensitivity: 0.87, specificity 0.85). Conclusions:The CAIT-TW is a valid and reliable tool to differentiate between stable and instable ankles in athletes and may further apply for research or daily practice in Taiwan.
Unexpected perturbations during locomotion can occur during daily life or sports performance. Adequate compensation for such perturbations is crucial in maintaining effective postural control. Studies utilising instrumented treadmills have previously validated perturbed walking protocols, however responses to perturbed running protocols remain less investigated. Therefore, the purpose of this study was to investigate the feasibility of a new instrumented treadmill-perturbed running protocol. <br /> Fifteen participants (age = 2 8 +/- 3 years; height = 172 +/- 9 cm; weight = 69 +/- 10 kg; 60% female) completed an 8-minute running protocol at baseline velocity of 2.5 m/s (9 km/h), whilst 15 one-sided belt perturbations were applied (pre-set perturbation characteristics: 150 ms delay (post-heel contact); 2.0 m/s amplitude; 100 ms duration). Perturbation characteristics and EMG responses were recorded. Bland-Altman analysis (BLA) was employed (bias +/- limits of agreement (LOA; bias +/- 1.96*SD)) and intra-individual variability of repeated perturbations was assessed via Coefficients of Variation (CV) (mean +/- SD). <br /> On average, 9.4 +/- 2.2 of 15 intended perturbations were successful. Perturbation delay was 143 +/- 10 ms, amplitude was 1.7 +/- 0.2 m/s and duration was 69 +/- 10 ms. BLA showed -7 +/- 13 ms for delay, -0.3 +/- 0.1 m/s for amplitude and -30 +/- 10 ms for duration. CV showed variability of 19 +/- 4.5% for delay, 58 +/- 12% for amplitude and 30 +/- 7% for duration. EMG RMS amplitudes of the legs and trunk ranged from 113 +/- 25% to 332 +/- 305% when compared to unperturbed gait. This study showed that the application of sudden perturbations during running can be achieved, though with increased variability across individuals. The perturbations with the above characteristics appear to have elicited a neuromuscular response during running.
We investigated the possibility to identify motor units (MUs) with high-density surface electromyography (HDEMG) over experimental sessions in different days. 10 subjects performed submaximal knee extensions across three sessions in three days separated by one week, while EMG was recorded from the vastus medialis muscle with high-density electrode grids. The shapes of the MU action potentials (MUAPs) over multiple channels extracted from HDEMG decomposition were matched across sessions by cross-correlation. Forty and twenty percent of the MUs decomposed could be tracked across two and three sessions, respectively (average cross correlation 0.85 +/- 0.04). The estimated properties of the matched motor units were similar across the sessions. For example, mean discharge rate and recruitment thresholds were measured with an intra-class correlation coefficient (ICCs) > 0.80. These results strongly suggest that the same MUs were indeed identified across sessions. This possibility will allow monitoring changes in MU properties following interventions or during the progression of neuromuscular disorders.
Eccentric exercise is discussed as a treatment option for clinical populations, but specific responses in terms of muscle damage and systemic inflammation after repeated loading of large muscle groups have not been conclusively characterized. Therefore, this study tested the feasibility of an isokinetic protocol for repeated maximum eccentric loading of the trunk muscles. Nine asymptomatic participants (5 f/4 m; 34±6 yrs; 175±13 cm; 76±17 kg) performed three isokinetic 2-minute all-out trunk strength tests (1x concentric (CON), 2x eccentric (ECC1, ECC2), 2 weeks apart; flexion/extension, 60°/s, ROM 55°). Outcomes were peak torque, torque decline, total work, and indicators of muscle damage and inflammation (over 168 h). Statistics were done using the Friedman test (Dunn’s post-test). For ECC1 and ECC2, peak torque and total work were increased and torque decline reduced compared to CON. Repeated ECC bouts yielded unaltered torque and work outcomes. Muscle damage markers were highest after ECC1 (soreness 48 h, creatine kinase 72 h; p<0.05). Their overall responses (area under the curve) were abolished post-ECC2 compared to post-ECC1 (p<0.05). Interleukin-6 was higher post-ECC1 than CON, and attenuated post-ECC2 (p>0.05). Interleukin-10 and tumor necrosis factor-α were not detectable. All markers showed high inter-individual variability. The protocol was feasible to induce muscle damage indicators after exercising a large muscle group, but the pilot results indicated only weak systemic inflammatory responses in asymptomatic adults.
Eccentric exercise is discussed as a treatment option for clinical populations, but specific responses in terms of muscle damage and systemic inflammation after repeated loading of large muscle groups have not been conclusively characterized. Therefore, this study tested the feasibility of an isokinetic protocol for repeated maximum eccentric loading of the trunk muscles. Nine asymptomatic participants (5 f/4 m; 34±6 yrs; 175±13 cm; 76±17 kg) performed three isokinetic 2-minute all-out trunk strength tests (1x concentric (CON), 2x eccentric (ECC1, ECC2), 2 weeks apart; flexion/extension, 60°/s, ROM 55°). Outcomes were peak torque, torque decline, total work, and indicators of muscle damage and inflammation (over 168 h). Statistics were done using the Friedman test (Dunn’s post-test). For ECC1 and ECC2, peak torque and total work were increased and torque decline reduced compared to CON. Repeated ECC bouts yielded unaltered torque and work outcomes. Muscle damage markers were highest after ECC1 (soreness 48 h, creatine kinase 72 h; p<0.05). Their overall responses (area under the curve) were abolished post-ECC2 compared to post-ECC1 (p<0.05). Interleukin-6 was higher post-ECC1 than CON, and attenuated post-ECC2 (p>0.05). Interleukin-10 and tumor necrosis factor-α were not detectable. All markers showed high inter-individual variability. The protocol was feasible to induce muscle damage indicators after exercising a large muscle group, but the pilot results indicated only weak systemic inflammatory responses in asymptomatic adults.
AIM To analyze neuromuscular activity patterns of the trunk in healthy controls (H) and back pain patients (BPP) during one-handed lifting of light to heavy loads. METHODS RESULTS Seven subjects (3m/4f; 32 +/- 7 years; 171 +/- 7 cm; 65 +/- 11 kg) were assigned to BPP (pain grade >= 2) and 36 (13m/23f; 28 +/- 8 years; 174 +/- 10 cm; 71 +/- 12 kg) to H (pain grade <= 1). H and BPP did not differ significantly in anthropometrics (P > 0.05). All subjects were able to lift the light and middle loads, but 57% of BPP and 22% of H were not able to lift the heavy load (all women) chi(2) analysis revealed statistically significant differences in task failure between H vs BPP (P = 0.03). EMG-RMS ranged from 33% +/- 10%/30% +/- 9% (DL, 1 kg) to 356% +/- 148%/283% +/- 80% (VR, 20 kg) in H/BPP with no statistical difference between groups regardless of load (P > 0.05). However, the EMG-RMS of the VR was greatest in all lifting tasks for both groups and increased with heavier loads. CONCLUSION Heavier loading leads to an increase (2-to 3-fold) in trunk muscle activity with comparable patterns. Heavy loading (20 kg) leads to task failure, especially in women with back pain.
Background
The metabolic syndrome (MetS) is a risk cluster for a number of secondary diseases. The implementation of prevention programs requires early detection of individuals at risk. However, access to health care providers is limited in structurally weak regions. Brandenburg, a rural federal state in Germany, has an especially high MetS prevalence and disease burden. This study aims to validate and test the feasibility of a setup for mobile diagnostics of MetS and its secondary diseases, to evaluate the MetS prevalence and its association with moderating factors in Brandenburg and to identify new ways of early prevention, while establishing a “Mobile Brandenburg Cohort” to reveal new causes and risk factors for MetS.
Methods
In a pilot study, setups for mobile diagnostics of MetS and secondary diseases will be developed and validated. A van will be equipped as an examination room using point-of-care blood analyzers and by mobilizing standard methods. In study part A, these mobile diagnostic units will be placed at different locations in Brandenburg to locally recruit 5000 participants aged 40-70 years. They will be examined for MetS and advice on nutrition and physical activity will be provided. Questionnaires will be used to evaluate sociodemographics, stress perception, and physical activity. In study part B, participants with MetS, but without known secondary diseases, will receive a detailed mobile medical examination, including MetS diagnostics, medical history, clinical examinations, and instrumental diagnostics for internal, cardiovascular, musculoskeletal, and cognitive disorders. Participants will receive advice on nutrition and an exercise program will be demonstrated on site. People unable to participate in these mobile examinations will be interviewed by telephone. If necessary, participants will be referred to general practitioners for further diagnosis.
Discussion
The mobile diagnostics approach enables early detection of individuals at risk, and their targeted referral to local health care providers. Evaluation of the MetS prevalence, its relation to risk-increasing factors, and the “Mobile Brandenburg Cohort” create a unique database for further longitudinal studies on the implementation of home-based prevention programs to reduce mortality, especially in rural regions.
Trial registration
German Clinical Trials Register, DRKS00022764; registered 07 October 2020—retrospectively registered.
Background
The metabolic syndrome (MetS) is a risk cluster for a number of secondary diseases. The implementation of prevention programs requires early detection of individuals at risk. However, access to health care providers is limited in structurally weak regions. Brandenburg, a rural federal state in Germany, has an especially high MetS prevalence and disease burden. This study aims to validate and test the feasibility of a setup for mobile diagnostics of MetS and its secondary diseases, to evaluate the MetS prevalence and its association with moderating factors in Brandenburg and to identify new ways of early prevention, while establishing a “Mobile Brandenburg Cohort” to reveal new causes and risk factors for MetS.
Methods
In a pilot study, setups for mobile diagnostics of MetS and secondary diseases will be developed and validated. A van will be equipped as an examination room using point-of-care blood analyzers and by mobilizing standard methods. In study part A, these mobile diagnostic units will be placed at different locations in Brandenburg to locally recruit 5000 participants aged 40-70 years. They will be examined for MetS and advice on nutrition and physical activity will be provided. Questionnaires will be used to evaluate sociodemographics, stress perception, and physical activity. In study part B, participants with MetS, but without known secondary diseases, will receive a detailed mobile medical examination, including MetS diagnostics, medical history, clinical examinations, and instrumental diagnostics for internal, cardiovascular, musculoskeletal, and cognitive disorders. Participants will receive advice on nutrition and an exercise program will be demonstrated on site. People unable to participate in these mobile examinations will be interviewed by telephone. If necessary, participants will be referred to general practitioners for further diagnosis.
Discussion
The mobile diagnostics approach enables early detection of individuals at risk, and their targeted referral to local health care providers. Evaluation of the MetS prevalence, its relation to risk-increasing factors, and the “Mobile Brandenburg Cohort” create a unique database for further longitudinal studies on the implementation of home-based prevention programs to reduce mortality, especially in rural regions.
Trial registration
German Clinical Trials Register, DRKS00022764; registered 07 October 2020—retrospectively registered.
The effects of exercise interventions on unspecific chronic low back pain (CLBP) have been investigated in many studies, but the results are inconclusive regarding exercise types, efficiency, and sustainability. This may be because the influence of psychosocial factors on exercise induced adaptation regarding CLBP is neglected. Therefore, this study assessed psychosocial characteristics, which moderate and mediate the effects of sensorimotor exercise on LBP. A single-blind 3-arm multicenter randomized controlled trial was conducted for 12-weeks. Three exercise groups, sensorimotor exercise (SMT), sensorimotor and behavioral training (SMT-BT), and regular routines (CG) were randomly assigned to 662 volunteers. Primary outcomes (pain intensity and disability) and psychosocial characteristics were assessed at baseline (M1) and follow-up (3/6/12/24 weeks, M2-M5). Multiple regression models were used to analyze whether psychosocial characteristics are moderators of the relationship between exercise and pain, meaning that psychosocial factors and exercise interact. Causal mediation analysis were conducted to analyze, whether psychosocial characteristics mediate the exercise effect on pain. A total of 453 participants with intermittent pain (mean age = 39.5 ± 12.2 years, f = 62%) completed the training. It was shown, that depressive symptomatology (at M4, M5), vital exhaustion (at M4), and perceived social support (at M5) are significant moderators of the relationship between exercise and the reduction of pain intensity. Further depressive mood (at M4), social-satisfaction (at M4), and anxiety (at M5 SMT) significantly moderate the exercise effect on pain disability. The amount of moderation was of clinical relevance. In contrast, there were no psychosocial variables which mediated exercise effects on pain. In conclusion it was shown, that psychosocial variables can be moderators in the relationship between sensorimotor exercise induced adaptation on CLBP which may explain conflicting results in the past regarding the merit of exercise interventions in CLBP. Results suggest further an early identification of psychosocial risk factors by diagnostic tools, which may essential support the planning of personalized exercise therapy.
Level of Evidence: Level I.
Clinical Trial Registration: DRKS00004977, LOE: I, MiSpEx: grant-number: 080102A/11-14. https://www.drks.de/drks_web/navigate.do?navigationId=trial.HTML&TRIAL_ID=DRKS00004977.
Objective: This study investigated intraindividual differences of intratendinous blood flow (IBF) in response to running exercise in participants with Achilles tendinopathy.
Design: This is a cross-sectional study.
Setting: The study was conducted at the University Outpatient Clinic.
Participants: Sonographic detectable intratendinous blood flow was examined in symptomatic and contralateral asymptomatic Achilles tendons of 19 participants (42 ± 13 years, 178 ± 10 cm, 76 ± 12 kg, VISA-A 75 ± 16) with clinically diagnosed unilateral Achilles tendinopathy and sonographic evident tendinosis.
Intervention: IBF was assessed using Doppler ultrasound “Advanced Dynamic Flow” before (Upre) and 5, 30, 60, and 120 min (U5–U120) after a standardized submaximal constant load run.
Main Outcome Measure: IBF was quantified by counting the number (n) of vessels in each tendon.
Results: At Upre, IBF was higher in symptomatic compared with asymptomatic tendons [mean 6.3 (95% CI: 2.8–9.9) and 1.7 (0.4–2.9), p < 0.01]. Overall, 63% of symptomatic and 47% of asymptomatic Achilles tendons responded to exercise, whereas 16 and 11% showed persisting IBF and 21 and 42% remained avascular throughout the investigation. At U5, IBF increased in both symptomatic and asymptomatic tendons [difference to baseline: 2.4 (0.3–4.5) and 0.9 (0.5–1.4), p = 0.05]. At U30 to U120, IBF was still increased in symptomatic but not in asymptomatic tendons [mean difference to baseline: 1.9 (0.8–2.9) and 0.1 (-0.9 to 1.2), p < 0.01].
Conclusion: Irrespective of pathology, 47–63% of Achilles tendons responded to exercise with an immediate acute physiological IBF increase by an average of one to two vessels (“responders”). A higher amount of baseline IBF (approximately five vessels) and a prolonged exercise-induced IBF response found in symptomatic ATs indicate a pain-associated altered intratendinous “neovascularization.”
Objective: This study investigated intraindividual differences of intratendinous blood flow (IBF) in response to running exercise in participants with Achilles tendinopathy.
Design: This is a cross-sectional study.
Setting: The study was conducted at the University Outpatient Clinic.
Participants: Sonographic detectable intratendinous blood flow was examined in symptomatic and contralateral asymptomatic Achilles tendons of 19 participants (42 ± 13 years, 178 ± 10 cm, 76 ± 12 kg, VISA-A 75 ± 16) with clinically diagnosed unilateral Achilles tendinopathy and sonographic evident tendinosis.
Intervention: IBF was assessed using Doppler ultrasound “Advanced Dynamic Flow” before (Upre) and 5, 30, 60, and 120 min (U5–U120) after a standardized submaximal constant load run.
Main Outcome Measure: IBF was quantified by counting the number (n) of vessels in each tendon.
Results: At Upre, IBF was higher in symptomatic compared with asymptomatic tendons [mean 6.3 (95% CI: 2.8–9.9) and 1.7 (0.4–2.9), p < 0.01]. Overall, 63% of symptomatic and 47% of asymptomatic Achilles tendons responded to exercise, whereas 16 and 11% showed persisting IBF and 21 and 42% remained avascular throughout the investigation. At U5, IBF increased in both symptomatic and asymptomatic tendons [difference to baseline: 2.4 (0.3–4.5) and 0.9 (0.5–1.4), p = 0.05]. At U30 to U120, IBF was still increased in symptomatic but not in asymptomatic tendons [mean difference to baseline: 1.9 (0.8–2.9) and 0.1 (-0.9 to 1.2), p < 0.01].
Conclusion: Irrespective of pathology, 47–63% of Achilles tendons responded to exercise with an immediate acute physiological IBF increase by an average of one to two vessels (“responders”). A higher amount of baseline IBF (approximately five vessels) and a prolonged exercise-induced IBF response found in symptomatic ATs indicate a pain-associated altered intratendinous “neovascularization.”
Background: Chronic ankle instability, developing from ankle sprain, is one of the most common sports injuries. Besides it being an ankle issue, chronic ankle instability can also cause additional injuries. Investigating the epidemiology of chronic ankle instability is an essential step to develop an adequate injury prevention strategy. However, the epidemiology of chronic ankle instability remains unknown. Therefore, the purpose of this study was to investigate the epidemiology of chronic ankle instability through valid and reliable self-reported tools in active populations.
Methods: An electronic search was performed on PubMed and Web of Science in July 2020. The inclusion criteria for articles were peer-reviewed, published between 2006 and 2020, using one of the valid and reliable tools to evaluate ankle instability, determining chronic ankle instability based on the criteria of the International Ankle
Consortium, and including the outcome of epidemiology of chronic ankle instability. The risk of bias of the included studies was evaluated with an adapted tool for the sports injury review method.
Results: After removing duplicated studies, 593 articles were screened for eligibility. Twenty full-texts were screened and finally nine studies were included, assessing 3804 participants in total. The participants were between 15 and 32 years old and represented soldiers, students, athletes and active individuals with a history of ankle sprain. The prevalence of chronic ankle instability was 25%, ranging between 7 and 53%. The prevalence of chronic ankle instability within participants with a history of ankle sprains was 46%, ranging between 9 and 76%. Five included studies identified chronic ankle instability based on the standard criteria, and four studies applied adapted exclusion criteria to conduct the study. Five out of nine included studies showed a low risk of bias.
Conclusions: The prevalence of chronic ankle instability shows a wide range. This could be due to the different exclusion criteria, age, sports discipline, or other factors among the included studies. For future studies, standardized criteria to investigate the epidemiology of chronic ankle instability are required. The epidemiology of
CAI should be prospective. Factors affecting the prevalence of chronic ankle instability should be investigated and clearly described.
Background: Chronic ankle instability, developing from ankle sprain, is one of the most common sports injuries. Besides it being an ankle issue, chronic ankle instability can also cause additional injuries. Investigating the epidemiology of chronic ankle instability is an essential step to develop an adequate injury prevention strategy. However, the epidemiology of chronic ankle instability remains unknown. Therefore, the purpose of this study was to investigate the epidemiology of chronic ankle instability through valid and reliable self-reported tools in active populations.
Methods: An electronic search was performed on PubMed and Web of Science in July 2020. The inclusion criteria for articles were peer-reviewed, published between 2006 and 2020, using one of the valid and reliable tools to evaluate ankle instability, determining chronic ankle instability based on the criteria of the International Ankle
Consortium, and including the outcome of epidemiology of chronic ankle instability. The risk of bias of the included studies was evaluated with an adapted tool for the sports injury review method.
Results: After removing duplicated studies, 593 articles were screened for eligibility. Twenty full-texts were screened and finally nine studies were included, assessing 3804 participants in total. The participants were between 15 and 32 years old and represented soldiers, students, athletes and active individuals with a history of ankle sprain. The prevalence of chronic ankle instability was 25%, ranging between 7 and 53%. The prevalence of chronic ankle instability within participants with a history of ankle sprains was 46%, ranging between 9 and 76%. Five included studies identified chronic ankle instability based on the standard criteria, and four studies applied adapted exclusion criteria to conduct the study. Five out of nine included studies showed a low risk of bias.
Conclusions: The prevalence of chronic ankle instability shows a wide range. This could be due to the different exclusion criteria, age, sports discipline, or other factors among the included studies. For future studies, standardized criteria to investigate the epidemiology of chronic ankle instability are required. The epidemiology of
CAI should be prospective. Factors affecting the prevalence of chronic ankle instability should be investigated and clearly described.
The aim of this study was to investigate the effect of a 6-week sensorimotor or resistance training on maximum trunk strength and response to sudden, high-intensity loading in athletes. Interventions showed no significant difference for maximum strength in concentric and eccentric testing (p>0.05). For perturbation compensation, higher peak torque response following SMT (Extension: +24Nm 95%CI +/- 19Nm; Rotation: + 19Nm 95%CI +/- 13Nm) and RT (Extension: +35Nm 95%CI +/- 16Nm; Rotation: +5Nm 95%CI +/- 4Nm) compared to CG (Extension: -4Nm 95%CI +/- 16Nm; Rotation: -2Nm 95%CI +/- 4Nm) was present (p<0.05).
Background: The relationship between exercise-induced intratendinous blood flow (IBF) and tendon pathology or training exposure is unclear.
Objective: This study investigates the acute effect of running exercise on sonographic detectable IBF in healthy and tendinopathic Achilles tendons (ATs) of runners and recreational participants.
Methods: 48 participants (43 ± 13 years, 176 ± 9 cm, 75 ± 11 kg) performed a standardized submaximal 30-min constant load treadmill run with Doppler ultrasound “Advanced dynamic flow” examinations before (Upre) and 5, 30, 60, and 120 min (U5-U120) afterward. Included were runners (>30 km/week) and recreational participants (<10 km/week) with healthy (Hrun, n = 10; Hrec, n = 15) or tendinopathic (Trun, n = 13; Trec, n = 10) ATs. IBF was assessed by counting number [n] of intratendinous vessels. IBF data are presented descriptively (%, median [minimum to maximum range] for baseline-IBF and IBF-difference post-exercise). Statistical differences for group and time point IBF and IBF changes were analyzed with Friedman and Kruskal-Wallis ANOVA (α = 0.05).
Results: At baseline, IBF was detected in 40% (3 [1–6]) of Hrun, in 53% (4 [1–5]) of Hrec, in 85% (3 [1–25]) of Trun, and 70% (10 [2–30]) of Trec. At U5 IBF responded to exercise in 30% (3 [−1–9]) of Hrun, in 53% (4 [−2–6]) of Hrec, in 70% (4 [−10–10]) of Trun, and in 80% (5 [1–10]) of Trec. While IBF in 80% of healthy responding ATs returned to baseline at U30, IBF remained elevated until U120 in 60% of tendinopathic ATs. Within groups, IBF changes from Upre-U120 were significant for Hrec (p < 0.01), Trun (p = 0.05), and Trec (p < 0.01). Between groups, IBF changes in consecutive examinations were not significantly different (p > 0.05) but IBF-level was significantly higher at all measurement time points in tendinopathic versus healthy ATs (p < 0.05).
Conclusion: Irrespective of training status and tendon pathology, running leads to an immediate increase of IBF in responding tendons. This increase occurs shortly in healthy and prolonged in tendinopathic ATs. Training exposure does not alter IBF occurrence, but IBF level is elevated in tendon pathology. While an immediate exercise-induced IBF increase is a physiological response, prolonged IBF is considered a pathological finding associated with Achilles tendinopathy.
Background: The relationship between exercise-induced intratendinous blood flow (IBF) and tendon pathology or training exposure is unclear.
Objective: This study investigates the acute effect of running exercise on sonographic detectable IBF in healthy and tendinopathic Achilles tendons (ATs) of runners and recreational participants.
Methods: 48 participants (43 ± 13 years, 176 ± 9 cm, 75 ± 11 kg) performed a standardized submaximal 30-min constant load treadmill run with Doppler ultrasound “Advanced dynamic flow” examinations before (Upre) and 5, 30, 60, and 120 min (U5-U120) afterward. Included were runners (>30 km/week) and recreational participants (<10 km/week) with healthy (Hrun, n = 10; Hrec, n = 15) or tendinopathic (Trun, n = 13; Trec, n = 10) ATs. IBF was assessed by counting number [n] of intratendinous vessels. IBF data are presented descriptively (%, median [minimum to maximum range] for baseline-IBF and IBF-difference post-exercise). Statistical differences for group and time point IBF and IBF changes were analyzed with Friedman and Kruskal-Wallis ANOVA (α = 0.05).
Results: At baseline, IBF was detected in 40% (3 [1–6]) of Hrun, in 53% (4 [1–5]) of Hrec, in 85% (3 [1–25]) of Trun, and 70% (10 [2–30]) of Trec. At U5 IBF responded to exercise in 30% (3 [−1–9]) of Hrun, in 53% (4 [−2–6]) of Hrec, in 70% (4 [−10–10]) of Trun, and in 80% (5 [1–10]) of Trec. While IBF in 80% of healthy responding ATs returned to baseline at U30, IBF remained elevated until U120 in 60% of tendinopathic ATs. Within groups, IBF changes from Upre-U120 were significant for Hrec (p < 0.01), Trun (p = 0.05), and Trec (p < 0.01). Between groups, IBF changes in consecutive examinations were not significantly different (p > 0.05) but IBF-level was significantly higher at all measurement time points in tendinopathic versus healthy ATs (p < 0.05).
Conclusion: Irrespective of training status and tendon pathology, running leads to an immediate increase of IBF in responding tendons. This increase occurs shortly in healthy and prolonged in tendinopathic ATs. Training exposure does not alter IBF occurrence, but IBF level is elevated in tendon pathology. While an immediate exercise-induced IBF increase is a physiological response, prolonged IBF is considered a pathological finding associated with Achilles tendinopathy.
Background
The anterior cruciate ligament (ACL) rupture can lead to impaired knee function. Reconstruction decreases the mechanical instability but might not have an impact on sensorimotor alterations.
Objective
Evaluation of the sensorimotor function measured with the active joint position sense (JPS) test in anterior cruciate ligament (ACL) reconstructed patients compared to the contralateral side and a healthy control group.
Methods
The databases MEDLINE, CINAHL, EMBASE, PEDro, Cochrane Library and SPORTDiscus were systematically searched from origin until April 2020. Studies published in English, German, French, Spanish or Italian language were included. Evaluation of the sensorimotor performance was restricted to the active joint position sense test in ACL reconstructed participants or healthy controls. The Preferred Items for Systematic Reviews and Meta-Analyses guidelines were followed. Study quality was evaluated using the Quality Assessment Tool for Observational Cohort and Cross-Sectional Studies. Data was descriptively synthesized.
Results
Ten studies were included after application of the selective criteria. Higher angular deviation, reaching significant difference (p < 0.001) in one study, was shown up to three months after surgery in the affected limb. Six months post-operative significantly less error (p < 0.01) was found in the reconstructed leg compared to the contralateral side and healthy controls. One or more years after ACL reconstruction significant differences were inconsistent along the studies.
Conclusions
Altered sensorimotor function was present after ACL reconstruction. Due to inconsistencies and small magnitudes, clinical relevance might be questionable. JPS testing can be performed in acute injured persons and prospective studies could enhance knowledge of sensorimotor function throughout the rehabilitative processes.
Background
The anterior cruciate ligament (ACL) rupture can lead to impaired knee function. Reconstruction decreases the mechanical instability but might not have an impact on sensorimotor alterations.
Objective
Evaluation of the sensorimotor function measured with the active joint position sense (JPS) test in anterior cruciate ligament (ACL) reconstructed patients compared to the contralateral side and a healthy control group.
Methods
The databases MEDLINE, CINAHL, EMBASE, PEDro, Cochrane Library and SPORTDiscus were systematically searched from origin until April 2020. Studies published in English, German, French, Spanish or Italian language were included. Evaluation of the sensorimotor performance was restricted to the active joint position sense test in ACL reconstructed participants or healthy controls. The Preferred Items for Systematic Reviews and Meta-Analyses guidelines were followed. Study quality was evaluated using the Quality Assessment Tool for Observational Cohort and Cross-Sectional Studies. Data was descriptively synthesized.
Results
Ten studies were included after application of the selective criteria. Higher angular deviation, reaching significant difference (p < 0.001) in one study, was shown up to three months after surgery in the affected limb. Six months post-operative significantly less error (p < 0.01) was found in the reconstructed leg compared to the contralateral side and healthy controls. One or more years after ACL reconstruction significant differences were inconsistent along the studies.
Conclusions
Altered sensorimotor function was present after ACL reconstruction. Due to inconsistencies and small magnitudes, clinical relevance might be questionable. JPS testing can be performed in acute injured persons and prospective studies could enhance knowledge of sensorimotor function throughout the rehabilitative processes.
Background/Purpose
Muscular reflex responses of the lower extremities to sudden gait disturbances are related to postural stability and injury risk. Chronic ankle instability (CAI) has shown to affect activities related to the distal leg muscles while walking. Its effects on proximal muscle activities of the leg, both for the injured- (IN) and uninjured-side (NON), remain unclear. Therefore, the aim was to compare the difference of the motor control strategy in ipsilateral and contralateral proximal joints while unperturbed walking and perturbed walking between individuals with CAI and matched controls.
Materials and methods
In a cross-sectional study, 13 participants with unilateral CAI and 13 controls (CON) walked on a split-belt treadmill with and without random left- and right-sided perturbations. EMG amplitudes of muscles at lower extremities were analyzed 200 ms after perturbations, 200 ms before, and 100 ms after (Post100) heel contact while walking. Onset latencies were analyzed at heel contacts and after perturbations. Statistical significance was set at alpha≤0.05 and 95% confidence intervals were applied to determine group differences. Cohen’s d effect sizes were calculated to evaluate the extent of differences.
Results
Participants with CAI showed increased EMG amplitudes for NON-rectus abdominus at Post100 and shorter latencies for IN-gluteus maximus after heel contact compared to CON (p<0.05). Overall, leg muscles (rectus femoris, biceps femoris, and gluteus medius) activated earlier and less bilaterally (d = 0.30–0.88) and trunk muscles (bilateral rectus abdominus and NON-erector spinae) activated earlier and more for the CAI group than CON group (d = 0.33–1.09).
Conclusion
Unilateral CAI alters the pattern of the motor control strategy around proximal joints bilaterally. Neuromuscular training for the muscles, which alters motor control strategy because of CAI, could be taken into consideration when planning rehabilitation for CAI.
Background/Purpose
Muscular reflex responses of the lower extremities to sudden gait disturbances are related to postural stability and injury risk. Chronic ankle instability (CAI) has shown to affect activities related to the distal leg muscles while walking. Its effects on proximal muscle activities of the leg, both for the injured- (IN) and uninjured-side (NON), remain unclear. Therefore, the aim was to compare the difference of the motor control strategy in ipsilateral and contralateral proximal joints while unperturbed walking and perturbed walking between individuals with CAI and matched controls.
Materials and methods
In a cross-sectional study, 13 participants with unilateral CAI and 13 controls (CON) walked on a split-belt treadmill with and without random left- and right-sided perturbations. EMG amplitudes of muscles at lower extremities were analyzed 200 ms after perturbations, 200 ms before, and 100 ms after (Post100) heel contact while walking. Onset latencies were analyzed at heel contacts and after perturbations. Statistical significance was set at alpha≤0.05 and 95% confidence intervals were applied to determine group differences. Cohen’s d effect sizes were calculated to evaluate the extent of differences.
Results
Participants with CAI showed increased EMG amplitudes for NON-rectus abdominus at Post100 and shorter latencies for IN-gluteus maximus after heel contact compared to CON (p<0.05). Overall, leg muscles (rectus femoris, biceps femoris, and gluteus medius) activated earlier and less bilaterally (d = 0.30–0.88) and trunk muscles (bilateral rectus abdominus and NON-erector spinae) activated earlier and more for the CAI group than CON group (d = 0.33–1.09).
Conclusion
Unilateral CAI alters the pattern of the motor control strategy around proximal joints bilaterally. Neuromuscular training for the muscles, which alters motor control strategy because of CAI, could be taken into consideration when planning rehabilitation for CAI.
Chronisch unspezifische Rückenschmerzen (CURS) gehören international zu den häufigsten Schmerzphänomenen und können für Athletinnen und Athleten karrierelimitierend sein. Knapp ein Drittel der jährlichen Trainingsausfallzeiten werden auf CURS zurückgeführt. In der Entstehung von chronischen Schmerzen ist ein multifaktorielles Ätiologiemodell mit einem signifikanten Einfluss psychosozialer Risikofaktoren evident. Obwohl dies in der Allgemeinbevölkerung bereits gut erforscht ist, gibt es in der Sportwissenschaft vergleichsweise wenige Arbeiten darüber. Dieses Thema wird daher in drei Multicenterstudien und zahlreichen Teilstudien des MiSpEx-Netzwerks (Medicine in Spine-Exercise-Network, Förderzeitraum 2011 – 2018) aufgegriffen. Entsprechend der Empfehlung einer frühzeitigen Diagnostik von Chronifizierungsfaktoren in der „Nationalen Versorgungsleitlinie Kreuzschmerz“, beschäftigt sich das Netzwerk u. a. mit der Überprüfung, Entwicklung und Evaluation diagnostischer Möglichkeiten. Der vorliegende Beitrag beschreibt die Entwicklung einer Diagnostik von psychosozialen Risikofaktoren, die einerseits eine Einschätzung des Risikos der Entwicklung von CURS und andererseits eine individuelle Zuweisung zu (Trainings)Interventionen erlaubt. Es wird die Entwicklungsrationale beschrieben und dabei verschiedene methodische Herangehensweisen und Entscheidungssequenzen reflektiert.
BACKGROUND: Compensating unstable situations is an important functional capability to maintain joint stability, to compensate perturbations and to prevent (re-)injury. Therefore, reduced maximum strength and altered neuromuscular activity are expected by inducing instability to load test situations. Possible effects are not clear for induced instability during maximum legpress tests in healthy individuals. OBJECTIVE: To compare isokinetic legpress (LP) strength and lower-leg muscle activity using stable (S) and unstable (UN) footplates. METHODS: 16 males (28 +/- 4 yrs, 179 +/- 7 cm, 75 +/- 8 kg) performed five maximum LP in concentric (CON) and eccentric (ECC) mode. The maximum force (Fmax) and muscle activity were measured under conditions of S and UN footplates. The tested muscles comprised of the tibialis anterior (TA), peroneus longus (PL) and soleus (SOL) and their activity were quantified against the MVIC of each muscle respectively. RESULTS: The main finding revealed a significant reduction in Fmax under UN condition: 11.9 +/- 11.3% in CON and 23.5 +/- 47.8% in ECC (P < 0.05). Significant findings were also noted regarding the RMS derived values of the EMG of PL and TA. CONCLUSION: Unstable LP reduced force generation and increased the activity of PL and TA muscles which confirmed greater neuromuscular effort to compensate instability. This may have some implications for resistance testing and training coupled with an unstable base in the prevention and rehabilitation of injury to the neuromusculoskeletal system.
Research question: The purpose of this study was to evaluate the test-retest reliability of lower extremity kinematics during squat, hip abduction and lunge exercises captured by the Kinect and to evaluate the agreement to a reference 3D camera-based motion system. Methods: Twenty-one healthy individuals performed five repetitions of each lower limb exercise on two different days. Movements were simultaneously assessed by the Kinect and the reference 3D motion system. Joint angles and positions of the lower limb were calculated for sagittal and frontal plane. For the inter-session reliability and the agreement between the two systems standard error of measurement (SEM), bias with limits of agreement (LoA) and Pearson Correlation Coefficient (r) were calculated. Results: Parameters indicated varying reliability for the assessed joint angles and positions and decreasing reliability with increasing task complexity. Across all exercises, measurement deviations were shown especially for small movement amplitudes. Variability was acceptable for joint angles and positions during the squat, partially acceptable during the hip abduction and predominately inacceptable during the lunge. The agreement between systems was characterized by systematic errors. Overestimations by the Kinect were apparent for hip flexion during the squat and hip abduction/adduction during the hip abduction exercise as well as for the knee positions during the lunge. Knee and hip flexion during hip abduction and lunge were underestimated by the Kinect. Significance: The Kinect system can reliably assess lower limb joint angles and positions during simple exercises. The validity of the system is however restricted. An application in the field of early orthopedic rehabilitation without further development of post-processing techniques seems so far limited.
An association between static and dynamic postural control exists in adults with back pain. We aimed to determine whether this association also exists in adolescent athletes with the same condition. In all, 128 athletes with and without back pain performed three measurements of 15s of static (one-legged stance) and dynamic (star excursion balance test) postural control tests. All subjects and amatched subgroup of athletes with and without back pain were analyzed. The smallest center of pressure mediolateral and anterior-posterior displacements (mm) and normalized highest reach distance were the outcome measures. No association was found between variables of the static and dynamic tests for all subjects and the matched group with and without back pain. The control of static and dynamic posture in adolescent athletes with and without back pain might not be related.
Cardiovascular drift response over two different constant-load exercises in healthy non-athletes
(2019)
Cardiovascular drift (CV-d) is a steady increase in heart rate (HR) over time while performing constant load moderate intensity exercise (CME) > 20 min. CV-d presents problems for the prescription of exercise intensity by means of HR, because the work rate (WR) during exercise must be adjusted to maintain target HR, thus disturbing the intended effect of the exercise intervention. It has been shown that the increase in HR during CME is due to changes in WR and not to CV-d.
Background: The back pain screening tool Risk-Prevention-Index Social (RPI-S) identifies the individual psychosocial risk for low back pain chronification and supports the allocation of patients at risk in additional multidisciplinary treatments. The study objectives were to evaluate (1) the prognostic validity of the RPI-S for a 6-month time frame and (2) the clinical benefit of the RPI-S.
Methods: In a multicenter single-blind 3-armed randomized controlled trial, n = 660 persons (age 18–65 years) were randomly assigned to a twelve-week uni- or multidisciplinary exercise intervention or control group. Psychosocial risk was assessed by the RPI-S domain social environment (RPI-SSE) and the outcome pain by the Chronic Pain Grade Questionnaire (baseline M1, 12-weeks M4, 24-weeks M5). Prognostic validity was quantified by the root mean squared error (RMSE) within the control group. The clinical benefit of RPI-SSE was calculated by repeated measures ANOVA in intervention groups.
Results: A subsample of n = 274 participants (mean = 38.0 years, SD 13.1) was analyzed, of which 30% were classified at risk in their psychosocial profile. The half-year prognostic validity was good (RMSE for disability of 9.04 at M4 and of 9.73 at M5; RMSE for pain intensity of 12.45 at M4 and of 14.49 at M5). People at risk showed significantly stronger reduction in pain disability and intensity at M4/M5, if participating in a multidisciplinary exercise treatment. Subjects at no risk showed a smaller reduction in pain disability in both interventions and no group differences for pain intensity. Regarding disability due to pain, around 41% of the sample would gain an unfitted treatment without the back pain screening.
Conclusion: The RPI-SSE prognostic validity demonstrated good applicability and a clinical benefit confirmed by a clear advantage of an individualized treatment possibility.
The research aimed to investigate back pain (BP) prevalence in a large cohort of young athletes with respect to age, gender, and sport discipline. BP (within the last 7days) was assessed with a face scale (face 1-2=no pain; face 3-5=pain) in 2116 athletes (m/f 61%/39%; 13.3 +/- 1.7years; 163.0 +/- 11.8cm; 52.6 +/- 13.9kg; 4.9 +/- 2.7 training years; 8.4 +/- 5.7 training h/week). Four different sports categories were devised (a: combat sports, b: game sports; c: explosive strength sport; d: endurance sport). Analysis was described descriptively, regarding age, gender, and sport. In addition, 95% confidence intervals (CI) were calculated. About 168 (8%) athletes were allocated into the BP group. About 9% of females and 7% of males reported BP. Athletes, 11-13years, showed a prevalence of 2-4%; while prevalence increased to 12-20% in 14- to 17-year olds. Considering sport discipline, prevalence ranged from 3% (soccer) to 14% (canoeing). Prevalences in weight lifting, judo, wrestling, rowing, and shooting were 10%; in boxing, soccer, handball, cycling, and horse riding, 6%. 95% CI ranged between 0.08-0.11. BP exists in adolescent athletes, but is uncommon and shows no gender differences. A prevalence increase after age 14 is obvious. Differentiated prevention programs in daily training routines might address sport discipline-specific BP prevalence.
Instrumented treadmills offer the potential to generate standardized walking perturbations, which are particularly rapid and powerful. However, technical requirements to release adequate perturbations regarding timing, duration and amplitude are demanding. This study investigated the test-retest reliability and validity of a new treadmill perturbation protocol releasing rapid and unexpected belt perturbations to provoke muscular reflex responses at lower extremities and the trunk. Fourteen healthy participants underwent two identical treadmill walking protocols, consisting of 10 superimposed one-sided belt perturbations (100 ms duration; 2 m/s amplitude), triggered by a plantar pressure insole 200 ms after heel contact. Delay, duration and amplitude of applied perturbations were recorded by 3D-motion capture. Muscular reflex responses (within 200 ms) were measured at lower extremities and the trunk (10-lead EMG). Data was analyzed descriptively (mean +/- SD). Reliability was analyzed using test-retest variability (TRV%) and limits of agreement (LoA, bias +/- 1.96*SD). Perturbation delay was 202 14 ms, duration was 102 +/- 4 ms and amplitude was 2.1 +/- 0.01 m/s. TRV for perturbation delay, duration and amplitude ranged from 5.0% to 5.7%. LoA reached 3 +/- 36 ms for delay, 2 +/- 13 ms for duration and 0.0 +/- 0.3 m/s for amplitude. EMG amplitudes following perturbations ranged between 106 +/- 97% and 909 +/- 979% of unperturbed gait and EMG latencies between 82 +/- 14 ms and 106 +/- 16 ms. Minor differences between preset and observed perturbation characteristics and results of test-retest analysis prove a high validity with excellent reliability of the setup. Therefore, the protocol tested can be recommended to provoke muscular reflex responses at lower extremities and the trunk in perturbed walking. (C) 2017 Elsevier Ltd. All rights reserved.
In the context of back pain, great emphasis has been placed on the importance of trunk stability, especially in situations requiring compensation of repetitive, intense loading induced during high-performance activities, e.g., jumping or landing. This study aims to evaluate trunk muscle activity during drop jump in adolescent athletes with back pain (BP) compared to athletes without back pain (NBP). Eleven adolescent athletes suffering back pain (BP: m/f: n = 4/7; 15.9 +/- 1.3 y; 176 +/- 11 cm; 68 +/- 11 kg; 12.4 +/- 10.5 h/we training) and 11 matched athletes without back pain (NBP: m/f: n = 4/7; 15.5 +/- 1.3 y; 174 +/- 7 cm; 67 +/- 8 kg; 14.9 +/- 9.5 h/we training) were evaluated. Subjects conducted 3 drop jumps onto a force plate (ground reaction force). Bilateral 12-lead SEMG (surface Electromyography) was applied to assess trunk muscle activity. Ground contact time [ms], maximum vertical jump force [N], jump time [ms] and the jump performance index [m/s] were calculated for drop jumps. SEMG amplitudes (RMS: root mean square [%]) for all 12 single muscles were normalized toMIVC (maximum isometric voluntary contraction) and analyzed in 4 time windows (100 ms pre- and 200 ms post-initial ground contact, 100 ms pre- and 200 ms post-landing) as outcome variables. In addition, muscles were grouped and analyzed in ventral and dorsal muscles, as well as straight and transverse trunk muscles. Drop jump ground reaction force variables did not differ between NBP and BP (p > 0.05). Mm obliquus externus and internus abdominis presented higher SEMG amplitudes (1.3-1.9-fold) for BP (p < 0.05). Mm rectus abdominis, erector spinae thoracic/lumbar and latissimus dorsi did not differ (p > 0.05). The muscle group analysis over the whole jumping cycle showed statistically significantly higher SEMG amplitudes for BP in the ventral (p = 0.031) and transverse muscles (p = 0.020) compared to NBP. Higher activity of transverse, but not straight, trunk muscles might indicate a specific compensation strategy to support trunk stability in athletes with back pain during drop jumps. Therefore, exercises favoring the transverse trunk muscles could be recommended for back pain treatment.
Repetitive overhead movements have been identified as a main risk factor to develop shoulder complaints with scapular muscle activity being altered. Reliable assessment of muscle activity is essential to differentiate between symptomatic and asymptomatic individuals. Therefore, the present study aimed to investigate the intra-and inter-session reliability of scapular muscle activity during maximal isokinetic shoulder flexion and extension. Eleven asymptomatic adults performed maximum effort isokinetic shoulder flexion and extension (concentric and eccentric at 60 degrees/s) in a test-retest design. Muscle activity of the upper and lower trapezius and serratus anterior was assessed by sEMG. Root Mean Square was calculated for whole ROM and single movement phases of absolute and normalized muscle activity. Absolute (Bland-Altman analysis (Bias, LoA), Minimal detectable change (MDC)) and relative reliability parameters (Intraclass correlation coefficient (ICC), coefficient of variation (CV)/test-retest variability (TRV)) were utilized for the evaluation of reproducibility. Intra-session reliability revealed ICCs between 0.56 and 0.98, averaged CVs of 18% and average MDCs of 81 mV. Inter-session reliability resulted in ICCs between 0.13 and 0.93, averaged TRVs of 21%, average MDCs of 15% and systematic and random error between -8 +/- 60% and 12 +/- 36%. Scapular muscle activity assessed in overhead movements can be measured reliably under maximum load conditions, though variability is dependent on the movement phase. Measurement variability does not exceed magnitudes of altered scapular muscle activities as reported in previous studies. Therefore, maximum load application is a promising approach for the evaluation of changes in scapular control related to pathologies. (C) 2017 Elsevier Ltd. All rights reserved.
Purpose Using a novel technique of high-density surface EMG decomposition and motor unit (MU) tracking, we compared changes in the properties of vastus medialis and vastus lateralis MU after endurance (END) and high-intensity interval training (HIIT). Methods Sixteen men were assigned to the END or the HIIT group (n = 8 each) and performed six training sessions for 14 d. Each session consisted of 8-12 x 60-s intervals at 100% peak power output separated by 75 s of recovery (HIIT) or 90-120 min continuous cycling at similar to 65% VO2peak (END). Pre- and postintervention, participants performed 1) incremental cycling to determine VO2peak and peak power output and 2) maximal, submaximal (10%, 30%, 50%, and 70% maximum voluntary contraction [MVC]), and sustained (until task failure at 30% MVC) isometric knee extensions while high-density surface EMG signals were recorded from the vastus medialis and vastus lateralis. EMG signals were decomposed (submaximal contractions) into individual MU by convolutive blind source separation. Finally, MU were tracked across sessions by semiblind source separation. Results After training, END and HIIT improved VO2peak similarly (by 5.0% and 6.7%, respectively). The HIIT group showed enhanced maximal knee extension torque by similar to 7% (P = 0.02) and was accompanied by an increase in discharge rate for high-threshold MU (50% knee extension MVC) (P < 0.05). By contrast, the END group increased their time to task failure by similar to 17% but showed no change in MU discharge rates (P > 0.05). Conclusions HIIT and END induce different adjustments in MU discharge rate despite similar improvements in cardiopulmonary fitness. Moreover, the changes induced by HIIT are specific for high-threshold MU. For the first time, we show that HIIT and END induce specific neuromuscular adaptations, possibly related to differences in exercise load intensity and training volume.
Background: Total hip or knee replacement is one of the most frequently performed surgical procedures. Physical rehabilitation following total hip or knee replacement is an essential part of the therapy to improve functional outcomes and quality of life. After discharge from inpatient rehabilitation, a subsequent postoperative exercise therapy is needed to maintain functional mobility. Telerehabilitation may be a potential innovative treatment approach. We aim to investigate the superiority of an interactive telerehabilitation intervention for patients after total hip or knee replacement, in comparison to usual care, regarding physical performance, functional mobility, quality of life and pain. Methods/design: This is an open, randomized controlled, multicenter superiority study with two prospective arms. One hundred and ten eligible and consenting participants with total knee or hip replacement will be recruited at admission to subsequent inpatient rehabilitation. After comprehensive, 3-week, inpatient rehabilitation, the intervention group performs a 3-month, interactive, home-based exercise training with a telerehabilitation system. For this purpose, the physiotherapist creates an individual training plan out of 38 different strength and balance exercises which were implemented in the system. Data about the quality and frequency of training are transmitted to the physiotherapist for further adjustment. Communication between patient and physiotherapist is possible with the system. The control group receives voluntary, usual aftercare programs. Baseline assessments are investigated after discharge from rehabilitation; final assessments 3 months later. The primary outcome is the difference in improvement between intervention and control group in 6-minute walk distance after 3 months. Secondary outcomes include differences in the Timed Up and Go Test, the Five-Times-Sit-to-Stand Test, the Stair Ascend Test, the Short-Form 36, the Western Ontario and McMaster Universities Osteoarthritis Index, the International Physical Activity Questionnaire, and postural control as well as gait and kinematic parameters of the lower limbs. Baseline-adjusted analysis of covariance models will be used to test for group differences in the primary and secondary endpoints. Discussion: We expect the intervention group to benefit from the interactive, home-based exercise training in many respects represented by the study endpoints. If successful, this approach could be used to enhance the access to aftercare programs, especially in structurally weak areas.
Increased Achilles (AT) and Patellar tendon (PT) thickness in adolescent athletes compared to non-athletes could be shown. However, it is unclear, if changes are of pathological or physiological origin due to training. The aim of this study was to determine physiological AT and PT thickness adaptation in adolescent elite athletes compared to non-athletes, considering sex and sport. In a longitudinal study design with two measurement days (M1/M2) within an interval of 3.2 +/- 0.8 years, 131 healthy adolescent elite athletes (m/f: 90/41) out of 13 different sports and 24 recreationally active controls (m/f: 6/18) were included. Both ATs and PTs were measured at standardized reference points. Athletes were divided into 4 sport categories [ball (B), combat (C), endurance (E) and explosive strength sports (S)]. Descriptive analysis (mean SD) and statistical testing for group differences was performed (cy = 0.05). AT thickness did not differ significantly between measurement days, neither in athletes (5.6 +/- 0.7 mm/5.6 +/- 0.7 mm) nor in controls (4.8 +/- 0.4 mm/4.9 +/- 0.5 mm, p > 0.05). For PTs, athletes presented increased thickness at M2 (Ml: 3.5 +/- 0.5 mm, M2: 3.8 +/- 0.5 mm, p < 0.001). In general, males had thicker ATs and PTs than females (p < 0.05). Considering sex and sports, only male athletes from B, C, and S showed significant higher PT-thickness at M2 compared to controls (p <= 0.01). Sport-specific adaptation regarding tendon thickness in adolescent elite athletes can be detected in PTs among male athletes participating in certain sports with high repetitive jumping and strength components. Sonographic microstructural analysis might provide an enhanced insight into tendon material properties enabling the differentiation of sex and influence of different sports.
Purpose: Psychosocial variables are known risk factors for the development and chronification of low back pain (LBP). Psychosocial stress is one of these risk factors. Therefore, this study aims to identify the most important types of stress predicting LBP. Self-efficacy was included as a potential protective factor related to both, stress and pain.
Participants and Methods: This prospective observational study assessed n = 1071 subjects with low back pain over 2 years. Psychosocial stress was evaluated in a broad manner using instruments assessing perceived stress, stress experiences in work and social contexts, vital exhaustion and life-event stress. Further, self-efficacy and pain (characteristic pain intensity and disability) were assessed. Using least absolute shrinkage selection operator regression, important predictors of characteristic pain intensity and pain-related disability at 1-year and 2-years follow-up were analyzed.
Results: The final sample for the statistic procedure consisted of 588 subjects (age: 39.2 (± 13.4) years; baseline pain intensity: 27.8 (± 18.4); disability: 14.3 (± 17.9)). In the 1-year follow-up, the stress types “tendency to worry”, “social isolation”, “work discontent” as well as vital exhaustion and negative life events were identified as risk factors for both pain intensity and pain-related disability. Within the 2-years follow-up, Lasso models identified the stress types “tendency to worry”, “social isolation”, “social conflicts”, and “perceived long-term stress” as potential risk factors for both pain intensity and disability. Furthermore, “self-efficacy” (“internality”, “self-concept”) and “social externality” play a role in reducing pain-related disability.
Conclusion: Stress experiences in social and work-related contexts were identified as important risk factors for LBP 1 or 2 years in the future, even in subjects with low initial pain levels. Self-efficacy turned out to be a protective factor for pain development, especially in the long-term follow-up. Results suggest a differentiation of stress types in addressing psychosocial factors in research, prevention and therapy approaches.
Purpose: Psychosocial variables are known risk factors for the development and chronification of low back pain (LBP). Psychosocial stress is one of these risk factors. Therefore, this study aims to identify the most important types of stress predicting LBP. Self-efficacy was included as a potential protective factor related to both, stress and pain.
Participants and Methods: This prospective observational study assessed n = 1071 subjects with low back pain over 2 years. Psychosocial stress was evaluated in a broad manner using instruments assessing perceived stress, stress experiences in work and social contexts, vital exhaustion and life-event stress. Further, self-efficacy and pain (characteristic pain intensity and disability) were assessed. Using least absolute shrinkage selection operator regression, important predictors of characteristic pain intensity and pain-related disability at 1-year and 2-years follow-up were analyzed.
Results: The final sample for the statistic procedure consisted of 588 subjects (age: 39.2 (± 13.4) years; baseline pain intensity: 27.8 (± 18.4); disability: 14.3 (± 17.9)). In the 1-year follow-up, the stress types “tendency to worry”, “social isolation”, “work discontent” as well as vital exhaustion and negative life events were identified as risk factors for both pain intensity and pain-related disability. Within the 2-years follow-up, Lasso models identified the stress types “tendency to worry”, “social isolation”, “social conflicts”, and “perceived long-term stress” as potential risk factors for both pain intensity and disability. Furthermore, “self-efficacy” (“internality”, “self-concept”) and “social externality” play a role in reducing pain-related disability.
Conclusion: Stress experiences in social and work-related contexts were identified as important risk factors for LBP 1 or 2 years in the future, even in subjects with low initial pain levels. Self-efficacy turned out to be a protective factor for pain development, especially in the long-term follow-up. Results suggest a differentiation of stress types in addressing psychosocial factors in research, prevention and therapy approaches.
Background: Exercising at intensities where fat oxidation rates are high has been shown to induce metabolic benefits in recreational and health-oriented sportsmen. The exercise intensity (Fat(peak)) eliciting peak fat oxidation rates is therefore of particular interest when aiming to prescribe exercise for the purpose of fat oxidation and related metabolic effects. Although running and walking are feasible and popular among the target population, no reliable protocols are available to assess Fat(peak) as well as its actual velocity (VPFO) during treadmill ergometry. Our purpose was therefore, to assess the reliability and day-to-day variability of VPFO and Fat(peak) during treadmill ergometry running. Conclusion: In summary, relative and absolute reliability indicators for V-PFO and Fat(peak) were found to be excellent. The observed LoA may now serve as a basis for future training prescriptions, although fat oxidation rates at prolonged exercise bouts at this intensity still need to be investigated.
Tendon adaptation due to mechanical loading is controversially discussed. However, data concerning the development of tendon thickness in adolescent athletes is sparse. The purpose of this study was to examine possible differences in Achilles (AT) and patellar tendon (PT) thickness in adolescent athletes while considering age, gender and sport-specific loading. In 500 adolescent competitive athletes of 16 different sports and 40 recreational controls both ATs and PTs were sonographically measured. Subjects were divided into 2 age groups (< 13; ≥ 13 years) and 6 sport type categories (ball, combat, and water sports, combined disciplines, cycling, controls). In addition, 3 risk groups (low, moderate, high) were created according to the athlete’s risk of developing tendinopathy. AT and PT thickness did not significantly differ between age groups (AT/PT:<13: 5.4±0.7 mm/3.6±0.5 mm;≥13: 5.3±0.7 mm/3.6±0.5 mm). In both age groups males presented higher tendon thickness than females (p<0.001). AT thickness was highest in ball sports/cyclists and lowest in controls (p≤0.002). PT thickness was greatest in water sports and lowest in controls (p=0.02). High risk athletes presented slightly higher AT thickness compared to the low risk group (p=0.03). Increased AT and PT thickness in certain sport types compared to controls supports the hypothesis of structural tendon adaptation due to sport-specific loading.
Background Overweight and obesity are increasing health problems that are not restricted to adults only. Childhood obesity is associated with metabolic, psychological and musculoskeletal comorbidities. However, knowledge about the effect of obesity on the foot function across maturation is lacking. Decreased foot function with disproportional loading characteristics is expected for obese children. The aim of this study was to examine foot loading characteristics during gait of normal-weight, overweight and obese children aged 1-12 years. Methods Results Mean walking velocity was 0.95 +/- 0.25 m/s with no differences between normal-weight, overweight or obese children (p = 0.0841). Results show higher foot contact area, arch index, peak pressure and force time integral in overweight and obese children (p< 0.001). Obese children showed the 1.48-fold (1 year-old) to 3.49-fold (10 year-old) midfoot loading (FTI) compared to normal-weight. Conclusion Additional body mass leads to higher overall load, with disproportional impact on the midfoot area and longitudinal foot arch showing characteristic foot loading patterns. Already the feet of one and two year old children are significantly affected. Childhood overweight and obesity is not compensated by the musculoskeletal system. To avoid excessive foot loading with potential risk of discomfort or pain in childhood, prevention strategies should be developed and validated for children with a high body mass index and functional changes in the midfoot area. The presented plantar pressure values could additionally serve as reference data to identify suspicious foot loading patterns in children.
BACK PAIN: THE STUDY OF MECHANISMS AND THE TRANSLATION IN INTERVENTIONS WITHIN THE MISPEX NETWORK
(2016)
Stumbling led to an increase in ROM, compared to unperturbed gait, in all segments and planes. These increases ranged between 107 +/- 26% (UTA/rotation) and 262 +/- 132% (UTS/lateral flexion), significant only in lateral flexion. EMG activity of the trunk was increased during stumbling (abdominal: 665 +/- 283%; back: 501 +/- 215%), without significant differences between muscles. Provoked stumbling leads to a measurable effect on the trunk, quantifiable by an increase in ROM and EMG activity, compared to normal walking. Greater abdominal muscle activity and ROM of lateral flexion may indicate a specific compensation pattern occurring during stumbling. (C) 2015 Elsevier Ltd. All rights reserved.
Objective: To assess the intra-and inter-session reliability of estimates of motor unit behavior and muscle fiber properties derived from high-density surface electromyography (HDEMG). Methods: Ten healthy subjects performed submaximal isometric knee extensions during three recording sessions (separate days) at 10%, 30%, 50% and 70% of their maximum voluntary effort. The discharge timings of motor units of the vastus lateralis and medialis muscles were automatically identified from HDEMG by a decomposition algorithm. We characterized the number of detected motor units, their discharge rates, the coefficient of variation of their inter-spike intervals (CoVisi), the action potential conduction velocity and peak-to-peak amplitude. Reliability was assessed for each motor unit characteristics by intra-class correlation coefficient (ICC). Additionally, a pulse-to-noise ratio (PNR) was calculated, to verify the accuracy of the decomposition. Results: Good to excellent reliability within and between sessions was found for all motor unit characteristics at all force levels (ICCs > 0.8), with the exception of CoVisi that presented poor reliability (ICC < 0.6). PNR was high and similar for both muscles with values ranging between 45.1 and 47.6 dB (accuracy > 95%). Conclusion: Motor unit features can be assessed non-invasively and reliably within and across sessions over a wide range of force levels. Significance: These results suggest that it is possible to characterize motor units in longitudinal intervention studies. (C) 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
This study aimed to determine the relative and absolute reliability of ultrasound (US) measurements of the thickness and echogenicity of the plantar fascia (PF) at different measurement stations along its length using a standardized protocol. Twelve healthy subjects (24 feet) were enrolled. The PF was imaged in the longitudinal plane. Subjects were assessed twice to evaluate the intra-rater reliability. A quantitative evaluation of the thickness and echogenicity of the plantar fascia was performed using Image J, a digital image analysis and viewer software. A sonography evaluation of the thickness and echogenicity of the PF showed a high relative reliability with an Intra class correlation coefficient of 0.88 at all measurement stations. However, the measurement stations for both the PF thickness and echogenicity which showed the highest intraclass correlation coefficient (ICCs) did not have the highest absolute reliability. Compared to other measurement stations, measuring the PF thickness at 3 cm distal and the echogenicity at a region of interest 1 cm to 2 cm distal from its insertion at the medial calcaneal tubercle showed the highest absolute reliability with the least systematic bias and random error. Also, the reliability was higher using a mean of three measurements compared to one measurement. To reduce discrepancies in the interpretation of the thickness and echogenicity measurements of the PF, the absolute reliability of the different measurement stations should be considered in clinical practice and research rather than the relative reliability with the ICC.
Stability of the trunk is relevant in determining trunk response to different loading in everyday tasks initiated by the limbs. Descriptions of the trunk’s mechanical movement patterns in response to different loads while lifting objects are still under debate. Hence, the aim of this study was to analyze the influence of weight on 3-dimensional segmental motion of the trunk during 1-handed lifting. Ten asymptomatic subjects were included (29 ± 3 y; 1.79 ± 0.09 m; 75 ± 14 kg). Subjects lifted 3× a light and heavy load from the ground up onto a table. Three-dimensional segmental trunk motion was measured (12 markers; 3 segments: upper thoracic area [UTA], lower thoracic area [LTA], lumbar area [LA]). Outcomes were total motion amplitudes (ROM;[°]) for anterior flexion, lateral flexion, and rotation of each segment. The highest ROM was observed in the LTA segment (anterior flexion), and the smallest ROM in the UTA segment (lateral flexion). ROM differed for all planes between the 3 segments for both tasks (P < .001). There were no differences in ROM between light and heavy loads (P > .05). No interaction effects (load × segment) were observed, as ROM did not reveal differences between loading tasks. Regardless of weight, the 3 segments did reflect differences, supporting the relevance of multisegmental analysis.
Neuromuscular response of the trunk to sudden gait disturbances: Forward vs. backward perturbation
(2016)
The study aimed to analyse neuromuscular activity of the trunk comparing four different perturbations during gait. Thirteen subjects (28 +/- 3 yrs) walked (1 m/s) on a split-belt treadmill, while 4 (belt) perturbations (F1, F2, B1, B2) were randomly applied. Perturbations differed, related to treadmill belt translation, in direction (forward (F)/backward (B)) and amplitude (20 m/s(2) (1)/40 m/s(2) (2)). Trunk muscle activity was assessed with a 12-lead-EMG. EMG-RMS [%] (0-200 ms after perturbation; normalized to RMS of normal gait) was analyzed for muscles and four trunk areas (ventral left/right; dorsal left/right). Ratio of ventral: dorsal muscles were calculated. Muscle onset [ms] was determined. Data analysis was conducted descriptively, followed by ANOVA (post hoc Tukey-Kramer (alpha = 0.05)). All perturbations lead to an increase in EMG-RMS (428 +/- 289%). F1 showed the lowest and F2 the highest increase for the flexors. B2 showed the highest increase for the extensors. Significant differences between perturbations could be observed for 6 muscles, as well as the 4 trunk areas. Ratio analysis revealed no significant differences (range 1.25 (B1) to 1.71 (F2) between stimuli. Muscle response time (ventral: 87.0 +/- 21.7 ms; dorsal: 88.4 +/- 17.0 ms) between stimuli was only significant (p = 0.005) for the dorsal muscles. Magnitude significantly influences neuromuscular trunk response patterns in healthy adults. Regardless of direction ventral muscles always revealed higher relative increase of activity while compensating the walking perturbations. (C) 2016 Elsevier Ltd. All rights reserved.
Background: Arising from the relevance of sensorimotor training in the therapy of nonspecific low back pain patients and from the value of individualized therapy, the present trial aims to test the feasibility and efficacy of individualized sensorimotor training interventions in patients suffering from nonspecific low back pain. Methods and study design: A multicentre, single-blind two-armed randomized controlled trial to evaluate the effects of a 12-week (3 weeks supervised centre-based and 9 weeks home-based) individualized sensorimotor exercise program is performed. The control group stays inactive during this period. Outcomes are pain, and pain-associated function as well as motor function in adults with nonspecific low back pain. Each participant is scheduled to five measurement dates: baseline (M1), following centre-based training (M2), following home-based training (M3) and at two follow-up time points 6 months (M4) and 12 months (M5) after M1. All investigations and the assessment of the primary and secondary outcomes are performed in a standardized order: questionnaires - clinical examination biomechanics (motor function). Subsequent statistical procedures are executed after the examination of underlying assumptions for parametric or rather non-parametric testing. Discussion: The results and practical relevance of the study will be of clinical and practical relevance not only for researchers and policy makers but also for the general population suffering from nonspecific low back pain.
Prevalence of Achilles tendinopathy increases with age leading to a weaker tendon with predisposition to rupture. Conclusive evidence of the influence of age and pathology on Achilles tendon (AT) properties remains limited, as previous studies are based on standardized isometric conditions. The study investigates the influence of age and pathology on AT properties during single-leg vertical jump (SLVJ). 10 children (C), 10 asymptomatic adults (A), and 10 tendinopathic patients (T) were included. AT elongation [mm] from rest to maximal displacement during a SLVJ on a force-plate was sonographically assessed. AT compliance [mm/N]) and strain [%] was calculated by dividing elongation by peak ground reaction force [N] and length, respectively. One-way ANOVA followed by Bonferroni post-hoc correction (=0.05) were used to compare C with A and A with T. AT elongation (p=0.004), compliance (p=0.001), and strain were found to be statistically significant higher in C (27 +/- 3mm, 0.026 +/- 0.006[mm/N], 13 +/- 2%) compared to A (21 +/- 4mm, 0.017 +/- 0.005[mm/N], 10 +/- 2%). No statistically significant differences (p0.05) was found between A and T (25 +/- 5mm, 0.019 +/- 0.004[mm/N], 12 +/- 3%). During SLVJ, tendon responded differently in regards to age and pathology with children having the most compliant AT. Higher compliance found in healthy tendons might be considered as a protective factor against load-related injuries.
Introduction
Injury prevention programs (IPPs) are an inherent part of training in recreational and professional sports. Providing performance-enhancing benefits in addition to injury prevention may help adjust coaches and athletes’ attitudes towards implementation of injury prevention into daily routine. Conventional thinking by players and coaches alike seems to suggest that IPPs need to be specific to one’s sport to allow for performance enhancement. The systematic literature review aims to firstly determine the IPPs nature of exercises and whether they are specific to the sport or based on general conditioning. Secondly, can they demonstrate whether general, sports-specific or even mixed IPPs improve key performance indicators with the aim to better facilitate long-term implementation of these programs?
Methods
PubMed and Web of Science were electronically searched throughout March 2018. The inclusion criteria were randomized control trials, publication dates between Jan 2006 and Feb 2018, athletes (11–45 years), injury prevention programs and included predefined performance measures that could be categorized into balance, power, strength, speed/agility and endurance. The methodological quality of included articles was assessed with the Cochrane Collaboration assessment tools.
Results
Of 6619 initial findings, 22 studies met the inclusion criteria. In addition, reference lists unearthed a further 6 studies, making a total of 28. Nine studies used sports specific IPPs, eleven general and eight mixed prevention strategies. Overall, general programs ranged from 29–57% in their effectiveness across performance outcomes. Mixed IPPs improved in 80% balance outcomes but only 20–44% in others. Sports-specific programs led to larger scale improvements in balance (66%), power (83%), strength (75%), and speed/agility (62%).
Conclusion
Sports-specific IPPs have the strongest influence on most performance indices based on the significant improvement versus control groups. Other factors such as intensity, technical execution and compliance should be accounted for in future investigations in addition to exercise modality.
Introduction
Injury prevention programs (IPPs) are an inherent part of training in recreational and professional sports. Providing performance-enhancing benefits in addition to injury prevention may help adjust coaches and athletes’ attitudes towards implementation of injury prevention into daily routine. Conventional thinking by players and coaches alike seems to suggest that IPPs need to be specific to one’s sport to allow for performance enhancement. The systematic literature review aims to firstly determine the IPPs nature of exercises and whether they are specific to the sport or based on general conditioning. Secondly, can they demonstrate whether general, sports-specific or even mixed IPPs improve key performance indicators with the aim to better facilitate long-term implementation of these programs?
Methods
PubMed and Web of Science were electronically searched throughout March 2018. The inclusion criteria were randomized control trials, publication dates between Jan 2006 and Feb 2018, athletes (11–45 years), injury prevention programs and included predefined performance measures that could be categorized into balance, power, strength, speed/agility and endurance. The methodological quality of included articles was assessed with the Cochrane Collaboration assessment tools.
Results
Of 6619 initial findings, 22 studies met the inclusion criteria. In addition, reference lists unearthed a further 6 studies, making a total of 28. Nine studies used sports specific IPPs, eleven general and eight mixed prevention strategies. Overall, general programs ranged from 29–57% in their effectiveness across performance outcomes. Mixed IPPs improved in 80% balance outcomes but only 20–44% in others. Sports-specific programs led to larger scale improvements in balance (66%), power (83%), strength (75%), and speed/agility (62%).
Conclusion
Sports-specific IPPs have the strongest influence on most performance indices based on the significant improvement versus control groups. Other factors such as intensity, technical execution and compliance should be accounted for in future investigations in addition to exercise modality.
Background: Telerehabilitation can contribute to the maintenance of successful rehabilitation regardless of location and time. The aim of this study was to investigate a specific three-month interactive telerehabilitation routine regarding its effectiveness in assisting patients with physical functionality and with returning to work compared to typical aftercare.
Objective: The aim of the study was to investigate a specific three-month interactive telerehabilitation with regard to effectiveness in functioning and return to work compared to usual aftercare.
Methods: From August 2016 to December 2017, 111 patients (mean 54.9 years old; SD 6.8; 54.3% female) with hip or knee replacement were enrolled in the randomized controlled trial. At discharge from inpatient rehabilitation and after three months, their distance in the 6-minute walk test was assessed as the primary endpoint. Other functional parameters, including health related quality of life, pain, and time to return to work, were secondary endpoints.
Results: Patients in the intervention group performed telerehabilitation for an average of 55.0 minutes (SD 9.2) per week. Adherence was high, at over 75%, until the 7th week of the three-month intervention phase. Almost all the patients and therapists used the communication options. Both the intervention group (average difference 88.3 m; SD 57.7; P=.95) and the control group (average difference 79.6 m; SD 48.7; P=.95) increased their distance in the 6-minute-walk-test. Improvements in other functional parameters, as well as in quality of life and pain, were achieved in both groups. The higher proportion of working patients in the intervention group (64.6%; P=.01) versus the control group (46.2%) is of note.
Conclusions: The effect of the investigated telerehabilitation therapy in patients following knee or hip replacement was equivalent to the usual aftercare in terms of functional testing, quality of life, and pain. Since a significantly higher return-to-work rate could be achieved, this therapy might be a promising supplement to established aftercare.
Background: Telerehabilitation can contribute to the maintenance of successful rehabilitation regardless of location and time. The aim of this study was to investigate a specific three-month interactive telerehabilitation routine regarding its effectiveness in assisting patients with physical functionality and with returning to work compared to typical aftercare.
Objective: The aim of the study was to investigate a specific three-month interactive telerehabilitation with regard to effectiveness in functioning and return to work compared to usual aftercare.
Methods: From August 2016 to December 2017, 111 patients (mean 54.9 years old; SD 6.8; 54.3% female) with hip or knee replacement were enrolled in the randomized controlled trial. At discharge from inpatient rehabilitation and after three months, their distance in the 6-minute walk test was assessed as the primary endpoint. Other functional parameters, including health related quality of life, pain, and time to return to work, were secondary endpoints.
Results: Patients in the intervention group performed telerehabilitation for an average of 55.0 minutes (SD 9.2) per week. Adherence was high, at over 75%, until the 7th week of the three-month intervention phase. Almost all the patients and therapists used the communication options. Both the intervention group (average difference 88.3 m; SD 57.7; P=.95) and the control group (average difference 79.6 m; SD 48.7; P=.95) increased their distance in the 6-minute-walk-test. Improvements in other functional parameters, as well as in quality of life and pain, were achieved in both groups. The higher proportion of working patients in the intervention group (64.6%; P=.01) versus the control group (46.2%) is of note.
Conclusions: The effect of the investigated telerehabilitation therapy in patients following knee or hip replacement was equivalent to the usual aftercare in terms of functional testing, quality of life, and pain. Since a significantly higher return-to-work rate could be achieved, this therapy might be a promising supplement to established aftercare.
Aim: The aim of the study was to identify common orthopedic sports injury profiles in adolescent elite athletes with respect to age, sex, and anthropometrics.
Methods: A retrospective data analysis of 718 orthopedic presentations among 381 adolescent elite athletes from 16 different sports to a sports medical department was performed. Recorded data of history and clinical examination included area, cause and structure of acute and overuse injuries. Injury-events were analyzed in the whole cohort and stratified by age (11–14/15–17 years) and sex. Group differences were tested by chi-squared-tests. Logistic regression analysis was applied examining the influence of factors age, sex, and body mass index (BMI) on the outcome variables area and structure (a = 0.05).
Results: Higher proportions of injury-events were reported for females (60%) and athletes of the older age group (66%) than males and younger athletes. The most frequently injured area was the lower extremity (47%) followed by the spine (30.5%) and the upper extremity (12.5%). Acute injuries were mainly located at the lower extremity (74.5%), while overuse injuries were predominantly observed at the lower extremity (41%) as well as the spine (36.5%). Joints (34%), muscles (22%), and tendons (21.5%) were found to be the most often affected structures. The injured structures were different between the age groups (p = 0.022), with the older age group presenting three times more frequent with ligament pathology events (5.5%/2%) and less frequent with bony problems (11%/20.5%) than athletes of the younger age group. The injured area differed between the sexes (p = 0.005), with males having fewer spine injury-events (25.5%/34%) but more upper extremity injuries (18%/9%) than females. Regression analysis showed statistically significant influence for BMI (p = 0.002) and age (p = 0.015) on structure, whereas the area was significantly influenced by sex (p = 0.005).
Conclusion: Events of soft-tissue overuse injuries are the most common reasons resulting in orthopedic presentations of adolescent elite athletes. Mostly, the lower extremity and the spine are affected, while sex and age characteristics on affected area and structure must be considered. Therefore, prevention strategies addressing the injury-event profiles should already be implemented in early adolescence taking age, sex as well as injury entity into account.
Aim: The aim of the study was to identify common orthopedic sports injury profiles in adolescent elite athletes with respect to age, sex, and anthropometrics.
Methods: A retrospective data analysis of 718 orthopedic presentations among 381 adolescent elite athletes from 16 different sports to a sports medical department was performed. Recorded data of history and clinical examination included area, cause and structure of acute and overuse injuries. Injury-events were analyzed in the whole cohort and stratified by age (11–14/15–17 years) and sex. Group differences were tested by chi-squared-tests. Logistic regression analysis was applied examining the influence of factors age, sex, and body mass index (BMI) on the outcome variables area and structure (a = 0.05).
Results: Higher proportions of injury-events were reported for females (60%) and athletes of the older age group (66%) than males and younger athletes. The most frequently injured area was the lower extremity (47%) followed by the spine (30.5%) and the upper extremity (12.5%). Acute injuries were mainly located at the lower extremity (74.5%), while overuse injuries were predominantly observed at the lower extremity (41%) as well as the spine (36.5%). Joints (34%), muscles (22%), and tendons (21.5%) were found to be the most often affected structures. The injured structures were different between the age groups (p = 0.022), with the older age group presenting three times more frequent with ligament pathology events (5.5%/2%) and less frequent with bony problems (11%/20.5%) than athletes of the younger age group. The injured area differed between the sexes (p = 0.005), with males having fewer spine injury-events (25.5%/34%) but more upper extremity injuries (18%/9%) than females. Regression analysis showed statistically significant influence for BMI (p = 0.002) and age (p = 0.015) on structure, whereas the area was significantly influenced by sex (p = 0.005).
Conclusion: Events of soft-tissue overuse injuries are the most common reasons resulting in orthopedic presentations of adolescent elite athletes. Mostly, the lower extremity and the spine are affected, while sex and age characteristics on affected area and structure must be considered. Therefore, prevention strategies addressing the injury-event profiles should already be implemented in early adolescence taking age, sex as well as injury entity into account.
How much is too much?
(2010)
Although dietary nutrient intake is often adequate, nutritional supplement use is common among elite athletes. However, high-dose supplements or the use of multiple supplements may exceed the recommended daily allowance (RDA) of particular nutrients or even result in a daily intake above tolerable upper limits (UL). The present case report presents nutritional intake data and supplement use of a highly trained male swimmer competing at international level. Habitual energy and micronutrient intake were analysed by 3 d dietary reports. Supplement use and dosage were assessed, and total amount of nutrient supply was calculated. Micronutrient intake was evaluated based on RDA and UL as presented by the European Scientific Committee on Food, and maximum permitted levels in supplements (MPL) are given. The athlete’s diet provided adequate micronutrient content well above RDA except for vitamin D. Simultaneous use of ten different supplements was reported, resulting in excess intake above tolerable UL for folate, vitamin E and Zn. Additionally, daily supplement dosage was considerably above MPL for nine micronutrients consumed as artificial products. Risks and possible side effects of exceeding UL by the athlete are discussed. Athletes with high energy intake may be at risk of exceeding UL of particular nutrients if multiple supplements are added. Therefore, dietary counselling of athletes should include assessment of habitual diet and nutritional supplement intake. Educating athletes to balance their diets instead of taking supplements might be prudent to prevent health risks
that may occur with long-term excess nutrient intake.
Background: Core-specific sensorimotor exercises are proven to enhance neuromuscular activity of the trunk, improve athletic performance and prevent back pain. However, the dose-response relationship and, therefore, the dose required to improve trunk function is still under debate. The purpose of the present trial will be to compare four different intervention strategies of sensorimotor exercises that will result in improved trunk function.
Methods/design: A single-blind, four-armed, randomized controlled trial with a 3-week (home-based) intervention phase and two measurement days pre and post intervention (M1/M2) is designed. Experimental procedures on both measurement days will include evaluation of maximum isokinetic and isometric trunk strength (extension/flexion, rotation) including perturbations, as well as neuromuscular trunk activity while performing strength testing. The primary outcome is trunk strength (peak torque). Neuromuscular activity (amplitude, latencies as a response to perturbation) serves as secondary outcome. The control group will perform a standardized exercise program of four sensorimotor exercises (three sets of 10 repetitions) in each of six training sessions (30 min duration) over 3 weeks. The intervention groups’ programs differ in the number of exercises, sets per exercise and, therefore, overall training amount (group I: six sessions, three exercises, two sets; group II: six sessions, two exercises, two sets; group III: six sessions, one exercise, three sets). The intervention programs of groups I, II and III include additional perturbations for all exercises to increase both the difficulty and the efficacy of the exercises performed. Statistical analysis will be performed after examining the underlying assumptions for parametric and non-parametric testing.
Discussion: The results of the study will be clinically relevant, not only for researchers but also for (sports) therapists, physicians, coaches, athletes and the general population who have the aim of improving trunk function.
Background: Core-specific sensorimotor exercises are proven to enhance neuromuscular activity of the trunk, improve athletic performance and prevent back pain. However, the dose-response relationship and, therefore, the dose required to improve trunk function is still under debate. The purpose of the present trial will be to compare four different intervention strategies of sensorimotor exercises that will result in improved trunk function.
Methods/design: A single-blind, four-armed, randomized controlled trial with a 3-week (home-based) intervention phase and two measurement days pre and post intervention (M1/M2) is designed. Experimental procedures on both measurement days will include evaluation of maximum isokinetic and isometric trunk strength (extension/flexion, rotation) including perturbations, as well as neuromuscular trunk activity while performing strength testing. The primary outcome is trunk strength (peak torque). Neuromuscular activity (amplitude, latencies as a response to perturbation) serves as secondary outcome. The control group will perform a standardized exercise program of four sensorimotor exercises (three sets of 10 repetitions) in each of six training sessions (30 min duration) over 3 weeks. The intervention groups’ programs differ in the number of exercises, sets per exercise and, therefore, overall training amount (group I: six sessions, three exercises, two sets; group II: six sessions, two exercises, two sets; group III: six sessions, one exercise, three sets). The intervention programs of groups I, II and III include additional perturbations for all exercises to increase both the difficulty and the efficacy of the exercises performed. Statistical analysis will be performed after examining the underlying assumptions for parametric and non-parametric testing.
Discussion: The results of the study will be clinically relevant, not only for researchers but also for (sports) therapists, physicians, coaches, athletes and the general population who have the aim of improving trunk function.
Long-distance race car drivers are classified as athletes. The sport is physically and mentally demanding, requiring long hours of practice. Therefore, optimal dietary intake is essential for health and performance of the athlete. The aim of the study was to evaluate dietary intake and to compare the data with dietary recommendations for athletes and for the general adult population according to the German Nutrition Society (DGE). A 24-h dietary recall during a competition preparation phase was obtained from 16 male race car drivers (28.3 ± 6.1 years, body mass index (BMI) of 22.9 ± 2.3 kg/m2). The mean intake of energy, nutrients, water and alcohol was recorded. The mean energy, vitamin B2, vitamin E, folate, fiber, calcium, water and alcohol intake were 2124 ± 814 kcal/day, 1.3 ± 0.5 mg/day, 12.5 ± 9.5 mg/day, 231.0 ± 90.9 ug/day, 21.4 ± 9.4 g/day, 1104 ± 764 mg/day, 3309 ± 1522 mL/day and 0.8 ± 2.5 mL/day respectively. Our study indicated that many of the nutrients studied, including energy and carbohydrate, were below the recommended dietary intake for both athletes and the DGE.
Long-distance race car drivers are classified as athletes. The sport is physically and mentally demanding, requiring long hours of practice. Therefore, optimal dietary intake is essential for health and performance of the athlete. The aim of the study was to evaluate dietary intake and to compare the data with dietary recommendations for athletes and for the general adult population according to the German Nutrition Society (DGE). A 24-h dietary recall during a competition preparation phase was obtained from 16 male race car drivers (28.3 ± 6.1 years, body mass index (BMI) of 22.9 ± 2.3 kg/m2). The mean intake of energy, nutrients, water and alcohol was recorded. The mean energy, vitamin B2, vitamin E, folate, fiber, calcium, water and alcohol intake were 2124 ± 814 kcal/day, 1.3 ± 0.5 mg/day, 12.5 ± 9.5 mg/day, 231.0 ± 90.9 ug/day, 21.4 ± 9.4 g/day, 1104 ± 764 mg/day, 3309 ± 1522 mL/day and 0.8 ± 2.5 mL/day respectively. Our study indicated that many of the nutrients studied, including energy and carbohydrate, were below the recommended dietary intake for both athletes and the DGE.
Introduction
Annually, 2 million sports-related injuries are reported in Germany of which athletes contribute to a large proportion. Multiple sport injury prevention programs designed to decrease acute and overuse injuries in athletes have been proven effective. Yet, the programs’ components, general or sports-specific, that led to these positive effects are uncertain. Despite not knowing about the superiority of sports-specific injury prevention programs, coaches and athletes alike prefer more specialized rather than generalized exercise programs. Therefore, this systematic review aimed to present the available evidence on how general and sports-specific prevention programs affect injury rates in athletes.
Methods
PubMed and Web of Science were electronically searched throughout April 2018. The inclusion criteria were publication dates Jan 2006–Dec 2017, athletes (11–45 years), exercise-based injury prevention programs and injury incidence. The methodological quality was assessed with the Cochrane Collaboration assessment tools.
Results
Of the initial 6619 findings, 15 studies met the inclusion criteria. In addition, 13 studies were added from reference lists and external sources making a total of 28 studies. Of which, one used sports-specific, seven general and 20 mixed prevention strategies. Twenty-four studies revealed reduced injury rates. Of the four ineffective programs, one was general and three mixed.
Conclusion
The general and mixed programs positively affect injury rates. Sports-specific programs are uninvestigated and despite wide discussion regarding the definition, no consensus was reached. Defining such terminology and investigating the true effectiveness of such IPPs is a potential avenue for future research.
Introduction
Annually, 2 million sports-related injuries are reported in Germany of which athletes contribute to a large proportion. Multiple sport injury prevention programs designed to decrease acute and overuse injuries in athletes have been proven effective. Yet, the programs’ components, general or sports-specific, that led to these positive effects are uncertain. Despite not knowing about the superiority of sports-specific injury prevention programs, coaches and athletes alike prefer more specialized rather than generalized exercise programs. Therefore, this systematic review aimed to present the available evidence on how general and sports-specific prevention programs affect injury rates in athletes.
Methods
PubMed and Web of Science were electronically searched throughout April 2018. The inclusion criteria were publication dates Jan 2006–Dec 2017, athletes (11–45 years), exercise-based injury prevention programs and injury incidence. The methodological quality was assessed with the Cochrane Collaboration assessment tools.
Results
Of the initial 6619 findings, 15 studies met the inclusion criteria. In addition, 13 studies were added from reference lists and external sources making a total of 28 studies. Of which, one used sports-specific, seven general and 20 mixed prevention strategies. Twenty-four studies revealed reduced injury rates. Of the four ineffective programs, one was general and three mixed.
Conclusion
The general and mixed programs positively affect injury rates. Sports-specific programs are uninvestigated and despite wide discussion regarding the definition, no consensus was reached. Defining such terminology and investigating the true effectiveness of such IPPs is a potential avenue for future research.
Accuracy of training recommendations based on a treadmill multistage incremental exercise test
(2018)
Competitive runners will occasionally undergo exercise in a laboratory setting to obtain predictive and prescriptive information regarding their performance. The present research aimed to assess whether the physiological demands of lab-based treadmill running (TM) can simulate that of over-ground (OG) running using a commonly used protocol. Fifteen healthy volunteers with a weekly mileage of ≥ 20 km over the past 6 months and treadmill experience participated in this cross-sectional study. Two stepwise incremental tests until volitional exhaustion was performed in a fixed order within one week in an Outpatient Clinic research laboratory and outdoor athletic track. Running velocity (IATspeed), heart rate (IATHR) and lactate concentration at the individual anaerobic threshold (IATbLa) were primary endpoints. Additionally, distance covered (DIST), maximal heart rate (HRmax), maximal blood lactate concentration (bLamax) and rate of perceived exertion (RPE) at IATspeed were analyzed. IATspeed, DIST and HRmax were not statistically significantly different between conditions, whereas bLamax and RPE at IATspeed showed statistical significance (p < 0.05). Apart from RPE at IATspeed, IATspeed, DIST, HRmax and bLamax strongly correlate between conditions (r = 0.815–0.988). High reliability between conditions provides strong evidence to suggest that running on a treadmill are physiologically comparable to that of OG and that training recommendations and be made with assurance.
Accuracy of training recommendations based on a treadmill multistage incremental exercise test
(2018)
Competitive runners will occasionally undergo exercise in a laboratory setting to obtain predictive and prescriptive information regarding their performance. The present research aimed to assess whether the physiological demands of lab-based treadmill running (TM) can simulate that of over-ground (OG) running using a commonly used protocol. Fifteen healthy volunteers with a weekly mileage of ≥ 20 km over the past 6 months and treadmill experience participated in this cross-sectional study. Two stepwise incremental tests until volitional exhaustion was performed in a fixed order within one week in an Outpatient Clinic research laboratory and outdoor athletic track. Running velocity (IATspeed), heart rate (IATHR) and lactate concentration at the individual anaerobic threshold (IATbLa) were primary endpoints. Additionally, distance covered (DIST), maximal heart rate (HRmax), maximal blood lactate concentration (bLamax) and rate of perceived exertion (RPE) at IATspeed were analyzed. IATspeed, DIST and HRmax were not statistically significantly different between conditions, whereas bLamax and RPE at IATspeed showed statistical significance (p < 0.05). Apart from RPE at IATspeed, IATspeed, DIST, HRmax and bLamax strongly correlate between conditions (r = 0.815–0.988). High reliability between conditions provides strong evidence to suggest that running on a treadmill are physiologically comparable to that of OG and that training recommendations and be made with assurance.
Static (one-legged stance) and dynamic (star excursion balance) postural control tests were performed by 14 adolescent athletes with and 17 without back pain to determine reproducibility. The total displacement, mediolateral and anterior-posterior displacements of the centre of pressure in mm for the static, and the normalized and composite reach distances for the dynamic tests were analysed. Intraclass correlation coefficients, 95% confidence intervals, and a Bland-Altman analysis were calculated for reproducibility. Intraclass correlation coefficients for subjects with (0.54 to 0.65), (0.61 to 0.69) and without (0.45 to 0.49), (0.52 to 0.60) back pain were obtained on the static test for right and left legs, respectively. Likewise, (0.79 to 0.88), (0.75 to 0.93) for subjects with and (0.61 to 0.82), (0.60 to 0.85) for those without back pain were obtained on the dynamic test for the right and left legs, respectively. Systematic bias was not observed between test and retest of subjects on both static and dynamic tests. The one-legged stance and star excursion balance tests have fair to excellent reliabilities on measures of postural control in adolescent athletes with and without back pain. They can be used as measures of postural control in adolescent athletes with and without back pain.
Static (one-legged stance) and dynamic (star excursion balance) postural control tests were performed by 14 adolescent athletes with and 17 without back pain to determine reproducibility. The total displacement, mediolateral and anterior-posterior displacements of the centre of pressure in mm for the static, and the normalized and composite reach distances for the dynamic tests were analysed. Intraclass correlation coefficients, 95% confidence intervals, and a Bland-Altman analysis were calculated for reproducibility. Intraclass correlation coefficients for subjects with (0.54 to 0.65), (0.61 to 0.69) and without (0.45 to 0.49), (0.52 to 0.60) back pain were obtained on the static test for right and left legs, respectively. Likewise, (0.79 to 0.88), (0.75 to 0.93) for subjects with and (0.61 to 0.82), (0.60 to 0.85) for those without back pain were obtained on the dynamic test for the right and left legs, respectively. Systematic bias was not observed between test and retest of subjects on both static and dynamic tests. The one-legged stance and star excursion balance tests have fair to excellent reliabilities on measures of postural control in adolescent athletes with and without back pain. They can be used as measures of postural control in adolescent athletes with and without back pain.
Background:
Arising from the relevance of sensorimotor training in the therapy of nonspecific low back pain patients and from the value of individualized therapy, the present trial aims to test the feasibility and efficacy of individualized sensorimotor training interventions in patients suffering from nonspecific low back pain.
Methods and study design:
A multicentre, single-blind two-armed randomized controlled trial to evaluate the
effects of a 12-week (3 weeks supervised centre-based and 9 weeks home-based) individualized sensorimotor exercise program is performed. The control group stays inactive during this period. Outcomes are pain, and pain-associated function as well as motor function in adults with nonspecific low back pain. Each participant is scheduled to five measurement dates: baseline (M1), following centre-based training (M2), following home-based training (M3) and at two follow-up time points 6 months (M4) and 12 months (M5) after M1. All investigations and the assessment of the primary and secondary outcomes are performed in a standardized order: questionnaires – clinical examination – biomechanics (motor function). Subsequent statistical procedures are executed after the examination of underlying assumptions for parametric or rather non-parametric testing.
Discussion:
The results and practical relevance of the study will be of clinical and practical relevance not only for researchers and policy makers but also for the general population suffering from nonspecific low back pain.
Trial registration:
Identification number DRKS00010129. German Clinical Trial registered on 3 March 2016.
Background:
Exercising at intensities where fat oxidation rates are high has been shown to induce metabolic benefits in recreational and health-oriented sportsmen. The exercise intensity (Fat peak ) eliciting peak fat oxidation rates is therefore of particular interest when aiming to prescribe exercise for the purpose of fat oxidation and related metabolic effects. Although running and walking are feasible and popular among the target population, no reliable protocols are available to assess Fat peak as well as its actual velocity (V PFO ) during treadmill ergometry. Our purpose was therefore, to assess the reliability and day-to-day variability of V PFO and Fat peak during treadmill ergometry running.
Methods:
Sixteen recreational athletes (f = 7, m = 9; 25 ± 3 y; 1.76 ± 0.09 m; 68.3 ± 13.7 kg; 23.1 ± 2.9 kg/m 2 ) performed 2 different running protocols on 3 different days with standardized nutrition the day before testing. At day 1, peak oxygen uptake (VO 2peak ) and the velocities at the aerobic threshold (V LT ) and respiratory exchange ratio (RER) of 1.00 (V RER ) were assessed. At days 2 and 3, subjects ran an identical submaximal incremental test (Fat-peak test) composed of a 10 min warm-up (70 % V LT ) followed by 5 stages of 6 min with equal increments (stage 1 = V LT , stage 5 = V RER ). Breath-by-breath gas exchange data was measured continuously and used to determine fat oxidation rates. A third order polynomial function was used to identify V PFO and subsequently Fat peak . The reproducibility and variability of variables was verified with an int raclass correlation coef ficient (ICC), Pearson ’ s correlation coefficient, coefficient of variation (CV) an d the mean differences (bias) ± 95 % limits of agreement (LoA).
Results:
ICC, Pearson ’ s correlation and CV for V PFO and Fat peak were 0.98, 0.97, 5.0 %; and 0.90, 0.81, 7.0 %, respectively. Bias ± 95 % LoA was − 0.3 ± 0.9 km/h for V PFO and − 2±8%ofVO 2peak for Fat peak.
Conclusion:
In summary, relative and absolute reliability indicators for V PFO and Fat peak were found to be excellent. The observed LoA may now serve as a basis for future training prescriptions, although fat oxidation rates at prolonged exercise bouts at this intensity still need to be investigated.
Background Low back pain (LBP) is a common pain syndrome in athletes, responsible for 28% of missed training days/year. Psychosocial factors contribute to chronic pain development. This study aims to investigate the transferability of psychosocial screening tools developed in the general population to athletes and to define athlete-specific thresholds.
Methods Data from a prospective multicentre study on LBP were collected at baseline and 1-year follow-up (n=52 athletes, n=289 recreational athletes and n=246 non-athletes). Pain was assessed using the Chronic Pain Grade questionnaire. The psychosocial Risk Stratification Index (RSI) was used to obtain prognostic information regarding the risk of chronic LBP (CLBP). Individual psychosocial risk profile was gained with the Risk Prevention Index – Social (RPI-S). Differences between groups were calculated using general linear models and planned contrasts. Discrimination thresholds for athletes were defined with receiver operating characteristics (ROC) curves.
Results Athletes and recreational athletes showed significantly lower psychosocial risk profiles and prognostic risk for CLBP than non-athletes. ROC curves suggested discrimination thresholds for athletes were different compared with non-athletes. Both screenings demonstrated very good sensitivity (RSI=100%; RPI-S: 75%–100%) and specificity (RSI: 76%–93%; RPI-S: 71%–93%). RSI revealed two risk classes for pain intensity (area under the curve (AUC) 0.92(95% CI 0.85 to 1.0)) and pain disability (AUC 0.88(95% CI 0.71 to 1.0)).
Conclusions Both screening tools can be used for athletes. Athlete-specific thresholds will improve physicians’ decision making and allow stratified treatment and prevention.
Background Low back pain (LBP) is a common pain syndrome in athletes, responsible for 28% of missed training days/year. Psychosocial factors contribute to chronic pain development. This study aims to investigate the transferability of psychosocial screening tools developed in the general population to athletes and to define athlete-specific thresholds.
Methods Data from a prospective multicentre study on LBP were collected at baseline and 1-year follow-up (n=52 athletes, n=289 recreational athletes and n=246 non-athletes). Pain was assessed using the Chronic Pain Grade questionnaire. The psychosocial Risk Stratification Index (RSI) was used to obtain prognostic information regarding the risk of chronic LBP (CLBP). Individual psychosocial risk profile was gained with the Risk Prevention Index – Social (RPI-S). Differences between groups were calculated using general linear models and planned contrasts. Discrimination thresholds for athletes were defined with receiver operating characteristics (ROC) curves.
Results Athletes and recreational athletes showed significantly lower psychosocial risk profiles and prognostic risk for CLBP than non-athletes. ROC curves suggested discrimination thresholds for athletes were different compared with non-athletes. Both screenings demonstrated very good sensitivity (RSI=100%; RPI-S: 75%–100%) and specificity (RSI: 76%–93%; RPI-S: 71%–93%). RSI revealed two risk classes for pain intensity (area under the curve (AUC) 0.92(95% CI 0.85 to 1.0)) and pain disability (AUC 0.88(95% CI 0.71 to 1.0)).
Conclusions Both screening tools can be used for athletes. Athlete-specific thresholds will improve physicians’ decision making and allow stratified treatment and prevention.
Background: Data on electrocardiographic and echocardiographic pre-participation screening findings in paediatric athletes are limited.
Methods and results: 10-15 year-old athletes (n = 343) were screened using electro- and echocardiography. The electrocardiogram (ECG) was normal in 220 (64%), mildly abnormal in 108 (31%), and distinctly abnormal in 15 (4%) athletes. Echocardiographic upper reference limits (URL, 97.5 percentile) for the left ventricular (LV) wall thickness in 10-11-year-old boys and girls were 9-10 mm and 8-9 mm, respectively; in 12-13-year-old boys and girls 9-10 mm; and in 14-15-year-old boys and girls 10-11 mm and 9-10 mm, respectively. Three athletes were excluded from competitive sports: one for symptomatic Wolff-Parkinson-White syndrome with a normal echocardiogram; one for negative T-waves in V-1-V-4 and a dilated right ventricle by echocardiography suggestive of (arrhythmogenic) right ventricular disease; and one for normal ECG and biscupid aortic valve including an aneurysm of the ascending aorta detected by echocardiography. Related to echocardiographic findings, the sensitivity and specificity of the ECG to identify cardiovascular abnormalities was 38% and 64%, respectively. The ECG's positive-predictive and negative-predictive values were 13% and 88%, respectively. The numbers needed to screen and calculated costs were 172 for ECG ( 7049), 172 for echocardiography ( 11,530), and 114 combining ECG and echocardiography ( 9323).
Conclusions: Compared to adults, paediatric athletes presented with fewer distinctly abnormal ECGs, and there was no gender difference in paediatric athletes' ECG-pattern distribution. A combination of ECG and echocardiography for pre-participation screening of paediatric athletes is superior to ECG alone but 30% more costly.
Increased Achilles (AT) and Patellar tendon (PT) thickness in adolescent athletes compared to non-athletes could be shown. However, it is unclear, if changes are of pathological or physiological origin due to training. The aim of this study was to determine physiological AT and PT thickness adaptation in adolescent elite athletes compared to non-athletes, considering sex and sport. In a longitudinal study design with two measurement days (M1/M2) within an interval of 3.2 ± 0.8 years, 131 healthy adolescent elite athletes (m/f: 90/41) out of 13 different sports and 24 recreationally active controls (m/f: 6/18) were included. Both ATs and PTs were measured at standardized reference points. Athletes were divided into 4 sport categories [ball (B), combat (C), endurance (E) and explosive strength sports (S)]. Descriptive analysis (mean ± SD) and statistical testing for group differences was performed (α = 0.05). AT thickness did not differ significantly between measurement days, neither in athletes (5.6 ± 0.7 mm/5.6 ± 0.7 mm) nor in controls (4.8 ± 0.4 mm/4.9 ± 0.5 mm, p > 0.05). For PTs, athletes presented increased thickness at M2 (M1: 3.5 ± 0.5 mm, M2: 3.8 ± 0.5 mm, p < 0.001). In general, males had thicker ATs and PTs than females (p < 0.05). Considering sex and sports, only male athletes from B, C, and S showed significant higher PT-thickness at M2 compared to controls (p ≤ 0.01). Sport-specific adaptation regarding tendon thickness in adolescent elite athletes can be detected in PTs among male athletes participating in certain sports with high repetitive jumping and strength components. Sonographic microstructural analysis might provide an enhanced insight into tendon material properties enabling the differentiation of sex and influence of different sports.
Increased Achilles (AT) and Patellar tendon (PT) thickness in adolescent athletes compared to non-athletes could be shown. However, it is unclear, if changes are of pathological or physiological origin due to training. The aim of this study was to determine physiological AT and PT thickness adaptation in adolescent elite athletes compared to non-athletes, considering sex and sport. In a longitudinal study design with two measurement days (M1/M2) within an interval of 3.2 ± 0.8 years, 131 healthy adolescent elite athletes (m/f: 90/41) out of 13 different sports and 24 recreationally active controls (m/f: 6/18) were included. Both ATs and PTs were measured at standardized reference points. Athletes were divided into 4 sport categories [ball (B), combat (C), endurance (E) and explosive strength sports (S)]. Descriptive analysis (mean ± SD) and statistical testing for group differences was performed (α = 0.05). AT thickness did not differ significantly between measurement days, neither in athletes (5.6 ± 0.7 mm/5.6 ± 0.7 mm) nor in controls (4.8 ± 0.4 mm/4.9 ± 0.5 mm, p > 0.05). For PTs, athletes presented increased thickness at M2 (M1: 3.5 ± 0.5 mm, M2: 3.8 ± 0.5 mm, p < 0.001). In general, males had thicker ATs and PTs than females (p < 0.05). Considering sex and sports, only male athletes from B, C, and S showed significant higher PT-thickness at M2 compared to controls (p ≤ 0.01). Sport-specific adaptation regarding tendon thickness in adolescent elite athletes can be detected in PTs among male athletes participating in certain sports with high repetitive jumping and strength components. Sonographic microstructural analysis might provide an enhanced insight into tendon material properties enabling the differentiation of sex and influence of different sports.
Background
Total hip or knee replacement is one of the most frequently performed surgical procedures. Physical rehabilitation following total hip or knee replacement is an essential part of the therapy to improve functional outcomes and quality of life. After discharge from inpatient rehabilitation, a subsequent postoperative exercise therapy is needed to maintain functional mobility. Telerehabilitation may be a potential innovative treatment approach. We aim to investigate the superiority of an interactive telerehabilitation intervention for patients after total hip or knee replacement, in comparison to usual care, regarding physical performance, functional mobility, quality of life and pain.
Methods/design
This is an open, randomized controlled, multicenter superiority study with two prospective arms. One hundred and ten eligible and consenting participants with total knee or hip replacement will be recruited at admission to subsequent inpatient rehabilitation. After comprehensive, 3-week, inpatient rehabilitation, the intervention group performs a 3-month, interactive, home-based exercise training with a telerehabilitation system. For this purpose, the physiotherapist creates an individual training plan out of 38 different strength and balance exercises which were implemented in the system. Data about the quality and frequency of training are transmitted to the physiotherapist for further adjustment. Communication between patient and physiotherapist is possible with the system. The control group receives voluntary, usual aftercare programs. Baseline assessments are investigated after discharge from rehabilitation; final assessments 3 months later. The primary outcome is the difference in improvement between intervention and control group in 6-minute walk distance after 3 months. Secondary outcomes include differences in the Timed Up and Go Test, the Five-Times-Sit-to-Stand Test, the Stair Ascend Test, the Short-Form 36, the Western Ontario and McMaster Universities Osteoarthritis Index, the International Physical Activity Questionnaire, and postural control as well as gait and kinematic parameters of the lower limbs. Baseline-adjusted analysis of covariance models will be used to test for group differences in the primary and secondary endpoints.
Discussion
We expect the intervention group to benefit from the interactive, home-based exercise training in many respects represented by the study endpoints. If successful, this approach could be used to enhance the access to aftercare programs, especially in structurally weak areas.
Background
Total hip or knee replacement is one of the most frequently performed surgical procedures. Physical rehabilitation following total hip or knee replacement is an essential part of the therapy to improve functional outcomes and quality of life. After discharge from inpatient rehabilitation, a subsequent postoperative exercise therapy is needed to maintain functional mobility. Telerehabilitation may be a potential innovative treatment approach. We aim to investigate the superiority of an interactive telerehabilitation intervention for patients after total hip or knee replacement, in comparison to usual care, regarding physical performance, functional mobility, quality of life and pain.
Methods/design
This is an open, randomized controlled, multicenter superiority study with two prospective arms. One hundred and ten eligible and consenting participants with total knee or hip replacement will be recruited at admission to subsequent inpatient rehabilitation. After comprehensive, 3-week, inpatient rehabilitation, the intervention group performs a 3-month, interactive, home-based exercise training with a telerehabilitation system. For this purpose, the physiotherapist creates an individual training plan out of 38 different strength and balance exercises which were implemented in the system. Data about the quality and frequency of training are transmitted to the physiotherapist for further adjustment. Communication between patient and physiotherapist is possible with the system. The control group receives voluntary, usual aftercare programs. Baseline assessments are investigated after discharge from rehabilitation; final assessments 3 months later. The primary outcome is the difference in improvement between intervention and control group in 6-minute walk distance after 3 months. Secondary outcomes include differences in the Timed Up and Go Test, the Five-Times-Sit-to-Stand Test, the Stair Ascend Test, the Short-Form 36, the Western Ontario and McMaster Universities Osteoarthritis Index, the International Physical Activity Questionnaire, and postural control as well as gait and kinematic parameters of the lower limbs. Baseline-adjusted analysis of covariance models will be used to test for group differences in the primary and secondary endpoints.
Discussion
We expect the intervention group to benefit from the interactive, home-based exercise training in many respects represented by the study endpoints. If successful, this approach could be used to enhance the access to aftercare programs, especially in structurally weak areas.
Introduction: Chronic low back pain (LBP) is a major cause of disability; early diagnosis and stratification of care remain challenges.
Objectives: This article describes the development of a screening tool for the 1-year prognosis of patients with high chronic LBP risk (risk stratification index) and for treatment allocation according to treatment-modifiable yellow flag indicators (risk prevention indices, RPI-S).
Methods: Screening tools were derived from a multicentre longitudinal study (n = 1071, age >18, intermittent LBP). The greatest prognostic predictors of 4 flag domains ("pain," "distress," "social-environment," "medical care-environment") were determined using least absolute shrinkage and selection operator regression analysis. Internal validity and prognosis error were evaluated after 1-year follow-up. Receiver operating characteristic curves for discrimination (area under the curve) and cutoff values were determined.
Results: The risk stratification index identified persons with increased risk of chronic LBP and accurately estimated expected pain intensity and disability on the Pain Grade Questionnaire (0-100 points) up to 1 year later with an average prognosis error of 15 points. In addition, 3-risk classes were discerned with an accuracy of area under the curve = 0.74 (95% confidence interval 0.63-0.85). The RPI-S also distinguished persons with potentially modifiable prognostic indicators from 4 flag domains and stratified allocation to biopsychosocial treatments accordingly.
Conclusion: The screening tools, developed in compliance with the PROGRESS and TRIPOD statements, revealed good validation and prognostic strength. These tools improve on existing screening tools because of their utility for secondary preventions, incorporation of exercise effect modifiers, exact pain estimations, and personalized allocation to multimodal treatments.
Introduction: Chronic low back pain (LBP) is a major cause of disability; early diagnosis and stratification of care remain challenges.
Objectives: This article describes the development of a screening tool for the 1-year prognosis of patients with high chronic LBP risk (risk stratification index) and for treatment allocation according to treatment-modifiable yellow flag indicators (risk prevention indices, RPI-S).
Methods: Screening tools were derived from a multicentre longitudinal study (n = 1071, age >18, intermittent LBP). The greatest prognostic predictors of 4 flag domains ("pain," "distress," "social-environment," "medical care-environment") were determined using least absolute shrinkage and selection operator regression analysis. Internal validity and prognosis error were evaluated after 1-year follow-up. Receiver operating characteristic curves for discrimination (area under the curve) and cutoff values were determined.
Results: The risk stratification index identified persons with increased risk of chronic LBP and accurately estimated expected pain intensity and disability on the Pain Grade Questionnaire (0-100 points) up to 1 year later with an average prognosis error of 15 points. In addition, 3-risk classes were discerned with an accuracy of area under the curve = 0.74 (95% confidence interval 0.63-0.85). The RPI-S also distinguished persons with potentially modifiable prognostic indicators from 4 flag domains and stratified allocation to biopsychosocial treatments accordingly.
Conclusion: The screening tools, developed in compliance with the PROGRESS and TRIPOD statements, revealed good validation and prognostic strength. These tools improve on existing screening tools because of their utility for secondary preventions, incorporation of exercise effect modifiers, exact pain estimations, and personalized allocation to multimodal treatments.
Background
Foot orthoses are usually assumed to be effective by optimizing mechanically dynamic rearfoot configuration. However, the effect from a foot orthosis on kinematics that has been demonstrated scientifically has only been marginal. The aim of this study was to examine the effect of different heights in medial arch-supported foot orthoses on rear foot motion during gait.
Methods
Nineteen asymptomatic runners (36±11years, 180±5cm, 79±10kg; 41±22km/week) participated in the study. Trials were recorded at 3.1 mph (5 km/h) on a treadmill. Athletes walked barefoot and with 4 different not customized medial arch-supported foot orthoses of various arch heights (N:0 mm, M:30 mm, H:35 mm, E:40mm). Six infrared cameras and the `Oxford Foot Model´ were used to capture motion. The average stride in each condition was calculated from 50 gait cycles per condition. Eversion excursion and internal tibia rotation were analyzed. Descriptive statistics included calculating the mean ± SD and 95% CIs. Group differences by condition were analyzed by one factor (foot orthoses) repeated measures ANOVA (α = 0.05).
Results
Eversion excursion revealed the lowest values for N and highest for H (B:4.6°±2.2°; 95% CI [3.1;6.2]/N:4.0°±1.7°; [2.9;5.2]/M:5.2°±2.6°; [3.6;6.8]/H:6.2°±3.3°; [4.0;8.5]/E:5.1°±3.5°; [2.8;7.5]) (p>0.05). Range of internal tibia rotation was lowest with orthosis H and highest with E (B:13.3°±3.2°; 95% CI [11.0;15.6]/N:14.5°±7.2°; [9.2;19.6]/M:13.8°±5.0°; [10.8;16.8]/H:12.3°±4.3°; [9.0;15.6]/E:14.9°±5.0°; [11.5;18.3]) (p>0.05). Differences between conditions were small and the intrasubject variation high.
Conclusion
Our results indicate that different arch support heights have no systematic effect on eversion excursion or the range of internal tibia rotation and therefore might not exert a crucial influence on rear foot alignment during gait.
Background
Foot orthoses are usually assumed to be effective by optimizing mechanically dynamic rearfoot configuration. However, the effect from a foot orthosis on kinematics that has been demonstrated scientifically has only been marginal. The aim of this study was to examine the effect of different heights in medial arch-supported foot orthoses on rear foot motion during gait.
Methods
Nineteen asymptomatic runners (36±11years, 180±5cm, 79±10kg; 41±22km/week) participated in the study. Trials were recorded at 3.1 mph (5 km/h) on a treadmill. Athletes walked barefoot and with 4 different not customized medial arch-supported foot orthoses of various arch heights (N:0 mm, M:30 mm, H:35 mm, E:40mm). Six infrared cameras and the `Oxford Foot Model´ were used to capture motion. The average stride in each condition was calculated from 50 gait cycles per condition. Eversion excursion and internal tibia rotation were analyzed. Descriptive statistics included calculating the mean ± SD and 95% CIs. Group differences by condition were analyzed by one factor (foot orthoses) repeated measures ANOVA (α = 0.05).
Results
Eversion excursion revealed the lowest values for N and highest for H (B:4.6°±2.2°; 95% CI [3.1;6.2]/N:4.0°±1.7°; [2.9;5.2]/M:5.2°±2.6°; [3.6;6.8]/H:6.2°±3.3°; [4.0;8.5]/E:5.1°±3.5°; [2.8;7.5]) (p>0.05). Range of internal tibia rotation was lowest with orthosis H and highest with E (B:13.3°±3.2°; 95% CI [11.0;15.6]/N:14.5°±7.2°; [9.2;19.6]/M:13.8°±5.0°; [10.8;16.8]/H:12.3°±4.3°; [9.0;15.6]/E:14.9°±5.0°; [11.5;18.3]) (p>0.05). Differences between conditions were small and the intrasubject variation high.
Conclusion
Our results indicate that different arch support heights have no systematic effect on eversion excursion or the range of internal tibia rotation and therefore might not exert a crucial influence on rear foot alignment during gait.