Refine
Year of publication
- 2017 (35) (remove)
Language
- English (35)
Is part of the Bibliography
- yes (35)
Keywords
- young athletes (7)
- back pain (5)
- sonography (5)
- Achilles and patellar tendon (3)
- Aftercare (3)
- Exercise therapy (3)
- Home-based (3)
- SEMG-pattern (3)
- Telerehabilitation (3)
- Total hip replacement (3)
Sonographically detectable intratendinous blood flow (IBF) is found in 50%-88% of Achilles tendinopathy patients as well as in up to 35% of asymptomatic Achilles tendons (AT). Although IBF is frequently associated with tendon pathology, it may also represent a physiological regulation, for example, due to increased blood flow in response to exercise. Therefore, this study aimed to investigate the acute effects of a standardized running exercise protocol on IBF assessed with Doppler ultrasound (DU) Advanced dynamic flow in healthy ATs. 10 recreationally active adults (5 f, 5m; 29 +/- 3years, 1.72 +/- 0.12m, 68 +/- 16kg, physical activity 206 +/- 145minute/wk) with no history of AT pain and inconspicious tendon structure performed 3 treadmill running tasks on separate days (M1-3) with DU examinations directly before and 5, 30, 60, and 120minutes after exercise. At M1, an incremental exercise test was used to determine the individual anaerobic threshold (IAT). At M2 and M3, participants performed 30-minute submaximal constant load tests (CL1/CL2) with an intensity 5% below IAT. IBF in each tendon was quantified by counting the number of vessels. IBF increased in five ATs from no vessels at baseline to one to four vessels solely detectable 5minutes after CL1 or CL2. One AT had persisting IBF (three vessels) throughout all examinations. Fourteen ATs revealed no IBF at all. Prolonged running led to a physiological, temporary appearance of IBF in 25% of asymptomatic ATs. To avoid exercise-induced IBF in clinical practice, DU examinations should be performed after 30minutes of rest.
The research aimed to investigate back pain (BP) prevalence in a large cohort of young athletes with respect to age, gender, and sport discipline. BP (within the last 7days) was assessed with a face scale (face 1-2=no pain; face 3-5=pain) in 2116 athletes (m/f 61%/39%; 13.3 +/- 1.7years; 163.0 +/- 11.8cm; 52.6 +/- 13.9kg; 4.9 +/- 2.7 training years; 8.4 +/- 5.7 training h/week). Four different sports categories were devised (a: combat sports, b: game sports; c: explosive strength sport; d: endurance sport). Analysis was described descriptively, regarding age, gender, and sport. In addition, 95% confidence intervals (CI) were calculated. About 168 (8%) athletes were allocated into the BP group. About 9% of females and 7% of males reported BP. Athletes, 11-13years, showed a prevalence of 2-4%; while prevalence increased to 12-20% in 14- to 17-year olds. Considering sport discipline, prevalence ranged from 3% (soccer) to 14% (canoeing). Prevalences in weight lifting, judo, wrestling, rowing, and shooting were 10%; in boxing, soccer, handball, cycling, and horse riding, 6%. 95% CI ranged between 0.08-0.11. BP exists in adolescent athletes, but is uncommon and shows no gender differences. A prevalence increase after age 14 is obvious. Differentiated prevention programs in daily training routines might address sport discipline-specific BP prevalence.
Introduction: Chronic low back pain (LBP) is a major cause of disability; early diagnosis and stratification of care remain challenges.
Objectives: This article describes the development of a screening tool for the 1-year prognosis of patients with high chronic LBP risk (risk stratification index) and for treatment allocation according to treatment-modifiable yellow flag indicators (risk prevention indices, RPI-S).
Methods: Screening tools were derived from a multicentre longitudinal study (n = 1071, age >18, intermittent LBP). The greatest prognostic predictors of 4 flag domains ("pain," "distress," "social-environment," "medical care-environment") were determined using least absolute shrinkage and selection operator regression analysis. Internal validity and prognosis error were evaluated after 1-year follow-up. Receiver operating characteristic curves for discrimination (area under the curve) and cutoff values were determined.
Results: The risk stratification index identified persons with increased risk of chronic LBP and accurately estimated expected pain intensity and disability on the Pain Grade Questionnaire (0-100 points) up to 1 year later with an average prognosis error of 15 points. In addition, 3-risk classes were discerned with an accuracy of area under the curve = 0.74 (95% confidence interval 0.63-0.85). The RPI-S also distinguished persons with potentially modifiable prognostic indicators from 4 flag domains and stratified allocation to biopsychosocial treatments accordingly.
Conclusion: The screening tools, developed in compliance with the PROGRESS and TRIPOD statements, revealed good validation and prognostic strength. These tools improve on existing screening tools because of their utility for secondary preventions, incorporation of exercise effect modifiers, exact pain estimations, and personalized allocation to multimodal treatments.
Introduction: Chronic low back pain (LBP) is a major cause of disability; early diagnosis and stratification of care remain challenges.
Objectives: This article describes the development of a screening tool for the 1-year prognosis of patients with high chronic LBP risk (risk stratification index) and for treatment allocation according to treatment-modifiable yellow flag indicators (risk prevention indices, RPI-S).
Methods: Screening tools were derived from a multicentre longitudinal study (n = 1071, age >18, intermittent LBP). The greatest prognostic predictors of 4 flag domains ("pain," "distress," "social-environment," "medical care-environment") were determined using least absolute shrinkage and selection operator regression analysis. Internal validity and prognosis error were evaluated after 1-year follow-up. Receiver operating characteristic curves for discrimination (area under the curve) and cutoff values were determined.
Results: The risk stratification index identified persons with increased risk of chronic LBP and accurately estimated expected pain intensity and disability on the Pain Grade Questionnaire (0-100 points) up to 1 year later with an average prognosis error of 15 points. In addition, 3-risk classes were discerned with an accuracy of area under the curve = 0.74 (95% confidence interval 0.63-0.85). The RPI-S also distinguished persons with potentially modifiable prognostic indicators from 4 flag domains and stratified allocation to biopsychosocial treatments accordingly.
Conclusion: The screening tools, developed in compliance with the PROGRESS and TRIPOD statements, revealed good validation and prognostic strength. These tools improve on existing screening tools because of their utility for secondary preventions, incorporation of exercise effect modifiers, exact pain estimations, and personalized allocation to multimodal treatments.
Background Low back pain (LBP) is a common pain syndrome in athletes, responsible for 28% of missed training days/year. Psychosocial factors contribute to chronic pain development. This study aims to investigate the transferability of psychosocial screening tools developed in the general population to athletes and to define athlete-specific thresholds.
Methods Data from a prospective multicentre study on LBP were collected at baseline and 1-year follow-up (n=52 athletes, n=289 recreational athletes and n=246 non-athletes). Pain was assessed using the Chronic Pain Grade questionnaire. The psychosocial Risk Stratification Index (RSI) was used to obtain prognostic information regarding the risk of chronic LBP (CLBP). Individual psychosocial risk profile was gained with the Risk Prevention Index – Social (RPI-S). Differences between groups were calculated using general linear models and planned contrasts. Discrimination thresholds for athletes were defined with receiver operating characteristics (ROC) curves.
Results Athletes and recreational athletes showed significantly lower psychosocial risk profiles and prognostic risk for CLBP than non-athletes. ROC curves suggested discrimination thresholds for athletes were different compared with non-athletes. Both screenings demonstrated very good sensitivity (RSI=100%; RPI-S: 75%–100%) and specificity (RSI: 76%–93%; RPI-S: 71%–93%). RSI revealed two risk classes for pain intensity (area under the curve (AUC) 0.92(95% CI 0.85 to 1.0)) and pain disability (AUC 0.88(95% CI 0.71 to 1.0)).
Conclusions Both screening tools can be used for athletes. Athlete-specific thresholds will improve physicians’ decision making and allow stratified treatment and prevention.
Purpose Using a novel technique of high-density surface EMG decomposition and motor unit (MU) tracking, we compared changes in the properties of vastus medialis and vastus lateralis MU after endurance (END) and high-intensity interval training (HIIT). Methods Sixteen men were assigned to the END or the HIIT group (n = 8 each) and performed six training sessions for 14 d. Each session consisted of 8-12 x 60-s intervals at 100% peak power output separated by 75 s of recovery (HIIT) or 90-120 min continuous cycling at similar to 65% VO2peak (END). Pre- and postintervention, participants performed 1) incremental cycling to determine VO2peak and peak power output and 2) maximal, submaximal (10%, 30%, 50%, and 70% maximum voluntary contraction [MVC]), and sustained (until task failure at 30% MVC) isometric knee extensions while high-density surface EMG signals were recorded from the vastus medialis and vastus lateralis. EMG signals were decomposed (submaximal contractions) into individual MU by convolutive blind source separation. Finally, MU were tracked across sessions by semiblind source separation. Results After training, END and HIIT improved VO2peak similarly (by 5.0% and 6.7%, respectively). The HIIT group showed enhanced maximal knee extension torque by similar to 7% (P = 0.02) and was accompanied by an increase in discharge rate for high-threshold MU (50% knee extension MVC) (P < 0.05). By contrast, the END group increased their time to task failure by similar to 17% but showed no change in MU discharge rates (P > 0.05). Conclusions HIIT and END induce different adjustments in MU discharge rate despite similar improvements in cardiopulmonary fitness. Moreover, the changes induced by HIIT are specific for high-threshold MU. For the first time, we show that HIIT and END induce specific neuromuscular adaptations, possibly related to differences in exercise load intensity and training volume.
Background: Data on electrocardiographic and echocardiographic pre-participation screening findings in paediatric athletes are limited.
Methods and results: 10-15 year-old athletes (n = 343) were screened using electro- and echocardiography. The electrocardiogram (ECG) was normal in 220 (64%), mildly abnormal in 108 (31%), and distinctly abnormal in 15 (4%) athletes. Echocardiographic upper reference limits (URL, 97.5 percentile) for the left ventricular (LV) wall thickness in 10-11-year-old boys and girls were 9-10 mm and 8-9 mm, respectively; in 12-13-year-old boys and girls 9-10 mm; and in 14-15-year-old boys and girls 10-11 mm and 9-10 mm, respectively. Three athletes were excluded from competitive sports: one for symptomatic Wolff-Parkinson-White syndrome with a normal echocardiogram; one for negative T-waves in V-1-V-4 and a dilated right ventricle by echocardiography suggestive of (arrhythmogenic) right ventricular disease; and one for normal ECG and biscupid aortic valve including an aneurysm of the ascending aorta detected by echocardiography. Related to echocardiographic findings, the sensitivity and specificity of the ECG to identify cardiovascular abnormalities was 38% and 64%, respectively. The ECG's positive-predictive and negative-predictive values were 13% and 88%, respectively. The numbers needed to screen and calculated costs were 172 for ECG ( 7049), 172 for echocardiography ( 11,530), and 114 combining ECG and echocardiography ( 9323).
Conclusions: Compared to adults, paediatric athletes presented with fewer distinctly abnormal ECGs, and there was no gender difference in paediatric athletes' ECG-pattern distribution. A combination of ECG and echocardiography for pre-participation screening of paediatric athletes is superior to ECG alone but 30% more costly.
Core-specific sensorimotor exercises are proven to enhance neuromuscular activity of the trunk. However, the influence of high-intensity perturbations on training efficiency is unclear within this context. Sixteen participants (29 +/- 2 yrs; 175 +/- 8 cm; 69 +/- 13 kg) were prepared with a 12-lead bilateral trunk EMG. Warm-up on a dynamometer was followed by maximum voluntary isometric trunk (flex/ext) contraction (MVC). Next, participants performed four conditions for a one-legged stance with hip abduction on a stable surface (HA) repeated randomly on an unstable surface (HAP), on a stable surface with perturbation (HA + P), and on an unstable surface with perturbation (HAP + P). Afterwards, bird dog (BD) was performed under the same conditions (BD, BDP, BD + P, BDP + P). A foam pad under the foot (HA) or the knee (BD) was used as an unstable surface. Exercises were conducted on a moveable platform. Perturbations (ACC 50 m/sec(2);100 ms duration;10rep.) were randomly applied in the anterior-posterior direction. The root mean square (RMS) normalized to MVC (%) was calculated (whole movement cycle). Muscles were grouped into ventral right and left (VR;VL), and dorsal right and left (DR;DL). Ventral Dorsal and right-left ratios were calculated (two way repeated-measures ANOVA;alpha = 0,05). Amplitudes of all muscle groups in bird dog were higher compared to hip abduction (p <= 0.0001; Range: BD: 14 +/- 3% (BD;VR) to 53 +/- 4%; HA: 7 +/- 2% (HA;DR) to 16 +/- 4% (HA;DR)). EMG-RMS showed significant differences (p < 0.001) between conditions and muscle groups per exercise. Interaction effects were only significant for HA (p = 0.02). No significant differences were present in EMG ratios (p > 0.05). Additional high-intensity perturbations during core-specific sensorimotor exercises lead to increased neuromuscular activity and therefore higher exercise intensities. However, the beneficial effects on trunk function remain unclear. Nevertheless, BD is more suitable to address trunk muscles.
Background
Total hip or knee replacement is one of the most frequently performed surgical procedures. Physical rehabilitation following total hip or knee replacement is an essential part of the therapy to improve functional outcomes and quality of life. After discharge from inpatient rehabilitation, a subsequent postoperative exercise therapy is needed to maintain functional mobility. Telerehabilitation may be a potential innovative treatment approach. We aim to investigate the superiority of an interactive telerehabilitation intervention for patients after total hip or knee replacement, in comparison to usual care, regarding physical performance, functional mobility, quality of life and pain.
Methods/design
This is an open, randomized controlled, multicenter superiority study with two prospective arms. One hundred and ten eligible and consenting participants with total knee or hip replacement will be recruited at admission to subsequent inpatient rehabilitation. After comprehensive, 3-week, inpatient rehabilitation, the intervention group performs a 3-month, interactive, home-based exercise training with a telerehabilitation system. For this purpose, the physiotherapist creates an individual training plan out of 38 different strength and balance exercises which were implemented in the system. Data about the quality and frequency of training are transmitted to the physiotherapist for further adjustment. Communication between patient and physiotherapist is possible with the system. The control group receives voluntary, usual aftercare programs. Baseline assessments are investigated after discharge from rehabilitation; final assessments 3 months later. The primary outcome is the difference in improvement between intervention and control group in 6-minute walk distance after 3 months. Secondary outcomes include differences in the Timed Up and Go Test, the Five-Times-Sit-to-Stand Test, the Stair Ascend Test, the Short-Form 36, the Western Ontario and McMaster Universities Osteoarthritis Index, the International Physical Activity Questionnaire, and postural control as well as gait and kinematic parameters of the lower limbs. Baseline-adjusted analysis of covariance models will be used to test for group differences in the primary and secondary endpoints.
Discussion
We expect the intervention group to benefit from the interactive, home-based exercise training in many respects represented by the study endpoints. If successful, this approach could be used to enhance the access to aftercare programs, especially in structurally weak areas.