Refine
Has Fulltext
- yes (17)
Document Type
- Postprint (17) (remove)
Language
- English (17)
Is part of the Bibliography
- yes (17)
Keywords
- young athletes (3)
- back pain (2)
- fertilization (2)
- Achilles and patellar tendon (1)
- Aftercare (1)
- Basketball (1)
- Children (1)
- EMG (1)
- Excursion Balance Test (1)
- Exercise therapy (1)
Background
Recently, the incidence rate of back pain (BP) in adolescents has been reported at 21%. However, the development of BP in adolescent athletes is unclear. Hence, the purpose of this study was to examine the incidence of BP in young elite athletes in relation to gender and type of sport practiced.
Methods
Subjective BP was assessed in 321 elite adolescent athletes (m/f 57%/43%; 13.2 ± 1.4 years; 163.4 ± 11.4 cm; 52.6 ± 12.6 kg; 5.0 ± 2.6 training yrs; 7.6 ± 5.3 training h/week). Initially, all athletes were free of pain. The main outcome criterion was the incidence of back pain [%] analyzed in terms of pain development from the first measurement day (M1) to the second measurement day (M2) after 2.0 ± 1.0 year. Participants were classified into athletes who developed back pain (BPD) and athletes who did not develop back pain (nBPD). BP (acute or within the last 7 days) was assessed with a 5-step face scale (face 1–2 = no pain; face 3–5 = pain). BPD included all athletes who reported faces 1 and 2 at M1 and faces 3 to 5 at M2. nBPD were all athletes who reported face 1 or 2 at both M1 and M2. Data was analyzed descriptively. Additionally, a Chi2 test was used to analyze gender- and sport-specific differences (p = 0.05).
Results
Thirty-two athletes were categorized as BPD (10%). The gender difference was 5% (m/f: 12%/7%) but did not show statistical significance (p = 0.15). The incidence of BP ranged between 6 and 15% for the different sport categories. Game sports (15%) showed the highest, and explosive strength sports (6%) the lowest incidence. Anthropometrics or training characteristics did not significantly influence BPD (p = 0.14 gender to p = 0.90 sports; r2 = 0.0825).
Conclusions
BP incidence was lower in adolescent athletes compared to young non-athletes and even to the general adult population. Consequently, it can be concluded that high-performance sports do not lead to an additional increase in back pain incidence during early adolescence. Nevertheless, back pain prevention programs should be implemented into daily training routines for sport categories identified as showing high incidence rates.
Background
Back pain patients (BPP) show delayed muscle onset, increased co-contractions, and variability as response to quasi-static sudden trunk loading in comparison to healthy controls (H). However, it is unclear whether these results can validly be transferred to suddenly applied walking perturbations, an automated but more functional and complex movement pattern. There is an evident need to develop research-based strategies for the rehabilitation of back pain. Therefore, the investigation of differences in trunk stability between H and BPP in functional movements is of primary interest in order to define suitable intervention regimes. The purpose of this study was to analyse neuromuscular reflex activity as well as three-dimensional trunk kinematics between H and BPP during walking perturbations.
Methods
Eighty H (31m/49f;29±9yrs;174±10cm;71±13kg) and 14 BPP (6m/8f;30±8yrs;171±10cm;67±14kg) walked (1m/s) on a split-belt treadmill while 15 right-sided perturbations (belt decelerating, 40m/s2, 50ms duration; 200ms after heel contact) were randomly applied. Trunk muscle activity was assessed using a 12-lead EMG set-up. Trunk kinematics were measured using a 3-segment-model consisting of 12 markers (upper thoracic (UTA), lower thoracic (LTA), lumbar area (LA)). EMG-RMS ([%],0-200ms after perturbation) was calculated and normalized to the RMS of unperturbed gait. Latency (TON;ms) and time to maximum activity (TMAX;ms) were analysed. Total motion amplitude (ROM;[°]) and mean angle (Amean;[°]) for extension-flexion, lateral flexion and rotation were calculated (whole stride cycle; 0-200ms after perturbation) for each of the three segments during unperturbed and perturbed gait. For ROM only, perturbed was normalized to unperturbed step [%] for the whole stride as well as the 200ms after perturbation. Data were analysed descriptively followed by a student´s t-test to account for group differences. Co-contraction was analyzed between ventral and dorsal muscles (V:R) as well as side right:side left ratio (Sright:Sleft). The coefficient of variation (CV;%) was calculated (EMG-RMS;ROM) to evaluate variability between the 15 perturbations for all groups. With respect to unequal distribution of participants to groups, an additional matched-group analysis was conducted. Fourteen healthy controls out of group H were sex-, age- and anthropometrically matched (group Hmatched) to the BPP.
Results
No group differences were observed for EMG-RMS or CV analysis (EMG/ROM) (p>0.025). Co-contraction analysis revealed no differences for V:R and Srigth:Sleft between the groups (p>0.025). BPP showed an increased TON and TMAX, being significant for Mm. rectus abdominus (p = 0.019) and erector spinae T9/L3 (p = 0.005/p = 0.015). ROM analysis over the unperturbed stride cycle revealed no differences between groups (p>0.025). Normalization of perturbed to unperturbed step lead to significant differences for the lumbar segment (LA) in lateral flexion with BPP showing higher normalized ROM compared to Hmatched (p = 0.02). BPP showed a significant higher flexed posture (UTA (p = 0.02); LTA (p = 0.004)) during normal walking (Amean). Trunk posture (Amean) during perturbation showed higher trunk extension values in LTA segments for H/Hmatched compared to BPP (p = 0.003). Matched group (BPP vs. Hmatched) analysis did not show any systematic changes of all results between groups.
Conclusion
BPP present impaired muscle response times and trunk posture, especially in the sagittal and transversal planes, compared to H. This could indicate reduced trunk stability and higher loading during gait perturbations.
Background
Overweight and obesity are increasing health problems that are not restricted to adults only. Childhood obesity is associated with metabolic, psychological and musculoskeletal comorbidities. However, knowledge about the effect of obesity on the foot function across maturation is lacking. Decreased foot function with disproportional loading characteristics is expected for obese children. The aim of this study was to examine foot loading characteristics during gait of normal-weight, overweight and obese children aged 1-12 years.
Methods
A total of 10382 children aged one to twelve years were enrolled in the study. Finally, 7575 children (m/f: n = 3630/3945; 7.0 +/- 2.9yr; 1.23 +/- 0.19m; 26.6 +/- 10.6kg; BMI: 17.1 +/- 2.4kg/m(2)) were included for (complete case) data analysis. Children were categorized to normalweight (>= 3rd and <90th percentile; n = 6458), overweight (>= 90rd and <97th percentile; n = 746) or obese (>97th percentile; n = 371) according to the German reference system that is based on age and gender-specific body mass indices (BMI). Plantar pressure measurements were assessed during gait on an instrumented walkway. Contact area, arch index (AI), peak pressure (PP) and force time integral (FTI) were calculated for the total, fore-, mid-and hindfoot. Data was analyzed descriptively (mean +/- SD) followed by ANOVA/Welch-test (according to homogeneity of variances: yes/no) for group differences according to BMI categorization (normal-weight, overweight, obesity) and for each age group 1 to 12yrs (post-hoc Tukey Kramer/Dunnett's C; alpha = 0.05).
Results
Mean walking velocity was 0.95 +/- 0.25 m/s with no differences between normal-weight, overweight or obese children (p = 0.0841). Results show higher foot contact area, arch index, peak pressure and force time integral in overweight and obese children (p< 0.001). Obese children showed the 1.48-fold (1 year-old) to 3.49-fold (10 year-old) midfoot loading (FTI) compared to normal-weight.
Conclusion
Additional body mass leads to higher overall load, with disproportional impact on the midfoot area and longitudinal foot arch showing characteristic foot loading patterns. Already the feet of one and two year old children are significantly affected. Childhood overweight and obesity is not compensated by the musculoskeletal system. To avoid excessive foot loading with potential risk of discomfort or pain in childhood, prevention strategies should be developed and validated for children with a high body mass index and functional changes in the midfoot area. The presented plantar pressure values could additionally serve as reference data to identify suspicious foot loading patterns in children.
In the context of back pain, great emphasis has been placed on the importance of trunk stability, especially in situations requiring compensation of repetitive, intense loading induced during high-performance activities, e.g., jumping or landing. This study aims to evaluate trunk muscle activity during drop jump in adolescent athletes with back pain (BP) compared to athletes without back pain (NBP). Eleven adolescent athletes suffering back pain (BP: m/f: n = 4/7; 15.9 ± 1.3 y; 176 ± 11 cm; 68 ± 11 kg; 12.4 ± 10.5 h/we training) and 11 matched athletes without back pain (NBP: m/f: n = 4/7; 15.5 ± 1.3 y; 174 ± 7 cm; 67 ± 8 kg; 14.9 ± 9.5 h/we training) were evaluated. Subjects conducted 3 drop jumps onto a force plate (ground reaction force). Bilateral 12-lead SEMG (surface Electromyography) was applied to assess trunk muscle activity. Ground contact time [ms], maximum vertical jump force [N], jump time [ms] and the jump performance index [m/s] were calculated for drop jumps. SEMG amplitudes (RMS: root mean square [%]) for all 12 single muscles were normalized to MIVC (maximum isometric voluntary contraction) and analyzed in 4 time windows (100 ms pre- and 200 ms post-initial ground contact, 100 ms pre- and 200 ms post-landing) as outcome variables. In addition, muscles were grouped and analyzed in ventral and dorsal muscles, as well as straight and transverse trunk muscles. Drop jump ground reaction force variables did not differ between NBP and BP (p > 0.05). Mm obliquus externus and internus abdominis presented higher SEMG amplitudes (1.3–1.9-fold) for BP (p < 0.05). Mm rectus abdominis, erector spinae thoracic/lumbar and latissimus dorsi did not differ (p > 0.05). The muscle group analysis over the whole jumping cycle showed statistically significantly higher SEMG amplitudes for BP in the ventral (p = 0.031) and transverse muscles (p = 0.020) compared to NBP. Higher activity of transverse, but not straight, trunk muscles might indicate a specific compensation strategy to support trunk stability in athletes with back pain during drop jumps. Therefore, exercises favoring the transverse trunk muscles could be recommended for back pain treatment.
The Star Excursion Balance Test (SEBT) is effective in measuring dynamic postural control (DPC). This research aimed to determine whether DPC measured by the SEBT in young athletes (YA) with back pain (BP) is different from those without BP (NBP). 53 BP YA and 53 NBP YA matched for age, height, weight, training years, training sessions/week and training minutes/session were studied. Participants performed 4 practice trials after which 3 measurements in the anterior, posteromedial and posterolateral SEBT reach directions were recorded. Normalized reach distance was analyzed using the mean of all 3 measurements. There was no statistical significant difference (p > 0.05) between the reach distance of BP (87.2 ± 5.3, 82.4 ± 8.2, 78.7 ± 8.1) and NBP (87.8 ± 5.6, 82.4 ± 8.0, 80.0 ± 8.8) in the anterior, posteromedial and posterolateral directions respectively. DPC in YA with BP, as assessed by the SEBT, was not different from NBP YA.
Introduction
Injury prevention programs (IPPs) are an inherent part of training in recreational and professional sports. Providing performance-enhancing benefits in addition to injury prevention may help adjust coaches and athletes’ attitudes towards implementation of injury prevention into daily routine. Conventional thinking by players and coaches alike seems to suggest that IPPs need to be specific to one’s sport to allow for performance enhancement. The systematic literature review aims to firstly determine the IPPs nature of exercises and whether they are specific to the sport or based on general conditioning. Secondly, can they demonstrate whether general, sports-specific or even mixed IPPs improve key performance indicators with the aim to better facilitate long-term implementation of these programs?
Methods
PubMed and Web of Science were electronically searched throughout March 2018. The inclusion criteria were randomized control trials, publication dates between Jan 2006 and Feb 2018, athletes (11–45 years), injury prevention programs and included predefined performance measures that could be categorized into balance, power, strength, speed/agility and endurance. The methodological quality of included articles was assessed with the Cochrane Collaboration assessment tools.
Results
Of 6619 initial findings, 22 studies met the inclusion criteria. In addition, reference lists unearthed a further 6 studies, making a total of 28. Nine studies used sports specific IPPs, eleven general and eight mixed prevention strategies. Overall, general programs ranged from 29–57% in their effectiveness across performance outcomes. Mixed IPPs improved in 80% balance outcomes but only 20–44% in others. Sports-specific programs led to larger scale improvements in balance (66%), power (83%), strength (75%), and speed/agility (62%).
Conclusion
Sports-specific IPPs have the strongest influence on most performance indices based on the significant improvement versus control groups. Other factors such as intensity, technical execution and compliance should be accounted for in future investigations in addition to exercise modality.
Abstract
Although genetic diversity is one of the key components of biodiversity, its drivers are still not fully understood. While it is known that genetic diversity is affected both by environmental parameters as well as habitat history, these factors are not often tested together. Therefore, we analyzed 14 microsatellite loci in Abax parallelepipedus, a flightless, forest dwelling ground beetle, from 88 plots in two study regions in Germany. We modeled the effects of historical and environmental variables on allelic richness, and found for one of the regions, the Schorfheide-Chorin, a significant effect of the depth of the litter layer, which is a main component of habitat quality, and of the sampling effort, which serves as an inverse proxy for local population size. For the other region, the Schwabische Alb, none of the potential drivers showed a significant effect on allelic richness. We conclude that the genetic diversity in our study species is being driven by current local population sizes via environmental variables and not by historical processes in the studied regions. This is also supported by lack of genetic differentiation between local populations sampled from ancient and from recent woodlands. We suggest that the potential effects of former fragmentation and recolonization processes have been mitigated by the large and stable local populations of Abax parallelepipedus in combination with the proximity of the ancient and recent woodlands in the studied landscapes.
Plant functional traits reflect individual and community ecological strategies. They allow the detection of directional changes in community dynamics and ecosystemic processes, being an additional tool to assess biodiversity than species richness. Analysis of functional patterns in plant communities provides mechanistic insight into biodiversity alterations due to anthropogenic activity. Although studies have considered of either anthropogenic management or nutrient availability on functional traits in temperate grasslands, studies combining effects of both drivers are scarce. Here, we assessed the impacts of management intensity (fertilization, mowing, grazing), nutrient stoichiometry (C, N, P, K), and vegetation composition on community-weighted means (CWMs) and functional diversity (Rao's Q) from seven plant traits in 150 grasslands in three regions in Germany, using data of 6 years. Land use and nutrient stoichiometry accounted for larger proportions of model variance of CWM and Rao's Q than species richness and productivity. Grazing affected all analyzed trait groups; fertilization and mowing only impacted generative traits. Grazing was clearly associated with nutrient retention strategies, that is, investing in durable structures and production of fewer, less variable seed. Phenological variability was increased. Fertilization and mowing decreased seed number/mass variability, indicating competition-related effects. Impacts of nutrient stoichiometry on trait syndromes varied. Nutrient limitation (large N:P, C:N ratios) promoted species with conservative strategies, that is, investment in durable plant structures rather than fast growth, fewer seed, and delayed flowering onset. In contrast to seed mass, leaf-economics variability was reduced under P shortage. Species diversity was positively associated with the variability of generative traits. Synthesis. Here, land use, nutrient availability, species richness, and plant functional strategies have been shown to interact complexly, driving community composition, and vegetation responses to management intensity. We suggest that deeper understanding of underlying mechanisms shaping community assembly and biodiversity will require analyzing all these parameters.
Background
Total hip or knee replacement is one of the most frequently performed surgical procedures. Physical rehabilitation following total hip or knee replacement is an essential part of the therapy to improve functional outcomes and quality of life. After discharge from inpatient rehabilitation, a subsequent postoperative exercise therapy is needed to maintain functional mobility. Telerehabilitation may be a potential innovative treatment approach. We aim to investigate the superiority of an interactive telerehabilitation intervention for patients after total hip or knee replacement, in comparison to usual care, regarding physical performance, functional mobility, quality of life and pain.
Methods/design
This is an open, randomized controlled, multicenter superiority study with two prospective arms. One hundred and ten eligible and consenting participants with total knee or hip replacement will be recruited at admission to subsequent inpatient rehabilitation. After comprehensive, 3-week, inpatient rehabilitation, the intervention group performs a 3-month, interactive, home-based exercise training with a telerehabilitation system. For this purpose, the physiotherapist creates an individual training plan out of 38 different strength and balance exercises which were implemented in the system. Data about the quality and frequency of training are transmitted to the physiotherapist for further adjustment. Communication between patient and physiotherapist is possible with the system. The control group receives voluntary, usual aftercare programs. Baseline assessments are investigated after discharge from rehabilitation; final assessments 3 months later. The primary outcome is the difference in improvement between intervention and control group in 6-minute walk distance after 3 months. Secondary outcomes include differences in the Timed Up and Go Test, the Five-Times-Sit-to-Stand Test, the Stair Ascend Test, the Short-Form 36, the Western Ontario and McMaster Universities Osteoarthritis Index, the International Physical Activity Questionnaire, and postural control as well as gait and kinematic parameters of the lower limbs. Baseline-adjusted analysis of covariance models will be used to test for group differences in the primary and secondary endpoints.
Discussion
We expect the intervention group to benefit from the interactive, home-based exercise training in many respects represented by the study endpoints. If successful, this approach could be used to enhance the access to aftercare programs, especially in structurally weak areas.