Refine
Year of publication
Document Type
- Article (178)
- Postprint (92)
- Review (15)
- Other (9)
- Monograph/Edited Volume (1)
- Conference Proceeding (1)
Keywords
- football (26)
- resistance training (19)
- adolescents (17)
- youth (15)
- athletic performance (14)
- balance (13)
- exercise (12)
- performance (11)
- strength training (11)
- stretch-shortening cycle (11)
Institute
- Department Sport- und Gesundheitswissenschaften (126)
- Strukturbereich Kognitionswissenschaften (108)
- Humanwissenschaftliche Fakultät (22)
- Extern (17)
- Department Psychologie (9)
- Hasso-Plattner-Institut für Digital Engineering GmbH (6)
- Fakultät für Gesundheitswissenschaften (3)
- Mathematisch-Naturwissenschaftliche Fakultät (3)
- Fachgruppe Soziologie (1)
Injuries in professional soccer are a significant concern for teams, and they are caused amongst others by high training load. This cohort study describes the relationship between workload parameters and the occurrence of non-contact injuries, during weeks with high and low workload in professional soccer players throughout the season. Twenty-one professional soccer players aged 28.3 ± 3.9 yrs. who competed in the Iranian Persian Gulf Pro League participated in this 48-week study. The external load was monitored using global positioning system (GPS, GPSPORTS Systems Pty Ltd) and the type of injury was documented daily by the team's medical staff. Odds ratio (OR) and relative risk (RR) were calculated for non-contact injuries for high- and low-load weeks according to acute (AW), chronic (CW), acute to chronic workload ratio (ACWR), and AW variation (Δ-Acute) values. By using Poisson distribution, the interval between previous and new injuries were estimated. Overall, 12 non-contact injuries occurred during high load and 9 during low load weeks. Based on the variables ACWR and Δ-AW, there was a significantly increased risk of sustaining non-contact injuries (p < 0.05) during high-load weeks for ACWR (OR: 4.67), and Δ-AW (OR: 4.07). Finally, the expected time between injuries was significantly shorter in high load weeks for ACWR [1.25 vs. 3.33, rate ratio time (RRT)] and Δ-AW (1.33 vs. 3.45, RRT) respectively, compared to low load weeks. The risk of sustaining injuries was significantly larger during high workload weeks for ACWR, and Δ-AW compared with low workload weeks. The observed high OR in high load weeks indicate that there is a significant relationship between workload and occurrence of non-contact injuries. The predicted time to new injuries is shorter in high load weeks compared to low load weeks. Therefore, the frequency of injuries is higher during high load weeks for ACWR and Δ-AW. ACWR and Δ-AW appear to be good indicators for estimating the injury risk, and the time interval between injuries.
Injuries in professional soccer are a significant concern for teams, and they are caused amongst others by high training load. This cohort study describes the relationship between workload parameters and the occurrence of non-contact injuries, during weeks with high and low workload in professional soccer players throughout the season. Twenty-one professional soccer players aged 28.3 ± 3.9 yrs. who competed in the Iranian Persian Gulf Pro League participated in this 48-week study. The external load was monitored using global positioning system (GPS, GPSPORTS Systems Pty Ltd) and the type of injury was documented daily by the team's medical staff. Odds ratio (OR) and relative risk (RR) were calculated for non-contact injuries for high- and low-load weeks according to acute (AW), chronic (CW), acute to chronic workload ratio (ACWR), and AW variation (Δ-Acute) values. By using Poisson distribution, the interval between previous and new injuries were estimated. Overall, 12 non-contact injuries occurred during high load and 9 during low load weeks. Based on the variables ACWR and Δ-AW, there was a significantly increased risk of sustaining non-contact injuries (p < 0.05) during high-load weeks for ACWR (OR: 4.67), and Δ-AW (OR: 4.07). Finally, the expected time between injuries was significantly shorter in high load weeks for ACWR [1.25 vs. 3.33, rate ratio time (RRT)] and Δ-AW (1.33 vs. 3.45, RRT) respectively, compared to low load weeks. The risk of sustaining injuries was significantly larger during high workload weeks for ACWR, and Δ-AW compared with low workload weeks. The observed high OR in high load weeks indicate that there is a significant relationship between workload and occurrence of non-contact injuries. The predicted time to new injuries is shorter in high load weeks compared to low load weeks. Therefore, the frequency of injuries is higher during high load weeks for ACWR and Δ-AW. ACWR and Δ-AW appear to be good indicators for estimating the injury risk, and the time interval between injuries.
The influence of muscular fatigue on tennis serve performance within regular training sessions is unclear. Therefore, the aim of the present study was to examine the within-session sequence of the tennis serve in youth tennis. Twenty-five young male (14.9 +/- 0.9 years) and female (14.5 +/- 0.9 years) players participated in this within-subject crossover study, and they were randomly but sex-matched assigned to different training sequences (serve exercise before tennis training (BTS) or after tennis training (ATS)). Pre- and post-tests included serve velocity performance and accuracy, shoulder strength, and range-of-motion (ROM) performance (internal/external rotation). Results showed that after one week of serve training conducted following the ATS sequence, significant decreases were found in serve performance (e.g., speed and accuracy), with standardized differences ranging from d = 0.29 to 1.13, as well as the shoulder function (strength [d = 0.20 to 1.0] and ROM [d = 0.17 to 0.31]) in both female and male players, compared to the BTS sequence. Based on the present findings, it appears more effective to implement serve training before the regular tennis training in youth players. If applied after training, excessive levels of fatigue may cause shoulder imbalances that could be related to an increased injury risk.
The integration of balance and plyometric training has been shown to provide significant improvements in sprint, jump, agility, and other performance measures in young athletes. It is not known if a specific within session balance and plyometric exercise sequence provides more effective training adaptations. The objective of the present study was to investigate the effects of using a sequence of alternating pairs of exercises versus a block (series) of all balance exercises followed by a block of plyometric exercises on components of physical fitness such as muscle strength, power, speed, agility, and balance. Twenty-six male adolescent soccer players ( 13.9 +/- 0.3 years) participated in an 8-week training program that either alternated individual balance (e. g., exercises on unstable surfaces) and plyometric (e. g., jumps, hops, rebounds) exercises or performed a block of balance exercises prior to a block of plyometric exercises within each training session. Pre- and post-training measures included proxies of strength, power, agility, sprint, and balance such as countermovement jumps, isometric back and knee extension strength, standing long jump, 10 and 30-m sprints, agility, standing stork, and Y-balance tests. Both groups exhibited significant, generally large magnitude (effect sizes) training improvements for all measures with mean performance increases of approximately > 30%. There were no significant differences between the training groups over time. The results demonstrate the effectiveness of combining balance and plyometric exercises within a training session on components of physical fitness with young adolescents. The improved performance outcomes were not significantly influenced by the within session exercise sequence.
Growth and maturation affect long term physical performance, making the appraisal of athletic ability difficult. We sought to longitudinally track youth soccer players to assess the developmental trajectory of athletic performance over a 6-year period in an English Premier League academy. Age-specific z-scores were calculated for sprint and jump performance from a sample of male youth soccer players (n = 140). A case study approach was used to analyse the longitudinal curves of the six players with the longest tenure. The trajectories of the sprint times of players 1 and 3 were characterised by a marked difference in respective performance levels up until peak height velocity (PHV) when player 1 achieved a substantial increase in sprint speed and player 3 experienced a large decrease. Player 5 was consistently a better performer than player 2 until PHV when the sprint and jump performance of the former markedly decreased and he was overtaken by the latter. Fluctuations in players' physical performance can occur quickly and in drastic fashion. Coaches must be aware that suppressed, or inflated, performance could be temporary and selection and deselection decisions should not be made based on information gathered over a short time period.
Validation of two accelerometers to determine mechanical loading of physical activities in children
(2015)
The purpose of this study was to assess the validity of accelerometers using force plates (i.e., ground reaction force (GRF)) during the performance of different tasks of daily physical activity in children. Thirteen children (10.1 (range 5.4-15.7)years, 3 girls) wore two accelerometers (ActiGraph GT3X+ (ACT), GENEA (GEN)) at the hip that provide raw acceleration signals at 100Hz. Participants completed different tasks (walking, jogging, running, landings from boxes of different height, rope skipping, dancing) on a force plate. GRF was collected for one step per trial (10 trials) for ambulatory movements and for all landings (10 trials), rope skips and dance procedures. Accelerometer outputs as peak loading (g) per activity were averaged. ANOVA, correlation analyses and Bland-Altman plots were computed to determine validity of accelerometers using GRF. There was a main effect of task with increasing acceleration values in tasks with increasing locomotion speed and landing height (P<0.001). Data from ACT and GEN correlated with GRF (r=0.90 and 0.89, respectively) and between each other (r=0.98), but both accelerometers consistently overestimated GRF. The new generation of accelerometer models that allow raw signal detection are reasonably accurate to measure impact loading of bone in children, although they systematically overestimate GRF.
Our experimental approach included two studies to determine discriminative validity and test-retest reliability (study 1) as well as ecological validity (study 2) of a judo ergometer system while performing judo-specific movements. Sixteen elite (age: 23 +/- 3 years) and 11 sub-elite (age: 16 +/- 1 years) athletes participated in study 1 and 14 male sub-elite judo athletes participated in study 2. Discriminative validity and test-retest reliability of sport-specific parameters (mechanical work, maximal force) were assessed during pulling movements with and without tsukuri (kuzushi). Ecological validity of muscle activity was determined by performing pulling movements using the ergometer without tsukuri and during the same movements against an opponent. In both conditions, electromyographic activity of trunk (e.g., m. erector spinae) and upper limb muscles (e.g., m. biceps brachii) were assessed separately for the lifting and pulling arm. Elite athletes showed mostly better mechanical work, maximal force, and power (0.12 <= d <= 1.80) compared with sub-elite athletes. The receiver operating characteristic analysis revealed acceptable validity of the JERGo(C) system to discriminate athletes of different performance levels predominantly during kuzushi without tsukuri (area under the curve = 0.27-0.90). Moreover, small-to-medium discriminative validity was found to detect meaningful performance changes for mechanical work and maximal force. The JERGo(C) system showed small-to-high relative (ICC = 0.37-0.92) and absolute reliability (SEM = 10.8-18.8%). Finally, our analyses revealed acceptable correlations (r = 0.41-0.88) between muscle activity during kuzushi performed with the JERGo(C) system compared with a judo opponent. Our findings indicate that the JERGo(C) system is a valid and reliable test instrument for the assessment and training of judo-specific pulling kinetics particularly during kuzushi movement without tsukuri.
Non-local or crossover (contralateral and non-stretched muscles) increases in range-of-motion (ROM) and balance have been reported following rolling of quadriceps, hamstrings and plantar flexors. Since there is limited information regarding plantar sole (foot) rolling effects, the objectives of this study were to determine if unilateral foot rolling would affect ipsilateral and contralateral measures of ROM and balance in young healthy adults. A randomized within-subject design was to examine non-local effects of unilateral foot rolling on ipsilateral and contralateral limb ankle dorsiflexion ROM and a modified sit-and-reachtest (SRT). Static balance was also tested during a 30 s single leg stance test. Twelve participants performed three bouts of 60 s unilateral plantar sole rolling using a roller on the dominant foot with 60 s rest intervals between sets. ROM and balance measures were assessed in separate sessions at pre-intervention, immediately and 10 minutes post-intervention. To evaluate repeated measures effects, two SRT pre-tests were implemented. Results demonstrated that the second pre-test SRT was 6.6% higher than the first pre-test (p = 0.009, d = 1.91). There were no statistically significant effects of foot rolling on any measures immediately or 10 min post-test. To conclude, unilateral foot rolling did not produce statistically significant increases in ipsilateral or contralateral dorsiflexion or SRT ROM nor did it affect postural sway. Our statistically non-significant findings might be attributed to a lower degree of roller-induced afferent stimulation due to the smaller volume of myofascia and muscle compared to prior studies. Furthermore, ROM results from studies utilizing a single pre-test without a sufficient warm-up should be viewed critically.
The purpose of this study was to investigate the effects of back extensor fatigue on performance measures and electromyographic (EMG) activity of leg and trunk muscles during jumping on stable and unstable surfaces.
Before and after a modified Biering-Sorensen fatigue protocol for the back extensors, countermovement (CMJ) and lateral jumps (LJ) were performed on a force plate under stable and unstable (balance pad on the force plate) conditions. Performance measures for LJ (contact time) and CMJ height and leg and trunk muscles EMG activity were tested in 14 male experienced jumpers during 2 time intervals for CMJ (braking phase, push-off phase) and 5 intervals for LJ (-30 to 0, 0-30, 30-60, 60-90, and 90-120 ms) in non-fatigued and fatigued conditions.
A significant main effect of test (fatigue) (p = 0.007, f = 0.57) was observed for CMJ height. EMG analysis showed a significant fatigue-induced decrease in biceps femoris and gastrocnemius activity with CMJ (p = 0.008, f = 0.58 andp = 0.04, f = 0.422, respectively). LJ contact time was not affected by fatigue or surface interaction. EMG activity was significantly lower in the tibialis anterior with LJ following fatigue (p = 0.05, f = 0.405). A test x surface (p = 0.04, f = 0.438) interaction revealed that the non-fatigued unstable CMJ gastrocnemius EMG activity was lower than the non-fatigued stable condition during the onset-of-force phase.
The findings revealed that fatiguing the trunk negatively impacts CMJ height and muscle activity during the performance of CMJs. However, skilled jumpers are not additionally affected by a moderately unstable surface as compared to a stable surface.
Inertial measurement units (IMUs) enable easy to operate and low-cost data recording for gait analysis. When combined with treadmill walking, a large number of steps can be collected in a controlled environment without the need of a dedicated gait analysis laboratory. In order to evaluate existing and novel IMU-based gait analysis algorithms for treadmill walking, a reference dataset that includes IMU data as well as reliable ground truth measurements for multiple participants and walking speeds is needed. This article provides a reference dataset consisting of 15 healthy young adults who walked on a treadmill at three different speeds. Data were acquired using seven IMUs placed on the lower body, two different reference systems (Zebris FDMT-HQ and OptoGait), and two RGB cameras. Additionally, in order to validate an existing IMU-based gait analysis algorithm using the dataset, an adaptable modular data analysis pipeline was built. Our results show agreement between the pressure-sensitive Zebris and the photoelectric OptoGait system (r = 0.99), demonstrating the quality of our reference data. As a use case, the performance of an algorithm originally designed for overground walking was tested on treadmill data using the data pipeline. The accuracy of stride length and stride time estimations was comparable to that reported in other studies with overground data, indicating that the algorithm is equally applicable to treadmill data. The Python source code of the data pipeline is publicly available, and the dataset will be provided by the authors upon request, enabling future evaluations of IMU gait analysis algorithms without the need of recording new data.
TRIPOD
(2021)
Inertial measurement units (IMUs) enable easy to operate and low-cost data recording for gait analysis. When combined with treadmill walking, a large number of steps can be collected in a controlled environment without the need of a dedicated gait analysis laboratory. In order to evaluate existing and novel IMU-based gait analysis algorithms for treadmill walking, a reference dataset that includes IMU data as well as reliable ground truth measurements for multiple participants and walking speeds is needed. This article provides a reference dataset consisting of 15 healthy young adults who walked on a treadmill at three different speeds. Data were acquired using seven IMUs placed on the lower body, two different reference systems (Zebris FDMT-HQ and OptoGait), and two RGB cameras. Additionally, in order to validate an existing IMU-based gait analysis algorithm using the dataset, an adaptable modular data analysis pipeline was built. Our results show agreement between the pressure-sensitive Zebris and the photoelectric OptoGait system (r = 0.99), demonstrating the quality of our reference data. As a use case, the performance of an algorithm originally designed for overground walking was tested on treadmill data using the data pipeline. The accuracy of stride length and stride time estimations was comparable to that reported in other studies with overground data, indicating that the algorithm is equally applicable to treadmill data. The Python source code of the data pipeline is publicly available, and the dataset will be provided by the authors upon request, enabling future evaluations of IMU gait analysis algorithms without the need of recording new data.
Coaches and athletes in elite sports are constantly seeking to use innovative and advanced training strategies to efficiently improve strength/power performance in already highly-trained individuals. In this regard, high-intensity conditioning contractions have become a popular means to induce acute improvements primarily in muscle contractile properties, which are supposed to translate to subsequent power performances. This performance-enhancing physiological mechanism has previously been called postactivation potentiation (PAP). However, in contrast to the traditional mechanistic understanding of PAP that is based on electrically-evoked twitch properties, an increasing number of studies used the term PAP while referring to acute performance enhancements, even if physiological measures of PAP were not directly assessed. In this current opinion article, we compare the two main approaches (i.e., mechanistic vs. performance) used in the literature to describe PAP effects. We additionally discuss potential misconceptions in the general use of the term PAP. Studies showed that mechanistic and performance-related PAP approaches have different characteristics in terms of the applied research field (basic vs. applied), effective conditioning contractions (e.g., stimulated vs. voluntary), verification (lab-based vs. field tests), effects (twitch peak force vs. maximal voluntary strength), occurrence (consistent vs. inconsistent), and time course (largest effect immediately after vs. similar to 7 min after the conditioning contraction). Moreover, cross-sectional studies revealed inconsistent and trivial-to-large-sized associations between selected measures of mechanistic (e.g., twitch peak force) vs. performance-related PAP approaches (e.g., jump height). In an attempt to avoid misconceptions related to the two different PAP approaches, we propose to use two different terms. Postactivation potentiation should only be used to indicate the increase in muscular force/torque production during an electrically-evoked twitch. In contrast, postactivation performance enhancement (PAPE) should be used to refer to the enhancement of measures of maximal strength, power, and speed following conditioning contractions. The implementation of this terminology would help to better differentiate between mechanistic and performance-related PAP approaches. This is important from a physiological point of view, but also when it comes to aggregating findings from PAP studies, e.g., in the form of meta-analyses, and translating these findings to the field of strength and conditioning.
Background The importance of trunk muscle strength (TMS) for physical fitness and athletic performance has been demonstrated by studies reporting significant correlations between those capacities. However, evidence-based knowledge regarding the magnitude of correlations between TMS and proxies of physical fitness and athletic performance as well as potential effects of core strength training (CST) on TMS, physical fitness and athletic performance variables is currently lacking for trained individuals. Objective The aims of this systematic review and meta-analysis were to quantify associations between variables of TMS, physical fitness and athletic performance and effects of CST on these measures in healthy trained individuals. Data Sources PubMed, Web of Science, and SPORTDiscus were systematically screened from January 1984 to March 2015. Study Eligibility Criteria Studies were included that investigated healthy trained individuals aged 16-44 years and tested at least one measure of TMS, muscle strength, muscle power, balance, and/or athletic performance. Results Small-sized relationships of TMS with physical performance measures (-0.05 <= r <= 0.18) were found in 15 correlation studies. Sixteen intervention studies revealed large effects of CST on measures of TMS (SMD = 1.07) but small-to-medium-sized effects on proxies of physical performance (0 <= SMD <= 0.71) compared with no training or regular training only. The methodological quality of CST studies was low (median PEDro score = 4). Conclusions Our findings indicate that TMS plays only a minor role for physical fitness and athletic performance in trained individuals. In fact, CST appears to be an effective means to increase TMS and was associated with only limited gains in physical fitness and athletic performance measures when compared with no or only regular training.
The purpose of this study was to investigate the effects of plyometric training on stable (SPT) vs. highly unstable surfaces (IPT) on athletic performance in adolescent soccer players. 24 male sub-elite soccer players (age: 15 +/- 1 years) were assigned to 2 groups performing plyometric training for 8 weeks (2 sessions/week, 90min each). The SPT group conducted plyometrics on stable and the IPT group on unstable surfaces. Tests included jump performance (countermovement jump [CMJ] height, drop jump [DJ] height, DJ performance index), sprint time, agility and balance. Statistical analysis revealed significant main effects of time for CMJ height (p<0.01, f=1.44), DJ height (p<0.01, f=0.62), DJ performance index (p<0.05, f=0.60), 0-10-m sprint time (p<0.05, f=0.58), agility (p<0.01, f=1.15) and balance (p<0.05, 0.46f1.36). Additionally, a Training groupxTime interaction was found for CMJ height (p<0.01, f=0.66) in favor of the SPT group. Following 8 weeks of training, similar improvements in speed, agility and balance were observed in the IPT and SPT groups. However, the performance of IPT appears to be less effective for increasing CMJ height compared to SPT. It is thus recommended that coaches use SPT if the goal is to improve jump performance.
Background
High prevalence rates have been reported for physical inactivity, mobility limitations, and falls in older adults. Home-based exercise might be an adequate means to increase physical activity by improving health- (i.e., muscle strength) and skill-related components of physical fitness (i.e., balance), particularly in times of restricted physical activity due to pandemics.
Objective
The objective of this study was to examine the effects of home-based balance exercises conducted during daily tooth brushing on measures of balance and muscle strength in healthy older adults.
Methods
Fifty-one older adults were randomly assigned to a balance exercise group (n = 27; age: 65.1 ± 1.1 years) or a passive control group (n = 24; age: 66.2 ± 3.3 years). The intervention group conducted balance exercises over a period of eight weeks twice daily for three minutes each during their daily tooth brushing routine. Pre- and post-intervention, tests were included for the assessment of static steady-state balance (i.e., Romberg test), dynamic steady-state balance (i.e., 10-m single and dual-task walk test using a cognitive and motor interference task), proactive balance (i.e., Timed-Up-and-Go Test [TUG], Functional-Reach-Test [FRT]), and muscle strength (i.e., Chair-Rise-Test [CRT]).
Results
Irrespective of group, the statistical analysis revealed significant main effects for time (pre vs. post) for dual-task gait speed (p < .001, 1.12 ≤ d ≤ 2.65), TUG (p < .001, d = 1.17), FRT (p = .002, d = 0.92), and CRT (p = .002, d = 0.94) but not for single-task gait speed and for the Romberg-Test. No significant group × time interactions were found for any of the investigated variables.
Conclusions
The applied lifestyle balance training program conducted twice daily during tooth brushing routines appears not to be sufficient in terms of exercise dosage and difficulty level to enhance balance and muscle strength in healthy adults aged 60–72 years. Consequently, structured balance training programs using higher exercise dosages and/or more difficult balance tasks are recommended for older adults to improve balance and muscle strength.
Background
High prevalence rates have been reported for physical inactivity, mobility limitations, and falls in older adults. Home-based exercise might be an adequate means to increase physical activity by improving health- (i.e., muscle strength) and skill-related components of physical fitness (i.e., balance), particularly in times of restricted physical activity due to pandemics.
Objective
The objective of this study was to examine the effects of home-based balance exercises conducted during daily tooth brushing on measures of balance and muscle strength in healthy older adults.
Methods
Fifty-one older adults were randomly assigned to a balance exercise group (n = 27; age: 65.1 ± 1.1 years) or a passive control group (n = 24; age: 66.2 ± 3.3 years). The intervention group conducted balance exercises over a period of eight weeks twice daily for three minutes each during their daily tooth brushing routine. Pre- and post-intervention, tests were included for the assessment of static steady-state balance (i.e., Romberg test), dynamic steady-state balance (i.e., 10-m single and dual-task walk test using a cognitive and motor interference task), proactive balance (i.e., Timed-Up-and-Go Test [TUG], Functional-Reach-Test [FRT]), and muscle strength (i.e., Chair-Rise-Test [CRT]).
Results
Irrespective of group, the statistical analysis revealed significant main effects for time (pre vs. post) for dual-task gait speed (p < .001, 1.12 ≤ d ≤ 2.65), TUG (p < .001, d = 1.17), FRT (p = .002, d = 0.92), and CRT (p = .002, d = 0.94) but not for single-task gait speed and for the Romberg-Test. No significant group × time interactions were found for any of the investigated variables.
Conclusions
The applied lifestyle balance training program conducted twice daily during tooth brushing routines appears not to be sufficient in terms of exercise dosage and difficulty level to enhance balance and muscle strength in healthy adults aged 60–72 years. Consequently, structured balance training programs using higher exercise dosages and/or more difficult balance tasks are recommended for older adults to improve balance and muscle strength.
Background
Due to inconclusive evidence on the effects of foot orthoses treatment on lower limb kinematics and kinetics in children, studies are needed that particularly evaluate the long-term use of foot orthoses on lower limb alignment during walking. Thus, the main objective of this study was to evaluate the effects of long-term treatment with arch support foot orthoses versus a sham condition on lower extremity kinematics and kinetics during walking in children with flexible flat feet.
Methods
Thirty boys aged 8–12 years with flexible flat feet participated in this study. While the experimental group (n = 15) used medial arch support foot orthoses during everyday activities over a period of four months, the control group (n = 15) received flat 2-mm-thick insoles (i.e., sham condition) for the same time period. Before and after the intervention period, walking kinematics and ground reaction forces were collected.
Results
Significant group by time interactions were observed during walking at preferred gait speed for maximum ankle eversion, maximum ankle internal rotation angle, minimum knee abduction angle, maximum knee abduction angle, maximum knee external rotation angle, maximum knee internal rotation angle, maximum hip extension angle, and maximum hip external rotation angle in favor of the foot orthoses group. In addition, statistically significant group by time interactions were detected for maximum posterior, and vertical ground reaction forces in favor of the foot orthoses group.
Conclusions
The long-term use of arch support foot orthoses proved to be feasible and effective in boys with flexible flat feet to improve lower limb alignment during walking.
Background
Due to inconclusive evidence on the effects of foot orthoses treatment on lower limb kinematics and kinetics in children, studies are needed that particularly evaluate the long-term use of foot orthoses on lower limb alignment during walking. Thus, the main objective of this study was to evaluate the effects of long-term treatment with arch support foot orthoses versus a sham condition on lower extremity kinematics and kinetics during walking in children with flexible flat feet.
Methods
Thirty boys aged 8–12 years with flexible flat feet participated in this study. While the experimental group (n = 15) used medial arch support foot orthoses during everyday activities over a period of four months, the control group (n = 15) received flat 2-mm-thick insoles (i.e., sham condition) for the same time period. Before and after the intervention period, walking kinematics and ground reaction forces were collected.
Results
Significant group by time interactions were observed during walking at preferred gait speed for maximum ankle eversion, maximum ankle internal rotation angle, minimum knee abduction angle, maximum knee abduction angle, maximum knee external rotation angle, maximum knee internal rotation angle, maximum hip extension angle, and maximum hip external rotation angle in favor of the foot orthoses group. In addition, statistically significant group by time interactions were detected for maximum posterior, and vertical ground reaction forces in favor of the foot orthoses group.
Conclusions
The long-term use of arch support foot orthoses proved to be feasible and effective in boys with flexible flat feet to improve lower limb alignment during walking.
Background: The regular assessment of hormonal and mood state parameters in professional soccer are proposed as good indicators during periods of intense training and/or competition to avoid overtraining.
Objective: The aim of this study was to analyze hormonal, psychological, workload and physical fitness parameters in elite soccer players in relation to changes in training and match exposure during a congested period of match play.
Methods: Sixteen elite soccer players from a team playing in the first Tunisian soccer league were evaluated three times (T1, T2, and T3) over 12 weeks. The non-congested period of match play was from T1 to T2, when the players played 6 games over 6 weeks. The congested period was from T2 to T3, when the players played 10 games over 6 weeks. From T1 to T3, players performed the Yo-Yo intermittent recovery test level 1 (YYIR1), the repeated shuttle sprint ability test (RSSA), the countermovement jump test (CMJ), and the squat jump test (SJ). Plasma Cortisol (C), Testosterone (T), and the T/C ratio were analyzed at T1, T2, and T3. Players had their mood dimensions (tension, depression, anger, vigor, fatigue, confusion, and a Total Mood Disturbance) assessed through the Profile of Mood State questionnaire (POMS). Training session rating of perceived exertion (sRPE) was also recorded on a daily basis in order to quantify internal training load and elements of monotony and strain.
Results: Significant performance declines (T1 < T2 < T3) were found for SJ performance (p = 0.04, effect size [ES] ES₁₋₂ = 0.15−0.06, ES₂₋₃ = 0.24) from T1 to T3. YYIR1 performance improved significantly from T1 to T2 and declined significantly from T2 to T3 (p = 0.001, ES₁₋₂ = 0.24, ES₂₋₃ = −2.54). Mean RSSA performance was significantly higher (p = 0.019, ES₁₋₂ = −0.47, ES₂₋₃ = 1.15) in T3 compared with T2 and T1. Best RSSA performance was significantly higher in T3 when compared with T2 and T1 (p = 0.006, ES₂₋₃ = 0.47, ES₁₋₂ = −0.56), but significantly lower in T2 when compared with to T1. T and T/C were significantly lower in T3 when compared with T2 and T1 (T: p = 0.03, ES₃₋₂ = −0.51, ES₃₋₁ = −0.51, T/C: p = 0.017, ES₃₋₂ = −1.1, ES₃₋₁ = −1.07). Significant decreases were found for the vigor scores in T3 when compared to T2 and T1 (p = 0.002, ES₁₋₂ = 0.31, ES₃₋₂ = −1.25). A significant increase was found in fatigue scores in T3 as compared to T1 and T2 (p = 0.002, ES₁₋₂ = 0.43, ES₂₋₃ = 0.81). A significant increase was found from T1 < T2 < T3 intension score (p = 0.002, ES₁₋₂ = 1.1, ES₂₋₃ = 0.2) and anger score (p = 0.03, ES₁₋₂ = 0.47, ES₂₋₃ = 0.33) over the study period. Total mood disturbance increased significantly (p = 0.02, ES₁₋₂ = 0.91, ES₂₋₃ = 1.1) from T1 to T3. Between T1-T2, significant relationships were observed between workload and changes in T (r = 0.66, p = 0.003), and T/C ratio (r = 0.62, p = 0.01). There were significant relationships between performance in RSSAbest and training load parameters (workload: r = 0.52, p = 0.03; monotony: r = 0.62, p = 0.01; strain: r = 0.62, p = 0.009). Between T2-T3, there was a significant relationship between Δ% of total mood disturbance and Δ% of YYIR1 (r = −0.54; p = 0.04), RSSAbest (r = 0.58, p = 0.01), SJ (r = −0,55, p = 0.01), T (r = 0.53; p = 0.03), and T/C (r = 0.5; p = 0.04).
Conclusion: An intensive period of congested match play significantly compromised elite soccer players’ physical and mental fitness. These changes were related to psychological but not hormonal parameters; even though significant alterations were detected for selected measures. Mood monitoring could be a simple and useful tool to determine the degree of preparedness for match play during a congested period in professional soccer.
Background: The regular assessment of hormonal and mood state parameters in professional soccer are proposed as good indicators during periods of intense training and/or competition to avoid overtraining.
Objective: The aim of this study was to analyze hormonal, psychological, workload and physical fitness parameters in elite soccer players in relation to changes in training and match exposure during a congested period of match play.
Methods: Sixteen elite soccer players from a team playing in the first Tunisian soccer league were evaluated three times (T1, T2, and T3) over 12 weeks. The non-congested period of match play was from T1 to T2, when the players played 6 games over 6 weeks. The congested period was from T2 to T3, when the players played 10 games over 6 weeks. From T1 to T3, players performed the Yo-Yo intermittent recovery test level 1 (YYIR1), the repeated shuttle sprint ability test (RSSA), the countermovement jump test (CMJ), and the squat jump test (SJ). Plasma Cortisol (C), Testosterone (T), and the T/C ratio were analyzed at T1, T2, and T3. Players had their mood dimensions (tension, depression, anger, vigor, fatigue, confusion, and a Total Mood Disturbance) assessed through the Profile of Mood State questionnaire (POMS). Training session rating of perceived exertion (sRPE) was also recorded on a daily basis in order to quantify internal training load and elements of monotony and strain.
Results: Significant performance declines (T1 < T2 < T3) were found for SJ performance (p = 0.04, effect size [ES] ES₁₋₂ = 0.15−0.06, ES₂₋₃ = 0.24) from T1 to T3. YYIR1 performance improved significantly from T1 to T2 and declined significantly from T2 to T3 (p = 0.001, ES₁₋₂ = 0.24, ES₂₋₃ = −2.54). Mean RSSA performance was significantly higher (p = 0.019, ES₁₋₂ = −0.47, ES₂₋₃ = 1.15) in T3 compared with T2 and T1. Best RSSA performance was significantly higher in T3 when compared with T2 and T1 (p = 0.006, ES₂₋₃ = 0.47, ES₁₋₂ = −0.56), but significantly lower in T2 when compared with to T1. T and T/C were significantly lower in T3 when compared with T2 and T1 (T: p = 0.03, ES₃₋₂ = −0.51, ES₃₋₁ = −0.51, T/C: p = 0.017, ES₃₋₂ = −1.1, ES₃₋₁ = −1.07). Significant decreases were found for the vigor scores in T3 when compared to T2 and T1 (p = 0.002, ES₁₋₂ = 0.31, ES₃₋₂ = −1.25). A significant increase was found in fatigue scores in T3 as compared to T1 and T2 (p = 0.002, ES₁₋₂ = 0.43, ES₂₋₃ = 0.81). A significant increase was found from T1 < T2 < T3 intension score (p = 0.002, ES₁₋₂ = 1.1, ES₂₋₃ = 0.2) and anger score (p = 0.03, ES₁₋₂ = 0.47, ES₂₋₃ = 0.33) over the study period. Total mood disturbance increased significantly (p = 0.02, ES₁₋₂ = 0.91, ES₂₋₃ = 1.1) from T1 to T3. Between T1-T2, significant relationships were observed between workload and changes in T (r = 0.66, p = 0.003), and T/C ratio (r = 0.62, p = 0.01). There were significant relationships between performance in RSSAbest and training load parameters (workload: r = 0.52, p = 0.03; monotony: r = 0.62, p = 0.01; strain: r = 0.62, p = 0.009). Between T2-T3, there was a significant relationship between Δ% of total mood disturbance and Δ% of YYIR1 (r = −0.54; p = 0.04), RSSAbest (r = 0.58, p = 0.01), SJ (r = −0,55, p = 0.01), T (r = 0.53; p = 0.03), and T/C (r = 0.5; p = 0.04).
Conclusion: An intensive period of congested match play significantly compromised elite soccer players’ physical and mental fitness. These changes were related to psychological but not hormonal parameters; even though significant alterations were detected for selected measures. Mood monitoring could be a simple and useful tool to determine the degree of preparedness for match play during a congested period in professional soccer.
Purpose: To examine the effects of loaded (LPJT) versus unloaded plyometric jump training (UPJT) programs on measures of muscle power, speed, change of direction (CoD), and kicking-distance performance in prepubertal male soccer players. Methods: Participants (N = 29) were randomly assigned to a LPJT group (n = 13; age = 13.0 [0.7] y) using weighted vests or UPJT group (n = 16; age = 13.0 [0.5] y) using body mass only. Before and after the intervention, tests for the assessment of proxies of muscle power (ie, countermovement jump, standing long jump); speed (ie, 5-, 10-, and 20-m sprint); CoD (ie, Illinois CoD test, modified 505 agility test); and kicking-distance were conducted. Data were analyzed using magnitude-based inferences. Results: Within-group analyses for the LPJT group showed large and very large improvements for 10-m sprint time (effect size [ES] = 2.00) and modified 505 CoD (ES = 2.83) tests, respectively. For the same group, moderate improvements were observed for the Illinois CoD test (ES = 0.61), 5- and 20-m sprint time test (ES = 1.00 for both the tests), countermovement jump test (ES = 1.00), and the maximal kicking-distance test (ES = 0.90). Small enhancements in the standing long jump test (ES = 0.50) were apparent. Regarding the UPJT group, small improvements were observed for all tests (ES = 0.33-0.57), except 5- and 10-m sprint time (ES = 1.00 and 0.63, respectively). Between-group analyses favored the LPJT group for the modified 505 CoD (ES = 0.61), standing long jump (ES = 0.50), and maximal kicking-distance tests (ES = 0.57), but not for the 5-m sprint time test (ES = 1.00). Only trivial between-group differences were shown for the remaining tests (ES = 0.00-0.09). Conclusion: Overall, LPJT appears to be more effective than UPJT in improving measures of muscle power, speed, CoD, and kicking-distance performance in prepubertal male soccer players.
Background The aging process results in a number of functional (e.g., deficits in balance and strength/power performance), neural (e.g., loss of sensory/motor neurons), muscular (e.g., atrophy of type-II muscle fibers in particular), and bone-related (e.g., osteoporosis) deteriorations. Traditionally, balance and/or lower extremity resistance training were used to mitigate these age-related deficits. However, the effects of resistance training are limited and poorly translate into improvements in balance, functional tasks, activities of daily living, and fall rates. Thus, it is necessary to develop and design new intervention programs that are specifically tailored to counteract age-related weaknesses. Recent studies indicate that measures of trunk muscle strength (TMS) are associated with variables of static/dynamic balance, functional performance, and falls (i.e., occurrence, fear, rate, and/or risk of falls). Further, there is preliminary evidence in the literature that core strength training (CST) and Pilates exercise training (PET) have a positive influence on measures of strength, balance, functional performance, and falls in older adults.
Objective The objectives of this systematic literature review are: (a) to report potential associations between TMS/trunk muscle composition and balance, functional performance, and falls in old adults, and (b) to describe and discuss the effects of CST/PET on measures of TMS, balance, functional performance, and falls in seniors.
Data Sources A systematic approach was employed to capture all articles related to TMS/trunk muscle composition, balance, functional performance, and falls in seniors that were identified using the electronic databases PubMed and Web of Science (1972 to February 2013).
Study Selection A systematic approach was used to evaluate the 582 articles identified for initial review. Cross-sectional (i.e., relationship) or longitudinal (i.e., intervention) studies were included if they investigated TMS and an outcome-related measure of balance, functional performance, and/or falls. In total, 20 studies met the inclusionary criteria for review.
Study Appraisal and Synthesis Methods Longitudinal studies were evaluated using the Physiotherapy Evidence Database (PEDro) scale. Effect sizes (ES) were calculated whenever possible. For ease of discussion, the 20 articles were separated into three groups [i.e., cross-sectional (n = 6), CST (n = 9), PET (n = 5)].
Results The cross-sectional studies reported small-to-medium correlations between TMS/trunk muscle composition and balance, functional performance, and falls in older adults. Further, CST and/or PET proved to be feasible exercise programs for seniors with high-adherence rates. Age-related deficits in measures of TMS, balance, functional performance, and falls can be mitigated by CST (mean strength gain = 30 %, mean effect size = 0.99; mean balance/functional performance gain = 23 %, mean ES = 0.88) and by PET (mean strength gain = 12 %, mean ES = 0.52; mean balance/functional performance gain = 18 %, mean ES = 0.71).
Limitations Given that the mean PEDro quality score did not reach the predetermined cut-off of >= 6 for the intervention studies, there is a need for more high-quality studies to explicitly identify the relevance of CST and PET to the elderly population.
Conclusions Core strength training and/or PET can be used as an adjunct or even alternative to traditional balance and/or resistance training programs for old adults. Further, CST and PET are easy to administer in a group setting or in individual fall preventive or rehabilitative intervention programs because little equipment and space is needed to perform such exercises.
Introduction:
In children, the impact of hearing loss on biomechanical gait parameters is not well understood. Thus, the objectives of this study were to examine three-dimensional lower limb joint torques in deaf compared to age-matched healthy (hearing) children while walking at preferred gait speed.
Methods:
Thirty prepubertal boys aged 8-14 were enrolled in this study and divided into a group with hearing loss (deaf group) and an age-matched healthy control. Three-dimensional joint torques were analyzed during barefoot walking at preferred speed using Kistler force plates and a Vicon motion capture system.
Results:
Findings revealed that boys with hearing loss showed lower joint torques in ankle evertors, knee flexors, abductors and internal rotators as well as in hip internal rotators in both, the dominant and non-dominant lower limbs (all p < 0.05; d = 1.23-7.00; 14-79%). Further, in the dominant limb, larger peak ankle dorsiflexor (p < 0.001; d = 1.83; 129%), knee adductor (p < 0.001; d = 3.20; 800%), and hip adductor torques (p < 0.001; d = 2.62; 350%) were found in deaf participants compared with controls.
Conclusion:
The observed altered lower limb torques during walking are indicative of unstable gait in children with hearing loss. More research is needed to elucidate whether physical training (e.g., balance and/or gait training) has the potential to improve walking performance in this patient group. (C) 2019 Elsevier Ltd. All rights reserved.
Background: There is evidence that fully recovered COVID-19 patients usually resume physical exercise, but do not perform at the same intensity level performed prior to infection. The aim of this study was to evaluate the impact of COVID-19 infection and recovery as well as muscle fatigue on cardiorespiratory fitness and running biomechanics in female recreational runners.
Methods: Twenty-eight females were divided into a group of hospitalized and recovered COVID-19 patients (COV, n = 14, at least 14 days following recovery) and a group of healthy age-matched controls (CTR, n = 14). Ground reaction forces from stepping on a force plate while barefoot overground running at 3.3 m/s was measured before and after a fatiguing protocol. The fatigue protocol consisted of incrementally increasing running speed until reaching a score of 13 on the 6–20 Borg scale, followed by steady-state running until exhaustion. The effects of group and fatigue were assessed for steady-state running duration, steady-state running speed, ground contact time, vertical instantaneous loading rate and peak propulsion force.
Results: COV runners completed only 56% of the running time achieved by the CTR (p < 0.0001), and at a 26% slower steady-state running speed (p < 0.0001). There were fatigue-related reductions in loading rate (p = 0.004) without group differences. Increased ground contact time (p = 0.002) and reduced peak propulsion force (p = 0.005) were found for COV when compared to CTR.
Conclusion: Our results suggest that female runners who recovered from COVID-19 showed compromised running endurance and altered running kinetics in the form of longer stance periods and weaker propulsion forces. More research is needed in this area using larger sample sizes to confirm our study findings.
Background: There is evidence that fully recovered COVID-19 patients usually resume physical exercise, but do not perform at the same intensity level performed prior to infection. The aim of this study was to evaluate the impact of COVID-19 infection and recovery as well as muscle fatigue on cardiorespiratory fitness and running biomechanics in female recreational runners.
Methods: Twenty-eight females were divided into a group of hospitalized and recovered COVID-19 patients (COV, n = 14, at least 14 days following recovery) and a group of healthy age-matched controls (CTR, n = 14). Ground reaction forces from stepping on a force plate while barefoot overground running at 3.3 m/s was measured before and after a fatiguing protocol. The fatigue protocol consisted of incrementally increasing running speed until reaching a score of 13 on the 6–20 Borg scale, followed by steady-state running until exhaustion. The effects of group and fatigue were assessed for steady-state running duration, steady-state running speed, ground contact time, vertical instantaneous loading rate and peak propulsion force.
Results: COV runners completed only 56% of the running time achieved by the CTR (p < 0.0001), and at a 26% slower steady-state running speed (p < 0.0001). There were fatigue-related reductions in loading rate (p = 0.004) without group differences. Increased ground contact time (p = 0.002) and reduced peak propulsion force (p = 0.005) were found for COV when compared to CTR.
Conclusion: Our results suggest that female runners who recovered from COVID-19 showed compromised running endurance and altered running kinetics in the form of longer stance periods and weaker propulsion forces. More research is needed in this area using larger sample sizes to confirm our study findings.
Background:
There is evidence that fully recovered COVID-19 patients usually resume physical exercise, but do not perform at the same intensity level performed prior to infection. The aim of this study was to evaluate the impact of COVID-19 infection and recovery as well as muscle fatigue on cardiorespiratory fitness and running biomechanics in female recreational runners.
Methods:
Twenty-eight females were divided into a group of hospitalized and recovered COVID-19 patients (COV, n = 14, at least 14 days following recovery) and a group of healthy age-matched controls (CTR, n = 14). Ground reaction forces from stepping on a force plate while barefoot overground running at 3.3 m/s was measured before and after a fatiguing protocol. The fatigue protocol consisted of incrementally increasing running speed until reaching a score of 13 on the 6-20 Borg scale, followed by steady-state running until exhaustion. The effects of group and fatigue were assessed for steady-state running duration, steady-state running speed, ground contact time, vertical instantaneous loading rate and peak propulsion force.
Results:
COV runners completed only 56% of the running time achieved by the CTR (p < 0.0001), and at a 26% slower steady-state running speed (p < 0.0001). There were fatigue-related reductions in loading rate (p = 0.004) without group differences. Increased ground contact time (p = 0.002) and reduced peak propulsion force (p = 0.005) were found for COV when compared to CTR.
Conclusion:
Our results suggest that female runners who recovered from COVID-19 showed compromised running endurance and altered running kinetics in the form of longer stance periods and weaker propulsion forces. More research is needed in this area using larger sample sizes to confirm our study findings.
Objective: To determine the effects of low- vs. high-intensity aerobic and resistance training on motor and cognitive function, brain activation, brain structure, and neurochemical markers of neuroplasticity and the association thereof in healthy young and older adults and in patients with multiple sclerosis, Parkinson's disease, and stroke. Design: Systematic review and robust variance estimation meta-analysis with meta-regression. Data sources: Systematic search of MEDLINE, Web of Science, and CINAHL databases. Results: Fifty studies with 60 intervention arms and 2283 in-analyses participants were included. Due to the low number of studies, the three patient groups were combined and analyzed as a single group. Overall, low- (g=0.19, p = 0.024) and high-intensity exercise (g=0.40, p = 0.001) improved neuroplasticity. Exercise intensity scaled with neuroplasticity only in healthy young adults but not in healthy older adults or patient groups. Exercise-induced improvements in neuroplasticity were associated with changes in motor but not cognitive outcomes. Conclusion: Exercise intensity is an important variable to dose and individualize the exercise stimulus for healthy young individuals but not necessarily for healthy older adults and neurological patients. This conclusion warrants caution because studies are needed that directly compare the effects of low- vs. high-intensity exercise on neuroplasticity to determine if such changes are mechanistically and incrementally linked to improved cognition and motor function.
Swimming performance can be improved not only by in-water sport-specific training but also by means of dry land-training (e.g., plyometric jump training [PJT]). This study examined the effects of an 8-week PJT on proxies of muscle power and swimming performance in prepubertal male swimmers. Participants were randomly allocated to a PJT group (PJT; n = 14; age: 10.3 +/- 0.4 years, maturity-offset = -3 +/- 0.3) or a control group (CG; n = 12; age: 10.5 +/- 0.4 years, maturity-offset = -2.8 +/- 0.3). Swimmers in PJT and CG performed 6 training sessions per week. Each training session lasted between 80 and 90 minutes. Over the 8 weeks in-season training period, PJT performed two PJT sessions per week, each lasting between 25 to 30 minutes (similar to 1 hour per week) in replacement of sport-specific swimming drills. During that time, CG followed their regular sport-specific swimming training (e.g., coordination, breathing, improving swimming strokes). Overall training volume was similar between groups. Pre- and post-training, tests were conducted to assess proxies of muscle power (countermovement-jump [CMJ]), standing-long-jump [SLJ]) and sport-specific swimming performances (15-, 25-, and 50-m front-crawl, 25-m kick without push [25-m kick WP], and 25-m front-crawl WP). No training or test-related injuries were detected over the course of the study. Between-group analyses derived from magnitude-based inferences showed trivial-to-large effects in favour of PJT for all tests (ES = 0.28 to 1.43). Within-group analyses for the PJT showed small performance improvements for CMJ (effect-size [ES] = 0.53), 25-m kick WP (ES = 0.25), and 50-m front crawl (ES = 0.56) tests. Moderate performance improvements were observed for the SLJ, 25-m front-crawl WP, 15-m and 25-m front-crawl tests (ES = 0.95, 0.60, 0.99, and 0.85, respectively). For CG, the within-group results showed trivial performance declines for the CMJ (ES=-0.13) and the 50-m front-crawl test (ES = -0.04). In addition, trivial-to-small performance improvements were observed for the SLJ (ES = 0.09), 25-m kick WP (ES = 0.02), 25-m front-crawl WP (ES = 0.19), 25-m front-crawl (ES = 0.2), (SLJ [ES = 0.09, and 15-m front crawl (ES = 0.36). Short-term in-season PJT, integrated into the regular swimming training, was more effective than regular swimming training alone in improving jump and sport-specific swimming performances in prepubertal male swimmers.
Background/objective
Dry land-training (e.g., plyometric jump training) can be a useful mean to improve swimming performance. This study examined the effects of an 8-week plyometric jump training (PJT) program on jump and sport-specific performances in prepubertal female swimmers.
Methods
Twenty-two girls were randomly assigned to either a plyometric jump training group (PJTG; n = 12, age: 10.01 ± 0.57 years, maturity-offset = -1.50 ± 0.50, body mass = 36.39 ± 6.32 kg, body height = 146.90 ± 7.62 cm, body mass index = 16.50 ± 1.73 kg/m2) or an active control (CG; n = 10, age: 10.50 ± 0.28 years, maturity-offset = -1.34 ± 0.51, body mass = 38.41 ± 9.42 kg, body height = 143.60 ± 5.05 cm, body mass index = 18.48 ± 3.77 kg/m2). Pre- and post-training, tests were conducted for the assessment of muscle power (e.g., countermovement-jump [CMJ], standing-long-jump [SLJ]). Sport-specific-performances were tested using the timed 25 and 50-m front crawl with a diving-start, timed 25-m front crawl without push-off from the wall (25-m WP), and a timed 25-m kick without push-off from the wall (25-m KWP).
Results
Findings showed a significant main effect of time for the CMJ (d = 0.78), the SLJ (d = 0.91), 25-m front crawl test (d = 2.5), and the 25-m-KWP (d = 1.38) test. Significant group × time interactions were found for CMJ, SLJ, 25-m front crawl, 50-m front crawl, 25-m KWP, and 25-m WP test (d = 0.29–1.63) in favor of PJTG (d = 1.34–3.50). No significant pre-post changes were found for CG (p > 0.05).
Conclusion
In sum, PJT is effective in improving muscle power and sport-specific performances in prepubertal swimmers. Therefore, PJT should be included from an early start into the regular training program of swimmers.
Background/objective
Dry land-training (e.g., plyometric jump training) can be a useful mean to improve swimming performance. This study examined the effects of an 8-week plyometric jump training (PJT) program on jump and sport-specific performances in prepubertal female swimmers.
Methods
Twenty-two girls were randomly assigned to either a plyometric jump training group (PJTG; n = 12, age: 10.01 ± 0.57 years, maturity-offset = -1.50 ± 0.50, body mass = 36.39 ± 6.32 kg, body height = 146.90 ± 7.62 cm, body mass index = 16.50 ± 1.73 kg/m2) or an active control (CG; n = 10, age: 10.50 ± 0.28 years, maturity-offset = -1.34 ± 0.51, body mass = 38.41 ± 9.42 kg, body height = 143.60 ± 5.05 cm, body mass index = 18.48 ± 3.77 kg/m2). Pre- and post-training, tests were conducted for the assessment of muscle power (e.g., countermovement-jump [CMJ], standing-long-jump [SLJ]). Sport-specific-performances were tested using the timed 25 and 50-m front crawl with a diving-start, timed 25-m front crawl without push-off from the wall (25-m WP), and a timed 25-m kick without push-off from the wall (25-m KWP).
Results
Findings showed a significant main effect of time for the CMJ (d = 0.78), the SLJ (d = 0.91), 25-m front crawl test (d = 2.5), and the 25-m-KWP (d = 1.38) test. Significant group × time interactions were found for CMJ, SLJ, 25-m front crawl, 50-m front crawl, 25-m KWP, and 25-m WP test (d = 0.29–1.63) in favor of PJTG (d = 1.34–3.50). No significant pre-post changes were found for CG (p > 0.05).
Conclusion
In sum, PJT is effective in improving muscle power and sport-specific performances in prepubertal swimmers. Therefore, PJT should be included from an early start into the regular training program of swimmers.
Background: The standard method to treat physically active patients with anterior cruciate ligament (ACL) rupture is ligament reconstruction surgery. The rehabilitation training program is very important to improve functional performance in recreational athletes following ACL reconstruction.
Objectives: The aims of this study were to compare the effects of three different training programs, eccentric training (ECC), plyometric training (PLYO), or combined eccentric and plyometric training (COMB), on dynamic balance (Y-BAL), the Lysholm Knee Scale (LKS), the return to sport index (RSI), and the leg symmetry index (LSI) for the single leg hop test for distance in elite female athletes after ACL surgery.
Materials and Methods: Fourteen weeks after rehabilitation from surgery, 40 elite female athletes (20.3 ± 3.2 years), who had undergone an ACL reconstruction, participated in a short-term (6 weeks; two times a week) training study. All participants received the same rehabilitation protocol prior to the training study. Athletes were randomly assigned to three experimental groups, ECC (n = 10), PLYO (n = 10), and COMB (n = 10), and to a control group (CON: n = 10). Testing was conducted before and after the 6-week training programs and included the Y-BAL, LKS, and RSI. LSI was assessed after the 6-week training programs only.
Results: Adherence rate was 100% across all groups and no training or test-related injuries were reported. No significant between-group baseline differences (pre-6-week training) were observed for any of the parameters. Significant group-by-time interactions were found for Y-BAL (p < 0.001, ES = 1.73), LKS (p < 0.001, ES = 0.76), and RSI (p < 0.001, ES = 1.39). Contrast analysis demonstrated that COMB yielded significantly greater improvements in Y-BAL, LKS, and RSI (all p < 0.001), in addition to significantly better performances in LSI (all p < 0.001), than CON, PLYO, and ECC, respectively.
Conclusion: In conclusion, combined (eccentric/plyometric) training seems to represent the most effective training method as it exerts positive effects on both stability and functional performance in the post-ACL-surgical rehabilitation period of elite female athletes.
Background: The standard method to treat physically active patients with anterior cruciate ligament (ACL) rupture is ligament reconstruction surgery. The rehabilitation training program is very important to improve functional performance in recreational athletes following ACL reconstruction.
Objectives: The aims of this study were to compare the effects of three different training programs, eccentric training (ECC), plyometric training (PLYO), or combined eccentric and plyometric training (COMB), on dynamic balance (Y-BAL), the Lysholm Knee Scale (LKS), the return to sport index (RSI), and the leg symmetry index (LSI) for the single leg hop test for distance in elite female athletes after ACL surgery.
Materials and Methods: Fourteen weeks after rehabilitation from surgery, 40 elite female athletes (20.3 ± 3.2 years), who had undergone an ACL reconstruction, participated in a short-term (6 weeks; two times a week) training study. All participants received the same rehabilitation protocol prior to the training study. Athletes were randomly assigned to three experimental groups, ECC (n = 10), PLYO (n = 10), and COMB (n = 10), and to a control group (CON: n = 10). Testing was conducted before and after the 6-week training programs and included the Y-BAL, LKS, and RSI. LSI was assessed after the 6-week training programs only.
Results: Adherence rate was 100% across all groups and no training or test-related injuries were reported. No significant between-group baseline differences (pre-6-week training) were observed for any of the parameters. Significant group-by-time interactions were found for Y-BAL (p < 0.001, ES = 1.73), LKS (p < 0.001, ES = 0.76), and RSI (p < 0.001, ES = 1.39). Contrast analysis demonstrated that COMB yielded significantly greater improvements in Y-BAL, LKS, and RSI (all p < 0.001), in addition to significantly better performances in LSI (all p < 0.001), than CON, PLYO, and ECC, respectively.
Conclusion: In conclusion, combined (eccentric/plyometric) training seems to represent the most effective training method as it exerts positive effects on both stability and functional performance in the post-ACL-surgical rehabilitation period of elite female athletes.
Combining training of muscle strength and cardiorespiratory fitness within a training cycle could increase athletic performance more than single-mode training. However, the physiological effects produced by each training modality could also interfere with each other, improving athletic performance less than single-mode training. Because anthropometric, physiological, and biomechanical differences between young and adult athletes can affect the responses to exercise training, young athletes might respond differently to concurrent training (CT) compared with adults. Thus, the aim of the present systematic review with meta-analysis was to determine the effects of concurrent strength and endurance training on selected physical fitness components and athletic performance in youth. A systematic literature search of PubMed and Web of Science identified 886 records. The studies included in the analyses examined children (girls age 6-11 years, boys age 6-13 years) or adolescents (girls age 12-18 years, boys age 14-18 years), compared CT with single-mode endurance (ET) or strength training (ST), and reported at least one strength/power-(e.g., jump height), endurance-(e.g., peak. VO2, exercise economy), or performance-related (e.g., time trial) outcome. We calculated weighted standardized mean differences (SMDs). CT compared to ET produced small effects in favor of CT on athletic performance (n = 11 studies, SMD = 0.41, p = 0.04) and trivial effects on cardiorespiratory endurance (n = 4 studies, SMD = 0.04, p = 0.86) and exercise economy (n = 5 studies, SMD = 0.16, p = 0.49) in young athletes. A sub-analysis of chronological age revealed a trend toward larger effects of CT vs. ET on athletic performance in adolescents (SMD = 0.52) compared with children (SMD = 0.17). CT compared with ST had small effects in favor of CT on muscle power (n = 4 studies, SMD = 0.23, p = 0.04). In conclusion, CT is more effective than single-mode ET or ST in improving selected measures of physical fitness and athletic performance in youth. Specifically, CT compared with ET improved athletic performance in children and particularly adolescents. Finally, CT was more effective than ST in improving muscle power in youth.
Combining training of muscle strength and cardiorespiratory fitness within a training cycle could increase athletic performance more than single-mode training. However, the physiological effects produced by each training modality could also interfere with each other, improving athletic performance less than single-mode training. Because anthropometric, physiological, and biomechanical differences between young and adult athletes can affect the responses to exercise training, young athletes might respond differently to concurrent training (CT) compared with adults. Thus, the aim of the present systematic review with meta-analysis was to determine the effects of concurrent strength and endurance training on selected physical fitness components and athletic performance in youth. A systematic literature search of PubMed and Web of Science identified 886 records. The studies included in the analyses examined children (girls age 6–11 years, boys age 6–13 years) or adolescents (girls age 12–18 years, boys age 14–18 years), compared CT with single-mode endurance (ET) or strength training (ST), and reported at least one strength/power—(e.g., jump height), endurance—(e.g., peak V°O2, exercise economy), or performance-related (e.g., time trial) outcome. We calculated weighted standardized mean differences (SMDs). CT compared to ET produced small effects in favor of CT on athletic performance (n = 11 studies, SMD = 0.41, p = 0.04) and trivial effects on cardiorespiratory endurance (n = 4 studies, SMD = 0.04, p = 0.86) and exercise economy (n = 5 studies, SMD = 0.16, p = 0.49) in young athletes. A sub-analysis of chronological age revealed a trend toward larger effects of CT vs. ET on athletic performance in adolescents (SMD = 0.52) compared with children (SMD = 0.17). CT compared with ST had small effects in favor of CT on muscle power (n = 4 studies, SMD = 0.23, p = 0.04). In conclusion, CT is more effective than single-mode ET or ST in improving selected measures of physical fitness and athletic performance in youth. Specifically, CT compared with ET improved athletic performance in children and particularly adolescents. Finally, CT was more effective than ST in improving muscle power in youth.
Combining training of muscle strength and cardiorespiratory fitness within a training cycle could increase athletic performance more than single-mode training. However, the physiological effects produced by each training modality could also interfere with each other, improving athletic performance less than single-mode training. Because anthropometric, physiological, and biomechanical differences between young and adult athletes can affect the responses to exercise training, young athletes might respond differently to concurrent training (CT) compared with adults. Thus, the aim of the present systematic review with meta-analysis was to determine the effects of concurrent strength and endurance training on selected physical fitness components and athletic performance in youth. A systematic literature search of PubMed and Web of Science identified 886 records. The studies included in the analyses examined children (girls age 6–11 years, boys age 6–13 years) or adolescents (girls age 12–18 years, boys age 14–18 years), compared CT with single-mode endurance (ET) or strength training (ST), and reported at least one strength/power—(e.g., jump height), endurance—(e.g., peak V°O2, exercise economy), or performance-related (e.g., time trial) outcome. We calculated weighted standardized mean differences (SMDs). CT compared to ET produced small effects in favor of CT on athletic performance (n = 11 studies, SMD = 0.41, p = 0.04) and trivial effects on cardiorespiratory endurance (n = 4 studies, SMD = 0.04, p = 0.86) and exercise economy (n = 5 studies, SMD = 0.16, p = 0.49) in young athletes. A sub-analysis of chronological age revealed a trend toward larger effects of CT vs. ET on athletic performance in adolescents (SMD = 0.52) compared with children (SMD = 0.17). CT compared with ST had small effects in favor of CT on muscle power (n = 4 studies, SMD = 0.23, p = 0.04). In conclusion, CT is more effective than single-mode ET or ST in improving selected measures of physical fitness and athletic performance in youth. Specifically, CT compared with ET improved athletic performance in children and particularly adolescents. Finally, CT was more effective than ST in improving muscle power in youth.
Maintaining and increasing walking speed in old age is clinically important because this activity of daily living predicts functional and clinical state. We reviewed evidence for the biomechanical mechanisms of how strength and power training increase gait speed in old adults. A systematic search yielded only four studies that reported changes in selected gait biomechanical variables after an intervention. A secondary analysis of 20 studies revealed an association of r(2) = 0.21 between the 22% and 12% increase, respectively, in quadriceps strength and gait velocity in 815 individuals age 72. In 6 studies, there was a correlation of r(2) = 0.16 between the 19% and 9% gains in plantarflexion strength and gait speed in 240 old volunteers age 75. In 8 studies, there was zero association between the 35% and 13% gains in leg mechanical power and gait speed in 150 old adults age 73. To increase the efficacy of intervention studies designed to improve gait speed and other critical mobility functions in old adults, there is a need for a paradigm shift from conventional (clinical) outcome assessments to more sophisticated biomechanical analyses that examine joint kinematics, kinetics, energetics, muscle-tendon function, and musculoskeletal modeling before and after interventions.
Cognitive resources contribute to balance control. There is evidence that mental fatigue reduces cognitive resources and impairs balance performance, particularly in older adults and when balance tasks are complex, for example when trying to walk or stand while concurrently performing a secondary cognitive task.
We conducted a systematic literature search in PubMed (MEDLINE), Web of Science and Google Scholar to identify eligible studies and performed a random effects meta-analysis to quantify the effects of experimentally induced mental fatigue on balance performance in healthy adults. Subgroup analyses were computed for age (healthy young vs. healthy older adults) and balance task complexity (balance tasks with high complexity vs. balance tasks with low complexity) to examine the moderating effects of these factors on fatigue-mediated balance performance.
We identified 7 eligible studies with 9 study groups and 206 participants. Analysis revealed that performing a prolonged cognitive task had a small but significant effect (SMDwm = −0.38) on subsequent balance performance in healthy young and older adults. However, age- and task-related differences in balance responses to fatigue could not be confirmed statistically.
Overall, aggregation of the available literature indicates that mental fatigue generally reduces balance in healthy adults. However, interactions between cognitive resource reduction, aging and balance task complexity remain elusive.
Cognitive resources contribute to balance control. There is evidence that mental fatigue reduces cognitive resources and impairs balance performance, particularly in older adults and when balance tasks are complex, for example when trying to walk or stand while concurrently performing a secondary cognitive task.
We conducted a systematic literature search in PubMed (MEDLINE), Web of Science and Google Scholar to identify eligible studies and performed a random effects meta-analysis to quantify the effects of experimentally induced mental fatigue on balance performance in healthy adults. Subgroup analyses were computed for age (healthy young vs. healthy older adults) and balance task complexity (balance tasks with high complexity vs. balance tasks with low complexity) to examine the moderating effects of these factors on fatigue-mediated balance performance.
We identified 7 eligible studies with 9 study groups and 206 participants. Analysis revealed that performing a prolonged cognitive task had a small but significant effect (SMDwm = −0.38) on subsequent balance performance in healthy young and older adults. However, age- and task-related differences in balance responses to fatigue could not be confirmed statistically.
Overall, aggregation of the available literature indicates that mental fatigue generally reduces balance in healthy adults. However, interactions between cognitive resource reduction, aging and balance task complexity remain elusive.
The regular monitoring of physical fitness and sport-specific performance is important in elite sports to increase the likelihood of success in competition. This study aimed to systematically review and to critically appraise the methodological quality, validation data, and feasibility of the sport-specific performance assessment in Olympic combat sports like amateur boxing, fencing, judo, karate, taekwondo, and wrestling. A systematic search was conducted in the electronic databases PubMed, Google-Scholar, and Science-Direct up to October 2017. Studies in combat sports were included that reported validation data (e.g., reliability, validity, sensitivity) of sport-specific tests. Overall, 39 studies were eligible for inclusion in this review. The majority of studies (74%) contained sample sizes <30 subjects. Nearly, 1/3 of the reviewed studies lacked a sufficient description (e.g., anthropometrics, age, expertise level) of the included participants. Seventy-two percent of studies did not sufficiently report inclusion/exclusion criteria of their participants. In 62% of the included studies, the description and/or inclusion of a familiarization session (s) was either incomplete or not existent. Sixty-percent of studies did not report any details about the stability of testing conditions. Approximately half of the studies examined reliability measures of the included sport-specific tests (intraclass correlation coefficient [ICC] = 0.43–1.00). Content validity was addressed in all included studies, criterion validity (only the concurrent aspect of it) in approximately half of the studies with correlation coefficients ranging from r = −0.41 to 0.90. Construct validity was reported in 31% of the included studies and predictive validity in only one. Test sensitivity was addressed in 13% of the included studies. The majority of studies (64%) ignored and/or provided incomplete information on test feasibility and methodological limitations of the sport-specific test. In 28% of the included studies, insufficient information or a complete lack of information was provided in the respective field of the test application. Several methodological gaps exist in studies that used sport-specific performance tests in Olympic combat sports. Additional research should adopt more rigorous validation procedures in the application and description of sport-specific performance tests in Olympic combat sports.
The regular monitoring of physical fitness and sport-specific performance is important in elite sports to increase the likelihood of success in competition. This study aimed to systematically review and to critically appraise the methodological quality, validation data, and feasibility of the sport-specific performance assessment in Olympic combat sports like amateur boxing, fencing, judo, karate, taekwondo, and wrestling. A systematic search was conducted in the electronic databases PubMed, Google-Scholar, and Science-Direct up to October 2017. Studies in combat sports were included that reported validation data (e.g., reliability, validity, sensitivity) of sport-specific tests. Overall, 39 studies were eligible for inclusion in this review. The majority of studies (74%) contained sample sizes <30 subjects. Nearly, 1/3 of the reviewed studies lacked a sufficient description (e.g., anthropometrics, age, expertise level) of the included participants. Seventy-two percent of studies did not sufficiently report inclusion/exclusion criteria of their participants. In 62% of the included studies, the description and/or inclusion of a familiarization session (s) was either incomplete or not existent. Sixty-percent of studies did not report any details about the stability of testing conditions. Approximately half of the studies examined reliability measures of the included sport-specific tests (intraclass correlation coefficient [ICC] = 0.43–1.00). Content validity was addressed in all included studies, criterion validity (only the concurrent aspect of it) in approximately half of the studies with correlation coefficients ranging from r = −0.41 to 0.90. Construct validity was reported in 31% of the included studies and predictive validity in only one. Test sensitivity was addressed in 13% of the included studies. The majority of studies (64%) ignored and/or provided incomplete information on test feasibility and methodological limitations of the sport-specific test. In 28% of the included studies, insufficient information or a complete lack of information was provided in the respective field of the test application. Several methodological gaps exist in studies that used sport-specific performance tests in Olympic combat sports. Additional research should adopt more rigorous validation procedures in the application and description of sport-specific performance tests in Olympic combat sports.
Symptoms of anxiety and depression in young athletes using the hospital anxiety and depression scale
(2018)
Elite young athletes have to cope with multiple psychological demands such as training volume, mental and physical fatigue, spatial separation of family and friends or time management problems may lead to reduced mental and physical recovery. While normative data regarding symptoms of anxiety and depression for the general population is available (Hinz and Brahler, 2011), hardly any information exists for adolescents in general and young athletes in particular. Therefore, the aim of this study was to assess overall symptoms of anxiety and depression in young athletes as well as possible sex differences. The survey was carried out within the scope of the study "Resistance Training in Young Athletes" (KINGS-Study). Between August 2015 and September 2016, 326 young athletes aged (mean +/- SD) 14.3 +/- 1.6 years completed the Hospital Anxiety and Depression Scale (HAD Scale). Regarding the analysis of age on the anxiety and depression subscales, age groups were classified as follows: late childhood (12-14 years) and late adolescence (15-18 years). The participating young athletes were recruited from Olympic weight lifting, handball, judo, track and field athletics, boxing, soccer, gymnastics, ice speed skating, volleyball, and rowing. Anxiety and depression scores were (mean +/- SD) 4.3 +/- 3.0 and 2.8 +/- 2.9, respectively. In the subscale anxiety, 22 cases (6.7%) showed subclinical scores and 11 cases (3.4%) showed clinical relevant score values. When analyzing the depression subscale, 31 cases (9.5%) showed subclinical score values and 12 cases (3.7%) showed clinically important values. No significant differences were found between male and female athletes (p >= 0.05). No statistically significant differences in the HADS scores were found between male athletes of late childhood and late adolescents (p >= 0.05). To the best of our knowledge, this is the first report describing questionnaire based indicators of symptoms of anxiety and depression in young athletes. Our data implies the need for sports medical as well as sports psychiatric support for young athletes. In addition, our results demonstrated that the chronological classification concerning age did not influence HAD Scale outcomes. Future research should focus on sports medical and sports psychiatric interventional approaches with the goal to prevent anxiety and depression as well as teaching coping strategies to young athletes.
Symptoms of anxiety and depression in young athletes using the Hospital Anxiety and Depression Scale
(2018)
Elite young athletes have to cope with multiple psychological demands such as training volume, mental and physical fatigue, spatial separation of family and friends or time management problems may lead to reduced mental and physical recovery. While normative data regarding symptoms of anxiety and depression for the general population is available (Hinz and Brahler, 2011), hardly any information exists for adolescents in general and young athletes in particular. Therefore, the aim of this study was to assess overall symptoms of anxiety and depression in young athletes as well as possible sex differences. The survey was carried out within the scope of the study "Resistance Training in Young Athletes" (KINGS-Study). Between August 2015 and September 2016, 326 young athletes aged (mean +/- SD) 14.3 +/- 1.6 years completed the Hospital Anxiety and Depression Scale (HAD Scale). Regarding the analysis of age on the anxiety and depression subscales, age groups were classified as follows: late childhood (12-14 years) and late adolescence (15-18 years). The participating young athletes were recruited from Olympic weight lifting, handball, judo, track and field athletics, boxing, soccer, gymnastics, ice speed skating, volleyball, and rowing. Anxiety and depression scores were (mean +/- SD) 4.3 +/- 3.0 and 2.8 +/- 2.9, respectively. In the subscale anxiety, 22 cases (6.7%) showed subclinical scores and 11 cases (3.4%) showed clinical relevant score values. When analyzing the depression subscale, 31 cases (9.5%) showed subclinical score values and 12 cases (3.7%) showed clinically important values. No significant differences were found between male and female athletes (p >= 0.05). No statistically significant differences in the HADS scores were found between male athletes of late childhood and late adolescents (p >= 0.05). To the best of our knowledge, this is the first report describing questionnaire based indicators of symptoms of anxiety and depression in young athletes. Our data implies the need for sports medical as well as sports psychiatric support for young athletes. In addition, our results demonstrated that the chronological classification concerning age did not influence HAD Scale outcomes. Future research should focus on sports medical and sports psychiatric interventional approaches with the goal to prevent anxiety and depression as well as teaching coping strategies to young athletes.
Purpose: The aim of this study was to compare the effects of moderate intensity, low volume (MILV) vs. low intensity, high volume (LIHV) strength training on sport-specific performance, measures of muscular fitness, and skeletal muscle mass in young kayakers and canoeists.
Methods: Semi-elite young kayakers and canoeists (N = 40, 13 ± 0.8 years, 11 girls) performed either MILV (70–80% 1-RM, 6–12 repetitions per set) or LIHV (30–40% 1-RM, 60–120 repetitions per set) strength training for one season. Linear mixed-effects models were used to compare effects of training condition on changes over time in 250 and 2,000 m time trials, handgrip strength, underhand shot throw, average bench pull power over 2 min, and skeletal muscle mass. Both between- and within-subject designs were used for analysis. An alpha of 0.05 was used to determine statistical significance.
Results: Between- and within-subject analyses showed that monthly changes were greater in LIHV vs. MILV for the 2,000 m time trial (between: 9.16 s, SE = 2.70, p < 0.01; within: 2,000 m: 13.90 s, SE = 5.02, p = 0.01) and bench pull average power (between: 0.021 W⋅kg–1, SE = 0.008, p = 0.02; within: 0.010 W⋅kg–1, SE = 0.009, p > 0.05). Training conditions did not affect other outcomes.
Conclusion: Young sprint kayakers and canoeists benefit from LIHV more than MILV strength training in terms of 2,000 m performance and muscular endurance (i.e., 2 min bench pull power).
Purpose: The aim of this study was to compare the effects of moderate intensity, low volume (MILV) vs. low intensity, high volume (LIHV) strength training on sport-specific performance, measures of muscular fitness, and skeletal muscle mass in young kayakers and canoeists.
Methods: Semi-elite young kayakers and canoeists (N = 40, 13 ± 0.8 years, 11 girls) performed either MILV (70–80% 1-RM, 6–12 repetitions per set) or LIHV (30–40% 1-RM, 60–120 repetitions per set) strength training for one season. Linear mixed-effects models were used to compare effects of training condition on changes over time in 250 and 2,000 m time trials, handgrip strength, underhand shot throw, average bench pull power over 2 min, and skeletal muscle mass. Both between- and within-subject designs were used for analysis. An alpha of 0.05 was used to determine statistical significance.
Results: Between- and within-subject analyses showed that monthly changes were greater in LIHV vs. MILV for the 2,000 m time trial (between: 9.16 s, SE = 2.70, p < 0.01; within: 2,000 m: 13.90 s, SE = 5.02, p = 0.01) and bench pull average power (between: 0.021 W⋅kg–1, SE = 0.008, p = 0.02; within: 0.010 W⋅kg–1, SE = 0.009, p > 0.05). Training conditions did not affect other outcomes.
Conclusion: Young sprint kayakers and canoeists benefit from LIHV more than MILV strength training in terms of 2,000 m performance and muscular endurance (i.e., 2 min bench pull power).
From a health and performance-related perspective, it is crucial to evaluate subjective symptoms and objective signs of acute training-induced immunological responses in young athletes. The limited number of available studies focused on immunological adaptations following aerobic training. Hardly any studies have been conducted on resistance-training induced stress responses. Therefore, the aim of this observational study was to investigate subjective symptoms and objective signs of immunological stress responses following resistance training in young athletes. Fourteen (7 females and 7 males) track and field athletes with a mean age of 16.4 years and without any symptoms of upper or lower respiratory tract infections participated in this study. Over a period of 7 days, subjective symptoms using the Acute Recovery and Stress Scale (ARSS) and objective signs of immunological responses using capillary blood markers were taken each morning and after the last training session. Differences between morning and evening sessions and associations between subjective and objective parameters were analyzed using generalized estimating equations (GEE). In post hoc analyses, daily change-scores of the ARSS dimensions were compared between participants and revealed specific changes in objective capillary blood samples. In the GEE models, recovery (ARSS) was characterized by a significant decrease while stress (ARSS) showed a significant increase between morning and evening-training sessions. A concomitant increase in white blood cell count (WBC), granulocytes (GRAN) and percentage shares of granulocytes (GRAN%) was found between morning and evening sessions. Of note, percentage shares of lymphocytes (LYM%) showed a significant decrease. Furthermore, using multivariate regression analyses, we identified that recovery was significantly associated with LYM%, while stress was significantly associated with WBC and GRAN%. Post hoc analyses revealed significantly larger increases in participants' stress dimensions who showed increases in GRAN%. For recovery, significantly larger decreases were found in participants with decreases in LYM% during recovery. More specifically, daily change-scores of the recovery and stress dimensions of the ARSS were associated with specific changes in objective immunological markers (GRAN%, LYM%) between morning and evening-training sessions. Our results indicate that changes of subjective symptoms of recovery and stress dimensions using the ARSS were associated with specific changes in objectively measured immunological markers.
From a health and performance-related perspective, it is crucial to evaluate subjective symptoms and objective signs of acute training-induced immunological responses in young athletes. The limited number of available studies focused on immunological adaptations following aerobic training. Hardly any studies have been conducted on resistance-training induced stress responses. Therefore, the aim of this observational study was to investigate subjective symptoms and objective signs of immunological stress responses following resistance training in young athletes. Fourteen (7 females and 7 males) track and field athletes with a mean age of 16.4 years and without any symptoms of upper or lower respiratory tract infections participated in this study. Over a period of 7 days, subjective symptoms using the Acute Recovery and Stress Scale (ARSS) and objective signs of immunological responses using capillary blood markers were taken each morning and after the last training session. Differences between morning and evening sessions and associations between subjective and objective parameters were analyzed using generalized estimating equations (GEE). In post hoc analyses, daily change-scores of the ARSS dimensions were compared between participants and revealed specific changes in objective capillary blood samples. In the GEE models, recovery (ARSS) was characterized by a significant decrease while stress (ARSS) showed a significant increase between morning and evening-training sessions. A concomitant increase in white blood cell count (WBC), granulocytes (GRAN) and percentage shares of granulocytes (GRAN%) was found between morning and evening sessions. Of note, percentage shares of lymphocytes (LYM%) showed a significant decrease. Furthermore, using multivariate regression analyses, we identified that recovery was significantly associated with LYM%, while stress was significantly associated with WBC and GRAN%. Post hoc analyses revealed significantly larger increases in participants’ stress dimensions who showed increases in GRAN%. For recovery, significantly larger decreases were found in participants with decreases in LYM% during recovery. More specifically, daily change-scores of the recovery and stress dimensions of the ARSS were associated with specific changes in objective immunological markers (GRAN%, LYM%) between morning and evening-training sessions. Our results indicate that changes of subjective symptoms of recovery and stress dimensions using the ARSS were associated with specific changes in objectively measured immunological markers.
Sprint and jump performances in highly trained young soccer players of different chronological age
(2020)
Objective
The aim of this study was to examine the effects of two different sprint-training regimes on sprint and jump performances according to age in elite young male soccer players over the course of one soccer season.
Methods
Players were randomly assigned to two training groups. Group 1 performed systematic change-of-direction sprints (CODST, U19 [n = 9], U17 [n = 9], U15 [n = 10]) while group 2 conducted systematic linear sprints (LST, U19 [n = 9], U17 [n = 9], U15 [n = 9]). Training volumes were similar between groups (40 sprints per week x 30 weeks = 1200 sprints per season). Pre and post training, all players performed tests for the assessment of linear and slalom sprint speed (5-m and 10-m), countermovement jump, and maximal aerobic speed performance.
Results
For all physical fitness measures, the baseline-adjusted means data (ANCOVA) across the age groups showed no significant differences between LST and CODST at post (0.061 < p < 0.995; 0.0017 < d < 1.01). The analyses of baseline-adjusted means for all physical fitness measures for U15, U17, and U19 (LST vs. CODST) revealed no significant differences between LST and CODST for U15 (0.213 < p < 0.917; 0.001 < d < 0.087), U17 (0.132 < p < 0.976; 0.001 < d < 0.310), and U19 (0.300 < p < 0.999; 0.001 < d < 0.049) at post.
Conclusions
The results from this study showed that both, LST and CODST induced significant changes in the sprint, lower limbs power, and aerobic performances in young elite soccer players. Since no significant differences were observed between LST and CODST, the observed changes are most likely due to training and/or maturation. Therefore, more research is needed to elucidate whether CODST, LST or a combination of both is beneficial for youth soccer athletes’ performance development.
Sprint and jump performances in highly trained young soccer players of different chronological age
(2020)
Objective
The aim of this study was to examine the effects of two different sprint-training regimes on sprint and jump performances according to age in elite young male soccer players over the course of one soccer season.
Methods
Players were randomly assigned to two training groups. Group 1 performed systematic change-of-direction sprints (CODST, U19 [n = 9], U17 [n = 9], U15 [n = 10]) while group 2 conducted systematic linear sprints (LST, U19 [n = 9], U17 [n = 9], U15 [n = 9]). Training volumes were similar between groups (40 sprints per week x 30 weeks = 1200 sprints per season). Pre and post training, all players performed tests for the assessment of linear and slalom sprint speed (5-m and 10-m), countermovement jump, and maximal aerobic speed performance.
Results
For all physical fitness measures, the baseline-adjusted means data (ANCOVA) across the age groups showed no significant differences between LST and CODST at post (0.061 < p < 0.995; 0.0017 < d < 1.01). The analyses of baseline-adjusted means for all physical fitness measures for U15, U17, and U19 (LST vs. CODST) revealed no significant differences between LST and CODST for U15 (0.213 < p < 0.917; 0.001 < d < 0.087), U17 (0.132 < p < 0.976; 0.001 < d < 0.310), and U19 (0.300 < p < 0.999; 0.001 < d < 0.049) at post.
Conclusions
The results from this study showed that both, LST and CODST induced significant changes in the sprint, lower limbs power, and aerobic performances in young elite soccer players. Since no significant differences were observed between LST and CODST, the observed changes are most likely due to training and/or maturation. Therefore, more research is needed to elucidate whether CODST, LST or a combination of both is beneficial for youth soccer athletes’ performance development.