Refine
Language
- English (40)
Is part of the Bibliography
- yes (40)
Keywords
- football (14)
- exercise (7)
- monitoring (6)
- recovery (5)
- training (5)
- Performance (4)
- adolescents (4)
- elite athletes (4)
- global positioning system (4)
- rate of perceived exertion (4)
Background: The regular assessment of hormonal and mood state parameters in professional soccer are proposed as good indicators during periods of intense training and/or competition to avoid overtraining.
Objective: The aim of this study was to analyze hormonal, psychological, workload and physical fitness parameters in elite soccer players in relation to changes in training and match exposure during a congested period of match play.
Methods: Sixteen elite soccer players from a team playing in the first Tunisian soccer league were evaluated three times (T1, T2, and T3) over 12 weeks. The non-congested period of match play was from T1 to T2, when the players played 6 games over 6 weeks. The congested period was from T2 to T3, when the players played 10 games over 6 weeks. From T1 to T3, players performed the Yo-Yo intermittent recovery test level 1 (YYIR1), the repeated shuttle sprint ability test (RSSA), the countermovement jump test (CMJ), and the squat jump test (SJ). Plasma Cortisol (C), Testosterone (T), and the T/C ratio were analyzed at T1, T2, and T3. Players had their mood dimensions (tension, depression, anger, vigor, fatigue, confusion, and a Total Mood Disturbance) assessed through the Profile of Mood State questionnaire (POMS). Training session rating of perceived exertion (sRPE) was also recorded on a daily basis in order to quantify internal training load and elements of monotony and strain.
Results: Significant performance declines (T1 < T2 < T3) were found for SJ performance (p = 0.04, effect size [ES] ES₁₋₂ = 0.15−0.06, ES₂₋₃ = 0.24) from T1 to T3. YYIR1 performance improved significantly from T1 to T2 and declined significantly from T2 to T3 (p = 0.001, ES₁₋₂ = 0.24, ES₂₋₃ = −2.54). Mean RSSA performance was significantly higher (p = 0.019, ES₁₋₂ = −0.47, ES₂₋₃ = 1.15) in T3 compared with T2 and T1. Best RSSA performance was significantly higher in T3 when compared with T2 and T1 (p = 0.006, ES₂₋₃ = 0.47, ES₁₋₂ = −0.56), but significantly lower in T2 when compared with to T1. T and T/C were significantly lower in T3 when compared with T2 and T1 (T: p = 0.03, ES₃₋₂ = −0.51, ES₃₋₁ = −0.51, T/C: p = 0.017, ES₃₋₂ = −1.1, ES₃₋₁ = −1.07). Significant decreases were found for the vigor scores in T3 when compared to T2 and T1 (p = 0.002, ES₁₋₂ = 0.31, ES₃₋₂ = −1.25). A significant increase was found in fatigue scores in T3 as compared to T1 and T2 (p = 0.002, ES₁₋₂ = 0.43, ES₂₋₃ = 0.81). A significant increase was found from T1 < T2 < T3 intension score (p = 0.002, ES₁₋₂ = 1.1, ES₂₋₃ = 0.2) and anger score (p = 0.03, ES₁₋₂ = 0.47, ES₂₋₃ = 0.33) over the study period. Total mood disturbance increased significantly (p = 0.02, ES₁₋₂ = 0.91, ES₂₋₃ = 1.1) from T1 to T3. Between T1-T2, significant relationships were observed between workload and changes in T (r = 0.66, p = 0.003), and T/C ratio (r = 0.62, p = 0.01). There were significant relationships between performance in RSSAbest and training load parameters (workload: r = 0.52, p = 0.03; monotony: r = 0.62, p = 0.01; strain: r = 0.62, p = 0.009). Between T2-T3, there was a significant relationship between Δ% of total mood disturbance and Δ% of YYIR1 (r = −0.54; p = 0.04), RSSAbest (r = 0.58, p = 0.01), SJ (r = −0,55, p = 0.01), T (r = 0.53; p = 0.03), and T/C (r = 0.5; p = 0.04).
Conclusion: An intensive period of congested match play significantly compromised elite soccer players’ physical and mental fitness. These changes were related to psychological but not hormonal parameters; even though significant alterations were detected for selected measures. Mood monitoring could be a simple and useful tool to determine the degree of preparedness for match play during a congested period in professional soccer.
Background: The regular assessment of hormonal and mood state parameters in professional soccer are proposed as good indicators during periods of intense training and/or competition to avoid overtraining.
Objective: The aim of this study was to analyze hormonal, psychological, workload and physical fitness parameters in elite soccer players in relation to changes in training and match exposure during a congested period of match play.
Methods: Sixteen elite soccer players from a team playing in the first Tunisian soccer league were evaluated three times (T1, T2, and T3) over 12 weeks. The non-congested period of match play was from T1 to T2, when the players played 6 games over 6 weeks. The congested period was from T2 to T3, when the players played 10 games over 6 weeks. From T1 to T3, players performed the Yo-Yo intermittent recovery test level 1 (YYIR1), the repeated shuttle sprint ability test (RSSA), the countermovement jump test (CMJ), and the squat jump test (SJ). Plasma Cortisol (C), Testosterone (T), and the T/C ratio were analyzed at T1, T2, and T3. Players had their mood dimensions (tension, depression, anger, vigor, fatigue, confusion, and a Total Mood Disturbance) assessed through the Profile of Mood State questionnaire (POMS). Training session rating of perceived exertion (sRPE) was also recorded on a daily basis in order to quantify internal training load and elements of monotony and strain.
Results: Significant performance declines (T1 < T2 < T3) were found for SJ performance (p = 0.04, effect size [ES] ES₁₋₂ = 0.15−0.06, ES₂₋₃ = 0.24) from T1 to T3. YYIR1 performance improved significantly from T1 to T2 and declined significantly from T2 to T3 (p = 0.001, ES₁₋₂ = 0.24, ES₂₋₃ = −2.54). Mean RSSA performance was significantly higher (p = 0.019, ES₁₋₂ = −0.47, ES₂₋₃ = 1.15) in T3 compared with T2 and T1. Best RSSA performance was significantly higher in T3 when compared with T2 and T1 (p = 0.006, ES₂₋₃ = 0.47, ES₁₋₂ = −0.56), but significantly lower in T2 when compared with to T1. T and T/C were significantly lower in T3 when compared with T2 and T1 (T: p = 0.03, ES₃₋₂ = −0.51, ES₃₋₁ = −0.51, T/C: p = 0.017, ES₃₋₂ = −1.1, ES₃₋₁ = −1.07). Significant decreases were found for the vigor scores in T3 when compared to T2 and T1 (p = 0.002, ES₁₋₂ = 0.31, ES₃₋₂ = −1.25). A significant increase was found in fatigue scores in T3 as compared to T1 and T2 (p = 0.002, ES₁₋₂ = 0.43, ES₂₋₃ = 0.81). A significant increase was found from T1 < T2 < T3 intension score (p = 0.002, ES₁₋₂ = 1.1, ES₂₋₃ = 0.2) and anger score (p = 0.03, ES₁₋₂ = 0.47, ES₂₋₃ = 0.33) over the study period. Total mood disturbance increased significantly (p = 0.02, ES₁₋₂ = 0.91, ES₂₋₃ = 1.1) from T1 to T3. Between T1-T2, significant relationships were observed between workload and changes in T (r = 0.66, p = 0.003), and T/C ratio (r = 0.62, p = 0.01). There were significant relationships between performance in RSSAbest and training load parameters (workload: r = 0.52, p = 0.03; monotony: r = 0.62, p = 0.01; strain: r = 0.62, p = 0.009). Between T2-T3, there was a significant relationship between Δ% of total mood disturbance and Δ% of YYIR1 (r = −0.54; p = 0.04), RSSAbest (r = 0.58, p = 0.01), SJ (r = −0,55, p = 0.01), T (r = 0.53; p = 0.03), and T/C (r = 0.5; p = 0.04).
Conclusion: An intensive period of congested match play significantly compromised elite soccer players’ physical and mental fitness. These changes were related to psychological but not hormonal parameters; even though significant alterations were detected for selected measures. Mood monitoring could be a simple and useful tool to determine the degree of preparedness for match play during a congested period in professional soccer.
Background: The standard method to treat physically active patients with anterior cruciate ligament (ACL) rupture is ligament reconstruction surgery. The rehabilitation training program is very important to improve functional performance in recreational athletes following ACL reconstruction.
Objectives: The aims of this study were to compare the effects of three different training programs, eccentric training (ECC), plyometric training (PLYO), or combined eccentric and plyometric training (COMB), on dynamic balance (Y-BAL), the Lysholm Knee Scale (LKS), the return to sport index (RSI), and the leg symmetry index (LSI) for the single leg hop test for distance in elite female athletes after ACL surgery.
Materials and Methods: Fourteen weeks after rehabilitation from surgery, 40 elite female athletes (20.3 ± 3.2 years), who had undergone an ACL reconstruction, participated in a short-term (6 weeks; two times a week) training study. All participants received the same rehabilitation protocol prior to the training study. Athletes were randomly assigned to three experimental groups, ECC (n = 10), PLYO (n = 10), and COMB (n = 10), and to a control group (CON: n = 10). Testing was conducted before and after the 6-week training programs and included the Y-BAL, LKS, and RSI. LSI was assessed after the 6-week training programs only.
Results: Adherence rate was 100% across all groups and no training or test-related injuries were reported. No significant between-group baseline differences (pre-6-week training) were observed for any of the parameters. Significant group-by-time interactions were found for Y-BAL (p < 0.001, ES = 1.73), LKS (p < 0.001, ES = 0.76), and RSI (p < 0.001, ES = 1.39). Contrast analysis demonstrated that COMB yielded significantly greater improvements in Y-BAL, LKS, and RSI (all p < 0.001), in addition to significantly better performances in LSI (all p < 0.001), than CON, PLYO, and ECC, respectively.
Conclusion: In conclusion, combined (eccentric/plyometric) training seems to represent the most effective training method as it exerts positive effects on both stability and functional performance in the post-ACL-surgical rehabilitation period of elite female athletes.
Background: The standard method to treat physically active patients with anterior cruciate ligament (ACL) rupture is ligament reconstruction surgery. The rehabilitation training program is very important to improve functional performance in recreational athletes following ACL reconstruction.
Objectives: The aims of this study were to compare the effects of three different training programs, eccentric training (ECC), plyometric training (PLYO), or combined eccentric and plyometric training (COMB), on dynamic balance (Y-BAL), the Lysholm Knee Scale (LKS), the return to sport index (RSI), and the leg symmetry index (LSI) for the single leg hop test for distance in elite female athletes after ACL surgery.
Materials and Methods: Fourteen weeks after rehabilitation from surgery, 40 elite female athletes (20.3 ± 3.2 years), who had undergone an ACL reconstruction, participated in a short-term (6 weeks; two times a week) training study. All participants received the same rehabilitation protocol prior to the training study. Athletes were randomly assigned to three experimental groups, ECC (n = 10), PLYO (n = 10), and COMB (n = 10), and to a control group (CON: n = 10). Testing was conducted before and after the 6-week training programs and included the Y-BAL, LKS, and RSI. LSI was assessed after the 6-week training programs only.
Results: Adherence rate was 100% across all groups and no training or test-related injuries were reported. No significant between-group baseline differences (pre-6-week training) were observed for any of the parameters. Significant group-by-time interactions were found for Y-BAL (p < 0.001, ES = 1.73), LKS (p < 0.001, ES = 0.76), and RSI (p < 0.001, ES = 1.39). Contrast analysis demonstrated that COMB yielded significantly greater improvements in Y-BAL, LKS, and RSI (all p < 0.001), in addition to significantly better performances in LSI (all p < 0.001), than CON, PLYO, and ECC, respectively.
Conclusion: In conclusion, combined (eccentric/plyometric) training seems to represent the most effective training method as it exerts positive effects on both stability and functional performance in the post-ACL-surgical rehabilitation period of elite female athletes.
Sprint and jump performances in highly trained young soccer players of different chronological age
(2020)
Objective
The aim of this study was to examine the effects of two different sprint-training regimes on sprint and jump performances according to age in elite young male soccer players over the course of one soccer season.
Methods
Players were randomly assigned to two training groups. Group 1 performed systematic change-of-direction sprints (CODST, U19 [n = 9], U17 [n = 9], U15 [n = 10]) while group 2 conducted systematic linear sprints (LST, U19 [n = 9], U17 [n = 9], U15 [n = 9]). Training volumes were similar between groups (40 sprints per week x 30 weeks = 1200 sprints per season). Pre and post training, all players performed tests for the assessment of linear and slalom sprint speed (5-m and 10-m), countermovement jump, and maximal aerobic speed performance.
Results
For all physical fitness measures, the baseline-adjusted means data (ANCOVA) across the age groups showed no significant differences between LST and CODST at post (0.061 < p < 0.995; 0.0017 < d < 1.01). The analyses of baseline-adjusted means for all physical fitness measures for U15, U17, and U19 (LST vs. CODST) revealed no significant differences between LST and CODST for U15 (0.213 < p < 0.917; 0.001 < d < 0.087), U17 (0.132 < p < 0.976; 0.001 < d < 0.310), and U19 (0.300 < p < 0.999; 0.001 < d < 0.049) at post.
Conclusions
The results from this study showed that both, LST and CODST induced significant changes in the sprint, lower limbs power, and aerobic performances in young elite soccer players. Since no significant differences were observed between LST and CODST, the observed changes are most likely due to training and/or maturation. Therefore, more research is needed to elucidate whether CODST, LST or a combination of both is beneficial for youth soccer athletes’ performance development.
Sprint and jump performances in highly trained young soccer players of different chronological age
(2020)
Objective
The aim of this study was to examine the effects of two different sprint-training regimes on sprint and jump performances according to age in elite young male soccer players over the course of one soccer season.
Methods
Players were randomly assigned to two training groups. Group 1 performed systematic change-of-direction sprints (CODST, U19 [n = 9], U17 [n = 9], U15 [n = 10]) while group 2 conducted systematic linear sprints (LST, U19 [n = 9], U17 [n = 9], U15 [n = 9]). Training volumes were similar between groups (40 sprints per week x 30 weeks = 1200 sprints per season). Pre and post training, all players performed tests for the assessment of linear and slalom sprint speed (5-m and 10-m), countermovement jump, and maximal aerobic speed performance.
Results
For all physical fitness measures, the baseline-adjusted means data (ANCOVA) across the age groups showed no significant differences between LST and CODST at post (0.061 < p < 0.995; 0.0017 < d < 1.01). The analyses of baseline-adjusted means for all physical fitness measures for U15, U17, and U19 (LST vs. CODST) revealed no significant differences between LST and CODST for U15 (0.213 < p < 0.917; 0.001 < d < 0.087), U17 (0.132 < p < 0.976; 0.001 < d < 0.310), and U19 (0.300 < p < 0.999; 0.001 < d < 0.049) at post.
Conclusions
The results from this study showed that both, LST and CODST induced significant changes in the sprint, lower limbs power, and aerobic performances in young elite soccer players. Since no significant differences were observed between LST and CODST, the observed changes are most likely due to training and/or maturation. Therefore, more research is needed to elucidate whether CODST, LST or a combination of both is beneficial for youth soccer athletes’ performance development.
Background: We assessed the effects of gender, in association with a four-week small-sided games (SSGs) training program, during Ramadan intermitting fasting (RIF) on changes in psychometric and physiological markers in professional male and female basketball players.
Methods: Twenty-four professional basketball players from the first Tunisian (Tunisia) division participated in this study. The players were dichotomized by sex (males [GM = 12]; females [GF = 12]). Both groups completed a 4 weeks SSGs training program with 3 sessions per week. Psychometric (e.g., quality of sleep, fatigue, stress, and delayed onset of muscle soreness [DOMS]) and physiological parameters (e.g., heart rate frequency, blood lactate) were measured during the first week (baseline) and at the end of RIF (post-test).
Results: Post hoc tests showed a significant increase in stress levels in both groups (GM [− 81.11%; p < 0.001, d = 0.33, small]; GF [− 36,53%; p = 0.001, d = 0.25, small]). Concerning physiological parameters, ANCOVA revealed significantly lower heart rates in favor of GM at post-test (1.70%, d = 0.38, small, p = 0.002).
Conclusions: Our results showed that SSGs training at the end of the RIF negatively impacted psychometric parameters of male and female basketball players. It can be concluded that there are sex-mediated effects of training during RIF in basketball players, and this should be considered by researchers and practitioners when programing training during RIF.
Background: We assessed the effects of gender, in association with a four-week small-sided games (SSGs) training program, during Ramadan intermitting fasting (RIF) on changes in psychometric and physiological markers in professional male and female basketball players.
Methods: Twenty-four professional basketball players from the first Tunisian (Tunisia) division participated in this study. The players were dichotomized by sex (males [GM = 12]; females [GF = 12]). Both groups completed a 4 weeks SSGs training program with 3 sessions per week. Psychometric (e.g., quality of sleep, fatigue, stress, and delayed onset of muscle soreness [DOMS]) and physiological parameters (e.g., heart rate frequency, blood lactate) were measured during the first week (baseline) and at the end of RIF (post-test).
Results: Post hoc tests showed a significant increase in stress levels in both groups (GM [− 81.11%; p < 0.001, d = 0.33, small]; GF [− 36,53%; p = 0.001, d = 0.25, small]). Concerning physiological parameters, ANCOVA revealed significantly lower heart rates in favor of GM at post-test (1.70%, d = 0.38, small, p = 0.002).
Conclusions: Our results showed that SSGs training at the end of the RIF negatively impacted psychometric parameters of male and female basketball players. It can be concluded that there are sex-mediated effects of training during RIF in basketball players, and this should be considered by researchers and practitioners when programing training during RIF.
The purpose of this study was to examine the test-retest reliability, and convergent and discriminative validity of a new taekwondo-specific change-of-direction (COD) speed test with striking techniques (TST) in elite taekwondo athletes. Twenty (10 males and 10 females) elite (athletes who compete at national level) and top-elite (athletes who compete at national and international level) taekwondo athletes with an average training background of 8.9 ± 1.3 years of systematic taekwondo training participated in this study. During the two-week test-retest period, various generic performance tests measuring COD speed, balance, speed, and jump performance were carried out during the first week and as a retest during the second week. Three TST trials were conducted with each athlete and the best trial was used for further analyses. The relevant performance measure derived from the TST was the time with striking penalty (TST-TSP). TST-TSP performances amounted to 10.57 ± 1.08 s for males and 11.74 ± 1.34 s for females. The reliability analysis of the TST performance was conducted after logarithmic transformation, in order to address the problem of heteroscedasticity. In both groups, the TST demonstrated a high relative test-retest reliability (intraclass correlation coefficients and 90% compatibility limits were 0.80 and 0.47 to 0.93, respectively). For absolute reliability, the TST’s typical error of measurement (TEM), 90% compatibility limits, and magnitudes were 4.6%, 3.4 to 7.7, for males, and 5.4%, 3.9 to 9.0, for females. The homogeneous sample of taekwondo athletes meant that the TST’s TEM exceeded the usual smallest important change (SIC) with 0.2 effect size in the two groups. The new test showed mostly very large correlations with linear sprint speed (r = 0.71 to 0.85) and dynamic balance (r = −0.71 and −0.74), large correlations with COD speed (r = 0.57 to 0.60) and vertical jump performance (r = −0.50 to −0.65), and moderate correlations with horizontal jump performance (r = −0.34 to −0.45) and static balance (r = −0.39 to −0.44). Top-elite athletes showed better TST performances than elite counterparts. Receiver operating characteristic analysis indicated that the TST effectively discriminated between top-elite and elite taekwondo athletes. In conclusion, the TST is a valid, and sensitive test to evaluate the COD speed with taekwondo specific skills, and reliable when considering ICC and TEM. Although the usefulness of the TST is questioned to detect small performance changes in the present population, the TST can detect moderate changes in taekwondo-specific COD speed.
The purpose of this study was to examine the test-retest reliability, and convergent and discriminative validity of a new taekwondo-specific change-of-direction (COD) speed test with striking techniques (TST) in elite taekwondo athletes. Twenty (10 males and 10 females) elite (athletes who compete at national level) and top-elite (athletes who compete at national and international level) taekwondo athletes with an average training background of 8.9 ± 1.3 years of systematic taekwondo training participated in this study. During the two-week test-retest period, various generic performance tests measuring COD speed, balance, speed, and jump performance were carried out during the first week and as a retest during the second week. Three TST trials were conducted with each athlete and the best trial was used for further analyses. The relevant performance measure derived from the TST was the time with striking penalty (TST-TSP). TST-TSP performances amounted to 10.57 ± 1.08 s for males and 11.74 ± 1.34 s for females. The reliability analysis of the TST performance was conducted after logarithmic transformation, in order to address the problem of heteroscedasticity. In both groups, the TST demonstrated a high relative test-retest reliability (intraclass correlation coefficients and 90% compatibility limits were 0.80 and 0.47 to 0.93, respectively). For absolute reliability, the TST’s typical error of measurement (TEM), 90% compatibility limits, and magnitudes were 4.6%, 3.4 to 7.7, for males, and 5.4%, 3.9 to 9.0, for females. The homogeneous sample of taekwondo athletes meant that the TST’s TEM exceeded the usual smallest important change (SIC) with 0.2 effect size in the two groups. The new test showed mostly very large correlations with linear sprint speed (r = 0.71 to 0.85) and dynamic balance (r = −0.71 and −0.74), large correlations with COD speed (r = 0.57 to 0.60) and vertical jump performance (r = −0.50 to −0.65), and moderate correlations with horizontal jump performance (r = −0.34 to −0.45) and static balance (r = −0.39 to −0.44). Top-elite athletes showed better TST performances than elite counterparts. Receiver operating characteristic analysis indicated that the TST effectively discriminated between top-elite and elite taekwondo athletes. In conclusion, the TST is a valid, and sensitive test to evaluate the COD speed with taekwondo specific skills, and reliable when considering ICC and TEM. Although the usefulness of the TST is questioned to detect small performance changes in the present population, the TST can detect moderate changes in taekwondo-specific COD speed.
There is controversy in the literature in regards of the link between training load and injury rate. Thus, the aims of this non-interventional study were to evaluate relationships between pre-season training load with biochemical markers, injury incidence and performance during the first month of the competitive period in professional soccer players.
There is controversy in the literature in regards of the link between training load and injury rate. Thus, the aims of this non-interventional study were to evaluate relationships between pre-season training load with biochemical markers, injury incidence and performance during the first month of the competitive period in professional soccer players.
This study aimed to compare the training load of a professional under-19 soccer team (U-19) to that of an elite adult team (EAT), from the same club, during the in-season period. Thirty-nine healthy soccer players were involved (EAT [n = 20]; U-19 [n = 19]) in the study which spanned four weeks. Training load (TL) was monitored as external TL, using a global positioning system (GPS), and internal TL, using a rating of perceived exertion (RPE). TL data were recorded after each training session. During soccer matches, players’ RPEs were recorded. The internal TL was quantified daily by means of the session rating of perceived exertion (session-RPE) using Borg’s 0–10 scale. For GPS data, the selected running speed intensities (over 0.5 s time intervals) were 12–15.9 km/h; 16–19.9 km/h; 20–24.9 km/h; >25 km/h (sprint). Distances covered between 16 and 19.9 km/h, > 20 km/h and >25 km/h were significantly higher in U-19 compared to EAT over the course of the study (p = 0.023, d = 0.243, small; p = 0.016, d = 0.298, small; and p = 0.001, d = 0.564, small, respectively). EAT players performed significantly fewer sprints per week compared to U-19 players (p = 0.002, d = 0.526, small). RPE was significantly higher in U-19 compared to EAT (p = 0.001, d = 0.188, trivial). The external and internal measures of TL were significantly higher in the U-19 group compared to the EAT soccer players. In conclusion, the results obtained show that the training load is greater in U19 compared to EAT.
This study aimed to compare the training load of a professional under-19 soccer team (U-19) to that of an elite adult team (EAT), from the same club, during the in-season period. Thirty-nine healthy soccer players were involved (EAT [n = 20]; U-19 [n = 19]) in the study which spanned four weeks. Training load (TL) was monitored as external TL, using a global positioning system (GPS), and internal TL, using a rating of perceived exertion (RPE). TL data were recorded after each training session. During soccer matches, players’ RPEs were recorded. The internal TL was quantified daily by means of the session rating of perceived exertion (session-RPE) using Borg’s 0–10 scale. For GPS data, the selected running speed intensities (over 0.5 s time intervals) were 12–15.9 km/h; 16–19.9 km/h; 20–24.9 km/h; >25 km/h (sprint). Distances covered between 16 and 19.9 km/h, > 20 km/h and >25 km/h were significantly higher in U-19 compared to EAT over the course of the study (p = 0.023, d = 0.243, small; p = 0.016, d = 0.298, small; and p = 0.001, d = 0.564, small, respectively). EAT players performed significantly fewer sprints per week compared to U-19 players (p = 0.002, d = 0.526, small). RPE was significantly higher in U-19 compared to EAT (p = 0.001, d = 0.188, trivial). The external and internal measures of TL were significantly higher in the U-19 group compared to the EAT soccer players. In conclusion, the results obtained show that the training load is greater in U19 compared to EAT.
Background: Change-of-direction (CoD) is a necessary physical ability of a field sport and may vary in youth players according to their maturation status.
Objectives: The aim of this study is: to compare the effectiveness of a 6-week CoD training intervention on dynamic balance (CS-YBT), horizontal jump (5JT), speed (10 and 30-m linear sprint times), CoD with (15 m-CoD + B) and without (15 m-CoD) the ball, in youth male soccer players at different levels of maturity [pre- and post-peak height velocity (PHV)].
Materials and Methods: Thirty elite male youth soccer players aged 10–17 years from the Tunisian first division participated in this study. The players were divided into pre- (G1, n = 15) and post-PHV (G2, n = 15) groups. Both groups completed a similar 6-week training program with two sessions per week of four CoD exercises. All players completed the following tests before and after intervention: CS-YBT; 5 JT; 10, 30, and 15 m-CoD; and 15 m-CoD + B, and data were analyzed using ANCOVA.
Results: All 30 players completed the study according to the study design and methodology. Adherence rate was 100% across all groups, and no training or test-related injuries were reported. Pre-PHV and post-PHV groups showed significant amelioration post-intervention for all dependent variables (after test > before test; p < 0.01, d = 0.09–1.51). ANOVA revealed a significant group × time interaction only for CS-YBT (F = 4.45; p < 0.04; η2 = 0.14), 5JT (F = 6.39; p < 0.02; η2 = 0.18), and 15 m-CoD (F = 7.88; p < 0.01; η2 = 0.22). CS-YBT, 5JT, and 15 m-CoD improved significantly in the post-PHV group (+ 4.56%, effect size = 1.51; + 4.51%, effect size = 1.05; and -3.08%, effect size = 0.51, respectively), more than the pre-PHV group (+ 2.77%, effect size = 0.85; + 2.91%, effect size = 0.54; and -1.56%, effect size = 0.20, respectively).
Conclusion: The CoD training program improved balance, horizontal jump, and CoD without the ball in male preadolescent and adolescent soccer players, and this improvement was greater in the post-PHV players. The maturity status of the athletes should be considered when programming CoD training for soccer players.
Background: Change-of-direction (CoD) is a necessary physical ability of a field sport and may vary in youth players according to their maturation status.
Objectives: The aim of this study is: to compare the effectiveness of a 6-week CoD training intervention on dynamic balance (CS-YBT), horizontal jump (5JT), speed (10 and 30-m linear sprint times), CoD with (15 m-CoD + B) and without (15 m-CoD) the ball, in youth male soccer players at different levels of maturity [pre- and post-peak height velocity (PHV)].
Materials and Methods: Thirty elite male youth soccer players aged 10–17 years from the Tunisian first division participated in this study. The players were divided into pre- (G1, n = 15) and post-PHV (G2, n = 15) groups. Both groups completed a similar 6-week training program with two sessions per week of four CoD exercises. All players completed the following tests before and after intervention: CS-YBT; 5 JT; 10, 30, and 15 m-CoD; and 15 m-CoD + B, and data were analyzed using ANCOVA.
Results: All 30 players completed the study according to the study design and methodology. Adherence rate was 100% across all groups, and no training or test-related injuries were reported. Pre-PHV and post-PHV groups showed significant amelioration post-intervention for all dependent variables (after test > before test; p < 0.01, d = 0.09–1.51). ANOVA revealed a significant group × time interaction only for CS-YBT (F = 4.45; p < 0.04; η2 = 0.14), 5JT (F = 6.39; p < 0.02; η2 = 0.18), and 15 m-CoD (F = 7.88; p < 0.01; η2 = 0.22). CS-YBT, 5JT, and 15 m-CoD improved significantly in the post-PHV group (+ 4.56%, effect size = 1.51; + 4.51%, effect size = 1.05; and -3.08%, effect size = 0.51, respectively), more than the pre-PHV group (+ 2.77%, effect size = 0.85; + 2.91%, effect size = 0.54; and -1.56%, effect size = 0.20, respectively).
Conclusion: The CoD training program improved balance, horizontal jump, and CoD without the ball in male preadolescent and adolescent soccer players, and this improvement was greater in the post-PHV players. The maturity status of the athletes should be considered when programming CoD training for soccer players.
Objective: A role for microRNAs is implicated in several biological and pathological processes. We investigated the effects of high-intensity interval training (HIIT) and moderate-intensity continuous training (MICT) on molecular markers of diabetic cardiomyopathy in rats.
Methods: Eighteen male Wistar rats (260 ± 10 g; aged 8 weeks) with streptozotocin (STZ)-induced type 1 diabetes mellitus (55 mg/kg, IP) were randomly allocated to three groups: control, MICT, and HIIT. The two different training protocols were performed 5 days each week for 5 weeks. Cardiac performance (end-systolic and end-diastolic dimensions, ejection fraction), the expression of miR-206, HSP60, and markers of apoptosis (cleaved PARP and cytochrome C) were determined at the end of the exercise interventions.
Results: Both exercise interventions (HIIT and MICT) decreased blood glucose levels and improved cardiac performance, with greater changes in the HIIT group (p < 0.001, η2: 0.909). While the expressions of miR-206 and apoptotic markers decreased in both training protocols (p < 0.001, η2: 0.967), HIIT caused greater reductions in apoptotic markers and produced a 20% greater reduction in miR-206 compared with the MICT protocol (p < 0.001). Furthermore, both training protocols enhanced the expression of HSP60 (p < 0.001, η2: 0.976), with a nearly 50% greater increase in the HIIT group compared with MICT.
Conclusions: Our results indicate that both exercise protocols, HIIT and MICT, have the potential to reduce diabetic cardiomyopathy by modifying the expression of miR-206 and its downstream targets of apoptosis. It seems however that HIIT is even more effective than MICT to modulate these molecular markers.
Objective: A role for microRNAs is implicated in several biological and pathological processes. We investigated the effects of high-intensity interval training (HIIT) and moderate-intensity continuous training (MICT) on molecular markers of diabetic cardiomyopathy in rats.
Methods: Eighteen male Wistar rats (260 ± 10 g; aged 8 weeks) with streptozotocin (STZ)-induced type 1 diabetes mellitus (55 mg/kg, IP) were randomly allocated to three groups: control, MICT, and HIIT. The two different training protocols were performed 5 days each week for 5 weeks. Cardiac performance (end-systolic and end-diastolic dimensions, ejection fraction), the expression of miR-206, HSP60, and markers of apoptosis (cleaved PARP and cytochrome C) were determined at the end of the exercise interventions.
Results: Both exercise interventions (HIIT and MICT) decreased blood glucose levels and improved cardiac performance, with greater changes in the HIIT group (p < 0.001, η2: 0.909). While the expressions of miR-206 and apoptotic markers decreased in both training protocols (p < 0.001, η2: 0.967), HIIT caused greater reductions in apoptotic markers and produced a 20% greater reduction in miR-206 compared with the MICT protocol (p < 0.001). Furthermore, both training protocols enhanced the expression of HSP60 (p < 0.001, η2: 0.976), with a nearly 50% greater increase in the HIIT group compared with MICT.
Conclusions: Our results indicate that both exercise protocols, HIIT and MICT, have the potential to reduce diabetic cardiomyopathy by modifying the expression of miR-206 and its downstream targets of apoptosis. It seems however that HIIT is even more effective than MICT to modulate these molecular markers.