Refine
Year of publication
- 2020 (1712) (remove)
Document Type
- Article (1712) (remove)
Keywords
- random point processes (18)
- statistical mechanics (18)
- stochastic analysis (18)
- climate change (17)
- machine learning (10)
- Germany (9)
- dynamics (9)
- model (9)
- Franconia (8)
- Franken (8)
Institute
- Institut für Physik und Astronomie (177)
- Institut für Biochemie und Biologie (156)
- Institut für Geowissenschaften (140)
- Institut für Chemie (108)
- Historisches Institut (83)
- Institut für Mathematik (75)
- Department Psychologie (72)
- Institut für Umweltwissenschaften und Geographie (63)
- Department Sport- und Gesundheitswissenschaften (60)
- Department Linguistik (58)
Strength training is an important means for performance development in young rowers. The purpose of this study was to examine the effects of a 9-week equal volume heavy-resistance strength training (HRST) versus strength endurance training (SET) in addition to regular rowing training on primary (e.g., maximal strength/power) and secondary outcomes (e.g., balance) in young rowers. Twenty-six female elite adolescent rowers were assigned to an HRST (n = 12; age: 13.2 ± 0.5 yrs; maturity-offset: +2.0 ± 0.5 yrs) or a SET group (n = 14; age: 13.1 ± 0.5 yrs; maturity-offset: +2.1 ± 0.5 yrs). HRST and SET comprised lower- (i.e., leg press/knee flexion/extension), upper-limbs (i.e., bench press/pull; lat-pull down), and complex exercises (i.e., rowing ergometer). HRST performed four sets with 12 repetitions per set at an intensity of 75–95% of the one-repetition maximum (1-RM). SET conducted four sets with 30 repetitions per set at 50–60% of the 1-RM. Training volume was matched for overall repetitions × intensity × training per week. Pre-post training, tests were performed for the assessment of primary [i.e., maximal strength (e.g., bench pull/knee flexion/extension 1-RM/isometric handgrip test), muscle power (e.g., medicine-ball push test, triple hop, drop jump, and countermovement jump), anaerobic endurance (400-m run), sport-specific performance (700-m rowing ergometer trial)] and secondary outcomes [dynamic balance (Y-balance test), change-of-direction (CoD) speed (multistage shuttle-run test)]. Adherence rate was >87% and one athlete of each group dropped out. Overall, 24 athletes completed the study and no test or training-related injuries occurred. Significant group × time interactions were observed for maximal strength, muscle power, anaerobic endurance, CoD speed, and sport-specific performance (p ≤ 0.05; 0.45 ≤ d ≤ 1.11). Post hoc analyses indicated larger gains in maximal strength and muscle power following HRST (p ≤ 0.05; 1.81 ≤ d ≤ 3.58) compared with SET (p ≤ 0.05; 1.04 ≤ d ≤ 2.30). Furthermore, SET (p ≤ 0.01; d = 2.08) resulted in larger gains in sport-specific performance compared with HRST (p < 0.05; d = 1.3). Only HRST produced significant pre-post improvements for anaerobic endurance and CoD speed (p ≤ 0.05; 1.84 ≤ d ≤ 4.76). In conclusion, HRST in addition to regular rowing training was more effective than SET to improve selected measures of physical fitness (i.e., maximal strength, muscle power, anaerobic endurance, and CoD speed) and SET was more effective than HRST to enhance sport-specific performance gains in female elite young rowers.
The purpose of this systematic review with meta-analysis was to examine the effects of strength training (ST) on selected components of physical fitness (e.g., lower/upper limb maximal strength, muscular endurance, jump performance, cardiorespiratory endurance) and sport-specific performance in rowers. Only studies with an active control group were included if they examined the effects of ST on at least one proxy of physical fitness and/or sport-specific performance in rowers. Weighted and averaged standardized mean differences (SMD) were calculated using random-effects models. Subgroup analyses were computed to identify effects of ST type or expertise level on sport-specific performance. Our analyses revealed significant small effects of ST on lower limb maximal strength (SMD = 0.42, p = 0.05) and on sport-specific performance (SMD = 0.32, p = 0.05). Non-significant effects were found for upper limb maximal strength, upper/lower limb muscular endurance, jump performance, and cardiorespiratory endurance. Subgroup analyses for ST type and expertise level showed non-significant differences between the respective subgroups of rowers (p >= 0.32). Our systematic review with meta-analysis indicated that ST is an effective means for improving lower limb maximal strength and sport-specific performance in rowers. However, ST-induced effects are neither modulated by ST type nor rowers' expertise level.
This study aimed to investigate the relationship between the acute to chronic workload ratio (ACWR), based upon participant session rating of perceived exertion (sRPE), using two models [(1) rolling averages (ACWRRA); and (2) exponentially weighted moving averages (ACWREWMA)] and the injury rate in young male team soccer players aged 17.1 ± 0.7 years during a competitive mesocycle. Twenty-two players were enrolled in this study and performed four training sessions per week with 2 days of recovery and 1 match day per week. During each training session and each weekly match, training time and sRPE were recorded. In addition, training impulse (TRIMP), monotony, and strain were subsequently calculated. The rate of injury was recorded for each soccer player over a period of 4 weeks (i.e., 28 days) using a daily questionnaire. The results showed that over the course of the study, the number of non-contact injuries was significantly higher than that for contact injuries (2.5 vs. 0.5, p = 0.01). There were also significant positive correlations between sRPE and training time (r = 0.411, p = 0.039), ACWRRA (r = 0.47, p = 0.049), and ACWREWMA (r = 0.51, p = 0.038). In addition, small-to-medium correlations were detected between ACWR and non-contact injury occurrence (ACWRRA, r = 0.31, p = 0.05; ACWREWMA, r = 0.53, p = 0.03). Explained variance (r 2) for non-contact injury was significantly greater using the ACWREWMA model (ranging between 21 and 52%) compared with ACWRRA (ranging between 17 and 39%). In conclusion, the results of this study showed that the ACWREWMA model is more sensitive than ACWRRA to identify non-contact injury occurrence in male team soccer players during a short period in the competitive season.
The relationship between residual stresses and microstructure associated with a laser powder bed fusion (LPBF) IN718 alloy has been investigated on specimens produced with three different scanning strategies (unidirectional Y-scan, 90 degrees XY-scan, and 67 degrees Rot-scan). Synchrotron X-ray energy-dispersive diffraction (EDXRD) combined with optical profilometry was used to study residual stress (RS) distribution and distortion upon removal of the specimens from the baseplate. The microstructural characterization of both the bulk and the near-surface regions was conducted using scanning electron microscopy (SEM) and electron backscatter diffraction (EBSD). On the top surfaces of the specimens, the highest RS values are observed in the Y-scan specimen and the lowest in the Rot-scan specimen, while the tendency is inversed on the side lateral surfaces. A considerable amount of RS remains in the specimens after their removal from the baseplate, especially in the Y- and Z-direction (short specimen dimension and building direction (BD), respectively). The distortion measured on the top surface following baseplate thinning and subsequent removal is mainly attributed to the amount of RS released in the build direction. Importantly, it is observed that the additive manufacturing microstructures challenge the use of classic theoretical models for the calculation of diffraction elastic constants (DEC) required for diffraction-based RS analysis. It is found that when the Reuss model is used for the calculation of RS for different crystal planes, as opposed to the conventionally used Kroner model, the results exhibit lower scatter. This is discussed in context of experimental measurements of DEC available in the literature for conventional and additively manufactured Ni-base alloys.
Coaches and athletes in elite sports are constantly seeking to use innovative and advanced training strategies to efficiently improve strength/power performance in already highly-trained individuals. In this regard, high-intensity conditioning contractions have become a popular means to induce acute improvements primarily in muscle contractile properties, which are supposed to translate to subsequent power performances. This performance-enhancing physiological mechanism has previously been called postactivation potentiation (PAP). However, in contrast to the traditional mechanistic understanding of PAP that is based on electrically-evoked twitch properties, an increasing number of studies used the term PAP while referring to acute performance enhancements, even if physiological measures of PAP were not directly assessed. In this current opinion article, we compare the two main approaches (i.e., mechanistic vs. performance) used in the literature to describe PAP effects. We additionally discuss potential misconceptions in the general use of the term PAP. Studies showed that mechanistic and performance-related PAP approaches have different characteristics in terms of the applied research field (basic vs. applied), effective conditioning contractions (e.g., stimulated vs. voluntary), verification (lab-based vs. field tests), effects (twitch peak force vs. maximal voluntary strength), occurrence (consistent vs. inconsistent), and time course (largest effect immediately after vs. similar to 7 min after the conditioning contraction). Moreover, cross-sectional studies revealed inconsistent and trivial-to-large-sized associations between selected measures of mechanistic (e.g., twitch peak force) vs. performance-related PAP approaches (e.g., jump height). In an attempt to avoid misconceptions related to the two different PAP approaches, we propose to use two different terms. Postactivation potentiation should only be used to indicate the increase in muscular force/torque production during an electrically-evoked twitch. In contrast, postactivation performance enhancement (PAPE) should be used to refer to the enhancement of measures of maximal strength, power, and speed following conditioning contractions. The implementation of this terminology would help to better differentiate between mechanistic and performance-related PAP approaches. This is important from a physiological point of view, but also when it comes to aggregating findings from PAP studies, e.g., in the form of meta-analyses, and translating these findings to the field of strength and conditioning.
The aim of this study was to establish maturation-, age-, and sex-specific anthropometric and physical fitness percentile reference values of young elite athletes from various sports. Anthropometric (i.e., standing and sitting body height, body mass, body mass index) and physical fitness (i.e., countermovement jump, drop jump, change-of-direction speed [i.e., T-test], trunk muscle endurance [i.e., ventral Bourban test], dynamic lower limbs balance [i.e., Y-balance test], hand grip strength) of 703 male and female elite young athletes aged 8–18 years were collected to aggregate reference values according to maturation, age, and sex. Findings indicate that body height and mass were significantly higher (p<0.001; 0.95≤d≤1.74) in more compared to less mature young athletes as well as with increasing chronological age (p<0.05; 0.66≤d≤3.13). Furthermore, male young athletes were significantly taller and heavier compared to their female counterparts (p<0.001; 0.34≤d≤0.50). In terms of physical fitness, post-pubertal athletes showed better countermovement jump, drop jump, change-of-direction, and handgrip strength performances (p<0.001; 1.57≤d≤8.72) compared to pubertal athletes. Further, countermovement jump, drop jump, change-of-direction, and handgrip strength performances increased with increasing chronological age (p<0.05; 0.29≤d≤4.13). In addition, male athletes outperformed their female counterpart in the countermovement jump, drop jump, change-of-direction, and handgrip strength (p<0.05; 0.17≤d≤0.76). Significant age by sex interactions indicate that sex-specific differences were even more pronounced with increasing age. Conclusively, body height, body mass, and physical fitness increased with increasing maturational status and chronological age. Sex-specific differences appear to be larger as youth grow older. Practitioners can use the percentile values as approximate benchmarks for talent identification and development.
Performance- and healthrelated benefits of yoThere is ample evidence that youth resistance training (RT) is safe, joyful, and effective for different markers of performance (e.g., muscle strength, power, linear sprint speed) and health (e.g., injury prevention). Accordingly, the first aim of this narrative review is to present and discuss the relevance of muscle strength for youth physical development. The second purpose is to report evidence on the effectiveness of RT on muscular fitness (muscle strength, power, muscle endurance), on movement skill performance and injury prevention in youth. There is evidence that RT is effective in enhancing measures of muscle fitness in children and adolescents, irrespective of sex. Additionally, numerous studies indicate that RT has positive effects on fundamental movement skills (e.g., jumping, running, throwing) in youth regardless of age, maturity, training status, and sex. Further, irrespective of age, sex, and training status, regular exposure to RT (e.g., plyometric training) decreases the risk of sustaining injuries in youth. This implies that RT should be a meaningful element of youths’ exercise programming. This has been acknowledged by global (e.g., World Health Organization) and national (e.g., National Strength and Conditioning Association) health- and performance-related organizations which is why they recommended to perform RT as an integral part of weekly exercise programs to promote muscular strength, fundamental movement skills, and to resist injuries in youth.uth resistance training
Electroencephalographic (EEG) research indicates changes in adults' low frequency bands of frontoparietal brain areas executing different balance tasks with increasing postural demands. However, this issue is unsolved for adolescents when performing the same balance task with increasing difficulty. Therefore, we examined the effects of a progressively increasing balance task difficulty on balance performance and brain activity in adolescents. Thirteen healthy adolescents aged 16-17 year performed tests in bipedal upright stance on a balance board with six progressively increasing levels of task difficulty. Postural sway and cortical activity were recorded simultaneously using a pressure sensitive measuring system and EEG. The power spectrum was analyzed for theta (4-7 Hz) and alpha-2 (10-12 Hz) frequency bands in pre-defined frontal, central, and parietal clusters of electrocortical sources. Repeated measures analysis of variance (rmANOVA) showed a significant main effect of task difficulty for postural sway (p < 0.001; d = 6.36). Concomitantly, the power spectrum changed in frontal, bilateral central, and bilateral parietal clusters. RmANOVAs revealed significant main effects of task difficulty for theta band power in the frontal (p < 0.001, d = 1.80) and both central clusters (left: p < 0.001, d = 1.49; right: p < 0.001, d = 1.42) as well as for alpha-2 band power in both parietal clusters (left: p < 0.001, d = 1.39; right: p < 0.001, d = 1.05) and in the central right cluster (p = 0.005, d = 0.92). Increases in theta band power (frontal, central) and decreases in alpha-2 power (central, parietal) with increasing balance task difficulty may reflect increased attentional processes and/or error monitoring as well as increased sensory information processing due to increasing postural demands. In general, our findings are mostly in agreement with studies conducted in adults. Similar to adult studies, our data with adolescents indicated the involvement of frontoparietal brain areas in the regulation of postural control. In addition, we detected that activity of selected brain areas (e.g., bilateral central) changed with increasing postural demands.
This study aimed to investigate the relationship between the acute to chronic workload ratio (ACWR), based upon participant session rating of perceived exertion (sRPE), using two models [(1) rolling averages (ACWRRA); and (2) exponentially weighted moving averages (ACWREWMA)] and the injury rate in young male team soccer players aged 17.1 ± 0.7 years during a competitive mesocycle. Twenty-two players were enrolled in this study and performed four training sessions per week with 2 days of recovery and 1 match day per week. During each training session and each weekly match, training time and sRPE were recorded. In addition, training impulse (TRIMP), monotony, and strain were subsequently calculated. The rate of injury was recorded for each soccer player over a period of 4 weeks (i.e., 28 days) using a daily questionnaire. The results showed that over the course of the study, the number of non-contact injuries was significantly higher than that for contact injuries (2.5 vs. 0.5, p = 0.01). There were also significant positive correlations between sRPE and training time (r = 0.411, p = 0.039), ACWRRA (r = 0.47, p = 0.049), and ACWREWMA (r = 0.51, p = 0.038). In addition, small-to-medium correlations were detected between ACWR and non-contact injury occurrence (ACWRRA, r = 0.31, p = 0.05; ACWREWMA, r = 0.53, p = 0.03). Explained variance (r²) for non-contact injury was significantly greater using the ACWREWMA model (ranging between 21 and 52%) compared with ACWRRA (ranging between 17 and 39%). In conclusion, the results of this study showed that the ACWREWMA model is more sensitive than ACWRRA to identify non-contact injury occurrence in male team soccer players during a short period in the competitive season.
Alles auf (Studien-)Anfang? Faktoren für den Studienerfolg in der Eingangsphase und zur Studienmitte
(2020)
Die hohen Abbruchquoten, insbesondere in der Studieneingangsphase, haben die Hochschulen in Deutschland veranlasst, eine Vielzahl von Maßnahmen zu ergreifen, über deren Wirkungen bisher allerdings wenig bekannt ist. Im vorliegenden Beitrag werden Befunde eines Forschungsprojekts speziell zur Studieneingangsphase sowie ergänzend zur Studienmitte vorgestellt, dessen Ziel es war, Bedingungen eines erfolgreichen Studieneinstiegs zu identifizieren und Empfehlungen für eine Optimierung des Studieneingangs abzuleiten. Das Forschungsdesign umfasste neben qualitativen Studien vor allem eine quantitative Längsschnittbefragung an fünf Universitäten (Potsdam, Mainz, Magdeburg, Kiel und Greifswald). Im Ergebnis der Analysen konnte die forschungsleitende Hypothese, dass Maßnahmen zum Studieneingang vor allem dann zur Erhöhung des Studienerfolgs einen Beitrag leisten, wenn sie zur akademischen und sozialen Integration in die Hochschule beitragen, bestätigt werden. Bedeutsam für den Studienerfolg sind demnach insbesondere solche Faktoren wie die Identifikation mit dem Studienfach, die Selbstwirksamkeit, die berufliche bzw. erfolgsorientierte Lernmotivation und die akademische Integration. Daneben konnte ein positiver Einfluss des sozialen Klimas sowie des Forschungs- und Praxisbezugs auf die Studienzufriedenheit nachgewiesen werden. Weiterführende Analysen zur Studienmitte verdeutlichen zudem, dass für die beiden Studienphasen (Eingang und Studienmitte) gleiche Faktoren bei zum Teil unterschiedlicher Gewichtung eine Rolle spielen. So ist die soziale Integration ein wesentlicher Prädiktor in beiden Phasen – in der Eingangsphase eher in die Studierendenschaft und im weiteren Studienverlauf (Studienmitte) eher in die akademische Gemeinschaft (in Form von Lehrenden). Insofern muss die Eingangsfrage wie folgt beantwortet werden: Ja, alles auf Anfang, aber dann mit den Bemühungen, soziale und akademische Integration aller Studierenden voll und ganz zu gewährleisten. Zudem machen die Befunde auf die bisher offenbar unterschätzte Rolle von Verwertungsmotiven aufmerksam.
Some studies reveal that adolescents with intellectual disabilities and developmental disabilities are more likely to be victims of both face-to-face bullying and cyberbullying. Research also suggests that these adolescents are likely to witness bullying victimization. More research is needed to better understand the negative outcomes associated with their experiences. The purpose of this short-term longitudinal study was to investigate the buffering effect of parental social support on the associations of cyberbullying victimization and bystanding to subjective health complaints, suicidal ideation, and non-suicidal self-harm. Participants were 121 adolescents (63% male;Mage = 14.10 years) with intellectual disabilities and developmental disorders who completed questionnaires on their face-to-face and cyberbullying victimization and bystanding, parental social support, subjective health complaints, suicidal ideation, and non-suicidal self-harm during the 7th grade (Time 1). In 8th grade (Time 2), they completed questionnaires on subjective health complaints, suicidal ideation, and non-suicidal self-harm. The findings revealed that the positive associations between Time 1 cyberbullying victimization and Time 2 subjective health complaints, suicidal ideation, and non-suicidal self-harm were stronger at lower levels of Time 1 parental social support, while high levels of Time 1 parental social support diminished these relationships. Similar patterns were found for Time 1 cyberbullying bystanding and Time 2 subjective health complaints. Parental social support has a buffering effect on the relationships among cyberbullying victimization, bystanding, and health outcomes among adolescents with intellectual and developmental disorders.
Background:
Cyberhate is a growing form of online aggression against a person or a group based on race, ethnicity, nationality, sexual orientation, gender, religion, or disability. The present study aims to examine psychometric properties of the Coping with Cyberhate Questionnaire, the prevalence of coping strategies in Spanish adolescents, differences in coping strategies based in sex, age, and victim status, and the association between coping with cyberhate and adolescents' mental well-being.
Method:
The sample consisted of 1,005 adolescents between 12 and 18 years old (Mage = 14.28 years, SD = 1.63; 51.9% girls) who completed self-report measures on coping strategies, victimization status, and mental well-being.
Results:
The results of confirmatory factor analyses showed a structure for the Coping with Cyberhate Questionnaire composed of six factors, namely Distal advice, Assertiveness, Helplessness/Selfblame, Close support, Technical coping, and Retaliation. It demonstrated acceptable internal consistency. The three most frequently endorsed coping strategies were Technical coping, Close support, and Assertiveness. In addition, lower Helplessness/Self-blame, and higher Close-support, Assertiveness, and Distal advice were significantly related to adolescents' better mental well-being.
Conclusion:
Prevention programs that educate adolescents about how to deal with cyberhate are needed.