Refine
Has Fulltext
- no (24) (remove)
Document Type
- Article (22)
- Part of a Book (1)
- Other (1)
Is part of the Bibliography
- yes (24) (remove)
Keywords
- performance (24) (remove)
Institute
- Department Sport- und Gesundheitswissenschaften (5)
- Strukturbereich Kognitionswissenschaften (5)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (4)
- Fachgruppe Betriebswirtschaftslehre (2)
- Wirtschaftswissenschaften (2)
- Department Musik und Kunst (1)
- Department Psychologie (1)
- Department für Inklusionspädagogik (1)
- Extern (1)
- Fachgruppe Politik- & Verwaltungswissenschaft (1)
International organizations (IOs) experience significant variation in their decision-making performance, or the extent to which they produce policy output. While some IOs are efficient decision-making machineries, others are plagued by deadlock. How can such variation be explained? Examining this question, the article makes three central contributions. First, we approach performance by looking at IO decision-making in terms of policy output and introduce an original measure of decision-making performance that captures annual growth rates in IO output. Second, we offer a novel theoretical explanation for decision-making performance. This account highlights the role of institutional design, pointing to how majoritarian decision rules, delegation of authority to supranational institutions, and access for transnational actors (TNAs) interact to affect decision-making. Third, we offer the first comparative assessment of the decision-making performance of IOs. While previous literature addresses single IOs, we explore decision-making across a broad spectrum of 30 IOs from 1980 to 2011. Our analysis indicates that IO decision-making performance varies across and within IOs. We find broad support for our theoretical account, showing the combined effect of institutional design features in shaping decision-making performance. Notably, TNA access has a positive effect on decision-making performance when pooling is greater, and delegation has a positive effect when TNA access is higher. We also find that pooling has an independent, positive effect on decision-making performance. All-in-all, these findings suggest that the institutional design of IOs matters for their decision-making performance, primarily in more complex ways than expected in earlier research.
Purpose: To examine the effects of loaded (LPJT) versus unloaded plyometric jump training (UPJT) programs on measures of muscle power, speed, change of direction (CoD), and kicking-distance performance in prepubertal male soccer players. Methods: Participants (N = 29) were randomly assigned to a LPJT group (n = 13; age = 13.0 [0.7] y) using weighted vests or UPJT group (n = 16; age = 13.0 [0.5] y) using body mass only. Before and after the intervention, tests for the assessment of proxies of muscle power (ie, countermovement jump, standing long jump); speed (ie, 5-, 10-, and 20-m sprint); CoD (ie, Illinois CoD test, modified 505 agility test); and kicking-distance were conducted. Data were analyzed using magnitude-based inferences. Results: Within-group analyses for the LPJT group showed large and very large improvements for 10-m sprint time (effect size [ES] = 2.00) and modified 505 CoD (ES = 2.83) tests, respectively. For the same group, moderate improvements were observed for the Illinois CoD test (ES = 0.61), 5- and 20-m sprint time test (ES = 1.00 for both the tests), countermovement jump test (ES = 1.00), and the maximal kicking-distance test (ES = 0.90). Small enhancements in the standing long jump test (ES = 0.50) were apparent. Regarding the UPJT group, small improvements were observed for all tests (ES = 0.33-0.57), except 5- and 10-m sprint time (ES = 1.00 and 0.63, respectively). Between-group analyses favored the LPJT group for the modified 505 CoD (ES = 0.61), standing long jump (ES = 0.50), and maximal kicking-distance tests (ES = 0.57), but not for the 5-m sprint time test (ES = 1.00). Only trivial between-group differences were shown for the remaining tests (ES = 0.00-0.09). Conclusion: Overall, LPJT appears to be more effective than UPJT in improving measures of muscle power, speed, CoD, and kicking-distance performance in prepubertal male soccer players.
Background:
COVID-19 has infected millions of people worldwide and is responsible for several hundred thousand fatalities. The COVID-19 pandemic has necessitated thoughtful resource allocation and early identification of high-risk patients. However, effective methods to meet these needs are lacking.
Objective:
The aims of this study were to analyze the electronic health records (EHRs) of patients who tested positive for COVID-19 and were admitted to hospitals in the Mount Sinai Health System in New York City; to develop machine learning models for making predictions about the hospital course of the patients over clinically meaningful time horizons based on patient characteristics at admission; and to assess the performance of these models at multiple hospitals and time points.
Methods:
We used Extreme Gradient Boosting (XGBoost) and baseline comparator models to predict in-hospital mortality and critical events at time windows of 3, 5, 7, and 10 days from admission. Our study population included harmonized EHR data from five hospitals in New York City for 4098 COVID-19-positive patients admitted from March 15 to May 22, 2020. The models were first trained on patients from a single hospital (n=1514) before or on May 1, externally validated on patients from four other hospitals (n=2201) before or on May 1, and prospectively validated on all patients after May 1 (n=383). Finally, we established model interpretability to identify and rank variables that drive model predictions.
Results:
Upon cross-validation, the XGBoost classifier outperformed baseline models, with an area under the receiver operating characteristic curve (AUC-ROC) for mortality of 0.89 at 3 days, 0.85 at 5 and 7 days, and 0.84 at 10 days. XGBoost also performed well for critical event prediction, with an AUC-ROC of 0.80 at 3 days, 0.79 at 5 days, 0.80 at 7 days, and 0.81 at 10 days. In external validation, XGBoost achieved an AUC-ROC of 0.88 at 3 days, 0.86 at 5 days, 0.86 at 7 days, and 0.84 at 10 days for mortality prediction. Similarly, the unimputed XGBoost model achieved an AUC-ROC of 0.78 at 3 days, 0.79 at 5 days, 0.80 at 7 days, and 0.81 at 10 days. Trends in performance on prospective validation sets were similar. At 7 days, acute kidney injury on admission, elevated LDH, tachypnea, and hyperglycemia were the strongest drivers of critical event prediction, while higher age, anion gap, and C-reactive protein were the strongest drivers of mortality prediction.
Conclusions:
We externally and prospectively trained and validated machine learning models for mortality and critical events for patients with COVID-19 at different time horizons. These models identified at-risk patients and uncovered underlying relationships that predicted outcomes.
Strategic management is the deliberate engagement of an administration with the challenges of fulfilling its mission and ensuring and improving its ability to act by clarifying measures of success, an understanding of how to influence patterns of action, and organiza-tional learning. In this respect, it is not just about planning, but about an understanding of the emerging strategies of the administration in fulfilling its tasks and the use of opportunities for performance improvement, taking into account stakeholder expectations, resource base and organizational capabilities.
Findings in the extant literature are mixed concerning when and how gender diversity benefits team performance. We develop and test a model that posits that gender-diverse teams outperform gender-homogeneous teams when perceived time pressure is low, whereas the opposite is the case when perceived time pressure is high. Drawing on the categorization-elaboration model (CEM; van Knippenberg, De Dreu, & Homan, 2004), we begin with the assumption that information elaboration is the process whereby gender diversity fosters positive effects on team performance. However, also in line with the CEM, we argue that this process can be disrupted by adverse team dynamics. Specifically, we argue that as time pressure increases, higher gender diversity leads to more team withdrawal, which, in turn, moderates the positive indirect effect of gender diversity on team performance via information elaboration such that this effect becomes weaker as team withdrawal increases. In an experimental study of 142 four-person teams, we found support for this model that explains why perceived time pressure affects the performance of gender-diverse teams more negatively than that of gender-homogeneous teams. Our study sheds new light on when and how gender diversity can become either an asset or a liability for team performance.
Purpose Paradoxical leadership (PL) is an emerging perspective to understand how leaders help followers deal with paradoxical demands. Recently, the positive relationship between PL and follower performance was established. This paper builds on and extends this research by interpreting PL as sensegiving and developing theory about mediation in the relationship between PL and adaptive and proactive performance. Design/methodology/approach The paper develops a new measure for PL as sensegiving and provides a test of the mediation model with data from two different sources and two measurement times in a German company. Findings Multilevel mediation analysis (N = 154) supports the mediation model. Originality/value The paper presents sensegiving about paradox as a core element of PL, which informs the choice of change-readiness as mediator. This study also develops and validates a scale to measure PL in future research.
Field-based sports require athletes to run sub-maximally over significant distances, often while contending with dynamic perturbations to preferred coordination patterns. The ability to adapt movement to maintain performance under such perturbations appears to be trainable through exposure to task variability, which encourages movement variability. The aim of the present study was to investigate the extent to which various wearable resistance loading magnitudes alter coordination and induce movement variability during running. To investigate this, 14 participants (three female and 11 male) performed 10 sub-maximal velocity shuttle runs with either no weight, 1%, 3%, or 5% of body weight attached to the lower limbs. Sagittal plane lower limb joint kinematics from one complete stride cycle in each run were assessed using functional data analysis techniques, both across the participant group and within-individuals. At the group-level, decreases in ankle plantarflexion following toe-off were evident in the 3% and 5% conditions, while increased knee flexion occurred during weight acceptance in the 5% condition compared with unloaded running. At the individual-level, between-run joint angle profiles varied, with six participants exhibiting increased joint angle variability in one or more loading conditions compared with unloaded running. Loading of 5% decreased between-run ankle joint variability among two individuals, likely in accordance with the need to manage increased system load or the novelty of the task. In terms of joint coordination, the most considerable alterations to coordination occurred in the 5% loading condition at the hip-knee joint pair, however, only a minority of participants exhibited this tendency. Coaches should prescribe wearable resistance individually to perturb preferred coordination patterns and encourage movement variability without loading to the extent that movement options become limited.
We examined state evaluation anxiety, trait evaluation anxiety, and neuroticism in relation to New Zealand first-year university students' (n = 234) task performance on either a test or essay assessment. For both assessment types, the underlying components of state evaluation anxiety (cognitive worry, emotionality, and distraction) reflect linear-as opposed to nonlinear-associations with task performance. Results of several regression models show differential effects of both state evaluation anxiety and neuroticism on task performance depending on the assessment type. The multi-dimensionality of anxiety and its relative contribution on task performance across authentic types of assessment are discussed.
The load-depended loss of vertical barbell velocity at the end of the acceleration phase limits the maximum weight that can be lifted. Thus, the purpose of this study was to analyze how increased barbell loads affect the vertical barbell velocity in the sub-phases of the acceleration phase during the snatch. It was hypothesized that the load-dependent velocity loss at the end of the acceleration phase is primarily associated with a velocity loss during the 1st pull. For this purpose, 14 male elite weightlifters lifted seven load-stages from 70-100% of their personal best in the snatch. The load-velocity relationship was calculated using linear regression analysis to determine the velocity loss at 1st pull, transition, and 2nd pull. A group mean data contrast analysis revealed the highest load-dependent velocity loss for the 1st pull (t = 1.85, p = 0.044, g = 0.49 [-0.05, 1.04]) which confirmed our study hypothesis. In contrast to the group mean data, the individual athlete showed a unique response to increased loads during the acceleration sub-phases of the snatch. With the proposed method, individualized training recommendations on exercise selection and loading schemes can be derived to specifically improve the sub-phases of the snatch acceleration phase. Furthermore, the results highlight the importance of single-subject assessment when working with elite athletes in Olympic weightlifting.
Improving scalability and reward of utility-driven self-healing for large dynamic architectures
(2020)
Self-adaptation can be realized in various ways. Rule-based approaches prescribe the adaptation to be executed if the system or environment satisfies certain conditions. They result in scalable solutions but often with merely satisfying adaptation decisions. In contrast, utility-driven approaches determine optimal decisions by using an often costly optimization, which typically does not scale for large problems. We propose a rule-based and utility-driven adaptation scheme that achieves the benefits of both directions such that the adaptation decisions are optimal, whereas the computation scales by avoiding an expensive optimization. We use this adaptation scheme for architecture-based self-healing of large software systems. For this purpose, we define the utility for large dynamic architectures of such systems based on patterns that define issues the self-healing must address. Moreover, we use pattern-based adaptation rules to resolve these issues. Using a pattern-based scheme to define the utility and adaptation rules allows us to compute the impact of each rule application on the overall utility and to realize an incremental and efficient utility-driven self-healing. In addition to formally analyzing the computational effort and optimality of the proposed scheme, we thoroughly demonstrate its scalability and optimality in terms of reward in comparative experiments with a static rule-based approach as a baseline and a utility-driven approach using a constraint solver. These experiments are based on different failure profiles derived from real-world failure logs. We also investigate the impact of different failure profile characteristics on the scalability and reward to evaluate the robustness of the different approaches.
While previous research underscores the role of leaders in stimulating employee voice behaviour, comparatively little is known about what affects leaders' support for such constructive but potentially threatening employee behaviours. We introduce leader member exchange quality (LMX) as a central predictor of leaders' support for employees' ideas for constructive change. Apart from a general benefit of high LMX for leaders' idea support, we propose that high LMX is particularly critical to leaders' idea support if the idea voiced by an employee constitutes a power threat to the leader. We investigate leaders' attribution of prosocial and egoistic employee intentions as mediators of these effects. Hypotheses were tested in a quasi-experimental vignette study (N = 160), in which leaders evaluated a simulated employee idea, and a field study (N = 133), in which leaders evaluated an idea that had been voiced to them at work. Results show an indirect effect of LMX on leaders' idea support via attributed prosocial intentions but not via attributed egoistic intentions, and a buffering effect of high LMX on the negative effect of power threat on leaders' idea support. Results differed across studies with regard to the main effect of LMX on idea support.
Evaluating the performance of self-adaptive systems is challenging due to their interactions with often highly dynamic environments. In the specific case of self-healing systems, the performance evaluations of self-healing approaches and their parameter tuning rely on the considered characteristics of failure occurrences and the resulting interactions with the self-healing actions. In this paper, we first study the state-of-the-art for evaluating the performances of self-healing systems by means of a systematic literature review. We provide a classification of different input types for such systems and analyse the limitations of each input type. A main finding is that the employed inputs are often not sophisticated regarding the considered characteristics for failure occurrences. To further study the impact of the identified limitations, we present experiments demonstrating that wrong assumptions regarding the characteristics of the failure occurrences can result in large performance prediction errors, disadvantageous design-time decisions concerning the selection of alternative self-healing approaches, and disadvantageous deployment-time decisions concerning parameter tuning. Furthermore, the experiments indicate that employing multiple alternative input characteristics can help with reducing the risk of premature disadvantageous design-time decisions.
Injuries in professional soccer are a significant concern for teams, and they are caused amongst others by high training load. This cohort study describes the relationship between workload parameters and the occurrence of non-contact injuries, during weeks with high and low workload in professional soccer players throughout the season. Twenty-one professional soccer players aged 28.3 ± 3.9 yrs. who competed in the Iranian Persian Gulf Pro League participated in this 48-week study. The external load was monitored using global positioning system (GPS, GPSPORTS Systems Pty Ltd) and the type of injury was documented daily by the team's medical staff. Odds ratio (OR) and relative risk (RR) were calculated for non-contact injuries for high- and low-load weeks according to acute (AW), chronic (CW), acute to chronic workload ratio (ACWR), and AW variation (Δ-Acute) values. By using Poisson distribution, the interval between previous and new injuries were estimated. Overall, 12 non-contact injuries occurred during high load and 9 during low load weeks. Based on the variables ACWR and Δ-AW, there was a significantly increased risk of sustaining non-contact injuries (p < 0.05) during high-load weeks for ACWR (OR: 4.67), and Δ-AW (OR: 4.07). Finally, the expected time between injuries was significantly shorter in high load weeks for ACWR [1.25 vs. 3.33, rate ratio time (RRT)] and Δ-AW (1.33 vs. 3.45, RRT) respectively, compared to low load weeks. The risk of sustaining injuries was significantly larger during high workload weeks for ACWR, and Δ-AW compared with low workload weeks. The observed high OR in high load weeks indicate that there is a significant relationship between workload and occurrence of non-contact injuries. The predicted time to new injuries is shorter in high load weeks compared to low load weeks. Therefore, the frequency of injuries is higher during high load weeks for ACWR and Δ-AW. ACWR and Δ-AW appear to be good indicators for estimating the injury risk, and the time interval between injuries.
Impact of normal weight obesity on fundamental motor skills in pre-school children aged 3 to 6 years
(2017)
Normal weight obesity is defined as having excessive body fat, but normal BMI. Even though previous research revealed that excessive body fat in children inhibited their physical activity and decreased motor performance, there has been only little evidence about motor performance of normal weight obese children. This study aims to establish whether normal weight obese pre-school children aged 3-6 years will have a significantly worse level of fundamental motor skills compared to normal weight non-obese counterparts. The research sample consisted of 152 pre-schoolers selected from a specific district of Prague, the Czech Republic. According to values from four skinfolds: triceps, subscapula, suprailiaca, calf, and BMI three categories of children aged 3-6 years were determined: A) normal weight obese n = 51; B) normal weight non-obese n = 52; C) overweight and obese n = 49. The Movement Assessment Battery for Children (MABC-2) was used for the assessment of fundamental motor skills. Normal weight obese children had significantly higher amount of adipose tissue p < 0.001 than normal weight non-obese children but the same average BMI. Moreover, normal weight obese children did not have significantly less amount of subcutaneous fat on triceps and calf compared to their overweight and obese peers. In majority of MABC-2 tests, normal weight obese pre-schoolers showed the poorest performance. Moreover, normal weight obese children had significantly worse total standard score = 38.82 compared to normal weight non-obese peers = 52.27; p < 0.05. In addition, normal weight obese children had a more than three times higher frequency OR = 3.69 CI95% (1.10; 12.35) of severe motor deficit performance <= 5th centile of the MABC-2 norm. These findings are strongly alarming since indices like BMI are not able to identify normal weight obese individual. We recommend verifying real portion of normal weight obese children as they are probably in higher risk of health and motor problems than overweight and obese population due to their low lean mass.
Form and Content, Again
(2017)
The following statement suggests reconsidering recent debates on a theory of lyric in terms of form and content. Four aspects and issues of the ongoing debate are discussed. In a first step, it is necessary to establish the relation between authorial poetics and lyric theory, since it is often characterised by fuzzy boundaries. Secondly, in order to specify the problem of form in lyric theory, it is suggested to have a closer look at the performative in lyric practice. Another important aspect of form is the semantics of lyrical genres. Lyrical genres mark an area in which form and content are intertwined and in which aspects of the form itself become semantic. Finally, the author argues that we should discuss - if possible assisted by a didactics sensitive to literary texts - whether and how theoretical proposals could be transformed into a practice of teaching poetry.
Background: High-intensity muscle actions have the potential to temporarily improve the performance which has been denoted as postactivation performance enhancement.
Objectives: This study determined the acute effects of different stretch-shortening (fast vs. low) and strength (dynamic vs. isometric) exercises executed during one training session on subsequent balance performance in youth weightlifters.
Materials and Methods: Sixteen male and female young weightlifters, aged 11.3±0.6years, performed four strength exercise conditions in randomized order, including dynamic strength (DYN; 3 sets of 3 repetitions of 10 RM) and isometric strength exercises (ISOM; 3 sets of maintaining 3s of 10 RM of back-squat), as well as fast (FSSC; 3 sets of 3 repetitions of 20-cm drop-jumps) and slow (SSSC; 3 sets of 3 hurdle jumps over a 20-cm obstacle) stretch-shortening cycle protocols. Balance performance was tested before and after each of the four exercise conditions in bipedal stance on an unstable surface (i.e., BOSU ball with flat side facing up) using two dependent variables, i.e., center of pressure surface area (CoP SA) and velocity (CoP V).
Results: There was a significant effect of time on CoP SA and CoP V [F(1,60)=54.37, d=1.88, p<0.0001; F(1,60)=9.07, d=0.77, p=0.003]. In addition, a statistically significant effect of condition on CoP SA and CoP V [F(3,60)=11.81, d=1.53, p<0.0001; F(3,60)=7.36, d=1.21, p=0.0003] was observed. Statistically significant condition-by-time interactions were found for the balance parameters CoP SA (p<0.003, d=0.54) and CoP V (p<0.002, d=0.70). Specific to contrast analysis, all specified hypotheses were tested and demonstrated that FSSC yielded significantly greater improvements than all other conditions in CoP SA and CoP V [p<0.0001 (d=1.55); p=0.0004 (d=1.19), respectively]. In addition, FSSC yielded significantly greater improvements compared with the two conditions for both balance parameters [p<0.0001 (d=2.03); p<0.0001 (d=1.45)].
Conclusion: Fast stretch-shortening cycle exercises appear to be more effective to improve short-term balance performance in young weightlifters. Due to the importance of balance for overall competitive achievement in weightlifting, it is recommended that young weightlifters implement dynamic plyometric exercises in the fast stretch-shortening cycle during the warm-up to improve their balance performance.
La música como rizoma
(2018)
Con este trabajo se persigue establecer las bases de una nueva epistemología de la educación musical inspirándose para ello en la obra Mil mesetas de los franceses Deleuze y Guattari. Concebir la música como rizoma y el aula de música como sistema rizomórfico, nos permitirá analizar las características del hacer musical como sistema descentrado en el que interactúan fuerzas performativas que trascienden la partitura en sentido tradicional como principal y único monumento a considerar como parte del curriculum, distanciándose con ello del museo imaginario y de sus ideólogos.
Entender el aula de música como espacio social performativo permite estudiar las características de los sistemas rizomórficos en esta área de conocimiento y con la ayuda de los principios de conectividad, heterogeneidad, multiplicidad, ruptura asignificante y cartografía se comprobará cómo los universales de la musicología convencional no son válidos cuando se toman como única referencia para aplicarlos a contextos educativos.
Evaluating the performance of self-adaptive systems (SAS) is challenging due to their complexity and interaction with the often highly dynamic environment. In the context of self-healing systems (SHS), employing simulators has been shown to be the most dominant means for performance evaluation. Simulating a SHS also requires realistic fault injection scenarios. We study the state of the practice for evaluating the performance of SHS by means of a systematic literature review. We present the current practice and point out that a more thorough and careful treatment in evaluating the performance of SHS is required.
The aim of this study was to establish maturation-, age-, and sex-specific anthropometric and physical fitness percentile reference values of young elite athletes from various sports. Anthropometric (i.e., standing and sitting body height, body mass, body mass index) and physical fitness (i.e., countermovement jump, drop jump, change-of-direction speed [i.e., T-test], trunk muscle endurance [i.e., ventral Bourban test], dynamic lower limbs balance [i.e., Y-balance test], hand grip strength) of 703 male and female elite young athletes aged 8–18 years were collected to aggregate reference values according to maturation, age, and sex. Findings indicate that body height and mass were significantly higher (p<0.001; 0.95≤d≤1.74) in more compared to less mature young athletes as well as with increasing chronological age (p<0.05; 0.66≤d≤3.13). Furthermore, male young athletes were significantly taller and heavier compared to their female counterparts (p<0.001; 0.34≤d≤0.50). In terms of physical fitness, post-pubertal athletes showed better countermovement jump, drop jump, change-of-direction, and handgrip strength performances (p<0.001; 1.57≤d≤8.72) compared to pubertal athletes. Further, countermovement jump, drop jump, change-of-direction, and handgrip strength performances increased with increasing chronological age (p<0.05; 0.29≤d≤4.13). In addition, male athletes outperformed their female counterpart in the countermovement jump, drop jump, change-of-direction, and handgrip strength (p<0.05; 0.17≤d≤0.76). Significant age by sex interactions indicate that sex-specific differences were even more pronounced with increasing age. Conclusively, body height, body mass, and physical fitness increased with increasing maturational status and chronological age. Sex-specific differences appear to be larger as youth grow older. Practitioners can use the percentile values as approximate benchmarks for talent identification and development.
In the context of back pain, great emphasis has been placed on the importance of trunk stability, especially in situations requiring compensation of repetitive, intense loading induced during high-performance activities, e.g., jumping or landing. This study aims to evaluate trunk muscle activity during drop jump in adolescent athletes with back pain (BP) compared to athletes without back pain (NBP). Eleven adolescent athletes suffering back pain (BP: m/f: n = 4/7; 15.9 +/- 1.3 y; 176 +/- 11 cm; 68 +/- 11 kg; 12.4 +/- 10.5 h/we training) and 11 matched athletes without back pain (NBP: m/f: n = 4/7; 15.5 +/- 1.3 y; 174 +/- 7 cm; 67 +/- 8 kg; 14.9 +/- 9.5 h/we training) were evaluated. Subjects conducted 3 drop jumps onto a force plate (ground reaction force). Bilateral 12-lead SEMG (surface Electromyography) was applied to assess trunk muscle activity. Ground contact time [ms], maximum vertical jump force [N], jump time [ms] and the jump performance index [m/s] were calculated for drop jumps. SEMG amplitudes (RMS: root mean square [%]) for all 12 single muscles were normalized toMIVC (maximum isometric voluntary contraction) and analyzed in 4 time windows (100 ms pre- and 200 ms post-initial ground contact, 100 ms pre- and 200 ms post-landing) as outcome variables. In addition, muscles were grouped and analyzed in ventral and dorsal muscles, as well as straight and transverse trunk muscles. Drop jump ground reaction force variables did not differ between NBP and BP (p > 0.05). Mm obliquus externus and internus abdominis presented higher SEMG amplitudes (1.3-1.9-fold) for BP (p < 0.05). Mm rectus abdominis, erector spinae thoracic/lumbar and latissimus dorsi did not differ (p > 0.05). The muscle group analysis over the whole jumping cycle showed statistically significantly higher SEMG amplitudes for BP in the ventral (p = 0.031) and transverse muscles (p = 0.020) compared to NBP. Higher activity of transverse, but not straight, trunk muscles might indicate a specific compensation strategy to support trunk stability in athletes with back pain during drop jumps. Therefore, exercises favoring the transverse trunk muscles could be recommended for back pain treatment.
Objectives
The aims of this study were to investigate the effects of a six-week in-season period of soccer training and games (congested period) on plasma volume variations (PV), hematological parameters, and physical fitness in elite players. In addition, we analyzed relationships between training load, hematological parameters and players’ physical fitness.
Methods
Eighteen elite players were evaluated before (T1) and after (T2) a six-week in-season period interspersed with 10 soccer matches. At T1 and T2, players performed the Yo-Yo intermittent recovery test level 1 (YYIR1), the repeated shuttle sprint ability test (RSSA), the countermovement jump test (CMJ), and the squat jump test (SJ). In addition, PV and hematological parameters (erythrocytes [M/mm3], hematocrit [%], hemoglobin [g/dl], mean corpuscular volume [fl], mean corpuscular hemoglobin content [pg], and mean hemoglobin concentration [%]) were assessed. Daily ratings of perceived exertion (RPE) were monitored in order to quantify the internal training load.
Results
From T1 to T2, significant performance declines were found for the YYIR1 (p<0.001, effect size [ES] = 0.5), RSSA (p<0.01, ES = 0.6) and SJ tests (p< 0.046, ES = 0.7). However, no significant changes were found for the CMJ (p = 0.86, ES = 0.1). Post-exercise, RSSA blood lactate (p<0.012, ES = 0.2) and PV (p<0.01, ES = 0.7) increased significantly from T1 to T2. A significant decrease was found from T1 to T2 for the erythrocyte value (p<0.002, ES = 0.5) and the hemoglobin concentration (p<0.018, ES = 0.8). The hematocrit percentage rate was also significantly lower (p<0.001, ES = 0.6) at T2. The mean corpuscular volume, mean corpuscular hemoglobin content and the mean hemoglobin content values were not statistically different from T1 to T2. No significant relationships were detected between training load parameters and percentage changes of hematological parameters. However, a significant relationship was observed between training load and changes in RSSA performance (r = -0.60; p<0.003).
Conclusions
An intensive period of “congested match play” over 6 weeks significantly compromised players’ physical fitness. These changes were not related to hematological parameters, even though significant alterations were detected for selected measures.
Accuracy of training recommendations based on a treadmill multistage incremental exercise test
(2018)
Competitive runners will occasionally undergo exercise in a laboratory setting to obtain predictive and prescriptive information regarding their performance. The present research aimed to assess whether the physiological demands of lab-based treadmill running (TM) can simulate that of over-ground (OG) running using a commonly used protocol. Fifteen healthy volunteers with a weekly mileage of ≥ 20 km over the past 6 months and treadmill experience participated in this cross-sectional study. Two stepwise incremental tests until volitional exhaustion was performed in a fixed order within one week in an Outpatient Clinic research laboratory and outdoor athletic track. Running velocity (IATspeed), heart rate (IATHR) and lactate concentration at the individual anaerobic threshold (IATbLa) were primary endpoints. Additionally, distance covered (DIST), maximal heart rate (HRmax), maximal blood lactate concentration (bLamax) and rate of perceived exertion (RPE) at IATspeed were analyzed. IATspeed, DIST and HRmax were not statistically significantly different between conditions, whereas bLamax and RPE at IATspeed showed statistical significance (p < 0.05). Apart from RPE at IATspeed, IATspeed, DIST, HRmax and bLamax strongly correlate between conditions (r = 0.815–0.988). High reliability between conditions provides strong evidence to suggest that running on a treadmill are physiologically comparable to that of OG and that training recommendations and be made with assurance.
In the context of back pain, great emphasis has been placed on the importance of trunk stability, especially in situations requiring compensation of repetitive, intense loading induced during high-performance activities, e.g., jumping or landing. This study aims to evaluate trunk muscle activity during drop jump in adolescent athletes with back pain (BP) compared to athletes without back pain (NBP). Eleven adolescent athletes suffering back pain (BP: m/f: n = 4/7; 15.9 ± 1.3 y; 176 ± 11 cm; 68 ± 11 kg; 12.4 ± 10.5 h/we training) and 11 matched athletes without back pain (NBP: m/f: n = 4/7; 15.5 ± 1.3 y; 174 ± 7 cm; 67 ± 8 kg; 14.9 ± 9.5 h/we training) were evaluated. Subjects conducted 3 drop jumps onto a force plate (ground reaction force). Bilateral 12-lead SEMG (surface Electromyography) was applied to assess trunk muscle activity. Ground contact time [ms], maximum vertical jump force [N], jump time [ms] and the jump performance index [m/s] were calculated for drop jumps. SEMG amplitudes (RMS: root mean square [%]) for all 12 single muscles were normalized to MIVC (maximum isometric voluntary contraction) and analyzed in 4 time windows (100 ms pre- and 200 ms post-initial ground contact, 100 ms pre- and 200 ms post-landing) as outcome variables. In addition, muscles were grouped and analyzed in ventral and dorsal muscles, as well as straight and transverse trunk muscles. Drop jump ground reaction force variables did not differ between NBP and BP (p > 0.05). Mm obliquus externus and internus abdominis presented higher SEMG amplitudes (1.3–1.9-fold) for BP (p < 0.05). Mm rectus abdominis, erector spinae thoracic/lumbar and latissimus dorsi did not differ (p > 0.05). The muscle group analysis over the whole jumping cycle showed statistically significantly higher SEMG amplitudes for BP in the ventral (p = 0.031) and transverse muscles (p = 0.020) compared to NBP. Higher activity of transverse, but not straight, trunk muscles might indicate a specific compensation strategy to support trunk stability in athletes with back pain during drop jumps. Therefore, exercises favoring the transverse trunk muscles could be recommended for back pain treatment.
The objective of this study was to investigate the effect of dietary citric acid (CA) on the performance and mineral metabolism of broiler chicks. A total of 1720 Ross PM3 broiler chicks (days old) were randomly assigned to four groups (430 in each) and reared for a period of 35 days. The diets of groups 1, 2, 3 and 4 were supplemented with 0%, 0.25%, 0.75% or 1.25% CA by weight respectively. Feed and faeces samples were collected weekly and analysed for acid insoluble ash, calcium (Ca), phosphorus (P) and magnesium (Mg). The pH was measured in feed and faeces. At the age of 28 days, 10 birds from each group were slaughtered; tibiae were collected from each bird for the determination of bone mineral density, total ash, Ca, P, Mg and bone-breaking strength, and blood was collected for the measurement of osteocalcin, serum CrossLaps (R), Ca, P, Mg and 1,25(OH)(2)Vit-D in serum. After finishing the trial on day 37, all chicks were slaughtered by using the approved procedure. Birds that were fed CA diets were heavier (average body weights of 2030, 2079 and 2086 g in the 0.25%, 0.75% and 1.25% CA groups, respectively, relative to the control birds (1986 g). Feed conversion efficiency (weight gain in g per kg of feed intake) was also higher in birds of the CA-fed groups (582, 595 and 587 g/kg feed intake for 0.25%, 0.75% and 1.25% CA respectively), relative to the control birds (565 g/kg feed intake). The digestibility of Ca, P and Mg increased in the CA-fed groups, especially for the diets supplemented with 0.25% and 0.75% CA. Support for finding was also indicated in the results of the analysis of the tibia. At slaughter, the birds had higher carcass weights and higher graded carcasses in the groups that were fed the CA diets. The estimated profit margin was highest for birds fed the diet containing 0.25% CA. Birds of the 0.75% CA group were found to have the second highest estimated profit margin. Addition of CA up to a level of 1.25% of the diet increased performance, feed conversion efficiency, carcass weight and carcass quality, but only in numerical terms. The addition of CA up to 0.75% significantly increased the digestibility of macro minerals, bone ash content, bone mineral density and bone strength of the broiler chicks. It may, therefore, be concluded that the addition of 0.75% CA in a standard diet is suitable for growth, carcass traits, macromineral digestibility and bone mineral density of broiler chicks.