Extern
Refine
Year of publication
- 2022 (133) (remove)
Document Type
- Article (133) (remove)
Keywords
- exercise (3)
- obesity (3)
- COVID-19 (2)
- attention (2)
- diabetes (2)
- football (2)
- human physical conditioning (2)
- longitudinal (2)
- muscle strength (2)
- pre-school (2)
Institute
- Extern (133) (remove)
The prevalence of obesity in the pediatric population has become a major public health issue. Indeed, the dramatic increase of this epidemic causes multiple and harmful consequences, Physical activity, particularly physical exercise, remains to be the cornerstone of interventions against childhood obesity. Given the conflicting findings with reference to the relevant literature addressing the effects of exercise on adiposity and physical fitness outcomes in obese children and adolescents, the effect of duration-matched concurrent training (CT) [50% resistance (RT) and 50% high-intensity-interval-training (HIIT)] on body composition and physical fitness in obese youth remains to be elucidated. Thus, the purpose of this study was to examine the effects of 9-weeks of CT compared to RT or HIIT alone, on body composition and selected physical fitness components in healthy sedentary obese youth. Out of 73 participants, only 37; [14 males and 23 females; age 13.4 ± 0.9 years; body-mass-index (BMI): 31.2 ± 4.8 kg·m-2] were eligible and randomized into three groups: HIIT (n = 12): 3-4 sets×12 runs at 80–110% peak velocity, with 10-s passive recovery between bouts; RT (n = 12): 6 exercises; 3–4 sets × 10 repetition maximum (RM) and CT (n = 13): 50% serial completion of RT and HIIT. CT promoted significant greater gains compared to HIIT and RT on body composition (p < 0.01, d = large), 6-min-walking test distance (6 MWT-distance) and on 6 MWT-VO2max (p < 0.03, d = large). In addition, CT showed substantially greater improvements than HIIT in the medicine ball throw test (20.2 vs. 13.6%, p < 0.04, d = large). On the other hand, RT exhibited significantly greater gains in relative hand grip strength (p < 0.03, d = large) and CMJ (p < 0.01, d = large) than HIIT and CT. CT promoted greater benefits for fat, body mass loss and cardiorespiratory fitness than HIIT or RT modalities. This study provides important information for practitioners and therapists on the application of effective exercise regimes with obese youth to induce significant and beneficial body composition changes. The applied CT program and the respective programming parameters in terms of exercise intensity and volume can be used by practitioners as an effective exercise treatment to fight the pandemic overweight and obesity in youth.
Background
The aim of this study was to analyze the shoulder functional profile (rotation range of motion [ROM] and strength), upper and lower body performance, and throwing speed of U13 versus U15 male handball players, and to establish the relationship between these measures of physical fitness and throwing speed.
Methods
One-hundred and nineteen young male handball players (under (U)-13 (U13) [n = 85]) and U15 [n = 34]) volunteered to participate in this study. The participating athletes had a mean background of sytematic handball training of 5.5 ± 2.8 years and they exercised on average 540 ± 10.1 min per week including sport-specific team handball training and strength and conditioning programs. Players were tested for passive shoulder range-of-motion (ROM) for both internal (IR) and external rotation (ER) and isometric strength (i.e., IR and ER) of the dominant/non-dominant shoulders, overhead medicine ball throw (OMB), hip isometric abductor (ABD) and adductor (ADD) strength, hip ROM, jumps (countermovement jump [CMJ] and triple leg-hop [3H] for distance), linear sprint test, modified 505 change-of-direction (COD) test and handball throwing speed (7 m [HT7] and 9 m [HT9]).
Results
U15 players outperformed U13 in upper (i.e., HT7 and HT9 speed, OMB, absolute IR and ER strength of the dominant and non-dominant sides; Cohen’s d: 0.76–2.13) and lower body (i.e., CMJ, 3H, 20-m sprint and COD, hip ABD and ADD; d: 0.70–2.33) performance measures. Regarding shoulder ROM outcomes, a lower IR ROM was found of the dominant side in the U15 group compared to the U13 and a higher ER ROM on both sides in U15 (d: 0.76–1.04). It seems that primarily anthropometric characteristics (i.e., body height, body mass) and upper body strength/power (OMB distance) are the most important factors that explain the throw speed variance in male handball players, particularly in U13.
Conclusions
Findings from this study imply that regular performance monitoring is important for performance development and for minimizing injury risk of the shoulder in both age categories of young male handball players. Besides measures of physical fitness, anthropometric data should be recorded because handball throwing performance is related to these measures.
Aims: High intensity interval training (HIIT) improves mitochondrial characteristics. This study compared the impact of two workload-matched high intensity interval training (HIIT) protocols with different work:recovery ratios on regulatory factors related to mitochondrial biogenesis in the soleus muscle of diabetic rats.
Materials and methods: Twenty-four Wistar rats were randomly divided into four equal-sized groups: non-diabetic control, diabetic control (DC), diabetic with long recovery exercise [4–5 × 2-min running at 80%–90% of the maximum speed reached with 2-min of recovery at 40% of the maximum speed reached (DHIIT1:1)], and diabetic with short recovery exercise (5–6 × 2-min running at 80%–90% of the maximum speed reached with 1-min of recovery at 30% of the maximum speed reached [DHIIT2:1]). Both HIIT protocols were completed five times/week for 4 weeks while maintaining equal running distances in each session.
Results: Gene and protein expressions of PGC-1α, p53, and citrate synthase of the muscles increased significantly following DHIIT1:1 and DHIIT2:1 compared to DC (p ˂ 0.05). Most parameters, except for PGC-1α protein (p = 0.597), were significantly higher in DHIIT2:1 than in DHIIT1:1 (p ˂ 0.05). Both DHIIT groups showed significant increases in maximum speed with larger increases in DHIIT2:1 compared with DHIIT1:1.
Conclusion: Our findings indicate that both HIIT protocols can potently up-regulate gene and protein expression of PGC-1α, p53, and CS. However, DHIIT2:1 has superior effects compared with DHIIT1:1 in improving mitochondrial adaptive responses in diabetic rats.
The study examined the potential future changes of drought characteristics in the Greater Lake Malawi Basin in Southeast Africa. This region strongly depends on water resources to generate electricity and food. Future projections (considering both moderate and high emission scenarios) of temperature and precipitation from an ensemble of 16 bias-corrected climate model combinations were blended with a scenario-neutral response surface approach to analyses changes in: (i) the meteorological conditions, (ii) the meteorological water balance, and (iii) selected drought characteristics such as drought intensity, drought months, and drought events, which were derived from the Standardized Precipitation and Evapotranspiration Index. Changes were analyzed for a near-term (2021–2050) and far-term period (2071–2100) with reference to 1976–2005. The effect of bias-correction (i.e., empirical quantile mapping) on the ability of the climate model ensemble to reproduce observed drought characteristics as compared to raw climate projections was also investigated. Results suggest that the bias-correction improves the climate models in terms of reproducing temperature and precipitation statistics but not drought characteristics. Still, despite the differences in the internal structures and uncertainties that exist among the climate models, they all agree on an increase of meteorological droughts in the future in terms of higher drought intensity and longer events. Drought intensity is projected to increase between +25 and +50% during 2021–2050 and between +131 and +388% during 2071–2100. This translates into +3 to +5, and +7 to +8 more drought months per year during both periods, respectively. With longer lasting drought events, the number of drought events decreases. Projected droughts based on the high emission scenario are 1.7 times more severe than droughts based on the moderate scenario. That means that droughts in this region will likely become more severe in the coming decades. Despite the inherent high uncertainties of climate projections, the results provide a basis in planning and (water-)managing activities for climate change adaptation measures in Malawi. This is of particular relevance for water management issues referring hydro power generation and food production, both for rain-fed and irrigated agriculture.
Injuries in professional soccer are a significant concern for teams, and they are caused amongst others by high training load. This cohort study describes the relationship between workload parameters and the occurrence of non-contact injuries, during weeks with high and low workload in professional soccer players throughout the season. Twenty-one professional soccer players aged 28.3 ± 3.9 yrs. who competed in the Iranian Persian Gulf Pro League participated in this 48-week study. The external load was monitored using global positioning system (GPS, GPSPORTS Systems Pty Ltd) and the type of injury was documented daily by the team's medical staff. Odds ratio (OR) and relative risk (RR) were calculated for non-contact injuries for high- and low-load weeks according to acute (AW), chronic (CW), acute to chronic workload ratio (ACWR), and AW variation (Δ-Acute) values. By using Poisson distribution, the interval between previous and new injuries were estimated. Overall, 12 non-contact injuries occurred during high load and 9 during low load weeks. Based on the variables ACWR and Δ-AW, there was a significantly increased risk of sustaining non-contact injuries (p < 0.05) during high-load weeks for ACWR (OR: 4.67), and Δ-AW (OR: 4.07). Finally, the expected time between injuries was significantly shorter in high load weeks for ACWR [1.25 vs. 3.33, rate ratio time (RRT)] and Δ-AW (1.33 vs. 3.45, RRT) respectively, compared to low load weeks. The risk of sustaining injuries was significantly larger during high workload weeks for ACWR, and Δ-AW compared with low workload weeks. The observed high OR in high load weeks indicate that there is a significant relationship between workload and occurrence of non-contact injuries. The predicted time to new injuries is shorter in high load weeks compared to low load weeks. Therefore, the frequency of injuries is higher during high load weeks for ACWR and Δ-AW. ACWR and Δ-AW appear to be good indicators for estimating the injury risk, and the time interval between injuries.
Background
Maximal isokinetic strength ratios of joint flexors and extensors are important parameters to indicate the level of muscular balance at the joint. Further, in combat sports athletes, upper and lower limb muscle strength is affected by the type of sport. Thus, this study aimed to examine the differences in maximal isokinetic strength of the flexors and extensors and the corresponding flexor–extensor strength ratios of the elbows and knees in combat sports athletes.
Method
Forty male participants (age = 22.3 ± 2.5 years) from four different combat sports (amateur boxing, taekwondo, karate, and judo; n = 10 per sport) were tested for eccentric peak torque of the elbow/knee flexors (EF/KF) and concentric peak torque of the elbow/knee extensors (EE/KE) at three different angular velocities (60, 120, and 180°/s) on the dominant and non-dominant side using an isokinetic device.
Results
Analyses revealed significant, large-sized group × velocity × limb interactions for EF, EE, and EF–EE ratio, KF, KE, and KF–KE ratio (p ≤ 0.03; 0.91 ≤ d ≤ 1.75). Post-hoc analyses indicated that amateur boxers displayed the largest EE strength values on the non-dominant side at ≤ 120°/s and the dominant side at ≥ 120°/s (p < 0.03; 1.21 ≤ d ≤ 1.59). The largest EF–EE strength ratios were observed on amateur boxers’ and judokas’ non-dominant side at ≥ 120°/s (p < 0.04; 1.36 ≤ d ≤ 2.44). Further, we found lower KF–KE strength measures in karate (p < 0.04; 1.12 ≤ d ≤ 6.22) and judo athletes (p ≤ 0.03; 1.60 ≤ d ≤ 5.31) particularly on the non-dominant side.
Conclusions
The present findings indicated combat sport-specific differences in maximal isokinetic strength measures of EF, EE, KF, and KE particularly in favor of amateur boxers on the non-dominant side.
The aim of this review was to describe and summarize the scientific literature on programming parameters related to jump or plyometric training in male and female soccer players of different ages and fitness levels. A literature search was conducted in the electronic databases PubMed, Web of Science and Scopus using keywords related to the main topic of this study (e.g., “ballistic” and “plyometric”). According to the PICOS framework, the population for the review was restricted to soccer players, involved in jump or plyometric training. Among 7556 identified studies, 90 were eligible for inclusion. Only 12 studies were found for females. Most studies (n = 52) were conducted with youth male players. Moreover, only 35 studies determined the effectiveness of a given jump training programming factor. Based on the limited available research, it seems that a dose of 7 weeks (1–2 sessions per week), with ~80 jumps (specific of combined types) per session, using near-maximal or maximal intensity, with adequate recovery between repetitions (<15 s), sets (≥30 s) and sessions (≥24–48 h), using progressive overload and taper strategies, using appropriate surfaces (e.g., grass), and applied in a well-rested state, when combined with other training methods, would increase the outcome of effective and safe plyometric-jump training interventions aimed at improving soccer players physical fitness. In conclusion, jump training is an effective and easy-to-administer training approach for youth, adult, male and female soccer players. However, optimal programming for plyometric-jump training in soccer is yet to be determined in future research.
Biological invasions may result from multiple introductions, which might compensate for reduced gene pools caused by bottleneck events, but could also dilute adaptive processes. A previous common-garden experiment showed heritable latitudinal clines in fitness-related traits in the invasive goldenrod Solidago canadensis in Central Europe. These latitudinal clines remained stable even in plants chemically treated with zebularine to reduce epigenetic variation. However, despite the heritability of traits investigated, genetic isolation-by-distance was non-significant. Utilizing the same specimens, we applied a molecular analysis of (epi)genetic differentiation with standard and methylation-sensitive (MSAP) AFLPs. We tested whether this variation was spatially structured among populations and whether zebularine had altered epigenetic variation. Additionally, we used genome scans to mine for putative outlier loci susceptible to selection processes in the invaded range. Despite the absence of isolation-by-distance, we found spatial genetic neighborhoods among populations and two AFLP clusters differentiating northern and southern Solidago populations. Genetic and epigenetic diversity were significantly correlated, but not linked to phenotypic variation. Hence, no spatial epigenetic patterns were detected along the latitudinal gradient sampled. Applying genome-scan approaches (BAYESCAN, BAYESCENV, RDA, and LFMM), we found 51 genetic and epigenetic loci putatively responding to selection. One of these genetic loci was significantly more frequent in populations at the northern range. Also, one epigenetic locus was more frequent in populations in the southern range, but this pattern was lost under zebularine treatment. Our results point to some genetic, but not epigenetic adaptation processes along a large-scale latitudinal gradient of S. canadensis in its invasive range.
Cosmic-ray neutron sensing (CRNS) has become an effective method to measure soil moisture at a horizontal scale of hundreds of metres and a depth of decimetres. Recent studies proposed operating CRNS in a network with overlapping footprints in order to cover root-zone water dynamics at the small catchment scale and, at the same time, to represent spatial heterogeneity. In a joint field campaign from September to November 2020 (JFC-2020), five German research institutions deployed 15 CRNS sensors in the 0.4 km2 Wüstebach catchment (Eifel mountains, Germany). The catchment is dominantly forested (but includes a substantial fraction of open vegetation) and features a topographically distinct catchment boundary. In addition to the dense CRNS coverage, the campaign featured a unique combination of additional instruments and techniques: hydro-gravimetry (to detect water storage dynamics also below the root zone); ground-based and, for the first time, airborne CRNS roving; an extensive wireless soil sensor network, supplemented by manual measurements; and six weighable lysimeters. Together with comprehensive data from the long-term local research infrastructure, the published data set (available at https://doi.org/10.23728/b2share.756ca0485800474e9dc7f5949c63b872; Heistermann et al., 2022) will be a valuable asset in various research contexts: to advance the retrieval of landscape water storage from CRNS, wireless soil sensor networks, or hydrogravimetry; to identify scale-specific combinations of sensors and methods to represent soil moisture variability; to improve the understanding and simulation of land–atmosphere exchange as well as hydrological and hydrogeological processes at the hillslope and the catchment scale; and to support the retrieval of soil water content from airborne and spaceborne remote sensing platforms.
Cyberhate represents a risk to adolescents’ development and peaceful coexistence in democratic societies. Yet, not much is known about the relationship between adolescents’ ability to cope with cyberhate and their cyberhate involvement. To fill current gaps in the literature and inform the development of media education programs, the present study investigated various coping strategies in a hypothetical cyberhate scenario as correlates for being cyberhate victims, perpetrators, and both victim–perpetrators. The sample consisted of 6829 adolescents aged 12–18 years old (Mage = 14.93, SD = 1.64; girls: 50.4%, boys: 48.9%, and 0.7% did not indicate their gender) from Asia, Europe, and North America. Results showed that adolescents who endorsed distal advice or endorsed technical coping showed a lower likelihood to be victims, perpetrators, or victim–perpetrators. In contrast, if adolescents felt helpless or endorsed retaliation to cope with cyberhate, they showed higher odds of being involved in cyberhate as victims, perpetrators, or victim–perpetrators. Finally, adolescents who endorsed close support as a coping strategy showed a lower likelihood to be victim–perpetrators, and adolescents who endorsed assertive coping showed higher odds of being victims. In conclusion, the results confirm the importance of addressing adolescents’ ability to deal with cyberhate to develop more tailored prevention approaches. More specifically, such initiatives should focus on adolescents who feel helpless or feel inclined to retaliate. In addition, adolescents should be educated to practice distal advice and technical coping when experiencing cyberhate. Implications for the design and instruction of evidence-based cyberhate prevention (e.g., online educational games, virtual learning environments) will be discussed.
Physical fatigue (PF) negatively affects postural control, resulting in impaired balance performance in young and older adults. Similar effects on postural control can be observed for mental fatigue (MF) mainly in older adults. Controversial results exist for young adults. There is a void in the literature on the effects of fatigue on balance and cortical activity. Therefore, this study aimed to examine the acute effects of PF and MF on postural sway and cortical activity. Fifteen healthy young adults aged 28 ± 3 years participated in this study. MF and PF protocols comprising of an all-out repeated sit-to-stand task and a computer-based attention network test, respectively, were applied in random order. Pre and post fatigue, cortical activity and postural sway (i.e., center of pressure displacements [CoPd], velocity [CoPv], and CoP variability [CV CoPd, CV CoPv]) were tested during a challenging bipedal balance board task. Absolute spectral power was calculated for theta (4–7.5 Hz), alpha-2 (10.5–12.5 Hz), beta-1 (13–18 Hz), and beta-2 (18.5–25 Hz) in frontal, central, and parietal regions of interest (ROI) and baseline-normalized. Inference statistics revealed a significant time-by-fatigue interaction for CoPd (p = 0.009, d = 0.39, Δ 9.2%) and CoPv (p = 0.009, d = 0.36, Δ 9.2%), and a significant main effect of time for CoP variability (CV CoPd: p = 0.001, d = 0.84; CV CoPv: p = 0.05, d = 0.62). Post hoc analyses showed a significant increase in CoPd (p = 0.002, d = 1.03) and CoPv (p = 0.003, d = 1.03) following PF but not MF. For cortical activity, a significant time-by-fatigue interaction was found for relative alpha-2 power in parietal (p < 0.001, d = 0.06) areas. Post hoc tests indicated larger alpha-2 power increases after PF (p < 0.001, d = 1.69, Δ 3.9%) compared to MF (p = 0.001, d = 1.03, Δ 2.5%). In addition, changes in parietal alpha-2 power and measures of postural sway did not correlate significantly, irrespective of the applied fatigue protocol. No significant changes were found for the other frequency bands, irrespective of the fatigue protocol and ROI under investigation. Thus, the applied PF protocol resulted in increased postural sway (CoPd and CoPv) and CoP variability accompanied by enhanced alpha-2 power in the parietal ROI while MF led to increased CoP variability and alpha-2 power in our sample of young adults. Potential underlying cortical mechanisms responsible for the greater increase in parietal alpha-2 power after PF were discussed but could not be clearly identified as cause. Therefore, further future research is needed to decipher alternative interpretations.
Inverted perovskite solar cells still suffer from significant non-radiative recombination losses at the perovskite surface and across the perovskite/C₆₀ interface, limiting the future development of perovskite-based single- and multi-junction photovoltaics. Therefore, more effective inter- or transport layers are urgently required. To tackle these recombination losses, we introduce ortho-carborane as an interlayer material that has a spherical molecular structure and a three-dimensional aromaticity. Based on a variety of experimental techniques, we show that ortho-carborane decorated with phenylamino groups effectively passivates the perovskite surface and essentially eliminates the non-radiative recombination loss across the perovskite/C₆₀ interface with high thermal stability. We further demonstrate the potential of carborane as an electron transport material, facilitating electron extraction while blocking holes from the interface. The resulting inverted perovskite solar cells deliver a power conversion efficiency of over 23% with a low non-radiative voltage loss of 110 mV, and retain >97% of the initial efficiency after 400 h of maximum power point tracking. Overall, the designed carborane based interlayer simultaneously enables passivation, electron-transport and hole-blocking and paves the way toward more efficient and stable perovskite solar cells.
The purpose of this study was to examine the test-retest reliability, and convergent and discriminative validity of a new taekwondo-specific change-of-direction (COD) speed test with striking techniques (TST) in elite taekwondo athletes. Twenty (10 males and 10 females) elite (athletes who compete at national level) and top-elite (athletes who compete at national and international level) taekwondo athletes with an average training background of 8.9 ± 1.3 years of systematic taekwondo training participated in this study. During the two-week test-retest period, various generic performance tests measuring COD speed, balance, speed, and jump performance were carried out during the first week and as a retest during the second week. Three TST trials were conducted with each athlete and the best trial was used for further analyses. The relevant performance measure derived from the TST was the time with striking penalty (TST-TSP). TST-TSP performances amounted to 10.57 ± 1.08 s for males and 11.74 ± 1.34 s for females. The reliability analysis of the TST performance was conducted after logarithmic transformation, in order to address the problem of heteroscedasticity. In both groups, the TST demonstrated a high relative test-retest reliability (intraclass correlation coefficients and 90% compatibility limits were 0.80 and 0.47 to 0.93, respectively). For absolute reliability, the TST’s typical error of measurement (TEM), 90% compatibility limits, and magnitudes were 4.6%, 3.4 to 7.7, for males, and 5.4%, 3.9 to 9.0, for females. The homogeneous sample of taekwondo athletes meant that the TST’s TEM exceeded the usual smallest important change (SIC) with 0.2 effect size in the two groups. The new test showed mostly very large correlations with linear sprint speed (r = 0.71 to 0.85) and dynamic balance (r = −0.71 and −0.74), large correlations with COD speed (r = 0.57 to 0.60) and vertical jump performance (r = −0.50 to −0.65), and moderate correlations with horizontal jump performance (r = −0.34 to −0.45) and static balance (r = −0.39 to −0.44). Top-elite athletes showed better TST performances than elite counterparts. Receiver operating characteristic analysis indicated that the TST effectively discriminated between top-elite and elite taekwondo athletes. In conclusion, the TST is a valid, and sensitive test to evaluate the COD speed with taekwondo specific skills, and reliable when considering ICC and TEM. Although the usefulness of the TST is questioned to detect small performance changes in the present population, the TST can detect moderate changes in taekwondo-specific COD speed.
Background: There is evidence that fully recovered COVID-19 patients usually resume physical exercise, but do not perform at the same intensity level performed prior to infection. The aim of this study was to evaluate the impact of COVID-19 infection and recovery as well as muscle fatigue on cardiorespiratory fitness and running biomechanics in female recreational runners.
Methods: Twenty-eight females were divided into a group of hospitalized and recovered COVID-19 patients (COV, n = 14, at least 14 days following recovery) and a group of healthy age-matched controls (CTR, n = 14). Ground reaction forces from stepping on a force plate while barefoot overground running at 3.3 m/s was measured before and after a fatiguing protocol. The fatigue protocol consisted of incrementally increasing running speed until reaching a score of 13 on the 6–20 Borg scale, followed by steady-state running until exhaustion. The effects of group and fatigue were assessed for steady-state running duration, steady-state running speed, ground contact time, vertical instantaneous loading rate and peak propulsion force.
Results: COV runners completed only 56% of the running time achieved by the CTR (p < 0.0001), and at a 26% slower steady-state running speed (p < 0.0001). There were fatigue-related reductions in loading rate (p = 0.004) without group differences. Increased ground contact time (p = 0.002) and reduced peak propulsion force (p = 0.005) were found for COV when compared to CTR.
Conclusion: Our results suggest that female runners who recovered from COVID-19 showed compromised running endurance and altered running kinetics in the form of longer stance periods and weaker propulsion forces. More research is needed in this area using larger sample sizes to confirm our study findings.
Objective: A role for microRNAs is implicated in several biological and pathological processes. We investigated the effects of high-intensity interval training (HIIT) and moderate-intensity continuous training (MICT) on molecular markers of diabetic cardiomyopathy in rats.
Methods: Eighteen male Wistar rats (260 ± 10 g; aged 8 weeks) with streptozotocin (STZ)-induced type 1 diabetes mellitus (55 mg/kg, IP) were randomly allocated to three groups: control, MICT, and HIIT. The two different training protocols were performed 5 days each week for 5 weeks. Cardiac performance (end-systolic and end-diastolic dimensions, ejection fraction), the expression of miR-206, HSP60, and markers of apoptosis (cleaved PARP and cytochrome C) were determined at the end of the exercise interventions.
Results: Both exercise interventions (HIIT and MICT) decreased blood glucose levels and improved cardiac performance, with greater changes in the HIIT group (p < 0.001, η2: 0.909). While the expressions of miR-206 and apoptotic markers decreased in both training protocols (p < 0.001, η2: 0.967), HIIT caused greater reductions in apoptotic markers and produced a 20% greater reduction in miR-206 compared with the MICT protocol (p < 0.001). Furthermore, both training protocols enhanced the expression of HSP60 (p < 0.001, η2: 0.976), with a nearly 50% greater increase in the HIIT group compared with MICT.
Conclusions: Our results indicate that both exercise protocols, HIIT and MICT, have the potential to reduce diabetic cardiomyopathy by modifying the expression of miR-206 and its downstream targets of apoptosis. It seems however that HIIT is even more effective than MICT to modulate these molecular markers.
Background
Eating in absence of hunger is quite common and often associated with an increased energy intake co-existent with a poorer food choice. Intuitive eating (IE), i.e., eating in accordance with internal hunger and satiety cues, may protect from overeating. IE, however, requires accurate perception and processing of one’s own bodily signals, also referred to as interoceptive sensitivity. Training interoceptive sensitivity might therefore be an effective method to promote IE and prevent overeating. As most studies on eating behavior are conducted in younger adults and close social relationships influence health-related behavior, this study focuses on middle-aged and older couples.
Methods
The present pilot randomized intervention study aims at investigating the feasibility and effectiveness of a 21-day mindfulness-based training program designed to increase interoceptive sensitivity. A total of N = 60 couples participating in the NutriAct Family Study, aged 50–80 years, will be recruited. This randomized-controlled intervention study comprises three measurement points (pre-intervention, post-intervention, 4-week follow-up) and a 21-day training that consists of daily mindfulness-based guided audio exercises (e.g., body scan). A three-arm intervention study design is applied to compare two intervention groups (training together as a couple vs. training alone) with a control group (no training). Each measurement point includes the assessment of self-reported and objective indicators of interoceptive sensitivity (primary outcome), self-reported indicators of intuitive and maladaptive eating (secondary outcomes), and additional variables. A training evaluation applying focus group discussions will be conducted to assess participants’ overall acceptance of the training and its feasibility.
Discussion
By investigating the feasibility and effectiveness of a mindfulness-based training program to increase interoceptive sensitivity, the present study will contribute to a deeper understanding of how to promote healthy eating in older age.
Privacy regulations and the physical distribution of heterogeneous data are often primary concerns for the development of deep learning models in a medical context. This paper evaluates the feasibility of differentially private federated learning for chest X-ray classification as a defense against data privacy attacks. To the best of our knowledge, we are the first to directly compare the impact of differentially private training on two different neural network architectures, DenseNet121 and ResNet50. Extending the federated learning environments previously analyzed in terms of privacy, we simulated a heterogeneous and imbalanced federated setting by distributing images from the public CheXpert and Mendeley chest X-ray datasets unevenly among 36 clients. Both non-private baseline models achieved an area under the receiver operating characteristic curve (AUC) of 0.940.94 on the binary classification task of detecting the presence of a medical finding. We demonstrate that both model architectures are vulnerable to privacy violation by applying image reconstruction attacks to local model updates from individual clients. The attack was particularly successful during later training stages. To mitigate the risk of a privacy breach, we integrated Rényi differential privacy with a Gaussian noise mechanism into local model training. We evaluate model performance and attack vulnerability for privacy budgets ε∈{1,3,6,10}�∈{1,3,6,10}. The DenseNet121 achieved the best utility-privacy trade-off with an AUC of 0.940.94 for ε=6�=6. Model performance deteriorated slightly for individual clients compared to the non-private baseline. The ResNet50 only reached an AUC of 0.760.76 in the same privacy setting. Its performance was inferior to that of the DenseNet121 for all considered privacy constraints, suggesting that the DenseNet121 architecture is more robust to differentially private training.
Objective: To examine the effect of plyometric jump training on skeletal muscle hypertrophy in healthy individuals.
Methods: A systematic literature search was conducted in the databases PubMed, SPORTDiscus, Web of Science, and Cochrane Library up to September 2021.
Results: Fifteen studies met the inclusion criteria. The main overall finding (44 effect sizes across 15 clusters median = 2, range = 1–15 effects per cluster) indicated that plyometric jump training had small to moderate effects [standardised mean difference (SMD) = 0.47 (95% CIs = 0.23–0.71); p < 0.001] on skeletal muscle hypertrophy. Subgroup analyses for training experience revealed trivial to large effects in non-athletes [SMD = 0.55 (95% CIs = 0.18–0.93); p = 0.007] and trivial to moderate effects in athletes [SMD = 0.33 (95% CIs = 0.16–0.51); p = 0.001]. Regarding muscle groups, results showed moderate effects for the knee extensors [SMD = 0.72 (95% CIs = 0.66–0.78), p < 0.001] and equivocal effects for the plantar flexors [SMD = 0.65 (95% CIs = −0.25–1.55); p = 0.143]. As to the assessment methods of skeletal muscle hypertrophy, findings indicated trivial to small effects for prediction equations [SMD = 0.29 (95% CIs = 0.16–0.42); p < 0.001] and moderate-to-large effects for ultrasound imaging [SMD = 0.74 (95% CIs = 0.59–0.89); p < 0.001]. Meta-regression analysis indicated that the weekly session frequency moderates the effect of plyometric jump training on skeletal muscle hypertrophy, with a higher weekly session frequency inducing larger hypertrophic gains [β = 0.3233 (95% CIs = 0.2041–0.4425); p < 0.001]. We found no clear evidence that age, sex, total training period, single session duration, or the number of jumps per week moderate the effect of plyometric jump training on skeletal muscle hypertrophy [β = −0.0133 to 0.0433 (95% CIs = −0.0387 to 0.1215); p = 0.101–0.751].
Conclusion: Plyometric jump training can induce skeletal muscle hypertrophy, regardless of age and sex. There is evidence for relatively larger effects in non-athletes compared with athletes. Further, the weekly session frequency seems to moderate the effect of plyometric jump training on skeletal muscle hypertrophy, whereby more frequent weekly plyometric jump training sessions elicit larger hypertrophic adaptations.
The first step towards assessing hazards in seismically active regions involves mapping capable faults and estimating their recurrence times. While the mapping of active faults is commonly based on distinct geologic and geomorphic features evident at the surface, mapping blind seismogenic faults is complicated by the absence of on-fault diagnostic features. Here we investigated the Pichilemu Fault in coastal Chile, unknown until it generated a Mw 7.0 earthquake in 2010. The lack of evident surface faulting suggests activity along a partly-hidden blind fault. We used off-fault deformed marine terraces to estimate a fault-slip rate of 0.52 ± 0.04 m/ka, which, when integrated with satellite geodesy suggests a 2.12 ± 0.2 ka recurrence time for Mw~7.0 normal-faulting earthquakes. We propose that extension in the Pichilemu region is associated with stress changes during megathrust earthquakes and accommodated by sporadic slip during upper-plate earthquakes, which has implications for assessing the seismic potential of cryptic faults along convergent margins and elsewhere.
Is There a Rural Penalty in Language Acquisition? Evidence From Germany's Refugee Allocation Policy
(2022)
Emerging evidence has highlighted the important role of local contexts for integration trajectories of asylum seekers and refugees. Germany's policy of randomly allocating asylum seekers across Germany may advantage some and disadvantage others in terms of opportunities for equal participation in society. This study explores the question whether asylum seekers that have been allocated to rural areas experience disadvantages in terms of language acquisition compared to those allocated to urban areas. We derive testable assumptions using a Directed Acyclic Graph (DAG) which are then tested using large-N survey data (IAB-BAMF-SOEP refugee survey). We find that living in a rural area has no negative total effect on language skills. Further the findings suggest that the “null effect” is the result of two processes which offset each other: while asylum seekers in rural areas have slightly lower access for formal, federally organized language courses, they have more regular exposure to German speakers.
Physical activity and exercise are effective approaches in prevention and therapy of multiple diseases. Although the specific characteristics of lengthening contractions have the potential to be beneficial in many clinical conditions, eccentric training is not commonly used in clinical populations with metabolic, orthopaedic, or neurologic conditions. The purpose of this pilot study is to investigate the feasibility, functional benefits, and systemic responses of an eccentric exercise program focused on the trunk and lower extremities in people with low back pain (LBP) and multiple sclerosis (MS). A six-week eccentric training program with three weekly sessions is performed by people with LBP and MS. The program consists of ten exercises addressing strength of the trunk and lower extremities. The study follows a four-group design (N = 12 per group) in two study centers (Israel and Germany): three groups perform the eccentric training program: A) control group (healthy, asymptomatic); B) people with LBP; C) people with MS; group D (people with MS) receives standard care physiotherapy. Baseline measurements are conducted before first training, post-measurement takes place after the last session both comprise blood sampling, self-reported questionnaires, mobility, balance, and strength testing. The feasibility of the eccentric training program will be evaluated using quantitative and qualitative measures related to the study process, compliance and adherence, safety, and overall program assessment. For preliminary assessment of potential intervention effects, surrogate parameters related to mobility, postural control, muscle strength and systemic effects are assessed. The presented study will add knowledge regarding safety, feasibility, and initial effects of eccentric training in people with orthopaedic and neurological conditions. The simple exercises, that are easily modifiable in complexity and intensity, are likely beneficial to other populations. Thus, multiple applications and implementation pathways for the herein presented training program are conceivable.
Basic psychological needs theory postulates that a social environment that satisfies individuals’ three basic psychological needs of autonomy, competence, and relatedness leads to optimal growth and well-being. On the other hand, the frustration of these needs is associated with ill-being and depressive symptoms foremost investigated in non-clinical samples; yet, there is a paucity of research on need frustration in clinical samples. Survey data were compared between adult individuals with major depressive disorder (MDD; n = 115; 48.69% female; 38.46 years, SD = 10.46) with those of a non-depressed comparison sample (n = 201; 53.23% female; 30.16 years, SD = 12.81). Need profiles were examined with a linear mixed model (LMM). Individuals with depression reported higher levels of frustration and lower levels of satisfaction in relation to the three basic psychological needs when compared to non-depressed adults. The difference between depressed and non-depressed groups was significantly larger for frustration than satisfaction regarding the needs for relatedness and competence. LMM correlation parameters confirmed the expected positive correlation between the three needs. This is the first study showing substantial differences in need-based experiences between depressed and non-depressed adults. The results confirm basic assumptions of the self-determination theory and have preliminary implications in tailoring therapy for depression.
Background
Animal personality has emerged as a key concept in behavioral ecology. While many studies have demonstrated the influence of personality traits on behavioral patterns, its quantification, especially in wild animal populations, remains a challenge. Only a few studies have established a link between personality and recurring movements within home ranges, although these small-scale movements are of key importance for identifying ecological interactions and forming individual niches. In this regard, differences in space use among individuals might reflect different exploration styles between behavioral types along the shy-bold continuum.
Methods
We assessed among-individual differences in behavior in the European hare (Lepus europaeus), a characteristic mammalian herbivore in agricultural landscapes using a standardized box emergence test for captive and wild hares. We determined an individuals’ degree of boldness by measuring the latencies of behavioral responses in repeated emergence tests in captivity. During capture events of wild hares, we conducted a single emergence test and recorded behavioral responses proven to be stable over time in captive hares. Applying repeated novel environment tests in a near-natural enclosure, we further quantified aspects of exploration and activity in captive hares. Finally, we investigated whether and how this among-individual behavioral variation is related to general activity and space use in a wild hare population. Wild and captive hares were treated similarly and GPS-collared with internal accelerometers prior to release to the wild or the outdoor enclosure, respectively. General activity was quantified as overall dynamic body acceleration (ODBA) obtained from accelerometers. Finally, we tested whether boldness explained variation in (i) ODBA in both settings and (ii) variation in home ranges and core areas across different time scales of GPS-collared hares in a wild population.
Results
We found three behavioral responses to be consistent over time in captive hares. ODBA was positively related to boldness (i.e., short latencies to make first contact with the new environment) in both captive and wild hares. Space use in wild hares also varied with boldness, with shy individuals having smaller core areas and larger home ranges than bold conspecifics (yet in some of the parameter space, this association was just marginally significant).
Conclusions
Against our prediction, shy individuals occupied relatively large home ranges but with small core areas. We suggest that this space use pattern is due to them avoiding risky, and energy-demanding competition for valuable resources. Carefully validated, activity measurements (ODBA) from accelerometers provide a valuable tool to quantify aspects of animal personality along the shy-bold continuum remotely. Without directly observing—and possibly disturbing—focal individuals, this approach allows measuring variability in animal personality, especially in species that are difficult to assess with experiments. Considering that accelerometers are often already built into GPS units, we recommend activating them at least during the initial days of tracking to estimate individual variation in general activity and, if possible, match them with a simple novelty experiment. Furthermore, information on individual behavioral types will help to facilitate mechanistic understanding of processes that drive spatial and ecological dynamics in heterogeneous landscapes.
In semi-arid environments characterized by erratic rainfall and scattered primary production, migratory movements are a key survival strategy of large herbivores to track resources over vast areas. Veterinary Cordon Fences (VCFs), intended to reduce wildlife-livestock disease transmission, fragment large parts of southern Africa and have limited the movements of large wild mammals for over 60 years. Consequently, wildlife-fence interactions are frequent and often result in perforations of the fence, mainly caused by elephants. Yet, we lack knowledge about at which times fences act as barriers, how fences directly alter the energy expenditure of native herbivores, and what the consequences of impermeability are. We studied 2-year ungulate movements in three common antelopes (springbok, kudu, eland) across a perforated part of Namibia's VCF separating a wildlife reserve and Etosha National Park using GPS telemetry, accelerometer measurements, and satellite imagery. We identified 2905 fence interaction events which we used to evaluate critical times of encounters and direct fence effects on energy expenditure. Using vegetation type-specific greenness dynamics, we quantified what animals gained in terms of high quality food resources from crossing the VCF. Our results show that the perforation of the VCF sustains herbivore-vegetation interactions in the savanna with its scattered resources. Fence permeability led to peaks in crossing numbers during the first flush of woody plants before the rain started. Kudu and eland often showed increased energy expenditure when crossing the fence. Energy expenditure was lowered during the frequent interactions of ungulates standing at the fence. We found no alteration of energy expenditure when springbok immediately found and crossed fence breaches. Our results indicate that constantly open gaps did not affect energy expenditure, while gaps with obstacles increased motion. Closing gaps may have confused ungulates and modified their intended movements. While browsing, sedentary kudu's use of space was less affected by the VCF; migratory, mixed-feeding springbok, and eland benefited from gaps by gaining forage quality and quantity after crossing. This highlights the importance of access to vast areas to allow ungulates to track vital vegetation patches.
Glaciated high-alpine areas are fundamentally altered by climate change, with well-known implications for hydrology, e.g., due to glacier retreat, longer snow-free periods, and more frequent and intense summer rainstorms. While knowledge on how these hydrological changes will propagate to suspended sediment dynamics is still scarce, it is needed to inform mitigation and adaptation strategies. To understand the processes and source areas most relevant to sediment dynamics, we analyzed discharge and sediment dynamics in high temporal resolution as well as their patterns on several spatial scales, which to date few studies have done.
We used a nested catchment setup in the Upper Ötztal in Tyrol, Austria, where high-resolution (15 min) time series of discharge and suspended sediment concentrations are available for up to 15 years (2006–2020). The catchments of the gauges in Vent, Sölden and Tumpen range from 100 to almost 800 km2 with 10 % to 30 % glacier cover and span an elevation range of 930 to 3772 m a.s.l. We analyzed discharge and suspended sediment yields (SSY), their distribution in space, their seasonality and spatial differences therein, and the relative importance of short-term events. We complemented our analysis by linking the observations to satellite-based snow cover maps, glacier inventories, mass balances and precipitation data.
Our results indicate that the areas above 2500 m a.s.l., characterized by glacier tongues and the most recently deglaciated areas, are crucial for sediment generation in all sub-catchments. This notion is supported by the synchronous spring onset of sediment export at the three gauges, which coincides with snowmelt above 2500 m but lags behind spring discharge onsets. This points at a limitation of suspended sediment supply as long as the areas above 2500 m are snow-covered. The positive correlation of annual SSY with glacier cover (among catchments) and glacier mass balances (within a catchment) further supports the importance of the glacier-dominated areas. The analysis of short-term events showed that summer precipitation events were associated with peak sediment concentrations and yields but on average accounted for only 21 % of the annual SSY in the headwaters. These results indicate that under current conditions, thermally induced sediment export (through snow and glacier melt) is dominant in the study area.
Our results extend the scientific knowledge on current hydro-sedimentological conditions in glaciated high-alpine areas and provide a baseline for studies on projected future changes in hydro-sedimentological system dynamics.
Quantifying neurological disorders from voice is a rapidly growing field of research and holds promise for unobtrusive and large-scale disorder monitoring. The data recording setup and data analysis pipelines are both crucial aspects to effectively obtain relevant information from participants. Therefore, we performed a systematic review to provide a high-level overview of practices across various neurological disorders and highlight emerging trends. PRISMA-based literature searches were conducted through PubMed, Web of Science, and IEEE Xplore to identify publications in which original (i.e., newly recorded) datasets were collected. Disorders of interest were psychiatric as well as neurodegenerative disorders, such as bipolar disorder, depression, and stress, as well as amyotrophic lateral sclerosis amyotrophic lateral sclerosis, Alzheimer's, and Parkinson's disease, and speech impairments (aphasia, dysarthria, and dysphonia). Of the 43 retrieved studies, Parkinson's disease is represented most prominently with 19 discovered datasets. Free speech and read speech tasks are most commonly used across disorders. Besides popular feature extraction toolkits, many studies utilise custom-built feature sets. Correlations of acoustic features with psychiatric and neurodegenerative disorders are presented. In terms of analysis, statistical analysis for significance of individual features is commonly used, as well as predictive modeling approaches, especially with support vector machines and a small number of artificial neural networks. An emerging trend and recommendation for future studies is to collect data in everyday life to facilitate longitudinal data collection and to capture the behavior of participants more naturally. Another emerging trend is to record additional modalities to voice, which can potentially increase analytical performance.
Thousands of glacier lakes have been forming behind natural dams in high mountains following glacier retreat since the early 20th century. Some of these lakes abruptly released pulses of water and sediment with disastrous downstream consequences. Yet it remains unclear whether the reported rise of these glacier lake outburst floods (GLOFs) has been fueled by a warming atmosphere and enhanced meltwater production, or simply a growing research effort. Here we estimate trends and biases in GLOF reporting based on the largest global catalog of 1,997 dated glacier-related floods in six major mountain ranges from 1901 to 2017. We find that the positive trend in the number of reported GLOFs has decayed distinctly after a break in the 1970s, coinciding with independently detected trend changes in annual air temperatures and in the annual number of field-based glacier surveys (a proxy of scientific reporting). We observe that GLOF reports and glacier surveys decelerated, while temperature rise accelerated in the past five decades. Enhanced warming alone can thus hardly explain the annual number of reported GLOFs, suggesting that temperature-driven glacier lake formation, growth, and failure are weakly coupled, or that outbursts have been overlooked. Indeed, our analysis emphasizes a distinct geographic and temporal bias in GLOF reporting, and we project that between two to four out of five GLOFs on average might have gone unnoticed in the early to mid-20th century. We recommend that such biases should be considered, or better corrected for, when attributing the frequency of reported GLOFs to atmospheric warming.
Cognitive resources contribute to balance control. There is evidence that mental fatigue reduces cognitive resources and impairs balance performance, particularly in older adults and when balance tasks are complex, for example when trying to walk or stand while concurrently performing a secondary cognitive task.
We conducted a systematic literature search in PubMed (MEDLINE), Web of Science and Google Scholar to identify eligible studies and performed a random effects meta-analysis to quantify the effects of experimentally induced mental fatigue on balance performance in healthy adults. Subgroup analyses were computed for age (healthy young vs. healthy older adults) and balance task complexity (balance tasks with high complexity vs. balance tasks with low complexity) to examine the moderating effects of these factors on fatigue-mediated balance performance.
We identified 7 eligible studies with 9 study groups and 206 participants. Analysis revealed that performing a prolonged cognitive task had a small but significant effect (SMDwm = −0.38) on subsequent balance performance in healthy young and older adults. However, age- and task-related differences in balance responses to fatigue could not be confirmed statistically.
Overall, aggregation of the available literature indicates that mental fatigue generally reduces balance in healthy adults. However, interactions between cognitive resource reduction, aging and balance task complexity remain elusive.
In recent years digital technologies have become a major means for providing health-related services and this trend was strongly reinforced by the current Coronavirus disease 2019 (COVID-19) pandemic. As it is well-known that regular physical activity has positive effects on individual physical and mental health and thus is an important prerequisite for healthy aging, digital technologies are also increasingly used to promote unstructured and structured forms of physical activity. However, in the course of this development, several terms (e.g., Digital Health, Electronic Health, Mobile Health, Telehealth, Telemedicine, and Telerehabilitation) have been introduced to refer to the application of digital technologies to provide health-related services such as physical interventions. Unfortunately, the above-mentioned terms are often used in several different ways, but also relatively interchangeably. Given that ambiguous terminology is a major source of difficulty in scientific communication which can impede the progress of theoretical and empirical research, this article aims to make the reader aware of the subtle differences between the relevant terms which are applied at the intersection of physical activity and Digital Health and to provide state-of-art definitions for them.
Importance Alcohol consumption (AC) leads to death and disability worldwide. Ongoing discussions on potential negative effects of the COVID-19 pandemic on AC need to be informed by real-world evidence.
Objective To examine whether lockdown measures are associated with AC and consumption-related temporal and psychological within-person mechanisms.
Design, Setting, and Participants This quantitative, intensive, longitudinal cohort study recruited 1743 participants from 3 sites from February 20, 2020, to February 28, 2021. Data were provided before and within the second lockdown of the COVID-19 pandemic in Germany: before lockdown (October 2 to November 1, 2020); light lockdown (November 2 to December 15, 2020); and hard lockdown (December 16, 2020, to February 28, 2021).
Main Outcomes and Measures Daily ratings of AC (main outcome) captured during 3 lockdown phases (main variable) and temporal (weekends and holidays) and psychological (social isolation and drinking intention) correlates.
Results Of the 1743 screened participants, 189 (119 [63.0%] male; median [IQR] age, 37 [27.5-52.0] years) with at least 2 alcohol use disorder (AUD) criteria according to the Diagnostic and Statistical Manual of Mental Disorders (Fifth Edition) yet without the need for medically supervised alcohol withdrawal were included. These individuals provided 14 694 smartphone ratings from October 2020 through February 2021. Multilevel modeling revealed significantly higher AC (grams of alcohol per day) on weekend days vs weekdays (β = 11.39; 95% CI, 10.00-12.77; P < .001). Alcohol consumption was above the overall average on Christmas (β = 26.82; 95% CI, 21.87-31.77; P < .001) and New Year’s Eve (β = 66.88; 95% CI, 59.22-74.54; P < .001). During the hard lockdown, perceived social isolation was significantly higher (β = 0.12; 95% CI, 0.06-0.15; P < .001), but AC was significantly lower (β = −5.45; 95% CI, −8.00 to −2.90; P = .001). Independent of lockdown, intention to drink less alcohol was associated with lower AC (β = −11.10; 95% CI, −13.63 to −8.58; P < .001). Notably, differences in AC between weekend and weekdays decreased both during the hard lockdown (β = −6.14; 95% CI, −9.96 to −2.31; P = .002) and in participants with severe AUD (β = −6.26; 95% CI, −10.18 to −2.34; P = .002).
Conclusions and Relevance This 5-month cohort study found no immediate negative associations of lockdown measures with overall AC. Rather, weekend-weekday and holiday AC patterns exceeded lockdown effects. Differences in AC between weekend days and weekdays evinced that weekend drinking cycles decreased as a function of AUD severity and lockdown measures, indicating a potential mechanism of losing and regaining control. This finding suggests that temporal patterns and drinking intention constitute promising targets for prevention and intervention, even in high-risk individuals.
This study examines the access to healthcare for children and adolescents with three common chronic diseases (type-1 diabetes (T1D), obesity, or juvenile idiopathic arthritis (JIA)) within the 4th (Delta), 5th (Omicron), and beginning of the 6th (Omicron) wave (June 2021 until July 2022) of the COVID-19 pandemic in Germany in a cross-sectional study using three national patient registries. A paper-and-pencil questionnaire was given to parents of pediatric patients (<21 years) during the routine check-ups. The questionnaire contains self-constructed items assessing the frequency of healthcare appointments and cancellations, remote healthcare, and satisfaction with healthcare. In total, 905 parents participated in the T1D-sample, 175 in the obesity-sample, and 786 in the JIA-sample. In general, satisfaction with healthcare (scale: 0–10; 10 reflecting the highest satisfaction) was quite high (median values: T1D 10, JIA 10, obesity 8.5). The proportion of children and adolescents with canceled appointments was relatively small (T1D 14.1%, JIA 11.1%, obesity 20%), with a median of 1 missed appointment, respectively. Only a few parents (T1D 8.6%; obesity 13.1%; JIA 5%) reported obstacles regarding health services during the pandemic. To conclude, it seems that access to healthcare was largely preserved for children and adolescents with chronic health conditions during the COVID-19 pandemic in Germany.