Department Sport- und Gesundheitswissenschaften
Refine
Year of publication
Document Type
- Article (933)
- Monograph/Edited Volume (135)
- Doctoral Thesis (111)
- Postprint (79)
- Other (44)
- Conference Proceeding (40)
- Review (40)
- Preprint (8)
- Habilitation Thesis (5)
- Bachelor Thesis (1)
Is part of the Bibliography
- yes (1398)
Keywords
- exercise (17)
- adolescents (12)
- athletic performance (12)
- cardiac rehabilitation (11)
- electromyography (10)
- resistance training (10)
- EMG (9)
- Gait (9)
- children (9)
- depression (9)
Institute
- Department Sport- und Gesundheitswissenschaften (1398)
- Humanwissenschaftliche Fakultät (10)
- Extern (8)
- Fakultät für Gesundheitswissenschaften (6)
- Hochschulambulanz (4)
- Department Psychologie (3)
- Hasso-Plattner-Institut für Digital Engineering gGmbH (3)
- Hasso-Plattner-Institut für Digital Engineering GmbH (2)
- Department Grundschulpädagogik (1)
- Historisches Institut (1)
In animals and humans, behavior can be influenced by irrelevant stimuli, a phenomenon called Pavlovian-to-instrumental transfer (PIT). In subjects with substance use disorder, PIT is even enhanced with functional activation in the nucleus accumbens (NAcc) and amygdala. While we observed enhanced behavioral and neural PIT effects in alcohol-dependent subjects, we here aimed to determine whether behavioral PIT is enhanced in young men with high-risk compared to low-risk drinking and subsequently related functional activation in an a-priori region of interest encompassing the NAcc and amygdala and related to polygenic risk for alcohol consumption. A representative sample of 18-year old men (n = 1937) was contacted: 445 were screened, 209 assessed: resulting in 191 valid behavioral, 139 imaging and 157 genetic datasets. None of the subjects fulfilled criteria for alcohol dependence according to the Diagnostic and Statistical Manual of Mental Disorders-IV-TextRevision (DSM-IV-TR). We measured how instrumental responding for rewards was influenced by background Pavlovian conditioned stimuli predicting action-independent rewards and losses. Behavioral PIT was enhanced in high-compared to low-risk drinkers (b = 0.09, SE = 0.03, z = 2.7, p < 0.009). Across all subjects, we observed PIT-related neural blood oxygen level-dependent (BOLD) signal in the right amygdala (t = 3.25, p(SVC) = 0.04, x = 26, y = -6, z = -12), but not in NAcc. The strength of the behavioral PIT effect was positively correlated with polygenic risk for alcohol consumption (r(s) = 0.17, p = 0.032). We conclude that behavioral PIT and polygenic risk for alcohol consumption might be a biomarker for a subclinical phenotype of risky alcohol consumption, even if no drug-related stimulus is present. The association between behavioral PIT effects and the amygdala might point to habitual processes related to out PIT task. In non-dependent young social drinkers, the amygdala rather than the NAcc is activated during PIT; possible different involvement in association with disease trajectory should be investigated in future studies.
Background
Generalized weakness and fatigue are underexplored symptoms in emergency medicine. Triage tools often underestimate patients presenting to the emergency department (ED) with these nonspecific symptoms (Nemec et al., 2010). At the same time, physicians' disease severity rating (DSR) on a scale from 0 (not sick at all) to 10 (extremely sick) predicts key outcomes in ED patients (Beglinger et al., 2015; Rohacek et al., 2015). Our goals were (1) to characterize ED patients with weakness and/or fatigue (W|F); to explore (2) to what extent physicians' DSR at triage can predict five key outcomes in ED patients with W|F; (3) how well DSR performs relative to two commonly used benchmark methods, the Emergency Severity Index (ESI) and the Charlson Comorbidity Index (CCI); (4) to what extent DSR provides predictive information beyond ESI, CCI, or their linear combination, i.e., whether ESI and CCI should be used alone or in combination with DSR; and (5) to what extent ESI, CCI, or their linear combination provide predictive information beyond DSR alone, i.e., whether DSR should be used alone or in combination with ESI and / or CCI.
Methods
Prospective observational study between 2013-2015 (analysis in 2018-2020, study team blinded to hypothesis) conducted at a single center. We study an all-comer cohort of 3,960 patients (48% female patients, median age = 51 years, 94% completed 1-year follow-up). We looked at two primary outcomes (acute morbidity (Bingisser et al., 2017; Weigel et al., 2017) and all-cause 1- year mortality) and three secondary outcomes (in-hospital mortality, hospitalization and transfer to ICU). We assessed the predictive power (i.e., resolution, measured as the Area under the ROC Curve, AUC) of the scores and, using logistic regression, their linear combinations.
Findings
Compared to patients without W|F (n = 3,227), patients with W|F (n = 733) showed higher prevalences for all five outcomes, reported more symptoms across both genders, and received higher DSRs (median = 4; interquartile range (IQR) = 3-6 vs. median = 3; IQR = 2-5). DSR predicted all five outcomes well above chance (i.e., AUCs > similar to 0.70), similarly well for both patients with and without W|F, and as good as or better than ESI and CCI in patients with and without W|F (except for 1-year mortality where CCI performs better). For acute morbidity, hospitalization, and transfer to ICU there is clear evidence that adding DSR to ESI and/or CCI improves predictions for both patient groups; for 1-year mortality and in-hospital mortality this holds for most, but not all comparisons. Adding ESI and/or CCI to DSR generally did not improve performance or even decreased it.
Conclusions
The use of physicians' disease severity rating has never been investigated in patients with generalized weakness and fatigue. We show that physicians' prediction of acute morbidity, mortality, hospitalization, and transfer to ICU through their DSR is also accurate in these patients. Across all patients, DSR is less predictive of acute morbidity for female than male patients, however. Future research should investigate how emergency physicians judge their patients' clinical state at triage and how this can be improved and used in simple decision aids.
Beyond randomised studies
(2020)
Background: Habitual walking speed predicts many clinical conditions later in life, but it declines with age. However, which particular exercise intervention can minimize the age-related gait speed loss is unclear.
Purpose: Our objective was to determine the effects of strength, power, coordination, and multimodal exercise training on healthy old adults' habitual and fast gait speed.
Methods: We performed a computerized systematic literature search in PubMed and Web of Knowledge from January 1984 up to December 2014. Search terms included 'Resistance training', 'power training', 'coordination training', 'multimodal training', and 'gait speed (outcome term). Inclusion criteria were articles available in full text, publication period over past 30 years, human species, journal articles, clinical trials, randomized controlled trials, English as publication language, and subject age C65 years. The methodological quality of all eligible intervention studies was assessed using the Physiotherapy Evidence Database (PEDro) scale. We computed weighted average standardized mean differences of the intervention-induced adaptations in gait speed using a random-effects model and tested for overall and individual intervention effects relative to no-exercise controls.
Results: A total of 42 studies (mean PEDro score of 5.0 +/- 1.2) were included in the analyses (2495 healthy old adults; age 74.2 years [64.4-82.7]; body mass 69.9 +/- 4.9 kg, height 1.64 +/- 0.05 m, body mass index 26.4 +/- 1.9 kg/m(2), and gait speed 1.22 +/- 0.18 m/s). The search identified only one power training study, therefore the subsequent analyses focused only on the effects of resistance, coordination, and multimodal training on gait speed. The three types of intervention improved gait speed in the three experimental groups combined (n = 1297) by 0.10 m/s (+/- 0.12) or 8.4 % (+/- 9.7), with a large effect size (ES) of 0.84. Resistance (24 studies; n = 613; 0.11 m/s; 9.3 %; ES: 0.84), coordination (eight studies, n = 198; 0.09 m/s; 7.6 %; ES: 0.76), and multimodal training (19 studies; n = 486; 0.09 m/s; 8.4 %, ES: 0.86) increased gait speed statistically and similarly.
Conclusions: Commonly used exercise interventions can functionally and clinically increase habitual and fast gait speed and help slow the loss of gait speed or delay its onset.
Background: Habitual walking speed predicts many clinical conditions later in life, but it declines with age. However, which particular exercise intervention can minimize the age-related gait speed loss is unclear.
Purpose: Our objective was to determine the effects of strength, power, coordination, and multimodal exercise training on healthy old adults' habitual and fast gait speed.
Methods: We performed a computerized systematic literature search in PubMed and Web of Knowledge from January 1984 up to December 2014. Search terms included 'Resistance training', 'power training', 'coordination training', 'multimodal training', and 'gait speed (outcome term). Inclusion criteria were articles available in full text, publication period over past 30 years, human species, journal articles, clinical trials, randomized controlled trials, English as publication language, and subject age C65 years. The methodological quality of all eligible intervention studies was assessed using the Physiotherapy Evidence Database (PEDro) scale. We computed weighted average standardized mean differences of the intervention-induced adaptations in gait speed using a random-effects model and tested for overall and individual intervention effects relative to no-exercise controls.
Results: A total of 42 studies (mean PEDro score of 5.0 +/- 1.2) were included in the analyses (2495 healthy old adults; age 74.2 years [64.4-82.7]; body mass 69.9 +/- 4.9 kg, height 1.64 +/- 0.05 m, body mass index 26.4 +/- 1.9 kg/m(2), and gait speed 1.22 +/- 0.18 m/s). The search identified only one power training study, therefore the subsequent analyses focused only on the effects of resistance, coordination, and multimodal training on gait speed. The three types of intervention improved gait speed in the three experimental groups combined (n = 1297) by 0.10 m/s (+/- 0.12) or 8.4 % (+/- 9.7), with a large effect size (ES) of 0.84. Resistance (24 studies; n = 613; 0.11 m/s; 9.3 %; ES: 0.84), coordination (eight studies, n = 198; 0.09 m/s; 7.6 %; ES: 0.76), and multimodal training (19 studies; n = 486; 0.09 m/s; 8.4 %, ES: 0.86) increased gait speed statistically and similarly.
Conclusions: Commonly used exercise interventions can functionally and clinically increase habitual and fast gait speed and help slow the loss of gait speed or delay its onset.
Background:
Osteoporosis is a growing public health problem. It is known that stress-related diseases such as depression may impair bone quality and lead to osteoporosis. The association between psychosocial stress and bone health may be triggered by alterations of mitochondrial function and cell signaling and intercellular communication. In this context, extracellular vesicles (EVs) may be relevant due to their crucial role in intercellular communicators through the transfer of cargo.
Aim:
This narrative review aims to summarize if the cargo of extracellular vesicles can constitute a biological link between psychosocial stress and osteoporosis.
Methods:
To evaluate this research question, a thorough literature search was conducted using PubMed, Google Scholar, and Science Direct. The research keywords are allostatic load, bone remodeling, microRNA, osteoblast, and osteoclast. A total of 21 articles were included in the narrative review.
Results:
We found that certain miRNAs in EVs, including miR-126a-3p, miR-128-3p, and miR-187-5p, have been described as crucial players in both psychosocial stress and osteoporosis.
Discussion:
This review describes EVs and their cargo as a potential mediator linking psychosocial stress and osteoporosis for the first time by highlighting common crucial miRNAs. However, based on the included studies, it is unclear whether EV-mediated transport of biological cargoes can alter the function of target cells in a real physiological environment.
Cell-free protein synthesis
(2020)
Proteins are the main source of drug targets and some of them possess therapeutic potential themselves. Among them, membrane proteins constitute approximately 50% of the major drug targets. In the drug discovery pipeline, rapid methods for producing different classes of proteins in a simple manner with high quality are important for structural and functional analysis. Cell-free systems are emerging as an attractive alternative for the production of proteins due to their flexible nature without any cell membrane constraints. In a bioproduction context, open systems based on cell lysates derived from different sources, and with batch-to-batch consistency, have acted as a catalyst for cell-free synthesis of target proteins. Most importantly, proteins can be processed for downstream applications like purification and functional analysis without the necessity of transfection, selection, and expansion of clones. In the last 5 years, there has been an increased availability of new cell-free lysates derived from multiple organisms, and their use for the synthesis of a diverse range of proteins. Despite this progress, major challenges still exist in terms of scalability, cost effectiveness, protein folding, and functionality. In this review, we present an overview of different cell-free systems derived from diverse sources and their application in the production of a wide spectrum of proteins. Further, this article discusses some recent progress in cell-free systems derived from Chinese hamster ovary and Sf21 lysates containing endogenous translocationally active microsomes for the synthesis of membrane proteins. We particularly highlight the usage of internal ribosomal entry site sequences for more efficient protein production, and also the significance of site-specific incorporation of non-canonical amino acids for labeling applications and creation of antibody drug conjugates using cell-free systems. We also discuss strategies to overcome the major challenges involved in commercializing cell-free platforms from a laboratory level for future drug development.
Field-based sports require athletes to run sub-maximally over significant distances, often while contending with dynamic perturbations to preferred coordination patterns. The ability to adapt movement to maintain performance under such perturbations appears to be trainable through exposure to task variability, which encourages movement variability. The aim of the present study was to investigate the extent to which various wearable resistance loading magnitudes alter coordination and induce movement variability during running. To investigate this, 14 participants (three female and 11 male) performed 10 sub-maximal velocity shuttle runs with either no weight, 1%, 3%, or 5% of body weight attached to the lower limbs. Sagittal plane lower limb joint kinematics from one complete stride cycle in each run were assessed using functional data analysis techniques, both across the participant group and within-individuals. At the group-level, decreases in ankle plantarflexion following toe-off were evident in the 3% and 5% conditions, while increased knee flexion occurred during weight acceptance in the 5% condition compared with unloaded running. At the individual-level, between-run joint angle profiles varied, with six participants exhibiting increased joint angle variability in one or more loading conditions compared with unloaded running. Loading of 5% decreased between-run ankle joint variability among two individuals, likely in accordance with the need to manage increased system load or the novelty of the task. In terms of joint coordination, the most considerable alterations to coordination occurred in the 5% loading condition at the hip-knee joint pair, however, only a minority of participants exhibited this tendency. Coaches should prescribe wearable resistance individually to perturb preferred coordination patterns and encourage movement variability without loading to the extent that movement options become limited.