Refine
Document Type
- Article (52)
- Postprint (15)
- Doctoral Thesis (2)
- Monograph/Edited Volume (1)
- Conference Proceeding (1)
- Master's Thesis (1)
Is part of the Bibliography
- yes (72) (remove)
Keywords
- allostatic load (5)
- bone remodeling (3)
- cortisol (3)
- microRNA (3)
- osteoblast (3)
- osteoclast (3)
- sonography (3)
- Advanced Dynamic Flow (2)
- Aging (2)
- Ankle injury (2)
Institute
- Fakultät für Gesundheitswissenschaften (72) (remove)
Electrical muscle stimulation (EMS) is an increasingly popular training method and has become the focus of research in recent years. New EMS devices offer a wide range of mobile applications for whole-body EMS (WB-EMS) training, e.g., the intensification of dynamic low-intensity endurance exercises through WB-EMS. The present study aimed to determine the differences in exercise intensity between WB-EMS-superimposed and conventional walking (EMS-CW), and CON and WB-EMS-superimposed Nordic walking (WB-EMS-NW) during a treadmill test. Eleven participants (52.0 ± years; 85.9 ± 7.4 kg, 182 ± 6 cm, BMI 25.9 ± 2.2 kg/m2) performed a 10 min treadmill test at a given velocity (6.5 km/h) in four different test situations, walking (W) and Nordic walking (NW) in both conventional and WB-EMS superimposed. Oxygen uptake in absolute (VO2) and relative to body weight (rel. VO2), lactate, and the rate of perceived exertion (RPE) were measured before and after the test. WB-EMS intensity was adjusted individually according to the feedback of the participant. The descriptive statistics were given in mean ± SD. For the statistical analyses, one-factorial ANOVA for repeated measures and two-factorial ANOVA [factors include EMS, W/NW, and factor combination (EMS*W/NW)] were performed (α = 0.05). Significant effects were found for EMS and W/NW factors for the outcome variables VO2 (EMS: p = 0.006, r = 0.736; W/NW: p < 0.001, r = 0.870), relative VO2 (EMS: p < 0.001, r = 0.850; W/NW: p < 0.001, r = 0.937), and lactate (EMS: p = 0.003, r = 0.771; w/NW: p = 0.003, r = 0.764) and both the factors produced higher results. However, the difference in VO2 and relative VO2 is within the range of biological variability of ± 12%. The factor combination EMS*W/NW is statistically non-significant for all three variables. WB-EMS resulted in the higher RPE values (p = 0.035, r = 0.613), RPE differences for W/NW and EMS*W/NW were not significant. The current study results indicate that WB-EMS influences the parameters of exercise intensity. The impact on exercise intensity and the clinical relevance of WB-EMS-superimposed walking (WB-EMS-W) exercise is questionable because of the marginal differences in the outcome variables.
Secretory Wnt trafficking can be studied in the polarized epithelial monolayer of Drosophila wing imaginal discs (WID). In this tissue, Wg (Drosophila Wnt-I) is presented on the apical surface of its source cells before being internalized into the endosomal pathway. Long-range Wg secretion and spread depend on secondary secretion from endosomal compartments, but the exact post-endocytic fate of Wg is poorly understood. Here, we summarize and present three protocols for the immunofluorescencebased visualization and quantitation of different pools of intracellular and extracellular Wg in WID: (1) steady-state extracellular Wg; (2) dynamic Wg trafficking inside endosomal compartments; and (3) dynamic Wg release to the cell surface. Using a genetic driver system for gene manipulation specifically at the posterior part of the WID (EnGal4) provides a robust internal control that allows for direct comparison of signal intensities of control and manipulated compartments of the same WID. Therefore, it also circumvents the high degree of staining variability usually associated with whole-tissue samples. In combination with the genetic manipulation of Wg pathway components that is easily feasible in Drosophila, these methods provide a tool-set for the dissection of secretory Wg trafficking and can help us to understand how Wnt proteins travel along endosomal compartments for short-and long-range signal secretion.
Background
Depression is one of the key factors contributing to difficulties in one’s ability to work, and serves as one of the major reasons why employees apply for psychotherapy and receive insurance subsidization of treatments. Hence, an increasing and growing number of studies rely on workability assessment scales as their primary outcome measure. The Work and Social Assessment Scale (WSAS) has been documented as one of the most psychometrically reliable and valid tools especially developed to assess workability and social functioning in patients with mental health problems. Yet, the application of the WSAS in Germany has been limited due to the paucity of a valid questionnaire in the German language. The objective of the present study was to translate the WSAS, as a brief and easy administrable tool into German and test its psychometric properties in a sample of adults with depression.
Methods
Two hundred seventy-seven patients (M = 48.3 years, SD = 11.1) with mild to moderately severe depression were recruited. A multistep translation from English into the German language was performed and the factorial validity, criterion validity, convergent validity, discriminant validity, internal consistency, and floor and ceiling effects were examined.
Results
The confirmatory factor analysis results confirmed the one-factor structure of the WSAS. Significant correlations with the WHODAS 2–0 questionnaire, a measure of functionality, demonstrated good convergent validity. Significant correlations with depression and quality of life demonstrated good criterion validity. The WSAS also demonstrated strong internal consistency (α = .89), and the absence of floor and ceiling effects indicated good sensitivity of the instrument.
Conclusions
The results of the present study demonstrated that the German version of the WSAS has good psychometric properties comparable to other international versions of this scale. The findings recommend a global assessment of psychosocial functioning with the sum score of the WSAS.
Background
Depression is one of the key factors contributing to difficulties in one’s ability to work, and serves as one of the major reasons why employees apply for psychotherapy and receive insurance subsidization of treatments. Hence, an increasing and growing number of studies rely on workability assessment scales as their primary outcome measure. The Work and Social Assessment Scale (WSAS) has been documented as one of the most psychometrically reliable and valid tools especially developed to assess workability and social functioning in patients with mental health problems. Yet, the application of the WSAS in Germany has been limited due to the paucity of a valid questionnaire in the German language. The objective of the present study was to translate the WSAS, as a brief and easy administrable tool into German and test its psychometric properties in a sample of adults with depression.
Methods
Two hundred seventy-seven patients (M = 48.3 years, SD = 11.1) with mild to moderately severe depression were recruited. A multistep translation from English into the German language was performed and the factorial validity, criterion validity, convergent validity, discriminant validity, internal consistency, and floor and ceiling effects were examined.
Results
The confirmatory factor analysis results confirmed the one-factor structure of the WSAS. Significant correlations with the WHODAS 2–0 questionnaire, a measure of functionality, demonstrated good convergent validity. Significant correlations with depression and quality of life demonstrated good criterion validity. The WSAS also demonstrated strong internal consistency (α = .89), and the absence of floor and ceiling effects indicated good sensitivity of the instrument.
Conclusions
The results of the present study demonstrated that the German version of the WSAS has good psychometric properties comparable to other international versions of this scale. The findings recommend a global assessment of psychosocial functioning with the sum score of the WSAS.
Training intervention effects on cognitive performance and neuronal plasticity — A pilot study
(2022)
Studies suggest that people suffering from chronic pain may have altered brain plasticity, along with altered functional connectivity between pain-processing brain regions. These may be related to decreased mood and cognitive performance. There is some debate as to whether physical activity combined with behavioral therapy (e.g. cognitive distraction, body scan) may counteract these changes. However, underlying neuronal mechanisms are unclear. The aim of the current pilot study with a 3-armed randomized controlled trial design was to examine the effects of sensorimotor training for nonspecific chronic low back pain on (1) cognitive performance; (2) fMRI activity co-fluctuations (functional connectivity) between pain-related brain regions; and (3) the relationship between functional connectivity and subjective variables (pain and depression). Six hundred and sixty two volunteers with non-specific chronic low back pain were randomly allocated to a unimodal (sensorimotor training), multidisciplinary (sensorimotor training and behavioral therapy) intervention, or to a control group within a multicenter study. A subsample of patients (n = 21) from one study center participated in the pilot study presented here. Measurements were at baseline, during (3 weeks, M2) and after intervention (12 weeks, M4 and 24 weeks, M5). Cognitive performance was measured by the Trail Making Test and functional connectivity by MRI. Pain perception and depression were assessed by the Von Korff questionnaire and the Hospital and Anxiety. Group differences were calculated by univariate and repeated ANOVA measures and Bayesian statistics; correlations by Pearson's r. Change and correlation of functional connection were analyzed within a pooled intervention group (uni-, multidisciplinary group). Results revealed that participants with increased pain intensity at baseline showed higher functional connectivity between pain-related brain areas used as ROIs in this study. Though small sample sizes limit generalization, cognitive performance increased in the multimodal group. Increased functional connectivity was observed in participants with increased pain ratings. Pain ratings and connectivity in pain-related brain regions decreased after the intervention. The results provide preliminary indication that intervention effects can potentially be achieved on the cognitive and neuronal level. The intervention may be suitable for therapy and prevention of non-specific chronic low back pain.
Training intervention effects on cognitive performance and neuronal plasticity — A pilot study
(2022)
Studies suggest that people suffering from chronic pain may have altered brain plasticity, along with altered functional connectivity between pain-processing brain regions. These may be related to decreased mood and cognitive performance. There is some debate as to whether physical activity combined with behavioral therapy (e.g. cognitive distraction, body scan) may counteract these changes. However, underlying neuronal mechanisms are unclear. The aim of the current pilot study with a 3-armed randomized controlled trial design was to examine the effects of sensorimotor training for nonspecific chronic low back pain on (1) cognitive performance; (2) fMRI activity co-fluctuations (functional connectivity) between pain-related brain regions; and (3) the relationship between functional connectivity and subjective variables (pain and depression). Six hundred and sixty two volunteers with non-specific chronic low back pain were randomly allocated to a unimodal (sensorimotor training), multidisciplinary (sensorimotor training and behavioral therapy) intervention, or to a control group within a multicenter study. A subsample of patients (n = 21) from one study center participated in the pilot study presented here. Measurements were at baseline, during (3 weeks, M2) and after intervention (12 weeks, M4 and 24 weeks, M5). Cognitive performance was measured by the Trail Making Test and functional connectivity by MRI. Pain perception and depression were assessed by the Von Korff questionnaire and the Hospital and Anxiety. Group differences were calculated by univariate and repeated ANOVA measures and Bayesian statistics; correlations by Pearson's r. Change and correlation of functional connection were analyzed within a pooled intervention group (uni-, multidisciplinary group). Results revealed that participants with increased pain intensity at baseline showed higher functional connectivity between pain-related brain areas used as ROIs in this study. Though small sample sizes limit generalization, cognitive performance increased in the multimodal group. Increased functional connectivity was observed in participants with increased pain ratings. Pain ratings and connectivity in pain-related brain regions decreased after the intervention. The results provide preliminary indication that intervention effects can potentially be achieved on the cognitive and neuronal level. The intervention may be suitable for therapy and prevention of non-specific chronic low back pain.
The primary aim of the current study was to examine the unique contribution of psychological need frustration and need satisfaction in the prediction of adults’ mental well-being and ill-being in a heterogeneous sample of adults (N = 334; Mage = 43.33, SD = 32.26; 53% females). Prior to this, validity evidence was provided for the German version of the Basic Psychological Need Satisfaction and Frustration Scale (BPNSFS) based on Self-Determination Theory (SDT). The results of the validation analyses found the German BPNSFS to be a valid and reliable measurement. Further, structural equation modeling (SEM) showed that both need satisfaction and frustration yielded unique and opposing associations with well-being. Specifically, the dimension of psychological need frustration predicted adults’ ill-being. Future research should examine whether frustration of psychological needs is involved in the onset and maintenance of psychopathology (e.g., major depressive disorder).
The Role of the Precuneus in Human Spatial Updating in a Real Environment Setting—A cTBS Study
(2022)
As we move through an environment, we update positions of our body relative to other objects, even when some objects temporarily or permanently leave our field of view—this ability is termed egocentric spatial updating and plays an important role in everyday life. Still, our knowledge about its representation in the brain is still scarce, with previous studies using virtual movements in virtual environments or patients with brain lesions suggesting that the precuneus might play an important role. However, whether this assumption is also true when healthy humans move in real environments where full body-based cues are available in addition to the visual cues typically used in many VR studies is unclear. Therefore, in this study we investigated the role of the precuneus in egocentric spatial updating in a real environment setting in 20 healthy young participants who underwent two conditions in a cross-over design: (a) stimulation, achieved through applying continuous theta-burst stimulation (cTBS) to inhibit the precuneus and (b) sham condition (activated coil turned upside down). In both conditions, participants had to walk back with blindfolded eyes to objects they had previously memorized while walking with open eyes. Simplified trials (without spatial updating) were used as control condition, to make sure the participants were not affected by factors such as walking blindfolded, vestibular or working memory deficits. A significant interaction was found, with participants performing better in the sham condition compared to real stimulation, showing smaller errors both in distance and angle. The results of our study reveal evidence of an important role of the precuneus in a real-environment egocentric spatial updating; studies on larger samples are necessary to confirm and further investigate this finding.
The Role of the Precuneus in Human Spatial Updating in a Real Environment Setting—A cTBS Study
(2022)
As we move through an environment, we update positions of our body relative to other objects, even when some objects temporarily or permanently leave our field of view—this ability is termed egocentric spatial updating and plays an important role in everyday life. Still, our knowledge about its representation in the brain is still scarce, with previous studies using virtual movements in virtual environments or patients with brain lesions suggesting that the precuneus might play an important role. However, whether this assumption is also true when healthy humans move in real environments where full body-based cues are available in addition to the visual cues typically used in many VR studies is unclear. Therefore, in this study we investigated the role of the precuneus in egocentric spatial updating in a real environment setting in 20 healthy young participants who underwent two conditions in a cross-over design: (a) stimulation, achieved through applying continuous theta-burst stimulation (cTBS) to inhibit the precuneus and (b) sham condition (activated coil turned upside down). In both conditions, participants had to walk back with blindfolded eyes to objects they had previously memorized while walking with open eyes. Simplified trials (without spatial updating) were used as control condition, to make sure the participants were not affected by factors such as walking blindfolded, vestibular or working memory deficits. A significant interaction was found, with participants performing better in the sham condition compared to real stimulation, showing smaller errors both in distance and angle. The results of our study reveal evidence of an important role of the precuneus in a real-environment egocentric spatial updating; studies on larger samples are necessary to confirm and further investigate this finding.
Background
Ankle sprain is the most common injury in basketball. Chronic ankle instability develops from an acute ankle sprain may cause negative effects on quality of life, ankle functionality or on increasing risk for recurrent ankle sprains and post-traumatic osteoarthritis. To facilitate a preventative strategy of chronic ankle instability (CAI) in the basketball population, gathering epidemiological data is essential. However, the epidemiological data of CAI in basketball is limited. Therefore, this study aims to investigate the prevalence of CAI in basketball athletes and to determine whether gender, competitive level, and basketball playing position influence this prevalence.
Methods
In a cross-sectional study, in total 391 Taiwanese basketball athletes from universities and sports clubs participated. Besides non-standardized questions about demographics and their history of ankle sprains, participants further filled out the standard Cumberland Ankle Instability Tool applied to determine the presence of ankle instability. Questionnaires from 255 collegiate and 133 semi-professional basketball athletes (male = 243, female = 145, 22.3 ± 3.8 years, 23.3 ± 2.2 kg/m2) were analyzed. Differences in prevalence between gender, competitive level and playing position were determined using the Chi-square test.
Results
In the surveyed cohort, 26% had unilateral CAI while 50% of them had bilateral CAI. Women had a higher prevalence than men in the whole surveyed cohort (X2(1) = 0.515, p = 0.003). This gender disparity also showed from sub-analyses, that the collegiate female athletes had a higher prevalence than collegiate men athletes (X2(1) = 0.203, p = 0.001). Prevalence showed no difference between competitive levels (p > 0.05) and among playing positions (p > 0.05).
Conclusions
CAI is highly prevalent in the basketball population. Gender affects the prevalence of CAI. Regardless of the competitive level and playing position the prevalence of CAI is similar. The characteristic of basketball contributes to the high prevalence. Prevention of CAI should be a focus in basketball. When applying the CAI prevention measures, gender should be taken into consideration.
Background
Ankle sprain is the most common injury in basketball. Chronic ankle instability develops from an acute ankle sprain may cause negative effects on quality of life, ankle functionality or on increasing risk for recurrent ankle sprains and post-traumatic osteoarthritis. To facilitate a preventative strategy of chronic ankle instability (CAI) in the basketball population, gathering epidemiological data is essential. However, the epidemiological data of CAI in basketball is limited. Therefore, this study aims to investigate the prevalence of CAI in basketball athletes and to determine whether gender, competitive level, and basketball playing position influence this prevalence.
Methods
In a cross-sectional study, in total 391 Taiwanese basketball athletes from universities and sports clubs participated. Besides non-standardized questions about demographics and their history of ankle sprains, participants further filled out the standard Cumberland Ankle Instability Tool applied to determine the presence of ankle instability. Questionnaires from 255 collegiate and 133 semi-professional basketball athletes (male = 243, female = 145, 22.3 ± 3.8 years, 23.3 ± 2.2 kg/m2) were analyzed. Differences in prevalence between gender, competitive level and playing position were determined using the Chi-square test.
Results
In the surveyed cohort, 26% had unilateral CAI while 50% of them had bilateral CAI. Women had a higher prevalence than men in the whole surveyed cohort (X2(1) = 0.515, p = 0.003). This gender disparity also showed from sub-analyses, that the collegiate female athletes had a higher prevalence than collegiate men athletes (X2(1) = 0.203, p = 0.001). Prevalence showed no difference between competitive levels (p > 0.05) and among playing positions (p > 0.05).
Conclusions
CAI is highly prevalent in the basketball population. Gender affects the prevalence of CAI. Regardless of the competitive level and playing position the prevalence of CAI is similar. The characteristic of basketball contributes to the high prevalence. Prevention of CAI should be a focus in basketball. When applying the CAI prevention measures, gender should be taken into consideration.
Introduction Airway infection with pathogens and its associated pulmonary exacerbations (PEX) are the major causes of morbidity and premature death in cystic fibrosis (CF). Preventing or postponing chronic infections requires early diagnosis. However, limitations of conventional microbiology-based methods can hamper identification of exacerbations and specific pathogen detection. Analyzing volatile organic compounds (VOCs) in breath samples may be an interesting tool in this regard, as VOC-biomarkers can characterize specific airway infections in CF. Areas covered We address the current achievements in VOC-analysis and discuss studies assessing VOC-biomarkers and fingerprints, i.e. a combination of multiple VOCs, in breath samples aiming at pathogen and PEX detection in people with CF (pwCF). We aim to provide bases for further research in this interesting field. Expert opinion Overall, VOC-based analysis is a promising tool for diagnosis of infection and inflammation with potential to monitor disease progression in pwCF. Advantages over conventional diagnostic methods, including easy and non-invasive sampling procedures, may help to drive prompt, suitable therapeutic approaches in the future. Our review shall encourage further research, including validation of VOC-based methods. Specifically, longitudinal validation under standardized conditions is of interest in order to ensure repeatability and enable inclusion in CF diagnostic routine.
Frailty assessment is recommended before elective transcatheter aortic valve implantation (TAVI) to determine post-interventional prognosis. Several studies have investigated frailty in TAVI-patients using numerous assessments; however, it remains unclear which is the most appropriate tool for clinical practice. Therefore, we evaluate which frailty assessment is mainly used and meaningful for ≤30-day and ≥1-year prognosis in TAVI patients. Randomized controlled or observational studies (prospective/retrospective) investigating all-cause mortality in older (≥70 years) TAVI patients were identified (PubMed; May 2020). In total, 79 studies investigating frailty with 49 different assessments were included. As single markers of frailty, mostly gait speed (23 studies) and serum albumin (16 studies) were used. Higher risk of 1-year mortality was predicted by slower gait speed (highest Hazard Ratios (HR): 14.71; 95% confidence interval (CI) 6.50–33.30) and lower serum albumin level (highest HR: 3.12; 95% CI 1.80–5.42). Composite indices (five items; seven studies) were associated with 30-day (highest Odds Ratio (OR): 15.30; 95% CI 2.71–86.10) and 1-year mortality (highest OR: 2.75; 95% CI 1.55–4.87). In conclusion, single markers of frailty, in particular gait speed, were widely used to predict 1-year mortality. Composite indices were appropriate, as well as a comprehensive assessment of frailty. View Full-Text
Frailty assessment is recommended before elective transcatheter aortic valve implantation (TAVI) to determine post-interventional prognosis. Several studies have investigated frailty in TAVI-patients using numerous assessments; however, it remains unclear which is the most appropriate tool for clinical practice. Therefore, we evaluate which frailty assessment is mainly used and meaningful for ≤30-day and ≥1-year prognosis in TAVI patients. Randomized controlled or observational studies (prospective/retrospective) investigating all-cause mortality in older (≥70 years) TAVI patients were identified (PubMed; May 2020). In total, 79 studies investigating frailty with 49 different assessments were included. As single markers of frailty, mostly gait speed (23 studies) and serum albumin (16 studies) were used. Higher risk of 1-year mortality was predicted by slower gait speed (highest Hazard Ratios (HR): 14.71; 95% confidence interval (CI) 6.50–33.30) and lower serum albumin level (highest HR: 3.12; 95% CI 1.80–5.42). Composite indices (five items; seven studies) were associated with 30-day (highest Odds Ratio (OR): 15.30; 95% CI 2.71–86.10) and 1-year mortality (highest OR: 2.75; 95% CI 1.55–4.87). In conclusion, single markers of frailty, in particular gait speed, were widely used to predict 1-year mortality. Composite indices were appropriate, as well as a comprehensive assessment of frailty. View Full-Text
Background
Elderly patients are a growing population in cardiac rehabilitation (CR). As postural control declines with age, assessment of impaired balance is important in older CR patients in order to predict fall risk and to initiate counteracting steps. Functional balance tests are subjective and lack adequate sensitivity to small differences, and are further subject to ceiling effects. A quantitative approach to measure postural control on a continuous scale is therefore desirable. Force plates are already used for this purpose in other clinical contexts, therefore could be a promising tool also for older CR patients. However, in this population the reliability of the assessment is not fully known.
Research question
Analysis of test-retest reliability of center of pressure (CoP) measures for the assessment of postural control using a force plate in older CR patients.
Methods
156 CR patients (> 75 years) were enrolled. CoP measures (path length (PL), mean velocity (MV), and 95% confidence ellipse area (95CEA)) were analyzed twice with an interval of two days in between (bipedal narrow stance, eyes open (EO) and closed (EC), three trials for each condition, 30 s per trial), using a force plate. For test-retest reliability estimation absolute differences (& UDelta;: T0-T1), intraclass correlation coefficients (ICC) with 95% confidence intervals, standard error of measurement and minimal detectable change were calculated.
Results
Under EO condition ICC were excellent for PL and MV (0.95) and good for 95CEA (0.88) with & UDelta; of 10.1 cm (PL), 0.3 cm/sec (MV) and 1.5 cm(2 )(95CEA) respectively. Under EC condition ICC were excellent (> 0.95) for all variables with larger & UDelta; (PL: 21.7 cm; MV: 0.7 cm/sec; 95CEA: 2.4 cm(2))
Significance
In older CR patients, the assessment of CoP measures using a force plate shows good to excellent test retest reliability.
Development of chronic pain after a low back pain episode is associated with increased pain sensitivity, altered pain processing mechanisms and the influence of psychosocial factors. Although there is some evidence that multimodal therapy (such as behavioral or motor control therapy) may be an important therapeutic strategy, its long-term effect on pain reduction and psychosocial load is still unclear. Prospective longitudinal designs providing information about the extent of such possible long-term effects are missing. This study aims to investigate the long-term effects of a homebased uni- and multidisciplinary motor control exercise program on low back pain intensity, disability and psychosocial variables. 14 months after completion of a multicenter study comparing uni- and multidisciplinary exercise interventions, a sample of one study center (n = 154) was assessed once more. Participants filled in questionnaires regarding their low back pain symptoms (characteristic pain intensity and related disability), stress and vital exhaustion (short version of the Maastricht Vital Exhaustion Questionnaire), anxiety and depression experiences (the Hospital and Anxiety Depression Scale), and pain-related cognitions (the Fear Avoidance Beliefs Questionnaire). Repeated measures mixed ANCOVAs were calculated to determine the long-term effects of the interventions on characteristic pain intensity and disability as well as on the psychosocial variables. Fifty four percent of the sub-sample responded to the questionnaires (n = 84). Longitudinal analyses revealed a significant long-term effect of the exercise intervention on pain disability. The multidisciplinary group missed statistical significance yet showed a medium sized long-term effect. The groups did not differ in their changes of the psychosocial variables of interest. There was evidence of long-term effects of the interventions on pain-related disability, but there was no effect on the other variables of interest. This may be partially explained by participant's low comorbidities at baseline. Results are important regarding costless homebased alternatives for back pain patients and prevention tasks. Furthermore, this study closes the gap of missing long-term effect analysis in this field.
Stunting
(2021)
Stress and pain
(2022)
Introduction: Low back pain (LBP) leads to considerable impairment of quality of life worldwide and is often accompanied by psychosomatic symptoms.
Objectives: First, to assess the association between stress and chronic low back pain (CLBP) and its simultaneous appearance with fatigue and depression as a symptom triad. Second, to identify the most predictive stress-related pattern set for CLBP for a 1-year diagnosis.
Methods: In a 1-year observational study with four measurement points, a total of 140 volunteers (aged 18–45 years with intermittent pain) were recruited. The primary outcomes were pain [characteristic pain intensity (CPI), subjective pain disability (DISS)], fatigue, and depressive mood. Stress was assessed as chronic stress, perceived stress, effort reward imbalance, life events, and physiological markers [allostatic load index (ALI), hair cortisol concentration (HCC)]. Multiple linear regression models and selection procedures for model shrinkage and variable selection (least absolute shrinkage and selection operator) were applied. Prediction accuracy was calculated by root mean squared error (RMSE) and receiver-operating characteristic curves.
Results: There were 110 participants completed the baseline assessments (28.2 7.5 years, 38.1% female), including HCC, and a further of 46 participants agreed to ALI laboratory measurements. Different stress types were associated with LBP, CLBP, fatigue, and depressive mood and its joint occurrence as a symptom triad at baseline; mainly social-related stress types were of relevance. Work-related stress, such as “excessive demands at work”[b = 0.51 (95%CI -0.23, 1.25), p = 0.18] played a role for upcoming chronic pain disability. “Social overload” [b = 0.45 (95%CI -0.06, 0.96), p = 0.080] and “over-commitment at work” [b = 0.28 (95%CI -0.39, 0.95), p = 0.42] were associated with an upcoming depressive mood within 1-year. Finally, seven psychometric (CPI: RMSE = 12.63; DISS: RMSE = 9.81) and five biomarkers (CPI: RMSE = 12.21; DISS: RMSE = 8.94) could be derived as the most predictive pattern set for a 1-year prediction of CLBP. The biomarker set showed an apparent area under the curve of 0.88 for CPI and 0.99 for DISS.
Conclusion: Stress disrupts allostasis and favors the development of chronic pain, fatigue, and depression and the emergence of a “hypocortisolemic symptom triad,” whereby the social-related stressors play a significant role. For translational medicine, a predictive pattern set could be derived which enables to diagnose the individuals at higher risk for the upcoming pain disorders and can be used in practice.
In this report, we investigate small proteins involved in bacterial alternative respiratory systems that improve the enzymatic efficiency through better anchorage and multimerization of membrane components. Using the small protein TorE of the respiratory TMAO reductase system as a model, we discovered that TorE is part of a subfamily of small proteins that are present in proteobacteria in which they play a similar role for bacterial respiratory systems. We reveal by microscopy that, in Shewanella oneidensis MR1, alternative respiratory systems are evenly distributed in the membrane contrary to what has been described for Escherichia coli. Thus, the better efficiency of the respiratory systems observed in the presence of the small proteins is not due to a specific localization in the membrane, but rather to the formation of membranous complexes formed by TorE homologs with their c-type cytochrome partner protein. By an in vivo approach combining Clear Native electrophoresis and fluorescent translational fusions, we determined the 4: 4 stoichiometry of the complexes. In addition, mild solubilization of the cytochrome indicates that the presence of the small protein reinforces its anchoring to the membrane. Therefore, assembly of the complex induced by this small protein improves the efficiency of the respiratory system.
The PNPLA3 reference single-nucleotide polymorphism rs738409 has been identified as a predisposing factor for nonalcoholic fatty liver disease. A simple method based on PCR and restriction fragment length polymorphism (RFLP) analysis had been published to detect the nonpathogenic allele PNPLA3 rs738409 variant. The presence of the pathogenic variant was deduced by the indigestibility of the corresponding PCR product with BtsCI recognizing the nonpathogenic allele. However, one cannot exclude that an enzymatic reaction does not occur for other, more trivial, reasons. For safe and secure detection of the pathogenic PNPLA3 rs738409, we have further developed the PCR-restriction fragment length polymorphism method by adding a second restriction enzyme digest, clearly identifying the correct PNPLA3 alleles and in particular the pathogenic variant. <br /> METHOD SUMMARY <br /> The method presented here represents an improved genetic diagnosis of the PNPLA3 rs738409 alleles based on conventional and inexpensive molecular biological methods. We used methodology based on PCR and restriction fragment length polymorphisms and clearly identified both described alleles by the use of two restriction enzymes. Digestion of individuals' specific PNPLA3 PCR fragments with both enzymes in independent reactions clearly showed the PNPLA3 rs738409 genotype.
Risikokommunikation spielt eine zentrale Rolle in Public-Health-Notlagen: Sie muss informierte Entscheidungen ermöglichen, schützendes bzw. lebenserhaltendes Verhalten fördern und das Vertrauen in öffentliche Institutionen bewahren. Zudem müssen Unsicherheiten über wissenschaftliche Erkenntnisse transparent benannt werden, irrationale Ängste und Gerüchte entkräftet werden. Risikokommunikation sollte die Bevölkerung partizipativ einbeziehen. Ihre Risikowahrnehmung und -kompetenz müssen kontinuierlich erfasst werden. In der aktuellen Pandemie der Coronavirus-Krankheit 2019 (COVID-19) ergeben sich spezifische Herausforderungen für die Risikokommunikation.
Der Wissensstand zu vielen wichtigen Aspekten, die COVID-19 betreffen, war und ist oftmals unsicher oder vorläufig, z. B. zu Übertragung, Symptomen, Langzeitfolgen und Immunität. Die Kommunikation ist durch wissenschaftliche Sprache sowie eine Vielzahl von Kennzahlen und Statistiken geprägt, was die Verständlichkeit erschweren kann. Neben offiziellen Mitteilungen und Einschätzungen von Expertinnen und Experten wird über COVID-19 in großem Umfang in sozialen Medien kommuniziert, dabei werden auch Fehlinformationen und Spekulationen verbreitet; diese „Infodemie“ erschwert die Risikokommunikation.
Nationale wie internationale Forschungsprojekte sollen helfen, die Risikokommunikation zu COVID-19 zielgruppenspezifischer und effektiver zu machen. Dazu gehören u. a. explorative Studien zum Umgang mit COVID-19-bezogenen Informationen, das COVID-19 Snapshot Monitoring (COSMO), ein regelmäßig durchgeführtes Onlinesurvey zu Risikowahrnehmung und Schutzverhalten sowie eine interdisziplinäre qualitative Studie, die die Konzeption, Umsetzung und Wirksamkeit von Risikokommunikationsstrategien vergleichend in 4 Ländern untersucht.
The reliability of quantifying intratendinous vascularization by high-sensitivity Doppler ultrasound advanced dynamic flow has not been examined yet. Therefore, this study aimed to investigate the intraobserver and interobserver reliability of evaluating Achilles tendon vascularization by advanced dynamic flow using established scoring systems. Methods-Three investigators evaluated vascularization in 67 recordings in a test-retest design, applying the Ohberg score, a modified Ohberg score, and a counting score. Intraobserver and interobserver agreement for the Ohberg score and modified Ohberg score was analyzed by the Cohen kappa and Fleiss kappa coefficients (absolute), Kendall tau b coefficient, and Kendall coefficient of concordance (W; relative). The reliability of the counting score was analyzed by intraclass correlation coefficients (ICC) 2.1 and 3.1, the standard error of measurement (SEM), and Bland-Altman analysis (bias and limits of agreement [LoA]). Results-Intraobserver and interobserver agreement (absolute/relative) ranged from 0.61 to 0.87/0.87 to 0.95 and 0.11 to 0.66/0.76 to 0.89 for the Ohberg score and from 0.81 to 0.87/0.92 to 0.95 and 0.64 to 0.80/0.88 to 0.93 for the modified Ohberg score, respectively. The counting score revealed an intraobserver ICC of 0.94 to 0.97 (SEM, 1.0-1.5; bias, -1; and LoA, 3-4 vessels). The interobserver ICC for the counting score ranged from 0.91 to 0.98 (SEM, 1.0-1.9; bias, 0; and LoA, 3-5 vessels). Conclusions-The modified Ohberg score and counting score showed excellent reliability and seem convenient for research and clinical practice. The Ohberg score revealed decent intraobserver but unexpected low interobserver reliability and therefore cannot be recommended.
Background and Study
Aims Recurrent laryngeal nerve palsy (RLNP) is a potential complication of anterior discectomy and fusion (ACDF). There still is substantial disagreement on the actual prevalence of RLNP after ACDF as well as on risk factors for postoperative RLNP. The aim of this study was to describe the prevalence of postoperative RLNP in a cohort of consecutive cases of ACDF and to examine potential risk factors.
Materials and Methods
This retrospective study included patients who underwent ACDF between 2005 and 2019 at a single neurosurgical center. As part of clinical routine, RLNP was examined prior to and after surgery by independent otorhinolaryngologists using endoscopic laryngoscopy. As potential risk factors for postoperative RLNP, we examined patient's age, sex, body mass index, multilevel surgery, and the duration of surgery.
Results
214 consecutive cases were included. The prevalence of preoperative RLNP was 1.4% (3/214) and the prevalence of postoperative RLNP was 9% (19/211). The number of operated levels was 1 in 73.5% (155/211), 2 in 24.2% (51/211), and 3 or more in 2.4% (5/211) of cases. Of all cases, 4.7% (10/211) were repeat surgeries. There was no difference in the prevalence of RLNP between the primary surgery group (9.0%, 18/183) versus the repeat surgery group (10.0%, 1/10;p = 0.91). Also, there was no difference in any characteristics between subjects with postoperative RLNP compared with those without postoperative RLNP. We found no association between postoperative RLNP and patient's age, sex, body mass index, duration of surgery, or number of levels (odds ratios between 0.24 and 1.05; p values between 0.20 and 0.97).
Conclusions
In our cohort, the prevalence of postoperative RLNP after ACDF was 9.0%. The fact that none of the examined variables was associated with the occurrence of RLNP supports the view that postoperative RLNP may depend more on direct mechanical manipulation during surgery than on specific patient or surgical characteristics.
Sedentarism is a risk factor for depression and anxiety. People living with the human immunodeficiency virus (PLWH) have a higher prevalence of anxiety and depression compared to HIV-negative individuals. This cross-sectional study (n = 450, median age 44 (19-75), 7.3% females) evaluates the prevalence rates and prevalence ratio (PR) of anxiety and/or depression in PLWH associated with recreational exercise. A decreased likelihood of having anxiety (PR=0.57; 0.36-0.91; p = 0.01), depression (PR=0.41; 0.36-0.94; p=0.01), and comorbid anxiety and depression (PR = 0,43; 0.24-0.75; p=0.002) was found in exercising compared to non-exercising PLWH. Recreational exercise is associated with a lower risk for anxiety and/or depression. Further prospective studies are needed to provide insights on the direction of this association.
Nonparametric goodness-of-fit testing for parametric covariate models in pharmacometric analyses
(2021)
The characterization of covariate effects on model parameters is a crucial step during pharmacokinetic/pharmacodynamic analyses. Although covariate selection criteria have been studied extensively, the choice of the functional relationship between covariates and parameters, however, has received much less attention. Often, a simple particular class of covariate-to-parameter relationships (linear, exponential, etc.) is chosen ad hoc or based on domain knowledge, and a statistical evaluation is limited to the comparison of a small number of such classes. Goodness-of-fit testing against a nonparametric alternative provides a more rigorous approach to covariate model evaluation, but no such test has been proposed so far. In this manuscript, we derive and evaluate nonparametric goodness-of-fit tests for parametric covariate models, the null hypothesis, against a kernelized Tikhonov regularized alternative, transferring concepts from statistical learning to the pharmacological setting. The approach is evaluated in a simulation study on the estimation of the age-dependent maturation effect on the clearance of a monoclonal antibody. Scenarios of varying data sparsity and residual error are considered. The goodness-of-fit test correctly identified misspecified parametric models with high power for relevant scenarios. The case study provides proof-of-concept of the feasibility of the proposed approach, which is envisioned to be beneficial for applications that lack well-founded covariate models.
Following stroke, neuronal death takes place both in the infarct region and in brain areas distal to the lesion site including the hippocampus. The hippocampus is critically involved in learning and memory processes and continuously generates new neurons. Dysregulation of adult neurogenesis may be associated with cognitive decline after a stroke lesion. In particular, proliferation of precursor cells and the formation of new neurons are increased after lesion. Within the first week, many new precursor cells die during development. How dying precursors are removed from the hippocampus and to what extent phagocytosis takes place after stroke is still not clear. Here, we evaluated the effect of a prefrontal stroke lesion on the phagocytic activity of microglia in the dentate gyrus (DG) of the hippocampus. Three-months-old C57BL/6J mice were injected once with the proliferation marker BrdU (250 mg/kg) 6 hr after a middle cerebral artery occlusion or sham surgery. The number of apoptotic cells and the phagocytic capacity of the microglia were evaluated by means of immunohistochemistry, confocal microscopy, and 3D-reconstructions. We found a transient but significant increase in the number of apoptotic cells in the DG early after stroke, associated with impaired removal by microglia. Interestingly, phagocytosis of newly generated precursor cells was not affected. Our study shows that a prefrontal stroke lesion affects phagocytosis of apoptotic cells in the DG, a region distal to the lesion core. Whether disturbed phagocytosis might contribute to inflammatory- and maladaptive processes including cognitive impairment following stroke needs to be further investigated.
Kraft und Kognition
(2023)
Die in den letzten Jahren aus Querschnittstudien gewonnenen empirischen Erkenntnisse deuten auf einen Zusammenhang zwischen muskulärer Kraftleistungsfähigkeit und kognitiver Leistungsfähigkeit hin [10]. Diese Beobachtung wird von Längsschnittstudien gestützt, bei denen in Folge gezielter Krafttrainingsinterventionen, welche typischerweise zur Steigerung der muskulären Kraftleistungsfähigkeit führen, Verbesserungen der kognitiven Leistungsfähigkeit dokumentiert werden konnten [11]. Die zugrundeliegenden Mechanismen, die den Zusammenhang zwischen muskulärer Kraftleistungsfähigkeit und kognitiver Leistungsfähigkeit begründen, sind jedoch noch nicht vollständig bekannt und bedürfen weiterer Forschung [10,12]. Vor diesem Hintergrund hatten die im Rahmen dieser Dissertation durchgeführten Forschungsarbeiten das übergeordnete Ziel, die Mechanismen zu untersuchen, welche den Zusammenhang zwischen der muskulären Kraftleistungsfähigkeit und der kognitiven Leistungsfähigkeit erklären können. In dieser Arbeit wurden dazu unterschiedliche Populationen (junge Menschen und ältere Menschen ohne und mit leichten kognitiven Störungen) unter Anwendung verschiedener untersuchungsmethodischer Ansätze (systematische Literaturrecherche, Doppelaufgabenparadigma und funktionelle Nahinfrarotspektroskopie) untersucht. Aufgrund der im Rahmen dieser Dissertation durchgeführten Forschungsarbeiten, die konsekutiv aufeinander aufbauen, konnten folgende Haupterkenntnisse gewonnen werden:
• Um einen umfassenden Überblick über die aktuelle Evidenzlage zum Thema Kraftleistungsfähigkeit und kognitiver Leistungsfähigkeit sowie den zugrundeliegenden neuronalen Korrelaten zu erlangen, wurde eine systematische Literaturrecherche zu diesem Forschungsthema durchgeführt. Die Ergebnisse dieser systematischen Literaturrecherche dokumentieren, dass ein gezieltes Krafttraining neben der Steigerung der kognitiven Leistungsfähigkeit zu funktionellen und strukturellen Veränderungen des Gehirns, insbesondere in frontalen Gehirnregionen, führen kann [13]. Ferner zeigen die Ergebnisse dieser systematischen Literaturrecherche, bei der eine begrenzte Anzahl verfügbarer Studien (n = 18) identifiziert wurde, den Bedarf weiterer Forschungsarbeiten zu diesem Themenfeld an [13].
• Zur Überprüfung der Hypothese, dass zur Ausführung von Krafttrainingsübungen höhere kognitive Prozesse benötigt werden, wurde in einer experimentellen Studie bei jüngeren gesunden Erwachsenen das Doppelaufgabenparadigma bei der Krafttrainingsübung Knie-beuge angewendet. Die in dieser Studie beobachteten Doppelaufgabenkosten bei der Ausführung der Krafttrainingsübung Kniebeuge (im Vergleich zur Kontrollbedingung Stehen) deuten auf die Beteiligung höherer kognitiver Prozesse zur Lösung dieser Bewegungsaufgabe hin und bestätigen die aufgestellte Hypothese [14].
• Um die Hypothese zu untersuchen, dass spezifische neuronale Korrelate (funktionelle Gehirnaktivität) den Zusammenhang zwischen muskulärer Kraftleistungsfähigkeit und kognitiver Leistungsfähigkeit vermitteln, wurde bei jungen gesunden Erwachsenen der Zusammenhang zwischen der Ausprägung der maximalen Handgriffkraft (normalisiert auf den Body-Mass-Index) und der kortikalen hämodynamischen Antwortreaktion untersucht, die bei der Durchführung eines standardisierten kognitiven Tests mittels funktioneller Nahinfrarotspektroskopie in präfrontalen Gehirnarealen gemessen wurde. Im Rahmen dieser Querschnittsstudie konnte die initiale Hypothese nicht vollständig bestätigt werden, da zwar Zusammenhänge zwischen maximaler Handgriffkraft und kognitiver Leistungsfähigkeit mit Parametern der hämodynamischen Antwortreaktion beobachtet wurden, aber die Ausprägung der maximalen Handgriffkraft nicht im Zusammenhang mit der Kurzeitgedächtnisleistung stand [16].
• Zur Untersuchung der Annahme, dass eine vorliegende neurologische Erkrankung (im Speziellen eine leichte kognitive Störung), die typischerweise mit Veränderungen von spezifischen neuronalen Korrelaten (z.B. des Hippokampus‘ [17-19] und des präfrontalen Kortex‘ [20,21]) einhergeht, einen Einfluss auf die Assoziation zwischen muskulärer Kraftleistungsfähigkeit und kognitiver Leistungsfähigkeit hat, wurde in einer Querschnittsstudie der Zusammenhang zwischen der Ausprägung der maximalen Handgriffkraft (normalisiert auf den Body-Mass-Index) und der Ausprägung der exekutiven Funktionen bei älteren Erwachsenen mit amnestischem und nicht-amnestischem Subtyp der leichten kognitiven Störung sowie gesunden älteren Erwachsenen untersucht. In dieser Querschnittsstudie wurde nur bei älteren Erwachsenen mit dem amnestischen Subtyp der leichten kognitiven Störung ein Zusammenhang zwischen maximaler Handgriffkraft und exekutiven Funktionen beobachtet. Solch eine Korrelation existiert jedoch nicht bei älteren Erwachsenen mit dem non-amnestischen Subtyp der leichten kognitiven Störung oder bei gesunden älteren Erwachsenen [24].
• In einem Perspektivenartikel wurde aufgezeigt, wie durch die theoriegeleitete Nutzung physiologischer Effekte, die bei einer speziellen Krafttrainingsmethode durch die Moderation des peripheren Blutflusses mittels Manschetten oder Bändern auftreten, insbesondere Populationen mit niedriger mechanischer Belastbarkeit von den positiven Effekten des Krafttrainings auf die Gehirngesundheit profitieren könnten [25].
Insgesamt deuten die Ergebnisse der in dieser Dissertation zusammengeführten und aufeinander aufbauenden Forschungsarbeiten auf das Vorhandensein von gemeinsamen neuronalen Korrelaten (z.B. frontaler Kortex) hin, die sowohl für die muskuläre Kraftleistungsfähigkeit als auch für höhere kognitive Prozesse eine wichtige Rolle spielen [26]. Betrachtet man die in der vorliegenden Dissertation gewonnenen Erkenntnisse im Verbund mit den bereits in der Literatur existieren-den empirischen Belegen, unterstützen sie die Sichtweise, dass eine relativ hohe muskuläre Kraftleistungsfähigkeit und deren Erhalt durch gezielte Krafttrainingsinterventionen über die Lebenspanne positive Effekte auf die (Gehirn-)Gesundheit haben können [27].
Jenseits der Klinik
(2021)
Unser Beitrag stellt ein interaktives Ethik-Konzept vor, das in Zusammenarbeit der BruderhausDiakonie Reutlingen und der Universität Tübingen entwickelt wurde, um den Eigenheiten und Bedarfen einer komplexen Organisationsstruktur gerecht zu werden, die mehrere Geschäftsfelder und Standorte unter sich vereint. Wir skizzieren die Grundzüge des interaktiven Nijmegener Modells, in dem die Kooperation eines auf Leitungsebene angesiedelten Komitees und situationsbezogener Fallbesprechungen ein fruchtbares Zusammenspiel zweier unverzichtbarer Reflexionsweisen bewirken soll („Top-Down“/„Bottom-Up“). Wir zeigen auf, welche Herausforderungen sich bei der Implementierung dieses Modells in die konkrete Aufbauorganisation der BruderhausDiakonie ergaben, und mit welchen konzeptionellen oder „implementationstechnischen“ Mitteln ihnen begegnet wurde. Im Zentrum steht dabei die Erweiterung des Nijmegener Modells um ein Verbindungselement, welches die Zusammenarbeit zwischen zentralem Ausschuss und dezentralen Fallbesprechungen koordiniert und das interaktive Moment des Modells erst ermöglicht.
Dementia as one of the most prevalent diseases urges for a better understanding of the central mechanisms responsible for clinical symptoms, and necessitates improvement of actual diagnostic capabilities. The brainstem nucleus locus coeruleus (LC) is a promising target for early diagnosis because of its early structural alterations and its relationship to the functional disturbances in the patients. In this study, we applied our improved method of localisation-based LC resting-state fMRI to investigate the differences in central sensory signal processing when comparing functional connectivity (fc) of a patient group with mild cognitive impairment (MCI, n = 28) and an age-matched healthy control group (n = 29). MCI and control participants could be differentiated in their Mini-Mental-State-Examination (MMSE) scores (p < .001) and LC intensity ratio (p = .010). In the fMRI, LC fc to anterior cingulate cortex (FDR p < .001) and left anterior insula (FDR p = .012) was elevated, and LC fc to right temporoparietal junction (rTPJ, FDR p = .012) and posterior cingulate cortex (PCC, FDR p = .021) was decreased in the patient group. Importantly, LC to rTPJ connectivity was also positively correlated to MMSE scores in MCI patients (p = .017). Furthermore, we found a hyperactivation of the left-insula salience network in the MCI patients. Our results and our proposed disease model shed new light on the functional pathogenesis of MCI by directing to attentional network disturbances, which could aid new therapeutic strategies and provide a marker for diagnosis and prediction of disease progression.
Dementia as one of the most prevalent diseases urges for a better understanding of the central mechanisms responsible for clinical symptoms, and necessitates improvement of actual diagnostic capabilities. The brainstem nucleus locus coeruleus (LC) is a promising target for early diagnosis because of its early structural alterations and its relationship to the functional disturbances in the patients. In this study, we applied our improved method of localisation-based LC resting-state fMRI to investigate the differences in central sensory signal processing when comparing functional connectivity (fc) of a patient group with mild cognitive impairment (MCI, n = 28) and an age-matched healthy control group (n = 29). MCI and control participants could be differentiated in their Mini-Mental-State-Examination (MMSE) scores (p < .001) and LC intensity ratio (p = .010). In the fMRI, LC fc to anterior cingulate cortex (FDR p < .001) and left anterior insula (FDR p = .012) was elevated, and LC fc to right temporoparietal junction (rTPJ, FDR p = .012) and posterior cingulate cortex (PCC, FDR p = .021) was decreased in the patient group. Importantly, LC to rTPJ connectivity was also positively correlated to MMSE scores in MCI patients (p = .017). Furthermore, we found a hyperactivation of the left-insula salience network in the MCI patients. Our results and our proposed disease model shed new light on the functional pathogenesis of MCI by directing to attentional network disturbances, which could aid new therapeutic strategies and provide a marker for diagnosis and prediction of disease progression.
The study investigated the incidence of Achilles and patellar tendinopathy in adolescent elite athletes and non-athletic controls. Furthermore, predictive and associated factors for tendinopathy development were analyzed. The prospective study consisted of two measurement days (M1/M2) with an interval of 3.2 +/- 0.9 years. 157 athletes (12.1 +/- 0.7 years) and 25 controls (13.3 +/- 0.6 years) without Achilles/patellar tendinopathy were included at Ml. Clinical and ultrasound examinations of both Achilles (AT) and patellar tendons (PT) were performed. Main outcome measures were incidence tendinopathy and structural intratendinous alterations (hypo-/hyperechogenicity, vascularization) at M2 [%]. Incidence of Achilles tendinopathy was 1% in athletes and 0% in controls. Patellar tendinopathy was more frequent in athletes (13 %)than in controls (4%). Incidence of intratendinous alterations in ATs was 1-2% in athletes and 0 % in controls, whereas in PTs it was 4-6 % in both groups (p >0.05). Intratendinous alterations at M2 were associated with patellar tendinopathy in athletes (p <= 0.01). Intratendinous alterations at M1, anthropometric data, training amount, sports or sex did not predict tendinopathy development (p>0.05). Incidence often dinopathy and intratendinous alterations in adolescent athletes is low in ATs and more common in PTs. Development of intratendinous alterations in PT is associated with tend in opathy. However, predictive factors could not be identified.
Objective
To improve consumer decision making, the results of risk assessments on food, feed, consumer products or chemicals need to be communicated not only to experts but also to non-expert audiences. The present study draws on evidence from literature reviews and focus groups with diverse stakeholders to identify content to integrate into an existing risk assessment communication (Risk Profile).
Methods
A combination of rapid literature reviews and focus groups with experts (risk assessors (n = 15), risk managers (n = 8)), and non-experts (general public (n = 18)) were used to identify content and strategies for including information about risk assessment results in the “Risk Profile” from the German Federal Institute for Risk Assessment. Feedback from initial focus groups was used to develop communication prototypes that informed subsequent feedback rounds in an iterative process. A final prototype was validated in usability tests with experts.
Results
Focus group feedback and suggestions from risk assessors were largely in line with findings from the literature. Risk managers and lay persons offered similar suggestions on how to improve the existing communication of risk assessment results (e.g., including more explanatory detail, reporting probabilities for individual health impairments, and specifying risks for subgroups in additional sections). Risk managers found information about quality of evidence important to communicate, whereas people from the general public found this information less relevant. Participants from lower educational backgrounds had difficulties understanding the purpose of risk assessments. User tests found that the final prototype was appropriate and feasible to implement by risk assessors.
Conclusion
An iterative and evidence-based process was used to develop content to improve the communication of risk assessments to the general public while being feasible to use by risk assessors. Remaining challenges include how to communicate dose-response relationships and standardise quality of evidence ratings across disciplines.
Objective
To improve consumer decision making, the results of risk assessments on food, feed, consumer products or chemicals need to be communicated not only to experts but also to non-expert audiences. The present study draws on evidence from literature reviews and focus groups with diverse stakeholders to identify content to integrate into an existing risk assessment communication (Risk Profile).
Methods
A combination of rapid literature reviews and focus groups with experts (risk assessors (n = 15), risk managers (n = 8)), and non-experts (general public (n = 18)) were used to identify content and strategies for including information about risk assessment results in the “Risk Profile” from the German Federal Institute for Risk Assessment. Feedback from initial focus groups was used to develop communication prototypes that informed subsequent feedback rounds in an iterative process. A final prototype was validated in usability tests with experts.
Results
Focus group feedback and suggestions from risk assessors were largely in line with findings from the literature. Risk managers and lay persons offered similar suggestions on how to improve the existing communication of risk assessment results (e.g., including more explanatory detail, reporting probabilities for individual health impairments, and specifying risks for subgroups in additional sections). Risk managers found information about quality of evidence important to communicate, whereas people from the general public found this information less relevant. Participants from lower educational backgrounds had difficulties understanding the purpose of risk assessments. User tests found that the final prototype was appropriate and feasible to implement by risk assessors.
Conclusion
An iterative and evidence-based process was used to develop content to improve the communication of risk assessments to the general public while being feasible to use by risk assessors. Remaining challenges include how to communicate dose-response relationships and standardise quality of evidence ratings across disciplines.
Boredom has been identified as one of the greatest psychological challenges when staying at home during quarantine and isolation. However, this does not mean that the situation necessarily causes boredom. On the basis of 13 explorative interviews with bored and non-bored persons who have been under quarantine or in isolation, we explain why boredom is related to a subjective interpretation process rather than being a direct consequence of the objective situation. Specifically, we show that participants vary significantly in their interpretations of staying at home and, thus, also in their experience of boredom. While the non-bored participants interpret the situation as a relief or as irrelevant, the bored participants interpret it as a major restriction that only some are able to cope with.
In recent years digital technologies have become a major means for providing health-related services and this trend was strongly reinforced by the current Coronavirus disease 2019 (COVID-19) pandemic. As it is well-known that regular physical activity has positive effects on individual physical and mental health and thus is an important prerequisite for healthy aging, digital technologies are also increasingly used to promote unstructured and structured forms of physical activity. However, in the course of this development, several terms (e.g., Digital Health, Electronic Health, Mobile Health, Telehealth, Telemedicine, and Telerehabilitation) have been introduced to refer to the application of digital technologies to provide health-related services such as physical interventions. Unfortunately, the above-mentioned terms are often used in several different ways, but also relatively interchangeably. Given that ambiguous terminology is a major source of difficulty in scientific communication which can impede the progress of theoretical and empirical research, this article aims to make the reader aware of the subtle differences between the relevant terms which are applied at the intersection of physical activity and Digital Health and to provide state-of-art definitions for them.
In recent years digital technologies have become a major means for providing health-related services and this trend was strongly reinforced by the current Coronavirus disease 2019 (COVID-19) pandemic. As it is well-known that regular physical activity has positive effects on individual physical and mental health and thus is an important prerequisite for healthy aging, digital technologies are also increasingly used to promote unstructured and structured forms of physical activity. However, in the course of this development, several terms (e.g., Digital Health, Electronic Health, Mobile Health, Telehealth, Telemedicine, and Telerehabilitation) have been introduced to refer to the application of digital technologies to provide health-related services such as physical interventions. Unfortunately, the above-mentioned terms are often used in several different ways, but also relatively interchangeably. Given that ambiguous terminology is a major source of difficulty in scientific communication which can impede the progress of theoretical and empirical research, this article aims to make the reader aware of the subtle differences between the relevant terms which are applied at the intersection of physical activity and Digital Health and to provide state-of-art definitions for them.
OBJECTIVE: For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
METHODS: We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
RESULTS: Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
CONCLUSION: Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population.
Objective
For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
Methods
We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
Results
Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
Conclusion
Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population
OBJECTIVE: For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
METHODS: We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
RESULTS: Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
CONCLUSION: Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population.
Extracellular vesicles: potential mediators of psychosocial stress contribution to osteoporosis?
(2021)
Osteoporosis is characterized by low bone mass and damage to the bone tissue’s microarchitecture, leading to increased fracture risk. Several studies have provided evidence for associations between psychosocial stress and osteoporosis through various pathways, including the hypothalamic-pituitary-adrenocortical axis, the sympathetic nervous system, and other endocrine factors. As psychosocial stress provokes oxidative cellular stress with consequences for mitochondrial function and cell signaling (e.g., gene expression, inflammation), it is of interest whether extracellular vesicles (EVs) may be a relevant biomarker in this context or act by transporting substances. EVs are intercellular communicators, transfer substances encapsulated in them, modify the phenotype and function of target cells, mediate cell-cell communication, and, therefore, have critical applications in disease progression and clinical diagnosis and therapy. This review summarizes the characteristics of EVs, their role in stress and osteoporosis, and their benefit as biological markers. We demonstrate that EVs are potential mediators of psychosocial stress and osteoporosis and may be beneficial in innovative research settings.
Background: The characteristics of osteoporosis are decreased bone mass and destruction towards the microarchitecture of bone tissue, which raises the risk of fracture. Psychosocialstress and osteoporosis are linked by sympathetic nervous system, hypothalamic-pituitary-adrenal axis, and other endocrine factors. Psychosocial stress causes a series of effects on the organism, and this long-term depletion at the cellular level is considered to be mitochondrial allostatic load, including mitochondrial dysfunction and oxidative stress. Extracellular vesicles (EVs) are involved in the mitochondrial allostatic load process and may as biomarkers in this setting. As critical participants during cell-to-cell communications, EVs serve as transport vehicles for nucleic acid and proteins, alter the phenotypic and functional characteristics of their target cells, and promote cell-to-cell contact. And hence, they play a significant role in the diagnosis and therapy of many diseases, such as osteoporosis.
Aim: This narrative review attempts to outline the features of EVs, investigate their involvement in both psychosocial stress and osteoporosis, and analyze if EVs can be potential mediators between both.
Methods: The online database from PubMed, Google Scholar, and Science Direct were searched for keywords related to the main topic of this study, and the availability of all the selected studies was verified. Afterward, the findings from the articles were summarized and synthesized.
Results: Psychosocial stress affects bone remodeling through increased neurotransmitters such as glucocorticoids and catecholamines, as well as increased glucose metabolism. Furthermore, psychosocial stress leads to mitochondrial allostatic load, including oxidative stress, which may affect bone remodeling. In vitro and in vivo data suggest EVs might involve in the link between psychosocial stress and bone remodeling through the transfer of bioactive substances and thus be a potential mediator of psychosocial stress leading to osteoporosis.
Conclusions: According to the included studies, psychosocial stress affects bone remodeling, leading to osteoporosis. By summarizing the specific properties of EVs and the function of EVs in both psychosocial stress and osteoporosis, respectively, it has been demonstrated that EVs are possible mediators of both, and have the prospects to be useful in innovative research areas.
Extracellular vesicles
(2021)
Osteoporosis is characterized by low bone mass and damage to the bone tissue’s microarchitecture, leading to increased fracture risk. Several studies have provided evidence for associations between psychosocial stress and osteoporosis through various pathways, including the hypothalamic-pituitary-adrenocortical axis, the sympathetic nervous system, and other endocrine factors. As psychosocial stress provokes oxidative cellular stress with consequences for mitochondrial function and cell signaling (e.g., gene expression, inflammation), it is of interest whether extracellular vesicles (EVs) may be a relevant biomarker in this context or act by transporting substances. EVs are intercellular communicators, transfer substances encapsulated in them, modify the phenotype and function of target cells, mediate cell-cell communication, and, therefore, have critical applications in disease progression and clinical diagnosis and therapy. This review summarizes the characteristics of EVs, their role in stress and osteoporosis, and their benefit as biological markers. We demonstrate that EVs are potential mediators of psychosocial stress and osteoporosis and may be beneficial in innovative research settings.
Emotional memories are better remembered than neutral ones, but the mechanisms leading to this memory bias are not well under-stood in humans yet. Based on animal research, it is suggested that the memory-enhancing effect of emotion is based on central nor-adrenergic release, which is triggered by afferent vagal nerve activation. To test the causal link between vagus nerve activation and emotional memory in humans, we applied continuous noninvasive transcutaneous auricular vagus nerve stimulation (taVNS) during exposure to emotional arousing and neutral scenes and tested subsequent, long-term recognition memory after 1 week. We found that taVNS, compared with sham, increased recollection-based memory performance for emotional, but not neutral, material. These findings were complemented by larger recollection-related brain potentials (parietal ERP Old/New effect) during retrieval of emotional scenes encoded under taVNS, compared with sham. Furthermore, brain potentials recorded during encoding also revealed that taVNS facilitated early attentional discrimination between emotional and neutral scenes. Extending animal research, our behavioral and neu-ral findings confirm a modulatory influence of the vagus nerve in emotional memory formation in humans.
Macrophages in pathologically expanded dysfunctional white adipose tissue are exposed to a mix of potential modulators of inflammatory response, including fatty acids released from insulin-resistant adipocytes, increased levels of insulin produced to compensate insulin resistance, and prostaglandin E-2 (PGE(2)) released from activated macrophages. The current study addressed the question of how palmitate might interact with insulin or PGE(2) to induce the formation of the chemotactic pro-inflammatory cytokine interleukin-8 (IL-8). Human THP-1 cells were differentiated into macrophages. In these macrophages, palmitate induced IL-8 formation. Insulin enhanced the induction of IL-8 formation by palmitate as well as the palmitate-dependent stimulation of PGE(2) synthesis. PGE(2) in turn elicited IL-8 formation on its own and enhanced the induction of IL-8 release by palmitate, most likely by activating the EP4 receptor. Since IL-8 causes insulin resistance and fosters inflammation, the increase in palmitate-induced IL-8 formation that is caused by hyperinsulinemia and locally produced PGE(2) in chronically inflamed adipose tissue might favor disease progression in a vicious feed-forward cycle.
Background
Generalized weakness and fatigue are underexplored symptoms in emergency medicine. Triage tools often underestimate patients presenting to the emergency department (ED) with these nonspecific symptoms (Nemec et al., 2010). At the same time, physicians' disease severity rating (DSR) on a scale from 0 (not sick at all) to 10 (extremely sick) predicts key outcomes in ED patients (Beglinger et al., 2015; Rohacek et al., 2015). Our goals were (1) to characterize ED patients with weakness and/or fatigue (W|F); to explore (2) to what extent physicians' DSR at triage can predict five key outcomes in ED patients with W|F; (3) how well DSR performs relative to two commonly used benchmark methods, the Emergency Severity Index (ESI) and the Charlson Comorbidity Index (CCI); (4) to what extent DSR provides predictive information beyond ESI, CCI, or their linear combination, i.e., whether ESI and CCI should be used alone or in combination with DSR; and (5) to what extent ESI, CCI, or their linear combination provide predictive information beyond DSR alone, i.e., whether DSR should be used alone or in combination with ESI and / or CCI.
Methods
Prospective observational study between 2013-2015 (analysis in 2018-2020, study team blinded to hypothesis) conducted at a single center. We study an all-comer cohort of 3,960 patients (48% female patients, median age = 51 years, 94% completed 1-year follow-up). We looked at two primary outcomes (acute morbidity (Bingisser et al., 2017; Weigel et al., 2017) and all-cause 1- year mortality) and three secondary outcomes (in-hospital mortality, hospitalization and transfer to ICU). We assessed the predictive power (i.e., resolution, measured as the Area under the ROC Curve, AUC) of the scores and, using logistic regression, their linear combinations.
Findings
Compared to patients without W|F (n = 3,227), patients with W|F (n = 733) showed higher prevalences for all five outcomes, reported more symptoms across both genders, and received higher DSRs (median = 4; interquartile range (IQR) = 3-6 vs. median = 3; IQR = 2-5). DSR predicted all five outcomes well above chance (i.e., AUCs > similar to 0.70), similarly well for both patients with and without W|F, and as good as or better than ESI and CCI in patients with and without W|F (except for 1-year mortality where CCI performs better). For acute morbidity, hospitalization, and transfer to ICU there is clear evidence that adding DSR to ESI and/or CCI improves predictions for both patient groups; for 1-year mortality and in-hospital mortality this holds for most, but not all comparisons. Adding ESI and/or CCI to DSR generally did not improve performance or even decreased it.
Conclusions
The use of physicians' disease severity rating has never been investigated in patients with generalized weakness and fatigue. We show that physicians' prediction of acute morbidity, mortality, hospitalization, and transfer to ICU through their DSR is also accurate in these patients. Across all patients, DSR is less predictive of acute morbidity for female than male patients, however. Future research should investigate how emergency physicians judge their patients' clinical state at triage and how this can be improved and used in simple decision aids.
Psychiatrische Stationen sind ein wichtiges Element in der psychiatrischen Versorgung von Menschen mit akuter Eigen- oder Fremdgefährdung. Leider kommt es in diesem Rahmen immer wieder auch zu Aggression, Gewalt (Konflikten) sowie zur Anwendung von Zwang (Eindämmung). Als entscheidender Faktor für den sachgemäßen Umgang mit diesen Situationen wird sowohl die Quantität als auch die Qualität der Mitarbeitenden angesehen. Vor diesem Hintergrund beschäftigt sich die vorliegende Untersuchung mit der Versorgungssituation auf akutpsychiatrischen Stationen. Die Hypothese lautet, dass sowohl die Größe der akutpsychiatrischen Station als auch die Anzahl der Pflegenden einen Einfluss auf das Vorkommen konflikthafter Situationen haben. Hierfür sind Daten in 6 Kliniken auf insgesamt 12 psychiatrischen Stationen erfasst worden. Als Erfassungsinstrument diente die Patient Staff Conflict Checklist – Shift Report (PCC-SR). Insgesamt konnten 2026 Schichten (Früh‑, Spät- und Nachtschicht) erfasst und ausgewertet werden. Die personelle Besetzung der Stationen mit Pflegepersonal variierte erheblich. Die Ergebnisse zeigen, dass sowohl die Stationsgröße als auch die Anzahl der Pflegepersonen auf akutpsychiatrischen Stationen einen signifikanten Einfluss auf das Vorkommen von Konflikten haben. In den Ergebnissen zeigt sich weiterhin, dass sich die Inzidenz des konflikthaften Verhaltens von Patienten sowohl im Hinblick auf die untersuchten Stationen der beteiligten Krankenhäuser als auch im Hinblick auf die betrachteten Dienstzeittypen unterscheiden. Darüber hinaus zeigt sich, dass das Ausmaß der Schließung einer Akutstation und die Größe einer Station einen negativen Einfluss auf die Inzidenz von Konflikten im stationär akutpsychiatrischen Kontext haben. Das Auftreten konflikthaften Verhaltens kann zur Fremd- oder Selbstgefährdung und zu einer Vielzahl deeskalierender und eindämmender Maßnahmen führen. Hierfür sind entsprechende personelle Ressourcen erforderlich.
Einleitung: Bisherige Untersuchungen deuten darauf hin, dass etwa 30-40 % der PatientInnen in der kardiologischen Rehabilitation eine besondere berufliche Problemlage (BBPL) aufweisen. Die hindernden und fördernden Faktoren des beruflichen Wiedereinstiegs wurden vielfach untersucht. Beispielsweise können eine positive Gesundheitswahrnehmung, Beschwerdefreiheit und Berufszufriedenheit als Förderfaktoren, und Komorbiditäten, die Krankheitsschwere, motivationale Gründe sowie das Alter beispielhaft als Hindernisse benannt werden. In dieser Arbeit sollten die Faktoren, die die subjektiven Berufsaussichten von PatientInnen in der kardiologischen Anschlussheilbehandlung (AHB) bestimmen, identifiziert und beschrieben werden. Daraus sollten Impulse für ein patientInnenzentriertes Vorgehen in der AHB abgeleitet werden.
Methode: In einer qualitativen, monozentrischen Interviewstudie wurden insgesamt 20 PatientInnen mit und ohne BBPL in der kardiologischen AHB als ExpertInnen gefragt, um die subjektiven Erwerbserwartungen zu eruieren und die PatientInnenperspektive besser zu verstehen. Die Interviews wurden aufgezeichnet, transkribiert und codiert. Die Auswertung erfolgte mittels der thematischen Analyse.
Ergebnisse: Es wurden sieben Schlüsselthemen identifiziert. Hierzu gehörten die krankheitsbezogenen Vorerfahrungen sowie Zukunftsvorstellungen als perspektivische Einflussfaktoren. Außerdem wurden interne und externe Aspekte, darunter die Gesundheitswahrnehmung (inkl. Belastbarkeitseinschätzung), die Veränderbarkeit der Arbeitsbedingungen und die Angst, erneut zu erkranken, als bedeutsame Themen ermittelt. Deutlich wurde auch, dass die BBPL-PatientInnen in das Berufsleben zurückkehren wollten, das kardiologische Ereignis jedoch zu einer wahrgenommenen Notwendigkeit für Lebensstil- und Prioritätenänderungen geführt hat. Zur Umsetzung dieser wollten sich die PatientInnen Zeit nehmen, auch das soziale Umfeld unterstützte die Priorisierung der Gesundheit.
Schlussfolgerung: Hieraus ergibt sich die Notwendigkeit einer multiprofessionellen, dabei individuell-differenzierten Herangehensweise in der kardiologischen AHB. Ein besonderer Fokus sollte auf der Berücksichtigung der Selbsterwartung, der individuellen Zielsetzung im Hinblick auf die Berufszukunft und dem Einbeziehen des sozialen Umfelds liegen. Des Weiteren wird eine Überarbeitung des BBPL-Begriffes vorgeschlagen, da die Zuweisung einer solchen Problemlage durch den Kostenträger paradox und stigmatisierend erscheint.
This study investigated whether transcutaneous auricular vagus nerve stimulation (taVNS) enhances reversal learning and augments noradrenergic biomarkers (i.e., pupil size, cortisol, and salivary alpha-amylase [sAA]). We also explored the effect of taVNS on respiratory rate and cardiac vagal activity (CVA). Seventy-one participants received stimulation of either the cymba concha (taVNS) or the earlobe (sham) of the left ear. After learning a series of cue-outcome associations, the stimulation was applied before and throughout a reversal phase in which cue-outcome associations were changed for some (reversal), but not for other (distractor) cues. Tonic pupil size, salivary cortisol, sAA, respiratory rate, and CVA were assessed at different time points. Contrary to our hypothesis, taVNS was not associated with an overall improvement in performance on the reversal task. Compared to sham, the taVNS group performed worse for distractor than reversal cues. taVNS did not increase tonic pupil size and sAA. Only post hoc analyses indicated that the cortisol decline was steeper in the sham compared to the taVNS group. Exploratory analyses showed that taVNS decreased respiratory rate but did not affect CVA. The weak and unexpected effects found in this study might relate to the lack of parameters optimization for taVNS and invite to further investigate the effect of taVNS on cortisol and respiratory rate.