Refine
Document Type
- Article (54)
- Postprint (15)
- Doctoral Thesis (2)
- Monograph/Edited Volume (1)
- Conference Proceeding (1)
- Habilitation Thesis (1)
- Master's Thesis (1)
Is part of the Bibliography
- yes (75)
Keywords
- allostatic load (5)
- bone remodeling (3)
- cortisol (3)
- microRNA (3)
- neuroplasticity (3)
- osteoblast (3)
- osteoclast (3)
- sonography (3)
- Advanced Dynamic Flow (2)
- Aging (2)
Institute
- Fakultät für Gesundheitswissenschaften (75) (remove)
Einleitung: Bisherige Untersuchungen deuten darauf hin, dass etwa 30-40 % der PatientInnen in der kardiologischen Rehabilitation eine besondere berufliche Problemlage (BBPL) aufweisen. Die hindernden und fördernden Faktoren des beruflichen Wiedereinstiegs wurden vielfach untersucht. Beispielsweise können eine positive Gesundheitswahrnehmung, Beschwerdefreiheit und Berufszufriedenheit als Förderfaktoren, und Komorbiditäten, die Krankheitsschwere, motivationale Gründe sowie das Alter beispielhaft als Hindernisse benannt werden. In dieser Arbeit sollten die Faktoren, die die subjektiven Berufsaussichten von PatientInnen in der kardiologischen Anschlussheilbehandlung (AHB) bestimmen, identifiziert und beschrieben werden. Daraus sollten Impulse für ein patientInnenzentriertes Vorgehen in der AHB abgeleitet werden.
Methode: In einer qualitativen, monozentrischen Interviewstudie wurden insgesamt 20 PatientInnen mit und ohne BBPL in der kardiologischen AHB als ExpertInnen gefragt, um die subjektiven Erwerbserwartungen zu eruieren und die PatientInnenperspektive besser zu verstehen. Die Interviews wurden aufgezeichnet, transkribiert und codiert. Die Auswertung erfolgte mittels der thematischen Analyse.
Ergebnisse: Es wurden sieben Schlüsselthemen identifiziert. Hierzu gehörten die krankheitsbezogenen Vorerfahrungen sowie Zukunftsvorstellungen als perspektivische Einflussfaktoren. Außerdem wurden interne und externe Aspekte, darunter die Gesundheitswahrnehmung (inkl. Belastbarkeitseinschätzung), die Veränderbarkeit der Arbeitsbedingungen und die Angst, erneut zu erkranken, als bedeutsame Themen ermittelt. Deutlich wurde auch, dass die BBPL-PatientInnen in das Berufsleben zurückkehren wollten, das kardiologische Ereignis jedoch zu einer wahrgenommenen Notwendigkeit für Lebensstil- und Prioritätenänderungen geführt hat. Zur Umsetzung dieser wollten sich die PatientInnen Zeit nehmen, auch das soziale Umfeld unterstützte die Priorisierung der Gesundheit.
Schlussfolgerung: Hieraus ergibt sich die Notwendigkeit einer multiprofessionellen, dabei individuell-differenzierten Herangehensweise in der kardiologischen AHB. Ein besonderer Fokus sollte auf der Berücksichtigung der Selbsterwartung, der individuellen Zielsetzung im Hinblick auf die Berufszukunft und dem Einbeziehen des sozialen Umfelds liegen. Des Weiteren wird eine Überarbeitung des BBPL-Begriffes vorgeschlagen, da die Zuweisung einer solchen Problemlage durch den Kostenträger paradox und stigmatisierend erscheint.
Secretory Wnt trafficking can be studied in the polarized epithelial monolayer of Drosophila wing imaginal discs (WID). In this tissue, Wg (Drosophila Wnt-I) is presented on the apical surface of its source cells before being internalized into the endosomal pathway. Long-range Wg secretion and spread depend on secondary secretion from endosomal compartments, but the exact post-endocytic fate of Wg is poorly understood. Here, we summarize and present three protocols for the immunofluorescencebased visualization and quantitation of different pools of intracellular and extracellular Wg in WID: (1) steady-state extracellular Wg; (2) dynamic Wg trafficking inside endosomal compartments; and (3) dynamic Wg release to the cell surface. Using a genetic driver system for gene manipulation specifically at the posterior part of the WID (EnGal4) provides a robust internal control that allows for direct comparison of signal intensities of control and manipulated compartments of the same WID. Therefore, it also circumvents the high degree of staining variability usually associated with whole-tissue samples. In combination with the genetic manipulation of Wg pathway components that is easily feasible in Drosophila, these methods provide a tool-set for the dissection of secretory Wg trafficking and can help us to understand how Wnt proteins travel along endosomal compartments for short-and long-range signal secretion.
Stress and pain
(2022)
Introduction: Low back pain (LBP) leads to considerable impairment of quality of life worldwide and is often accompanied by psychosomatic symptoms.
Objectives: First, to assess the association between stress and chronic low back pain (CLBP) and its simultaneous appearance with fatigue and depression as a symptom triad. Second, to identify the most predictive stress-related pattern set for CLBP for a 1-year diagnosis.
Methods: In a 1-year observational study with four measurement points, a total of 140 volunteers (aged 18–45 years with intermittent pain) were recruited. The primary outcomes were pain [characteristic pain intensity (CPI), subjective pain disability (DISS)], fatigue, and depressive mood. Stress was assessed as chronic stress, perceived stress, effort reward imbalance, life events, and physiological markers [allostatic load index (ALI), hair cortisol concentration (HCC)]. Multiple linear regression models and selection procedures for model shrinkage and variable selection (least absolute shrinkage and selection operator) were applied. Prediction accuracy was calculated by root mean squared error (RMSE) and receiver-operating characteristic curves.
Results: There were 110 participants completed the baseline assessments (28.2 7.5 years, 38.1% female), including HCC, and a further of 46 participants agreed to ALI laboratory measurements. Different stress types were associated with LBP, CLBP, fatigue, and depressive mood and its joint occurrence as a symptom triad at baseline; mainly social-related stress types were of relevance. Work-related stress, such as “excessive demands at work”[b = 0.51 (95%CI -0.23, 1.25), p = 0.18] played a role for upcoming chronic pain disability. “Social overload” [b = 0.45 (95%CI -0.06, 0.96), p = 0.080] and “over-commitment at work” [b = 0.28 (95%CI -0.39, 0.95), p = 0.42] were associated with an upcoming depressive mood within 1-year. Finally, seven psychometric (CPI: RMSE = 12.63; DISS: RMSE = 9.81) and five biomarkers (CPI: RMSE = 12.21; DISS: RMSE = 8.94) could be derived as the most predictive pattern set for a 1-year prediction of CLBP. The biomarker set showed an apparent area under the curve of 0.88 for CPI and 0.99 for DISS.
Conclusion: Stress disrupts allostasis and favors the development of chronic pain, fatigue, and depression and the emergence of a “hypocortisolemic symptom triad,” whereby the social-related stressors play a significant role. For translational medicine, a predictive pattern set could be derived which enables to diagnose the individuals at higher risk for the upcoming pain disorders and can be used in practice.
Training intervention effects on cognitive performance and neuronal plasticity — A pilot study
(2022)
Studies suggest that people suffering from chronic pain may have altered brain plasticity, along with altered functional connectivity between pain-processing brain regions. These may be related to decreased mood and cognitive performance. There is some debate as to whether physical activity combined with behavioral therapy (e.g. cognitive distraction, body scan) may counteract these changes. However, underlying neuronal mechanisms are unclear. The aim of the current pilot study with a 3-armed randomized controlled trial design was to examine the effects of sensorimotor training for nonspecific chronic low back pain on (1) cognitive performance; (2) fMRI activity co-fluctuations (functional connectivity) between pain-related brain regions; and (3) the relationship between functional connectivity and subjective variables (pain and depression). Six hundred and sixty two volunteers with non-specific chronic low back pain were randomly allocated to a unimodal (sensorimotor training), multidisciplinary (sensorimotor training and behavioral therapy) intervention, or to a control group within a multicenter study. A subsample of patients (n = 21) from one study center participated in the pilot study presented here. Measurements were at baseline, during (3 weeks, M2) and after intervention (12 weeks, M4 and 24 weeks, M5). Cognitive performance was measured by the Trail Making Test and functional connectivity by MRI. Pain perception and depression were assessed by the Von Korff questionnaire and the Hospital and Anxiety. Group differences were calculated by univariate and repeated ANOVA measures and Bayesian statistics; correlations by Pearson's r. Change and correlation of functional connection were analyzed within a pooled intervention group (uni-, multidisciplinary group). Results revealed that participants with increased pain intensity at baseline showed higher functional connectivity between pain-related brain areas used as ROIs in this study. Though small sample sizes limit generalization, cognitive performance increased in the multimodal group. Increased functional connectivity was observed in participants with increased pain ratings. Pain ratings and connectivity in pain-related brain regions decreased after the intervention. The results provide preliminary indication that intervention effects can potentially be achieved on the cognitive and neuronal level. The intervention may be suitable for therapy and prevention of non-specific chronic low back pain.
Training intervention effects on cognitive performance and neuronal plasticity — A pilot study
(2022)
Studies suggest that people suffering from chronic pain may have altered brain plasticity, along with altered functional connectivity between pain-processing brain regions. These may be related to decreased mood and cognitive performance. There is some debate as to whether physical activity combined with behavioral therapy (e.g. cognitive distraction, body scan) may counteract these changes. However, underlying neuronal mechanisms are unclear. The aim of the current pilot study with a 3-armed randomized controlled trial design was to examine the effects of sensorimotor training for nonspecific chronic low back pain on (1) cognitive performance; (2) fMRI activity co-fluctuations (functional connectivity) between pain-related brain regions; and (3) the relationship between functional connectivity and subjective variables (pain and depression). Six hundred and sixty two volunteers with non-specific chronic low back pain were randomly allocated to a unimodal (sensorimotor training), multidisciplinary (sensorimotor training and behavioral therapy) intervention, or to a control group within a multicenter study. A subsample of patients (n = 21) from one study center participated in the pilot study presented here. Measurements were at baseline, during (3 weeks, M2) and after intervention (12 weeks, M4 and 24 weeks, M5). Cognitive performance was measured by the Trail Making Test and functional connectivity by MRI. Pain perception and depression were assessed by the Von Korff questionnaire and the Hospital and Anxiety. Group differences were calculated by univariate and repeated ANOVA measures and Bayesian statistics; correlations by Pearson's r. Change and correlation of functional connection were analyzed within a pooled intervention group (uni-, multidisciplinary group). Results revealed that participants with increased pain intensity at baseline showed higher functional connectivity between pain-related brain areas used as ROIs in this study. Though small sample sizes limit generalization, cognitive performance increased in the multimodal group. Increased functional connectivity was observed in participants with increased pain ratings. Pain ratings and connectivity in pain-related brain regions decreased after the intervention. The results provide preliminary indication that intervention effects can potentially be achieved on the cognitive and neuronal level. The intervention may be suitable for therapy and prevention of non-specific chronic low back pain.
Coronary artery disease (CAD) is the leading cause of death worldwide.
Statins reduce morbidity and mortality of CAD. Intake of n-3 polyunsaturated fatty acid (n-3 PUFAs), particularly eicosapentaenoic acid (EPA), is associated with reduced morbidity and mortality in patients with CAD. Previous data indicate that a higher conversion of precursor fatty acids (FAs) to arachidonic acid (AA) is associated with increased CAD prevalence.
Our study explored the FA composition in blood to assess n-3 PUFA levels from patients with and without CAD. We analyzed blood samples from 273 patients undergoing cardiac catheterization. Patients were stratified according to clinically relevant CAD (n = 192) and those without (n = 81). FA analysis in full blood was performed by gas chromatography. Indicating increased formation of AA from precursors, the ratio of dihomo-gamma-linolenic acid (DGLA) to AA, the delta-5 desaturase index (D5D index) was higher in CAD patients. CAD patients had significantly lower levels of omega-6 polyunsaturated FAs (n-6 PUFA) and n-3 PUFA, particularly EPA, in the blood.
Thus, our study supports a role of increased EPA levels for cardioprotection.
Electrical muscle stimulation (EMS) is an increasingly popular training method and has become the focus of research in recent years. New EMS devices offer a wide range of mobile applications for whole-body EMS (WB-EMS) training, e.g., the intensification of dynamic low-intensity endurance exercises through WB-EMS. The present study aimed to determine the differences in exercise intensity between WB-EMS-superimposed and conventional walking (EMS-CW), and CON and WB-EMS-superimposed Nordic walking (WB-EMS-NW) during a treadmill test. Eleven participants (52.0 ± years; 85.9 ± 7.4 kg, 182 ± 6 cm, BMI 25.9 ± 2.2 kg/m2) performed a 10 min treadmill test at a given velocity (6.5 km/h) in four different test situations, walking (W) and Nordic walking (NW) in both conventional and WB-EMS superimposed. Oxygen uptake in absolute (VO2) and relative to body weight (rel. VO2), lactate, and the rate of perceived exertion (RPE) were measured before and after the test. WB-EMS intensity was adjusted individually according to the feedback of the participant. The descriptive statistics were given in mean ± SD. For the statistical analyses, one-factorial ANOVA for repeated measures and two-factorial ANOVA [factors include EMS, W/NW, and factor combination (EMS*W/NW)] were performed (α = 0.05). Significant effects were found for EMS and W/NW factors for the outcome variables VO2 (EMS: p = 0.006, r = 0.736; W/NW: p < 0.001, r = 0.870), relative VO2 (EMS: p < 0.001, r = 0.850; W/NW: p < 0.001, r = 0.937), and lactate (EMS: p = 0.003, r = 0.771; w/NW: p = 0.003, r = 0.764) and both the factors produced higher results. However, the difference in VO2 and relative VO2 is within the range of biological variability of ± 12%. The factor combination EMS*W/NW is statistically non-significant for all three variables. WB-EMS resulted in the higher RPE values (p = 0.035, r = 0.613), RPE differences for W/NW and EMS*W/NW were not significant. The current study results indicate that WB-EMS influences the parameters of exercise intensity. The impact on exercise intensity and the clinical relevance of WB-EMS-superimposed walking (WB-EMS-W) exercise is questionable because of the marginal differences in the outcome variables.
Emotional memories are better remembered than neutral ones, but the mechanisms leading to this memory bias are not well under-stood in humans yet. Based on animal research, it is suggested that the memory-enhancing effect of emotion is based on central nor-adrenergic release, which is triggered by afferent vagal nerve activation. To test the causal link between vagus nerve activation and emotional memory in humans, we applied continuous noninvasive transcutaneous auricular vagus nerve stimulation (taVNS) during exposure to emotional arousing and neutral scenes and tested subsequent, long-term recognition memory after 1 week. We found that taVNS, compared with sham, increased recollection-based memory performance for emotional, but not neutral, material. These findings were complemented by larger recollection-related brain potentials (parietal ERP Old/New effect) during retrieval of emotional scenes encoded under taVNS, compared with sham. Furthermore, brain potentials recorded during encoding also revealed that taVNS facilitated early attentional discrimination between emotional and neutral scenes. Extending animal research, our behavioral and neu-ral findings confirm a modulatory influence of the vagus nerve in emotional memory formation in humans.
The investigation of protein structures, functions and interactions often requires modifications to adapt protein properties to the specific application. Among many possible methods to equip proteins with new chemical groups, the utilization of orthogonal aminoacyl-tRNA synthetase/tRNA pairs enables the site-specific incorporation of non-canonical amino acids at defined positions in the protein. The open nature of cell-free protein synthesis reactions provides an optimal environment, as the orthogonal components do not need to be transported across the cell membrane and the impact on cell viability is negligible. In the present work, it was shown that the expression of orthogonal aminoacyl-tRNA synthetases in CHO cells prior to cell disruption enhanced the modification of the pharmaceutically relevant adenosine A2a receptor. For this purpose, in complement to transient transfection of CHO cells, an approach based on CRISPR/Cas9 technology was selected to generate a translationally active cell lysate harboring endogenous orthogonal aminoacyl-tRNA synthetase.
Stunting
(2021)
Cardiac rehabilitation
(2021)
Following stroke, neuronal death takes place both in the infarct region and in brain areas distal to the lesion site including the hippocampus. The hippocampus is critically involved in learning and memory processes and continuously generates new neurons. Dysregulation of adult neurogenesis may be associated with cognitive decline after a stroke lesion. In particular, proliferation of precursor cells and the formation of new neurons are increased after lesion. Within the first week, many new precursor cells die during development. How dying precursors are removed from the hippocampus and to what extent phagocytosis takes place after stroke is still not clear. Here, we evaluated the effect of a prefrontal stroke lesion on the phagocytic activity of microglia in the dentate gyrus (DG) of the hippocampus. Three-months-old C57BL/6J mice were injected once with the proliferation marker BrdU (250 mg/kg) 6 hr after a middle cerebral artery occlusion or sham surgery. The number of apoptotic cells and the phagocytic capacity of the microglia were evaluated by means of immunohistochemistry, confocal microscopy, and 3D-reconstructions. We found a transient but significant increase in the number of apoptotic cells in the DG early after stroke, associated with impaired removal by microglia. Interestingly, phagocytosis of newly generated precursor cells was not affected. Our study shows that a prefrontal stroke lesion affects phagocytosis of apoptotic cells in the DG, a region distal to the lesion core. Whether disturbed phagocytosis might contribute to inflammatory- and maladaptive processes including cognitive impairment following stroke needs to be further investigated.
The reliability of quantifying intratendinous vascularization by high-sensitivity Doppler ultrasound advanced dynamic flow has not been examined yet. Therefore, this study aimed to investigate the intraobserver and interobserver reliability of evaluating Achilles tendon vascularization by advanced dynamic flow using established scoring systems. Methods-Three investigators evaluated vascularization in 67 recordings in a test-retest design, applying the Ohberg score, a modified Ohberg score, and a counting score. Intraobserver and interobserver agreement for the Ohberg score and modified Ohberg score was analyzed by the Cohen kappa and Fleiss kappa coefficients (absolute), Kendall tau b coefficient, and Kendall coefficient of concordance (W; relative). The reliability of the counting score was analyzed by intraclass correlation coefficients (ICC) 2.1 and 3.1, the standard error of measurement (SEM), and Bland-Altman analysis (bias and limits of agreement [LoA]). Results-Intraobserver and interobserver agreement (absolute/relative) ranged from 0.61 to 0.87/0.87 to 0.95 and 0.11 to 0.66/0.76 to 0.89 for the Ohberg score and from 0.81 to 0.87/0.92 to 0.95 and 0.64 to 0.80/0.88 to 0.93 for the modified Ohberg score, respectively. The counting score revealed an intraobserver ICC of 0.94 to 0.97 (SEM, 1.0-1.5; bias, -1; and LoA, 3-4 vessels). The interobserver ICC for the counting score ranged from 0.91 to 0.98 (SEM, 1.0-1.9; bias, 0; and LoA, 3-5 vessels). Conclusions-The modified Ohberg score and counting score showed excellent reliability and seem convenient for research and clinical practice. The Ohberg score revealed decent intraobserver but unexpected low interobserver reliability and therefore cannot be recommended.
Background: The relationship between exercise-induced intratendinous blood flow (IBF) and tendon pathology or training exposure is unclear.
Objective: This study investigates the acute effect of running exercise on sonographic detectable IBF in healthy and tendinopathic Achilles tendons (ATs) of runners and recreational participants.
Methods: 48 participants (43 ± 13 years, 176 ± 9 cm, 75 ± 11 kg) performed a standardized submaximal 30-min constant load treadmill run with Doppler ultrasound “Advanced dynamic flow” examinations before (Upre) and 5, 30, 60, and 120 min (U5-U120) afterward. Included were runners (>30 km/week) and recreational participants (<10 km/week) with healthy (Hrun, n = 10; Hrec, n = 15) or tendinopathic (Trun, n = 13; Trec, n = 10) ATs. IBF was assessed by counting number [n] of intratendinous vessels. IBF data are presented descriptively (%, median [minimum to maximum range] for baseline-IBF and IBF-difference post-exercise). Statistical differences for group and time point IBF and IBF changes were analyzed with Friedman and Kruskal-Wallis ANOVA (α = 0.05).
Results: At baseline, IBF was detected in 40% (3 [1–6]) of Hrun, in 53% (4 [1–5]) of Hrec, in 85% (3 [1–25]) of Trun, and 70% (10 [2–30]) of Trec. At U5 IBF responded to exercise in 30% (3 [−1–9]) of Hrun, in 53% (4 [−2–6]) of Hrec, in 70% (4 [−10–10]) of Trun, and in 80% (5 [1–10]) of Trec. While IBF in 80% of healthy responding ATs returned to baseline at U30, IBF remained elevated until U120 in 60% of tendinopathic ATs. Within groups, IBF changes from Upre-U120 were significant for Hrec (p < 0.01), Trun (p = 0.05), and Trec (p < 0.01). Between groups, IBF changes in consecutive examinations were not significantly different (p > 0.05) but IBF-level was significantly higher at all measurement time points in tendinopathic versus healthy ATs (p < 0.05).
Conclusion: Irrespective of training status and tendon pathology, running leads to an immediate increase of IBF in responding tendons. This increase occurs shortly in healthy and prolonged in tendinopathic ATs. Training exposure does not alter IBF occurrence, but IBF level is elevated in tendon pathology. While an immediate exercise-induced IBF increase is a physiological response, prolonged IBF is considered a pathological finding associated with Achilles tendinopathy.
Background: The relationship between exercise-induced intratendinous blood flow (IBF) and tendon pathology or training exposure is unclear.
Objective: This study investigates the acute effect of running exercise on sonographic detectable IBF in healthy and tendinopathic Achilles tendons (ATs) of runners and recreational participants.
Methods: 48 participants (43 ± 13 years, 176 ± 9 cm, 75 ± 11 kg) performed a standardized submaximal 30-min constant load treadmill run with Doppler ultrasound “Advanced dynamic flow” examinations before (Upre) and 5, 30, 60, and 120 min (U5-U120) afterward. Included were runners (>30 km/week) and recreational participants (<10 km/week) with healthy (Hrun, n = 10; Hrec, n = 15) or tendinopathic (Trun, n = 13; Trec, n = 10) ATs. IBF was assessed by counting number [n] of intratendinous vessels. IBF data are presented descriptively (%, median [minimum to maximum range] for baseline-IBF and IBF-difference post-exercise). Statistical differences for group and time point IBF and IBF changes were analyzed with Friedman and Kruskal-Wallis ANOVA (α = 0.05).
Results: At baseline, IBF was detected in 40% (3 [1–6]) of Hrun, in 53% (4 [1–5]) of Hrec, in 85% (3 [1–25]) of Trun, and 70% (10 [2–30]) of Trec. At U5 IBF responded to exercise in 30% (3 [−1–9]) of Hrun, in 53% (4 [−2–6]) of Hrec, in 70% (4 [−10–10]) of Trun, and in 80% (5 [1–10]) of Trec. While IBF in 80% of healthy responding ATs returned to baseline at U30, IBF remained elevated until U120 in 60% of tendinopathic ATs. Within groups, IBF changes from Upre-U120 were significant for Hrec (p < 0.01), Trun (p = 0.05), and Trec (p < 0.01). Between groups, IBF changes in consecutive examinations were not significantly different (p > 0.05) but IBF-level was significantly higher at all measurement time points in tendinopathic versus healthy ATs (p < 0.05).
Conclusion: Irrespective of training status and tendon pathology, running leads to an immediate increase of IBF in responding tendons. This increase occurs shortly in healthy and prolonged in tendinopathic ATs. Training exposure does not alter IBF occurrence, but IBF level is elevated in tendon pathology. While an immediate exercise-induced IBF increase is a physiological response, prolonged IBF is considered a pathological finding associated with Achilles tendinopathy.
Angesichts der Alterung der Gesellschaft und der hohen Kosten für die Unterstützung und Pflege in privaten Haushalten stellt sich die Frage, welche Rolle assistive Roboter spielen können. Dieser Beitrag richtet sich auf die Frage, inwieweit Roboter in der Pflege heute von der erwachsenen Bevölkerung in Deutschland akzeptiert werden. Und inwieweit beeinflussen Geschlecht, Alter und Erfahrung (beruflich, persönlich) das Ausmaß dieser Akzeptanz? Die durchgeführten Auswertungen beruhen auf drei repräsentativen Erhebungen mit insgesamt über 7000 Befragten. Zwei Erhebungen fanden in der 2. Jahreshälfte 2017 im Auftrag der Deutschen Akademie der Technikwissenschaften (acatech) und des Lebensversicherers ERGO statt, die dritte Erhebung im Auftrag des Sachverständigenrats für Verbraucherfragen (SVRV) im Frühjahr 2018. Eine vertiefte und kumulative Auswertung dieser Erhebungen und Datensätze, die von den Autoren mitkonzipiert wurden, im Hinblick auf assistive Robotik ist bislang noch nicht veröffentlicht. Trotz unterschiedlicher erfragter Einsatzszenarien für Roboter in der Pflege stimmen die Ergebnisse aller 3 Erhebungen erstaunlich überein: In Deutschland gibt es eine signifikante Minderheit von Menschen, die bereits jetzt eine funktionierende Betreuung von Robotern akzeptieren würden – sofern dadurch menschliche Pflege nicht ersetzt, sondern nur unterstützt würde. Ein gutes Drittel, das nach Alter und Geschlecht differenziert ist, lehnt die Assistenz durch Roboter grundsätzlich ab.
Objective
For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
Methods
We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
Results
Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
Conclusion
Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population
OBJECTIVE: For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
METHODS: We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
RESULTS: Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
CONCLUSION: Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population.
OBJECTIVE: For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
METHODS: We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
RESULTS: Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
CONCLUSION: Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population.
Sedentarism is a risk factor for depression and anxiety. People living with the human immunodeficiency virus (PLWH) have a higher prevalence of anxiety and depression compared to HIV-negative individuals. This cross-sectional study (n = 450, median age 44 (19-75), 7.3% females) evaluates the prevalence rates and prevalence ratio (PR) of anxiety and/or depression in PLWH associated with recreational exercise. A decreased likelihood of having anxiety (PR=0.57; 0.36-0.91; p = 0.01), depression (PR=0.41; 0.36-0.94; p=0.01), and comorbid anxiety and depression (PR = 0,43; 0.24-0.75; p=0.002) was found in exercising compared to non-exercising PLWH. Recreational exercise is associated with a lower risk for anxiety and/or depression. Further prospective studies are needed to provide insights on the direction of this association.
Development of chronic pain after a low back pain episode is associated with increased pain sensitivity, altered pain processing mechanisms and the influence of psychosocial factors. Although there is some evidence that multimodal therapy (such as behavioral or motor control therapy) may be an important therapeutic strategy, its long-term effect on pain reduction and psychosocial load is still unclear. Prospective longitudinal designs providing information about the extent of such possible long-term effects are missing. This study aims to investigate the long-term effects of a homebased uni- and multidisciplinary motor control exercise program on low back pain intensity, disability and psychosocial variables. 14 months after completion of a multicenter study comparing uni- and multidisciplinary exercise interventions, a sample of one study center (n = 154) was assessed once more. Participants filled in questionnaires regarding their low back pain symptoms (characteristic pain intensity and related disability), stress and vital exhaustion (short version of the Maastricht Vital Exhaustion Questionnaire), anxiety and depression experiences (the Hospital and Anxiety Depression Scale), and pain-related cognitions (the Fear Avoidance Beliefs Questionnaire). Repeated measures mixed ANCOVAs were calculated to determine the long-term effects of the interventions on characteristic pain intensity and disability as well as on the psychosocial variables. Fifty four percent of the sub-sample responded to the questionnaires (n = 84). Longitudinal analyses revealed a significant long-term effect of the exercise intervention on pain disability. The multidisciplinary group missed statistical significance yet showed a medium sized long-term effect. The groups did not differ in their changes of the psychosocial variables of interest. There was evidence of long-term effects of the interventions on pain-related disability, but there was no effect on the other variables of interest. This may be partially explained by participant's low comorbidities at baseline. Results are important regarding costless homebased alternatives for back pain patients and prevention tasks. Furthermore, this study closes the gap of missing long-term effect analysis in this field.
Basic psychological needs theory postulates that a social environment that satisfies individuals’ three basic psychological needs of autonomy, competence, and relatedness leads to optimal growth and well-being. On the other hand, the frustration of these needs is associated with ill-being and depressive symptoms foremost investigated in non-clinical samples; yet, there is a paucity of research on need frustration in clinical samples. Survey data were compared between adult individuals with major depressive disorder (MDD; n = 115; 48.69% female; 38.46 years, SD = 10.46) with those of a non-depressed comparison sample (n = 201; 53.23% female; 30.16 years, SD = 12.81). Need profiles were examined with a linear mixed model (LMM). Individuals with depression reported higher levels of frustration and lower levels of satisfaction in relation to the three basic psychological needs when compared to non-depressed adults. The difference between depressed and non-depressed groups was significantly larger for frustration than satisfaction regarding the needs for relatedness and competence. LMM correlation parameters confirmed the expected positive correlation between the three needs. This is the first study showing substantial differences in need-based experiences between depressed and non-depressed adults. The results confirm basic assumptions of the self-determination theory and have preliminary implications in tailoring therapy for depression.
Basic psychological needs theory postulates that a social environment that satisfies individuals’ three basic psychological needs of autonomy, competence, and relatedness leads to optimal growth and well-being. On the other hand, the frustration of these needs is associated with ill-being and depressive symptoms foremost investigated in non-clinical samples; yet, there is a paucity of research on need frustration in clinical samples. Survey data were compared between adult individuals with major depressive disorder (MDD; n = 115; 48.69% female; 38.46 years, SD = 10.46) with those of a non-depressed comparison sample (n = 201; 53.23% female; 30.16 years, SD = 12.81). Need profiles were examined with a linear mixed model (LMM). Individuals with depression reported higher levels of frustration and lower levels of satisfaction in relation to the three basic psychological needs when compared to non-depressed adults. The difference between depressed and non-depressed groups was significantly larger for frustration than satisfaction regarding the needs for relatedness and competence. LMM correlation parameters confirmed the expected positive correlation between the three needs. This is the first study showing substantial differences in need-based experiences between depressed and non-depressed adults. The results confirm basic assumptions of the self-determination theory and have preliminary implications in tailoring therapy for depression.
Association of primary allostatic load mediators and metabolic syndrome (MetS): A systematic review
(2022)
Allostatic load (AL) exposure may cause detrimental effects on the neuroendocrine system, leading to metabolic syndrome (MetS). The primary mediators of AL involve serum dehydroepiandrosterone sulfate (DHEAS; a functional HPA axis antagonist); further, cortisol, urinary norepinephrine (NE), and epinephrine (EPI) excretion levels (assessed within 12-h urine as a golden standard for the evaluation of the HPA axis activity and sympathetic nervous system activity). However, the evidence of an association between the primary mediators of AL and MetS is limited. This systematic review aimed to critically examine the association between the primary mediators of AL and MetS. PubMed and Web of Science were searched for articles from January 2010 to December 2021, published in English. The search strategy focused on cross-sectional and case–control studies comprising adult participants with MetS, obesity, overweight, and without chronic diseases. The STROBE checklist was used to assess study quality control. Of 770 studies, twenty-one studies with a total sample size (n = 10,666) met the eligibility criteria. Eighteen studies were cross-sectional, and three were case–control studies. The included studies had a completeness of reporting score of COR % = 87.0 ± 6.4%. It is to be noted, that cortisol as a primary mediator of AL showed an association with MetS in 50% (urinary cortisol), 40% (serum cortisol), 60% (salivary cortisol), and 100% (hair cortisol) of the studies. For DHEAS, it is to conclude that 60% of the studies showed an association with MetS. In contrast, urinary EPI and urinary NE had 100% no association with MetS. In summary, there is a tendency for the association between higher serum cortisol, salivary cortisol, urinary cortisol, hair cortisol, and lower levels of DHEAS with MetS. Future studies focusing on longitudinal data are warranted for clarification and understanding of the association between the primary mediators of AL and MetS.
Association of primary allostatic load mediators and metabolic syndrome (MetS): A systematic review
(2022)
Allostatic load (AL) exposure may cause detrimental effects on the neuroendocrine system, leading to metabolic syndrome (MetS). The primary mediators of AL involve serum dehydroepiandrosterone sulfate (DHEAS; a functional HPA axis antagonist); further, cortisol, urinary norepinephrine (NE), and epinephrine (EPI) excretion levels (assessed within 12-h urine as a golden standard for the evaluation of the HPA axis activity and sympathetic nervous system activity). However, the evidence of an association between the primary mediators of AL and MetS is limited. This systematic review aimed to critically examine the association between the primary mediators of AL and MetS. PubMed and Web of Science were searched for articles from January 2010 to December 2021, published in English. The search strategy focused on cross-sectional and case–control studies comprising adult participants with MetS, obesity, overweight, and without chronic diseases. The STROBE checklist was used to assess study quality control. Of 770 studies, twenty-one studies with a total sample size (n = 10,666) met the eligibility criteria. Eighteen studies were cross-sectional, and three were case–control studies. The included studies had a completeness of reporting score of COR % = 87.0 ± 6.4%. It is to be noted, that cortisol as a primary mediator of AL showed an association with MetS in 50% (urinary cortisol), 40% (serum cortisol), 60% (salivary cortisol), and 100% (hair cortisol) of the studies. For DHEAS, it is to conclude that 60% of the studies showed an association with MetS. In contrast, urinary EPI and urinary NE had 100% no association with MetS. In summary, there is a tendency for the association between higher serum cortisol, salivary cortisol, urinary cortisol, hair cortisol, and lower levels of DHEAS with MetS. Future studies focusing on longitudinal data are warranted for clarification and understanding of the association between the primary mediators of AL and MetS.
Boredom has been identified as one of the greatest psychological challenges when staying at home during quarantine and isolation. However, this does not mean that the situation necessarily causes boredom. On the basis of 13 explorative interviews with bored and non-bored persons who have been under quarantine or in isolation, we explain why boredom is related to a subjective interpretation process rather than being a direct consequence of the objective situation. Specifically, we show that participants vary significantly in their interpretations of staying at home and, thus, also in their experience of boredom. While the non-bored participants interpret the situation as a relief or as irrelevant, the bored participants interpret it as a major restriction that only some are able to cope with.
Psychiatrische Stationen sind ein wichtiges Element in der psychiatrischen Versorgung von Menschen mit akuter Eigen- oder Fremdgefährdung. Leider kommt es in diesem Rahmen immer wieder auch zu Aggression, Gewalt (Konflikten) sowie zur Anwendung von Zwang (Eindämmung). Als entscheidender Faktor für den sachgemäßen Umgang mit diesen Situationen wird sowohl die Quantität als auch die Qualität der Mitarbeitenden angesehen. Vor diesem Hintergrund beschäftigt sich die vorliegende Untersuchung mit der Versorgungssituation auf akutpsychiatrischen Stationen. Die Hypothese lautet, dass sowohl die Größe der akutpsychiatrischen Station als auch die Anzahl der Pflegenden einen Einfluss auf das Vorkommen konflikthafter Situationen haben. Hierfür sind Daten in 6 Kliniken auf insgesamt 12 psychiatrischen Stationen erfasst worden. Als Erfassungsinstrument diente die Patient Staff Conflict Checklist – Shift Report (PCC-SR). Insgesamt konnten 2026 Schichten (Früh‑, Spät- und Nachtschicht) erfasst und ausgewertet werden. Die personelle Besetzung der Stationen mit Pflegepersonal variierte erheblich. Die Ergebnisse zeigen, dass sowohl die Stationsgröße als auch die Anzahl der Pflegepersonen auf akutpsychiatrischen Stationen einen signifikanten Einfluss auf das Vorkommen von Konflikten haben. In den Ergebnissen zeigt sich weiterhin, dass sich die Inzidenz des konflikthaften Verhaltens von Patienten sowohl im Hinblick auf die untersuchten Stationen der beteiligten Krankenhäuser als auch im Hinblick auf die betrachteten Dienstzeittypen unterscheiden. Darüber hinaus zeigt sich, dass das Ausmaß der Schließung einer Akutstation und die Größe einer Station einen negativen Einfluss auf die Inzidenz von Konflikten im stationär akutpsychiatrischen Kontext haben. Das Auftreten konflikthaften Verhaltens kann zur Fremd- oder Selbstgefährdung und zu einer Vielzahl deeskalierender und eindämmender Maßnahmen führen. Hierfür sind entsprechende personelle Ressourcen erforderlich.
Risikokommunikation spielt eine zentrale Rolle in Public-Health-Notlagen: Sie muss informierte Entscheidungen ermöglichen, schützendes bzw. lebenserhaltendes Verhalten fördern und das Vertrauen in öffentliche Institutionen bewahren. Zudem müssen Unsicherheiten über wissenschaftliche Erkenntnisse transparent benannt werden, irrationale Ängste und Gerüchte entkräftet werden. Risikokommunikation sollte die Bevölkerung partizipativ einbeziehen. Ihre Risikowahrnehmung und -kompetenz müssen kontinuierlich erfasst werden. In der aktuellen Pandemie der Coronavirus-Krankheit 2019 (COVID-19) ergeben sich spezifische Herausforderungen für die Risikokommunikation.
Der Wissensstand zu vielen wichtigen Aspekten, die COVID-19 betreffen, war und ist oftmals unsicher oder vorläufig, z. B. zu Übertragung, Symptomen, Langzeitfolgen und Immunität. Die Kommunikation ist durch wissenschaftliche Sprache sowie eine Vielzahl von Kennzahlen und Statistiken geprägt, was die Verständlichkeit erschweren kann. Neben offiziellen Mitteilungen und Einschätzungen von Expertinnen und Experten wird über COVID-19 in großem Umfang in sozialen Medien kommuniziert, dabei werden auch Fehlinformationen und Spekulationen verbreitet; diese „Infodemie“ erschwert die Risikokommunikation.
Nationale wie internationale Forschungsprojekte sollen helfen, die Risikokommunikation zu COVID-19 zielgruppenspezifischer und effektiver zu machen. Dazu gehören u. a. explorative Studien zum Umgang mit COVID-19-bezogenen Informationen, das COVID-19 Snapshot Monitoring (COSMO), ein regelmäßig durchgeführtes Onlinesurvey zu Risikowahrnehmung und Schutzverhalten sowie eine interdisziplinäre qualitative Studie, die die Konzeption, Umsetzung und Wirksamkeit von Risikokommunikationsstrategien vergleichend in 4 Ländern untersucht.
Background
Anticancer compound 3-bromopyruvate (3-BrPA) suppresses cancer cell growth via targeting glycolytic and mitochondrial metabolism. The malignant peripheral nerve sheath tumor (MPNST), a very aggressive, therapy resistant, and Neurofibromatosis type 1 associated neoplasia, shows a high metabolic activity and affected patients may therefore benefit from 3-BrPA treatment. To elucidate the specific mode of action, we used a controlled cell model overexpressing proteasome activator (PA) 28, subsequently leading to p53 inactivation and oncogenic transformation and therefore reproducing an important pathway in MPNST and overall tumor pathogenesis.
Methods
Viability of MPNST cell lines S462, NSF1, and T265 in response to increasing doses (0-120 mu M) of 3-BrPA was analyzed by CellTiter-Blue (R) assay. Additionally, we investigated viability, reactive oxygen species (ROS) production (dihydroethidium assay), nicotinamide adenine dinucleotide dehydrogenase activity (NADH-TR assay) and lactate production (lactate assay) in mouse B8 fibroblasts overexpressing PA28 in response to 3-BrPA application. For all experiments normal and nutrient deficient conditions were tested. MPNST cell lines were furthermore characterized immunohistochemically for Ki67, p53, bcl2, bcl6, cyclin D1, and p21.
Results
MPNST significantly responded dose dependent to 3-BrPA application, whereby S462 cells were most responsive. Human control cells showed a reduced sensitivity. In PA28 overexpressing cancer cell model 3-BrPA application harmed mitochondrial NADH dehydrogenase activity mildly and significantly failed to inhibit lactate production. PA28 overexpression was associated with a functional glycolysis as well as a partial resistance to stress provoked by nutrient deprivation. 3-BrPA treatment was not associated with an increase of ROS. Starvation sensitized MPNST to treatment.
Conclusions
Aggressive MPNST cells are sensitive to 3-BrPA therapy in-vitro with and without starvation. In a PA28 overexpression cancer cell model leading to p53 inactivation, thereby reflecting a key molecular feature in human NF1 associated MPNST, known functions of 3-BrPA to block mitochondrial activity and glycolysis were reproduced, however oncogenic cells displayed a partial resistance. To conclude, 3-BrPA was sufficient to reduce NF1 associated MPNST viability potentially due inhibition of glycolysis which should lead to the initiation of further studies and promises a potential benefit for NF1 patients.
Background
Ankle sprain is the most common injury in basketball. Chronic ankle instability develops from an acute ankle sprain may cause negative effects on quality of life, ankle functionality or on increasing risk for recurrent ankle sprains and post-traumatic osteoarthritis. To facilitate a preventative strategy of chronic ankle instability (CAI) in the basketball population, gathering epidemiological data is essential. However, the epidemiological data of CAI in basketball is limited. Therefore, this study aims to investigate the prevalence of CAI in basketball athletes and to determine whether gender, competitive level, and basketball playing position influence this prevalence.
Methods
In a cross-sectional study, in total 391 Taiwanese basketball athletes from universities and sports clubs participated. Besides non-standardized questions about demographics and their history of ankle sprains, participants further filled out the standard Cumberland Ankle Instability Tool applied to determine the presence of ankle instability. Questionnaires from 255 collegiate and 133 semi-professional basketball athletes (male = 243, female = 145, 22.3 ± 3.8 years, 23.3 ± 2.2 kg/m2) were analyzed. Differences in prevalence between gender, competitive level and playing position were determined using the Chi-square test.
Results
In the surveyed cohort, 26% had unilateral CAI while 50% of them had bilateral CAI. Women had a higher prevalence than men in the whole surveyed cohort (X2(1) = 0.515, p = 0.003). This gender disparity also showed from sub-analyses, that the collegiate female athletes had a higher prevalence than collegiate men athletes (X2(1) = 0.203, p = 0.001). Prevalence showed no difference between competitive levels (p > 0.05) and among playing positions (p > 0.05).
Conclusions
CAI is highly prevalent in the basketball population. Gender affects the prevalence of CAI. Regardless of the competitive level and playing position the prevalence of CAI is similar. The characteristic of basketball contributes to the high prevalence. Prevention of CAI should be a focus in basketball. When applying the CAI prevention measures, gender should be taken into consideration.
Background
Ankle sprain is the most common injury in basketball. Chronic ankle instability develops from an acute ankle sprain may cause negative effects on quality of life, ankle functionality or on increasing risk for recurrent ankle sprains and post-traumatic osteoarthritis. To facilitate a preventative strategy of chronic ankle instability (CAI) in the basketball population, gathering epidemiological data is essential. However, the epidemiological data of CAI in basketball is limited. Therefore, this study aims to investigate the prevalence of CAI in basketball athletes and to determine whether gender, competitive level, and basketball playing position influence this prevalence.
Methods
In a cross-sectional study, in total 391 Taiwanese basketball athletes from universities and sports clubs participated. Besides non-standardized questions about demographics and their history of ankle sprains, participants further filled out the standard Cumberland Ankle Instability Tool applied to determine the presence of ankle instability. Questionnaires from 255 collegiate and 133 semi-professional basketball athletes (male = 243, female = 145, 22.3 ± 3.8 years, 23.3 ± 2.2 kg/m2) were analyzed. Differences in prevalence between gender, competitive level and playing position were determined using the Chi-square test.
Results
In the surveyed cohort, 26% had unilateral CAI while 50% of them had bilateral CAI. Women had a higher prevalence than men in the whole surveyed cohort (X2(1) = 0.515, p = 0.003). This gender disparity also showed from sub-analyses, that the collegiate female athletes had a higher prevalence than collegiate men athletes (X2(1) = 0.203, p = 0.001). Prevalence showed no difference between competitive levels (p > 0.05) and among playing positions (p > 0.05).
Conclusions
CAI is highly prevalent in the basketball population. Gender affects the prevalence of CAI. Regardless of the competitive level and playing position the prevalence of CAI is similar. The characteristic of basketball contributes to the high prevalence. Prevention of CAI should be a focus in basketball. When applying the CAI prevention measures, gender should be taken into consideration.
Dementia as one of the most prevalent diseases urges for a better understanding of the central mechanisms responsible for clinical symptoms, and necessitates improvement of actual diagnostic capabilities. The brainstem nucleus locus coeruleus (LC) is a promising target for early diagnosis because of its early structural alterations and its relationship to the functional disturbances in the patients. In this study, we applied our improved method of localisation-based LC resting-state fMRI to investigate the differences in central sensory signal processing when comparing functional connectivity (fc) of a patient group with mild cognitive impairment (MCI, n = 28) and an age-matched healthy control group (n = 29). MCI and control participants could be differentiated in their Mini-Mental-State-Examination (MMSE) scores (p < .001) and LC intensity ratio (p = .010). In the fMRI, LC fc to anterior cingulate cortex (FDR p < .001) and left anterior insula (FDR p = .012) was elevated, and LC fc to right temporoparietal junction (rTPJ, FDR p = .012) and posterior cingulate cortex (PCC, FDR p = .021) was decreased in the patient group. Importantly, LC to rTPJ connectivity was also positively correlated to MMSE scores in MCI patients (p = .017). Furthermore, we found a hyperactivation of the left-insula salience network in the MCI patients. Our results and our proposed disease model shed new light on the functional pathogenesis of MCI by directing to attentional network disturbances, which could aid new therapeutic strategies and provide a marker for diagnosis and prediction of disease progression.
Dementia as one of the most prevalent diseases urges for a better understanding of the central mechanisms responsible for clinical symptoms, and necessitates improvement of actual diagnostic capabilities. The brainstem nucleus locus coeruleus (LC) is a promising target for early diagnosis because of its early structural alterations and its relationship to the functional disturbances in the patients. In this study, we applied our improved method of localisation-based LC resting-state fMRI to investigate the differences in central sensory signal processing when comparing functional connectivity (fc) of a patient group with mild cognitive impairment (MCI, n = 28) and an age-matched healthy control group (n = 29). MCI and control participants could be differentiated in their Mini-Mental-State-Examination (MMSE) scores (p < .001) and LC intensity ratio (p = .010). In the fMRI, LC fc to anterior cingulate cortex (FDR p < .001) and left anterior insula (FDR p = .012) was elevated, and LC fc to right temporoparietal junction (rTPJ, FDR p = .012) and posterior cingulate cortex (PCC, FDR p = .021) was decreased in the patient group. Importantly, LC to rTPJ connectivity was also positively correlated to MMSE scores in MCI patients (p = .017). Furthermore, we found a hyperactivation of the left-insula salience network in the MCI patients. Our results and our proposed disease model shed new light on the functional pathogenesis of MCI by directing to attentional network disturbances, which could aid new therapeutic strategies and provide a marker for diagnosis and prediction of disease progression.
In this report, we investigate small proteins involved in bacterial alternative respiratory systems that improve the enzymatic efficiency through better anchorage and multimerization of membrane components. Using the small protein TorE of the respiratory TMAO reductase system as a model, we discovered that TorE is part of a subfamily of small proteins that are present in proteobacteria in which they play a similar role for bacterial respiratory systems. We reveal by microscopy that, in Shewanella oneidensis MR1, alternative respiratory systems are evenly distributed in the membrane contrary to what has been described for Escherichia coli. Thus, the better efficiency of the respiratory systems observed in the presence of the small proteins is not due to a specific localization in the membrane, but rather to the formation of membranous complexes formed by TorE homologs with their c-type cytochrome partner protein. By an in vivo approach combining Clear Native electrophoresis and fluorescent translational fusions, we determined the 4: 4 stoichiometry of the complexes. In addition, mild solubilization of the cytochrome indicates that the presence of the small protein reinforces its anchoring to the membrane. Therefore, assembly of the complex induced by this small protein improves the efficiency of the respiratory system.
Myasthenia gravis is an autoimmune disease affecting neuromuscular transmission and causing skeletal muscle weakness. Additionally, systemic inflammation, cognitive deficits and autonomic dysfunction have been described.
However, little is known about myasthenia gravis-related reorganization of the brain. In this study, we thus investigated the structural and functional brain changes in myasthenia gravis patients.
Eleven myasthenia gravis patients (age: 70.64 +/- 9.27; 11 males) were compared to age-, sex- and education-matched healthy controls (age: 70.18 +/- 8.98; 11 males). Most of the patients (n = 10, 0.91%) received cholinesterase inhibitors.
Structural brain changes were determined by applying voxel-based morphometry using high-resolution T-1-weighted sequences. Functional brain changes were assessed with a neuropsychological test battery (including attention, memory and executive functions), a spatial orientation task and brain-derived neurotrophic factor blood levels.
Myasthenia gravis patients showed significant grey matter volume reductions in the cingulate gyrus, in the inferior parietal lobe and in the fusiform gyrus. Furthermore, myasthenia gravis patients showed significantly lower performance in executive functions, working memory (Spatial Span, P = 0.034, d = 1.466), verbal episodic memory (P = 0.003, d = 1.468) and somatosensory-related spatial orientation (Triangle Completion Test, P = 0.003, d = 1.200).
Additionally, serum brain-derived neurotrophic factor levels were significantly higher in myasthenia gravis patients (P = 0.001, d = 2.040). Our results indicate that myasthenia gravis is associated with structural and functional brain alterations. Especially the grey matter volume changes in the cingulate gyrus and the inferior parietal lobe could be associated with cognitive deficits in memory and executive functions.
Furthermore, deficits in somatosensory-related spatial orientation could be associated with the lower volumes in the inferior parietal lobe. Future research is needed to replicate these findings independently in a larger sample and to investigate the underlying mechanisms in more detail.
Klaus et al. compared myasthenia gravis patients to matched healthy control subjects and identified functional alterations in memory functions as well as structural alterations in the cingulate gyrus, in the inferior parietal lobe and in the fusiform gyrus.
Background and Study
Aims Recurrent laryngeal nerve palsy (RLNP) is a potential complication of anterior discectomy and fusion (ACDF). There still is substantial disagreement on the actual prevalence of RLNP after ACDF as well as on risk factors for postoperative RLNP. The aim of this study was to describe the prevalence of postoperative RLNP in a cohort of consecutive cases of ACDF and to examine potential risk factors.
Materials and Methods
This retrospective study included patients who underwent ACDF between 2005 and 2019 at a single neurosurgical center. As part of clinical routine, RLNP was examined prior to and after surgery by independent otorhinolaryngologists using endoscopic laryngoscopy. As potential risk factors for postoperative RLNP, we examined patient's age, sex, body mass index, multilevel surgery, and the duration of surgery.
Results
214 consecutive cases were included. The prevalence of preoperative RLNP was 1.4% (3/214) and the prevalence of postoperative RLNP was 9% (19/211). The number of operated levels was 1 in 73.5% (155/211), 2 in 24.2% (51/211), and 3 or more in 2.4% (5/211) of cases. Of all cases, 4.7% (10/211) were repeat surgeries. There was no difference in the prevalence of RLNP between the primary surgery group (9.0%, 18/183) versus the repeat surgery group (10.0%, 1/10;p = 0.91). Also, there was no difference in any characteristics between subjects with postoperative RLNP compared with those without postoperative RLNP. We found no association between postoperative RLNP and patient's age, sex, body mass index, duration of surgery, or number of levels (odds ratios between 0.24 and 1.05; p values between 0.20 and 0.97).
Conclusions
In our cohort, the prevalence of postoperative RLNP after ACDF was 9.0%. The fact that none of the examined variables was associated with the occurrence of RLNP supports the view that postoperative RLNP may depend more on direct mechanical manipulation during surgery than on specific patient or surgical characteristics.
Background:
From birth to young adulthood, health and development of young people are strongly linked to their living situation, including their family's socioeconomic position (SEP) and living environment. The impact of regional characteristics on development in early childhood beyond family SEP has been rarely investigated. This study aimed to identify regional predictors of global developmental delay at school entry taking family SEP into consideration.
Method:
We used representative, population-based data from mandatory school entry examinations of the German federal state of Brandenburg in 2018/2019 with n=22,801 preschool children. By applying binary multilevel models, we hierarchically analyzed the effect of regional deprivation defined by the German Index of Socioeconomic Deprivation (GISD) and rurality operationalized as inverted population density of the children's school district on global developmental delay (GDD) while adjusting for family SEP (low, medium and high)
Results:
Family SEP was significantly and strongly linked to GDD. Children with the highest family SEP showed a lower odds for GDD compared to a medium SEP (female: OR=4.26, male: OR=3.46) and low SEP (female: OR=16.58, male: OR=12.79). Furthermore, we discovered a smaller, but additional and independent effect of regional socioeconomic deprivation on GDD, with a higher odds for children from a more deprived school district (female: OR=1.35, male: OR=1.20). However, rurality did not show a significant link to GDD in preschool children beyond family SEP and regional deprivation.
Conclusion:
Family SEP and regional deprivation are risk factors for child development and of particular interest to promote health of children in early childhood and over the life course.
Background
Generalized weakness and fatigue are underexplored symptoms in emergency medicine. Triage tools often underestimate patients presenting to the emergency department (ED) with these nonspecific symptoms (Nemec et al., 2010). At the same time, physicians' disease severity rating (DSR) on a scale from 0 (not sick at all) to 10 (extremely sick) predicts key outcomes in ED patients (Beglinger et al., 2015; Rohacek et al., 2015). Our goals were (1) to characterize ED patients with weakness and/or fatigue (W|F); to explore (2) to what extent physicians' DSR at triage can predict five key outcomes in ED patients with W|F; (3) how well DSR performs relative to two commonly used benchmark methods, the Emergency Severity Index (ESI) and the Charlson Comorbidity Index (CCI); (4) to what extent DSR provides predictive information beyond ESI, CCI, or their linear combination, i.e., whether ESI and CCI should be used alone or in combination with DSR; and (5) to what extent ESI, CCI, or their linear combination provide predictive information beyond DSR alone, i.e., whether DSR should be used alone or in combination with ESI and / or CCI.
Methods
Prospective observational study between 2013-2015 (analysis in 2018-2020, study team blinded to hypothesis) conducted at a single center. We study an all-comer cohort of 3,960 patients (48% female patients, median age = 51 years, 94% completed 1-year follow-up). We looked at two primary outcomes (acute morbidity (Bingisser et al., 2017; Weigel et al., 2017) and all-cause 1- year mortality) and three secondary outcomes (in-hospital mortality, hospitalization and transfer to ICU). We assessed the predictive power (i.e., resolution, measured as the Area under the ROC Curve, AUC) of the scores and, using logistic regression, their linear combinations.
Findings
Compared to patients without W|F (n = 3,227), patients with W|F (n = 733) showed higher prevalences for all five outcomes, reported more symptoms across both genders, and received higher DSRs (median = 4; interquartile range (IQR) = 3-6 vs. median = 3; IQR = 2-5). DSR predicted all five outcomes well above chance (i.e., AUCs > similar to 0.70), similarly well for both patients with and without W|F, and as good as or better than ESI and CCI in patients with and without W|F (except for 1-year mortality where CCI performs better). For acute morbidity, hospitalization, and transfer to ICU there is clear evidence that adding DSR to ESI and/or CCI improves predictions for both patient groups; for 1-year mortality and in-hospital mortality this holds for most, but not all comparisons. Adding ESI and/or CCI to DSR generally did not improve performance or even decreased it.
Conclusions
The use of physicians' disease severity rating has never been investigated in patients with generalized weakness and fatigue. We show that physicians' prediction of acute morbidity, mortality, hospitalization, and transfer to ICU through their DSR is also accurate in these patients. Across all patients, DSR is less predictive of acute morbidity for female than male patients, however. Future research should investigate how emergency physicians judge their patients' clinical state at triage and how this can be improved and used in simple decision aids.
In recent years digital technologies have become a major means for providing health-related services and this trend was strongly reinforced by the current Coronavirus disease 2019 (COVID-19) pandemic. As it is well-known that regular physical activity has positive effects on individual physical and mental health and thus is an important prerequisite for healthy aging, digital technologies are also increasingly used to promote unstructured and structured forms of physical activity. However, in the course of this development, several terms (e.g., Digital Health, Electronic Health, Mobile Health, Telehealth, Telemedicine, and Telerehabilitation) have been introduced to refer to the application of digital technologies to provide health-related services such as physical interventions. Unfortunately, the above-mentioned terms are often used in several different ways, but also relatively interchangeably. Given that ambiguous terminology is a major source of difficulty in scientific communication which can impede the progress of theoretical and empirical research, this article aims to make the reader aware of the subtle differences between the relevant terms which are applied at the intersection of physical activity and Digital Health and to provide state-of-art definitions for them.
In recent years digital technologies have become a major means for providing health-related services and this trend was strongly reinforced by the current Coronavirus disease 2019 (COVID-19) pandemic. As it is well-known that regular physical activity has positive effects on individual physical and mental health and thus is an important prerequisite for healthy aging, digital technologies are also increasingly used to promote unstructured and structured forms of physical activity. However, in the course of this development, several terms (e.g., Digital Health, Electronic Health, Mobile Health, Telehealth, Telemedicine, and Telerehabilitation) have been introduced to refer to the application of digital technologies to provide health-related services such as physical interventions. Unfortunately, the above-mentioned terms are often used in several different ways, but also relatively interchangeably. Given that ambiguous terminology is a major source of difficulty in scientific communication which can impede the progress of theoretical and empirical research, this article aims to make the reader aware of the subtle differences between the relevant terms which are applied at the intersection of physical activity and Digital Health and to provide state-of-art definitions for them.
Older adults with amnestic mild cognitive impairment (aMCI) who in addition to their memory deficits also suffer from frontal-executive dysfunctions have a higher risk of developing dementia later in their lives than older adults with aMCI without executive deficits and older adults with non-amnestic MCI (naMCI). Handgrip strength (HGS) is also correlated with the risk of cognitive decline in the elderly. Hence, the current study aimed to investigate the associations between HGS and executive functioning in individuals with aMCI, naMCI and healthy controls. Older, right-handed adults with amnestic MCI (aMCI), non-amnestic MCI (naMCI), and healthy controls (HC) conducted a handgrip strength measurement via a handheld dynamometer. Executive functions were assessed with the Trail Making Test (TMT A&B). Normalized handgrip strength (nHGS, normalized to Body Mass Index (BMI)) was calculated and its associations with executive functions (operationalized through z-scores of TMT B/A ratio) were investigated through partial correlation analyses (i.e., accounting for age, sex, and severity of depressive symptoms). A positive and low-to-moderate correlation between right nHGS (rp (22) = 0.364; p = 0.063) and left nHGS (rp (22) = 0.420; p = 0.037) and executive functioning in older adults with aMCI but not in naMCI or HC was observed. Our results suggest that higher levels of nHGS are linked to better executive functioning in aMCI but not naMCI and HC. This relationship is perhaps driven by alterations in the integrity of the hippocampal-prefrontal network occurring in older adults with aMCI. Further research is needed to provide empirical evidence for this assumption.
Older adults with amnestic mild cognitive impairment (aMCI) who in addition to their memory deficits also suffer from frontal-executive dysfunctions have a higher risk of developing dementia later in their lives than older adults with aMCI without executive deficits and older adults with non-amnestic MCI (naMCI). Handgrip strength (HGS) is also correlated with the risk of cognitive decline in the elderly. Hence, the current study aimed to investigate the associations between HGS and executive functioning in individuals with aMCI, naMCI and healthy controls. Older, right-handed adults with amnestic MCI (aMCI), non-amnestic MCI (naMCI), and healthy controls (HC) conducted a handgrip strength measurement via a handheld dynamometer. Executive functions were assessed with the Trail Making Test (TMT A&B). Normalized handgrip strength (nHGS, normalized to Body Mass Index (BMI)) was calculated and its associations with executive functions (operationalized through z-scores of TMT B/A ratio) were investigated through partial correlation analyses (i.e., accounting for age, sex, and severity of depressive symptoms). A positive and low-to-moderate correlation between right nHGS (rp (22) = 0.364; p = 0.063) and left nHGS (rp (22) = 0.420; p = 0.037) and executive functioning in older adults with aMCI but not in naMCI or HC was observed. Our results suggest that higher levels of nHGS are linked to better executive functioning in aMCI but not naMCI and HC. This relationship is perhaps driven by alterations in the integrity of the hippocampal-prefrontal network occurring in older adults with aMCI. Further research is needed to provide empirical evidence for this assumption.
Kraft und Kognition
(2023)
Die in den letzten Jahren aus Querschnittstudien gewonnenen empirischen Erkenntnisse deuten auf einen Zusammenhang zwischen muskulärer Kraftleistungsfähigkeit und kognitiver Leistungsfähigkeit hin [10]. Diese Beobachtung wird von Längsschnittstudien gestützt, bei denen in Folge gezielter Krafttrainingsinterventionen, welche typischerweise zur Steigerung der muskulären Kraftleistungsfähigkeit führen, Verbesserungen der kognitiven Leistungsfähigkeit dokumentiert werden konnten [11]. Die zugrundeliegenden Mechanismen, die den Zusammenhang zwischen muskulärer Kraftleistungsfähigkeit und kognitiver Leistungsfähigkeit begründen, sind jedoch noch nicht vollständig bekannt und bedürfen weiterer Forschung [10,12]. Vor diesem Hintergrund hatten die im Rahmen dieser Dissertation durchgeführten Forschungsarbeiten das übergeordnete Ziel, die Mechanismen zu untersuchen, welche den Zusammenhang zwischen der muskulären Kraftleistungsfähigkeit und der kognitiven Leistungsfähigkeit erklären können. In dieser Arbeit wurden dazu unterschiedliche Populationen (junge Menschen und ältere Menschen ohne und mit leichten kognitiven Störungen) unter Anwendung verschiedener untersuchungsmethodischer Ansätze (systematische Literaturrecherche, Doppelaufgabenparadigma und funktionelle Nahinfrarotspektroskopie) untersucht. Aufgrund der im Rahmen dieser Dissertation durchgeführten Forschungsarbeiten, die konsekutiv aufeinander aufbauen, konnten folgende Haupterkenntnisse gewonnen werden:
• Um einen umfassenden Überblick über die aktuelle Evidenzlage zum Thema Kraftleistungsfähigkeit und kognitiver Leistungsfähigkeit sowie den zugrundeliegenden neuronalen Korrelaten zu erlangen, wurde eine systematische Literaturrecherche zu diesem Forschungsthema durchgeführt. Die Ergebnisse dieser systematischen Literaturrecherche dokumentieren, dass ein gezieltes Krafttraining neben der Steigerung der kognitiven Leistungsfähigkeit zu funktionellen und strukturellen Veränderungen des Gehirns, insbesondere in frontalen Gehirnregionen, führen kann [13]. Ferner zeigen die Ergebnisse dieser systematischen Literaturrecherche, bei der eine begrenzte Anzahl verfügbarer Studien (n = 18) identifiziert wurde, den Bedarf weiterer Forschungsarbeiten zu diesem Themenfeld an [13].
• Zur Überprüfung der Hypothese, dass zur Ausführung von Krafttrainingsübungen höhere kognitive Prozesse benötigt werden, wurde in einer experimentellen Studie bei jüngeren gesunden Erwachsenen das Doppelaufgabenparadigma bei der Krafttrainingsübung Knie-beuge angewendet. Die in dieser Studie beobachteten Doppelaufgabenkosten bei der Ausführung der Krafttrainingsübung Kniebeuge (im Vergleich zur Kontrollbedingung Stehen) deuten auf die Beteiligung höherer kognitiver Prozesse zur Lösung dieser Bewegungsaufgabe hin und bestätigen die aufgestellte Hypothese [14].
• Um die Hypothese zu untersuchen, dass spezifische neuronale Korrelate (funktionelle Gehirnaktivität) den Zusammenhang zwischen muskulärer Kraftleistungsfähigkeit und kognitiver Leistungsfähigkeit vermitteln, wurde bei jungen gesunden Erwachsenen der Zusammenhang zwischen der Ausprägung der maximalen Handgriffkraft (normalisiert auf den Body-Mass-Index) und der kortikalen hämodynamischen Antwortreaktion untersucht, die bei der Durchführung eines standardisierten kognitiven Tests mittels funktioneller Nahinfrarotspektroskopie in präfrontalen Gehirnarealen gemessen wurde. Im Rahmen dieser Querschnittsstudie konnte die initiale Hypothese nicht vollständig bestätigt werden, da zwar Zusammenhänge zwischen maximaler Handgriffkraft und kognitiver Leistungsfähigkeit mit Parametern der hämodynamischen Antwortreaktion beobachtet wurden, aber die Ausprägung der maximalen Handgriffkraft nicht im Zusammenhang mit der Kurzeitgedächtnisleistung stand [16].
• Zur Untersuchung der Annahme, dass eine vorliegende neurologische Erkrankung (im Speziellen eine leichte kognitive Störung), die typischerweise mit Veränderungen von spezifischen neuronalen Korrelaten (z.B. des Hippokampus‘ [17-19] und des präfrontalen Kortex‘ [20,21]) einhergeht, einen Einfluss auf die Assoziation zwischen muskulärer Kraftleistungsfähigkeit und kognitiver Leistungsfähigkeit hat, wurde in einer Querschnittsstudie der Zusammenhang zwischen der Ausprägung der maximalen Handgriffkraft (normalisiert auf den Body-Mass-Index) und der Ausprägung der exekutiven Funktionen bei älteren Erwachsenen mit amnestischem und nicht-amnestischem Subtyp der leichten kognitiven Störung sowie gesunden älteren Erwachsenen untersucht. In dieser Querschnittsstudie wurde nur bei älteren Erwachsenen mit dem amnestischen Subtyp der leichten kognitiven Störung ein Zusammenhang zwischen maximaler Handgriffkraft und exekutiven Funktionen beobachtet. Solch eine Korrelation existiert jedoch nicht bei älteren Erwachsenen mit dem non-amnestischen Subtyp der leichten kognitiven Störung oder bei gesunden älteren Erwachsenen [24].
• In einem Perspektivenartikel wurde aufgezeigt, wie durch die theoriegeleitete Nutzung physiologischer Effekte, die bei einer speziellen Krafttrainingsmethode durch die Moderation des peripheren Blutflusses mittels Manschetten oder Bändern auftreten, insbesondere Populationen mit niedriger mechanischer Belastbarkeit von den positiven Effekten des Krafttrainings auf die Gehirngesundheit profitieren könnten [25].
Insgesamt deuten die Ergebnisse der in dieser Dissertation zusammengeführten und aufeinander aufbauenden Forschungsarbeiten auf das Vorhandensein von gemeinsamen neuronalen Korrelaten (z.B. frontaler Kortex) hin, die sowohl für die muskuläre Kraftleistungsfähigkeit als auch für höhere kognitive Prozesse eine wichtige Rolle spielen [26]. Betrachtet man die in der vorliegenden Dissertation gewonnenen Erkenntnisse im Verbund mit den bereits in der Literatur existieren-den empirischen Belegen, unterstützen sie die Sichtweise, dass eine relativ hohe muskuläre Kraftleistungsfähigkeit und deren Erhalt durch gezielte Krafttrainingsinterventionen über die Lebenspanne positive Effekte auf die (Gehirn-)Gesundheit haben können [27].
Macrophages in pathologically expanded dysfunctional white adipose tissue are exposed to a mix of potential modulators of inflammatory response, including fatty acids released from insulin-resistant adipocytes, increased levels of insulin produced to compensate insulin resistance, and prostaglandin E-2 (PGE(2)) released from activated macrophages. The current study addressed the question of how palmitate might interact with insulin or PGE(2) to induce the formation of the chemotactic pro-inflammatory cytokine interleukin-8 (IL-8). Human THP-1 cells were differentiated into macrophages. In these macrophages, palmitate induced IL-8 formation. Insulin enhanced the induction of IL-8 formation by palmitate as well as the palmitate-dependent stimulation of PGE(2) synthesis. PGE(2) in turn elicited IL-8 formation on its own and enhanced the induction of IL-8 release by palmitate, most likely by activating the EP4 receptor. Since IL-8 causes insulin resistance and fosters inflammation, the increase in palmitate-induced IL-8 formation that is caused by hyperinsulinemia and locally produced PGE(2) in chronically inflamed adipose tissue might favor disease progression in a vicious feed-forward cycle.
Background
Depression is one of the key factors contributing to difficulties in one’s ability to work, and serves as one of the major reasons why employees apply for psychotherapy and receive insurance subsidization of treatments. Hence, an increasing and growing number of studies rely on workability assessment scales as their primary outcome measure. The Work and Social Assessment Scale (WSAS) has been documented as one of the most psychometrically reliable and valid tools especially developed to assess workability and social functioning in patients with mental health problems. Yet, the application of the WSAS in Germany has been limited due to the paucity of a valid questionnaire in the German language. The objective of the present study was to translate the WSAS, as a brief and easy administrable tool into German and test its psychometric properties in a sample of adults with depression.
Methods
Two hundred seventy-seven patients (M = 48.3 years, SD = 11.1) with mild to moderately severe depression were recruited. A multistep translation from English into the German language was performed and the factorial validity, criterion validity, convergent validity, discriminant validity, internal consistency, and floor and ceiling effects were examined.
Results
The confirmatory factor analysis results confirmed the one-factor structure of the WSAS. Significant correlations with the WHODAS 2–0 questionnaire, a measure of functionality, demonstrated good convergent validity. Significant correlations with depression and quality of life demonstrated good criterion validity. The WSAS also demonstrated strong internal consistency (α = .89), and the absence of floor and ceiling effects indicated good sensitivity of the instrument.
Conclusions
The results of the present study demonstrated that the German version of the WSAS has good psychometric properties comparable to other international versions of this scale. The findings recommend a global assessment of psychosocial functioning with the sum score of the WSAS.
Background
Depression is one of the key factors contributing to difficulties in one’s ability to work, and serves as one of the major reasons why employees apply for psychotherapy and receive insurance subsidization of treatments. Hence, an increasing and growing number of studies rely on workability assessment scales as their primary outcome measure. The Work and Social Assessment Scale (WSAS) has been documented as one of the most psychometrically reliable and valid tools especially developed to assess workability and social functioning in patients with mental health problems. Yet, the application of the WSAS in Germany has been limited due to the paucity of a valid questionnaire in the German language. The objective of the present study was to translate the WSAS, as a brief and easy administrable tool into German and test its psychometric properties in a sample of adults with depression.
Methods
Two hundred seventy-seven patients (M = 48.3 years, SD = 11.1) with mild to moderately severe depression were recruited. A multistep translation from English into the German language was performed and the factorial validity, criterion validity, convergent validity, discriminant validity, internal consistency, and floor and ceiling effects were examined.
Results
The confirmatory factor analysis results confirmed the one-factor structure of the WSAS. Significant correlations with the WHODAS 2–0 questionnaire, a measure of functionality, demonstrated good convergent validity. Significant correlations with depression and quality of life demonstrated good criterion validity. The WSAS also demonstrated strong internal consistency (α = .89), and the absence of floor and ceiling effects indicated good sensitivity of the instrument.
Conclusions
The results of the present study demonstrated that the German version of the WSAS has good psychometric properties comparable to other international versions of this scale. The findings recommend a global assessment of psychosocial functioning with the sum score of the WSAS.
The primary aim of the current study was to examine the unique contribution of psychological need frustration and need satisfaction in the prediction of adults’ mental well-being and ill-being in a heterogeneous sample of adults (N = 334; Mage = 43.33, SD = 32.26; 53% females). Prior to this, validity evidence was provided for the German version of the Basic Psychological Need Satisfaction and Frustration Scale (BPNSFS) based on Self-Determination Theory (SDT). The results of the validation analyses found the German BPNSFS to be a valid and reliable measurement. Further, structural equation modeling (SEM) showed that both need satisfaction and frustration yielded unique and opposing associations with well-being. Specifically, the dimension of psychological need frustration predicted adults’ ill-being. Future research should examine whether frustration of psychological needs is involved in the onset and maintenance of psychopathology (e.g., major depressive disorder).
Extracellular vesicles
(2021)
Osteoporosis is characterized by low bone mass and damage to the bone tissue’s microarchitecture, leading to increased fracture risk. Several studies have provided evidence for associations between psychosocial stress and osteoporosis through various pathways, including the hypothalamic-pituitary-adrenocortical axis, the sympathetic nervous system, and other endocrine factors. As psychosocial stress provokes oxidative cellular stress with consequences for mitochondrial function and cell signaling (e.g., gene expression, inflammation), it is of interest whether extracellular vesicles (EVs) may be a relevant biomarker in this context or act by transporting substances. EVs are intercellular communicators, transfer substances encapsulated in them, modify the phenotype and function of target cells, mediate cell-cell communication, and, therefore, have critical applications in disease progression and clinical diagnosis and therapy. This review summarizes the characteristics of EVs, their role in stress and osteoporosis, and their benefit as biological markers. We demonstrate that EVs are potential mediators of psychosocial stress and osteoporosis and may be beneficial in innovative research settings.