Refine
Year of publication
- 2022 (31) (remove)
Document Type
- Article (20)
- Postprint (10)
- Master's Thesis (1)
Is part of the Bibliography
- yes (31)
Keywords
- neuroplasticity (3)
- Aging (2)
- Ankle injury (2)
- Ankle sprain (2)
- Basketball (2)
- Digital Health (2)
- Electronic Health (2)
- Functional ankle instability (2)
- MCI (2)
- MRI (2)
Institute
- Fakultät für Gesundheitswissenschaften (31) (remove)
Myasthenia gravis is an autoimmune disease affecting neuromuscular transmission and causing skeletal muscle weakness. Additionally, systemic inflammation, cognitive deficits and autonomic dysfunction have been described.
However, little is known about myasthenia gravis-related reorganization of the brain. In this study, we thus investigated the structural and functional brain changes in myasthenia gravis patients.
Eleven myasthenia gravis patients (age: 70.64 +/- 9.27; 11 males) were compared to age-, sex- and education-matched healthy controls (age: 70.18 +/- 8.98; 11 males). Most of the patients (n = 10, 0.91%) received cholinesterase inhibitors.
Structural brain changes were determined by applying voxel-based morphometry using high-resolution T-1-weighted sequences. Functional brain changes were assessed with a neuropsychological test battery (including attention, memory and executive functions), a spatial orientation task and brain-derived neurotrophic factor blood levels.
Myasthenia gravis patients showed significant grey matter volume reductions in the cingulate gyrus, in the inferior parietal lobe and in the fusiform gyrus. Furthermore, myasthenia gravis patients showed significantly lower performance in executive functions, working memory (Spatial Span, P = 0.034, d = 1.466), verbal episodic memory (P = 0.003, d = 1.468) and somatosensory-related spatial orientation (Triangle Completion Test, P = 0.003, d = 1.200).
Additionally, serum brain-derived neurotrophic factor levels were significantly higher in myasthenia gravis patients (P = 0.001, d = 2.040). Our results indicate that myasthenia gravis is associated with structural and functional brain alterations. Especially the grey matter volume changes in the cingulate gyrus and the inferior parietal lobe could be associated with cognitive deficits in memory and executive functions.
Furthermore, deficits in somatosensory-related spatial orientation could be associated with the lower volumes in the inferior parietal lobe. Future research is needed to replicate these findings independently in a larger sample and to investigate the underlying mechanisms in more detail.
Klaus et al. compared myasthenia gravis patients to matched healthy control subjects and identified functional alterations in memory functions as well as structural alterations in the cingulate gyrus, in the inferior parietal lobe and in the fusiform gyrus.
Background:
From birth to young adulthood, health and development of young people are strongly linked to their living situation, including their family's socioeconomic position (SEP) and living environment. The impact of regional characteristics on development in early childhood beyond family SEP has been rarely investigated. This study aimed to identify regional predictors of global developmental delay at school entry taking family SEP into consideration.
Method:
We used representative, population-based data from mandatory school entry examinations of the German federal state of Brandenburg in 2018/2019 with n=22,801 preschool children. By applying binary multilevel models, we hierarchically analyzed the effect of regional deprivation defined by the German Index of Socioeconomic Deprivation (GISD) and rurality operationalized as inverted population density of the children's school district on global developmental delay (GDD) while adjusting for family SEP (low, medium and high)
Results:
Family SEP was significantly and strongly linked to GDD. Children with the highest family SEP showed a lower odds for GDD compared to a medium SEP (female: OR=4.26, male: OR=3.46) and low SEP (female: OR=16.58, male: OR=12.79). Furthermore, we discovered a smaller, but additional and independent effect of regional socioeconomic deprivation on GDD, with a higher odds for children from a more deprived school district (female: OR=1.35, male: OR=1.20). However, rurality did not show a significant link to GDD in preschool children beyond family SEP and regional deprivation.
Conclusion:
Family SEP and regional deprivation are risk factors for child development and of particular interest to promote health of children in early childhood and over the life course.
The investigation of protein structures, functions and interactions often requires modifications to adapt protein properties to the specific application. Among many possible methods to equip proteins with new chemical groups, the utilization of orthogonal aminoacyl-tRNA synthetase/tRNA pairs enables the site-specific incorporation of non-canonical amino acids at defined positions in the protein. The open nature of cell-free protein synthesis reactions provides an optimal environment, as the orthogonal components do not need to be transported across the cell membrane and the impact on cell viability is negligible. In the present work, it was shown that the expression of orthogonal aminoacyl-tRNA synthetases in CHO cells prior to cell disruption enhanced the modification of the pharmaceutically relevant adenosine A2a receptor. For this purpose, in complement to transient transfection of CHO cells, an approach based on CRISPR/Cas9 technology was selected to generate a translationally active cell lysate harboring endogenous orthogonal aminoacyl-tRNA synthetase.
Introduction Airway infection with pathogens and its associated pulmonary exacerbations (PEX) are the major causes of morbidity and premature death in cystic fibrosis (CF). Preventing or postponing chronic infections requires early diagnosis. However, limitations of conventional microbiology-based methods can hamper identification of exacerbations and specific pathogen detection. Analyzing volatile organic compounds (VOCs) in breath samples may be an interesting tool in this regard, as VOC-biomarkers can characterize specific airway infections in CF. Areas covered We address the current achievements in VOC-analysis and discuss studies assessing VOC-biomarkers and fingerprints, i.e. a combination of multiple VOCs, in breath samples aiming at pathogen and PEX detection in people with CF (pwCF). We aim to provide bases for further research in this interesting field. Expert opinion Overall, VOC-based analysis is a promising tool for diagnosis of infection and inflammation with potential to monitor disease progression in pwCF. Advantages over conventional diagnostic methods, including easy and non-invasive sampling procedures, may help to drive prompt, suitable therapeutic approaches in the future. Our review shall encourage further research, including validation of VOC-based methods. Specifically, longitudinal validation under standardized conditions is of interest in order to ensure repeatability and enable inclusion in CF diagnostic routine.
Objective
For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
Methods
We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
Results
Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
Conclusion
Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population
Genetic engineering has provided humans the ability to transform organisms by direct manipulation of genomes within a broad range of applications including agriculture (e.g., GM crops), and the pharmaceutical industry (e.g., insulin production). Developments within the last 10 years have produced new tools for genome editing (e.g., CRISPR/Cas9) that can achieve much greater precision than previous forms of genetic engineering. Moreover, these tools could offer the potential for interventions on humans and for both clinical and non-clinical purposes, resulting in a broad scope of applicability. However, their promising abilities and potential uses (including their applicability in humans for either somatic or heritable genome editing interventions) greatly increase their potential societal impacts and, as such, have brought an urgency to ethical and regulatory discussions about the application of such technology in our society. In this article, we explore different arguments (pragmatic, sociopolitical and categorical) that have been made in support of or in opposition to the new technologies of genome editing and their impact on the debate of the permissibility or otherwise of human heritable genome editing interventions in the future. For this purpose, reference is made to discussions on genetic engineering that have taken place in the field of bioethics since the 1980s. Our analysis shows that the dominance of categorical arguments has been reversed in favour of pragmatic arguments such as safety concerns. However, when it comes to involving the public in ethical discourse, we consider it crucial widening the debate beyond such pragmatic considerations. In this article, we explore some of the key categorical as well sociopolitical considerations raised by the potential uses of heritable genome editing interventions, as these considerations underline many of the societal concerns and values crucial for public engagement. We also highlight how pragmatic considerations, despite their increasing importance in the work of recent authoritative sources, are unlikely to be the result of progress on outstanding categorical issues, but rather reflect the limited progress on these aspects and/or pressures in regulating the use of the technology.
Boredom has been identified as one of the greatest psychological challenges when staying at home during quarantine and isolation. However, this does not mean that the situation necessarily causes boredom. On the basis of 13 explorative interviews with bored and non-bored persons who have been under quarantine or in isolation, we explain why boredom is related to a subjective interpretation process rather than being a direct consequence of the objective situation. Specifically, we show that participants vary significantly in their interpretations of staying at home and, thus, also in their experience of boredom. While the non-bored participants interpret the situation as a relief or as irrelevant, the bored participants interpret it as a major restriction that only some are able to cope with.
Sedentarism is a risk factor for depression and anxiety. People living with the human immunodeficiency virus (PLWH) have a higher prevalence of anxiety and depression compared to HIV-negative individuals. This cross-sectional study (n = 450, median age 44 (19-75), 7.3% females) evaluates the prevalence rates and prevalence ratio (PR) of anxiety and/or depression in PLWH associated with recreational exercise. A decreased likelihood of having anxiety (PR=0.57; 0.36-0.91; p = 0.01), depression (PR=0.41; 0.36-0.94; p=0.01), and comorbid anxiety and depression (PR = 0,43; 0.24-0.75; p=0.002) was found in exercising compared to non-exercising PLWH. Recreational exercise is associated with a lower risk for anxiety and/or depression. Further prospective studies are needed to provide insights on the direction of this association.
OBJECTIVE: For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
METHODS: We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
RESULTS: Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
CONCLUSION: Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population.
OBJECTIVE: For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
METHODS: We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
RESULTS: Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
CONCLUSION: Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population.
Stress and pain
(2022)
Introduction: Low back pain (LBP) leads to considerable impairment of quality of life worldwide and is often accompanied by psychosomatic symptoms.
Objectives: First, to assess the association between stress and chronic low back pain (CLBP) and its simultaneous appearance with fatigue and depression as a symptom triad. Second, to identify the most predictive stress-related pattern set for CLBP for a 1-year diagnosis.
Methods: In a 1-year observational study with four measurement points, a total of 140 volunteers (aged 18–45 years with intermittent pain) were recruited. The primary outcomes were pain [characteristic pain intensity (CPI), subjective pain disability (DISS)], fatigue, and depressive mood. Stress was assessed as chronic stress, perceived stress, effort reward imbalance, life events, and physiological markers [allostatic load index (ALI), hair cortisol concentration (HCC)]. Multiple linear regression models and selection procedures for model shrinkage and variable selection (least absolute shrinkage and selection operator) were applied. Prediction accuracy was calculated by root mean squared error (RMSE) and receiver-operating characteristic curves.
Results: There were 110 participants completed the baseline assessments (28.2 7.5 years, 38.1% female), including HCC, and a further of 46 participants agreed to ALI laboratory measurements. Different stress types were associated with LBP, CLBP, fatigue, and depressive mood and its joint occurrence as a symptom triad at baseline; mainly social-related stress types were of relevance. Work-related stress, such as “excessive demands at work”[b = 0.51 (95%CI -0.23, 1.25), p = 0.18] played a role for upcoming chronic pain disability. “Social overload” [b = 0.45 (95%CI -0.06, 0.96), p = 0.080] and “over-commitment at work” [b = 0.28 (95%CI -0.39, 0.95), p = 0.42] were associated with an upcoming depressive mood within 1-year. Finally, seven psychometric (CPI: RMSE = 12.63; DISS: RMSE = 9.81) and five biomarkers (CPI: RMSE = 12.21; DISS: RMSE = 8.94) could be derived as the most predictive pattern set for a 1-year prediction of CLBP. The biomarker set showed an apparent area under the curve of 0.88 for CPI and 0.99 for DISS.
Conclusion: Stress disrupts allostasis and favors the development of chronic pain, fatigue, and depression and the emergence of a “hypocortisolemic symptom triad,” whereby the social-related stressors play a significant role. For translational medicine, a predictive pattern set could be derived which enables to diagnose the individuals at higher risk for the upcoming pain disorders and can be used in practice.
Background
Elderly patients are a growing population in cardiac rehabilitation (CR). As postural control declines with age, assessment of impaired balance is important in older CR patients in order to predict fall risk and to initiate counteracting steps. Functional balance tests are subjective and lack adequate sensitivity to small differences, and are further subject to ceiling effects. A quantitative approach to measure postural control on a continuous scale is therefore desirable. Force plates are already used for this purpose in other clinical contexts, therefore could be a promising tool also for older CR patients. However, in this population the reliability of the assessment is not fully known.
Research question
Analysis of test-retest reliability of center of pressure (CoP) measures for the assessment of postural control using a force plate in older CR patients.
Methods
156 CR patients (> 75 years) were enrolled. CoP measures (path length (PL), mean velocity (MV), and 95% confidence ellipse area (95CEA)) were analyzed twice with an interval of two days in between (bipedal narrow stance, eyes open (EO) and closed (EC), three trials for each condition, 30 s per trial), using a force plate. For test-retest reliability estimation absolute differences (& UDelta;: T0-T1), intraclass correlation coefficients (ICC) with 95% confidence intervals, standard error of measurement and minimal detectable change were calculated.
Results
Under EO condition ICC were excellent for PL and MV (0.95) and good for 95CEA (0.88) with & UDelta; of 10.1 cm (PL), 0.3 cm/sec (MV) and 1.5 cm(2 )(95CEA) respectively. Under EC condition ICC were excellent (> 0.95) for all variables with larger & UDelta; (PL: 21.7 cm; MV: 0.7 cm/sec; 95CEA: 2.4 cm(2))
Significance
In older CR patients, the assessment of CoP measures using a force plate shows good to excellent test retest reliability.
Basic psychological needs theory postulates that a social environment that satisfies individuals’ three basic psychological needs of autonomy, competence, and relatedness leads to optimal growth and well-being. On the other hand, the frustration of these needs is associated with ill-being and depressive symptoms foremost investigated in non-clinical samples; yet, there is a paucity of research on need frustration in clinical samples. Survey data were compared between adult individuals with major depressive disorder (MDD; n = 115; 48.69% female; 38.46 years, SD = 10.46) with those of a non-depressed comparison sample (n = 201; 53.23% female; 30.16 years, SD = 12.81). Need profiles were examined with a linear mixed model (LMM). Individuals with depression reported higher levels of frustration and lower levels of satisfaction in relation to the three basic psychological needs when compared to non-depressed adults. The difference between depressed and non-depressed groups was significantly larger for frustration than satisfaction regarding the needs for relatedness and competence. LMM correlation parameters confirmed the expected positive correlation between the three needs. This is the first study showing substantial differences in need-based experiences between depressed and non-depressed adults. The results confirm basic assumptions of the self-determination theory and have preliminary implications in tailoring therapy for depression.
Basic psychological needs theory postulates that a social environment that satisfies individuals’ three basic psychological needs of autonomy, competence, and relatedness leads to optimal growth and well-being. On the other hand, the frustration of these needs is associated with ill-being and depressive symptoms foremost investigated in non-clinical samples; yet, there is a paucity of research on need frustration in clinical samples. Survey data were compared between adult individuals with major depressive disorder (MDD; n = 115; 48.69% female; 38.46 years, SD = 10.46) with those of a non-depressed comparison sample (n = 201; 53.23% female; 30.16 years, SD = 12.81). Need profiles were examined with a linear mixed model (LMM). Individuals with depression reported higher levels of frustration and lower levels of satisfaction in relation to the three basic psychological needs when compared to non-depressed adults. The difference between depressed and non-depressed groups was significantly larger for frustration than satisfaction regarding the needs for relatedness and competence. LMM correlation parameters confirmed the expected positive correlation between the three needs. This is the first study showing substantial differences in need-based experiences between depressed and non-depressed adults. The results confirm basic assumptions of the self-determination theory and have preliminary implications in tailoring therapy for depression.
Dementia as one of the most prevalent diseases urges for a better understanding of the central mechanisms responsible for clinical symptoms, and necessitates improvement of actual diagnostic capabilities. The brainstem nucleus locus coeruleus (LC) is a promising target for early diagnosis because of its early structural alterations and its relationship to the functional disturbances in the patients. In this study, we applied our improved method of localisation-based LC resting-state fMRI to investigate the differences in central sensory signal processing when comparing functional connectivity (fc) of a patient group with mild cognitive impairment (MCI, n = 28) and an age-matched healthy control group (n = 29). MCI and control participants could be differentiated in their Mini-Mental-State-Examination (MMSE) scores (p < .001) and LC intensity ratio (p = .010). In the fMRI, LC fc to anterior cingulate cortex (FDR p < .001) and left anterior insula (FDR p = .012) was elevated, and LC fc to right temporoparietal junction (rTPJ, FDR p = .012) and posterior cingulate cortex (PCC, FDR p = .021) was decreased in the patient group. Importantly, LC to rTPJ connectivity was also positively correlated to MMSE scores in MCI patients (p = .017). Furthermore, we found a hyperactivation of the left-insula salience network in the MCI patients. Our results and our proposed disease model shed new light on the functional pathogenesis of MCI by directing to attentional network disturbances, which could aid new therapeutic strategies and provide a marker for diagnosis and prediction of disease progression.
Dementia as one of the most prevalent diseases urges for a better understanding of the central mechanisms responsible for clinical symptoms, and necessitates improvement of actual diagnostic capabilities. The brainstem nucleus locus coeruleus (LC) is a promising target for early diagnosis because of its early structural alterations and its relationship to the functional disturbances in the patients. In this study, we applied our improved method of localisation-based LC resting-state fMRI to investigate the differences in central sensory signal processing when comparing functional connectivity (fc) of a patient group with mild cognitive impairment (MCI, n = 28) and an age-matched healthy control group (n = 29). MCI and control participants could be differentiated in their Mini-Mental-State-Examination (MMSE) scores (p < .001) and LC intensity ratio (p = .010). In the fMRI, LC fc to anterior cingulate cortex (FDR p < .001) and left anterior insula (FDR p = .012) was elevated, and LC fc to right temporoparietal junction (rTPJ, FDR p = .012) and posterior cingulate cortex (PCC, FDR p = .021) was decreased in the patient group. Importantly, LC to rTPJ connectivity was also positively correlated to MMSE scores in MCI patients (p = .017). Furthermore, we found a hyperactivation of the left-insula salience network in the MCI patients. Our results and our proposed disease model shed new light on the functional pathogenesis of MCI by directing to attentional network disturbances, which could aid new therapeutic strategies and provide a marker for diagnosis and prediction of disease progression.
Association of primary allostatic load mediators and metabolic syndrome (MetS): A systematic review
(2022)
Allostatic load (AL) exposure may cause detrimental effects on the neuroendocrine system, leading to metabolic syndrome (MetS). The primary mediators of AL involve serum dehydroepiandrosterone sulfate (DHEAS; a functional HPA axis antagonist); further, cortisol, urinary norepinephrine (NE), and epinephrine (EPI) excretion levels (assessed within 12-h urine as a golden standard for the evaluation of the HPA axis activity and sympathetic nervous system activity). However, the evidence of an association between the primary mediators of AL and MetS is limited. This systematic review aimed to critically examine the association between the primary mediators of AL and MetS. PubMed and Web of Science were searched for articles from January 2010 to December 2021, published in English. The search strategy focused on cross-sectional and case–control studies comprising adult participants with MetS, obesity, overweight, and without chronic diseases. The STROBE checklist was used to assess study quality control. Of 770 studies, twenty-one studies with a total sample size (n = 10,666) met the eligibility criteria. Eighteen studies were cross-sectional, and three were case–control studies. The included studies had a completeness of reporting score of COR % = 87.0 ± 6.4%. It is to be noted, that cortisol as a primary mediator of AL showed an association with MetS in 50% (urinary cortisol), 40% (serum cortisol), 60% (salivary cortisol), and 100% (hair cortisol) of the studies. For DHEAS, it is to conclude that 60% of the studies showed an association with MetS. In contrast, urinary EPI and urinary NE had 100% no association with MetS. In summary, there is a tendency for the association between higher serum cortisol, salivary cortisol, urinary cortisol, hair cortisol, and lower levels of DHEAS with MetS. Future studies focusing on longitudinal data are warranted for clarification and understanding of the association between the primary mediators of AL and MetS.
Association of primary allostatic load mediators and metabolic syndrome (MetS): A systematic review
(2022)
Allostatic load (AL) exposure may cause detrimental effects on the neuroendocrine system, leading to metabolic syndrome (MetS). The primary mediators of AL involve serum dehydroepiandrosterone sulfate (DHEAS; a functional HPA axis antagonist); further, cortisol, urinary norepinephrine (NE), and epinephrine (EPI) excretion levels (assessed within 12-h urine as a golden standard for the evaluation of the HPA axis activity and sympathetic nervous system activity). However, the evidence of an association between the primary mediators of AL and MetS is limited. This systematic review aimed to critically examine the association between the primary mediators of AL and MetS. PubMed and Web of Science were searched for articles from January 2010 to December 2021, published in English. The search strategy focused on cross-sectional and case–control studies comprising adult participants with MetS, obesity, overweight, and without chronic diseases. The STROBE checklist was used to assess study quality control. Of 770 studies, twenty-one studies with a total sample size (n = 10,666) met the eligibility criteria. Eighteen studies were cross-sectional, and three were case–control studies. The included studies had a completeness of reporting score of COR % = 87.0 ± 6.4%. It is to be noted, that cortisol as a primary mediator of AL showed an association with MetS in 50% (urinary cortisol), 40% (serum cortisol), 60% (salivary cortisol), and 100% (hair cortisol) of the studies. For DHEAS, it is to conclude that 60% of the studies showed an association with MetS. In contrast, urinary EPI and urinary NE had 100% no association with MetS. In summary, there is a tendency for the association between higher serum cortisol, salivary cortisol, urinary cortisol, hair cortisol, and lower levels of DHEAS with MetS. Future studies focusing on longitudinal data are warranted for clarification and understanding of the association between the primary mediators of AL and MetS.
In recent years digital technologies have become a major means for providing health-related services and this trend was strongly reinforced by the current Coronavirus disease 2019 (COVID-19) pandemic. As it is well-known that regular physical activity has positive effects on individual physical and mental health and thus is an important prerequisite for healthy aging, digital technologies are also increasingly used to promote unstructured and structured forms of physical activity. However, in the course of this development, several terms (e.g., Digital Health, Electronic Health, Mobile Health, Telehealth, Telemedicine, and Telerehabilitation) have been introduced to refer to the application of digital technologies to provide health-related services such as physical interventions. Unfortunately, the above-mentioned terms are often used in several different ways, but also relatively interchangeably. Given that ambiguous terminology is a major source of difficulty in scientific communication which can impede the progress of theoretical and empirical research, this article aims to make the reader aware of the subtle differences between the relevant terms which are applied at the intersection of physical activity and Digital Health and to provide state-of-art definitions for them.
In recent years digital technologies have become a major means for providing health-related services and this trend was strongly reinforced by the current Coronavirus disease 2019 (COVID-19) pandemic. As it is well-known that regular physical activity has positive effects on individual physical and mental health and thus is an important prerequisite for healthy aging, digital technologies are also increasingly used to promote unstructured and structured forms of physical activity. However, in the course of this development, several terms (e.g., Digital Health, Electronic Health, Mobile Health, Telehealth, Telemedicine, and Telerehabilitation) have been introduced to refer to the application of digital technologies to provide health-related services such as physical interventions. Unfortunately, the above-mentioned terms are often used in several different ways, but also relatively interchangeably. Given that ambiguous terminology is a major source of difficulty in scientific communication which can impede the progress of theoretical and empirical research, this article aims to make the reader aware of the subtle differences between the relevant terms which are applied at the intersection of physical activity and Digital Health and to provide state-of-art definitions for them.
Training intervention effects on cognitive performance and neuronal plasticity — A pilot study
(2022)
Studies suggest that people suffering from chronic pain may have altered brain plasticity, along with altered functional connectivity between pain-processing brain regions. These may be related to decreased mood and cognitive performance. There is some debate as to whether physical activity combined with behavioral therapy (e.g. cognitive distraction, body scan) may counteract these changes. However, underlying neuronal mechanisms are unclear. The aim of the current pilot study with a 3-armed randomized controlled trial design was to examine the effects of sensorimotor training for nonspecific chronic low back pain on (1) cognitive performance; (2) fMRI activity co-fluctuations (functional connectivity) between pain-related brain regions; and (3) the relationship between functional connectivity and subjective variables (pain and depression). Six hundred and sixty two volunteers with non-specific chronic low back pain were randomly allocated to a unimodal (sensorimotor training), multidisciplinary (sensorimotor training and behavioral therapy) intervention, or to a control group within a multicenter study. A subsample of patients (n = 21) from one study center participated in the pilot study presented here. Measurements were at baseline, during (3 weeks, M2) and after intervention (12 weeks, M4 and 24 weeks, M5). Cognitive performance was measured by the Trail Making Test and functional connectivity by MRI. Pain perception and depression were assessed by the Von Korff questionnaire and the Hospital and Anxiety. Group differences were calculated by univariate and repeated ANOVA measures and Bayesian statistics; correlations by Pearson's r. Change and correlation of functional connection were analyzed within a pooled intervention group (uni-, multidisciplinary group). Results revealed that participants with increased pain intensity at baseline showed higher functional connectivity between pain-related brain areas used as ROIs in this study. Though small sample sizes limit generalization, cognitive performance increased in the multimodal group. Increased functional connectivity was observed in participants with increased pain ratings. Pain ratings and connectivity in pain-related brain regions decreased after the intervention. The results provide preliminary indication that intervention effects can potentially be achieved on the cognitive and neuronal level. The intervention may be suitable for therapy and prevention of non-specific chronic low back pain.
Training intervention effects on cognitive performance and neuronal plasticity — A pilot study
(2022)
Studies suggest that people suffering from chronic pain may have altered brain plasticity, along with altered functional connectivity between pain-processing brain regions. These may be related to decreased mood and cognitive performance. There is some debate as to whether physical activity combined with behavioral therapy (e.g. cognitive distraction, body scan) may counteract these changes. However, underlying neuronal mechanisms are unclear. The aim of the current pilot study with a 3-armed randomized controlled trial design was to examine the effects of sensorimotor training for nonspecific chronic low back pain on (1) cognitive performance; (2) fMRI activity co-fluctuations (functional connectivity) between pain-related brain regions; and (3) the relationship between functional connectivity and subjective variables (pain and depression). Six hundred and sixty two volunteers with non-specific chronic low back pain were randomly allocated to a unimodal (sensorimotor training), multidisciplinary (sensorimotor training and behavioral therapy) intervention, or to a control group within a multicenter study. A subsample of patients (n = 21) from one study center participated in the pilot study presented here. Measurements were at baseline, during (3 weeks, M2) and after intervention (12 weeks, M4 and 24 weeks, M5). Cognitive performance was measured by the Trail Making Test and functional connectivity by MRI. Pain perception and depression were assessed by the Von Korff questionnaire and the Hospital and Anxiety. Group differences were calculated by univariate and repeated ANOVA measures and Bayesian statistics; correlations by Pearson's r. Change and correlation of functional connection were analyzed within a pooled intervention group (uni-, multidisciplinary group). Results revealed that participants with increased pain intensity at baseline showed higher functional connectivity between pain-related brain areas used as ROIs in this study. Though small sample sizes limit generalization, cognitive performance increased in the multimodal group. Increased functional connectivity was observed in participants with increased pain ratings. Pain ratings and connectivity in pain-related brain regions decreased after the intervention. The results provide preliminary indication that intervention effects can potentially be achieved on the cognitive and neuronal level. The intervention may be suitable for therapy and prevention of non-specific chronic low back pain.
Objective
To improve consumer decision making, the results of risk assessments on food, feed, consumer products or chemicals need to be communicated not only to experts but also to non-expert audiences. The present study draws on evidence from literature reviews and focus groups with diverse stakeholders to identify content to integrate into an existing risk assessment communication (Risk Profile).
Methods
A combination of rapid literature reviews and focus groups with experts (risk assessors (n = 15), risk managers (n = 8)), and non-experts (general public (n = 18)) were used to identify content and strategies for including information about risk assessment results in the “Risk Profile” from the German Federal Institute for Risk Assessment. Feedback from initial focus groups was used to develop communication prototypes that informed subsequent feedback rounds in an iterative process. A final prototype was validated in usability tests with experts.
Results
Focus group feedback and suggestions from risk assessors were largely in line with findings from the literature. Risk managers and lay persons offered similar suggestions on how to improve the existing communication of risk assessment results (e.g., including more explanatory detail, reporting probabilities for individual health impairments, and specifying risks for subgroups in additional sections). Risk managers found information about quality of evidence important to communicate, whereas people from the general public found this information less relevant. Participants from lower educational backgrounds had difficulties understanding the purpose of risk assessments. User tests found that the final prototype was appropriate and feasible to implement by risk assessors.
Conclusion
An iterative and evidence-based process was used to develop content to improve the communication of risk assessments to the general public while being feasible to use by risk assessors. Remaining challenges include how to communicate dose-response relationships and standardise quality of evidence ratings across disciplines.
Objective
To improve consumer decision making, the results of risk assessments on food, feed, consumer products or chemicals need to be communicated not only to experts but also to non-expert audiences. The present study draws on evidence from literature reviews and focus groups with diverse stakeholders to identify content to integrate into an existing risk assessment communication (Risk Profile).
Methods
A combination of rapid literature reviews and focus groups with experts (risk assessors (n = 15), risk managers (n = 8)), and non-experts (general public (n = 18)) were used to identify content and strategies for including information about risk assessment results in the “Risk Profile” from the German Federal Institute for Risk Assessment. Feedback from initial focus groups was used to develop communication prototypes that informed subsequent feedback rounds in an iterative process. A final prototype was validated in usability tests with experts.
Results
Focus group feedback and suggestions from risk assessors were largely in line with findings from the literature. Risk managers and lay persons offered similar suggestions on how to improve the existing communication of risk assessment results (e.g., including more explanatory detail, reporting probabilities for individual health impairments, and specifying risks for subgroups in additional sections). Risk managers found information about quality of evidence important to communicate, whereas people from the general public found this information less relevant. Participants from lower educational backgrounds had difficulties understanding the purpose of risk assessments. User tests found that the final prototype was appropriate and feasible to implement by risk assessors.
Conclusion
An iterative and evidence-based process was used to develop content to improve the communication of risk assessments to the general public while being feasible to use by risk assessors. Remaining challenges include how to communicate dose-response relationships and standardise quality of evidence ratings across disciplines.
Background
Ankle sprain is the most common injury in basketball. Chronic ankle instability develops from an acute ankle sprain may cause negative effects on quality of life, ankle functionality or on increasing risk for recurrent ankle sprains and post-traumatic osteoarthritis. To facilitate a preventative strategy of chronic ankle instability (CAI) in the basketball population, gathering epidemiological data is essential. However, the epidemiological data of CAI in basketball is limited. Therefore, this study aims to investigate the prevalence of CAI in basketball athletes and to determine whether gender, competitive level, and basketball playing position influence this prevalence.
Methods
In a cross-sectional study, in total 391 Taiwanese basketball athletes from universities and sports clubs participated. Besides non-standardized questions about demographics and their history of ankle sprains, participants further filled out the standard Cumberland Ankle Instability Tool applied to determine the presence of ankle instability. Questionnaires from 255 collegiate and 133 semi-professional basketball athletes (male = 243, female = 145, 22.3 ± 3.8 years, 23.3 ± 2.2 kg/m2) were analyzed. Differences in prevalence between gender, competitive level and playing position were determined using the Chi-square test.
Results
In the surveyed cohort, 26% had unilateral CAI while 50% of them had bilateral CAI. Women had a higher prevalence than men in the whole surveyed cohort (X2(1) = 0.515, p = 0.003). This gender disparity also showed from sub-analyses, that the collegiate female athletes had a higher prevalence than collegiate men athletes (X2(1) = 0.203, p = 0.001). Prevalence showed no difference between competitive levels (p > 0.05) and among playing positions (p > 0.05).
Conclusions
CAI is highly prevalent in the basketball population. Gender affects the prevalence of CAI. Regardless of the competitive level and playing position the prevalence of CAI is similar. The characteristic of basketball contributes to the high prevalence. Prevention of CAI should be a focus in basketball. When applying the CAI prevention measures, gender should be taken into consideration.
Background
Ankle sprain is the most common injury in basketball. Chronic ankle instability develops from an acute ankle sprain may cause negative effects on quality of life, ankle functionality or on increasing risk for recurrent ankle sprains and post-traumatic osteoarthritis. To facilitate a preventative strategy of chronic ankle instability (CAI) in the basketball population, gathering epidemiological data is essential. However, the epidemiological data of CAI in basketball is limited. Therefore, this study aims to investigate the prevalence of CAI in basketball athletes and to determine whether gender, competitive level, and basketball playing position influence this prevalence.
Methods
In a cross-sectional study, in total 391 Taiwanese basketball athletes from universities and sports clubs participated. Besides non-standardized questions about demographics and their history of ankle sprains, participants further filled out the standard Cumberland Ankle Instability Tool applied to determine the presence of ankle instability. Questionnaires from 255 collegiate and 133 semi-professional basketball athletes (male = 243, female = 145, 22.3 ± 3.8 years, 23.3 ± 2.2 kg/m2) were analyzed. Differences in prevalence between gender, competitive level and playing position were determined using the Chi-square test.
Results
In the surveyed cohort, 26% had unilateral CAI while 50% of them had bilateral CAI. Women had a higher prevalence than men in the whole surveyed cohort (X2(1) = 0.515, p = 0.003). This gender disparity also showed from sub-analyses, that the collegiate female athletes had a higher prevalence than collegiate men athletes (X2(1) = 0.203, p = 0.001). Prevalence showed no difference between competitive levels (p > 0.05) and among playing positions (p > 0.05).
Conclusions
CAI is highly prevalent in the basketball population. Gender affects the prevalence of CAI. Regardless of the competitive level and playing position the prevalence of CAI is similar. The characteristic of basketball contributes to the high prevalence. Prevention of CAI should be a focus in basketball. When applying the CAI prevention measures, gender should be taken into consideration.
The Role of the Precuneus in Human Spatial Updating in a Real Environment Setting—A cTBS Study
(2022)
As we move through an environment, we update positions of our body relative to other objects, even when some objects temporarily or permanently leave our field of view—this ability is termed egocentric spatial updating and plays an important role in everyday life. Still, our knowledge about its representation in the brain is still scarce, with previous studies using virtual movements in virtual environments or patients with brain lesions suggesting that the precuneus might play an important role. However, whether this assumption is also true when healthy humans move in real environments where full body-based cues are available in addition to the visual cues typically used in many VR studies is unclear. Therefore, in this study we investigated the role of the precuneus in egocentric spatial updating in a real environment setting in 20 healthy young participants who underwent two conditions in a cross-over design: (a) stimulation, achieved through applying continuous theta-burst stimulation (cTBS) to inhibit the precuneus and (b) sham condition (activated coil turned upside down). In both conditions, participants had to walk back with blindfolded eyes to objects they had previously memorized while walking with open eyes. Simplified trials (without spatial updating) were used as control condition, to make sure the participants were not affected by factors such as walking blindfolded, vestibular or working memory deficits. A significant interaction was found, with participants performing better in the sham condition compared to real stimulation, showing smaller errors both in distance and angle. The results of our study reveal evidence of an important role of the precuneus in a real-environment egocentric spatial updating; studies on larger samples are necessary to confirm and further investigate this finding.
The Role of the Precuneus in Human Spatial Updating in a Real Environment Setting—A cTBS Study
(2022)
As we move through an environment, we update positions of our body relative to other objects, even when some objects temporarily or permanently leave our field of view—this ability is termed egocentric spatial updating and plays an important role in everyday life. Still, our knowledge about its representation in the brain is still scarce, with previous studies using virtual movements in virtual environments or patients with brain lesions suggesting that the precuneus might play an important role. However, whether this assumption is also true when healthy humans move in real environments where full body-based cues are available in addition to the visual cues typically used in many VR studies is unclear. Therefore, in this study we investigated the role of the precuneus in egocentric spatial updating in a real environment setting in 20 healthy young participants who underwent two conditions in a cross-over design: (a) stimulation, achieved through applying continuous theta-burst stimulation (cTBS) to inhibit the precuneus and (b) sham condition (activated coil turned upside down). In both conditions, participants had to walk back with blindfolded eyes to objects they had previously memorized while walking with open eyes. Simplified trials (without spatial updating) were used as control condition, to make sure the participants were not affected by factors such as walking blindfolded, vestibular or working memory deficits. A significant interaction was found, with participants performing better in the sham condition compared to real stimulation, showing smaller errors both in distance and angle. The results of our study reveal evidence of an important role of the precuneus in a real-environment egocentric spatial updating; studies on larger samples are necessary to confirm and further investigate this finding.
Older adults with amnestic mild cognitive impairment (aMCI) who in addition to their memory deficits also suffer from frontal-executive dysfunctions have a higher risk of developing dementia later in their lives than older adults with aMCI without executive deficits and older adults with non-amnestic MCI (naMCI). Handgrip strength (HGS) is also correlated with the risk of cognitive decline in the elderly. Hence, the current study aimed to investigate the associations between HGS and executive functioning in individuals with aMCI, naMCI and healthy controls. Older, right-handed adults with amnestic MCI (aMCI), non-amnestic MCI (naMCI), and healthy controls (HC) conducted a handgrip strength measurement via a handheld dynamometer. Executive functions were assessed with the Trail Making Test (TMT A&B). Normalized handgrip strength (nHGS, normalized to Body Mass Index (BMI)) was calculated and its associations with executive functions (operationalized through z-scores of TMT B/A ratio) were investigated through partial correlation analyses (i.e., accounting for age, sex, and severity of depressive symptoms). A positive and low-to-moderate correlation between right nHGS (rp (22) = 0.364; p = 0.063) and left nHGS (rp (22) = 0.420; p = 0.037) and executive functioning in older adults with aMCI but not in naMCI or HC was observed. Our results suggest that higher levels of nHGS are linked to better executive functioning in aMCI but not naMCI and HC. This relationship is perhaps driven by alterations in the integrity of the hippocampal-prefrontal network occurring in older adults with aMCI. Further research is needed to provide empirical evidence for this assumption.
Older adults with amnestic mild cognitive impairment (aMCI) who in addition to their memory deficits also suffer from frontal-executive dysfunctions have a higher risk of developing dementia later in their lives than older adults with aMCI without executive deficits and older adults with non-amnestic MCI (naMCI). Handgrip strength (HGS) is also correlated with the risk of cognitive decline in the elderly. Hence, the current study aimed to investigate the associations between HGS and executive functioning in individuals with aMCI, naMCI and healthy controls. Older, right-handed adults with amnestic MCI (aMCI), non-amnestic MCI (naMCI), and healthy controls (HC) conducted a handgrip strength measurement via a handheld dynamometer. Executive functions were assessed with the Trail Making Test (TMT A&B). Normalized handgrip strength (nHGS, normalized to Body Mass Index (BMI)) was calculated and its associations with executive functions (operationalized through z-scores of TMT B/A ratio) were investigated through partial correlation analyses (i.e., accounting for age, sex, and severity of depressive symptoms). A positive and low-to-moderate correlation between right nHGS (rp (22) = 0.364; p = 0.063) and left nHGS (rp (22) = 0.420; p = 0.037) and executive functioning in older adults with aMCI but not in naMCI or HC was observed. Our results suggest that higher levels of nHGS are linked to better executive functioning in aMCI but not naMCI and HC. This relationship is perhaps driven by alterations in the integrity of the hippocampal-prefrontal network occurring in older adults with aMCI. Further research is needed to provide empirical evidence for this assumption.
Einleitung: Bisherige Untersuchungen deuten darauf hin, dass etwa 30-40 % der PatientInnen in der kardiologischen Rehabilitation eine besondere berufliche Problemlage (BBPL) aufweisen. Die hindernden und fördernden Faktoren des beruflichen Wiedereinstiegs wurden vielfach untersucht. Beispielsweise können eine positive Gesundheitswahrnehmung, Beschwerdefreiheit und Berufszufriedenheit als Förderfaktoren, und Komorbiditäten, die Krankheitsschwere, motivationale Gründe sowie das Alter beispielhaft als Hindernisse benannt werden. In dieser Arbeit sollten die Faktoren, die die subjektiven Berufsaussichten von PatientInnen in der kardiologischen Anschlussheilbehandlung (AHB) bestimmen, identifiziert und beschrieben werden. Daraus sollten Impulse für ein patientInnenzentriertes Vorgehen in der AHB abgeleitet werden.
Methode: In einer qualitativen, monozentrischen Interviewstudie wurden insgesamt 20 PatientInnen mit und ohne BBPL in der kardiologischen AHB als ExpertInnen gefragt, um die subjektiven Erwerbserwartungen zu eruieren und die PatientInnenperspektive besser zu verstehen. Die Interviews wurden aufgezeichnet, transkribiert und codiert. Die Auswertung erfolgte mittels der thematischen Analyse.
Ergebnisse: Es wurden sieben Schlüsselthemen identifiziert. Hierzu gehörten die krankheitsbezogenen Vorerfahrungen sowie Zukunftsvorstellungen als perspektivische Einflussfaktoren. Außerdem wurden interne und externe Aspekte, darunter die Gesundheitswahrnehmung (inkl. Belastbarkeitseinschätzung), die Veränderbarkeit der Arbeitsbedingungen und die Angst, erneut zu erkranken, als bedeutsame Themen ermittelt. Deutlich wurde auch, dass die BBPL-PatientInnen in das Berufsleben zurückkehren wollten, das kardiologische Ereignis jedoch zu einer wahrgenommenen Notwendigkeit für Lebensstil- und Prioritätenänderungen geführt hat. Zur Umsetzung dieser wollten sich die PatientInnen Zeit nehmen, auch das soziale Umfeld unterstützte die Priorisierung der Gesundheit.
Schlussfolgerung: Hieraus ergibt sich die Notwendigkeit einer multiprofessionellen, dabei individuell-differenzierten Herangehensweise in der kardiologischen AHB. Ein besonderer Fokus sollte auf der Berücksichtigung der Selbsterwartung, der individuellen Zielsetzung im Hinblick auf die Berufszukunft und dem Einbeziehen des sozialen Umfelds liegen. Des Weiteren wird eine Überarbeitung des BBPL-Begriffes vorgeschlagen, da die Zuweisung einer solchen Problemlage durch den Kostenträger paradox und stigmatisierend erscheint.