Refine
Document Type
- Article (48)
- Postprint (15)
- Monograph/Edited Volume (1)
- Doctoral Thesis (1)
Language
- English (65) (remove)
Is part of the Bibliography
- yes (65)
Keywords
- allostatic load (5)
- bone remodeling (3)
- cortisol (3)
- microRNA (3)
- neuroplasticity (3)
- osteoblast (3)
- osteoclast (3)
- sonography (3)
- Advanced Dynamic Flow (2)
- Aging (2)
Institute
- Fakultät für Gesundheitswissenschaften (65) (remove)
Older adults with amnestic mild cognitive impairment (aMCI) who in addition to their memory deficits also suffer from frontal-executive dysfunctions have a higher risk of developing dementia later in their lives than older adults with aMCI without executive deficits and older adults with non-amnestic MCI (naMCI). Handgrip strength (HGS) is also correlated with the risk of cognitive decline in the elderly. Hence, the current study aimed to investigate the associations between HGS and executive functioning in individuals with aMCI, naMCI and healthy controls. Older, right-handed adults with amnestic MCI (aMCI), non-amnestic MCI (naMCI), and healthy controls (HC) conducted a handgrip strength measurement via a handheld dynamometer. Executive functions were assessed with the Trail Making Test (TMT A&B). Normalized handgrip strength (nHGS, normalized to Body Mass Index (BMI)) was calculated and its associations with executive functions (operationalized through z-scores of TMT B/A ratio) were investigated through partial correlation analyses (i.e., accounting for age, sex, and severity of depressive symptoms). A positive and low-to-moderate correlation between right nHGS (rp (22) = 0.364; p = 0.063) and left nHGS (rp (22) = 0.420; p = 0.037) and executive functioning in older adults with aMCI but not in naMCI or HC was observed. Our results suggest that higher levels of nHGS are linked to better executive functioning in aMCI but not naMCI and HC. This relationship is perhaps driven by alterations in the integrity of the hippocampal-prefrontal network occurring in older adults with aMCI. Further research is needed to provide empirical evidence for this assumption.
Older adults with amnestic mild cognitive impairment (aMCI) who in addition to their memory deficits also suffer from frontal-executive dysfunctions have a higher risk of developing dementia later in their lives than older adults with aMCI without executive deficits and older adults with non-amnestic MCI (naMCI). Handgrip strength (HGS) is also correlated with the risk of cognitive decline in the elderly. Hence, the current study aimed to investigate the associations between HGS and executive functioning in individuals with aMCI, naMCI and healthy controls. Older, right-handed adults with amnestic MCI (aMCI), non-amnestic MCI (naMCI), and healthy controls (HC) conducted a handgrip strength measurement via a handheld dynamometer. Executive functions were assessed with the Trail Making Test (TMT A&B). Normalized handgrip strength (nHGS, normalized to Body Mass Index (BMI)) was calculated and its associations with executive functions (operationalized through z-scores of TMT B/A ratio) were investigated through partial correlation analyses (i.e., accounting for age, sex, and severity of depressive symptoms). A positive and low-to-moderate correlation between right nHGS (rp (22) = 0.364; p = 0.063) and left nHGS (rp (22) = 0.420; p = 0.037) and executive functioning in older adults with aMCI but not in naMCI or HC was observed. Our results suggest that higher levels of nHGS are linked to better executive functioning in aMCI but not naMCI and HC. This relationship is perhaps driven by alterations in the integrity of the hippocampal-prefrontal network occurring in older adults with aMCI. Further research is needed to provide empirical evidence for this assumption.
Although aluminum chronic neurotoxicity is well documented, there are no well-established experimental protocols of Al exposure. In the current study, toxic effects of sub-chronic Al exposure have been evaluated in outbreed male rats (gastrointestinal administration). Forty animals were used: 10 were administered with AlCl3 water solution (2 mg/kg Al per day) for 1 month, 10 received the same concentration of AlCl3 for 3 month, and 20 (10 per observation period) saline as control. After 30 and 90 days, the animals underwent behavioral tests: open field, passive avoidance, extrapolation escape task, and grip strength. At the end of the study, the blood, liver, kidney, and brain were excised for analytical and morphological studies. The Al content was measured by inductively coupled plasma mass-spectrometry. Essential trace elements-Co, Cr, Cu, Fe, Mg, Mn, Mo, Se, and Zn-were measured in whole blood samples. Although no morphological changes were observed in the brain, liver, or kidney for both exposure terms, dose-dependent Al accumulation and behavioral differences (increased locomotor activity after 30 days) between treatment and control groups were indicated. Moreover, for 30 days exposure, strong positive correlation between Al content in the brain and blood for individual animals was established, which surprisingly disappeared by the third month. This may indicate neural barrier adaptation to the Al exposure or the saturation of Al transport into the brain. Notably, we could not see a clear neurodegeneration process after rather prolonged sub-chronic Al exposure, so probably longer exposure periods are required.
Background
Anticancer compound 3-bromopyruvate (3-BrPA) suppresses cancer cell growth via targeting glycolytic and mitochondrial metabolism. The malignant peripheral nerve sheath tumor (MPNST), a very aggressive, therapy resistant, and Neurofibromatosis type 1 associated neoplasia, shows a high metabolic activity and affected patients may therefore benefit from 3-BrPA treatment. To elucidate the specific mode of action, we used a controlled cell model overexpressing proteasome activator (PA) 28, subsequently leading to p53 inactivation and oncogenic transformation and therefore reproducing an important pathway in MPNST and overall tumor pathogenesis.
Methods
Viability of MPNST cell lines S462, NSF1, and T265 in response to increasing doses (0-120 mu M) of 3-BrPA was analyzed by CellTiter-Blue (R) assay. Additionally, we investigated viability, reactive oxygen species (ROS) production (dihydroethidium assay), nicotinamide adenine dinucleotide dehydrogenase activity (NADH-TR assay) and lactate production (lactate assay) in mouse B8 fibroblasts overexpressing PA28 in response to 3-BrPA application. For all experiments normal and nutrient deficient conditions were tested. MPNST cell lines were furthermore characterized immunohistochemically for Ki67, p53, bcl2, bcl6, cyclin D1, and p21.
Results
MPNST significantly responded dose dependent to 3-BrPA application, whereby S462 cells were most responsive. Human control cells showed a reduced sensitivity. In PA28 overexpressing cancer cell model 3-BrPA application harmed mitochondrial NADH dehydrogenase activity mildly and significantly failed to inhibit lactate production. PA28 overexpression was associated with a functional glycolysis as well as a partial resistance to stress provoked by nutrient deprivation. 3-BrPA treatment was not associated with an increase of ROS. Starvation sensitized MPNST to treatment.
Conclusions
Aggressive MPNST cells are sensitive to 3-BrPA therapy in-vitro with and without starvation. In a PA28 overexpression cancer cell model leading to p53 inactivation, thereby reflecting a key molecular feature in human NF1 associated MPNST, known functions of 3-BrPA to block mitochondrial activity and glycolysis were reproduced, however oncogenic cells displayed a partial resistance. To conclude, 3-BrPA was sufficient to reduce NF1 associated MPNST viability potentially due inhibition of glycolysis which should lead to the initiation of further studies and promises a potential benefit for NF1 patients.
Association of primary allostatic load mediators and metabolic syndrome (MetS): A systematic review
(2022)
Allostatic load (AL) exposure may cause detrimental effects on the neuroendocrine system, leading to metabolic syndrome (MetS). The primary mediators of AL involve serum dehydroepiandrosterone sulfate (DHEAS; a functional HPA axis antagonist); further, cortisol, urinary norepinephrine (NE), and epinephrine (EPI) excretion levels (assessed within 12-h urine as a golden standard for the evaluation of the HPA axis activity and sympathetic nervous system activity). However, the evidence of an association between the primary mediators of AL and MetS is limited. This systematic review aimed to critically examine the association between the primary mediators of AL and MetS. PubMed and Web of Science were searched for articles from January 2010 to December 2021, published in English. The search strategy focused on cross-sectional and case–control studies comprising adult participants with MetS, obesity, overweight, and without chronic diseases. The STROBE checklist was used to assess study quality control. Of 770 studies, twenty-one studies with a total sample size (n = 10,666) met the eligibility criteria. Eighteen studies were cross-sectional, and three were case–control studies. The included studies had a completeness of reporting score of COR % = 87.0 ± 6.4%. It is to be noted, that cortisol as a primary mediator of AL showed an association with MetS in 50% (urinary cortisol), 40% (serum cortisol), 60% (salivary cortisol), and 100% (hair cortisol) of the studies. For DHEAS, it is to conclude that 60% of the studies showed an association with MetS. In contrast, urinary EPI and urinary NE had 100% no association with MetS. In summary, there is a tendency for the association between higher serum cortisol, salivary cortisol, urinary cortisol, hair cortisol, and lower levels of DHEAS with MetS. Future studies focusing on longitudinal data are warranted for clarification and understanding of the association between the primary mediators of AL and MetS.
Association of primary allostatic load mediators and metabolic syndrome (MetS): A systematic review
(2022)
Allostatic load (AL) exposure may cause detrimental effects on the neuroendocrine system, leading to metabolic syndrome (MetS). The primary mediators of AL involve serum dehydroepiandrosterone sulfate (DHEAS; a functional HPA axis antagonist); further, cortisol, urinary norepinephrine (NE), and epinephrine (EPI) excretion levels (assessed within 12-h urine as a golden standard for the evaluation of the HPA axis activity and sympathetic nervous system activity). However, the evidence of an association between the primary mediators of AL and MetS is limited. This systematic review aimed to critically examine the association between the primary mediators of AL and MetS. PubMed and Web of Science were searched for articles from January 2010 to December 2021, published in English. The search strategy focused on cross-sectional and case–control studies comprising adult participants with MetS, obesity, overweight, and without chronic diseases. The STROBE checklist was used to assess study quality control. Of 770 studies, twenty-one studies with a total sample size (n = 10,666) met the eligibility criteria. Eighteen studies were cross-sectional, and three were case–control studies. The included studies had a completeness of reporting score of COR % = 87.0 ± 6.4%. It is to be noted, that cortisol as a primary mediator of AL showed an association with MetS in 50% (urinary cortisol), 40% (serum cortisol), 60% (salivary cortisol), and 100% (hair cortisol) of the studies. For DHEAS, it is to conclude that 60% of the studies showed an association with MetS. In contrast, urinary EPI and urinary NE had 100% no association with MetS. In summary, there is a tendency for the association between higher serum cortisol, salivary cortisol, urinary cortisol, hair cortisol, and lower levels of DHEAS with MetS. Future studies focusing on longitudinal data are warranted for clarification and understanding of the association between the primary mediators of AL and MetS.
Background:
From birth to young adulthood, health and development of young people are strongly linked to their living situation, including their family's socioeconomic position (SEP) and living environment. The impact of regional characteristics on development in early childhood beyond family SEP has been rarely investigated. This study aimed to identify regional predictors of global developmental delay at school entry taking family SEP into consideration.
Method:
We used representative, population-based data from mandatory school entry examinations of the German federal state of Brandenburg in 2018/2019 with n=22,801 preschool children. By applying binary multilevel models, we hierarchically analyzed the effect of regional deprivation defined by the German Index of Socioeconomic Deprivation (GISD) and rurality operationalized as inverted population density of the children's school district on global developmental delay (GDD) while adjusting for family SEP (low, medium and high)
Results:
Family SEP was significantly and strongly linked to GDD. Children with the highest family SEP showed a lower odds for GDD compared to a medium SEP (female: OR=4.26, male: OR=3.46) and low SEP (female: OR=16.58, male: OR=12.79). Furthermore, we discovered a smaller, but additional and independent effect of regional socioeconomic deprivation on GDD, with a higher odds for children from a more deprived school district (female: OR=1.35, male: OR=1.20). However, rurality did not show a significant link to GDD in preschool children beyond family SEP and regional deprivation.
Conclusion:
Family SEP and regional deprivation are risk factors for child development and of particular interest to promote health of children in early childhood and over the life course.
This study sought to analyze the relationship between in-season training workload with changes in aerobic power (VO2max), maximum and resting heart rate (HRmax and HRrest), linear sprint medium (LSM), and short test (LSS), in soccer players younger than 16 years (under-16 soccer players). We additionally aimed to explain changes in fitness levels during the in-season through regression models, considering accumulated load, baseline levels, and peak height velocity (PHV) as predictors. Twenty-three male sub-elite soccer players aged 15.5 ± 0.2 years (PHV: 13.6 ± 0.4 years; body height: 172.7 ± 4.2 cm; body mass: 61.3 ± 5.6 kg; body fat: 13.7% ± 3.9%; VO2max: 48.4 ± 2.6 mL⋅kg–1⋅min–1), were tested three times across the season (i.e., early-season (EaS), mid-season (MiS), and end-season (EnS) for VO2max, HRmax, LSM, and LSS. Aerobic and speed variables gradually improved over the season and had a strong association with PHV. Moreover, the HRmax demonstrated improvements from EaS to EnS; however, this was more evident in the intermediate period (from EaS to MiS) and had a strong association with VO2max. Regression analysis showed significant predictions for VO2max [F(2, 20) = 8.18, p ≤ 0.001] with an R2 of 0.45. In conclusion, the meaningful variation of youth players’ fitness levels can be observed across the season, and such changes can be partially explained by the load imposed.
This study sought to analyze the relationship between in-season training workload with changes in aerobic power (VO2max), maximum and resting heart rate (HRmax and HRrest), linear sprint medium (LSM), and short test (LSS), in soccer players younger than 16 years (under-16 soccer players). We additionally aimed to explain changes in fitness levels during the in-season through regression models, considering accumulated load, baseline levels, and peak height velocity (PHV) as predictors. Twenty-three male sub-elite soccer players aged 15.5 ± 0.2 years (PHV: 13.6 ± 0.4 years; body height: 172.7 ± 4.2 cm; body mass: 61.3 ± 5.6 kg; body fat: 13.7% ± 3.9%; VO2max: 48.4 ± 2.6 mL⋅kg–1⋅min–1), were tested three times across the season (i.e., early-season (EaS), mid-season (MiS), and end-season (EnS) for VO2max, HRmax, LSM, and LSS. Aerobic and speed variables gradually improved over the season and had a strong association with PHV. Moreover, the HRmax demonstrated improvements from EaS to EnS; however, this was more evident in the intermediate period (from EaS to MiS) and had a strong association with VO2max. Regression analysis showed significant predictions for VO2max [F(2, 20) = 8.18, p ≤ 0.001] with an R2 of 0.45. In conclusion, the meaningful variation of youth players’ fitness levels can be observed across the season, and such changes can be partially explained by the load imposed.
Background: The enzyme-linked immunosorbent assay (ELISA) is an indispensable tool for clinical diagnostics to identify or differentiate diseases such as autoimmune illnesses, but also to monitor their progression or control the efficacy of drugs. One use case of ELISA is to differentiate between different states (e.g. healthy vs. diseased). Another goal is to quantitatively assess the biomarker in question, like autoantibodies. Thus, the ELISA technology is used for the discovery and verification of new autoantibodies, too. Of key interest, however, is the development of immunoassays for the sensitive and specific detection of such biomarkers at early disease stages. Therefore, users have to deal with many parameters, such as buffer systems or antigen-autoantibody interactions, to successfully establish an ELISA. Often, fine-tuning like testing of several blocking substances is performed to yield high signal-to-noise ratios. <br /> Methods: We developed an ELISA to detect IgA and IgG autoantibodies against chitinase-3-like protein 1 (CHI3L1), a newly identified autoantigen in inflammatory bowel disease (IBD), in the serum of control and disease groups (n = 23, respectively). Microwell plates with different surface modifications (PolySorp and MaxiSorp coating) were tested to detect reproducibility problems. <br /> Results: We found a significant impact of the surface properties of the microwell plates. IgA antibody reactivity was significantly lower, since it was in the range of background noise, when measured on MaxiSorp coated plates (p < 0.0001). The IgG antibody reactivity did not differ on the diverse plates, but the plate surface had a significant influence on the test result (p = 0.0005). <br /> Conclusion: With this report, we want to draw readers' attention to the properties of solid phases and their effects on the detection of autoantibodies by ELISA. We want to sensitize the reader to the fact that the choice of the wrong plate can lead to a false negative test result, which in turn has serious consequences for the discovery of autoantibodies.
Electroencephalographic (EEG) research indicates changes in adults' low frequency bands of frontoparietal brain areas executing different balance tasks with increasing postural demands. However, this issue is unsolved for adolescents when performing the same balance task with increasing difficulty. Therefore, we examined the effects of a progressively increasing balance task difficulty on balance performance and brain activity in adolescents. Thirteen healthy adolescents aged 16-17 year performed tests in bipedal upright stance on a balance board with six progressively increasing levels of task difficulty. Postural sway and cortical activity were recorded simultaneously using a pressure sensitive measuring system and EEG. The power spectrum was analyzed for theta (4-7 Hz) and alpha-2 (10-12 Hz) frequency bands in pre-defined frontal, central, and parietal clusters of electrocortical sources. Repeated measures analysis of variance (rmANOVA) showed a significant main effect of task difficulty for postural sway (p < 0.001; d = 6.36). Concomitantly, the power spectrum changed in frontal, bilateral central, and bilateral parietal clusters. RmANOVAs revealed significant main effects of task difficulty for theta band power in the frontal (p < 0.001, d = 1.80) and both central clusters (left: p < 0.001, d = 1.49; right: p < 0.001, d = 1.42) as well as for alpha-2 band power in both parietal clusters (left: p < 0.001, d = 1.39; right: p < 0.001, d = 1.05) and in the central right cluster (p = 0.005, d = 0.92). Increases in theta band power (frontal, central) and decreases in alpha-2 power (central, parietal) with increasing balance task difficulty may reflect increased attentional processes and/or error monitoring as well as increased sensory information processing due to increasing postural demands. In general, our findings are mostly in agreement with studies conducted in adults. Similar to adult studies, our data with adolescents indicated the involvement of frontoparietal brain areas in the regulation of postural control. In addition, we detected that activity of selected brain areas (e.g., bilateral central) changed with increasing postural demands.
Basic psychological needs theory postulates that a social environment that satisfies individuals’ three basic psychological needs of autonomy, competence, and relatedness leads to optimal growth and well-being. On the other hand, the frustration of these needs is associated with ill-being and depressive symptoms foremost investigated in non-clinical samples; yet, there is a paucity of research on need frustration in clinical samples. Survey data were compared between adult individuals with major depressive disorder (MDD; n = 115; 48.69% female; 38.46 years, SD = 10.46) with those of a non-depressed comparison sample (n = 201; 53.23% female; 30.16 years, SD = 12.81). Need profiles were examined with a linear mixed model (LMM). Individuals with depression reported higher levels of frustration and lower levels of satisfaction in relation to the three basic psychological needs when compared to non-depressed adults. The difference between depressed and non-depressed groups was significantly larger for frustration than satisfaction regarding the needs for relatedness and competence. LMM correlation parameters confirmed the expected positive correlation between the three needs. This is the first study showing substantial differences in need-based experiences between depressed and non-depressed adults. The results confirm basic assumptions of the self-determination theory and have preliminary implications in tailoring therapy for depression.
Basic psychological needs theory postulates that a social environment that satisfies individuals’ three basic psychological needs of autonomy, competence, and relatedness leads to optimal growth and well-being. On the other hand, the frustration of these needs is associated with ill-being and depressive symptoms foremost investigated in non-clinical samples; yet, there is a paucity of research on need frustration in clinical samples. Survey data were compared between adult individuals with major depressive disorder (MDD; n = 115; 48.69% female; 38.46 years, SD = 10.46) with those of a non-depressed comparison sample (n = 201; 53.23% female; 30.16 years, SD = 12.81). Need profiles were examined with a linear mixed model (LMM). Individuals with depression reported higher levels of frustration and lower levels of satisfaction in relation to the three basic psychological needs when compared to non-depressed adults. The difference between depressed and non-depressed groups was significantly larger for frustration than satisfaction regarding the needs for relatedness and competence. LMM correlation parameters confirmed the expected positive correlation between the three needs. This is the first study showing substantial differences in need-based experiences between depressed and non-depressed adults. The results confirm basic assumptions of the self-determination theory and have preliminary implications in tailoring therapy for depression.
Genetic engineering has provided humans the ability to transform organisms by direct manipulation of genomes within a broad range of applications including agriculture (e.g., GM crops), and the pharmaceutical industry (e.g., insulin production). Developments within the last 10 years have produced new tools for genome editing (e.g., CRISPR/Cas9) that can achieve much greater precision than previous forms of genetic engineering. Moreover, these tools could offer the potential for interventions on humans and for both clinical and non-clinical purposes, resulting in a broad scope of applicability. However, their promising abilities and potential uses (including their applicability in humans for either somatic or heritable genome editing interventions) greatly increase their potential societal impacts and, as such, have brought an urgency to ethical and regulatory discussions about the application of such technology in our society. In this article, we explore different arguments (pragmatic, sociopolitical and categorical) that have been made in support of or in opposition to the new technologies of genome editing and their impact on the debate of the permissibility or otherwise of human heritable genome editing interventions in the future. For this purpose, reference is made to discussions on genetic engineering that have taken place in the field of bioethics since the 1980s. Our analysis shows that the dominance of categorical arguments has been reversed in favour of pragmatic arguments such as safety concerns. However, when it comes to involving the public in ethical discourse, we consider it crucial widening the debate beyond such pragmatic considerations. In this article, we explore some of the key categorical as well sociopolitical considerations raised by the potential uses of heritable genome editing interventions, as these considerations underline many of the societal concerns and values crucial for public engagement. We also highlight how pragmatic considerations, despite their increasing importance in the work of recent authoritative sources, are unlikely to be the result of progress on outstanding categorical issues, but rather reflect the limited progress on these aspects and/or pressures in regulating the use of the technology.
Cardiac rehabilitation
(2021)
The investigation of protein structures, functions and interactions often requires modifications to adapt protein properties to the specific application. Among many possible methods to equip proteins with new chemical groups, the utilization of orthogonal aminoacyl-tRNA synthetase/tRNA pairs enables the site-specific incorporation of non-canonical amino acids at defined positions in the protein. The open nature of cell-free protein synthesis reactions provides an optimal environment, as the orthogonal components do not need to be transported across the cell membrane and the impact on cell viability is negligible. In the present work, it was shown that the expression of orthogonal aminoacyl-tRNA synthetases in CHO cells prior to cell disruption enhanced the modification of the pharmaceutically relevant adenosine A2a receptor. For this purpose, in complement to transient transfection of CHO cells, an approach based on CRISPR/Cas9 technology was selected to generate a translationally active cell lysate harboring endogenous orthogonal aminoacyl-tRNA synthetase.
Background: The relationship between exercise-induced intratendinous blood flow (IBF) and tendon pathology or training exposure is unclear.
Objective: This study investigates the acute effect of running exercise on sonographic detectable IBF in healthy and tendinopathic Achilles tendons (ATs) of runners and recreational participants.
Methods: 48 participants (43 ± 13 years, 176 ± 9 cm, 75 ± 11 kg) performed a standardized submaximal 30-min constant load treadmill run with Doppler ultrasound “Advanced dynamic flow” examinations before (Upre) and 5, 30, 60, and 120 min (U5-U120) afterward. Included were runners (>30 km/week) and recreational participants (<10 km/week) with healthy (Hrun, n = 10; Hrec, n = 15) or tendinopathic (Trun, n = 13; Trec, n = 10) ATs. IBF was assessed by counting number [n] of intratendinous vessels. IBF data are presented descriptively (%, median [minimum to maximum range] for baseline-IBF and IBF-difference post-exercise). Statistical differences for group and time point IBF and IBF changes were analyzed with Friedman and Kruskal-Wallis ANOVA (α = 0.05).
Results: At baseline, IBF was detected in 40% (3 [1–6]) of Hrun, in 53% (4 [1–5]) of Hrec, in 85% (3 [1–25]) of Trun, and 70% (10 [2–30]) of Trec. At U5 IBF responded to exercise in 30% (3 [−1–9]) of Hrun, in 53% (4 [−2–6]) of Hrec, in 70% (4 [−10–10]) of Trun, and in 80% (5 [1–10]) of Trec. While IBF in 80% of healthy responding ATs returned to baseline at U30, IBF remained elevated until U120 in 60% of tendinopathic ATs. Within groups, IBF changes from Upre-U120 were significant for Hrec (p < 0.01), Trun (p = 0.05), and Trec (p < 0.01). Between groups, IBF changes in consecutive examinations were not significantly different (p > 0.05) but IBF-level was significantly higher at all measurement time points in tendinopathic versus healthy ATs (p < 0.05).
Conclusion: Irrespective of training status and tendon pathology, running leads to an immediate increase of IBF in responding tendons. This increase occurs shortly in healthy and prolonged in tendinopathic ATs. Training exposure does not alter IBF occurrence, but IBF level is elevated in tendon pathology. While an immediate exercise-induced IBF increase is a physiological response, prolonged IBF is considered a pathological finding associated with Achilles tendinopathy.
Background: The relationship between exercise-induced intratendinous blood flow (IBF) and tendon pathology or training exposure is unclear.
Objective: This study investigates the acute effect of running exercise on sonographic detectable IBF in healthy and tendinopathic Achilles tendons (ATs) of runners and recreational participants.
Methods: 48 participants (43 ± 13 years, 176 ± 9 cm, 75 ± 11 kg) performed a standardized submaximal 30-min constant load treadmill run with Doppler ultrasound “Advanced dynamic flow” examinations before (Upre) and 5, 30, 60, and 120 min (U5-U120) afterward. Included were runners (>30 km/week) and recreational participants (<10 km/week) with healthy (Hrun, n = 10; Hrec, n = 15) or tendinopathic (Trun, n = 13; Trec, n = 10) ATs. IBF was assessed by counting number [n] of intratendinous vessels. IBF data are presented descriptively (%, median [minimum to maximum range] for baseline-IBF and IBF-difference post-exercise). Statistical differences for group and time point IBF and IBF changes were analyzed with Friedman and Kruskal-Wallis ANOVA (α = 0.05).
Results: At baseline, IBF was detected in 40% (3 [1–6]) of Hrun, in 53% (4 [1–5]) of Hrec, in 85% (3 [1–25]) of Trun, and 70% (10 [2–30]) of Trec. At U5 IBF responded to exercise in 30% (3 [−1–9]) of Hrun, in 53% (4 [−2–6]) of Hrec, in 70% (4 [−10–10]) of Trun, and in 80% (5 [1–10]) of Trec. While IBF in 80% of healthy responding ATs returned to baseline at U30, IBF remained elevated until U120 in 60% of tendinopathic ATs. Within groups, IBF changes from Upre-U120 were significant for Hrec (p < 0.01), Trun (p = 0.05), and Trec (p < 0.01). Between groups, IBF changes in consecutive examinations were not significantly different (p > 0.05) but IBF-level was significantly higher at all measurement time points in tendinopathic versus healthy ATs (p < 0.05).
Conclusion: Irrespective of training status and tendon pathology, running leads to an immediate increase of IBF in responding tendons. This increase occurs shortly in healthy and prolonged in tendinopathic ATs. Training exposure does not alter IBF occurrence, but IBF level is elevated in tendon pathology. While an immediate exercise-induced IBF increase is a physiological response, prolonged IBF is considered a pathological finding associated with Achilles tendinopathy.
Drug target miRNA
(2017)
This volume provides a concise and technical discussion of recently developed approaches to overcome challenges in miRNA drug discovery. Drug Target miRNA: Methods and Protocols explores strategies to overcome pharmacodynamics and pharmacokinetics challenges. These strategies cover anti-sense agents targeting miRNA that are applied in advanced formulations or are chemically optimized to increase delivery; small molecule miRNA modulators to overcome anti-sense agents’ limitations; general enhancers of miRNA maturation; and Argonaute 2 protein and its pharmacokinetic parameters. Written in the highly successful Methods in Molecular Biology series format, chapters include introductions to their respective topics, lists of the necessary materials and reagents, step-by-step, readily reproducible laboratory protocols, and tips on troubleshooting and avoiding known pitfalls.Cutting-edge and thorough, Drug Target miRNA: Methods and Protocols is a valuable resource for anyone interested in the ever-evolving field of miRNA drug discovery.
This study investigated whether transcutaneous auricular vagus nerve stimulation (taVNS) enhances reversal learning and augments noradrenergic biomarkers (i.e., pupil size, cortisol, and salivary alpha-amylase [sAA]). We also explored the effect of taVNS on respiratory rate and cardiac vagal activity (CVA). Seventy-one participants received stimulation of either the cymba concha (taVNS) or the earlobe (sham) of the left ear. After learning a series of cue-outcome associations, the stimulation was applied before and throughout a reversal phase in which cue-outcome associations were changed for some (reversal), but not for other (distractor) cues. Tonic pupil size, salivary cortisol, sAA, respiratory rate, and CVA were assessed at different time points. Contrary to our hypothesis, taVNS was not associated with an overall improvement in performance on the reversal task. Compared to sham, the taVNS group performed worse for distractor than reversal cues. taVNS did not increase tonic pupil size and sAA. Only post hoc analyses indicated that the cortisol decline was steeper in the sham compared to the taVNS group. Exploratory analyses showed that taVNS decreased respiratory rate but did not affect CVA. The weak and unexpected effects found in this study might relate to the lack of parameters optimization for taVNS and invite to further investigate the effect of taVNS on cortisol and respiratory rate.
Background
Generalized weakness and fatigue are underexplored symptoms in emergency medicine. Triage tools often underestimate patients presenting to the emergency department (ED) with these nonspecific symptoms (Nemec et al., 2010). At the same time, physicians' disease severity rating (DSR) on a scale from 0 (not sick at all) to 10 (extremely sick) predicts key outcomes in ED patients (Beglinger et al., 2015; Rohacek et al., 2015). Our goals were (1) to characterize ED patients with weakness and/or fatigue (W|F); to explore (2) to what extent physicians' DSR at triage can predict five key outcomes in ED patients with W|F; (3) how well DSR performs relative to two commonly used benchmark methods, the Emergency Severity Index (ESI) and the Charlson Comorbidity Index (CCI); (4) to what extent DSR provides predictive information beyond ESI, CCI, or their linear combination, i.e., whether ESI and CCI should be used alone or in combination with DSR; and (5) to what extent ESI, CCI, or their linear combination provide predictive information beyond DSR alone, i.e., whether DSR should be used alone or in combination with ESI and / or CCI.
Methods
Prospective observational study between 2013-2015 (analysis in 2018-2020, study team blinded to hypothesis) conducted at a single center. We study an all-comer cohort of 3,960 patients (48% female patients, median age = 51 years, 94% completed 1-year follow-up). We looked at two primary outcomes (acute morbidity (Bingisser et al., 2017; Weigel et al., 2017) and all-cause 1- year mortality) and three secondary outcomes (in-hospital mortality, hospitalization and transfer to ICU). We assessed the predictive power (i.e., resolution, measured as the Area under the ROC Curve, AUC) of the scores and, using logistic regression, their linear combinations.
Findings
Compared to patients without W|F (n = 3,227), patients with W|F (n = 733) showed higher prevalences for all five outcomes, reported more symptoms across both genders, and received higher DSRs (median = 4; interquartile range (IQR) = 3-6 vs. median = 3; IQR = 2-5). DSR predicted all five outcomes well above chance (i.e., AUCs > similar to 0.70), similarly well for both patients with and without W|F, and as good as or better than ESI and CCI in patients with and without W|F (except for 1-year mortality where CCI performs better). For acute morbidity, hospitalization, and transfer to ICU there is clear evidence that adding DSR to ESI and/or CCI improves predictions for both patient groups; for 1-year mortality and in-hospital mortality this holds for most, but not all comparisons. Adding ESI and/or CCI to DSR generally did not improve performance or even decreased it.
Conclusions
The use of physicians' disease severity rating has never been investigated in patients with generalized weakness and fatigue. We show that physicians' prediction of acute morbidity, mortality, hospitalization, and transfer to ICU through their DSR is also accurate in these patients. Across all patients, DSR is less predictive of acute morbidity for female than male patients, however. Future research should investigate how emergency physicians judge their patients' clinical state at triage and how this can be improved and used in simple decision aids.
Macrophages in pathologically expanded dysfunctional white adipose tissue are exposed to a mix of potential modulators of inflammatory response, including fatty acids released from insulin-resistant adipocytes, increased levels of insulin produced to compensate insulin resistance, and prostaglandin E-2 (PGE(2)) released from activated macrophages. The current study addressed the question of how palmitate might interact with insulin or PGE(2) to induce the formation of the chemotactic pro-inflammatory cytokine interleukin-8 (IL-8). Human THP-1 cells were differentiated into macrophages. In these macrophages, palmitate induced IL-8 formation. Insulin enhanced the induction of IL-8 formation by palmitate as well as the palmitate-dependent stimulation of PGE(2) synthesis. PGE(2) in turn elicited IL-8 formation on its own and enhanced the induction of IL-8 release by palmitate, most likely by activating the EP4 receptor. Since IL-8 causes insulin resistance and fosters inflammation, the increase in palmitate-induced IL-8 formation that is caused by hyperinsulinemia and locally produced PGE(2) in chronically inflamed adipose tissue might favor disease progression in a vicious feed-forward cycle.
Emotional memories are better remembered than neutral ones, but the mechanisms leading to this memory bias are not well under-stood in humans yet. Based on animal research, it is suggested that the memory-enhancing effect of emotion is based on central nor-adrenergic release, which is triggered by afferent vagal nerve activation. To test the causal link between vagus nerve activation and emotional memory in humans, we applied continuous noninvasive transcutaneous auricular vagus nerve stimulation (taVNS) during exposure to emotional arousing and neutral scenes and tested subsequent, long-term recognition memory after 1 week. We found that taVNS, compared with sham, increased recollection-based memory performance for emotional, but not neutral, material. These findings were complemented by larger recollection-related brain potentials (parietal ERP Old/New effect) during retrieval of emotional scenes encoded under taVNS, compared with sham. Furthermore, brain potentials recorded during encoding also revealed that taVNS facilitated early attentional discrimination between emotional and neutral scenes. Extending animal research, our behavioral and neu-ral findings confirm a modulatory influence of the vagus nerve in emotional memory formation in humans.
Extracellular vesicles
(2021)
Osteoporosis is characterized by low bone mass and damage to the bone tissue’s microarchitecture, leading to increased fracture risk. Several studies have provided evidence for associations between psychosocial stress and osteoporosis through various pathways, including the hypothalamic-pituitary-adrenocortical axis, the sympathetic nervous system, and other endocrine factors. As psychosocial stress provokes oxidative cellular stress with consequences for mitochondrial function and cell signaling (e.g., gene expression, inflammation), it is of interest whether extracellular vesicles (EVs) may be a relevant biomarker in this context or act by transporting substances. EVs are intercellular communicators, transfer substances encapsulated in them, modify the phenotype and function of target cells, mediate cell-cell communication, and, therefore, have critical applications in disease progression and clinical diagnosis and therapy. This review summarizes the characteristics of EVs, their role in stress and osteoporosis, and their benefit as biological markers. We demonstrate that EVs are potential mediators of psychosocial stress and osteoporosis and may be beneficial in innovative research settings.
Background: The characteristics of osteoporosis are decreased bone mass and destruction towards the microarchitecture of bone tissue, which raises the risk of fracture. Psychosocialstress and osteoporosis are linked by sympathetic nervous system, hypothalamic-pituitary-adrenal axis, and other endocrine factors. Psychosocial stress causes a series of effects on the organism, and this long-term depletion at the cellular level is considered to be mitochondrial allostatic load, including mitochondrial dysfunction and oxidative stress. Extracellular vesicles (EVs) are involved in the mitochondrial allostatic load process and may as biomarkers in this setting. As critical participants during cell-to-cell communications, EVs serve as transport vehicles for nucleic acid and proteins, alter the phenotypic and functional characteristics of their target cells, and promote cell-to-cell contact. And hence, they play a significant role in the diagnosis and therapy of many diseases, such as osteoporosis.
Aim: This narrative review attempts to outline the features of EVs, investigate their involvement in both psychosocial stress and osteoporosis, and analyze if EVs can be potential mediators between both.
Methods: The online database from PubMed, Google Scholar, and Science Direct were searched for keywords related to the main topic of this study, and the availability of all the selected studies was verified. Afterward, the findings from the articles were summarized and synthesized.
Results: Psychosocial stress affects bone remodeling through increased neurotransmitters such as glucocorticoids and catecholamines, as well as increased glucose metabolism. Furthermore, psychosocial stress leads to mitochondrial allostatic load, including oxidative stress, which may affect bone remodeling. In vitro and in vivo data suggest EVs might involve in the link between psychosocial stress and bone remodeling through the transfer of bioactive substances and thus be a potential mediator of psychosocial stress leading to osteoporosis.
Conclusions: According to the included studies, psychosocial stress affects bone remodeling, leading to osteoporosis. By summarizing the specific properties of EVs and the function of EVs in both psychosocial stress and osteoporosis, respectively, it has been demonstrated that EVs are possible mediators of both, and have the prospects to be useful in innovative research areas.
Extracellular vesicles: potential mediators of psychosocial stress contribution to osteoporosis?
(2021)
Osteoporosis is characterized by low bone mass and damage to the bone tissue’s microarchitecture, leading to increased fracture risk. Several studies have provided evidence for associations between psychosocial stress and osteoporosis through various pathways, including the hypothalamic-pituitary-adrenocortical axis, the sympathetic nervous system, and other endocrine factors. As psychosocial stress provokes oxidative cellular stress with consequences for mitochondrial function and cell signaling (e.g., gene expression, inflammation), it is of interest whether extracellular vesicles (EVs) may be a relevant biomarker in this context or act by transporting substances. EVs are intercellular communicators, transfer substances encapsulated in them, modify the phenotype and function of target cells, mediate cell-cell communication, and, therefore, have critical applications in disease progression and clinical diagnosis and therapy. This review summarizes the characteristics of EVs, their role in stress and osteoporosis, and their benefit as biological markers. We demonstrate that EVs are potential mediators of psychosocial stress and osteoporosis and may be beneficial in innovative research settings.
OBJECTIVE: For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
METHODS: We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
RESULTS: Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
CONCLUSION: Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population.
Objective
For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
Methods
We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
Results
Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
Conclusion
Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population
OBJECTIVE: For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
METHODS: We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
RESULTS: Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
CONCLUSION: Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population.
In recent years digital technologies have become a major means for providing health-related services and this trend was strongly reinforced by the current Coronavirus disease 2019 (COVID-19) pandemic. As it is well-known that regular physical activity has positive effects on individual physical and mental health and thus is an important prerequisite for healthy aging, digital technologies are also increasingly used to promote unstructured and structured forms of physical activity. However, in the course of this development, several terms (e.g., Digital Health, Electronic Health, Mobile Health, Telehealth, Telemedicine, and Telerehabilitation) have been introduced to refer to the application of digital technologies to provide health-related services such as physical interventions. Unfortunately, the above-mentioned terms are often used in several different ways, but also relatively interchangeably. Given that ambiguous terminology is a major source of difficulty in scientific communication which can impede the progress of theoretical and empirical research, this article aims to make the reader aware of the subtle differences between the relevant terms which are applied at the intersection of physical activity and Digital Health and to provide state-of-art definitions for them.
In recent years digital technologies have become a major means for providing health-related services and this trend was strongly reinforced by the current Coronavirus disease 2019 (COVID-19) pandemic. As it is well-known that regular physical activity has positive effects on individual physical and mental health and thus is an important prerequisite for healthy aging, digital technologies are also increasingly used to promote unstructured and structured forms of physical activity. However, in the course of this development, several terms (e.g., Digital Health, Electronic Health, Mobile Health, Telehealth, Telemedicine, and Telerehabilitation) have been introduced to refer to the application of digital technologies to provide health-related services such as physical interventions. Unfortunately, the above-mentioned terms are often used in several different ways, but also relatively interchangeably. Given that ambiguous terminology is a major source of difficulty in scientific communication which can impede the progress of theoretical and empirical research, this article aims to make the reader aware of the subtle differences between the relevant terms which are applied at the intersection of physical activity and Digital Health and to provide state-of-art definitions for them.
Boredom has been identified as one of the greatest psychological challenges when staying at home during quarantine and isolation. However, this does not mean that the situation necessarily causes boredom. On the basis of 13 explorative interviews with bored and non-bored persons who have been under quarantine or in isolation, we explain why boredom is related to a subjective interpretation process rather than being a direct consequence of the objective situation. Specifically, we show that participants vary significantly in their interpretations of staying at home and, thus, also in their experience of boredom. While the non-bored participants interpret the situation as a relief or as irrelevant, the bored participants interpret it as a major restriction that only some are able to cope with.
Objective
To improve consumer decision making, the results of risk assessments on food, feed, consumer products or chemicals need to be communicated not only to experts but also to non-expert audiences. The present study draws on evidence from literature reviews and focus groups with diverse stakeholders to identify content to integrate into an existing risk assessment communication (Risk Profile).
Methods
A combination of rapid literature reviews and focus groups with experts (risk assessors (n = 15), risk managers (n = 8)), and non-experts (general public (n = 18)) were used to identify content and strategies for including information about risk assessment results in the “Risk Profile” from the German Federal Institute for Risk Assessment. Feedback from initial focus groups was used to develop communication prototypes that informed subsequent feedback rounds in an iterative process. A final prototype was validated in usability tests with experts.
Results
Focus group feedback and suggestions from risk assessors were largely in line with findings from the literature. Risk managers and lay persons offered similar suggestions on how to improve the existing communication of risk assessment results (e.g., including more explanatory detail, reporting probabilities for individual health impairments, and specifying risks for subgroups in additional sections). Risk managers found information about quality of evidence important to communicate, whereas people from the general public found this information less relevant. Participants from lower educational backgrounds had difficulties understanding the purpose of risk assessments. User tests found that the final prototype was appropriate and feasible to implement by risk assessors.
Conclusion
An iterative and evidence-based process was used to develop content to improve the communication of risk assessments to the general public while being feasible to use by risk assessors. Remaining challenges include how to communicate dose-response relationships and standardise quality of evidence ratings across disciplines.
Objective
To improve consumer decision making, the results of risk assessments on food, feed, consumer products or chemicals need to be communicated not only to experts but also to non-expert audiences. The present study draws on evidence from literature reviews and focus groups with diverse stakeholders to identify content to integrate into an existing risk assessment communication (Risk Profile).
Methods
A combination of rapid literature reviews and focus groups with experts (risk assessors (n = 15), risk managers (n = 8)), and non-experts (general public (n = 18)) were used to identify content and strategies for including information about risk assessment results in the “Risk Profile” from the German Federal Institute for Risk Assessment. Feedback from initial focus groups was used to develop communication prototypes that informed subsequent feedback rounds in an iterative process. A final prototype was validated in usability tests with experts.
Results
Focus group feedback and suggestions from risk assessors were largely in line with findings from the literature. Risk managers and lay persons offered similar suggestions on how to improve the existing communication of risk assessment results (e.g., including more explanatory detail, reporting probabilities for individual health impairments, and specifying risks for subgroups in additional sections). Risk managers found information about quality of evidence important to communicate, whereas people from the general public found this information less relevant. Participants from lower educational backgrounds had difficulties understanding the purpose of risk assessments. User tests found that the final prototype was appropriate and feasible to implement by risk assessors.
Conclusion
An iterative and evidence-based process was used to develop content to improve the communication of risk assessments to the general public while being feasible to use by risk assessors. Remaining challenges include how to communicate dose-response relationships and standardise quality of evidence ratings across disciplines.
The study investigated the incidence of Achilles and patellar tendinopathy in adolescent elite athletes and non-athletic controls. Furthermore, predictive and associated factors for tendinopathy development were analyzed. The prospective study consisted of two measurement days (M1/M2) with an interval of 3.2 +/- 0.9 years. 157 athletes (12.1 +/- 0.7 years) and 25 controls (13.3 +/- 0.6 years) without Achilles/patellar tendinopathy were included at Ml. Clinical and ultrasound examinations of both Achilles (AT) and patellar tendons (PT) were performed. Main outcome measures were incidence tendinopathy and structural intratendinous alterations (hypo-/hyperechogenicity, vascularization) at M2 [%]. Incidence of Achilles tendinopathy was 1% in athletes and 0% in controls. Patellar tendinopathy was more frequent in athletes (13 %)than in controls (4%). Incidence of intratendinous alterations in ATs was 1-2% in athletes and 0 % in controls, whereas in PTs it was 4-6 % in both groups (p >0.05). Intratendinous alterations at M2 were associated with patellar tendinopathy in athletes (p <= 0.01). Intratendinous alterations at M1, anthropometric data, training amount, sports or sex did not predict tendinopathy development (p>0.05). Incidence often dinopathy and intratendinous alterations in adolescent athletes is low in ATs and more common in PTs. Development of intratendinous alterations in PT is associated with tend in opathy. However, predictive factors could not be identified.
Dementia as one of the most prevalent diseases urges for a better understanding of the central mechanisms responsible for clinical symptoms, and necessitates improvement of actual diagnostic capabilities. The brainstem nucleus locus coeruleus (LC) is a promising target for early diagnosis because of its early structural alterations and its relationship to the functional disturbances in the patients. In this study, we applied our improved method of localisation-based LC resting-state fMRI to investigate the differences in central sensory signal processing when comparing functional connectivity (fc) of a patient group with mild cognitive impairment (MCI, n = 28) and an age-matched healthy control group (n = 29). MCI and control participants could be differentiated in their Mini-Mental-State-Examination (MMSE) scores (p < .001) and LC intensity ratio (p = .010). In the fMRI, LC fc to anterior cingulate cortex (FDR p < .001) and left anterior insula (FDR p = .012) was elevated, and LC fc to right temporoparietal junction (rTPJ, FDR p = .012) and posterior cingulate cortex (PCC, FDR p = .021) was decreased in the patient group. Importantly, LC to rTPJ connectivity was also positively correlated to MMSE scores in MCI patients (p = .017). Furthermore, we found a hyperactivation of the left-insula salience network in the MCI patients. Our results and our proposed disease model shed new light on the functional pathogenesis of MCI by directing to attentional network disturbances, which could aid new therapeutic strategies and provide a marker for diagnosis and prediction of disease progression.
Dementia as one of the most prevalent diseases urges for a better understanding of the central mechanisms responsible for clinical symptoms, and necessitates improvement of actual diagnostic capabilities. The brainstem nucleus locus coeruleus (LC) is a promising target for early diagnosis because of its early structural alterations and its relationship to the functional disturbances in the patients. In this study, we applied our improved method of localisation-based LC resting-state fMRI to investigate the differences in central sensory signal processing when comparing functional connectivity (fc) of a patient group with mild cognitive impairment (MCI, n = 28) and an age-matched healthy control group (n = 29). MCI and control participants could be differentiated in their Mini-Mental-State-Examination (MMSE) scores (p < .001) and LC intensity ratio (p = .010). In the fMRI, LC fc to anterior cingulate cortex (FDR p < .001) and left anterior insula (FDR p = .012) was elevated, and LC fc to right temporoparietal junction (rTPJ, FDR p = .012) and posterior cingulate cortex (PCC, FDR p = .021) was decreased in the patient group. Importantly, LC to rTPJ connectivity was also positively correlated to MMSE scores in MCI patients (p = .017). Furthermore, we found a hyperactivation of the left-insula salience network in the MCI patients. Our results and our proposed disease model shed new light on the functional pathogenesis of MCI by directing to attentional network disturbances, which could aid new therapeutic strategies and provide a marker for diagnosis and prediction of disease progression.
Following stroke, neuronal death takes place both in the infarct region and in brain areas distal to the lesion site including the hippocampus. The hippocampus is critically involved in learning and memory processes and continuously generates new neurons. Dysregulation of adult neurogenesis may be associated with cognitive decline after a stroke lesion. In particular, proliferation of precursor cells and the formation of new neurons are increased after lesion. Within the first week, many new precursor cells die during development. How dying precursors are removed from the hippocampus and to what extent phagocytosis takes place after stroke is still not clear. Here, we evaluated the effect of a prefrontal stroke lesion on the phagocytic activity of microglia in the dentate gyrus (DG) of the hippocampus. Three-months-old C57BL/6J mice were injected once with the proliferation marker BrdU (250 mg/kg) 6 hr after a middle cerebral artery occlusion or sham surgery. The number of apoptotic cells and the phagocytic capacity of the microglia were evaluated by means of immunohistochemistry, confocal microscopy, and 3D-reconstructions. We found a transient but significant increase in the number of apoptotic cells in the DG early after stroke, associated with impaired removal by microglia. Interestingly, phagocytosis of newly generated precursor cells was not affected. Our study shows that a prefrontal stroke lesion affects phagocytosis of apoptotic cells in the DG, a region distal to the lesion core. Whether disturbed phagocytosis might contribute to inflammatory- and maladaptive processes including cognitive impairment following stroke needs to be further investigated.
Nonparametric goodness-of-fit testing for parametric covariate models in pharmacometric analyses
(2021)
The characterization of covariate effects on model parameters is a crucial step during pharmacokinetic/pharmacodynamic analyses. Although covariate selection criteria have been studied extensively, the choice of the functional relationship between covariates and parameters, however, has received much less attention. Often, a simple particular class of covariate-to-parameter relationships (linear, exponential, etc.) is chosen ad hoc or based on domain knowledge, and a statistical evaluation is limited to the comparison of a small number of such classes. Goodness-of-fit testing against a nonparametric alternative provides a more rigorous approach to covariate model evaluation, but no such test has been proposed so far. In this manuscript, we derive and evaluate nonparametric goodness-of-fit tests for parametric covariate models, the null hypothesis, against a kernelized Tikhonov regularized alternative, transferring concepts from statistical learning to the pharmacological setting. The approach is evaluated in a simulation study on the estimation of the age-dependent maturation effect on the clearance of a monoclonal antibody. Scenarios of varying data sparsity and residual error are considered. The goodness-of-fit test correctly identified misspecified parametric models with high power for relevant scenarios. The case study provides proof-of-concept of the feasibility of the proposed approach, which is envisioned to be beneficial for applications that lack well-founded covariate models.
Sedentarism is a risk factor for depression and anxiety. People living with the human immunodeficiency virus (PLWH) have a higher prevalence of anxiety and depression compared to HIV-negative individuals. This cross-sectional study (n = 450, median age 44 (19-75), 7.3% females) evaluates the prevalence rates and prevalence ratio (PR) of anxiety and/or depression in PLWH associated with recreational exercise. A decreased likelihood of having anxiety (PR=0.57; 0.36-0.91; p = 0.01), depression (PR=0.41; 0.36-0.94; p=0.01), and comorbid anxiety and depression (PR = 0,43; 0.24-0.75; p=0.002) was found in exercising compared to non-exercising PLWH. Recreational exercise is associated with a lower risk for anxiety and/or depression. Further prospective studies are needed to provide insights on the direction of this association.
Background and Study
Aims Recurrent laryngeal nerve palsy (RLNP) is a potential complication of anterior discectomy and fusion (ACDF). There still is substantial disagreement on the actual prevalence of RLNP after ACDF as well as on risk factors for postoperative RLNP. The aim of this study was to describe the prevalence of postoperative RLNP in a cohort of consecutive cases of ACDF and to examine potential risk factors.
Materials and Methods
This retrospective study included patients who underwent ACDF between 2005 and 2019 at a single neurosurgical center. As part of clinical routine, RLNP was examined prior to and after surgery by independent otorhinolaryngologists using endoscopic laryngoscopy. As potential risk factors for postoperative RLNP, we examined patient's age, sex, body mass index, multilevel surgery, and the duration of surgery.
Results
214 consecutive cases were included. The prevalence of preoperative RLNP was 1.4% (3/214) and the prevalence of postoperative RLNP was 9% (19/211). The number of operated levels was 1 in 73.5% (155/211), 2 in 24.2% (51/211), and 3 or more in 2.4% (5/211) of cases. Of all cases, 4.7% (10/211) were repeat surgeries. There was no difference in the prevalence of RLNP between the primary surgery group (9.0%, 18/183) versus the repeat surgery group (10.0%, 1/10;p = 0.91). Also, there was no difference in any characteristics between subjects with postoperative RLNP compared with those without postoperative RLNP. We found no association between postoperative RLNP and patient's age, sex, body mass index, duration of surgery, or number of levels (odds ratios between 0.24 and 1.05; p values between 0.20 and 0.97).
Conclusions
In our cohort, the prevalence of postoperative RLNP after ACDF was 9.0%. The fact that none of the examined variables was associated with the occurrence of RLNP supports the view that postoperative RLNP may depend more on direct mechanical manipulation during surgery than on specific patient or surgical characteristics.
The reliability of quantifying intratendinous vascularization by high-sensitivity Doppler ultrasound advanced dynamic flow has not been examined yet. Therefore, this study aimed to investigate the intraobserver and interobserver reliability of evaluating Achilles tendon vascularization by advanced dynamic flow using established scoring systems. Methods-Three investigators evaluated vascularization in 67 recordings in a test-retest design, applying the Ohberg score, a modified Ohberg score, and a counting score. Intraobserver and interobserver agreement for the Ohberg score and modified Ohberg score was analyzed by the Cohen kappa and Fleiss kappa coefficients (absolute), Kendall tau b coefficient, and Kendall coefficient of concordance (W; relative). The reliability of the counting score was analyzed by intraclass correlation coefficients (ICC) 2.1 and 3.1, the standard error of measurement (SEM), and Bland-Altman analysis (bias and limits of agreement [LoA]). Results-Intraobserver and interobserver agreement (absolute/relative) ranged from 0.61 to 0.87/0.87 to 0.95 and 0.11 to 0.66/0.76 to 0.89 for the Ohberg score and from 0.81 to 0.87/0.92 to 0.95 and 0.64 to 0.80/0.88 to 0.93 for the modified Ohberg score, respectively. The counting score revealed an intraobserver ICC of 0.94 to 0.97 (SEM, 1.0-1.5; bias, -1; and LoA, 3-4 vessels). The interobserver ICC for the counting score ranged from 0.91 to 0.98 (SEM, 1.0-1.9; bias, 0; and LoA, 3-5 vessels). Conclusions-The modified Ohberg score and counting score showed excellent reliability and seem convenient for research and clinical practice. The Ohberg score revealed decent intraobserver but unexpected low interobserver reliability and therefore cannot be recommended.
The PNPLA3 reference single-nucleotide polymorphism rs738409 has been identified as a predisposing factor for nonalcoholic fatty liver disease. A simple method based on PCR and restriction fragment length polymorphism (RFLP) analysis had been published to detect the nonpathogenic allele PNPLA3 rs738409 variant. The presence of the pathogenic variant was deduced by the indigestibility of the corresponding PCR product with BtsCI recognizing the nonpathogenic allele. However, one cannot exclude that an enzymatic reaction does not occur for other, more trivial, reasons. For safe and secure detection of the pathogenic PNPLA3 rs738409, we have further developed the PCR-restriction fragment length polymorphism method by adding a second restriction enzyme digest, clearly identifying the correct PNPLA3 alleles and in particular the pathogenic variant. <br /> METHOD SUMMARY <br /> The method presented here represents an improved genetic diagnosis of the PNPLA3 rs738409 alleles based on conventional and inexpensive molecular biological methods. We used methodology based on PCR and restriction fragment length polymorphisms and clearly identified both described alleles by the use of two restriction enzymes. Digestion of individuals' specific PNPLA3 PCR fragments with both enzymes in independent reactions clearly showed the PNPLA3 rs738409 genotype.
In this report, we investigate small proteins involved in bacterial alternative respiratory systems that improve the enzymatic efficiency through better anchorage and multimerization of membrane components. Using the small protein TorE of the respiratory TMAO reductase system as a model, we discovered that TorE is part of a subfamily of small proteins that are present in proteobacteria in which they play a similar role for bacterial respiratory systems. We reveal by microscopy that, in Shewanella oneidensis MR1, alternative respiratory systems are evenly distributed in the membrane contrary to what has been described for Escherichia coli. Thus, the better efficiency of the respiratory systems observed in the presence of the small proteins is not due to a specific localization in the membrane, but rather to the formation of membranous complexes formed by TorE homologs with their c-type cytochrome partner protein. By an in vivo approach combining Clear Native electrophoresis and fluorescent translational fusions, we determined the 4: 4 stoichiometry of the complexes. In addition, mild solubilization of the cytochrome indicates that the presence of the small protein reinforces its anchoring to the membrane. Therefore, assembly of the complex induced by this small protein improves the efficiency of the respiratory system.
Stress and pain
(2022)
Introduction: Low back pain (LBP) leads to considerable impairment of quality of life worldwide and is often accompanied by psychosomatic symptoms.
Objectives: First, to assess the association between stress and chronic low back pain (CLBP) and its simultaneous appearance with fatigue and depression as a symptom triad. Second, to identify the most predictive stress-related pattern set for CLBP for a 1-year diagnosis.
Methods: In a 1-year observational study with four measurement points, a total of 140 volunteers (aged 18–45 years with intermittent pain) were recruited. The primary outcomes were pain [characteristic pain intensity (CPI), subjective pain disability (DISS)], fatigue, and depressive mood. Stress was assessed as chronic stress, perceived stress, effort reward imbalance, life events, and physiological markers [allostatic load index (ALI), hair cortisol concentration (HCC)]. Multiple linear regression models and selection procedures for model shrinkage and variable selection (least absolute shrinkage and selection operator) were applied. Prediction accuracy was calculated by root mean squared error (RMSE) and receiver-operating characteristic curves.
Results: There were 110 participants completed the baseline assessments (28.2 7.5 years, 38.1% female), including HCC, and a further of 46 participants agreed to ALI laboratory measurements. Different stress types were associated with LBP, CLBP, fatigue, and depressive mood and its joint occurrence as a symptom triad at baseline; mainly social-related stress types were of relevance. Work-related stress, such as “excessive demands at work”[b = 0.51 (95%CI -0.23, 1.25), p = 0.18] played a role for upcoming chronic pain disability. “Social overload” [b = 0.45 (95%CI -0.06, 0.96), p = 0.080] and “over-commitment at work” [b = 0.28 (95%CI -0.39, 0.95), p = 0.42] were associated with an upcoming depressive mood within 1-year. Finally, seven psychometric (CPI: RMSE = 12.63; DISS: RMSE = 9.81) and five biomarkers (CPI: RMSE = 12.21; DISS: RMSE = 8.94) could be derived as the most predictive pattern set for a 1-year prediction of CLBP. The biomarker set showed an apparent area under the curve of 0.88 for CPI and 0.99 for DISS.
Conclusion: Stress disrupts allostasis and favors the development of chronic pain, fatigue, and depression and the emergence of a “hypocortisolemic symptom triad,” whereby the social-related stressors play a significant role. For translational medicine, a predictive pattern set could be derived which enables to diagnose the individuals at higher risk for the upcoming pain disorders and can be used in practice.
Myasthenia gravis is an autoimmune disease affecting neuromuscular transmission and causing skeletal muscle weakness. Additionally, systemic inflammation, cognitive deficits and autonomic dysfunction have been described.
However, little is known about myasthenia gravis-related reorganization of the brain. In this study, we thus investigated the structural and functional brain changes in myasthenia gravis patients.
Eleven myasthenia gravis patients (age: 70.64 +/- 9.27; 11 males) were compared to age-, sex- and education-matched healthy controls (age: 70.18 +/- 8.98; 11 males). Most of the patients (n = 10, 0.91%) received cholinesterase inhibitors.
Structural brain changes were determined by applying voxel-based morphometry using high-resolution T-1-weighted sequences. Functional brain changes were assessed with a neuropsychological test battery (including attention, memory and executive functions), a spatial orientation task and brain-derived neurotrophic factor blood levels.
Myasthenia gravis patients showed significant grey matter volume reductions in the cingulate gyrus, in the inferior parietal lobe and in the fusiform gyrus. Furthermore, myasthenia gravis patients showed significantly lower performance in executive functions, working memory (Spatial Span, P = 0.034, d = 1.466), verbal episodic memory (P = 0.003, d = 1.468) and somatosensory-related spatial orientation (Triangle Completion Test, P = 0.003, d = 1.200).
Additionally, serum brain-derived neurotrophic factor levels were significantly higher in myasthenia gravis patients (P = 0.001, d = 2.040). Our results indicate that myasthenia gravis is associated with structural and functional brain alterations. Especially the grey matter volume changes in the cingulate gyrus and the inferior parietal lobe could be associated with cognitive deficits in memory and executive functions.
Furthermore, deficits in somatosensory-related spatial orientation could be associated with the lower volumes in the inferior parietal lobe. Future research is needed to replicate these findings independently in a larger sample and to investigate the underlying mechanisms in more detail.
Klaus et al. compared myasthenia gravis patients to matched healthy control subjects and identified functional alterations in memory functions as well as structural alterations in the cingulate gyrus, in the inferior parietal lobe and in the fusiform gyrus.
Stunting
(2021)
Development of chronic pain after a low back pain episode is associated with increased pain sensitivity, altered pain processing mechanisms and the influence of psychosocial factors. Although there is some evidence that multimodal therapy (such as behavioral or motor control therapy) may be an important therapeutic strategy, its long-term effect on pain reduction and psychosocial load is still unclear. Prospective longitudinal designs providing information about the extent of such possible long-term effects are missing. This study aims to investigate the long-term effects of a homebased uni- and multidisciplinary motor control exercise program on low back pain intensity, disability and psychosocial variables. 14 months after completion of a multicenter study comparing uni- and multidisciplinary exercise interventions, a sample of one study center (n = 154) was assessed once more. Participants filled in questionnaires regarding their low back pain symptoms (characteristic pain intensity and related disability), stress and vital exhaustion (short version of the Maastricht Vital Exhaustion Questionnaire), anxiety and depression experiences (the Hospital and Anxiety Depression Scale), and pain-related cognitions (the Fear Avoidance Beliefs Questionnaire). Repeated measures mixed ANCOVAs were calculated to determine the long-term effects of the interventions on characteristic pain intensity and disability as well as on the psychosocial variables. Fifty four percent of the sub-sample responded to the questionnaires (n = 84). Longitudinal analyses revealed a significant long-term effect of the exercise intervention on pain disability. The multidisciplinary group missed statistical significance yet showed a medium sized long-term effect. The groups did not differ in their changes of the psychosocial variables of interest. There was evidence of long-term effects of the interventions on pain-related disability, but there was no effect on the other variables of interest. This may be partially explained by participant's low comorbidities at baseline. Results are important regarding costless homebased alternatives for back pain patients and prevention tasks. Furthermore, this study closes the gap of missing long-term effect analysis in this field.