Refine
Has Fulltext
- no (69)
Document Type
- Article (69) (remove)
Language
- English (69) (remove)
Is part of the Bibliography
- yes (69)
Keywords
- Depression (3)
- cell-free protein synthesis (3)
- Physical activity (2)
- Pseudomonas aeruginosa (2)
- aging (2)
- allostatic load (2)
- attention (2)
- balance (2)
- cortisol (2)
- cystic fibrosis (2)
Institute
- Fakultät für Gesundheitswissenschaften (69) (remove)
Genetic engineering has provided humans the ability to transform organisms by direct manipulation of genomes within a broad range of applications including agriculture (e.g., GM crops), and the pharmaceutical industry (e.g., insulin production). Developments within the last 10 years have produced new tools for genome editing (e.g., CRISPR/Cas9) that can achieve much greater precision than previous forms of genetic engineering. Moreover, these tools could offer the potential for interventions on humans and for both clinical and non-clinical purposes, resulting in a broad scope of applicability. However, their promising abilities and potential uses (including their applicability in humans for either somatic or heritable genome editing interventions) greatly increase their potential societal impacts and, as such, have brought an urgency to ethical and regulatory discussions about the application of such technology in our society. In this article, we explore different arguments (pragmatic, sociopolitical and categorical) that have been made in support of or in opposition to the new technologies of genome editing and their impact on the debate of the permissibility or otherwise of human heritable genome editing interventions in the future. For this purpose, reference is made to discussions on genetic engineering that have taken place in the field of bioethics since the 1980s. Our analysis shows that the dominance of categorical arguments has been reversed in favour of pragmatic arguments such as safety concerns. However, when it comes to involving the public in ethical discourse, we consider it crucial widening the debate beyond such pragmatic considerations. In this article, we explore some of the key categorical as well sociopolitical considerations raised by the potential uses of heritable genome editing interventions, as these considerations underline many of the societal concerns and values crucial for public engagement. We also highlight how pragmatic considerations, despite their increasing importance in the work of recent authoritative sources, are unlikely to be the result of progress on outstanding categorical issues, but rather reflect the limited progress on these aspects and/or pressures in regulating the use of the technology.
Analysis of physicians' probability estimates of a medical outcome based on a sequence of events
(2022)
A basic law of probability is that the probability of a conjunction of 2 independent events is the product of both components and cannot exceed the likelihood of either component. When this basic law is violated, it is known as the conjunction fallacy. In clinical practice, the conjunction fallacy may arise when physicians estimate the probability of the overall outcome that requires >= 2 steps to be successful. For example, if a successful procedure requires the success of step A and step B, then the probability of overall success of the procedure cannot exceed the likelihood of success of either step A or step B. The aim of this study was to determine whether physicians could correctly estimate the overall probability of success from 2 independent events. <br /> This was a 3-part, Internet-based survey study designed to evaluate the presence of the conjunction fallacy in 2 separate obstetric contexts and 1 pulmonary context. Respondents were board-certified or board-eligible physicians in obstetrics and gynecology and pulmonary, recruited from a commercial survey service. In each context, physicians were presented with scenarios related to their medical specialty and asked to judge the probability of the overall outcome, or conjunction, and of the 2 individual events, or conjuncts. <br /> The first substudy, conducted April 2-4, 2021, described a delivery in brow presentation discovered during labor. To assess the overall probability of a successful spontaneous vaginal delivery, an obstetrician must consider the likelihood of the brow presentation converting to a deliverable position and the likelihood of vaginal delivery from the converted position. The second substudy, conducted November 2-11, 2021, described the diagnostic evaluation of pulmonary nodule discovered incidentally. To assess the overall probability that a biopsy reveals cancer, the physician must consider the likelihood that the nodule is cancerous and the likelihood that the biopsy successfully and accurately detects cancer. The third study, conducted May 13-19, 2021, modified the first substudy and asked responding obstetricians to consider the likelihood of the individual conjuncts before estimating the overall probability of successful vaginal delivery. <br /> The survey included responses from 215 physicians: 66% were male and 34% were female with a mean (SD) age of 53.6 (9.5) years and mean time since obtaining a medical degree of 27.5 (10.6) years. Overall, 78.1% of physicians committed the conjunction fallacy, estimating that the overall probability of success was greater than the likelihood of at least 1 of the 2 conjuncts. In the first substudy, 74.6% of 67 obstetricians committed the conjunction fallacy; respondents overestimated the combined probability by 12.8% (95% confidence interval [CI], 9.6%-16.1%), compared with the product of the 2 estimated conjuncts with statistically significant deviation (t66 = 7.94; P < 0.001; Cohen d = 0.97 [95% CI, 0.68-1.26]). In the second substudy, 86.9% of 84 pulmonologists committed the conjunction fallacy; respondents overestimated the combined probability by 19.8% (95% CI, 16.6%-23.0%), with statistically significant deviation of (t83 = 7.94; P < 0.001; Cohen d = 1.34 [95% CI, 1.04-1.64]). In the third substudy, 70.3% of 64 committed the conjunction fallacy; respondents overestimated the combined probability by 18.0% (95% CI, 13.4%-22.5%) with statistically significant deviation (t63 = 7.89; P < 0.001; Cohen d = 0.99 [95% CI, 0.68-1.28]). <br /> In this study, it was common for seasoned obstetricians and pulmonologists to commit the conjunction fallacy. Given that physicians often need to estimate the successful outcome of a multistep procedure, they may be doing so in a flawed manner that may negatively impact decision-making.
Frailty assessment is recommended before elective transcatheter aortic valve implantation (TAVI) to determine post-interventional prognosis. Several studies have investigated frailty in TAVI-patients using numerous assessments; however, it remains unclear which is the most appropriate tool for clinical practice. Therefore, we evaluate which frailty assessment is mainly used and meaningful for ≤30-day and ≥1-year prognosis in TAVI patients. Randomized controlled or observational studies (prospective/retrospective) investigating all-cause mortality in older (≥70 years) TAVI patients were identified (PubMed; May 2020). In total, 79 studies investigating frailty with 49 different assessments were included. As single markers of frailty, mostly gait speed (23 studies) and serum albumin (16 studies) were used. Higher risk of 1-year mortality was predicted by slower gait speed (highest Hazard Ratios (HR): 14.71; 95% confidence interval (CI) 6.50–33.30) and lower serum albumin level (highest HR: 3.12; 95% CI 1.80–5.42). Composite indices (five items; seven studies) were associated with 30-day (highest Odds Ratio (OR): 15.30; 95% CI 2.71–86.10) and 1-year mortality (highest OR: 2.75; 95% CI 1.55–4.87). In conclusion, single markers of frailty, in particular gait speed, were widely used to predict 1-year mortality. Composite indices were appropriate, as well as a comprehensive assessment of frailty. View Full-Text
Introduction
Elderly patients after hospitalisation for acute events on account of age-related diseases (eg, joint or heart valve replacement surgery) are often characterised by a remarkably reduced functional health. Multicomponent rehabilitation (MR) is considered an appropriate approach to restore the functioning of these patients. However, its efficacy in improving functioning-related outcomes such as care dependency, activities of daily living (ADL), physical function and health-related quality of life (HRQL) remains unclarified. We outline the research framework of a scoping review designed to map the available evidence of the effects of MR on the independence and functional capacity of elderly patients hospitalised for age-related diseases in four main medical specialties beyond geriatrics.
Methods and analysis
The biomedical databases (PubMed, Cochrane Library, ICTRP Search Platform, ClinicalTrials) and additionally Google Scholar will be systematically searched for studies comparing centre-based MR with usual care in patients ≥75 years of age, hospitalised for common acute events due to age-related diseases (eg, joint replacement, stroke) in one of the specialties of orthopaedics, oncology, cardiology or neurology. MR is defined as exercise training and at least one additional component (eg, nutritional counselling), starting within 3 months after hospital discharge. Randomised controlled trials as well as prospective and retrospective controlled cohort studies will be included from inception and without language restriction. Studies investigating patients <75 years, other specialties (eg, geriatrics), rehabilitation definition or differently designed will be excluded. Care dependency after at least a 6-month follow-up is set as the primary outcome. Physical function, HRQL, ADL, rehospitalisation and mortality will be additionally considered. Data for each outcome will be summarised, stratified by specialty, study design and type of assessment. Furthermore, quality assessment of the included studies will be performed.
Ethics and dissemination
Ethical approval is not required. Findings will be published in a peer-reviewed journal and presented at national and/or international congresses.
Introduction Airway infection with pathogens and its associated pulmonary exacerbations (PEX) are the major causes of morbidity and premature death in cystic fibrosis (CF). Preventing or postponing chronic infections requires early diagnosis. However, limitations of conventional microbiology-based methods can hamper identification of exacerbations and specific pathogen detection. Analyzing volatile organic compounds (VOCs) in breath samples may be an interesting tool in this regard, as VOC-biomarkers can characterize specific airway infections in CF. Areas covered We address the current achievements in VOC-analysis and discuss studies assessing VOC-biomarkers and fingerprints, i.e. a combination of multiple VOCs, in breath samples aiming at pathogen and PEX detection in people with CF (pwCF). We aim to provide bases for further research in this interesting field. Expert opinion Overall, VOC-based analysis is a promising tool for diagnosis of infection and inflammation with potential to monitor disease progression in pwCF. Advantages over conventional diagnostic methods, including easy and non-invasive sampling procedures, may help to drive prompt, suitable therapeutic approaches in the future. Our review shall encourage further research, including validation of VOC-based methods. Specifically, longitudinal validation under standardized conditions is of interest in order to ensure repeatability and enable inclusion in CF diagnostic routine.
Introduction Vagally mediated heart rate variability is an index of autonomic nervous system activity that is associated with a large variety of outcome variables including psychopathology and self-regulation. While practicing heart rate variability biofeedback over several weeks has been reliably associated with a number of positive outcomes, its acute effects are not well known. As the strongest association with vagally mediated heart rate variability has been found particularly within the attention-related subdomain of self-regulation, we investigated the acute effect of heart rate variability biofeedback on attentional control using the revised Attention Network Test.
Methods Fifty-six participants were tested in two sessions. In one session each participant received a heart rate variability biofeedback intervention, and in the other session a control intervention of paced breathing at a normal ventilation rate. After the biofeedback or control intervention, participants completed the Attention Network Test using the Orienting Score as a measure of attentional control.
Results Mixed models revealed that higher resting baseline vagally mediated heart rate variability was associated with better performance in attentional control, which suggests more efficient direction of attention to target stimuli. There was no significant main effect of the intervention on attentional control. However, an interaction effect indicated better performance in attentional control after biofeedback in individuals who reported higher current stress levels.
Discussion The results point to acute beneficial effects of heart rate variability biofeedback on cognitive performance in highly stressed individuals. Although promising, the results need to be replicated in larger or more targeted samples in order to reach stronger conclusions about the effects.
The study investigated the incidence of Achilles and patellar tendinopathy in adolescent elite athletes and non-athletic controls. Furthermore, predictive and associated factors for tendinopathy development were analyzed. The prospective study consisted of two measurement days (M1/M2) with an interval of 3.2 +/- 0.9 years. 157 athletes (12.1 +/- 0.7 years) and 25 controls (13.3 +/- 0.6 years) without Achilles/patellar tendinopathy were included at Ml. Clinical and ultrasound examinations of both Achilles (AT) and patellar tendons (PT) were performed. Main outcome measures were incidence tendinopathy and structural intratendinous alterations (hypo-/hyperechogenicity, vascularization) at M2 [%]. Incidence of Achilles tendinopathy was 1% in athletes and 0% in controls. Patellar tendinopathy was more frequent in athletes (13 %)than in controls (4%). Incidence of intratendinous alterations in ATs was 1-2% in athletes and 0 % in controls, whereas in PTs it was 4-6 % in both groups (p >0.05). Intratendinous alterations at M2 were associated with patellar tendinopathy in athletes (p <= 0.01). Intratendinous alterations at M1, anthropometric data, training amount, sports or sex did not predict tendinopathy development (p>0.05). Incidence often dinopathy and intratendinous alterations in adolescent athletes is low in ATs and more common in PTs. Development of intratendinous alterations in PT is associated with tend in opathy. However, predictive factors could not be identified.
This study investigated whether transcutaneous auricular vagus nerve stimulation (taVNS) enhances reversal learning and augments noradrenergic biomarkers (i.e., pupil size, cortisol, and salivary alpha-amylase [sAA]). We also explored the effect of taVNS on respiratory rate and cardiac vagal activity (CVA). Seventy-one participants received stimulation of either the cymba concha (taVNS) or the earlobe (sham) of the left ear. After learning a series of cue-outcome associations, the stimulation was applied before and throughout a reversal phase in which cue-outcome associations were changed for some (reversal), but not for other (distractor) cues. Tonic pupil size, salivary cortisol, sAA, respiratory rate, and CVA were assessed at different time points. Contrary to our hypothesis, taVNS was not associated with an overall improvement in performance on the reversal task. Compared to sham, the taVNS group performed worse for distractor than reversal cues. taVNS did not increase tonic pupil size and sAA. Only post hoc analyses indicated that the cortisol decline was steeper in the sham compared to the taVNS group. Exploratory analyses showed that taVNS decreased respiratory rate but did not affect CVA. The weak and unexpected effects found in this study might relate to the lack of parameters optimization for taVNS and invite to further investigate the effect of taVNS on cortisol and respiratory rate.
Background: The enzyme-linked immunosorbent assay (ELISA) is an indispensable tool for clinical diagnostics to identify or differentiate diseases such as autoimmune illnesses, but also to monitor their progression or control the efficacy of drugs. One use case of ELISA is to differentiate between different states (e.g. healthy vs. diseased). Another goal is to quantitatively assess the biomarker in question, like autoantibodies. Thus, the ELISA technology is used for the discovery and verification of new autoantibodies, too. Of key interest, however, is the development of immunoassays for the sensitive and specific detection of such biomarkers at early disease stages. Therefore, users have to deal with many parameters, such as buffer systems or antigen-autoantibody interactions, to successfully establish an ELISA. Often, fine-tuning like testing of several blocking substances is performed to yield high signal-to-noise ratios. <br /> Methods: We developed an ELISA to detect IgA and IgG autoantibodies against chitinase-3-like protein 1 (CHI3L1), a newly identified autoantigen in inflammatory bowel disease (IBD), in the serum of control and disease groups (n = 23, respectively). Microwell plates with different surface modifications (PolySorp and MaxiSorp coating) were tested to detect reproducibility problems. <br /> Results: We found a significant impact of the surface properties of the microwell plates. IgA antibody reactivity was significantly lower, since it was in the range of background noise, when measured on MaxiSorp coated plates (p < 0.0001). The IgG antibody reactivity did not differ on the diverse plates, but the plate surface had a significant influence on the test result (p = 0.0005). <br /> Conclusion: With this report, we want to draw readers' attention to the properties of solid phases and their effects on the detection of autoantibodies by ELISA. We want to sensitize the reader to the fact that the choice of the wrong plate can lead to a false negative test result, which in turn has serious consequences for the discovery of autoantibodies.
Background
Elderly patients are a growing population in cardiac rehabilitation (CR). As postural control declines with age, assessment of impaired balance is important in older CR patients in order to predict fall risk and to initiate counteracting steps. Functional balance tests are subjective and lack adequate sensitivity to small differences, and are further subject to ceiling effects. A quantitative approach to measure postural control on a continuous scale is therefore desirable. Force plates are already used for this purpose in other clinical contexts, therefore could be a promising tool also for older CR patients. However, in this population the reliability of the assessment is not fully known.
Research question
Analysis of test-retest reliability of center of pressure (CoP) measures for the assessment of postural control using a force plate in older CR patients.
Methods
156 CR patients (> 75 years) were enrolled. CoP measures (path length (PL), mean velocity (MV), and 95% confidence ellipse area (95CEA)) were analyzed twice with an interval of two days in between (bipedal narrow stance, eyes open (EO) and closed (EC), three trials for each condition, 30 s per trial), using a force plate. For test-retest reliability estimation absolute differences (& UDelta;: T0-T1), intraclass correlation coefficients (ICC) with 95% confidence intervals, standard error of measurement and minimal detectable change were calculated.
Results
Under EO condition ICC were excellent for PL and MV (0.95) and good for 95CEA (0.88) with & UDelta; of 10.1 cm (PL), 0.3 cm/sec (MV) and 1.5 cm(2 )(95CEA) respectively. Under EC condition ICC were excellent (> 0.95) for all variables with larger & UDelta; (PL: 21.7 cm; MV: 0.7 cm/sec; 95CEA: 2.4 cm(2))
Significance
In older CR patients, the assessment of CoP measures using a force plate shows good to excellent test retest reliability.
The Role of the Precuneus in Human Spatial Updating in a Real Environment Setting—A cTBS Study
(2022)
As we move through an environment, we update positions of our body relative to other objects, even when some objects temporarily or permanently leave our field of view—this ability is termed egocentric spatial updating and plays an important role in everyday life. Still, our knowledge about its representation in the brain is still scarce, with previous studies using virtual movements in virtual environments or patients with brain lesions suggesting that the precuneus might play an important role. However, whether this assumption is also true when healthy humans move in real environments where full body-based cues are available in addition to the visual cues typically used in many VR studies is unclear. Therefore, in this study we investigated the role of the precuneus in egocentric spatial updating in a real environment setting in 20 healthy young participants who underwent two conditions in a cross-over design: (a) stimulation, achieved through applying continuous theta-burst stimulation (cTBS) to inhibit the precuneus and (b) sham condition (activated coil turned upside down). In both conditions, participants had to walk back with blindfolded eyes to objects they had previously memorized while walking with open eyes. Simplified trials (without spatial updating) were used as control condition, to make sure the participants were not affected by factors such as walking blindfolded, vestibular or working memory deficits. A significant interaction was found, with participants performing better in the sham condition compared to real stimulation, showing smaller errors both in distance and angle. The results of our study reveal evidence of an important role of the precuneus in a real-environment egocentric spatial updating; studies on larger samples are necessary to confirm and further investigate this finding.
Functional near-infrared spectroscopy (fNIRS) allows for a reliable assessment of oxygenated blood flow in relevant brain regions. Recent advancements in immersive virtual reality (VR)-based technology have generated many new possibilities for its application, such as in stroke rehabilitation. In this study, we asked whether there is a difference in oxygenated hemoglobin (HbO2) within brain motor areas during hand/arm movements between immersive and non-immersive VR settings. Ten healthy young participants (24.3 ± 3.7, three females) were tested using a specially developed VR paradigm, called “bus riding”, whereby participants used their hand to steer a moving bus. Both immersive and non-immersive conditions stimulated brain regions controlling hand movements, namely motor cortex, but no significant differences in HbO2 could be found between the two conditions in any of the relevant brain regions. These results are to be interpreted with caution, as only ten participants were included in the study.
Although aluminum chronic neurotoxicity is well documented, there are no well-established experimental protocols of Al exposure. In the current study, toxic effects of sub-chronic Al exposure have been evaluated in outbreed male rats (gastrointestinal administration). Forty animals were used: 10 were administered with AlCl3 water solution (2 mg/kg Al per day) for 1 month, 10 received the same concentration of AlCl3 for 3 month, and 20 (10 per observation period) saline as control. After 30 and 90 days, the animals underwent behavioral tests: open field, passive avoidance, extrapolation escape task, and grip strength. At the end of the study, the blood, liver, kidney, and brain were excised for analytical and morphological studies. The Al content was measured by inductively coupled plasma mass-spectrometry. Essential trace elements-Co, Cr, Cu, Fe, Mg, Mn, Mo, Se, and Zn-were measured in whole blood samples. Although no morphological changes were observed in the brain, liver, or kidney for both exposure terms, dose-dependent Al accumulation and behavioral differences (increased locomotor activity after 30 days) between treatment and control groups were indicated. Moreover, for 30 days exposure, strong positive correlation between Al content in the brain and blood for individual animals was established, which surprisingly disappeared by the third month. This may indicate neural barrier adaptation to the Al exposure or the saturation of Al transport into the brain. Notably, we could not see a clear neurodegeneration process after rather prolonged sub-chronic Al exposure, so probably longer exposure periods are required.
Diabetic foot ulcer (DFU) is a severe complication of diabetes and a challenging medical condition. Conventional treatments for DFU have not been effective enough to reduce the amputation rates, which urges the need for additional treatment. Stem cell-based therapy for DFU has been investigated over the past years. Its therapeutic effect is through promoting angiogenesis, secreting paracrine factors, stimulating vascular differentiation, suppressing inflammation, improving collagen deposition, and immunomodulation. It is controversial which type and origin of stem cells, and which administration route would be the most optimal for therapy. We reviewed the different types and origins of stem cells and routes of administration used for the treatment of DFU in clinical and preclinical studies. Diabetes leads to the impairment of the stem cells in the diseased patients, which makes it less ideal to use autologous stem cells, and requires looking for a matching donor. Moreover, angioplasty could be complementary to stem cell therapy, and scaffolds have a positive impact on the healing process of DFU by stem cell-based therapy. In short, stem cell-based therapy is promising in the field of regenerative medicine, but more studies are still needed to determine the ideal type of stem cells required in therapy, their safety, proper dosing, and optimal administration route.
Objective
To improve consumer decision making, the results of risk assessments on food, feed, consumer products or chemicals need to be communicated not only to experts but also to non-expert audiences. The present study draws on evidence from literature reviews and focus groups with diverse stakeholders to identify content to integrate into an existing risk assessment communication (Risk Profile).
Methods
A combination of rapid literature reviews and focus groups with experts (risk assessors (n = 15), risk managers (n = 8)), and non-experts (general public (n = 18)) were used to identify content and strategies for including information about risk assessment results in the “Risk Profile” from the German Federal Institute for Risk Assessment. Feedback from initial focus groups was used to develop communication prototypes that informed subsequent feedback rounds in an iterative process. A final prototype was validated in usability tests with experts.
Results
Focus group feedback and suggestions from risk assessors were largely in line with findings from the literature. Risk managers and lay persons offered similar suggestions on how to improve the existing communication of risk assessment results (e.g., including more explanatory detail, reporting probabilities for individual health impairments, and specifying risks for subgroups in additional sections). Risk managers found information about quality of evidence important to communicate, whereas people from the general public found this information less relevant. Participants from lower educational backgrounds had difficulties understanding the purpose of risk assessments. User tests found that the final prototype was appropriate and feasible to implement by risk assessors.
Conclusion
An iterative and evidence-based process was used to develop content to improve the communication of risk assessments to the general public while being feasible to use by risk assessors. Remaining challenges include how to communicate dose-response relationships and standardise quality of evidence ratings across disciplines.
The PNPLA3 reference single-nucleotide polymorphism rs738409 has been identified as a predisposing factor for nonalcoholic fatty liver disease. A simple method based on PCR and restriction fragment length polymorphism (RFLP) analysis had been published to detect the nonpathogenic allele PNPLA3 rs738409 variant. The presence of the pathogenic variant was deduced by the indigestibility of the corresponding PCR product with BtsCI recognizing the nonpathogenic allele. However, one cannot exclude that an enzymatic reaction does not occur for other, more trivial, reasons. For safe and secure detection of the pathogenic PNPLA3 rs738409, we have further developed the PCR-restriction fragment length polymorphism method by adding a second restriction enzyme digest, clearly identifying the correct PNPLA3 alleles and in particular the pathogenic variant. <br /> METHOD SUMMARY <br /> The method presented here represents an improved genetic diagnosis of the PNPLA3 rs738409 alleles based on conventional and inexpensive molecular biological methods. We used methodology based on PCR and restriction fragment length polymorphisms and clearly identified both described alleles by the use of two restriction enzymes. Digestion of individuals' specific PNPLA3 PCR fragments with both enzymes in independent reactions clearly showed the PNPLA3 rs738409 genotype.
Background
In cystic fibrosis (CF), acute respiratory exacerbations critically enhance pulmonary destruction. Since these mainly occur outside regular appointments, they remain unexplored. We previously elaborated a protocol for home-based upper airway (UAW) sampling obtaining nasal-lavage fluid (NLF), which, in contrast to sputum, does not require immediate processing. The aim of this study was to compare UAW inflammation and pathogen colonization during stable phases and exacerbations in CF patients and healthy controls.
Methods
Initially, we obtained NLF by rinsing 10 ml of isotonic saline/nostril during stable phases. During exacerbations, subjects regularly collected NLF at home. CF patients directly submitted one aliquot for microbiological cultures. The remaining samples were immediately frozen until transfer on ice to our clinic, where PCR analyses were performed and interleukin (IL)-1 beta/IL-6/IL-8, neutrophil elastase (NE), matrix metalloproteinase (MMP)-9, and tissue inhibitor of metalloproteinase (TIMP)-1 were assessed.
Results
Altogether, 49 CF patients and 38 healthy controls (HCs) completed the study, and 214 NLF samples were analyzed. Of the 49 CF patients, 20 were at least intermittently colonized with P. aeruginosa and received azithromycin and/or inhaled antibiotics as standard therapy. At baseline, IL-6 and IL-8 tended to be elevated in CF compared to controls. During infection, inflammatory mediators increased in both cohorts, reaching significance only for IL-6 in controls (p=0.047). Inflammatory responses tended to be higher in controls [1.6-fold (NE) to 4.4-fold (MMP-9)], while in CF, mediators increased only moderately [1.2-1.5-fold (IL-6/IL-8/NE/TIMP-1/MMP-9)]. Patients receiving inhalative antibiotics or azithromycin (n=20 and n=15, respectively) revealed lower levels of IL-1 beta/IL-6/IL-8 and NE during exacerbation compared to CF patients not receiving those antibiotics. In addition, CF patients receiving azithromycin showed MMP-9 levels significantly lower than CF patients not receiving azithromycin at stable phase and exacerbation. Altogether, rhinoviruses were the most frequently detected virus, detected at least once in n=24 (49.0%) of the 49 included pwCF and in n=26 (68.4%) of the 38 healthy controls over the 13-month duration of the study. Remarkably, during exacerbation, rhinovirus detection rates were significantly higher in the HC group compared to those in CF patients (65.8% vs. 22.4%; p<0.0001).
Conclusion
Non-invasive and partially home-based UAW sampling opens new windows for the assessment of inflammation and pathogen colonization in the unified airway system.
Electroencephalographic (EEG) research indicates changes in adults' low frequency bands of frontoparietal brain areas executing different balance tasks with increasing postural demands. However, this issue is unsolved for adolescents when performing the same balance task with increasing difficulty. Therefore, we examined the effects of a progressively increasing balance task difficulty on balance performance and brain activity in adolescents. Thirteen healthy adolescents aged 16-17 year performed tests in bipedal upright stance on a balance board with six progressively increasing levels of task difficulty. Postural sway and cortical activity were recorded simultaneously using a pressure sensitive measuring system and EEG. The power spectrum was analyzed for theta (4-7 Hz) and alpha-2 (10-12 Hz) frequency bands in pre-defined frontal, central, and parietal clusters of electrocortical sources. Repeated measures analysis of variance (rmANOVA) showed a significant main effect of task difficulty for postural sway (p < 0.001; d = 6.36). Concomitantly, the power spectrum changed in frontal, bilateral central, and bilateral parietal clusters. RmANOVAs revealed significant main effects of task difficulty for theta band power in the frontal (p < 0.001, d = 1.80) and both central clusters (left: p < 0.001, d = 1.49; right: p < 0.001, d = 1.42) as well as for alpha-2 band power in both parietal clusters (left: p < 0.001, d = 1.39; right: p < 0.001, d = 1.05) and in the central right cluster (p = 0.005, d = 0.92). Increases in theta band power (frontal, central) and decreases in alpha-2 power (central, parietal) with increasing balance task difficulty may reflect increased attentional processes and/or error monitoring as well as increased sensory information processing due to increasing postural demands. In general, our findings are mostly in agreement with studies conducted in adults. Similar to adult studies, our data with adolescents indicated the involvement of frontoparietal brain areas in the regulation of postural control. In addition, we detected that activity of selected brain areas (e.g., bilateral central) changed with increasing postural demands.
Real options are widely applied in strategic and operational decision-making, allowing for managerial flexibility in uncertaincontexts. Increased scholarly interest has led to an extensive but fragmented research landscape. We aim to measure andsystematize the research field quantitatively. To achieve this goal, we conduct bibliometric performance analyses and bibliographiccoupling analyses with an in-depth content review. The results of the performance analyses show an increasing interest in realoptions since the beginning of the 2000s and identify the most influential journals and authors. The science mappings reveal sixand seven research clusters over the last two decades. Based on an in-depth analysis of their themes, we develop a researchframework comprising antecedents, application areas, internal and external contingencies, and uncertainty resolution throughreal option valuation or reasoning. We identify several gaps in that framework, which we propose to tackle in future research.
This study sought to analyze the relationship between in-season training workload with changes in aerobic power (VO2max), maximum and resting heart rate (HRmax and HRrest), linear sprint medium (LSM), and short test (LSS), in soccer players younger than 16 years (under-16 soccer players). We additionally aimed to explain changes in fitness levels during the in-season through regression models, considering accumulated load, baseline levels, and peak height velocity (PHV) as predictors. Twenty-three male sub-elite soccer players aged 15.5 ± 0.2 years (PHV: 13.6 ± 0.4 years; body height: 172.7 ± 4.2 cm; body mass: 61.3 ± 5.6 kg; body fat: 13.7% ± 3.9%; VO2max: 48.4 ± 2.6 mL⋅kg–1⋅min–1), were tested three times across the season (i.e., early-season (EaS), mid-season (MiS), and end-season (EnS) for VO2max, HRmax, LSM, and LSS. Aerobic and speed variables gradually improved over the season and had a strong association with PHV. Moreover, the HRmax demonstrated improvements from EaS to EnS; however, this was more evident in the intermediate period (from EaS to MiS) and had a strong association with VO2max. Regression analysis showed significant predictions for VO2max [F(2, 20) = 8.18, p ≤ 0.001] with an R2 of 0.45. In conclusion, the meaningful variation of youth players’ fitness levels can be observed across the season, and such changes can be partially explained by the load imposed.
Nudix hydrolase NUDT19 regulates mitochondrial function and ATP production in murine hepatocytes
(2022)
Changes in intracellular CoA levels are known to contribute to the development of non-alcoholic fatty liver disease (NAFLD) in type 2 diabetes (T2D) in human and rodents. However, the underlying genetic basis is still poorly understood.
Due to their diverse susceptibility towards metabolic diseases, mouse inbred strains have been proven to serve as powerful tools for the identification of novel genetic factors that underlie the patho-physiology of NAFLD and diabetes. Transcriptome analysis of mouse liver samples revealed the nucleoside diphosphate linked moiety X-type motif Nudt19 as novel candidate gene responsible for NAFLD and T2D development. Knockdown (KD) of Nudt19 increased mitochondrial and glycolytic ATP production rates in Hepa 1-6 cells by 41% and 10%, respectively.
The enforced utilization of glutamine or fatty acids as energy substrate reduced uncoupled respiration by 41% and 47%, respectively, in non-target (NT) siRNA transfected cells.
This reduction was prevented upon Nudt19 KD. Furthermore, incubation with palmitate or oleate respectively increased mitochondrial ATP production by 31% and 20%, and uncoupled respiration by 23% and 30% in Nudt19 KD cells, but not in NT cells.
The enhanced fatty acid oxidation in Nudt19 KD cells was accompanied by a 1.3-fold increased abundance of Pdk4.
This study is the first to describe Nudt19 as regulator of hepatic lipid metabolism and potential mediator of NAFLD and T2D development.
Background: Socially assistive devices (care robots, companions, smart screen assistants) have been advocated as a promising tool in elderly care in Western healthcare systems. Ethical debates indicate various challenges. One of the most prevalent arguments in the debate is the double-benefit argument claiming that socially assistive devices may not only provide benefits for autonomy and well-being of their users but might also be more efficient than other caring practices and might help to mitigate scarce resources in healthcare. Against this background, we used a subset of comparative empirical studies from a comprehensive systematic review on effects and perceptions of human-machine interaction with socially assistive devices to gather and appraise all available evidence supporting this argument from the empirical side.
Methods: Electronic databases and additional sources were queried using a comprehensive search strategy which generated 9851 records. Studies were screened independently by two authors. Methodological quality of studies was assessed. For 39 reports using a comparative study design, a narrative synthesis was performed.
Results: The data shows positive evidential support to claim that some socially assistive devices (Paro) might be able to contribute to the well-being and autonomy of their users. However, results also indicate that these positive findings may be heavily dependent on the context of use and the population. In addition, we found evidence that socially assistive devices can have negative effects on certain populations. Evidence regarding the claim of efficiency is scarce. Existing results indicate that socially assistive devices can be more effective than standard of care but are far less effective than plush toys or placebo devices.
Discussion: We suggest using the double-benefit argument with great caution as it is not supported by the currently available evidence. The occurrence of potentially negative effects of socially assistive devices requires more research and indicates a more complex ethical calculus than suggested by the double-benefit argument.
Nonparametric goodness-of-fit testing for parametric covariate models in pharmacometric analyses
(2021)
The characterization of covariate effects on model parameters is a crucial step during pharmacokinetic/pharmacodynamic analyses. Although covariate selection criteria have been studied extensively, the choice of the functional relationship between covariates and parameters, however, has received much less attention. Often, a simple particular class of covariate-to-parameter relationships (linear, exponential, etc.) is chosen ad hoc or based on domain knowledge, and a statistical evaluation is limited to the comparison of a small number of such classes. Goodness-of-fit testing against a nonparametric alternative provides a more rigorous approach to covariate model evaluation, but no such test has been proposed so far. In this manuscript, we derive and evaluate nonparametric goodness-of-fit tests for parametric covariate models, the null hypothesis, against a kernelized Tikhonov regularized alternative, transferring concepts from statistical learning to the pharmacological setting. The approach is evaluated in a simulation study on the estimation of the age-dependent maturation effect on the clearance of a monoclonal antibody. Scenarios of varying data sparsity and residual error are considered. The goodness-of-fit test correctly identified misspecified parametric models with high power for relevant scenarios. The case study provides proof-of-concept of the feasibility of the proposed approach, which is envisioned to be beneficial for applications that lack well-founded covariate models.
Extracellular vesicles
(2021)
Osteoporosis is characterized by low bone mass and damage to the bone tissue’s microarchitecture, leading to increased fracture risk. Several studies have provided evidence for associations between psychosocial stress and osteoporosis through various pathways, including the hypothalamic-pituitary-adrenocortical axis, the sympathetic nervous system, and other endocrine factors. As psychosocial stress provokes oxidative cellular stress with consequences for mitochondrial function and cell signaling (e.g., gene expression, inflammation), it is of interest whether extracellular vesicles (EVs) may be a relevant biomarker in this context or act by transporting substances. EVs are intercellular communicators, transfer substances encapsulated in them, modify the phenotype and function of target cells, mediate cell-cell communication, and, therefore, have critical applications in disease progression and clinical diagnosis and therapy. This review summarizes the characteristics of EVs, their role in stress and osteoporosis, and their benefit as biological markers. We demonstrate that EVs are potential mediators of psychosocial stress and osteoporosis and may be beneficial in innovative research settings.
Background
Depression is a leading cause of disability worldwide and a significant contributor to the global burden of disease. Altered leptin levels are known to be associated with depressive symptoms, however discrepancies in the results of increased or decreased levels exist. Due to various limitations associated with commonly used antidepressant drugs, alternatives such as exercise therapy are gaining more importance. Therefore, the current study investigates whether depressed patients have higher leptin levels compared to healthy controls and if exercise is efficient to reduce these levels.
Methods
Leptin levels of 105 participants with major depressive disorder (MDD; 45.7% female, age mean ± SEM: 39.1 ± 1.0) and 34 healthy controls (HC; 61.8% female, age mean ± SEM: 36.0 ± 2.0) were measured before and after a bicycle ergometer test. Additionally, the MDD group was separated into three groups: two endurance exercise intervention groups (EX) differing in their intensities, and a waiting list control group (WL). Leptin levels were measured pre and post a 12-week exercise intervention or the waiting period.
Results
Baseline data showed no significant differences in leptin levels between the MDD and HC groups. As expected, correlation analyses displayed significant relations between leptin levels and body weight (HC: r = 0.474, p = 0.005; MDD: r = 0.198, p = 0.043) and even more with body fat content (HC: r = 0.755, p < 0.001; MDD: r = 0.675, p < 0.001). The acute effect of the bicycle ergometer test and the 12-week training intervention showed no significant changes in circulating leptin levels.
Conclusion
Leptin levels were not altered in patients with major depression compared to healthy controls and exercise, both the acute response and after 12 weeks of endurance training, had no effect on the change in leptin levels.
Trial registration
The study was registered at the German register for clinical studies (DRKS) and the International Clinical Trials Registry Platform of the World Health Organization https://trialsearch.who.int/Trial2.aspx?TrialID=DRKS00008869 on 28/07/2015.
The primary aim of the current study was to examine the unique contribution of psychological need frustration and need satisfaction in the prediction of adults’ mental well-being and ill-being in a heterogeneous sample of adults (N = 334; Mage = 43.33, SD = 32.26; 53% females). Prior to this, validity evidence was provided for the German version of the Basic Psychological Need Satisfaction and Frustration Scale (BPNSFS) based on Self-Determination Theory (SDT). The results of the validation analyses found the German BPNSFS to be a valid and reliable measurement. Further, structural equation modeling (SEM) showed that both need satisfaction and frustration yielded unique and opposing associations with well-being. Specifically, the dimension of psychological need frustration predicted adults’ ill-being. Future research should examine whether frustration of psychological needs is involved in the onset and maintenance of psychopathology (e.g., major depressive disorder).
Background
Depression is one of the key factors contributing to difficulties in one’s ability to work, and serves as one of the major reasons why employees apply for psychotherapy and receive insurance subsidization of treatments. Hence, an increasing and growing number of studies rely on workability assessment scales as their primary outcome measure. The Work and Social Assessment Scale (WSAS) has been documented as one of the most psychometrically reliable and valid tools especially developed to assess workability and social functioning in patients with mental health problems. Yet, the application of the WSAS in Germany has been limited due to the paucity of a valid questionnaire in the German language. The objective of the present study was to translate the WSAS, as a brief and easy administrable tool into German and test its psychometric properties in a sample of adults with depression.
Methods
Two hundred seventy-seven patients (M = 48.3 years, SD = 11.1) with mild to moderately severe depression were recruited. A multistep translation from English into the German language was performed and the factorial validity, criterion validity, convergent validity, discriminant validity, internal consistency, and floor and ceiling effects were examined.
Results
The confirmatory factor analysis results confirmed the one-factor structure of the WSAS. Significant correlations with the WHODAS 2–0 questionnaire, a measure of functionality, demonstrated good convergent validity. Significant correlations with depression and quality of life demonstrated good criterion validity. The WSAS also demonstrated strong internal consistency (α = .89), and the absence of floor and ceiling effects indicated good sensitivity of the instrument.
Conclusions
The results of the present study demonstrated that the German version of the WSAS has good psychometric properties comparable to other international versions of this scale. The findings recommend a global assessment of psychosocial functioning with the sum score of the WSAS.
Macrophages in pathologically expanded dysfunctional white adipose tissue are exposed to a mix of potential modulators of inflammatory response, including fatty acids released from insulin-resistant adipocytes, increased levels of insulin produced to compensate insulin resistance, and prostaglandin E-2 (PGE(2)) released from activated macrophages. The current study addressed the question of how palmitate might interact with insulin or PGE(2) to induce the formation of the chemotactic pro-inflammatory cytokine interleukin-8 (IL-8). Human THP-1 cells were differentiated into macrophages. In these macrophages, palmitate induced IL-8 formation. Insulin enhanced the induction of IL-8 formation by palmitate as well as the palmitate-dependent stimulation of PGE(2) synthesis. PGE(2) in turn elicited IL-8 formation on its own and enhanced the induction of IL-8 release by palmitate, most likely by activating the EP4 receptor. Since IL-8 causes insulin resistance and fosters inflammation, the increase in palmitate-induced IL-8 formation that is caused by hyperinsulinemia and locally produced PGE(2) in chronically inflamed adipose tissue might favor disease progression in a vicious feed-forward cycle.
Older adults with amnestic mild cognitive impairment (aMCI) who in addition to their memory deficits also suffer from frontal-executive dysfunctions have a higher risk of developing dementia later in their lives than older adults with aMCI without executive deficits and older adults with non-amnestic MCI (naMCI). Handgrip strength (HGS) is also correlated with the risk of cognitive decline in the elderly. Hence, the current study aimed to investigate the associations between HGS and executive functioning in individuals with aMCI, naMCI and healthy controls. Older, right-handed adults with amnestic MCI (aMCI), non-amnestic MCI (naMCI), and healthy controls (HC) conducted a handgrip strength measurement via a handheld dynamometer. Executive functions were assessed with the Trail Making Test (TMT A&B). Normalized handgrip strength (nHGS, normalized to Body Mass Index (BMI)) was calculated and its associations with executive functions (operationalized through z-scores of TMT B/A ratio) were investigated through partial correlation analyses (i.e., accounting for age, sex, and severity of depressive symptoms). A positive and low-to-moderate correlation between right nHGS (rp (22) = 0.364; p = 0.063) and left nHGS (rp (22) = 0.420; p = 0.037) and executive functioning in older adults with aMCI but not in naMCI or HC was observed. Our results suggest that higher levels of nHGS are linked to better executive functioning in aMCI but not naMCI and HC. This relationship is perhaps driven by alterations in the integrity of the hippocampal-prefrontal network occurring in older adults with aMCI. Further research is needed to provide empirical evidence for this assumption.
In recent years digital technologies have become a major means for providing health-related services and this trend was strongly reinforced by the current Coronavirus disease 2019 (COVID-19) pandemic. As it is well-known that regular physical activity has positive effects on individual physical and mental health and thus is an important prerequisite for healthy aging, digital technologies are also increasingly used to promote unstructured and structured forms of physical activity. However, in the course of this development, several terms (e.g., Digital Health, Electronic Health, Mobile Health, Telehealth, Telemedicine, and Telerehabilitation) have been introduced to refer to the application of digital technologies to provide health-related services such as physical interventions. Unfortunately, the above-mentioned terms are often used in several different ways, but also relatively interchangeably. Given that ambiguous terminology is a major source of difficulty in scientific communication which can impede the progress of theoretical and empirical research, this article aims to make the reader aware of the subtle differences between the relevant terms which are applied at the intersection of physical activity and Digital Health and to provide state-of-art definitions for them.
Background
Generalized weakness and fatigue are underexplored symptoms in emergency medicine. Triage tools often underestimate patients presenting to the emergency department (ED) with these nonspecific symptoms (Nemec et al., 2010). At the same time, physicians' disease severity rating (DSR) on a scale from 0 (not sick at all) to 10 (extremely sick) predicts key outcomes in ED patients (Beglinger et al., 2015; Rohacek et al., 2015). Our goals were (1) to characterize ED patients with weakness and/or fatigue (W|F); to explore (2) to what extent physicians' DSR at triage can predict five key outcomes in ED patients with W|F; (3) how well DSR performs relative to two commonly used benchmark methods, the Emergency Severity Index (ESI) and the Charlson Comorbidity Index (CCI); (4) to what extent DSR provides predictive information beyond ESI, CCI, or their linear combination, i.e., whether ESI and CCI should be used alone or in combination with DSR; and (5) to what extent ESI, CCI, or their linear combination provide predictive information beyond DSR alone, i.e., whether DSR should be used alone or in combination with ESI and / or CCI.
Methods
Prospective observational study between 2013-2015 (analysis in 2018-2020, study team blinded to hypothesis) conducted at a single center. We study an all-comer cohort of 3,960 patients (48% female patients, median age = 51 years, 94% completed 1-year follow-up). We looked at two primary outcomes (acute morbidity (Bingisser et al., 2017; Weigel et al., 2017) and all-cause 1- year mortality) and three secondary outcomes (in-hospital mortality, hospitalization and transfer to ICU). We assessed the predictive power (i.e., resolution, measured as the Area under the ROC Curve, AUC) of the scores and, using logistic regression, their linear combinations.
Findings
Compared to patients without W|F (n = 3,227), patients with W|F (n = 733) showed higher prevalences for all five outcomes, reported more symptoms across both genders, and received higher DSRs (median = 4; interquartile range (IQR) = 3-6 vs. median = 3; IQR = 2-5). DSR predicted all five outcomes well above chance (i.e., AUCs > similar to 0.70), similarly well for both patients with and without W|F, and as good as or better than ESI and CCI in patients with and without W|F (except for 1-year mortality where CCI performs better). For acute morbidity, hospitalization, and transfer to ICU there is clear evidence that adding DSR to ESI and/or CCI improves predictions for both patient groups; for 1-year mortality and in-hospital mortality this holds for most, but not all comparisons. Adding ESI and/or CCI to DSR generally did not improve performance or even decreased it.
Conclusions
The use of physicians' disease severity rating has never been investigated in patients with generalized weakness and fatigue. We show that physicians' prediction of acute morbidity, mortality, hospitalization, and transfer to ICU through their DSR is also accurate in these patients. Across all patients, DSR is less predictive of acute morbidity for female than male patients, however. Future research should investigate how emergency physicians judge their patients' clinical state at triage and how this can be improved and used in simple decision aids.
Background:
From birth to young adulthood, health and development of young people are strongly linked to their living situation, including their family's socioeconomic position (SEP) and living environment. The impact of regional characteristics on development in early childhood beyond family SEP has been rarely investigated. This study aimed to identify regional predictors of global developmental delay at school entry taking family SEP into consideration.
Method:
We used representative, population-based data from mandatory school entry examinations of the German federal state of Brandenburg in 2018/2019 with n=22,801 preschool children. By applying binary multilevel models, we hierarchically analyzed the effect of regional deprivation defined by the German Index of Socioeconomic Deprivation (GISD) and rurality operationalized as inverted population density of the children's school district on global developmental delay (GDD) while adjusting for family SEP (low, medium and high)
Results:
Family SEP was significantly and strongly linked to GDD. Children with the highest family SEP showed a lower odds for GDD compared to a medium SEP (female: OR=4.26, male: OR=3.46) and low SEP (female: OR=16.58, male: OR=12.79). Furthermore, we discovered a smaller, but additional and independent effect of regional socioeconomic deprivation on GDD, with a higher odds for children from a more deprived school district (female: OR=1.35, male: OR=1.20). However, rurality did not show a significant link to GDD in preschool children beyond family SEP and regional deprivation.
Conclusion:
Family SEP and regional deprivation are risk factors for child development and of particular interest to promote health of children in early childhood and over the life course.
Background and Study
Aims Recurrent laryngeal nerve palsy (RLNP) is a potential complication of anterior discectomy and fusion (ACDF). There still is substantial disagreement on the actual prevalence of RLNP after ACDF as well as on risk factors for postoperative RLNP. The aim of this study was to describe the prevalence of postoperative RLNP in a cohort of consecutive cases of ACDF and to examine potential risk factors.
Materials and Methods
This retrospective study included patients who underwent ACDF between 2005 and 2019 at a single neurosurgical center. As part of clinical routine, RLNP was examined prior to and after surgery by independent otorhinolaryngologists using endoscopic laryngoscopy. As potential risk factors for postoperative RLNP, we examined patient's age, sex, body mass index, multilevel surgery, and the duration of surgery.
Results
214 consecutive cases were included. The prevalence of preoperative RLNP was 1.4% (3/214) and the prevalence of postoperative RLNP was 9% (19/211). The number of operated levels was 1 in 73.5% (155/211), 2 in 24.2% (51/211), and 3 or more in 2.4% (5/211) of cases. Of all cases, 4.7% (10/211) were repeat surgeries. There was no difference in the prevalence of RLNP between the primary surgery group (9.0%, 18/183) versus the repeat surgery group (10.0%, 1/10;p = 0.91). Also, there was no difference in any characteristics between subjects with postoperative RLNP compared with those without postoperative RLNP. We found no association between postoperative RLNP and patient's age, sex, body mass index, duration of surgery, or number of levels (odds ratios between 0.24 and 1.05; p values between 0.20 and 0.97).
Conclusions
In our cohort, the prevalence of postoperative RLNP after ACDF was 9.0%. The fact that none of the examined variables was associated with the occurrence of RLNP supports the view that postoperative RLNP may depend more on direct mechanical manipulation during surgery than on specific patient or surgical characteristics.
This study aimed to validate the Norwegian version of the Basic Psychological Need Satisfaction and Frustration Scale (BPNSFS) and to examine its relations with indicators of well-being and ill-being. Additionally, despite the vast number of studies employing the BPNSFS, norms related to the BPNSFS are currently lacking.
Therefore, we also aimed to provide normative data for this scale. Data were collected among a representative sample of 326 participants (M age = 42.90 years, SD = 14.76; range 18-70) in Norway, of which 49.7% was female.
Results yielded evidence for a six-factor structure (i.e., combining satisfaction/frustration with the type of need) and showed the subscales to be highly reliable. Subsequent structural equation modeling showed that both need satisfaction and need frustration related strongly to vitality, life satisfaction, and internalizing symptoms, but in opposite ways. Norm scores were provided, thereby differentiating between women and men and different age groups.
These findings support the use of the Norwegian BPNSFS and provide researchers and professionals with normative data on the most widely used tool to assess individuals' satisfaction and frustration of the basic psychological needs for autonomy, competence, and relatedness.
Recent theories suggest a shift from model-based goal-directed to model-free habitual decision-making in obsessive-compulsive disorder (OCD). However, it is yet unclear, whether this shift in the decision process is heritable. We investigated 32 patients with OCD, 27 unaffected siblings (SIBs) and 31 healthy controls (HCs) using the two-step task. We computed behavioral and reaction time analyses and fitted a computational model to assess the balance between model-based and model-free control. 80 subjects also underwent structural imaging. We observed a significant ordered effect for the shift towards model-free control in the direction OCD>SIB>HC in our computational parameter of interest. However less directed analyses revealed no shift towards model-free control in OCDs. Nonetheless, we found evidence for reduced model-based control in OCDs compared to HCs and SIBs via 2nd stage reaction time analyses. In this measure SIBs also showed higher levels of model-based control than HCs. Across all subjects these effects were associated with the surface area of the left medial/right dorsolateral prefrontal cortex. Moreover, correlations between bilateral putamen/right caudate volumes and these effects varied as a function of group: they were negative in SIBs and OCDs, but positive in HCs. Associations between fronto-striatal regions and model-based reaction time effects point to a potential endophenotype for OCD.
Myasthenia gravis is an autoimmune disease affecting neuromuscular transmission and causing skeletal muscle weakness. Additionally, systemic inflammation, cognitive deficits and autonomic dysfunction have been described.
However, little is known about myasthenia gravis-related reorganization of the brain. In this study, we thus investigated the structural and functional brain changes in myasthenia gravis patients.
Eleven myasthenia gravis patients (age: 70.64 +/- 9.27; 11 males) were compared to age-, sex- and education-matched healthy controls (age: 70.18 +/- 8.98; 11 males). Most of the patients (n = 10, 0.91%) received cholinesterase inhibitors.
Structural brain changes were determined by applying voxel-based morphometry using high-resolution T-1-weighted sequences. Functional brain changes were assessed with a neuropsychological test battery (including attention, memory and executive functions), a spatial orientation task and brain-derived neurotrophic factor blood levels.
Myasthenia gravis patients showed significant grey matter volume reductions in the cingulate gyrus, in the inferior parietal lobe and in the fusiform gyrus. Furthermore, myasthenia gravis patients showed significantly lower performance in executive functions, working memory (Spatial Span, P = 0.034, d = 1.466), verbal episodic memory (P = 0.003, d = 1.468) and somatosensory-related spatial orientation (Triangle Completion Test, P = 0.003, d = 1.200).
Additionally, serum brain-derived neurotrophic factor levels were significantly higher in myasthenia gravis patients (P = 0.001, d = 2.040). Our results indicate that myasthenia gravis is associated with structural and functional brain alterations. Especially the grey matter volume changes in the cingulate gyrus and the inferior parietal lobe could be associated with cognitive deficits in memory and executive functions.
Furthermore, deficits in somatosensory-related spatial orientation could be associated with the lower volumes in the inferior parietal lobe. Future research is needed to replicate these findings independently in a larger sample and to investigate the underlying mechanisms in more detail.
Klaus et al. compared myasthenia gravis patients to matched healthy control subjects and identified functional alterations in memory functions as well as structural alterations in the cingulate gyrus, in the inferior parietal lobe and in the fusiform gyrus.
The ability to catalyze diverse reactions with relevance for chemical and pharmaceutical research and industry has led to an increasing interest in fungal enzymes.
There is still an enormous potential considering the sheer amount of new enzymes from the huge diversity of fungi.
Most of these fungal enzymes have not been characterized yet due to the lack of high throughput synthesis and analysis methods.
This bottleneck could be overcome by means of cell-free protein synthesis. In this study, cell-free protein synthesis based on eukaryotic cell lysates was utilized to produce a functional glycoside hydrolase (GH78) from the soft-rot fungus Xylaria polymorpha (Ascomycota).
The enzyme was successfully synthesized under different reaction conditions.
We characterized its enzymatic activities and immobilized the protein via FLAG-Tag interaction. Alteration of several conditions including reaction temperature, template design and lysate supplementation had an influence on the activity of cell-free synthesized GH78.
Consequently this led to a production of purified GH78 with a specific activity of 15.4 U mg? 1.
The results of this study may be foundational for future high throughput fungal enzyme screenings, including substrate spectra analysis and mutant screenings.
Brain activation during active balancing and its behavioral relevance in younger and older adults
(2022)
Age-related deterioration of balance control is widely regarded as an important phenomenon influencing quality of life and longevity, such that a more comprehensive understanding of the neural mechanisms underlying this process is warranted.
Specifically, previous studies have reported that older adults typically show higher neural activity during balancing as compared to younger counterparts, but the implications of this finding on balance performance remain largely unclear.
Using functional near-infrared spectroscopy (fNIRS), differences in the cortical control of balance between healthy younger (n = 27) and older (n = 35) adults were explored.
More specifically, the association between cortical functional activity and balance performance across and within age groups was investigated. To this end, we measured hemodynamic responses (i.e., changes in oxygenated and deoxygenated hemoglobin) while participants balanced on an unstable device.
As criterion variables for brain-behavior-correlations, we also assessed postural sway while standing on a free-swinging platform and while balancing on wobble boards with different levels of difficulty.
We found that older compared to younger participants had higher activity in prefrontal and lower activity in postcentral regions.
Subsequent robust regression analyses revealed that lower prefrontal brain activity was related to improved balance performance across age groups, indicating that higher activity of the prefrontal cortex during balancing reflects neural inefficiency.
We also present evidence supporting that age serves as a moderator in the relationship between brain activity and balance, i.e., cortical hemodynamics generally appears to be a more important predictor of balance performance in the older than in the younger. Strikingly, we found that age differences in balance performance are mediated by balancing-induced activation of the superior frontal gyrus, thus suggesting that differential activation of this region reflects a mechanism involved in the aging process of the neural control of balance.
Our study suggests that differences in functional brain activity between age groups are not a mere by-product of aging, but instead of direct behavioral relevance for balance performance.
Potential implications of these findings in terms of early detection of fall-prone individuals and intervention strategies targeting balance and healthy aging are discussed.
In this report, we investigate small proteins involved in bacterial alternative respiratory systems that improve the enzymatic efficiency through better anchorage and multimerization of membrane components. Using the small protein TorE of the respiratory TMAO reductase system as a model, we discovered that TorE is part of a subfamily of small proteins that are present in proteobacteria in which they play a similar role for bacterial respiratory systems. We reveal by microscopy that, in Shewanella oneidensis MR1, alternative respiratory systems are evenly distributed in the membrane contrary to what has been described for Escherichia coli. Thus, the better efficiency of the respiratory systems observed in the presence of the small proteins is not due to a specific localization in the membrane, but rather to the formation of membranous complexes formed by TorE homologs with their c-type cytochrome partner protein. By an in vivo approach combining Clear Native electrophoresis and fluorescent translational fusions, we determined the 4: 4 stoichiometry of the complexes. In addition, mild solubilization of the cytochrome indicates that the presence of the small protein reinforces its anchoring to the membrane. Therefore, assembly of the complex induced by this small protein improves the efficiency of the respiratory system.
Dementia as one of the most prevalent diseases urges for a better understanding of the central mechanisms responsible for clinical symptoms, and necessitates improvement of actual diagnostic capabilities. The brainstem nucleus locus coeruleus (LC) is a promising target for early diagnosis because of its early structural alterations and its relationship to the functional disturbances in the patients. In this study, we applied our improved method of localisation-based LC resting-state fMRI to investigate the differences in central sensory signal processing when comparing functional connectivity (fc) of a patient group with mild cognitive impairment (MCI, n = 28) and an age-matched healthy control group (n = 29). MCI and control participants could be differentiated in their Mini-Mental-State-Examination (MMSE) scores (p < .001) and LC intensity ratio (p = .010). In the fMRI, LC fc to anterior cingulate cortex (FDR p < .001) and left anterior insula (FDR p = .012) was elevated, and LC fc to right temporoparietal junction (rTPJ, FDR p = .012) and posterior cingulate cortex (PCC, FDR p = .021) was decreased in the patient group. Importantly, LC to rTPJ connectivity was also positively correlated to MMSE scores in MCI patients (p = .017). Furthermore, we found a hyperactivation of the left-insula salience network in the MCI patients. Our results and our proposed disease model shed new light on the functional pathogenesis of MCI by directing to attentional network disturbances, which could aid new therapeutic strategies and provide a marker for diagnosis and prediction of disease progression.
Background
Ankle sprain is the most common injury in basketball. Chronic ankle instability develops from an acute ankle sprain may cause negative effects on quality of life, ankle functionality or on increasing risk for recurrent ankle sprains and post-traumatic osteoarthritis. To facilitate a preventative strategy of chronic ankle instability (CAI) in the basketball population, gathering epidemiological data is essential. However, the epidemiological data of CAI in basketball is limited. Therefore, this study aims to investigate the prevalence of CAI in basketball athletes and to determine whether gender, competitive level, and basketball playing position influence this prevalence.
Methods
In a cross-sectional study, in total 391 Taiwanese basketball athletes from universities and sports clubs participated. Besides non-standardized questions about demographics and their history of ankle sprains, participants further filled out the standard Cumberland Ankle Instability Tool applied to determine the presence of ankle instability. Questionnaires from 255 collegiate and 133 semi-professional basketball athletes (male = 243, female = 145, 22.3 ± 3.8 years, 23.3 ± 2.2 kg/m2) were analyzed. Differences in prevalence between gender, competitive level and playing position were determined using the Chi-square test.
Results
In the surveyed cohort, 26% had unilateral CAI while 50% of them had bilateral CAI. Women had a higher prevalence than men in the whole surveyed cohort (X2(1) = 0.515, p = 0.003). This gender disparity also showed from sub-analyses, that the collegiate female athletes had a higher prevalence than collegiate men athletes (X2(1) = 0.203, p = 0.001). Prevalence showed no difference between competitive levels (p > 0.05) and among playing positions (p > 0.05).
Conclusions
CAI is highly prevalent in the basketball population. Gender affects the prevalence of CAI. Regardless of the competitive level and playing position the prevalence of CAI is similar. The characteristic of basketball contributes to the high prevalence. Prevention of CAI should be a focus in basketball. When applying the CAI prevention measures, gender should be taken into consideration.
Background
Anticancer compound 3-bromopyruvate (3-BrPA) suppresses cancer cell growth via targeting glycolytic and mitochondrial metabolism. The malignant peripheral nerve sheath tumor (MPNST), a very aggressive, therapy resistant, and Neurofibromatosis type 1 associated neoplasia, shows a high metabolic activity and affected patients may therefore benefit from 3-BrPA treatment. To elucidate the specific mode of action, we used a controlled cell model overexpressing proteasome activator (PA) 28, subsequently leading to p53 inactivation and oncogenic transformation and therefore reproducing an important pathway in MPNST and overall tumor pathogenesis.
Methods
Viability of MPNST cell lines S462, NSF1, and T265 in response to increasing doses (0-120 mu M) of 3-BrPA was analyzed by CellTiter-Blue (R) assay. Additionally, we investigated viability, reactive oxygen species (ROS) production (dihydroethidium assay), nicotinamide adenine dinucleotide dehydrogenase activity (NADH-TR assay) and lactate production (lactate assay) in mouse B8 fibroblasts overexpressing PA28 in response to 3-BrPA application. For all experiments normal and nutrient deficient conditions were tested. MPNST cell lines were furthermore characterized immunohistochemically for Ki67, p53, bcl2, bcl6, cyclin D1, and p21.
Results
MPNST significantly responded dose dependent to 3-BrPA application, whereby S462 cells were most responsive. Human control cells showed a reduced sensitivity. In PA28 overexpressing cancer cell model 3-BrPA application harmed mitochondrial NADH dehydrogenase activity mildly and significantly failed to inhibit lactate production. PA28 overexpression was associated with a functional glycolysis as well as a partial resistance to stress provoked by nutrient deprivation. 3-BrPA treatment was not associated with an increase of ROS. Starvation sensitized MPNST to treatment.
Conclusions
Aggressive MPNST cells are sensitive to 3-BrPA therapy in-vitro with and without starvation. In a PA28 overexpression cancer cell model leading to p53 inactivation, thereby reflecting a key molecular feature in human NF1 associated MPNST, known functions of 3-BrPA to block mitochondrial activity and glycolysis were reproduced, however oncogenic cells displayed a partial resistance. To conclude, 3-BrPA was sufficient to reduce NF1 associated MPNST viability potentially due inhibition of glycolysis which should lead to the initiation of further studies and promises a potential benefit for NF1 patients.
Introduction
Attempts to improve cognitive abilities via transcranial direct current stimulation (tDCS) have led to ambiguous results, likely due to the method's susceptibility to methodological and inter-individual factors. Conventional tDCS, i.e., using an active electrode over brain areas associated with the targeted cognitive function and a supposedly passive reference, neglects stimulation effects on entire neural networks.
Methods
We investigated the advantage of frontoparietal network stimulation (right prefrontal anode, left posterior parietal cathode) against conventional and sham tDCS in modulating working memory (WM) capacity dependent transfer effects of a single-session distractor inhibition (DIIN) training. Since previous results did not clarify whether electrode montage drives this individual transfer, we here compared conventional to frontoparietal and sham tDCS and reanalyzed data of 124 young, healthy participants in a more robust way using linear mixed effect modeling.
Results
The interaction of electrode montage and WM capacity resulted in systematic differences in transfer effects. While higher performance gains were observed with increasing WM capacity in the frontoparietal stimulation group, low WM capacity individuals benefited more in the sham condition. The conventional stimulation group showed subtle performance gains independent of WM capacity.
Discussion
Our results confirm our previous findings of WM capacity dependent transfer effects on WM by a single-session DIIN training combined with tDCS and additionally highlight the pivotal role of the specific electrode montage. WM capacity dependent differences in frontoparietal network recruitment, especially regarding the parietal involvement, are assumed to underlie this observation.
Symptom Checker Applications (SCA) are mobile applications often designed for the end-user to assist with symptom assessment and self-triage. SCA are meant to provide the user with easily accessible information about their own health conditions.
However, SCA raise questions regarding ethical, legal, and social aspects (ELSA), for example, regarding fair access to this new technology.
The aim of this scoping review is to identify the ELSA of SCA in the scientific literature. A scoping review was conducted to identify the ELSA of SCA. Ten databases (e.g., Web of Science and PubMed) were used. Studies on SCA that address ELSA, written in English or German, were included in the review.
The ELSA of SCA were extracted and synthesized using qualitative content analysis. A total of 25,061 references were identified, of which 39 were included in the analysis. The identified aspects were allotted to three main categories: (1) Technology; (2) Individual Level; and (3) Healthcare system.
The results show that there are controversial debates in the literature on the ethical and social challenges of SCA usage. Furthermore, the debates are characterised by a lack of a specific legal perspective and empirical data.
The review provides an overview on the spectrum of ELSA regarding SCA. It offers guidance to stakeholders in the healthcare system, for example, patients, healthcare professionals, and insurance providers and could be used in future empirical research to investigate the perspectives of those affected, such as users.
Boredom has been identified as one of the greatest psychological challenges when staying at home during quarantine and isolation. However, this does not mean that the situation necessarily causes boredom. On the basis of 13 explorative interviews with bored and non-bored persons who have been under quarantine or in isolation, we explain why boredom is related to a subjective interpretation process rather than being a direct consequence of the objective situation. Specifically, we show that participants vary significantly in their interpretations of staying at home and, thus, also in their experience of boredom. While the non-bored participants interpret the situation as a relief or as irrelevant, the bored participants interpret it as a major restriction that only some are able to cope with.
Association of primary allostatic load mediators and metabolic syndrome (MetS): A systematic review
(2022)
Allostatic load (AL) exposure may cause detrimental effects on the neuroendocrine system, leading to metabolic syndrome (MetS). The primary mediators of AL involve serum dehydroepiandrosterone sulfate (DHEAS; a functional HPA axis antagonist); further, cortisol, urinary norepinephrine (NE), and epinephrine (EPI) excretion levels (assessed within 12-h urine as a golden standard for the evaluation of the HPA axis activity and sympathetic nervous system activity). However, the evidence of an association between the primary mediators of AL and MetS is limited. This systematic review aimed to critically examine the association between the primary mediators of AL and MetS. PubMed and Web of Science were searched for articles from January 2010 to December 2021, published in English. The search strategy focused on cross-sectional and case–control studies comprising adult participants with MetS, obesity, overweight, and without chronic diseases. The STROBE checklist was used to assess study quality control. Of 770 studies, twenty-one studies with a total sample size (n = 10,666) met the eligibility criteria. Eighteen studies were cross-sectional, and three were case–control studies. The included studies had a completeness of reporting score of COR % = 87.0 ± 6.4%. It is to be noted, that cortisol as a primary mediator of AL showed an association with MetS in 50% (urinary cortisol), 40% (serum cortisol), 60% (salivary cortisol), and 100% (hair cortisol) of the studies. For DHEAS, it is to conclude that 60% of the studies showed an association with MetS. In contrast, urinary EPI and urinary NE had 100% no association with MetS. In summary, there is a tendency for the association between higher serum cortisol, salivary cortisol, urinary cortisol, hair cortisol, and lower levels of DHEAS with MetS. Future studies focusing on longitudinal data are warranted for clarification and understanding of the association between the primary mediators of AL and MetS.
Basic psychological needs theory postulates that a social environment that satisfies individuals’ three basic psychological needs of autonomy, competence, and relatedness leads to optimal growth and well-being. On the other hand, the frustration of these needs is associated with ill-being and depressive symptoms foremost investigated in non-clinical samples; yet, there is a paucity of research on need frustration in clinical samples. Survey data were compared between adult individuals with major depressive disorder (MDD; n = 115; 48.69% female; 38.46 years, SD = 10.46) with those of a non-depressed comparison sample (n = 201; 53.23% female; 30.16 years, SD = 12.81). Need profiles were examined with a linear mixed model (LMM). Individuals with depression reported higher levels of frustration and lower levels of satisfaction in relation to the three basic psychological needs when compared to non-depressed adults. The difference between depressed and non-depressed groups was significantly larger for frustration than satisfaction regarding the needs for relatedness and competence. LMM correlation parameters confirmed the expected positive correlation between the three needs. This is the first study showing substantial differences in need-based experiences between depressed and non-depressed adults. The results confirm basic assumptions of the self-determination theory and have preliminary implications in tailoring therapy for depression.
Background
Millions of people in Germany suffer from chronic pain, in which course and intensity are multifactorial. Besides physical injuries, certain psychosocial risk factors are involved in the disease process. The national health care guidelines for the diagnosis and treatment of non-specific low back pain recommend the screening of psychosocial risk factors as early as possible, to be able to adapt the therapy to patient needs (e.g., unimodal or multimodal). However, such a procedure has been difficult to implement in practice and has not yet been integrated into the rehabilitation care structures across the country.
Methods
The aim of this study is to implement an individualized therapy and aftercare program within the rehabilitation offer of the German Pension Insurance in the area of orthopedics and to examine its success and sustainability in comparison to the previous standard aftercare program.
The study is a multicenter randomized controlled trial including 1204 patients from six orthopedic rehabilitation clinics. A 2:1 allocation ratio to intervention (individualized and home-based rehabilitation aftercare) versus the control group (regular outpatient rehabilitation aftercare) is set. Upon admission to the rehabilitation clinic, participants in the intervention group will be screened according to their psychosocial risk profile. They could then receive either unimodal or multimodal, together with an individualized training program. The program is instructed in the clinic (approximately 3 weeks) and will continue independently at home afterwards for 3 months. The success of the program is examined by means of a total of four surveys. The co-primary outcomes are the Characteristic Pain Intensity and Disability Score assessed by the German version of the Chronic Pain Grade questionnaire (CPG).
Discussion
An improvement in terms of pain, work ability, patient compliance, and acceptance in our intervention program compared to the standard aftercare is expected. The study contributes to provide individualized care also to patients living far away from clinical centers.
Trial registration
DRKS, DRKS00020373. Registered on 15 April 2020
Development of chronic pain after a low back pain episode is associated with increased pain sensitivity, altered pain processing mechanisms and the influence of psychosocial factors. Although there is some evidence that multimodal therapy (such as behavioral or motor control therapy) may be an important therapeutic strategy, its long-term effect on pain reduction and psychosocial load is still unclear. Prospective longitudinal designs providing information about the extent of such possible long-term effects are missing. This study aims to investigate the long-term effects of a homebased uni- and multidisciplinary motor control exercise program on low back pain intensity, disability and psychosocial variables. 14 months after completion of a multicenter study comparing uni- and multidisciplinary exercise interventions, a sample of one study center (n = 154) was assessed once more. Participants filled in questionnaires regarding their low back pain symptoms (characteristic pain intensity and related disability), stress and vital exhaustion (short version of the Maastricht Vital Exhaustion Questionnaire), anxiety and depression experiences (the Hospital and Anxiety Depression Scale), and pain-related cognitions (the Fear Avoidance Beliefs Questionnaire). Repeated measures mixed ANCOVAs were calculated to determine the long-term effects of the interventions on characteristic pain intensity and disability as well as on the psychosocial variables. Fifty four percent of the sub-sample responded to the questionnaires (n = 84). Longitudinal analyses revealed a significant long-term effect of the exercise intervention on pain disability. The multidisciplinary group missed statistical significance yet showed a medium sized long-term effect. The groups did not differ in their changes of the psychosocial variables of interest. There was evidence of long-term effects of the interventions on pain-related disability, but there was no effect on the other variables of interest. This may be partially explained by participant's low comorbidities at baseline. Results are important regarding costless homebased alternatives for back pain patients and prevention tasks. Furthermore, this study closes the gap of missing long-term effect analysis in this field.
Sedentarism is a risk factor for depression and anxiety. People living with the human immunodeficiency virus (PLWH) have a higher prevalence of anxiety and depression compared to HIV-negative individuals. This cross-sectional study (n = 450, median age 44 (19-75), 7.3% females) evaluates the prevalence rates and prevalence ratio (PR) of anxiety and/or depression in PLWH associated with recreational exercise. A decreased likelihood of having anxiety (PR=0.57; 0.36-0.91; p = 0.01), depression (PR=0.41; 0.36-0.94; p=0.01), and comorbid anxiety and depression (PR = 0,43; 0.24-0.75; p=0.002) was found in exercising compared to non-exercising PLWH. Recreational exercise is associated with a lower risk for anxiety and/or depression. Further prospective studies are needed to provide insights on the direction of this association.
OBJECTIVE: For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
METHODS: We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
RESULTS: Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
CONCLUSION: Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population.
Objective
For an effective control of the SARS-CoV-2 pandemic with vaccines, most people in a population need to be vaccinated. It is thus important to know how to inform the public with reference to individual preferences–while also acknowledging the societal preference to encourage vaccinations. According to the health care standard of informed decision-making, a comparison of the benefits and harms of (not) having the vaccination would be required to inform undecided and skeptical people. To test evidence-based fact boxes, an established risk communication format, and to inform their development, we investigated their contribution to knowledge and evaluations of COVID-19 vaccines.
Methods
We conducted four studies (1, 2, and 4 were population-wide surveys with N = 1,942 to N = 6,056): Study 1 assessed the relationship between vaccination knowledge and intentions in Germany over three months. Study 2 assessed respective information gaps and needs of the population in Germany. In parallel, an experiment (Study 3) with a mixed design (presentation formats; pre-post-comparison) assessed the effect of fact boxes on risk perceptions and fear, using a convenience sample (N = 719). Study 4 examined how effective two fact box formats are for informing vaccination intentions, with a mixed experimental design: between-subjects (presentation formats) and within-subjects (pre-post-comparison).
Results
Study 1 showed that vaccination knowledge and vaccination intentions increased between November 2020 and February 2021. Study 2 revealed objective information requirements and subjective information needs. Study 3 showed that the fact box format is effective in adjusting risk perceptions concerning COVID-19. Based on those results, fact boxes were revised and implemented with the help of a national health authority in Germany. Study 4 showed that simple fact boxes increase vaccination knowledge and positive evaluations in skeptics and undecideds.
Conclusion
Fact boxes can inform COVID-19 vaccination intentions of undecided and skeptical people without threatening societal vaccination goals of the population
Background: The relationship between exercise-induced intratendinous blood flow (IBF) and tendon pathology or training exposure is unclear.
Objective: This study investigates the acute effect of running exercise on sonographic detectable IBF in healthy and tendinopathic Achilles tendons (ATs) of runners and recreational participants.
Methods: 48 participants (43 ± 13 years, 176 ± 9 cm, 75 ± 11 kg) performed a standardized submaximal 30-min constant load treadmill run with Doppler ultrasound “Advanced dynamic flow” examinations before (Upre) and 5, 30, 60, and 120 min (U5-U120) afterward. Included were runners (>30 km/week) and recreational participants (<10 km/week) with healthy (Hrun, n = 10; Hrec, n = 15) or tendinopathic (Trun, n = 13; Trec, n = 10) ATs. IBF was assessed by counting number [n] of intratendinous vessels. IBF data are presented descriptively (%, median [minimum to maximum range] for baseline-IBF and IBF-difference post-exercise). Statistical differences for group and time point IBF and IBF changes were analyzed with Friedman and Kruskal-Wallis ANOVA (α = 0.05).
Results: At baseline, IBF was detected in 40% (3 [1–6]) of Hrun, in 53% (4 [1–5]) of Hrec, in 85% (3 [1–25]) of Trun, and 70% (10 [2–30]) of Trec. At U5 IBF responded to exercise in 30% (3 [−1–9]) of Hrun, in 53% (4 [−2–6]) of Hrec, in 70% (4 [−10–10]) of Trun, and in 80% (5 [1–10]) of Trec. While IBF in 80% of healthy responding ATs returned to baseline at U30, IBF remained elevated until U120 in 60% of tendinopathic ATs. Within groups, IBF changes from Upre-U120 were significant for Hrec (p < 0.01), Trun (p = 0.05), and Trec (p < 0.01). Between groups, IBF changes in consecutive examinations were not significantly different (p > 0.05) but IBF-level was significantly higher at all measurement time points in tendinopathic versus healthy ATs (p < 0.05).
Conclusion: Irrespective of training status and tendon pathology, running leads to an immediate increase of IBF in responding tendons. This increase occurs shortly in healthy and prolonged in tendinopathic ATs. Training exposure does not alter IBF occurrence, but IBF level is elevated in tendon pathology. While an immediate exercise-induced IBF increase is a physiological response, prolonged IBF is considered a pathological finding associated with Achilles tendinopathy.
The reliability of quantifying intratendinous vascularization by high-sensitivity Doppler ultrasound advanced dynamic flow has not been examined yet. Therefore, this study aimed to investigate the intraobserver and interobserver reliability of evaluating Achilles tendon vascularization by advanced dynamic flow using established scoring systems. Methods-Three investigators evaluated vascularization in 67 recordings in a test-retest design, applying the Ohberg score, a modified Ohberg score, and a counting score. Intraobserver and interobserver agreement for the Ohberg score and modified Ohberg score was analyzed by the Cohen kappa and Fleiss kappa coefficients (absolute), Kendall tau b coefficient, and Kendall coefficient of concordance (W; relative). The reliability of the counting score was analyzed by intraclass correlation coefficients (ICC) 2.1 and 3.1, the standard error of measurement (SEM), and Bland-Altman analysis (bias and limits of agreement [LoA]). Results-Intraobserver and interobserver agreement (absolute/relative) ranged from 0.61 to 0.87/0.87 to 0.95 and 0.11 to 0.66/0.76 to 0.89 for the Ohberg score and from 0.81 to 0.87/0.92 to 0.95 and 0.64 to 0.80/0.88 to 0.93 for the modified Ohberg score, respectively. The counting score revealed an intraobserver ICC of 0.94 to 0.97 (SEM, 1.0-1.5; bias, -1; and LoA, 3-4 vessels). The interobserver ICC for the counting score ranged from 0.91 to 0.98 (SEM, 1.0-1.9; bias, 0; and LoA, 3-5 vessels). Conclusions-The modified Ohberg score and counting score showed excellent reliability and seem convenient for research and clinical practice. The Ohberg score revealed decent intraobserver but unexpected low interobserver reliability and therefore cannot be recommended.
Following stroke, neuronal death takes place both in the infarct region and in brain areas distal to the lesion site including the hippocampus. The hippocampus is critically involved in learning and memory processes and continuously generates new neurons. Dysregulation of adult neurogenesis may be associated with cognitive decline after a stroke lesion. In particular, proliferation of precursor cells and the formation of new neurons are increased after lesion. Within the first week, many new precursor cells die during development. How dying precursors are removed from the hippocampus and to what extent phagocytosis takes place after stroke is still not clear. Here, we evaluated the effect of a prefrontal stroke lesion on the phagocytic activity of microglia in the dentate gyrus (DG) of the hippocampus. Three-months-old C57BL/6J mice were injected once with the proliferation marker BrdU (250 mg/kg) 6 hr after a middle cerebral artery occlusion or sham surgery. The number of apoptotic cells and the phagocytic capacity of the microglia were evaluated by means of immunohistochemistry, confocal microscopy, and 3D-reconstructions. We found a transient but significant increase in the number of apoptotic cells in the DG early after stroke, associated with impaired removal by microglia. Interestingly, phagocytosis of newly generated precursor cells was not affected. Our study shows that a prefrontal stroke lesion affects phagocytosis of apoptotic cells in the DG, a region distal to the lesion core. Whether disturbed phagocytosis might contribute to inflammatory- and maladaptive processes including cognitive impairment following stroke needs to be further investigated.
Cardiac rehabilitation
(2021)
The investigation of protein structures, functions and interactions often requires modifications to adapt protein properties to the specific application. Among many possible methods to equip proteins with new chemical groups, the utilization of orthogonal aminoacyl-tRNA synthetase/tRNA pairs enables the site-specific incorporation of non-canonical amino acids at defined positions in the protein. The open nature of cell-free protein synthesis reactions provides an optimal environment, as the orthogonal components do not need to be transported across the cell membrane and the impact on cell viability is negligible. In the present work, it was shown that the expression of orthogonal aminoacyl-tRNA synthetases in CHO cells prior to cell disruption enhanced the modification of the pharmaceutically relevant adenosine A2a receptor. For this purpose, in complement to transient transfection of CHO cells, an approach based on CRISPR/Cas9 technology was selected to generate a translationally active cell lysate harboring endogenous orthogonal aminoacyl-tRNA synthetase.
Background: Patients with subjective cognitive decline (SCD) report memory deterioration and are at an increased risk of converting to Alzheimer's disease (AD) although psychophysical testing does not reveal any cognitive deficit.
Objective: Here, gustatory function is investigated as a potential predictor for an increased risk of progressive cognitive decline indicating higher AD risk in SCD.
Methods: Measures of smell and taste perception as well as neuropsychological data were assessed in patients with subjective cognitive decline (SCD): Subgroups with an increased likelihood of the progression to preclinical AD (SCD+) and those with a lower likelihood (SCD-) were compared to healthy controls (HC), patients with mild cognitive impairment and AD patients. The Sniffin' Sticks test contained 12 items with different qualities and taste was measured with 32 taste stripes (sweet, salty, bitter, sour) of different concentration.
Results: Only taste was able to distinguish between HC/SCD- and SCD+ patients.
Conclusion: This study provides a first hint of taste as a more sensitive marker than smell for detecting preclinical AD in SCD. Longitudinal observation of cognition and pathology are necessary to further evaluate taste perception as a predictor of pathological objective decline in cognition.
Emotional memories are better remembered than neutral ones, but the mechanisms leading to this memory bias are not well under-stood in humans yet. Based on animal research, it is suggested that the memory-enhancing effect of emotion is based on central nor-adrenergic release, which is triggered by afferent vagal nerve activation. To test the causal link between vagus nerve activation and emotional memory in humans, we applied continuous noninvasive transcutaneous auricular vagus nerve stimulation (taVNS) during exposure to emotional arousing and neutral scenes and tested subsequent, long-term recognition memory after 1 week. We found that taVNS, compared with sham, increased recollection-based memory performance for emotional, but not neutral, material. These findings were complemented by larger recollection-related brain potentials (parietal ERP Old/New effect) during retrieval of emotional scenes encoded under taVNS, compared with sham. Furthermore, brain potentials recorded during encoding also revealed that taVNS facilitated early attentional discrimination between emotional and neutral scenes. Extending animal research, our behavioral and neu-ral findings confirm a modulatory influence of the vagus nerve in emotional memory formation in humans.
Electrical muscle stimulation (EMS) is an increasingly popular training method and has become the focus of research in recent years. New EMS devices offer a wide range of mobile applications for whole-body EMS (WB-EMS) training, e.g., the intensification of dynamic low-intensity endurance exercises through WB-EMS. The present study aimed to determine the differences in exercise intensity between WB-EMS-superimposed and conventional walking (EMS-CW), and CON and WB-EMS-superimposed Nordic walking (WB-EMS-NW) during a treadmill test. Eleven participants (52.0 ± years; 85.9 ± 7.4 kg, 182 ± 6 cm, BMI 25.9 ± 2.2 kg/m2) performed a 10 min treadmill test at a given velocity (6.5 km/h) in four different test situations, walking (W) and Nordic walking (NW) in both conventional and WB-EMS superimposed. Oxygen uptake in absolute (VO2) and relative to body weight (rel. VO2), lactate, and the rate of perceived exertion (RPE) were measured before and after the test. WB-EMS intensity was adjusted individually according to the feedback of the participant. The descriptive statistics were given in mean ± SD. For the statistical analyses, one-factorial ANOVA for repeated measures and two-factorial ANOVA [factors include EMS, W/NW, and factor combination (EMS*W/NW)] were performed (α = 0.05). Significant effects were found for EMS and W/NW factors for the outcome variables VO2 (EMS: p = 0.006, r = 0.736; W/NW: p < 0.001, r = 0.870), relative VO2 (EMS: p < 0.001, r = 0.850; W/NW: p < 0.001, r = 0.937), and lactate (EMS: p = 0.003, r = 0.771; w/NW: p = 0.003, r = 0.764) and both the factors produced higher results. However, the difference in VO2 and relative VO2 is within the range of biological variability of ± 12%. The factor combination EMS*W/NW is statistically non-significant for all three variables. WB-EMS resulted in the higher RPE values (p = 0.035, r = 0.613), RPE differences for W/NW and EMS*W/NW were not significant. The current study results indicate that WB-EMS influences the parameters of exercise intensity. The impact on exercise intensity and the clinical relevance of WB-EMS-superimposed walking (WB-EMS-W) exercise is questionable because of the marginal differences in the outcome variables.
Unspecific peroxygenases (UPOs, EC 1.11.2.1) are fungal enzymes that catalyze the oxyfunctionalization of non-activated hydrocarbons, making them valuable biocatalysts. Despite the increasing interest in UPOs that has led to the identification of thousands of putative UPO genes, only a few of these have been successfully expressed and characterized.
There is currently no universal expression system in place to explore their full potential. Cell-free protein synthesis has proven to be a sophisticated technique for the synthesis of difficult-to-express proteins.
In this work, we aimed to establish an insect-based cell-free protein synthesis (CFPS) platform to produce UPOs. CFPS relies on translationally active cell lysates rather than living cells.
The system parameters can thus be directly manipulated without having to account for cell viability, thereby making it highly adaptable.
The insect-based lysate contains translocationally active, ER-derived vesicles, called microsomes.
These microsomes have been shown to allow efficient translocation of proteins into their lumen, promoting post-translational modifications such as disulfide bridge formation and N-glycosylations.
In this study the ability of a redox optimized, vesicle-based, eukaryotic CFPS system to synthesize functional UPOs was explored. The influence of different reaction parameters as well as the influence of translocation on enzyme activity was evaluated for a short UPO from Marasmius rotula and a long UPO from Agrocybe aegerita.
The capability of the CFPS system described here was demonstrated by the successful synthesis of a novel UPO from Podospora anserina, thus qualifying CFPS as a promising tool for the identification and evaluation of novel UPOs and variants thereof.
Coronary artery disease (CAD) is the leading cause of death worldwide.
Statins reduce morbidity and mortality of CAD. Intake of n-3 polyunsaturated fatty acid (n-3 PUFAs), particularly eicosapentaenoic acid (EPA), is associated with reduced morbidity and mortality in patients with CAD. Previous data indicate that a higher conversion of precursor fatty acids (FAs) to arachidonic acid (AA) is associated with increased CAD prevalence.
Our study explored the FA composition in blood to assess n-3 PUFA levels from patients with and without CAD. We analyzed blood samples from 273 patients undergoing cardiac catheterization. Patients were stratified according to clinically relevant CAD (n = 192) and those without (n = 81). FA analysis in full blood was performed by gas chromatography. Indicating increased formation of AA from precursors, the ratio of dihomo-gamma-linolenic acid (DGLA) to AA, the delta-5 desaturase index (D5D index) was higher in CAD patients. CAD patients had significantly lower levels of omega-6 polyunsaturated FAs (n-6 PUFA) and n-3 PUFA, particularly EPA, in the blood.
Thus, our study supports a role of increased EPA levels for cardioprotection.
Training intervention effects on cognitive performance and neuronal plasticity — A pilot study
(2022)
Studies suggest that people suffering from chronic pain may have altered brain plasticity, along with altered functional connectivity between pain-processing brain regions. These may be related to decreased mood and cognitive performance. There is some debate as to whether physical activity combined with behavioral therapy (e.g. cognitive distraction, body scan) may counteract these changes. However, underlying neuronal mechanisms are unclear. The aim of the current pilot study with a 3-armed randomized controlled trial design was to examine the effects of sensorimotor training for nonspecific chronic low back pain on (1) cognitive performance; (2) fMRI activity co-fluctuations (functional connectivity) between pain-related brain regions; and (3) the relationship between functional connectivity and subjective variables (pain and depression). Six hundred and sixty two volunteers with non-specific chronic low back pain were randomly allocated to a unimodal (sensorimotor training), multidisciplinary (sensorimotor training and behavioral therapy) intervention, or to a control group within a multicenter study. A subsample of patients (n = 21) from one study center participated in the pilot study presented here. Measurements were at baseline, during (3 weeks, M2) and after intervention (12 weeks, M4 and 24 weeks, M5). Cognitive performance was measured by the Trail Making Test and functional connectivity by MRI. Pain perception and depression were assessed by the Von Korff questionnaire and the Hospital and Anxiety. Group differences were calculated by univariate and repeated ANOVA measures and Bayesian statistics; correlations by Pearson's r. Change and correlation of functional connection were analyzed within a pooled intervention group (uni-, multidisciplinary group). Results revealed that participants with increased pain intensity at baseline showed higher functional connectivity between pain-related brain areas used as ROIs in this study. Though small sample sizes limit generalization, cognitive performance increased in the multimodal group. Increased functional connectivity was observed in participants with increased pain ratings. Pain ratings and connectivity in pain-related brain regions decreased after the intervention. The results provide preliminary indication that intervention effects can potentially be achieved on the cognitive and neuronal level. The intervention may be suitable for therapy and prevention of non-specific chronic low back pain.
Dysfunctional islets of Langerhans are a hallmark of type 2 diabetes (T2D). We hypothesize that differences in islet gene expression alternative splicing which can contribute to altered protein function also participate in islet dysfunction. RNA sequencing (RNAseq) data from islets of obese diabetes-resistant and diabetes-susceptible mice were analyzed for alternative splicing and its putative genetic and epigenetic modulators. We focused on the expression levels of chromatin modifiers and SNPs in regulatory sequences. We identified alternative splicing events in islets of diabetes-susceptible mice amongst others in genes linked to insulin secretion, endocytosis or ubiquitin-mediated proteolysis pathways. The expression pattern of 54 histones and chromatin modifiers, which may modulate splicing, were markedly downregulated in islets of diabetic animals. Furthermore, diabetes-susceptible mice carry SNPs in RNA-binding protein motifs and in splice sites potentially responsible for alternative splicing events. They also exhibit a larger exon skipping rate, e.g., in the diabetes gene Abcc8, which might affect protein function. Expression of the neuronal splicing factor Srrm4 which mediates inclusion of microexons in mRNA transcripts was markedly lower in islets of diabetes-prone compared to diabetes-resistant mice, correlating with a preferential skipping of SRRM4 target exons. The repression of Srrm4 expression is presumably mediated via a higher expression of miR-326-3p and miR-3547-3p in islets of diabetic mice. Thus, our study suggests that an altered splicing pattern in islets of diabetes-susceptible mice may contribute to an elevated T2D risk.
Stress and pain
(2022)
Introduction: Low back pain (LBP) leads to considerable impairment of quality of life worldwide and is often accompanied by psychosomatic symptoms.
Objectives: First, to assess the association between stress and chronic low back pain (CLBP) and its simultaneous appearance with fatigue and depression as a symptom triad. Second, to identify the most predictive stress-related pattern set for CLBP for a 1-year diagnosis.
Methods: In a 1-year observational study with four measurement points, a total of 140 volunteers (aged 18–45 years with intermittent pain) were recruited. The primary outcomes were pain [characteristic pain intensity (CPI), subjective pain disability (DISS)], fatigue, and depressive mood. Stress was assessed as chronic stress, perceived stress, effort reward imbalance, life events, and physiological markers [allostatic load index (ALI), hair cortisol concentration (HCC)]. Multiple linear regression models and selection procedures for model shrinkage and variable selection (least absolute shrinkage and selection operator) were applied. Prediction accuracy was calculated by root mean squared error (RMSE) and receiver-operating characteristic curves.
Results: There were 110 participants completed the baseline assessments (28.2 7.5 years, 38.1% female), including HCC, and a further of 46 participants agreed to ALI laboratory measurements. Different stress types were associated with LBP, CLBP, fatigue, and depressive mood and its joint occurrence as a symptom triad at baseline; mainly social-related stress types were of relevance. Work-related stress, such as “excessive demands at work”[b = 0.51 (95%CI -0.23, 1.25), p = 0.18] played a role for upcoming chronic pain disability. “Social overload” [b = 0.45 (95%CI -0.06, 0.96), p = 0.080] and “over-commitment at work” [b = 0.28 (95%CI -0.39, 0.95), p = 0.42] were associated with an upcoming depressive mood within 1-year. Finally, seven psychometric (CPI: RMSE = 12.63; DISS: RMSE = 9.81) and five biomarkers (CPI: RMSE = 12.21; DISS: RMSE = 8.94) could be derived as the most predictive pattern set for a 1-year prediction of CLBP. The biomarker set showed an apparent area under the curve of 0.88 for CPI and 0.99 for DISS.
Conclusion: Stress disrupts allostasis and favors the development of chronic pain, fatigue, and depression and the emergence of a “hypocortisolemic symptom triad,” whereby the social-related stressors play a significant role. For translational medicine, a predictive pattern set could be derived which enables to diagnose the individuals at higher risk for the upcoming pain disorders and can be used in practice.
Secretory Wnt trafficking can be studied in the polarized epithelial monolayer of Drosophila wing imaginal discs (WID). In this tissue, Wg (Drosophila Wnt-I) is presented on the apical surface of its source cells before being internalized into the endosomal pathway. Long-range Wg secretion and spread depend on secondary secretion from endosomal compartments, but the exact post-endocytic fate of Wg is poorly understood. Here, we summarize and present three protocols for the immunofluorescencebased visualization and quantitation of different pools of intracellular and extracellular Wg in WID: (1) steady-state extracellular Wg; (2) dynamic Wg trafficking inside endosomal compartments; and (3) dynamic Wg release to the cell surface. Using a genetic driver system for gene manipulation specifically at the posterior part of the WID (EnGal4) provides a robust internal control that allows for direct comparison of signal intensities of control and manipulated compartments of the same WID. Therefore, it also circumvents the high degree of staining variability usually associated with whole-tissue samples. In combination with the genetic manipulation of Wg pathway components that is easily feasible in Drosophila, these methods provide a tool-set for the dissection of secretory Wg trafficking and can help us to understand how Wnt proteins travel along endosomal compartments for short-and long-range signal secretion.
Background/objective: Negative emotional states, such as depression, anxiety, and stress challenge health care due to their long-term consequences for mental disorders. Accumulating evidence indicates that regular physical activity (PA) can positively influence negative emotional states. Among possible candidates, resilience and exercise tolerance in particular have the potential to partly explain the positive effects of PA on negative emotional states. Thus, the aim of this study was to investigate the association between PA and negative emotional states, and further determine the mediating effects of exercise tolerance and resilience in such a relationship. Method: In total, 1117 Chinese college students (50.4% female, Mage=18.90, SD=1.25) completed a psychosocial battery, including the 21-item Depression Anxiety Stress Scale (DASS-21), the Connor-Davidson Resilience Scale (CD-RISC), the Preference for and Tolerance of the Intensity of Exercise Questionnaire (PRETIE-Q), and the International Physical Activity Questionnaire short form (IPAQ-SF). Regression analysis was used to identify the serial multiple mediation, controlling for gender, age and BMI. Results: PA, exercise intensity-tolerance, and resilience were significantly negatively correlated with negative emotional states (Ps<.05). Further, exercise tolerance and resilience partially mediated the relationship between PA and negative emotional states. Conclusions: Resilience and exercise intensity-tolerance can be achieved through regularly engaging in PA, and these newly observed variables play critical roles in prevention of mental illnesses, especially college students who face various challenges. Recommended amount of PA should be incorporated into curriculum or sport clubs within a campus environment.