Refine
Year of publication
Document Type
Is part of the Bibliography
- yes (158)
Keywords
- depression (14)
- dementia (13)
- fMRI (11)
- working memory (10)
- HIV (7)
- aging (7)
- risk factors (7)
- Alcohol dependence (6)
- Depression (5)
- Pavlovian-to-instrumental transfer (5)
Institute
- Department Sport- und Gesundheitswissenschaften (70)
- Department Psychologie (28)
- Strukturbereich Kognitionswissenschaften (26)
- Fakultät für Gesundheitswissenschaften (11)
- Humanwissenschaftliche Fakultät (11)
- Extern (7)
- Department Grundschulpädagogik (1)
- Institut für Biochemie und Biologie (1)
- Institut für Mathematik (1)
- Interdisziplinäres Zentrum für Kognitive Studien (1)
Reward expectation and affective responses across psychiatric disorders - A dimensional approach
(2014)
Processing of reward is the basis of adaptive behavior of the human being. Neural correlates of reward processing seem to be influenced by developmental changes from adolescence to late adulthood. The aim of this study is to uncover these neural correlates during a slot machine gambling task across the lifespan. Therefore, we used functional magnetic resonance imaging to investigate 102 volunteers in three different age groups: 34 adolescents, 34 younger adults, and 34 older adults. We focused on the core reward areas ventral striatum (VS) and ventromedial prefrontal cortex (VMPFC), the valence processing associated areas, anterior cingulate cortex (ACC) and insula, as well as information integration associated areas, dorsolateral prefrontal cortex (DLPFC), and inferior parietal lobule (IPL). Results showed that VS and VMPFC were characterized by a hyperactivation in adolescents compared with younger adults. Furthermore, the ACC and insula were characterized by a U-shape pattern (hypoactivation in younger adults compared with adolescents and older adults), whereas the DLPFC and IPL were characterized by a J-shaped form (hyperactivation in older adults compared with younger groups). Furthermore, a functional connectivity analysis revealed an elevated negative functional coupling between the inhibition-related area rIFG and VS in younger adults compared with adolescents. Results indicate that lifespan-related changes during reward anticipation are characterized by different trajectories in different reward network modules and support the hypothesis of an imbalance in maturation of striatal and prefrontal cortex in adolescents. Furthermore, these results suggest compensatory age-specific effects in fronto-parietal regions. Hum Brain Mapp 35:5153-5165, 2014. (c) 2014 Wiley Periodicals, Inc.
Background Aversive stimuli in the environment influence human actions. This includes valence-dependent influences on action selection, e.g., increased avoidance but decreased approach behavior. However, it is yet unclear how aversive stimuli interact with complex learning and decision-making in the reward and avoidance domain. Moreover, the underlying computational mechanisms of these decision-making biases are unknown. Methods To elucidate these mechanisms, 54 healthy young male subjects performed a two-step sequential decision-making task, which allows to computationally model different aspects of learning, e.g., model-free, habitual, and model-based, goal-directed learning. We used a within-subject design, crossing task valence (reward vs. punishment learning) with emotional context (aversive vs. neutral background stimuli). We analyzed choice data, applied a computational model, and performed simulations. Results Whereas model-based learning was not affected, aversive stimuli interacted with model-free learning in a way that depended on task valence. Thus, aversive stimuli increased model-free avoidance learning but decreased model-free reward learning. The computational model confirmed this effect: the parameter lambda that indicates the influence of reward prediction errors on decision values was increased in the punishment condition but decreased in the reward condition when aversive stimuli were present. Further, by using the inferred computational parameters to simulate choice data, our effects were captured. Exploratory analyses revealed that the observed biases were associated with subclinical depressive symptoms. Conclusion Our data show that aversive environmental stimuli affect complex learning and decision-making, which depends on task valence. Further, we provide a model of the underlying computations of this affective modulation. Finally, our finding of increased decision-making biases in subjects reporting subclinical depressive symptoms matches recent reports of amplified Pavlovian influences on action selection in depression and suggests a potential vulnerability factor for mood disorders. We discuss our findings in the light of the involvement of the neuromodulators serotonin and dopamine.
We investigated the efficacy of reminiscence therapy (RT) on symptoms of depression in patients with mild to moderate dementia. Out of 227 patients with mild to moderate dementia from a specialized physician’s office, 27 pairs (N = 54; mean age 79.04 ± 6.16 years) who had either received treatment as usual (TAU) or TAU combined with RT, were matched retrospectively according to age as well as cognitive and depressive symptom scores. After controlling for age and sex, symptoms of depression significantly decreased over time in the RT group compared to TAU (F1,52 = 4.36; p < .05). RT is a promising option for the treatment of depression in mild to moderate dementia. Larger randomized-controlled trials are needed.
Exercise is known for its beneficial effects on preventing cardiometabolic diseases (CMDs) in the general population. People living with the human immunodeficiency virus (PLWH) are prone to sedentarism, thus raising their already elevated risk of developing CMDs in comparison to individuals without HIV. The aim of this cross-sectional study was to determine if exercise is associated with reduced risk of self-reported CMDs in a German HIV-positive sample (n = 446). Participants completed a self-report survey to assess exercise levels, date of HIV diagnosis, CD4 cell count, antiretroviral therapy, and CMDs. Participants were classified into exercising or sedentary conditions. Generalized linear models with Poisson regression were conducted to assess the prevalence ratio (PR) of PLWH reporting a CMD. Exercising PLWH were less likely to report a heart arrhythmia for every increase in exercise duration (PR: 0.20: 95% CI: 0.10–0.62, p < 0.01) and diabetes mellitus for every increase in exercise session per week (PR: 0.40: 95% CI: 0.10–1, p < 0.01). Exercise frequency and duration are associated with a decreased risk of reporting arrhythmia and diabetes mellitus in PLWH. Further studies are needed to elucidate the mechanisms underlying exercise as a protective factor for CMDs in PLWH.
Recreational exercising and self-reported cardiometabolic diseases in German people living with HIV
(2021)
Exercise is known for its beneficial effects on preventing cardiometabolic diseases (CMDs) in the general population. People living with the human immunodeficiency virus (PLWH) are prone to sedentarism, thus raising their already elevated risk of developing CMDs in comparison to individuals without HIV. The aim of this cross-sectional study was to determine if exercise is associated with reduced risk of self-reported CMDs in a German HIV-positive sample (n = 446). Participants completed a self-report survey to assess exercise levels, date of HIV diagnosis, CD4 cell count, antiretroviral therapy, and CMDs. Participants were classified into exercising or sedentary conditions. Generalized linear models with Poisson regression were conducted to assess the prevalence ratio (PR) of PLWH reporting a CMD. Exercising PLWH were less likely to report a heart arrhythmia for every increase in exercise duration (PR: 0.20: 95% CI: 0.10–0.62, p < 0.01) and diabetes mellitus for every increase in exercise session per week (PR: 0.40: 95% CI: 0.10–1, p < 0.01). Exercise frequency and duration are associated with a decreased risk of reporting arrhythmia and diabetes mellitus in PLWH. Further studies are needed to elucidate the mechanisms underlying exercise as a protective factor for CMDs in PLWH.
Sedentarism is a risk factor for depression and anxiety. People living with the human immunodeficiency virus (PLWH) have a higher prevalence of anxiety and depression compared to HIV-negative individuals. This cross-sectional study (n = 450, median age 44 (19-75), 7.3% females) evaluates the prevalence rates and prevalence ratio (PR) of anxiety and/or depression in PLWH associated with recreational exercise. A decreased likelihood of having anxiety (PR=0.57; 0.36-0.91; p = 0.01), depression (PR=0.41; 0.36-0.94; p=0.01), and comorbid anxiety and depression (PR = 0,43; 0.24-0.75; p=0.002) was found in exercising compared to non-exercising PLWH. Recreational exercise is associated with a lower risk for anxiety and/or depression. Further prospective studies are needed to provide insights on the direction of this association.
Background The Covid-19 pandemic led to increased work-related strain and psychosocial burden in nurses worldwide, resulting in high prevalences of mental health problems. Nurses in long-term care facilities seem to be especially affected by the pandemic. Nevertheless, there are few findings indicating possible positive changes for health care workers. Therefore, we investigated which psychosocial burdens and potential positive aspects nurses working in long-term care facilities experience during the Covid-19 pandemic. Methods We conducted a mixed-methods study among nurses and nursing assistants working in nursing homes in Germany. The survey contained the third German version of the Copenhagen Psychosocial Questionnaire (COPSOQ III). Using Welch's t-tests, we compared the COPSOQ results of our sample against a pre-pandemic reference group of geriatric nurses from Germany. Additionally, we conducted semi-structured interviews with geriatric nurses with a special focus on psychosocial stress, to reach a deeper understanding of their experiences on work-related changes and burdens during the pandemic. Data were analysed using thematic coding (Braun and Clarke). Results Our survey sample (n = 177) differed significantly from the pre-pandemic reference group in 14 out of 31 COPSOQ scales. Almost all of these differences indicated negative changes. Our sample scored significantly worse regarding the scales 'quantitative demands', 'hiding emotions', 'work-privacy conflicts', 'role conflicts', 'quality of leadership', 'support at work', 'recognition', 'physical demands', 'intention to leave profession', 'burnout', 'presenteeism' and 'inability to relax'. The interviews (n = 15) revealed six main themes related to nurses' psychosocial stress: 'overall working conditions', 'concern for residents', 'management of relatives', 'inability to provide terminal care', 'tensions between being infected and infecting others' and 'technicisation of care'. 'Enhanced community cohesion' (interviews), 'meaning of work' and 'quantity of social relations' (COPSOQ III) were identified as positive effects of the pandemic. Conclusions Results clearly illustrate an aggravation of geriatric nurses' situation and psychosocial burden and only few positive changes due to the Covid-19 pandemic. Pre-existing hardships seem to have further deteriorated and new stressors added to nurses' strain. The perceived erosion of care, due to an overemphasis of the technical in relation to the social and emotional dimensions of care, seems to be especially burdensome to geriatric nurses.
Background: The aim of the present study was to investigate the psychometric characteristics of the Perceived Stress Scale (PSS) in a sample of dementia patients and their spousal caregivers. Methods: We investigated the reliability and validity of the 14-item PSS in a sample of 80 couples, each including one spouse who had been diagnosed with mild to moderate dementia (mean age 75.55, SD = 5.85, 38.7% female) and one spousal caregiver (mean age 73.06, SD = 6.75, 61.3% female). We also examined the factor structure and sensitivity of the scale with regard to gender differences. Results: Exploratory factor analysis of the PSS revealed a two-factor solution for the scale; the first factor reflected general stress while the second factor consisted of items reflecting the perceived ability to cope with stressors. A confirmatory factor analysis verified that the data were a better fit for the two-factor model than a one-factor model. The two factors of the PSS showed good reliability for patients as well as for caregivers ranging between alpha = 0.73 and alpha = 0.82. Perceived stress was significantly positively correlated with depressive symptomatology in both caregivers and patients. Mean PSS scores did not significantly differ between male and female patients nor did they differ between male and female caregivers. Conclusion: The present data indicate that the PSS provides a reliable and valid measure of perceived stress in dementia patients and their caregivers.
Theories of decision-making and its neural substrates have long assumed the existence of two distinct and competing valuation systems, variously described as goal-directed vs. habitual, or, more recently and based on statistical arguments, as model-free vs. model-based reinforcement-learning. Though both have been shown to control choices, the cognitive abilities associated with these systems are under ongoing investigation. Here we examine the link to cognitive abilities, and find that individual differences in processing speed covary with a shift from model-free to model-based choice control in the presence of above-average working memory function. This suggests shared cognitive and neural processes; provides a bridge between literatures on intelligence and valuation; and may guide the development of process models of different valuation components. Furthermore, it provides a rationale for individual differences in the tendency to deploy valuation systems, which may be important for understanding the manifold neuropsychiatric diseases associated with malfunctions of valuation.
Prevention of Cognitive Decline: A Physical Exercise Perspective on Brain Health in the Long Run
(2016)
Objective: To estimate the prevalence and type of antidepressant medication prescribed by German primary care physicians for patients with depression and osteoporosis. Methods: This study was a retrospective database analysis conducted in Germany utilizing the Disease Analyzer (R) Database (IMS Health, Germany). The study population included 3,488 female osteoporosis patients aged between 40 and 90 years recruited from 1,179 general practitioner practices and who were initially diagnosed with depression during the index period (January 2004 to December 2013). Follow-up lasted up to 12 months and was completed in August 2015. Also included in this study were 3,488 nonosteoporosis controls who were matched (1 : 1) to osteoporosis cases on the basis of age, health insurance coverage, severity of depression, and physician carrying out the diagnosis. Results: After 12 months of followup, 30.1% of osteoporosis and 29.9% of nonosteoporosis patients with mild depression (p = 0.783), 52.4% of osteoporosis and 48.0% of non-osteoporosis patients with moderate depression (p = 0.003), and 39.4% of osteoporosis and 35.1% of nonosteoporosis patients with severe depression (p = 0.147) were being treated with antidepressants. Osteoporosis patients with moderate depression had a higher chance of being prescribed antidepressant therapy at the initial diagnosis (hazard ratio (HR): 1.12, p = 0.014). No differences were found between osteoporosis and nonosteoporosis patients regarding the proportion of patients receiving selective serotonin reuptake inhibitors (SSRI)/serotonin-noradrenaline reuptake inhibitors (SNRI), tricyclic antidepressant (TCA), or other antidepressants. Osteoporosis patients were more often referred to hospitals or psychiatrists for consultation. Conclusion: Osteoporosis patients are more often treated initially with antidepressants than non-osteoporosis patients, especially within the groups of patients with moderate or severe depression. TCA was the most frequently used antidepressant therapy class on initial diagnosis in both patient groups. Osteo-porosis patients receive referrals to hospitals or psychiatrists more often than patients without osteoporosis.
Theoretical models and preceding studies have described age-related alterations in neuronal activation of frontoparietal regions in a working memory (WM)load-dependent manner. However, to date, underlying neuronal mechanisms of these WM load-dependent activation changes in aging remain poorly understood. The aim of this study was to investigate these mechanisms in terms of effective connectivity by application of dynamic causal modeling with Bayesian Model Selection. Eighteen healthy younger (age: 20-32 years) and 32 older (60-75 years) participants performed an n-back task with 3 WM load levels during functional magnetic resonance imaging (fMRI). Behavioral and conventional fMRI results replicated age group by WM load interactions. Importantly, the analysis of effective connectivity derived from dynamic causal modeling, indicated an age-and performance-related reduction in WM load-dependent modulation of connectivity from dorsolateral prefrontal cortex to inferior parietal lobule. This finding provides evidence for the proposal that age-related WM decline manifests as deficient WM load-dependent modulation of neuronal top-down control and can integrate implications from theoretical models and previous studies of functional changes in the aging brain.
Predictors and prevalence of hazardous alcohol use in middle-late to late adulthood in Europe
(2022)
Objectives:
Even low to moderate levels of alcohol consumption can have detrimental health consequences, especially in older adults (OA). Although many studies report an increase in the proportion of drinkers among OA, there are regional variations. Therefore, we examined alcohol consumption and the prevalence of hazardous alcohol use (HAU) among men and women aged 50+ years in four European regions and investigated predictors of HAU.
Methods:
We analyzed data of N = 35,042 participants of the European SHARE study. We investigated differences in alcohol consumption (units last week) according to gender, age and EU-region using ANOVAs. Furthermore, logistic regression models were used to examine the effect of income, education, marital status, history of a low-quality parent-child relationship and smoking on HAU, also stratified for gender and EU-region. HAU was operationalized as binge drinking or risky drinking (<12.5 units of 10 ml alcohol/week).
Results:
Overall, past week alcohol consumption was 5.0 units (+/- 7.8), prevalence of HAU was 25.4% within our sample of European adults aged 50+ years. Male gender, younger age and living in Western Europe were linked to both higher alcohol consumption and higher risks of HAU. Income, education, smoking, a low-quality parent-child relationship, living in Northern and especially Eastern Europe were positively associated with HAU. Stratified analyses revealed differences by region and gender.
Conclusions:
HAU was highly prevalent within this European sample of OA. Alcohol consumption and determinants of HAU differed between EU-regions, hinting to a necessity of risk-stratified population-level strategies to prevent HAU and subsequent alcohol use disorders.
Poverty and social exclusion are closely related to an increased risk for the deterioration of mental health. In 2018 approximately 19% of the German population were threatened by poverty and the associated social ostracization. Migrant groups in particular often show an increased risk for poverty and are often exposed to multiple socioeconomic stress factors depending on the context of migration, pre-migration and post-migration social factors. Numerous studies have shown that societal exclusion, precarious living conditions and the residential environment negatively affect mental health beyond the effects of pre-migration risk factors. This article provides a review and discussion on the relationship between mental health, poverty and related constructs, such as social cohesion, social capital and social exclusion in general as well as in specific risk groups, such as migrant and refugee populations.
In 2011, about 1.1-1.4 million patients with dementia were living in Germany, a number expected to rise to three million by 2050. Dementia poses a major challenge to the healthcare system and neuropharmacological service provision. The aim of this study was to determine prescription rates for anti-dementia drugs as well as for neuroleptics, sedative-hypnotics and antidepressants in dementia using the complete nationwide outpatient claims data pertaining to the services of statutory health insurance. We controlled for gender, age, dementia diagnosis, physician specialty (general practitioner GP versus neuropsychiatry specialist physician NPSP), and rural and urban living area. In about one million prevalent dementia patients (N=1,014,710) in 2011, the prescription prevalence rate of anti-dementia drugs was 24.6%; it varied with gender, age, and diagnosis (highest in Alzheimer's disease; 42%), and was higher in patients treated by NPSPs (48% vs. 25% in GPs). At the same time, we found an alarmingly high rate of treatment with neuroleptics in dementia patients (35%), with an only slightly decreased risk in patients treated exclusively by NPSPs (OR=0.86). We found marginal differences between rural and urban areas. Our results show that the majority of anti-dementia drug prescriptions appear guideline-oriented, yet prescription rates are overall comparatively low. On the other hand, neuroleptic drugs, which are associated with excess morbidity and mortality in dementia, were prescribed very frequently, suggesting excess use given current guidelines. We therefore suggest that guideline implementation measures and increasing quality control procedures are needed with respect to the pharmacotherapy of this vulnerable population. (C) 2015 Elsevier B.V. and ECNR All rights reserved.
Background/Aims: To analyze the duration of treatment with antipsychotics in German dementia patients. Methods: This study included patients aged 60 years and over with dementia who received a first-time antipsychotic prescription by psychiatrists between 2009 and 2013. The main outcome measure was the treatment rate for more than 6 months following the index date. Results: A total of 12,979 patients with dementia (mean age 82 years, 52.1% living in nursing homes) were included. After 2 years of follow-up, 54.8%, 57.2%, 61.1%, and 65.4% of patients aged 60 - 69, 70 - 79, 80 - 89, and 90 - 99 years, respectively, received antipsychotic prescriptions. 63.9% of subjects living in nursing homes and 55.0% of subjects living at home also continued their treatment (p-value < 0.001). Conclusion: The percentage of dementia patients treated with anti psychotics is very high.
Persistence with antidepressant drugs in patients with dementia: a retrospective database analysis
(2016)
Background: The aims of the present study are to determine what proportion of patients with dementia receives antidepressants, how long the treatment is administered, and what factors increase the risk of discontinuation. Methods: The study was based on Disease Analyzer database and included 1,203 general practitioners (GP) and 209 neurologists/psychiatrists (NP). 12,281 patients with a diagnosis of dementia and an initial prescription of an antidepressant drug between January 2004 and December 2013 were included. The main outcome measure was antidepressant discontinuation rates within 6 months of the index date. Results: After 6 months of follow-up, 52.7% of dementia patients treated with antidepressants had stopped medication intake. There was a significantly decreased risk for treatment discontinuation for patients using selective serotonin reuptake inhibitors (SSRRIs) or serotonin and norepinephrine reuptake inhibitors (SSNRIs) compared to tricyclic antidepressants. There was a significantly increased risk of treatment discontinuation for older patients and patients treated in NP practice. Comorbidity of diabetes or history of stroke was associated with a decreased risk of treatment discontinuation. Conclusion: The study results show insufficient persistence in antidepressant treatment in dementia patients in a real world setting. The improvement must be achieved to ensure the treatment recommended in the guidelines.
Background: Given the well-established association between perceived stress and quality of life (QoL) in dementia patients and their partners, our goal was to identify whether relationship quality and dyadic coping would operate as mediators between perceived stress and QoL. Results: We found negative correlations between stress and QoL in both partners (QoL-AD: r = -0.62; p < 0.001; WHO-QOL Overall: r = -0.27; p = 0.02). Spousal caregivers had a significantly lower DCI total score than dementia patients (p < 0.001). Dyadic coping was a significant mediator of the relationship between stress and QoL in spousal caregivers (z = 0.28; p = 0.02), but not in dementia patients. Likewise, relationship quality significantly mediated the relationship between stress and QoL in caregivers only (z = -2.41; p = 0.02). Conclusions: This study identified dyadic coping as a mediator on the relationship between stress and QoL in (caregiving) partners of dementia patients. In patients, however, we found a direct negative effect of stress on QoL. The findings suggest the importance of stress reducing and dyadic interventions for dementia patients and their partners, respectively.
Background: Given the well-established association between perceived stress and quality of life (QoL) in dementia patients and their partners, our goal was to identify whether relationship quality and dyadic coping would operate as mediators between perceived stress and QoL.
Methods: 82 dyads of dementia patients and their spousal caregivers were included in a cross-sectional assessment from a prospective study. QoL was assessed with the Quality of Life in Alzheimer's Disease scale (QoL-AD) for dementia patients and the WHO Quality of Life-BREF for spousal caregivers. Perceived stress was measured with the Perceived Stress Scale (PSS-14). Both partners were assessed with the Dyadic Coping Inventory (DCI). Analyses of correlation as well as regression models including mediator analyses were performed.
Results: We found negative correlations between stress and QoL in both partners (QoL-AD: r = -0.62; p < 0.001; WHO-QOL Overall: r = -0.27; p = 0.02). Spousal caregivers had a significantly lower DCI total score than dementia patients (p < 0.001). Dyadic coping was a significant mediator of the relationship between stress and QoL in spousal caregivers (z = 0.28; p = 0.02), but not in dementia patients. Likewise, relationship quality significantly mediated the relationship between stress and QoL in caregivers only (z = -2.41; p = 0.02).
Conclusions: This study identified dyadic coping as a mediator on the relationship between stress and QoL in (caregiving) partners of dementia patients. In patients, however, we found a direct negative effect of stress on QoL. The findings suggest the importance of stress reducing and dyadic interventions for dementia patients and their partners, respectively.
The role of perceived need support from exercise professionals in improving mental health was examined in a sample of older adults, thereby validating the short Health Care Climate Questionnaire. A total of 491 older people (M = 72.68 years; SD = 5.47) attending a health exercise program participated in this study. Cronbach's alpha was found to be high (alpha = .90). Satisfaction with the exercise professional correlated moderately with the short Health Care Climate Questionnaire mean value (r = .38; p < .01). The mediator analyses yielded support for the self-determination theory process model in older adults by showing both basic need satisfaction and frustration as mediating variables between perceived autonomy support and depressive symptoms. The short Health Care Climate Questionnaire is an economical instrument for assessing basic need satisfaction provided by the exercise therapist from the participant's perspective. Furthermore, this cross-sectional study supported the link from coaching style to the satisfaction/frustration of basic psychological needs, which in turn, predicted mental health. Analyses of criterion validity suggest a revision of the construct by integrating need frustration.
Background: Pavlovian processes are thought to play an important role in the development, maintenance and relapse of alcohol dependence, possibly by influencing and usurping ongoing thought and behavior. The influence of pavlovian stimuli on ongoing behavior is paradigmatically measured by pavlovian-to-instrumental transfer (PIT) tasks. These involve multiple stages and are complex. Whether increased PIT is involved in human alcohol dependence is uncertain. We therefore aimed to establish and validate a modified PIT paradigm that would be robust, consistent and tolerated by healthy controls as well as by patients suffering from alcohol dependence, and to explore whether alcohol dependence is associated with enhanced PIT. Methods: Thirty-two recently detoxified alcohol-dependent patients and 32 age- and gender-matched healthy controls performed a PIT task with instrumental go/no-go approach behaviors. The task involved both pavlovian stimuli associated with monetary rewards and losses, and images of drinks. Results: Both patients and healthy controls showed a robust and temporally stable PIT effect. Strengths of PIT effects to drug-related and monetary conditioned stimuli were highly correlated. Patients more frequently showed a PIT effect, and the effect was stronger in response to aversively conditioned CSs (conditioned suppression), but there was no group difference in response to appetitive CSs. Conclusion: The implementation of PIT has favorably robust properties in chronic alcohol-dependent patients and in healthy controls. It shows internal consistency between monetary and drug-related cues. The findings support an association of alcohol dependence with an increased propensity towards PIT.
Behavioral choice can be characterized along two axes. One axis distinguishes reflexive, model-free systems that slowly accumulate values through experience and a model-based system that uses knowledge to reason prospectively. The second axis distinguishes Pavlovian valuation of stimuli from instrumental valuation of actions or stimulus–action pairs. This results in four values and many possible interactions between them, with important consequences for accounts of individual variation. We here explored whether individual variation along one axis was related to individual variation along the other. Specifically, we asked whether individuals' balance between model-based and model-free learning was related to their tendency to show Pavlovian interferences with instrumental decisions. In two independent samples with a total of 243 participants, Pavlovian–instrumental transfer effects were negatively correlated with the strength of model-based reasoning in a two-step task. This suggests a potential common underlying substrate predisposing individuals to both have strong Pavlovian interference and be less model-based and provides a framework within which to interpret the observation of both effects in addiction.
In detoxified alcohol-dependent patients, alcohol-related stimuli can promote relapse. However, to date, the mechanisms by which contextual stimuli promote relapse have not been elucidated in detail. One hypothesis is that such contextual stimuli directly stimulate the motivation to drink via associated brain regions like the ventral striatum and thus promote alcohol seeking, intake and relapse. Pavlovian-to-Instrumental-Transfer (PIT) may be one of those behavioral phenomena contributing to relapse, capturing how Pavlovian conditioned (contextual) cues determine instrumental behavior (e.g. alcohol seeking and intake). We used a PIT paradigm during functional magnetic resonance imaging to examine the effects of classically conditioned Pavlovian stimuli on instrumental choices in n=31 detoxified patients diagnosed with alcohol dependence and n=24 healthy controls matched for age and gender. Patients were followed up over a period of 3 months. We observed that (1) there was a significant behavioral PIT effect for all participants, which was significantly more pronounced in alcohol-dependent patients; (2) PIT was significantly associated with blood oxygen level-dependent (BOLD) signals in the nucleus accumbens (NAcc) in subsequent relapsers only; and (3) PIT-related NAcc activation was associated with, and predictive of, critical outcomes (amount of alcohol intake and relapse during a 3 months follow-up period) in alcohol-dependent patients. These observations show for the first time that PIT-related BOLD signals, as a measure of the influence of Pavlovian cues on instrumental behavior, predict alcohol intake and relapse in alcohol dependence.
In animals and humans, behavior can be influenced by irrelevant stimuli, a phenomenon called Pavlovian-to-instrumental transfer (PIT). In subjects with substance use disorder, PIT is even enhanced with functional activation in the nucleus accumbens (NAcc) and amygdala. While we observed enhanced behavioral and neural PIT effects in alcohol-dependent subjects, we here aimed to determine whether behavioral PIT is enhanced in young men with high-risk compared to low-risk drinking and subsequently related functional activation in an a-priori region of interest encompassing the NAcc and amygdala and related to polygenic risk for alcohol consumption. A representative sample of 18-year old men (n = 1937) was contacted: 445 were screened, 209 assessed: resulting in 191 valid behavioral, 139 imaging and 157 genetic datasets. None of the subjects fulfilled criteria for alcohol dependence according to the Diagnostic and Statistical Manual of Mental Disorders-IV-TextRevision (DSM-IV-TR). We measured how instrumental responding for rewards was influenced by background Pavlovian conditioned stimuli predicting action-independent rewards and losses. Behavioral PIT was enhanced in high-compared to low-risk drinkers (b = 0.09, SE = 0.03, z = 2.7, p < 0.009). Across all subjects, we observed PIT-related neural blood oxygen level-dependent (BOLD) signal in the right amygdala (t = 3.25, p(SVC) = 0.04, x = 26, y = -6, z = -12), but not in NAcc. The strength of the behavioral PIT effect was positively correlated with polygenic risk for alcohol consumption (r(s) = 0.17, p = 0.032). We conclude that behavioral PIT and polygenic risk for alcohol consumption might be a biomarker for a subclinical phenotype of risky alcohol consumption, even if no drug-related stimulus is present. The association between behavioral PIT effects and the amygdala might point to habitual processes related to out PIT task. In non-dependent young social drinkers, the amygdala rather than the NAcc is activated during PIT; possible different involvement in association with disease trajectory should be investigated in future studies.
In animals and humans, behavior can be influenced by irrelevant stimuli, a phenomenon called Pavlovian-to-instrumental transfer (PIT). In subjects with substance use disorder, PIT is even enhanced with functional activation in the nucleus accumbens (NAcc) and amygdala. While we observed enhanced behavioral and neural PIT effects in alcohol-dependent subjects, we here aimed to determine whether behavioral PIT is enhanced in young men with high-risk compared to low-risk drinking and subsequently related functional activation in an a-priori region of interest encompassing the NAcc and amygdala and related to polygenic risk for alcohol consumption. A representative sample of 18-year old men (n = 1937) was contacted: 445 were screened, 209 assessed: resulting in 191 valid behavioral, 139 imaging and 157 genetic datasets. None of the subjects fulfilled criteria for alcohol dependence according to the Diagnostic and Statistical Manual of Mental Disorders-IV-TextRevision (DSM-IV-TR). We measured how instrumental responding for rewards was influenced by background Pavlovian conditioned stimuli predicting action-independent rewards and losses. Behavioral PIT was enhanced in high-compared to low-risk drinkers (b = 0.09, SE = 0.03, z = 2.7, p < 0.009). Across all subjects, we observed PIT-related neural blood oxygen level-dependent (BOLD) signal in the right amygdala (t = 3.25, p(SVC) = 0.04, x = 26, y = -6, z = -12), but not in NAcc. The strength of the behavioral PIT effect was positively correlated with polygenic risk for alcohol consumption (r(s) = 0.17, p = 0.032). We conclude that behavioral PIT and polygenic risk for alcohol consumption might be a biomarker for a subclinical phenotype of risky alcohol consumption, even if no drug-related stimulus is present. The association between behavioral PIT effects and the amygdala might point to habitual processes related to out PIT task. In non-dependent young social drinkers, the amygdala rather than the NAcc is activated during PIT; possible different involvement in association with disease trajectory should be investigated in future studies.
A mechanism known as Pavlovian-to-instrumental transfer (PIT) describes a phenomenon by which the values of environmental cues acquired through Pavlovian conditioning can motivate instrumental behavior. PIT may be one basic mechanism of action control that can characterize mental disorders on a dimensional level beyond current classification systems. Therefore, we review human PIT studies investigating subclinical and clinical mental syndromes. The literature prevails an inhomogeneous picture concerning PIT. While enhanced PIT effects seem to be present in non-substance-related disorders, overweight people, and most studies with AUD patients, no altered PIT effects were reported in tobacco use disorder and obesity. Regarding AUD and relapsing alcohol-dependent patients, there is mixed evidence of enhanced or no PIT effects.
Additionally, there is evidence for aberrant corticostriatal activation and genetic risk, e.g., in association with high-risk alcohol consumption and relapse after alcohol detoxification. In patients with anorexia nervosa, stronger PIT effects elicited by low caloric stimuli were associated with increased disease severity.
In patients with depression, enhanced aversive PIT effects and a loss of action-specificity associated with poorer treatment outcomes were reported. Schizophrenic patients showed disrupted specific but intact general PIT effects. Patients with chronic back pain showed reduced PIT effects.
We provide possible reasons to understand heterogeneity in PIT effects within and across mental disorders. Further, we strengthen the importance of reliable experimental tasks and provide test-retest data of a PIT task showing moderate to good reliability.
Finally, we point toward stress as a possible underlying factor that may explain stronger PIT effects in mental disorders, as there is some evidence that stress per se interacts with the impact of environmental cues on behavior by selectively increasing cue-triggered wanting.
To conclude, we discuss the results of the literature review in the light of Research Domain Criteria, suggesting future studies that comprehensively assess PIT across psychopathological dimensions.
Background: Pavlovian processes are thought to play an important role in the development, maintenance and relapse of alcohol dependence, possibly by influencing and usurping on- going thought and behavior. The influence of Pavlovian stimuli on on-going behavior is paradigmatically measured by Pavlovian-to-instrumental-transfer (PIT) tasks. These involve multiple stages and are complex. Whether increased PIT is involved in human alcohol
dependence is uncertain. We therefore aimed to establish and validate a modified PIT paradigm that would be robust, consistent, and tolerated by healthy controls as well as by patients suffering from alcohol dependence, and to explore whether alcohol dependence is associated with enhanced Pavlovian-Instrumental transfer.
Methods: 32 recently detoxified alcohol-dependent patients and 32 age and gender matched healthy controls performed a PIT task with instrumental go/no-go approach behaviours. The task involved both Pavlovian stimuli associated with monetary rewards and losses, and images of drinks.
Results: Both patients and healthy controls showed a robust and temporally stable PIT effect. Strengths of PIT effects to drug-related and monetary conditioned stimuli were highly correlated. Patients more frequently showed a PIT effect and the effect was stronger in response to aversively conditioned CSs (conditioned suppression), but there was no group difference in response to appetitive CSs.
Conclusion: The implementation of PIT has favorably robust properties in chronic alcohol- dependent patients and in healthy controls. It shows internal consistency between monetary and drug-related cues. The findings support an association of alcohol dependence with an increased propensity towards PIT.
Background: The patterns of benzodiazepine prescriptions in older adults are of general and scientific interest as they are not yet well understood. The aim of this study was to compare the prescription patterns of benzodiazepines in elderly people in Germany to determine the share or proportion treated by general practitioners (GP) and neuropsychiatrists (NP). Methods: This study included 31,268 and 6,603 patients between the ages of 65 and 100 with at least one benzodiazepine prescription in 2014 from GP and NP, respectively. Demographic data included age, gender, and type of health insurance coverage. The share of elderly people with benzodiazepine prescriptions was estimated in different age and disease groups for both GP and NP patients. The share of the six most commonly prescribed drugs was also calculated for each type of practice. Results: The share of people taking benzodiazepines prescribed by GP increased from 3.2% in patients aged between 65 and 69 years to 8.6% in patients aged between 90 and 100 years, whereas this share increased from 5.4% to 7.1% in those seen by NP. Benzodiazepines were frequently used by patients suffering from sleep disorders (GP: 33.9%; NP: 5.5%), depression (GP: 17.9%; NP: 29.8%), and anxiety disorders (GP: 14.5%; NP: 22.8%). Lorazepam (30.3%), oxazepam (24.7%), and bromazepam (24.3%) were the three most commonly prescribed drugs for GP patients. In contrast, lorazepam (60.4%), diazepam (14.8%), and oxazepam (11.2%) were those more frequently prescribed to NP patients. Conclusion: Prescription patterns of benzodiazepine in the elderly varied widely between GP and NP.
Importance Alcohol consumption (AC) leads to death and disability worldwide. Ongoing discussions on potential negative effects of the COVID-19 pandemic on AC need to be informed by real-world evidence.
Objective To examine whether lockdown measures are associated with AC and consumption-related temporal and psychological within-person mechanisms.
Design, Setting, and Participants This quantitative, intensive, longitudinal cohort study recruited 1743 participants from 3 sites from February 20, 2020, to February 28, 2021. Data were provided before and within the second lockdown of the COVID-19 pandemic in Germany: before lockdown (October 2 to November 1, 2020); light lockdown (November 2 to December 15, 2020); and hard lockdown (December 16, 2020, to February 28, 2021).
Main Outcomes and Measures Daily ratings of AC (main outcome) captured during 3 lockdown phases (main variable) and temporal (weekends and holidays) and psychological (social isolation and drinking intention) correlates.
Results Of the 1743 screened participants, 189 (119 [63.0%] male; median [IQR] age, 37 [27.5-52.0] years) with at least 2 alcohol use disorder (AUD) criteria according to the Diagnostic and Statistical Manual of Mental Disorders (Fifth Edition) yet without the need for medically supervised alcohol withdrawal were included. These individuals provided 14 694 smartphone ratings from October 2020 through February 2021. Multilevel modeling revealed significantly higher AC (grams of alcohol per day) on weekend days vs weekdays (β = 11.39; 95% CI, 10.00-12.77; P < .001). Alcohol consumption was above the overall average on Christmas (β = 26.82; 95% CI, 21.87-31.77; P < .001) and New Year’s Eve (β = 66.88; 95% CI, 59.22-74.54; P < .001). During the hard lockdown, perceived social isolation was significantly higher (β = 0.12; 95% CI, 0.06-0.15; P < .001), but AC was significantly lower (β = −5.45; 95% CI, −8.00 to −2.90; P = .001). Independent of lockdown, intention to drink less alcohol was associated with lower AC (β = −11.10; 95% CI, −13.63 to −8.58; P < .001). Notably, differences in AC between weekend and weekdays decreased both during the hard lockdown (β = −6.14; 95% CI, −9.96 to −2.31; P = .002) and in participants with severe AUD (β = −6.26; 95% CI, −10.18 to −2.34; P = .002).
Conclusions and Relevance This 5-month cohort study found no immediate negative associations of lockdown measures with overall AC. Rather, weekend-weekday and holiday AC patterns exceeded lockdown effects. Differences in AC between weekend days and weekdays evinced that weekend drinking cycles decreased as a function of AUD severity and lockdown measures, indicating a potential mechanism of losing and regaining control. This finding suggests that temporal patterns and drinking intention constitute promising targets for prevention and intervention, even in high-risk individuals.
Importance Alcohol consumption (AC) leads to death and disability worldwide. Ongoing discussions on potential negative effects of the COVID-19 pandemic on AC need to be informed by real-world evidence.
Objective To examine whether lockdown measures are associated with AC and consumption-related temporal and psychological within-person mechanisms.
Design, Setting, and Participants This quantitative, intensive, longitudinal cohort study recruited 1743 participants from 3 sites from February 20, 2020, to February 28, 2021. Data were provided before and within the second lockdown of the COVID-19 pandemic in Germany: before lockdown (October 2 to November 1, 2020); light lockdown (November 2 to December 15, 2020); and hard lockdown (December 16, 2020, to February 28, 2021).
Main Outcomes and Measures Daily ratings of AC (main outcome) captured during 3 lockdown phases (main variable) and temporal (weekends and holidays) and psychological (social isolation and drinking intention) correlates.
Results Of the 1743 screened participants, 189 (119 [63.0%] male; median [IQR] age, 37 [27.5-52.0] years) with at least 2 alcohol use disorder (AUD) criteria according to the Diagnostic and Statistical Manual of Mental Disorders (Fifth Edition) yet without the need for medically supervised alcohol withdrawal were included. These individuals provided 14 694 smartphone ratings from October 2020 through February 2021. Multilevel modeling revealed significantly higher AC (grams of alcohol per day) on weekend days vs weekdays (β = 11.39; 95% CI, 10.00-12.77; P < .001). Alcohol consumption was above the overall average on Christmas (β = 26.82; 95% CI, 21.87-31.77; P < .001) and New Year’s Eve (β = 66.88; 95% CI, 59.22-74.54; P < .001). During the hard lockdown, perceived social isolation was significantly higher (β = 0.12; 95% CI, 0.06-0.15; P < .001), but AC was significantly lower (β = −5.45; 95% CI, −8.00 to −2.90; P = .001). Independent of lockdown, intention to drink less alcohol was associated with lower AC (β = −11.10; 95% CI, −13.63 to −8.58; P < .001). Notably, differences in AC between weekend and weekdays decreased both during the hard lockdown (β = −6.14; 95% CI, −9.96 to −2.31; P = .002) and in participants with severe AUD (β = −6.26; 95% CI, −10.18 to −2.34; P = .002).
Conclusions and Relevance This 5-month cohort study found no immediate negative associations of lockdown measures with overall AC. Rather, weekend-weekday and holiday AC patterns exceeded lockdown effects. Differences in AC between weekend days and weekdays evinced that weekend drinking cycles decreased as a function of AUD severity and lockdown measures, indicating a potential mechanism of losing and regaining control. This finding suggests that temporal patterns and drinking intention constitute promising targets for prevention and intervention, even in high-risk individuals.
Background
Postoperative delirium is a common disorder in older adults that is associated with higher morbidity and mortality, prolonged cognitive impairment, development of dementia, higher institutionalization rates, and rising healthcare costs. The probability of delirium after surgery increases with patients’ age, with pre-existing cognitive impairment, and with comorbidities, and its diagnosis and treatment is dependent on the knowledge of diagnostic criteria, risk factors, and treatment options of the medical staff. In this study, we will investigate whether a cross-sectoral and multimodal intervention for preventing delirium can reduce the prevalence of delirium and postoperative cognitive decline (POCD) in patients older than 70 years undergoing elective surgery. Additionally, we will analyze whether the intervention is cost-effective.
Methods
The study will be conducted at five medical centers (with two or three surgical departments each) in the southwest of Germany. The study employs a stepped-wedge design with cluster randomization of the medical centers. Measurements are performed at six consecutive points: preadmission, preoperative, and postoperative with daily delirium screening up to day 7 and POCD evaluations at 2, 6, and 12 months after surgery. Recruitment goals are to enroll 1500 patients older than 70 years undergoing elective operative procedures (cardiac, thoracic, vascular, proximal big joints and spine, genitourinary, gastrointestinal, and general elective surgery procedures.
Discussion
Results of the trial should form the basis of future standards for preventing delirium and POCD in surgical wards. Key aims are the improvement of patient safety and quality of life, as well as the reduction of the long-term risk of conversion to dementia. Furthermore, from an economic perspective, we expect benefits and decreased costs for hospitals, patients, and healthcare insurances.
Trial registration
German Clinical Trials Register, DRKS00013311. Registered on 10 November 2017.
Background
Postoperative delirium is a common disorder in older adults that is associated with higher morbidity and mortality, prolonged cognitive impairment, development of dementia, higher institutionalization rates, and rising healthcare costs. The probability of delirium after surgery increases with patients’ age, with pre-existing cognitive impairment, and with comorbidities, and its diagnosis and treatment is dependent on the knowledge of diagnostic criteria, risk factors, and treatment options of the medical staff. In this study, we will investigate whether a cross-sectoral and multimodal intervention for preventing delirium can reduce the prevalence of delirium and postoperative cognitive decline (POCD) in patients older than 70 years undergoing elective surgery. Additionally, we will analyze whether the intervention is cost-effective.
Methods
The study will be conducted at five medical centers (with two or three surgical departments each) in the southwest of Germany. The study employs a stepped-wedge design with cluster randomization of the medical centers. Measurements are performed at six consecutive points: preadmission, preoperative, and postoperative with daily delirium screening up to day 7 and POCD evaluations at 2, 6, and 12 months after surgery. Recruitment goals are to enroll 1500 patients older than 70 years undergoing elective operative procedures (cardiac, thoracic, vascular, proximal big joints and spine, genitourinary, gastrointestinal, and general elective surgery procedures.
Discussion
Results of the trial should form the basis of future standards for preventing delirium and POCD in surgical wards. Key aims are the improvement of patient safety and quality of life, as well as the reduction of the long-term risk of conversion to dementia. Furthermore, from an economic perspective, we expect benefits and decreased costs for hospitals, patients, and healthcare insurances.
Trial registration
German Clinical Trials Register, DRKS00013311. Registered on 10 November 2017.
Objectives: In patients with early-stage dementia and their caregiving partners, reciprocal dyadic coping (DC) is crucial for preventing or reducing depressive symptoms in both partners. This study examines the relationships between ‘own DC’ and ‘perceived partner DC’ with depressive symptoms in couples coping with dementia on individual (actor effects) and cross-person (partner effects) levels.
Method: 164 individuals (82 patients with early-stage dementia and their 82 caregiving partners; ND = 82 dyads) participated in this prospective study with measures (DC, depressive symptoms, and dementia severity) taken at baseline and at six months. Each partner evaluated their own and the perceived partner DC. Actor–partner interdependence models were applied to the resulting four independent evaluations.
Results: Results differed substantially between patients and caregivers. DC was significantly related to patients’ but not to caregivers’ depressive symptoms, when adjustments were made for individual coping. Perceived partner DC showed a negative association with depressive symptoms in patients, whereas own DC was adversely related for actor as well as for partner effects across individuals.
Conclusion: The adverse association between the own DC of the caregiver and the patient on depressive symptoms of the patient might be due to inappropriate efforts or to the loss of autonomy as a care-receiver. DC is important in both patients and caregivers, as shown by the negative association between perceived partner DC and depressive symptoms in the patients, which might inform interventions that target the couple as a whole.
Wenn in der Medizin vom demografischen Wandel gesprochen wird [1], wird zumeist von einer rasanten Zunahme der Hochaltrigen gesprochen, bei denen aufgrund der differenziellen altersassoziierten Inzidenzraten in erster Linie eine Zunahme an Demenzerkrankungen, kardiovaskulären Erkrankungen, Krebserkrankungen und allgemeiner Multimorbidität und Gebrechlichkeit zu erwarten ist [2]. Dies ist unstrittig richtig, aber nur ein Teil der Folgen des demografischen Wandels für die psychiatrische Versorgung. Diese muss weiterhin die gesamte adulte Lebensspanne im Blick haben, da sonst Versorgungsengpässe bei ohnehin vulnerablen Patienten verstärkt werden, mit Folgen für die Morbidität und Mortalität auf Bevölkerungsebene [3].
No association of goal-directed and habitual control with alcohol consumption in young adults
(2017)
Alcohol dependence is a mental disorder that has been associated with an imbalance in behavioral control favoring model-free habitual over model-based goal-directed strategies. It is as yet unknown, however, whether such an imbalance reflects a predisposing vulnerability or results as a consequence of repeated and/or excessive alcohol exposure. We, therefore, examined the association of alcohol consumption with model-based goal-directed and model-free habitual control in 188 18-year-old social drinkers in a two-step sequential decision-making task while undergoing functional magnetic resonance imaging before prolonged alcohol misuse could have led to severe neurobiological adaptations. Behaviorally, participants showed a mixture of model-free and model-based decision-making as observed previously. Measures of impulsivity were positively related to alcohol consumption. In contrast, neither model-free nor model-based decision weights nor the trade-off between them were associated with alcohol consumption. There were also no significant associations between alcohol consumption and neural correlates of model-free or model-based decision quantities in either ventral striatum or ventromedial prefrontal cortex. Exploratory whole-brain functional magnetic resonance imaging analyses with a lenient threshold revealed early onset of drinking to be associated with an enhanced representation of model-free reward prediction errors in the posterior putamen. These results suggest that an imbalance between model-based goal-directed and model-free habitual control might rather not be a trait marker of alcohol intake per se.
Background/Aims: Even though cognitive behavioral therapy has become a relatively effective treatment for major depressive disorder and cognitive behavioral therapy-related changes of dysfunctional neural activations were shown in recent studies, remission rates still remain at an insufficient level. Therefore, the implementation of effective augmentation strategies is needed. In recent meta-analyses, exercise therapy (especially endurance exercise) was reported to be an effective intervention in major depressive disorder. Despite these findings, underlying mechanisms of the antidepressant effect of exercise especially in combination with cognitive behavioral therapy have rarely been studied to date and an investigation of its neural underpinnings is lacking. A better understanding of the psychological and neural mechanisms of exercise and cognitive behavioral therapy would be important for developing optimal treatment strategies in depression. The SPeED study (Sport/Exercise Therapy and Psychotherapyevaluating treatment Effects in Depressive patients) is a randomized controlled trial to investigate underlying physiological, neurobiological, and psychological mechanisms of the augmentation of cognitive behavioral therapy with endurance exercise. It is investigated if a preceding endurance exercise program will enhance the effect of a subsequent cognitive behavioral therapy. Methods: This study will include 105 patients diagnosed with a mild or moderate depressive episode according to the Diagnostic and Statistical Manual of Mental Disorders (4th ed.). The participants are randomized into one of three groups: a high-intensive or a low-intensive endurance exercise group or a waiting list control group. After the exercise program/waiting period, all patients receive an outpatient cognitive behavioral therapy treatment according to a standardized therapy manual. At four measurement points, major depressive disorder symptoms (Beck Depression Inventory, Hamilton Rating Scale for Depression), (neuro)biological measures (neural activations during working memory, monetary incentive delay task, and emotion regulation, as well as cortisol levels and brain-derived neurotrophic factor), neuropsychological test performance, and questionnaires (psychological needs, self-efficacy, and quality of life) are assessed. Results: In this article, we report the design of the SPeED study and refer to important methodological issues such as including both high- and low-intensity endurance exercise groups to allow the investigation of dose-response effects and physiological components of the therapy effects. Conclusion: The main aims of this research project are to study effects of endurance exercise and cognitive behavioral therapy on depressive symptoms and to investigate underlying physiological and neurobiological mechanisms of these effects. Results may provide important implications for the development of effective treatment strategies in major depressive disorder, specifically concerning the augmentation of cognitive behavioral therapy by endurance exercise.
As indicated by previous research, aging is associated with a decline in working memory (WM) functioning, related to alterations in fronto-parietal neural activations. At the same time, previous studies showed that WM training in older adults may improve the performance in the trained task (training effect), and more importantly, also in untrained WM tasks (transfer effects). However, neural correlates of these transfer effects that would improve understanding of its underlying mechanisms, have not been shown in older participants as yet. In this study, we investigated blood-oxygen-level-dependent (BOLD) signal changes during n-back performance and an untrained delayed recognition (Sternberg) task following 12 sessions (45 min each) of adaptive n-back training in older adults. The Sternberg task used in this study allowed to test for neural training effects independent of specific task affordances of the trained task and to separate maintenance from updating processes. Thirty-two healthy older participants (60-75 years) were assigned either to an n-back training or a no-contact control group. Before (t1) and after (t2) training/waiting period, both the n-back task and the Sternberg task were conducted while BOLD signal was measured using functional Magnetic Resonance Imaging (fMRI) in all participants. In addition, neuropsychological tests were performed outside the scanner. WM performance improved with training and behavioral transfer to tests measuring executive functions, processing speed, and fluid intelligence was found. In the training group, BOLD signal in the right lateral middle frontal gyrus/caudal superior frontal sulcus (Brodmann area, BA 6/8) decreased in both the trained n-back and the updating condition of the untrained Sternberg task at t2, compared to the control group. fMRI findings indicate a training-related increase in processing efficiency of WM networks, potentially related to the process of WM updating. Performance gains in untrained tasks suggest that transfer to other cognitive tasks remains possible in aging. (C) 2016 Elsevier Inc. All rights reserved.
The influence of Pavlovian conditioned stimuli on ongoing behavior may contribute to explaining how alcohol cues stimulate drug seeking and intake. Using a Pavlovian-instrumental transfer task, we investigated the effects of alcohol-related cues on approach behavior (i.e., instrumental response behavior) and its neural correlates, and related both to the relapse after detoxification in alcohol-dependent patients. Thirty-one recently detoxified alcohol-dependent patients and 24 healthy controls underwent instrumental training, where approach or non-approach towards initially neutral stimuli was reinforced by monetary incentives. Approach behavior was tested during extinction with either alcohol-related or neutral stimuli (as Pavlovian cues) presented in the background during functional magnetic resonance imaging (fMRI). Patients were subsequently followed up for 6 months. We observed that alcohol-related background stimuli inhibited the approach behavior in detoxified alcohol-dependent patients (t = -3.86, p < .001), but not in healthy controls (t = -0.92, p = .36). This behavioral inhibition was associated with neural activation in the nucleus accumbens (NAcc) (t((30)) = 2.06, p < .05). Interestingly, both the effects were only present in subsequent abstainers, but not relapsers and in those with mild but not severe dependence. Our data show that alcohol-related cues can acquire inhibitory behavioral features typical of aversive stimuli despite being accompanied by a stronger NAcc activation, suggesting salience attribution. The fact that these findings are restricted to abstinence and milder illness suggests that they may be potential resilience factors.
Different systems for habitual versus goal-directed control are thought to underlie human decision-making. Working memory is known to shape these decision-making systems and
their interplay, and is known to support goal-directed decision making even under stress. Here, we investigated if and how decision systems are differentially influenced by breaks filled with diverse everyday life activities known to modulate working memory performance. We used a within-subject design where young adults listened to music and played a video game during breaks interleaved with trials of a sequential two-step Markov decision task, designed to assess habitual as well as goal-directed decision making. Based on a neurocomputational model of task performance, we observed that for individuals with a rather limited working memory capacity video gaming as compared to music reduced reliance on the goal-directed decision-making system, while a rather large working memory capacity prevented such a decline. Our findings suggest differential effects of everyday activities on key decision-making processes.
Different systems for habitual versus goal-directed control are thought to underlie human decision-making. Working memory is known to shape these decision-making systems and
their interplay, and is known to support goal-directed decision making even under stress. Here, we investigated if and how decision systems are differentially influenced by breaks filled with diverse everyday life activities known to modulate working memory performance. We used a within-subject design where young adults listened to music and played a video game during breaks interleaved with trials of a sequential two-step Markov decision task, designed to assess habitual as well as goal-directed decision making. Based on a neurocomputational model of task performance, we observed that for individuals with a rather limited
working memory capacity video gaming as compared to music reduced reliance on the goal-directed decision-making system, while a rather large working memory capacity prevented such a decline. Our findings suggest differential effects of everyday activities on key decision-making processes.
Background: Human and animal work suggests a shift from goal-directed to habitual decision-making in addiction. However, the evidence for this in human alcohol dependence is as yet inconclusive. Methods: Twenty-six healthy controls and 26 recently detoxified alcohol-dependent patients underwent behavioral testing with a 2-step task designed to disentangle goal-directed and habitual response patterns. Results: Alcohol-dependent patients showed less evidence of goal-directed choices than healthy controls, particularly after losses. There was no difference in the strength of the habitual component. The group differences did not survive controlling for performance on the Digit Symbol Substitution Task. Conclusion: Chronic alcohol use appears to selectively impair goal-directed function, rather than promoting habitual responding. It appears to do so particularly after nonrewards, and this may be mediated by the effects of alcohol on more general cognitive functions subserved by the prefrontal cortex.
Background
The metabolic syndrome (MetS) is a risk cluster for a number of secondary diseases. The implementation of prevention programs requires early detection of individuals at risk. However, access to health care providers is limited in structurally weak regions. Brandenburg, a rural federal state in Germany, has an especially high MetS prevalence and disease burden. This study aims to validate and test the feasibility of a setup for mobile diagnostics of MetS and its secondary diseases, to evaluate the MetS prevalence and its association with moderating factors in Brandenburg and to identify new ways of early prevention, while establishing a “Mobile Brandenburg Cohort” to reveal new causes and risk factors for MetS.
Methods
In a pilot study, setups for mobile diagnostics of MetS and secondary diseases will be developed and validated. A van will be equipped as an examination room using point-of-care blood analyzers and by mobilizing standard methods. In study part A, these mobile diagnostic units will be placed at different locations in Brandenburg to locally recruit 5000 participants aged 40-70 years. They will be examined for MetS and advice on nutrition and physical activity will be provided. Questionnaires will be used to evaluate sociodemographics, stress perception, and physical activity. In study part B, participants with MetS, but without known secondary diseases, will receive a detailed mobile medical examination, including MetS diagnostics, medical history, clinical examinations, and instrumental diagnostics for internal, cardiovascular, musculoskeletal, and cognitive disorders. Participants will receive advice on nutrition and an exercise program will be demonstrated on site. People unable to participate in these mobile examinations will be interviewed by telephone. If necessary, participants will be referred to general practitioners for further diagnosis.
Discussion
The mobile diagnostics approach enables early detection of individuals at risk, and their targeted referral to local health care providers. Evaluation of the MetS prevalence, its relation to risk-increasing factors, and the “Mobile Brandenburg Cohort” create a unique database for further longitudinal studies on the implementation of home-based prevention programs to reduce mortality, especially in rural regions.
Trial registration
German Clinical Trials Register, DRKS00022764; registered 07 October 2020—retrospectively registered.
Background
The metabolic syndrome (MetS) is a risk cluster for a number of secondary diseases. The implementation of prevention programs requires early detection of individuals at risk. However, access to health care providers is limited in structurally weak regions. Brandenburg, a rural federal state in Germany, has an especially high MetS prevalence and disease burden. This study aims to validate and test the feasibility of a setup for mobile diagnostics of MetS and its secondary diseases, to evaluate the MetS prevalence and its association with moderating factors in Brandenburg and to identify new ways of early prevention, while establishing a “Mobile Brandenburg Cohort” to reveal new causes and risk factors for MetS.
Methods
In a pilot study, setups for mobile diagnostics of MetS and secondary diseases will be developed and validated. A van will be equipped as an examination room using point-of-care blood analyzers and by mobilizing standard methods. In study part A, these mobile diagnostic units will be placed at different locations in Brandenburg to locally recruit 5000 participants aged 40-70 years. They will be examined for MetS and advice on nutrition and physical activity will be provided. Questionnaires will be used to evaluate sociodemographics, stress perception, and physical activity. In study part B, participants with MetS, but without known secondary diseases, will receive a detailed mobile medical examination, including MetS diagnostics, medical history, clinical examinations, and instrumental diagnostics for internal, cardiovascular, musculoskeletal, and cognitive disorders. Participants will receive advice on nutrition and an exercise program will be demonstrated on site. People unable to participate in these mobile examinations will be interviewed by telephone. If necessary, participants will be referred to general practitioners for further diagnosis.
Discussion
The mobile diagnostics approach enables early detection of individuals at risk, and their targeted referral to local health care providers. Evaluation of the MetS prevalence, its relation to risk-increasing factors, and the “Mobile Brandenburg Cohort” create a unique database for further longitudinal studies on the implementation of home-based prevention programs to reduce mortality, especially in rural regions.
Trial registration
German Clinical Trials Register, DRKS00022764; registered 07 October 2020—retrospectively registered.
Background: The purpose of this study was to analyze the prevalence of long-term benzodiazepine use in older adults treated in general and neuropsychiatric practices in Germany. Methods: This study included 32,182 patients over the age of 65 years who received benzodiazepine prescriptions for the first time between January 2010 and December 2014 in general and neuropsychiatric practices in Germany. Follow up lasted until July 2016. The main outcome measure was the proportion of patients treated with benzodiazepines for >6 months. Results: The proportion of patients with benzodiazepine therapy for >6 months increased with age (65-70 years: 12.3%; 71-80 years: 15.5%; 81-90 years: 23.7%; >90 years: 31.6%) but did not differ significantly between men (15.5%) and women (17.1%). The proportion of patients who received benzodiazepines for >6 months was higher among those with sleep disorders (21.1%), depression (20.8%) and dementia (32.1%) than among those with anxiety (15.5%). By contrast, this proportion was lower among people diagnosed with adjustment disorders (7.7%) and back pain (3.8%). Conclusion: Overall, long-term use of benzodiazepines is common in older people, particularly in patients over the age of 80 and in those diagnosed with dementia, sleep disorders, or depression.
Rationale: Advances in neurocomputational modeling suggest that valuation systems for goal-directed (deliberative) on one side, and habitual (automatic) decision-making on the other side may rely on distinct computational strategies for reinforcement learning, namely model-free vs. model-based learning. As a key theoretical difference, the model-based system strongly demands cognitive functions to plan actions prospectively based on an internal cognitive model of the environment, whereas valuation in the model-free system relies on rather simple learning rules from operant conditioning to retrospectively associate actions with their outcomes and is thus cognitively less demanding. Acute stress reactivity is known to impair model-based but not model-free choice behavior, with higher working memory capacity protecting the model-based system from acute stress. However, it is not clear which impact accumulated real life stress has on model-free and model-based decision systems and how this influence interacts with cognitive abilities. Methods: We used a sequential decision-making task distinguishing relative contributions of both learning strategies to choice behavior, the Social Readjustment Rating Scale questionnaire to assess accumulated real life stress, and the Digit Symbol Substitution Test to test cognitive speed in 95 healthy subjects. Results: Individuals reporting high stress exposure who had low cognitive speed showed reduced model-based but increased model-free behavioral control. In contrast, subjects exposed to accumulated real life stress with high cognitive speed displayed increased model-based performance but reduced model-free control. Conclusion: These findings suggest that accumulated real life stress exposure can enhance reliance on cognitive speed for model-based computations, which may ultimately protect the model-based system from the detrimental influences of accumulated real life stress. The combination of accumulated real life stress exposure and slower information processing capacities, however, might favor model-free strategies. Thus, the valence and preference of either system strongly depends on stressful experiences and individual cognitive capacities.
The aim was to analyze the risk of hip fracture in German primary care patients with dementia. This study included patients aged 65-90 from 1072 primary care practices who were first diagnosed with dementia between 2010 and 2013. Controls were matched (1:1) to cases for age, sex, and type of health insurance. The primary outcome was the diagnosis of hip fracture during the three-year follow-up period. A total of 53,156 dementia patients and 53,156 controls were included. A total of 5.3% of patients and 0.7% of controls displayed hip fracture after three years. Hip fracture occurred more frequently in dementia subjects living in nursing homes than in those living at home (9.2% versus 4.3%). Dementia, residence in nursing homes, and osteoporosis were risk factors for fracture development. Antidementia, antipsychotic, and antidepressant drugs generally had no significant impact on hip fracture risk when prescribed for less than six months. Dementia increased hip fracture risk in German primary care practices.
The aim was to analyze the risk of hip fracture in German primary care patients with dementia. This study included patients aged 65-90 from 1072 primary care practices who were first diagnosed with dementia between 2010 and 2013. Controls were matched (1:1) to cases for age, sex, and type of health insurance. The primary outcome was the diagnosis of hip fracture during the three-year follow-up period. A total of 53,156 dementia patients and 53,156 controls were included. A total of 5.3% of patients and 0.7% of controls displayed hip fracture after three years. Hip fracture occurred more frequently in dementia subjects living in nursing homes than in those living at home (9.2% versus 4.3%). Dementia, residence in nursing homes, and osteoporosis were risk factors for fracture development. Antidementia, antipsychotic, and antidepressant drugs generally had no significant impact on hip fracture risk when prescribed for less than six months. Dementia increased hip fracture risk in German primary care practices.
In recent years digital technologies have become a major means for providing health-related services and this trend was strongly reinforced by the current Coronavirus disease 2019 (COVID-19) pandemic. As it is well-known that regular physical activity has positive effects on individual physical and mental health and thus is an important prerequisite for healthy aging, digital technologies are also increasingly used to promote unstructured and structured forms of physical activity. However, in the course of this development, several terms (e.g., Digital Health, Electronic Health, Mobile Health, Telehealth, Telemedicine, and Telerehabilitation) have been introduced to refer to the application of digital technologies to provide health-related services such as physical interventions. Unfortunately, the above-mentioned terms are often used in several different ways, but also relatively interchangeably. Given that ambiguous terminology is a major source of difficulty in scientific communication which can impede the progress of theoretical and empirical research, this article aims to make the reader aware of the subtle differences between the relevant terms which are applied at the intersection of physical activity and Digital Health and to provide state-of-art definitions for them.
In recent years digital technologies have become a major means for providing health-related services and this trend was strongly reinforced by the current Coronavirus disease 2019 (COVID-19) pandemic. As it is well-known that regular physical activity has positive effects on individual physical and mental health and thus is an important prerequisite for healthy aging, digital technologies are also increasingly used to promote unstructured and structured forms of physical activity. However, in the course of this development, several terms (e.g., Digital Health, Electronic Health, Mobile Health, Telehealth, Telemedicine, and Telerehabilitation) have been introduced to refer to the application of digital technologies to provide health-related services such as physical interventions. Unfortunately, the above-mentioned terms are often used in several different ways, but also relatively interchangeably. Given that ambiguous terminology is a major source of difficulty in scientific communication which can impede the progress of theoretical and empirical research, this article aims to make the reader aware of the subtle differences between the relevant terms which are applied at the intersection of physical activity and Digital Health and to provide state-of-art definitions for them.
Ziel der Studie Erfassung der Strukturen zur Frühdiagnostik von Demenzen an Krankenhäusern in Deutschland.
Methodik Fragebogenerhebung.
Ergebnisse 14 % von 1758 kontaktierten Einrichtungen antworteten. 52 % berichteten über ein entsprechendes Angebot, zum großen Teil mit leitlinienorientierten Verfahren, wie Liquordiagnostik. Das Diagnosespektrum umfasste zu 46 % Demenzen und zu 41 % Diagnosen der leichten oder subjektiven kognitiven Störung.
Schlussfolgerung Leitlinienbasierte Diagnostik und Früherkennungskonzepte sind in Gedächtnisambulanzen weitgehend etabliert.
(1) Background: People with HIV (PWH) may perform more than one type of exercise cumulatively. The objective of this study is to investigate recreational exercise and its association with health-related quality of life (HRQOL) and comorbidities in relation to potential covariates. (2) Methods: The HIBES study (HIV-Begleiterkrankungen-Sport) is a cross-sectional study for people with HIV. The differences between non-exercisers versus exercisers (cumulated vs. single type of exercises) were investigated using regression models based on 454 participants. (3) Results: Exercisers showed a higher HRQOL score compared to non-exercisers (Wilcox r = 0.2 to 0.239). Psychological disorders were identified as the main covariate. Participants performing exercise cumulatively showed higher scores in duration, frequency, and intensity when compared to participants performing only one type of exercise. The mental health summary score was higher for the cumulated and single type of exercise if a psychological disorder existed. Duration and intensity were associated with an increase of HRQOL, whilst a stronger association between psychological disorders and exercise variables were evident. Exercise duration (minutes) showed a significant effect on QOL (standardized beta = 0.1) and for participants with psychological disorders (standardized beta = 0.3), respectively. (4) Conclusions: Psychological disorders and other covariates have a prominent effect on HRQOL and its association with exercise. For PWH with a psychological disorder, a stronger relationship between HRQOL with exercise duration and intensity emerged. However, differentiation of high-HRQOL individuals warrants further investigation by considering additional factors.
(1) Background: People with HIV (PWH) may perform more than one type of exercise cumulatively. The objective of this study is to investigate recreational exercise and its association with health-related quality of life (HRQOL) and comorbidities in relation to potential covariates. (2) Methods: The HIBES study (HIV-Begleiterkrankungen-Sport) is a cross-sectional study for people with HIV. The differences between non-exercisers versus exercisers (cumulated vs. single type of exercises) were investigated using regression models based on 454 participants. (3) Results: Exercisers showed a higher HRQOL score compared to non-exercisers (Wilcox r = 0.2 to 0.239). Psychological disorders were identified as the main covariate. Participants performing exercise cumulatively showed higher scores in duration, frequency, and intensity when compared to participants performing only one type of exercise. The mental health summary score was higher for the cumulated and single type of exercise if a psychological disorder existed. Duration and intensity were associated with an increase of HRQOL, whilst a stronger association between psychological disorders and exercise variables were evident. Exercise duration (minutes) showed a significant effect on QOL (standardized beta = 0.1) and for participants with psychological disorders (standardized beta = 0.3), respectively. (4) Conclusions: Psychological disorders and other covariates have a prominent effect on HRQOL and its association with exercise. For PWH with a psychological disorder, a stronger relationship between HRQOL with exercise duration and intensity emerged. However, differentiation of high-HRQOL individuals warrants further investigation by considering additional factors.
Evaluation of technology-based interventions for informal caregivers of patients with dementia
(2019)
Objective: The aim of this study was to estimate the efficacy of technology-based interventions for informal caregivers of people with dementia (PWD). Methods: PubMed, PsycINFO, and Cochrane Library databases were searched in August 2018, with no restrictions in language or publication date. Two independent reviewers identified 33 eligible randomized controlled trials (RCTs) conducting a technology-based intervention for informal carers of PWD. Meta-analyses for the outcome measures caregiver depression and caregiver burden were conducted with subgroup analyses according to mode of delivery (telephone, computer/web-based, combined interventions). To assess methodologic quality, the Cochrane risk-of-bias assessment was rated. Results: Meta-analyses revealed a small but significant postintervention effect of technology-based interventions for caregiver depression and caregiver burden. Combined interventions showed the strongest effects. Conclusion: Technology-based interventions have the potential to support informal caregivers of PWD. Because of advantages such as high flexibility and availability, technology-based interventions provide a promising alternative compared with "traditional services," e.g., those for people living in rural areas. More high-quality RCTs for specific caregiver groups are needed.
Genetic and environmental factors both contribute to cognitive test performance. A substantial increase in average intelligence test results in the second half of the previous century within one generation is unlikely to be explained by genetic changes. One possible explanation for the strong malleability of cognitive performance measure is that environmental factors modify gene expression via epigenetic mechanisms. Epigenetic factors may help to understand the recent observations of an association between dopamine-dependent encoding of reward prediction errors and cognitive capacity, which was modulated by adverse life events. The possible manifestation of malleable biomarkers contributing to variance in cognitive test performance, and thus possibly contributing to the "missing heritability" between estimates from twin studies and variance explained by genetic markers, is still unclear. Here we show in 1475 healthy adolescents from the IMaging and GENetics (IMAGEN) sample that general IQ (gIQ) is associated with (1) polygenic scores for intelligence, (2) epigenetic modification of DRD2 gene, (3) gray matter density in striatum, and (4) functional striatal activation elicited by temporarily surprising reward-predicting cues. Comparing the relative importance for the prediction of gIQ in an overlapping subsample, our results demonstrate neurobiological correlates of the malleability of gIQ and point to equal importance of genetic variance, epigenetic modification of DRD2 receptor gene, as well as functional striatal activation, known to influence dopamine neurotransmission. Peripheral epigenetic markers are in need of confirmation in the central nervous system and should be tested in longitudinal settings specifically assessing individual and environmental factors that modify epigenetic structure.
Genetic and environmental factors both contribute to cognitive test performance. A substantial increase in average intelligence test results in the second half of the previous century within one generation is unlikely to be explained by genetic changes. One possible explanation for the strong malleability of cognitive performance measure is that environmental factors modify gene expression via epigenetic mechanisms. Epigenetic factors may help to understand the recent observations of an association between dopamine-dependent encoding of reward prediction errors and cognitive capacity, which was modulated by adverse life events. The possible manifestation of malleable biomarkers contributing to variance in cognitive test performance, and thus possibly contributing to the "missing heritability" between estimates from twin studies and variance explained by genetic markers, is still unclear. Here we show in 1475 healthy adolescents from the IMaging and GENetics (IMAGEN) sample that general IQ (gIQ) is associated with (1) polygenic scores for intelligence, (2) epigenetic modification of DRD2 gene, (3) gray matter density in striatum, and (4) functional striatal activation elicited by temporarily surprising reward-predicting cues. Comparing the relative importance for the prediction of gIQ in an overlapping subsample, our results demonstrate neurobiological correlates of the malleability of gIQ and point to equal importance of genetic variance, epigenetic modification of DRD2 receptor gene, as well as functional striatal activation, known to influence dopamine neurotransmission. Peripheral epigenetic markers are in need of confirmation in the central nervous system and should be tested in longitudinal settings specifically assessing individual and environmental factors that modify epigenetic structure.
Psychiatrische Stationen sind ein wichtiges Element in der psychiatrischen Versorgung von Menschen mit akuter Eigen- oder Fremdgefährdung. Leider kommt es in diesem Rahmen immer wieder auch zu Aggression, Gewalt (Konflikten) sowie zur Anwendung von Zwang (Eindämmung). Als entscheidender Faktor für den sachgemäßen Umgang mit diesen Situationen wird sowohl die Quantität als auch die Qualität der Mitarbeitenden angesehen. Vor diesem Hintergrund beschäftigt sich die vorliegende Untersuchung mit der Versorgungssituation auf akutpsychiatrischen Stationen. Die Hypothese lautet, dass sowohl die Größe der akutpsychiatrischen Station als auch die Anzahl der Pflegenden einen Einfluss auf das Vorkommen konflikthafter Situationen haben. Hierfür sind Daten in 6 Kliniken auf insgesamt 12 psychiatrischen Stationen erfasst worden. Als Erfassungsinstrument diente die Patient Staff Conflict Checklist – Shift Report (PCC-SR). Insgesamt konnten 2026 Schichten (Früh‑, Spät- und Nachtschicht) erfasst und ausgewertet werden. Die personelle Besetzung der Stationen mit Pflegepersonal variierte erheblich. Die Ergebnisse zeigen, dass sowohl die Stationsgröße als auch die Anzahl der Pflegepersonen auf akutpsychiatrischen Stationen einen signifikanten Einfluss auf das Vorkommen von Konflikten haben. In den Ergebnissen zeigt sich weiterhin, dass sich die Inzidenz des konflikthaften Verhaltens von Patienten sowohl im Hinblick auf die untersuchten Stationen der beteiligten Krankenhäuser als auch im Hinblick auf die betrachteten Dienstzeittypen unterscheiden. Darüber hinaus zeigt sich, dass das Ausmaß der Schließung einer Akutstation und die Größe einer Station einen negativen Einfluss auf die Inzidenz von Konflikten im stationär akutpsychiatrischen Kontext haben. Das Auftreten konflikthaften Verhaltens kann zur Fremd- oder Selbstgefährdung und zu einer Vielzahl deeskalierender und eindämmender Maßnahmen führen. Hierfür sind entsprechende personelle Ressourcen erforderlich.
Background
Recent research emphasized the role of inflammatory processes in the pathophysiology of depression. Theories hypothesizes that life events (LE) can affect the immune system and trigger depressive symptoms. LE are also considered as one of the best predictors for the onset and course of depressive disorders.
Methods
Observational study across three treatment settings: n=208 depressive patients (75.5%f, M 46.6 y) were examined on depression (BDI-II), life events (ILE) and inflammatory markers (IL-6, CRP, fibrinogen, ICAM-1, TNF-alpha, E-selectin) at baseline (t0), 5-week(t1) and 5-month(t2) follow-up. Effects and interactions were analyzed with regression models.
Results
LE were associated with depressive symptoms at t0 (beta=.209; p=.002) and both follow-ups. Except for CRP, which was linked to depression symptoms at t2 (betai=-.190; p=.032), there were no effects of inflammatory markers on depressive symptoms. At t1, an interaction between CRP and LE in total (beta=-.249; p=.041) was found as well as for LE in the past five years (beta=-.122; p=.027). Similar interactions were found between cumulative LE and ICAM-1 (beta=-.197; p=.003) and IL-6 (beta=-.425; p=.001).
Conclusion
The cumulative burden of LE effects symptoms and treatment outcome in depressive patients. There is some evidence that inflammatory marker may have long-term effects on treatment outcome as they seem to weaken the determining relation between LE and depression.
Objective: The purpose of this systematic review and meta-analysis was to examine the effects of exercise on depression and anxiety in people living with HIV (PLWH), and to evaluate, through subgroup analysis, the effects of exercise type, frequency, supervision by exercise professionals, study quality, and control group conditions on these outcomes. Method: A literature search was conducted through four electronic databases from inception to February 2019. Considered for inclusion were randomized controlled trials (RCTs) investigating exercise interventions and depression or anxiety as outcomes in people living with HIV (>= 18 years of age). Ten studies were included (n = 479 participants, 49.67% females at baseline), and the standardized mean difference (SMD) and heterogeneity were calculated using random-effect models. An additional pre-post meta-analysis was also conducted. Results: A large effect in favor of exercise when compared to controls was found for depression (SMD = -0.84, 95%CI = [-1.57, -0.11], p = 0.02) and anxiety (SMD = -1.23, 95%CI = [-2.42, 0.04], p = -0.04). Subgroup analyses for depression revealed large effects on depression for aerobic exercise only (SMD = -0.96, 95%CI = [-1.63, -0.30], p = 0.004), a frequency of >= 3 exercise sessions per week (SMD = -1.39, 95%CI = [-2.24, -0.54], p < 0.001), professionally supervised exercise (SMD = -1.40, 95%CI = [-2.46, -0.17], p = 0.03]), and high-quality studies (SMD = -1.31, 95%CI = [-2.46, -0.17], p = 0.02). Conclusion: Exercise seems to decrease depressive symptoms and anxiety in PLWH, but other larger and high-quality studies are needed to verify these effects.
Bone pathology is frequent in stressed individuals. A comprehensive examination of mechanisms linking life stress, depression and disturbed bone homeostasis is missing. In this translational study, mice exposed to early life stress (MSUS) were examined for bone microarchitecture (μCT), metabolism (qPCR/ELISA), and neuronal stress mediator expression (qPCR) and compared with a sample of depressive patients with or without early life stress by analyzing bone mineral density (BMD) (DXA) and metabolic changes in serum (osteocalcin, PINP, CTX-I). MSUS mice showed a significant decrease in NGF, NPYR1, VIPR1 and TACR1 expression, higher innervation density in bone, and increased serum levels of CTX-I, suggesting a milieu in favor of catabolic bone turnover. MSUS mice had a significantly lower body weight compared to control mice, and this caused minor effects on bone microarchitecture. Depressive patients with experiences of childhood neglect also showed a catabolic pattern. A significant reduction in BMD was observed in depressive patients with childhood abuse and stressful life events during childhood. Therefore, future studies on prevention and treatment strategies for both mental and bone disease should consider early life stress as a risk factor for bone pathologies.
Bone pathology is frequent in stressed individuals. A comprehensive examination of mechanisms linking life stress, depression and disturbed bone homeostasis is missing. In this translational study, mice exposed to early life stress (MSUS) were examined for bone microarchitecture (μCT), metabolism (qPCR/ELISA), and neuronal stress mediator expression (qPCR) and compared with a sample of depressive patients with or without early life stress by analyzing bone mineral density (BMD) (DXA) and metabolic changes in serum (osteocalcin, PINP, CTX-I). MSUS mice showed a significant decrease in NGF, NPYR1, VIPR1 and TACR1 expression, higher innervation density in bone, and increased serum levels of CTX-I, suggesting a milieu in favor of catabolic bone turnover. MSUS mice had a significantly lower body weight compared to control mice, and this caused minor effects on bone microarchitecture. Depressive patients with experiences of childhood neglect also showed a catabolic pattern. A significant reduction in BMD was observed in depressive patients with childhood abuse and stressful life events during childhood. Therefore, future studies on prevention and treatment strategies for both mental and bone disease should consider early life stress as a risk factor for bone pathologies.
Effects of Aerobic and Resistance Exercise on Cardiovascular Parameters for People Living With HIV
(2019)
People living with HIV (PLWH) have limited exercise capacity because of anemia, neuromuscular disorders, and pulmonary limitations. We used a meta-analysis to examine the effect of aerobic and resistance exercise alone and in combination on cardiovascular parameters. Subgroup meta-analyses were conducted and long-term effects of exercise were investigated. A systematic literature search was conducted up to July/August 2017. The Physiotherapy Evidence Database-scale was used to rate quality and assess the risk of bias on the papers. Standardized mean differences (SMDs) were calculated to assess the effect of exercise. Posttreatment comparison between the exercise and control groups revealed moderate and large effect sizes in favor of the intervention group for VO2max (SMD50.66, p < .0001) and the 6-minute walk test (SMD = 1.11, p = .0001). Exercise had a positive effect on cardiovascular parameters in PLWH. Exercise can be a prevention factor for PLWH dealing with multiple comorbidities.
Background: Infection with human immunodeficiency virus (HIV) affects muscle mass, altering independent activities of people living with HIV (PLWH). Resistance training alone (RT) or combined with aerobic exercise (AE) is linked to improved muscle mass and strength maintenance in PLWH. These exercise benefits have been the focus of different meta-analyses, although only a limited number of studies have been identified up to the year 2013/4. An up-to-date systematic review and meta-analysis concerning the effect of RT alone or combined with AE on strength parameters and hormones is of high value, since more and recent studies dealing with these types of exercise in PLWH have been published. Methods: Randomized controlled trials evaluating the effects of RT alone, AE alone or the combination of both (AERT) on PLWH was performed through five web-databases up to December 2017. Risk of bias and study quality was attained using the PEDro scale. Weighted mean difference (WMD) from baseline to post-intervention changes was calculated. The I2 statistics for heterogeneity was calculated.
Results: Thirteen studies reported strength outcomes. Eight studies presented a low risk of bias. The overall change in upper body strength was 19.3 Kg (95% CI: 9.8±28.8, p< 0.001) after AERT and 17.5 Kg (95% CI: 16±19.1, p< 0.001) for RT. Lower body change was 29.4 Kg (95% CI: 18.1±40.8, p< 0.001) after RT and 10.2 Kg (95% CI: 6.7±13.8, p< 0.001) for AERT. Changes were higher after controlling for the risk of bias in upper and lower body strength and for supervised exercise in lower body strength. A significant change towards lower levels of IL-6 was found (-2.4 ng/dl (95% CI: -2.6, -2.1, p< 0.001). Conclusion: Both resistance training alone and combined with aerobic exercise showed a positive change when studies with low risk of bias and professional supervision were analyzed, improving upper and, more critically, lower body muscle strength. Also, this study found that exercise had a lowering effect on IL-6 levels in PLWH.
Background: Infection with human immunodeficiency virus (HIV) affects muscle mass, altering independent activities of people living with HIV (PLWH). Resistance training alone (RT) or combined with aerobic exercise (AE) is linked to improved muscle mass and strength maintenance in PLWH. These exercise benefits have been the focus of different meta-analyses, although only a limited number of studies have been identified up to the year 2013/4. An up-to-date systematic review and meta-analysis concerning the effect of RT alone or combined with AE on strength parameters and hormones is of high value, since more and recent studies dealing with these types of exercise in PLWH have been published.
Methods: Randomized controlled trials evaluating the effects of RT alone, AE alone or the combination of both (AERT) on PLWH was performed through five web-databases up to December 2017. Risk of bias and study quality was attained using the PEDro scale. Weighted mean difference (WMD) from baseline to post-intervention changes was calculated. The I2 statistics for heterogeneity was calculated.
Results: Thirteen studies reported strength outcomes. Eight studies presented a low risk of bias. The overall change in upper body strength was 19.3 Kg (95% CI: 9.8±28.8, p< 0.001) after AERT and 17.5 Kg (95% CI: 16±19.1, p< 0.001) for RT. Lower body change was 29.4 Kg (95% CI: 18.1±40.8, p< 0.001) after RT and 10.2 Kg (95% CI: 6.7±13.8, p< 0.001) for AERT. Changes were higher after controlling for the risk of bias in upper and lower body strength and for supervised exercise in lower body strength. A significant change towards lower levels of IL-6 was found (-2.4 ng/dl (95% CI: -2.6, -2.1, p< 0.001).
Conclusion: Both resistance training alone and combined with aerobic exercise showed a positive change when studies with low risk of bias and professional supervision were analyzed, improving upper and, more critically, lower body muscle strength. Also, this study found that exercise had a lowering effect on IL-6 levels in PLWH.
Objective: Demographic changes are increasing the pressure to improve therapeutic strategies against cognitive decline in Alzheimer disease (AD) and mild cognitive impairment (MCI). Besides drug treatment, physical activity seems to be a promising intervention target as epidemiological and clinical studies suggest beneficial effects of exercise training on cognition. Using comparable inclusion and exclusion criteria, we analyzed the efficacy of drug therapy (cholinesterase inhibitors, memantine, and Ginkgo biloba) and exercise interventions for improving cognition in AD and MCI populations. Methods: We searched The Cochrane Library, EBSCO, OVID, Web of Science, and U.S Food and Drug Administration data from inception through October 30, 2013. Randomized controlled trials in which at least one treatment arm consisted of an exercise or a pharmacological intervention for AD or MCI patients, and which had either a non-exposed control condition or a control condition that received another intervention. Treatment discontinuation rates and Standardized Mean Change score using Raw score standardization (SMCR) of cognitive performance were calculated. Results: Discontinuation rates varied substantially and ranged between 0% and 49% with a median of 18%. Significantly increased discontinuation rates were found for galantamine and rivastigmine as compared to placebo in AD studies. Drug treatments resulted in a small pooled effect on cognition (SMCR: 0.23, 95% CI: 0.20 to 0.25) in AD studies (N = 45, 18,434 patients) and no effect in any of the MCI studies (N = 5, 3,693 patients; SMCR: 0.03, 95% CI: 0.00 to 0.005). Exercise interventions had a moderate to strong pooled effect size (SMCR: 0.83, 95% CI: 0.59 to 1.07) in AD studies (N = 4, 119 patients), and a small effect size (SMCR: 0.20, 95% CI: 0.11 to 0.28) in MCI (N = 6, 443 patients). Conclusions: Drug treatments have a small but significant impact on cognitive functioning in AD and exercise has the potential to improve cognition in AD and MCI. Head-to-head trials with sufficient statistical power are necessary to directly compare efficacy, safety, and acceptability. Combining these two approaches might further increase the efficacy of each individual intervention. Identifier: PROSPERO (2013:CRD42013003910).
Substance-dependent individuals often lack the ability to adjust decisions flexibly in response to the changes in reward contingencies. Prediction errors (PEs) are thought to mediate flexible decision-making by updating the reward values associated with available actions. In this study, we explored whether the neurobiological correlates of PEs are altered in alcohol dependence. Behavioral, and functional magnetic resonance imaging (fMRI) data were simultaneously acquired from 34 abstinent alcohol-dependent patients (ADP) and 26 healthy controls (HC) during a probabilistic reward-guided decision-making task with dynamically changing reinforcement contingencies. A hierarchical Bayesian inference method was used to fit and compare learning models with different assumptions about the amount of task-related information subjects may have inferred during the experiment. Here, we observed that the best-fitting model was a modified Rescorla-Wagner type model, the “double-update” model, which assumes that subjects infer the knowledge that reward contingencies are anti-correlated, and integrate both actual and hypothetical outcomes into their decisions. Moreover, comparison of the best-fitting model's parameters showed that ADP were less sensitive to punishments compared to HC. Hence, decisions of ADP after punishments were loosely coupled with the expected reward values assigned to them. A correlation analysis between the model-generated PEs and the fMRI data revealed a reduced association between these PEs and the BOLD activity in the dorsolateral prefrontal cortex (DLPFC) of ADP. A hemispheric asymmetry was observed in the DLPFC when positive and negative PE signals were analyzed separately. The right DLPFC activity in ADP showed a reduced correlation with positive PEs. On the other hand, ADP, particularly the patients with high dependence severity, recruited the left DLPFC to a lesser extent than HC for processing negative PE signals. These results suggest that the DLPFC, which has been linked to adaptive control of action selection, may play an important role in cognitive inflexibility observed in alcohol dependence when reinforcement contingencies change. Particularly, the left DLPFC may contribute to this impaired behavioral adaptation, possibly by impeding the extinction of the actions that no longer lead to a reward.
Alcohol use disorder (AUD) is the most common substance use disorder worldwide. Although dopamine-related findings were often observed in AUD, associated neurobiological mechanisms are still poorly understood. Therefore, in the present study, we investigate D2/3 receptor availability in healthy participants, participants at high risk (HR) to develop addiction (not diagnosed with AUD), and AUD patients in a detoxified stage, applying F-18-fallypride positron emission tomography (F-18-PET). Specifically, D2/3 receptor availability was investigated in (1) 19 low-risk (LR) controls, (2) 19 HR participants, and (3) 20 AUD patients after alcohol detoxification. Quality and severity of addiction were assessed with clinical questionnaires and (neuro)psychological tests. PET data were corrected for age of participants and smoking status. In the dorsal striatum, we observed significant reductions of D2/3 receptor availability in AUD patients compared with LR participants. Further, receptor availability in HR participants was observed to be intermediate between LR and AUD groups (linearly decreasing). Still, in direct comparison, no group difference was observed between LR and HR groups or between HR and AUD groups. Further, the score of the Alcohol Dependence Scale (ADS) was inversely correlated with D2/3 receptor availability in the combined sample. Thus, in line with a dimensional approach, striatal D2/3 receptor availability showed a linear decrease from LR participants to HR participants to AUD patients, which was paralleled by clinical measures. Our study shows that a core neurobiological feature in AUD seems to be detectable in an early, subclinical state, allowing more individualized alcohol prevention programs in the future.
Purpose: Postoperative cognitive dysfunction (POCD) appears in up to 30% of patients suffering from postoperative delirium (POD). Both are associated with higher mortality and postoperative complications, prolonged hospital stays, and increased costs. Multi-modal models with pre-admission risk reduction counselling, perioperative monitoring, and training of multidisciplinary patient care providers have been shown to decrease the prevalence of both. The aim of our study is to understand how far those measures are known and implemented in routine care and to detect potential gaps in the current practice regarding risk communication and information flow between involved caregivers for patients at risk for POD/POCD. Patients and Methods: As part of a multicenter study, seven semi-structured focus group (FG) discussions with nurses and physicians from tertiary care hospitals (surgery, anesthesiology, and orthopedics, n=31) and general practitioners (GPs) in private practice (n=7) were performed. Transcribed discussions were analyzed using qualitative content analysis. Results: POD is present above all in the daily work of nurses, whereas physicians do not perceive it as a relevant problem. Physicians report that no regular risk assessment or risk communication was performed prior to elective surgery. Information about POD often gets lost during hand-offs and is not regularly reported in discharge letters. Thus, persisting cognitive dysfunction is often missed. The importance of standardized documentation and continuous education concerning risks, screening, and treatment was emphasized. The often-suggested pre-OP medication adjustment was seen as less important; in contrast, avoiding withdrawal was regarded as far more important. Conclusion: Altogether, it seems that standards and available best practice concepts are rarely implemented. In contrast to physicians, nurses are highly aware of delirium and ask for standardized procedures and more responsibility. Therefore, raising awareness regarding risks, screening tools, and effective preventive measures for POD/POCD seems an urgent goal. Nurses should have a central role in coordination and care of POD to prevent the risk for POCD.
A dimensional approach in psychiatry aims to identify core mechanisms of mental disorders across nosological boundaries.
We compared anticipation of reward between major psychiatric disorders, and investigated whether reward anticipation is impaired in several mental disorders and whether there is a common psychopathological correlate (negative mood) of such an impairment.
During reward anticipation, we observed significant group differences in ventral striatal (VS) activation: patients with schizophrenia, alcohol dependence, and major depression showed significantly less ventral striatal activation compared to healthy controls. Depressive symptoms correlated with dysfunction in reward anticipation regardless of diagnostic entity. There was no significant correlation between anxiety symptoms and VS functional activation.
Our findings demonstrate a neurobiological dysfunction related to reward prediction that transcended disorder categories and was related to measures of depressed mood. The findings underline the potential of a dimensional approach in psychiatry and strengthen the hypothesis that neurobiological research in psychiatric disorders can be targeted at core mechanisms that are likely to be implicated in a range of clinical entities.
Dimensional psychiatry
(2014)
A dimensional approach in psychiatry aims to identify core mechanisms of mental disorders across nosological boundaries.
We compared anticipation of reward between major psychiatric disorders, and investigated whether reward anticipation is impaired in several mental disorders and whether there is a common psychopathological correlate (negative mood) of such an impairment.
We used functional magnetic resonance imaging (fMRI) and a monetary incentive delay (MID) task to study the functional correlates of reward anticipation across major psychiatric disorders in 184 subjects, with the diagnoses of alcohol dependence (n = 26), schizophrenia (n = 44), major depressive disorder (MDD, n = 24), bipolar disorder (acute manic episode, n = 13), attention deficit/hyperactivity disorder (ADHD, n = 23), and healthy controls (n = 54). Subjects' individual Beck Depression Inventory-and State-Trait Anxiety Inventory-scores were correlated with clusters showing significant activation during reward anticipation.
During reward anticipation, we observed significant group differences in ventral striatal (VS) activation: patients with schizophrenia, alcohol dependence, and major depression showed significantly less ventral striatal activation compared to healthy controls. Depressive symptoms correlated with dysfunction in reward anticipation regardless of diagnostic entity. There was no significant correlation between anxiety symptoms and VS functional activation.
Our findings demonstrate a neurobiological dysfunction related to reward prediction that transcended disorder categories and was related to measures of depressed mood. The findings underline the potential of a dimensional approach in psychiatry and strengthen the hypothesis that neurobiological research in psychiatric disorders can be targeted at core mechanisms that are likely to be implicated in a range of clinical entities.
The interruption of learning processes by breaks filled with diverse activities is common in everyday life. We investigated the effects of active computer gaming and passive relaxation (rest and music) breaks on working memory performance. Young adults were exposed to breaks involving (i) eyes-open resting, (ii) listening to music and (iii) playing the video game “Angry Birds” before performing the n-back working memory task. Based on linear mixed-effects modeling, we found that playing the “Angry Birds” video game during a short learning break led to a decline in task performance over the course of the task as compared to eyes-open resting and listening to music, although overall task performance was not impaired. This effect was associated with high levels of daily mind wandering and low self-reported ability to concentrate. These findings indicate that video games can negatively affect working memory performance over time when played in between learning tasks. We suggest further investigation of these effects because of their relevance to everyday activity.
The interruption of learning processes by breaks filled with diverse activities is common in everyday life. We investigated the effects of active computer gaming and passive relaxation (rest and music) breaks on working memory performance. Young adults were exposed to breaks involving (i) eyes-open resting, (ii) listening to music and (iii) playing the video game “Angry Birds” before performing the n-back working memory task. Based on linear mixed-effects modeling, we found that playing the “Angry Birds” video game during a short learning break led to a decline in task performance over the course of the task as compared to eyes-open resting and listening to music, although overall task performance was not impaired. This effect was associated with high levels of daily mind wandering and low self-reported ability to concentrate. These findings indicate that video games can negatively affect working memory performance over time when played in between learning tasks. We suggest further investigation of these effects because of their relevance to everyday activity.
The interruption of learning processes by breaks filled with diverse activities is common in everyday life. This study investigated the effects of active computer gaming and passive relaxation (rest and music) breaks on auditory versus visual memory performance. Young adults were exposed to breaks involving (a) open eyes resting, (b) listening to music, and (c) playing a video game, immediately after memorizing auditory versus visual stimuli. To assess learning performance, words were recalled directly after the break (an 8:30 minute delay) and were recalled and recognized again after 7 days. Based on linear mixed-effects modeling, it was found that playing the Angry Birds video game during a short learning break impaired long-term retrieval in auditory learning but enhanced long-term retrieval in visual learning compared with the music and rest conditions. These differential effects of video games on visual versus auditory learning suggest specific interference of common break activities on learning.
Der demografische Wandel wird nicht nur mit einer rasanten Zunahme der Hochaltrigen einhergehen [1], was für die gerontopsychiatrische Versorgung aufgrund der altersassoziierten Inzidenzraten in erster Linie eine Zunahme an Demenzerkrankungen und Patienten mit Multimorbidität und Gebrechlichkeit bedeutet [2], sondern auch mit einer Zunahme jüngerer alter Menschen vom 65. bis 75. Lebensjahr, was für die Gerontopsychiatrie eine Zunahme der Patienten mit Abhängigkeitserkrankungen, Erkrankungen aus dem schizophrenen Formenkreis und affektiven Erkrankungen bedeutet. Soziale Faktoren werden hier mehr und mehr eine zentrale Rolle spielen, da neben der Qualität der medizinischen Versorgung insbesondere die individuelle soziale Situation der Patienten mit einer erhöhten Morbidität und Mortalität einhergehen wird [3].
The goal of this study was to determine the prevalence of depression and its risk factors in patients with late-onset rheumatoid arthritis (RA) treated in German primary care practices. Longitudinal data from general practices (n=1072) throughout Germany were analyzed. Individuals initially diagnosed with RA (2009-2013) were identified, and 7301 patients were included and matched (1:1) to 7301 controls. The primary outcome measure was the initial diagnosis of depression within 5 years after the index date in patients with and without RA. Cox proportional hazards models were used to adjust for confounders. The mean age was 72.2 years (SD: 7.6 years). A total of 34.9 % of patients were men. Depression diagnoses were present in 22.0 % of the RA group and 14.3 % of the control group after a 5-year follow-up period (p < 0.001). In the multivariate regression model, RA was a strong risk factor for the development of depression (HR: 1.55, p < 0.001). There was significant interaction of RA and diagnosed inflammatory polyarthropathies (IP) (RA*IP interaction: p < 0.001). Furthermore, dementia, cancer, osteoporosis, hypertension, and diabetes were associated with a higher risk of developing depression (p values < 0.001). The risk of depression is significantly higher in patients with late-onset RA than in patients without RA for subjects treated in primary care practices in Germany. RA patients should be screened routinely for depression in order to ensure improved treatment and management.
Background: The goal of this study was to estimate the prevalence of and risk factors for diagnosed depression in heart failure (HF) patients in German primary care practices.
Methods: This study was a retrospective database analysis in Germany utilizing the Disease Analyzer (R) Database (IMS Health, Germany). The study population included 132,994 patients between 40 and 90 years of age from 1,072 primary care practices. The observation period was between 2004 and 2013. Follow-up lasted up to five years and ended in April 2015. A total of 66,497 HF patients were selected after applying exclusion criteria. The same number of 66,497 controls were chosen and were matched (1:1) to HF patients on the basis of age, sex, health insurance, depression diagnosis in the past, and follow-up duration after index date.
Results: HF was a strong risk factor for diagnosed depression (p < 0.0001). A total of 10.5% of HF patients and 6.3% of matched controls developed depression after one year of follow-up (p < 0.001). Depression was documented in 28.9% of the HF group and 18.2% of the control group after the five-year follow-up (p < 0.001). Cancer, dementia, osteoporosis, stroke, and osteoarthritis were associated with a higher risk of developing depression. Male gender and private health insurance were associated with lower risk of depression.
Conclusions: The risk of diagnosed depression is significantly increased in patients with HF compared to patients without HF in primary care practices in Germany.
Background: The goal of this study was to estimate the prevalence of and risk factors for diagnosed depression in heart failure (HF) patients in German primary care practices. Methods: This study was a retrospective database analysis in Germany utilizing the Disease Analyzer (R) Database (IMS Health, Germany). The study population included 132,994 patients between 40 and 90 years of age from 1,072 primary care practices. The observation period was between 2004 and 2013. Follow-up lasted up to five years and ended in April 2015. A total of 66,497 HF patients were selected after applying exclusion criteria. The same number of 66,497 controls were chosen and were matched (1:1) to HF patients on the basis of age, sex, health insurance, depression diagnosis in the past, and follow-up duration after index date. Results: HF was a strong risk factor for diagnosed depression (p < 0.0001). A total of 10.5% of HF patients and 6.3% of matched controls developed depression after one year of follow-up (p < 0.001). Depression was documented in 28.9% of the HF group and 18.2% of the control group after the five-year follow-up (p < 0.001). Cancer, dementia, osteoporosis, stroke, and osteoarthritis were associated with a higher risk of developing depression. Male gender and private health insurance were associated with lower risk of depression. Conclusions: The risk of diagnosed depression is significantly increased in patients with HF compared to patients without HF in primary care practices in Germany.
AIM To determine the prevalence of depression and its risk factors among patients with coronary heart disease (CHD) treated in German primary care practices. METHODS Longitudinal data from nationwide general practices in Germany (n = 1072) were analyzed. Individuals initially diagnosed with CHD (2009-2013) were identified, and 59992 patients were included and matched (1: 1) to 59992 controls. The primary outcome measure was an initial diagnosis of depression within five years after the index date among patients with and without CHD. Cox proportional hazards models were used to adjust for confounders. RESULTS Mean age was equal to 68.0 years (SD = 11.3). A total of 55.9% of patients were men. After a five-year follow-up, 21.8% of the CHD group and 14.2% of the control group were diagnosed with depression (P < 0.001). In the multivariate regression model, CHD was a strong risk factor for developing depression (HR = 1.54, 95% CI: 1.49-1.59, P < 0.001). Prior depressive episodes, dementia, and eight other chronic conditions were associated with a higher risk of developing depression. Interestingly, older patients and women were also more likely to be diagnosed with depression compared with younger patients and men, respectively. CONCLUSION The risk of depression is significantly increased among patients with CHD compared with patients without CHD treated in primary care practices in Germany. CHD patients should be routinely screened for depression to ensure improved treatment and management.
The Summary Thirty-five thousand four hundred eighty-three female osteoporosis patients were compared with 35,483 patients without osteoporosis regarding the incidence of depression. The risk of depression is significantly increased for patients with osteoporosis compared with patients without osteoporosis in primary care practices within Germany. Introduction The objectives of the present study were to analyze the incidence of depression in German female patients with osteoporosis and to evaluate the risk factors for depression diagnosis within this patient population. Methods This study was a retrospective database analysis conducted in Germany utilizing the Disease Analyzer (R) Database (IMS Health, Germany). The study population included 70,966 patients between 40 and 80 years of age from 1072 primary care practices. The observation period was between 2004 and 2013. Follow-up duration was 5 years and was completed in April 2015. A total of 35,483 osteoporosis patients were selected after applying exclusion criteria, and 35,483 controls were chosen and then matched (1:1) to osteoporosis patients based on age, sex, health insurance coverage, depression diagnosis in the past, and follow-up duration after index date. The analyses of depression-free survival were carried out using Kaplan-Meier curves and log-rank tests. Cox proportional hazards models (dependent variable: depression) were used to adjust for confounders. Results Depression diagnoses were presented in 33.0% of the osteoporosis group and 22.7% of the control group after the 5-year follow-up (p < 0.001). Dementia, cancer, heart failure, coronary heart disease, and diabetes were associated with a higher risk of developing depression (p < 0.001). Private health insurance was associated with a lower risk of depression. There was no significant effect of fractures on depression risk. Conclusion The risk of depression is significantly increased for patients with osteoporosis in primary care practices within Germany.
There is evidence for cortical contribution to the regulation of human postural control. Interference from concurrently performed cognitive tasks supports this notion, and the lateral prefrontal cortex (lPFC) has been suggested to play a prominent role in the processing of purely cognitive as well as cognitive-postural dual tasks. The degree of cognitive-motor interference varies greatly between individuals, but it is unresolved whether individual differences in the recruitment of specific lPFC regions during cognitive dual tasking are associated with individual differences in cognitive-motor interference. Here, we investigated inter-individual variability in a cognitive-postural multitasking situation in healthy young adults (n = 29) in order to relate these to inter-individual variability in lPFC recruitment during cognitive multitasking. For this purpose, a oneback working memory task was performed either as single task or as dual task in order to vary cognitive load. Participants performed these cognitive single and dual tasks either during upright stance on a balance pad that was placed on top of a force plate or during fMRI measurement with little to no postural demands. We hypothesized dual one-back task performance to be associated with lPFC recruitment when compared to single one-back task performance. In addition, we expected individual variability in lPFC recruitment to be associated with postural performance costs during concurrent dual one-back performance. As expected, behavioral performance costs in postural sway during dual-one back performance largely varied between individuals and so did lPFC recruitment during dual one-back performance. Most importantly, individuals who recruited the right mid-lPFC to a larger degree during dual one-back performance also showed greater postural sway as measured by larger performance costs in total center of pressure displacements. This effect was selective to the high-load dual one-back task and suggests a crucial role of the right lPFC in allocating resources during cognitivemotor interference. Our study provides further insight into the mechanisms underlying cognitive-motor multitasking and its impairments.
There is evidence for cortical contribution to the regulation of human postural control. Interference from concurrently performed cognitive tasks supports this notion, and the lateral prefrontal cortex (lPFC) has been suggested to play a prominent role in the processing of purely cognitive as well as cognitive-postural dual tasks. The degree of cognitive-motor interference varies greatly between individuals, but it is unresolved whether individual differences in the recruitment of specific lPFC regions during cognitive dual tasking are associated with individual differences in cognitive-motor interference. Here, we investigated inter-individual variability in a cognitive-postural multitasking situation in healthy young adults (n = 29) in order to relate these to inter-individual variability in lPFC recruitment during cognitive multitasking. For this purpose, a oneback working memory task was performed either as single task or as dual task in order to vary cognitive load. Participants performed these cognitive single and dual tasks either during upright stance on a balance pad that was placed on top of a force plate or during fMRI measurement with little to no postural demands. We hypothesized dual one-back task performance to be associated with lPFC recruitment when compared to single one-back task performance. In addition, we expected individual variability in lPFC recruitment to be associated with postural performance costs during concurrent dual one-back performance. As expected, behavioral performance costs in postural sway during dual-one back performance largely varied between individuals and so did lPFC recruitment during dual one-back performance. Most importantly, individuals who recruited the right mid-lPFC to a larger degree during dual one-back performance also showed greater postural sway as measured by larger performance costs in total center of pressure displacements. This effect was selective to the high-load dual one-back task and suggests a crucial role of the right lPFC in allocating resources during cognitivemotor interference. Our study provides further insight into the mechanisms underlying cognitive-motor multitasking and its impairments.
Background: Continuous treatment is an important indicator of medication adherence in dementia. However, long-term studies in larger clinical settings are lacking, and little is known about moderating effects of patient and service characteristics.
Methods: Data from 12,910 outpatients with dementia (mean age 79.2 years; SD = 7.6 years) treated between January 2003 and December 2013 in Germany were included. Continuous treatment was analysed using Kaplan-Meier curves and log-rank tests. In addition, multivariate Cox regression models were fitted with continuous treatment as dependent variable and the predictors antidementia agent, age, gender, medical comorbidities, physician specialty, and health insurance status.
Results: After one year of follow-up, nearly 60% of patients continued drug treatment. Donezepil (HR: 0.88; 95% CI: 0.82-0.95) and memantine (HR: 0.85; 0.79-0.91) patients were less likely to be discontinued treatment as compared to rivastigmine users. Patients were less likely to be discontinued if they were treated by specialist physicians as compared to general practitioners (HR: 0.44; 0.41-0.48). Younger male patients and patients who had private health insurance had a lower discontinuation risk. Regarding comorbidity, patients were more likely to be continuously treated with the index substance if a diagnosis of heart failure or hypertension had been diagnosed at baseline.
Conclusions: Our results imply that besides type of antidementia agent, involvement of a specialist in the complex process of prescribing antidementia drugs can provide meaningful benefits to patients, in terms of more disease-specific and continuous treatment.
Continuous treatment with antidementia drugs in Germany 2003-2013: a retrospective database analysis
(2015)
Background: Continuous treatment is an important indicator of medication adherence in dementia. However, long-term studies in larger clinical settings are lacking, and little is known about moderating effects of patient and service characteristics.
Methods: Data from 12,910 outpatients with dementia (mean age 79.2 years; SD = 7.6 years) treated between January 2003 and December 2013 in Germany were included. Continuous treatment was analysed using Kaplan-Meier curves and log-rank tests. In addition, multivariate Cox regression models were fitted with continuous treatment as dependent variable and the predictors antidementia agent, age, gender, medical comorbidities, physician specialty, and health insurance status.
Results: After one year of follow-up, nearly 60% of patients continued drug treatment. Donezepil (HR: 0.88; 95% CI: 0.82-0.95) and memantine (HR: 0.85; 0.79-0.91) patients were less likely to be discontinued treatment as compared to rivastigmine users. Patients were less likely to be discontinued if they were treated by specialist physicians as compared to general practitioners (HR: 0.44; 0.41-0.48). Younger male patients and patients who had private health insurance had a lower discontinuation risk. Regarding comorbidity, patients were more likely to be continuously treated with the index substance if a diagnosis of heart failure or hypertension had been diagnosed at baseline.
Conclusions: Our results imply that besides type of antidementia agent, involvement of a specialist in the complex process of prescribing antidementia drugs can provide meaningful benefits to patients, in terms of more disease-specific and continuous treatment.
Cities and Mental Health
(2017)
Background: More than half of the global population currently lives in cities, with an increasing trend for further urbanization. Living in cities is associated with increased population density, traffic noise and pollution, but also with better access to health care and other commodities. Methods: This review is based on a selective literature search, providing an overview of the risk factors for mental illness in urban centers. Results: Studies have shown that the risk for serious mental illness is generally higher in cities compared to rural areas. Epidemiological studies have associated growing up and living in cities with a considerably higher risk for schizophrenia. However, correlation is not causation and living in poverty can both contribute to and result from impairments associated with poor mental health. Social isolation and discrimination as well as poverty in the neighborhood contribute to the mental health burden while little is known about specific inter actions between such factors and the built environment. Conclusion: Further insights on the interaction between spatial heterogeneity of neighborhood resources and socio-ecological factors is warranted and requires interdisciplinary research.
Drugs of abuse elicit dopamine release in the ventral striatum, possibly biasing dopamine-driven reinforcement learning towards drug-related reward at the expense of non-drug-related reward. Indeed, in alcohol-dependent patients, reactivity in dopaminergic target areas is shifted from non-drug-related stimuli towards drug-related stimuli. Such hijacked' dopamine signals may impair flexible learning from non-drug-related rewards, and thus promote craving for the drug of abuse. Here, we used functional magnetic resonance imaging to measure ventral striatal activation by reward prediction errors (RPEs) during a probabilistic reversal learning task in recently detoxified alcohol-dependent patients and healthy controls (N=27). All participants also underwent 6-[F-18]fluoro-DOPA positron emission tomography to assess ventral striatal dopamine synthesis capacity. Neither ventral striatal activation by RPEs nor striatal dopamine synthesis capacity differed between groups. However, ventral striatal coding of RPEs correlated inversely with craving in patients. Furthermore, we found a negative correlation between ventral striatal coding of RPEs and dopamine synthesis capacity in healthy controls, but not in alcohol-dependent patients. Moderator analyses showed that the magnitude of the association between dopamine synthesis capacity and RPE coding depended on the amount of chronic, habitual alcohol intake. Despite the relatively small sample size, a power analysis supports the reported results. Using a multimodal imaging approach, this study suggests that dopaminergic modulation of neural learning signals is disrupted in alcohol dependence in proportion to long-term alcohol intake of patients. Alcohol intake may perpetuate itself by interfering with dopaminergic modulation of neural learning signals in the ventral striatum, thus increasing craving for habitual drug intake.
Objectives. Recent work suggests that a genetic variation associated with increased dopamine metabolism in the prefrontal cortex (catechol-O-methyltransferase Val158Met; COMT) amplifies age-related changes in working memory performance. Research on younger adults indicates that the influence of dopamine-related genetic polymorphisms on working memory performance increases when testing the cognitive limits through training. To date, this has not been studied in older adults. Method. Here we investigate the effect of COMT genotype on plasticity in working memory in a sample of 14 younger (aged 24-30 years) and 25 older (aged 60-75 years) healthy adults. Participants underwent adaptive training in the n-back working memory task over 12 sessions under increasing difficulty conditions. Results. Both younger and older adults exhibited sizeable behavioral plasticity through training (P < .001), which was larger in younger as compared to older adults (P < .001). Age-related differences were qualified by an interaction with COMT genotype (P < .001), and this interaction was due to decreased behavioral plasticity in older adults carrying the Val/Val genotype, while there was no effect of genotype in younger adults. Discussion. Our findings indicate that age-related changes in plasticity in working memory are critically affected by genetic variation in prefrontal dopamine metabolism.
Timing and magnitude of surface uplift are key to understanding the impact of crustal deformation and topographic growth on atmospheric circulation, environmental conditions, and surface processes. Uplift of the East African Plateau is linked to mantle processes, but paleoaltimetry data are too scarce to constrain plateau evolution and subsequent vertical motions associated with rifting. Here, we assess the paleotopographic implications of a beaked whale fossil (Ziphiidae) from the Turkana region of Kenya found 740 km inland from the present-day coastline of the Indian Ocean at an elevation of 620 m. The specimen is similar to 17 My old and represents the oldest derived beaked whale known, consistent with molecular estimates of the emergence of modern straptoothed whales (Mesoplodon). The whale traveled from the Indian Ocean inland along an eastward-directed drainage system controlled by the Cretaceous Anza Graben and was stranded slightly above sea level. Surface uplift from near sea level coincides with paleoclimatic change from a humid environment to highly variable and much drier conditions, which altered biotic communities and drove evolution in east Africa, including that of primates.
Ziel der Studie Dieser Artikel untersucht, inwiefern sich die 2016 reformierte Richtlinie im Stadt-Land- sowie im Ost-West-Vergleich auf die ambulante psychotherapeutische Arbeit und Versorgung auswirkt.
Methodik Eine Onlineumfrage unter vertragsärztlich tätigen TherapeutInnen wurde durchgeführt. Die Fragen bezogen sich auf verschiedene Neuerungen in der Richtlinie.
Ergebnisse Unabhängig von der Region schätzten die Befragten ein, dass die Reform zu keiner verbesserten Versorgung führte.
Im Westen und in der Stadt tätige TherapeutInnen verwiesen PatientInnen nach der Sprechstunde öfter an andere Psychotherapiepraxen, im Osten und auf dem Land tätige hingegen öfter auf andere Hilfeangebote.
Schlussfolgerung Stärkere Anreize für die psychotherapeutische Tätigkeit auf dem Land sind zu schaffen. Abbaumaßnahmen der Ost-West-Ungleichheiten in der Versorgungsdichte scheinen nötig.
Ziel der Studie Dieser Artikel untersucht, inwiefern Aspekte der 2016 reformierten Psychotherapierichtlinie aus Sicht der drei Richtlinienverfahren für die praktische Arbeit unterschiedlich bewertet und genutzt werden. <br /> Methodik Eine Onlineumfrage wurde unter vertragsärztlich tätigen PsychotherapeutInnen (n = 987) durchgeführt. Die Fragen bezogen sich auf die unterschiedlichen Neuerungen in der Psychotherapierichtlinie.
Ergebnisse Signifikante Unterschiede wurden u. a. in der Nutzung der erweiterten Befugnisse sowie in der Abrechnung bestimmter Leistungen deutlich. Die Gruppen unterschieden sich auch in der Beantragung von Behandlungskontingenten. <br /> Schlussfolgerung Es scheint sinnvoll, Elemente der Richtlinienreform aus der Sicht des bevorzugten Behandlungsverfahrens zu evaluieren. Jene Aspekte scheinen bedeutsam, die sich auf die unmittelbare Arbeit mit den PatientInnen beziehen.