Refine
Year of publication
Document Type
Is part of the Bibliography
- yes (165)
Keywords
- depression (14)
- dementia (13)
- fMRI (11)
- working memory (10)
- HIV (7)
- aging (7)
- risk factors (7)
- Alcohol dependence (6)
- Depression (6)
- alcohol (6)
Institute
- Department Sport- und Gesundheitswissenschaften (77)
- Department Psychologie (27)
- Strukturbereich Kognitionswissenschaften (26)
- Fakultät für Gesundheitswissenschaften (13)
- Humanwissenschaftliche Fakultät (11)
- Extern (7)
- Department Grundschulpädagogik (1)
- Institut für Biochemie und Biologie (1)
- Institut für Mathematik (1)
- Interdisziplinäres Zentrum für Kognitive Studien (1)
IntroductionPostoperative delirium (POD) is a common and serious adverse event of surgery in older people. Because of its great impact on patients' safety and quality of life, identification of modifiable risk factors could be useful. Although preoperative medication intake is assumed to be an important modifiable risk factor, the impact of anticholinergic drugs on the occurrence of POD seems underestimated in elective surgery. The aim of this study was to investigate the association between preoperative anticholinergic burden and POD. We hypothesized that a high preoperative anticholinergic burden is an independent, potentially modifiable predisposing and precipitating factor of POD in older people. MethodsBetween November 2017 and April 2019, 1,470 patients of 70 years and older undergoing elective orthopedic, general, cardiac, or vascular surgery were recruited in the randomized, prospective, multicenter PAWEL trial. Anticholinergic burden of a sub-cohort of 899 patients, who did not receive a multimodal intervention for preventing POD, was assessed by two different tools at hospital admission: The established Anticholinergic Risk Scale (ARS) and the recently developed Anticholinergic Burden Score (ABS). POD was detected by confusion assessment method (CAM) and a validated post discharge medical record review. Logistic regression analyses were performed to evaluate the association between anticholinergic burden and POD. ResultsPOD was observed in 210 of 899 patients (23.4%). Both ARS and ABS were independently associated with POD. The association persisted after adjustment for relevant confounding factors such as age, sex, comorbidities, preoperative cognitive and physical status, number of prescribed drugs, surgery time, type of surgery and anesthesia, usage of heart-lung-machine, and treatment in intensive care unit. If a patient was taking one of the 56 drugs listed in the ABS, risk for POD was 2.7-fold higher (OR = 2.74, 95% CI = 1.55-4.94) and 1.5-fold higher per additional point on the ARS (OR = 1.54, 95% CI = 1.15-2.02). ConclusionPreoperative anticholinergic drug exposure measured by ARS or ABS was independently associated with POD in older patients undergoing elective surgery. Therefore, identification, discontinuation or substitution of anticholinergic medication prior to surgery may be a promising approach to reduce the risk of POD in older patients.
Background
Postoperative delirium is a common disorder in older adults that is associated with higher morbidity and mortality, prolonged cognitive impairment, development of dementia, higher institutionalization rates, and rising healthcare costs. The probability of delirium after surgery increases with patients’ age, with pre-existing cognitive impairment, and with comorbidities, and its diagnosis and treatment is dependent on the knowledge of diagnostic criteria, risk factors, and treatment options of the medical staff. In this study, we will investigate whether a cross-sectoral and multimodal intervention for preventing delirium can reduce the prevalence of delirium and postoperative cognitive decline (POCD) in patients older than 70 years undergoing elective surgery. Additionally, we will analyze whether the intervention is cost-effective.
Methods
The study will be conducted at five medical centers (with two or three surgical departments each) in the southwest of Germany. The study employs a stepped-wedge design with cluster randomization of the medical centers. Measurements are performed at six consecutive points: preadmission, preoperative, and postoperative with daily delirium screening up to day 7 and POCD evaluations at 2, 6, and 12 months after surgery. Recruitment goals are to enroll 1500 patients older than 70 years undergoing elective operative procedures (cardiac, thoracic, vascular, proximal big joints and spine, genitourinary, gastrointestinal, and general elective surgery procedures.
Discussion
Results of the trial should form the basis of future standards for preventing delirium and POCD in surgical wards. Key aims are the improvement of patient safety and quality of life, as well as the reduction of the long-term risk of conversion to dementia. Furthermore, from an economic perspective, we expect benefits and decreased costs for hospitals, patients, and healthcare insurances.
Trial registration
German Clinical Trials Register, DRKS00013311. Registered on 10 November 2017.
Genetic and environmental factors both contribute to cognitive test performance. A substantial increase in average intelligence test results in the second half of the previous century within one generation is unlikely to be explained by genetic changes. One possible explanation for the strong malleability of cognitive performance measure is that environmental factors modify gene expression via epigenetic mechanisms. Epigenetic factors may help to understand the recent observations of an association between dopamine-dependent encoding of reward prediction errors and cognitive capacity, which was modulated by adverse life events. The possible manifestation of malleable biomarkers contributing to variance in cognitive test performance, and thus possibly contributing to the "missing heritability" between estimates from twin studies and variance explained by genetic markers, is still unclear. Here we show in 1475 healthy adolescents from the IMaging and GENetics (IMAGEN) sample that general IQ (gIQ) is associated with (1) polygenic scores for intelligence, (2) epigenetic modification of DRD2 gene, (3) gray matter density in striatum, and (4) functional striatal activation elicited by temporarily surprising reward-predicting cues. Comparing the relative importance for the prediction of gIQ in an overlapping subsample, our results demonstrate neurobiological correlates of the malleability of gIQ and point to equal importance of genetic variance, epigenetic modification of DRD2 receptor gene, as well as functional striatal activation, known to influence dopamine neurotransmission. Peripheral epigenetic markers are in need of confirmation in the central nervous system and should be tested in longitudinal settings specifically assessing individual and environmental factors that modify epigenetic structure.
Genetic and environmental factors both contribute to cognitive test performance. A substantial increase in average intelligence test results in the second half of the previous century within one generation is unlikely to be explained by genetic changes. One possible explanation for the strong malleability of cognitive performance measure is that environmental factors modify gene expression via epigenetic mechanisms. Epigenetic factors may help to understand the recent observations of an association between dopamine-dependent encoding of reward prediction errors and cognitive capacity, which was modulated by adverse life events. The possible manifestation of malleable biomarkers contributing to variance in cognitive test performance, and thus possibly contributing to the "missing heritability" between estimates from twin studies and variance explained by genetic markers, is still unclear. Here we show in 1475 healthy adolescents from the IMaging and GENetics (IMAGEN) sample that general IQ (gIQ) is associated with (1) polygenic scores for intelligence, (2) epigenetic modification of DRD2 gene, (3) gray matter density in striatum, and (4) functional striatal activation elicited by temporarily surprising reward-predicting cues. Comparing the relative importance for the prediction of gIQ in an overlapping subsample, our results demonstrate neurobiological correlates of the malleability of gIQ and point to equal importance of genetic variance, epigenetic modification of DRD2 receptor gene, as well as functional striatal activation, known to influence dopamine neurotransmission. Peripheral epigenetic markers are in need of confirmation in the central nervous system and should be tested in longitudinal settings specifically assessing individual and environmental factors that modify epigenetic structure.
Background
Postoperative delirium is a common disorder in older adults that is associated with higher morbidity and mortality, prolonged cognitive impairment, development of dementia, higher institutionalization rates, and rising healthcare costs. The probability of delirium after surgery increases with patients’ age, with pre-existing cognitive impairment, and with comorbidities, and its diagnosis and treatment is dependent on the knowledge of diagnostic criteria, risk factors, and treatment options of the medical staff. In this study, we will investigate whether a cross-sectoral and multimodal intervention for preventing delirium can reduce the prevalence of delirium and postoperative cognitive decline (POCD) in patients older than 70 years undergoing elective surgery. Additionally, we will analyze whether the intervention is cost-effective.
Methods
The study will be conducted at five medical centers (with two or three surgical departments each) in the southwest of Germany. The study employs a stepped-wedge design with cluster randomization of the medical centers. Measurements are performed at six consecutive points: preadmission, preoperative, and postoperative with daily delirium screening up to day 7 and POCD evaluations at 2, 6, and 12 months after surgery. Recruitment goals are to enroll 1500 patients older than 70 years undergoing elective operative procedures (cardiac, thoracic, vascular, proximal big joints and spine, genitourinary, gastrointestinal, and general elective surgery procedures.
Discussion
Results of the trial should form the basis of future standards for preventing delirium and POCD in surgical wards. Key aims are the improvement of patient safety and quality of life, as well as the reduction of the long-term risk of conversion to dementia. Furthermore, from an economic perspective, we expect benefits and decreased costs for hospitals, patients, and healthcare insurances.
Trial registration
German Clinical Trials Register, DRKS00013311. Registered on 10 November 2017.
Background: First rank symptoms (FRS) of schizophrenia have been used for decades for diagnostic purposes. In the new version of the DSM-5, the American Psychiatric Association (APA) has abolished any further reference to FRS of schizophrenia and treats them like any other "criterion A' symptom (e.g. any kind of hallucination or delusion) with regard to their diagnostic implication. The ICD-10 is currently under revision and may follow suit. In this review, we discuss central points of criticism that are directed against the continuous use of first rank symptoms (FRS) to diagnose schizophrenia.
Background: Demographic changes are leading to growing care needs of older people and creating a challenge for healthcare systems worldwide. Nursing homes (NHs) need to provide care for growing numbers of residents while ensuring a high-quality care. We aimed to examine an innovative NH in Germany and apply a theory of change (ToC) approach to develop a best practice model (BPM) for therapeutic care in NHs.
Methods: A multimethod qualitative study conducted from February to July 2021 in Germany involved interviews with 14 staff members of an innovative NH and 10 directors and care managers of other NHs. The interview guidelines included questions on nursing practices, infrastructure, resources, interprofessional collaboration, and working culture. Additional material on the participating NH (website, promotion videos, newsletters, care documentation) were collected. Contextual literature on NH culture and therapeutic care in Germany, ToC methodology, and NH culture change were reviewed. Following a question-focused analysis of all material, we generated a ToC model towards a BPM of therapeutic care and meaningful living in NHs.
Results: were verified in interdisciplinary team meetings, with study participants and other stakeholders to establish consensus. Results The participating NH's care concept aims to improve residents' functional abilities and wellbeing as well as staff members' job satisfaction. Central components of their approach include therapeutic elements such as music and movement in all nursing activities, multidisciplinary collaboration, a broad therapy and social activity offer, the continuation of therapy in everyday activities, a focus on individual life history, values, needs, and skills, social integration into the regional community, and the creation of a meaningful living environment for residents and staff.
Conclusion: The BPM we developed shows how a meaningful living environment can be created through therapeutic care and integrative activities. The ToC sheds light onto the contextual factors and cultural values which should be considered in the development of NH interventions. Research on not only biomedical aspects, but also psychosocial dynamics and narrative co-constructions in nursing practice should inform NH innovations. The ToC also highlights the importance of developing adequate political frameworks and infrastructures for implementing such innovative practices on a larger scale.
BACKGROUND: Addiction is supposedly characterized by a shift from goal-directed to habitual decision making, thus facilitating automatic drug intake. The two-step task allows distinguishing between these mechanisms by computationally modeling goal-directed and habitual behavior as model-based and model-free control. In addicted patients, decision making may also strongly depend upon drug-associated expectations. Therefore, we investigated model-based versus model-free decision making and its neural correlates as well as alcohol expectancies in alcohol-dependent patients and healthy controls and assessed treatment outcome in patients. METHODS: Ninety detoxified, medication-free, alcohol-dependent patients and 96 age-and gender-matched control subjects underwent functional magnetic resonance imaging during the two-step task. Alcohol expectancies were measured with the Alcohol Expectancy Questionnaire. Over a follow-up period of 48 weeks, 37 patients remained abstinent and 53 patients relapsed as indicated by the Alcohol Timeline Followback method. RESULTS: Patients who relapsed displayed reduced medial prefrontal cortex activation during model-based decision making. Furthermore, high alcohol expectancies were associated with low model-based control in relapsers, while the opposite was observed in abstainers and healthy control subjects. However, reduced model-based control per se was not associated with subsequent relapse. CONCLUSIONS: These findings suggest that poor treatment outcome in alcohol dependence does not simply result from a shift from model-based to model-free control but is instead dependent on the interaction between high drug expectancies and low model-based decision making. Reduced model-based medial prefrontal cortex signatures in those who relapse point to a neural correlate of relapse risk. These observations suggest that therapeutic interventions should target subjective alcohol expectancies.
In detoxified alcohol-dependent patients, alcohol-related stimuli can promote relapse. However, to date, the mechanisms by which contextual stimuli promote relapse have not been elucidated in detail. One hypothesis is that such contextual stimuli directly stimulate the motivation to drink via associated brain regions like the ventral striatum and thus promote alcohol seeking, intake and relapse. Pavlovian-to-Instrumental-Transfer (PIT) may be one of those behavioral phenomena contributing to relapse, capturing how Pavlovian conditioned (contextual) cues determine instrumental behavior (e.g. alcohol seeking and intake). We used a PIT paradigm during functional magnetic resonance imaging to examine the effects of classically conditioned Pavlovian stimuli on instrumental choices in n=31 detoxified patients diagnosed with alcohol dependence and n=24 healthy controls matched for age and gender. Patients were followed up over a period of 3 months. We observed that (1) there was a significant behavioral PIT effect for all participants, which was significantly more pronounced in alcohol-dependent patients; (2) PIT was significantly associated with blood oxygen level-dependent (BOLD) signals in the nucleus accumbens (NAcc) in subsequent relapsers only; and (3) PIT-related NAcc activation was associated with, and predictive of, critical outcomes (amount of alcohol intake and relapse during a 3 months follow-up period) in alcohol-dependent patients. These observations show for the first time that PIT-related BOLD signals, as a measure of the influence of Pavlovian cues on instrumental behavior, predict alcohol intake and relapse in alcohol dependence.
Importance Alcohol consumption (AC) leads to death and disability worldwide. Ongoing discussions on potential negative effects of the COVID-19 pandemic on AC need to be informed by real-world evidence.
Objective To examine whether lockdown measures are associated with AC and consumption-related temporal and psychological within-person mechanisms.
Design, Setting, and Participants This quantitative, intensive, longitudinal cohort study recruited 1743 participants from 3 sites from February 20, 2020, to February 28, 2021. Data were provided before and within the second lockdown of the COVID-19 pandemic in Germany: before lockdown (October 2 to November 1, 2020); light lockdown (November 2 to December 15, 2020); and hard lockdown (December 16, 2020, to February 28, 2021).
Main Outcomes and Measures Daily ratings of AC (main outcome) captured during 3 lockdown phases (main variable) and temporal (weekends and holidays) and psychological (social isolation and drinking intention) correlates.
Results Of the 1743 screened participants, 189 (119 [63.0%] male; median [IQR] age, 37 [27.5-52.0] years) with at least 2 alcohol use disorder (AUD) criteria according to the Diagnostic and Statistical Manual of Mental Disorders (Fifth Edition) yet without the need for medically supervised alcohol withdrawal were included. These individuals provided 14 694 smartphone ratings from October 2020 through February 2021. Multilevel modeling revealed significantly higher AC (grams of alcohol per day) on weekend days vs weekdays (β = 11.39; 95% CI, 10.00-12.77; P < .001). Alcohol consumption was above the overall average on Christmas (β = 26.82; 95% CI, 21.87-31.77; P < .001) and New Year’s Eve (β = 66.88; 95% CI, 59.22-74.54; P < .001). During the hard lockdown, perceived social isolation was significantly higher (β = 0.12; 95% CI, 0.06-0.15; P < .001), but AC was significantly lower (β = −5.45; 95% CI, −8.00 to −2.90; P = .001). Independent of lockdown, intention to drink less alcohol was associated with lower AC (β = −11.10; 95% CI, −13.63 to −8.58; P < .001). Notably, differences in AC between weekend and weekdays decreased both during the hard lockdown (β = −6.14; 95% CI, −9.96 to −2.31; P = .002) and in participants with severe AUD (β = −6.26; 95% CI, −10.18 to −2.34; P = .002).
Conclusions and Relevance This 5-month cohort study found no immediate negative associations of lockdown measures with overall AC. Rather, weekend-weekday and holiday AC patterns exceeded lockdown effects. Differences in AC between weekend days and weekdays evinced that weekend drinking cycles decreased as a function of AUD severity and lockdown measures, indicating a potential mechanism of losing and regaining control. This finding suggests that temporal patterns and drinking intention constitute promising targets for prevention and intervention, even in high-risk individuals.
Pavlovian-to-instrumental transfer (PIT) tasks examine the influence of Pavlovian stimuli on ongoing instrumental behaviour. Previous studies reported associations between a strong PIT effect, high-risk drinking and alcohol use disorder. This study investigated whether susceptibility to interference between Pavlovian and instrumental control is linked to risky alcohol use in a community sample of 18-year-old male adults. Participants (N = 191) were instructed to 'collect good shells' and 'leave bad shells' during the presentation of appetitive (monetary reward), aversive (monetary loss) or neutral Pavlovian stimuli. We compared instrumental error rates (ER) and functional magnetic resonance imaging (fMRI) brain responses between the congruent and incongruent conditions, as well as among high-risk and low-risk drinking groups. On average, individuals showed a substantial PIT effect, that is, increased ER when Pavlovian cues and instrumental stimuli were in conflict compared with congruent trials. Neural PIT correlates were found in the ventral striatum and the dorsomedial and lateral prefrontal cortices (lPFC). Importantly, high-risk drinking was associated with a stronger behavioural PIT effect, a decreased lPFC response and an increased neural response in the ventral striatum on the trend level. Moreover, high-risk drinkers showed weaker connectivity from the ventral striatum to the lPFC during incongruent trials. Our study links interference during PIT to drinking behaviour in healthy, young adults. High-risk drinkers showed higher susceptibility to Pavlovian cues, especially when they conflicted with instrumental behaviour, indicating lower interference control abilities. Increased activity in the ventral striatum (bottom-up), decreased lPFC response (top-down), and their altered interplay may contribute to poor interference control in the high-risk drinkers.
Importance Alcohol consumption (AC) leads to death and disability worldwide. Ongoing discussions on potential negative effects of the COVID-19 pandemic on AC need to be informed by real-world evidence.
Objective To examine whether lockdown measures are associated with AC and consumption-related temporal and psychological within-person mechanisms.
Design, Setting, and Participants This quantitative, intensive, longitudinal cohort study recruited 1743 participants from 3 sites from February 20, 2020, to February 28, 2021. Data were provided before and within the second lockdown of the COVID-19 pandemic in Germany: before lockdown (October 2 to November 1, 2020); light lockdown (November 2 to December 15, 2020); and hard lockdown (December 16, 2020, to February 28, 2021).
Main Outcomes and Measures Daily ratings of AC (main outcome) captured during 3 lockdown phases (main variable) and temporal (weekends and holidays) and psychological (social isolation and drinking intention) correlates.
Results Of the 1743 screened participants, 189 (119 [63.0%] male; median [IQR] age, 37 [27.5-52.0] years) with at least 2 alcohol use disorder (AUD) criteria according to the Diagnostic and Statistical Manual of Mental Disorders (Fifth Edition) yet without the need for medically supervised alcohol withdrawal were included. These individuals provided 14 694 smartphone ratings from October 2020 through February 2021. Multilevel modeling revealed significantly higher AC (grams of alcohol per day) on weekend days vs weekdays (β = 11.39; 95% CI, 10.00-12.77; P < .001). Alcohol consumption was above the overall average on Christmas (β = 26.82; 95% CI, 21.87-31.77; P < .001) and New Year’s Eve (β = 66.88; 95% CI, 59.22-74.54; P < .001). During the hard lockdown, perceived social isolation was significantly higher (β = 0.12; 95% CI, 0.06-0.15; P < .001), but AC was significantly lower (β = −5.45; 95% CI, −8.00 to −2.90; P = .001). Independent of lockdown, intention to drink less alcohol was associated with lower AC (β = −11.10; 95% CI, −13.63 to −8.58; P < .001). Notably, differences in AC between weekend and weekdays decreased both during the hard lockdown (β = −6.14; 95% CI, −9.96 to −2.31; P = .002) and in participants with severe AUD (β = −6.26; 95% CI, −10.18 to −2.34; P = .002).
Conclusions and Relevance This 5-month cohort study found no immediate negative associations of lockdown measures with overall AC. Rather, weekend-weekday and holiday AC patterns exceeded lockdown effects. Differences in AC between weekend days and weekdays evinced that weekend drinking cycles decreased as a function of AUD severity and lockdown measures, indicating a potential mechanism of losing and regaining control. This finding suggests that temporal patterns and drinking intention constitute promising targets for prevention and intervention, even in high-risk individuals.
Pavlovian-to-instrumental transfer (PIT) tasks examine the influence of Pavlovian stimuli on ongoing instrumental behaviour. Previous studies reported associations between a strong PIT effect, high-risk drinking and alcohol use disorder. This study investigated whether susceptibility to interference between Pavlovian and instrumental control is linked to risky alcohol use in a community sample of 18-year-old male adults. Participants (N = 191) were instructed to 'collect good shells' and 'leave bad shells' during the presentation of appetitive (monetary reward), aversive (monetary loss) or neutral Pavlovian stimuli. We compared instrumental error rates (ER) and functional magnetic resonance imaging (fMRI) brain responses between the congruent and incongruent conditions, as well as among high-risk and low-risk drinking groups. On average, individuals showed a substantial PIT effect, that is, increased ER when Pavlovian cues and instrumental stimuli were in conflict compared with congruent trials. Neural PIT correlates were found in the ventral striatum and the dorsomedial and lateral prefrontal cortices (lPFC). Importantly, high-risk drinking was associated with a stronger behavioural PIT effect, a decreased lPFC response and an increased neural response in the ventral striatum on the trend level. Moreover, high-risk drinkers showed weaker connectivity from the ventral striatum to the lPFC during incongruent trials. Our study links interference during PIT to drinking behaviour in healthy, young adults. High-risk drinkers showed higher susceptibility to Pavlovian cues, especially when they conflicted with instrumental behaviour, indicating lower interference control abilities. Increased activity in the ventral striatum (bottom-up), decreased lPFC response (top-down), and their altered interplay may contribute to poor interference control in the high-risk drinkers.
One of the major risk factors for global death and disability is alcohol, tobacco, and illicit drug use. While there is increasing knowledge with respect to individual factors promoting the initiation and maintenance of substance use disorders (SUDs), disease trajectories involved in losing and regaining control over drug intake (ReCoDe) are still not well described. Our newly formed German Collaborative Research Centre (CRC) on ReCoDe has an interdisciplinary approach funded by the German Research Foundation (DFG) with a 12-year perspective. The main goals of our research consortium are (i) to identify triggers and modifying factors that longitudinally modulate the trajectories of losing and regaining control over drug consumption in real life, (ii) to study underlying behavioral, cognitive, and neurobiological mechanisms, and (iii) to implicate mechanism-based interventions. These goals will be achieved by: (i) using mobile health (m-health) tools to longitudinally monitor the effects of triggers (drug cues, stressors, and priming doses) and modify factors (eg, age, gender, physical activity, and cognitive control) on drug consumption patterns in real-life conditions and in animal models of addiction; (ii) the identification and computational modeling of key mechanisms mediating the effects of such triggers and modifying factors on goal-directed, habitual, and compulsive aspects of behavior from human studies and animal models; and (iii) developing and testing interventions that specifically target the underlying mechanisms for regaining control over drug intake.
In animals and humans, behavior can be influenced by irrelevant stimuli, a phenomenon called Pavlovian-to-instrumental transfer (PIT). In subjects with substance use disorder, PIT is even enhanced with functional activation in the nucleus accumbens (NAcc) and amygdala. While we observed enhanced behavioral and neural PIT effects in alcohol-dependent subjects, we here aimed to determine whether behavioral PIT is enhanced in young men with high-risk compared to low-risk drinking and subsequently related functional activation in an a-priori region of interest encompassing the NAcc and amygdala and related to polygenic risk for alcohol consumption. A representative sample of 18-year old men (n = 1937) was contacted: 445 were screened, 209 assessed: resulting in 191 valid behavioral, 139 imaging and 157 genetic datasets. None of the subjects fulfilled criteria for alcohol dependence according to the Diagnostic and Statistical Manual of Mental Disorders-IV-TextRevision (DSM-IV-TR). We measured how instrumental responding for rewards was influenced by background Pavlovian conditioned stimuli predicting action-independent rewards and losses. Behavioral PIT was enhanced in high-compared to low-risk drinkers (b = 0.09, SE = 0.03, z = 2.7, p < 0.009). Across all subjects, we observed PIT-related neural blood oxygen level-dependent (BOLD) signal in the right amygdala (t = 3.25, p(SVC) = 0.04, x = 26, y = -6, z = -12), but not in NAcc. The strength of the behavioral PIT effect was positively correlated with polygenic risk for alcohol consumption (r(s) = 0.17, p = 0.032). We conclude that behavioral PIT and polygenic risk for alcohol consumption might be a biomarker for a subclinical phenotype of risky alcohol consumption, even if no drug-related stimulus is present. The association between behavioral PIT effects and the amygdala might point to habitual processes related to out PIT task. In non-dependent young social drinkers, the amygdala rather than the NAcc is activated during PIT; possible different involvement in association with disease trajectory should be investigated in future studies.
Background: Human and animal work suggests a shift from goal-directed to habitual decision-making in addiction. However, the evidence for this in human alcohol dependence is as yet inconclusive. Methods: Twenty-six healthy controls and 26 recently detoxified alcohol-dependent patients underwent behavioral testing with a 2-step task designed to disentangle goal-directed and habitual response patterns. Results: Alcohol-dependent patients showed less evidence of goal-directed choices than healthy controls, particularly after losses. There was no difference in the strength of the habitual component. The group differences did not survive controlling for performance on the Digit Symbol Substitution Task. Conclusion: Chronic alcohol use appears to selectively impair goal-directed function, rather than promoting habitual responding. It appears to do so particularly after nonrewards, and this may be mediated by the effects of alcohol on more general cognitive functions subserved by the prefrontal cortex.
The influence of Pavlovian conditioned stimuli on ongoing behavior may contribute to explaining how alcohol cues stimulate drug seeking and intake. Using a Pavlovian-instrumental transfer task, we investigated the effects of alcohol-related cues on approach behavior (i.e., instrumental response behavior) and its neural correlates, and related both to the relapse after detoxification in alcohol-dependent patients. Thirty-one recently detoxified alcohol-dependent patients and 24 healthy controls underwent instrumental training, where approach or non-approach towards initially neutral stimuli was reinforced by monetary incentives. Approach behavior was tested during extinction with either alcohol-related or neutral stimuli (as Pavlovian cues) presented in the background during functional magnetic resonance imaging (fMRI). Patients were subsequently followed up for 6 months. We observed that alcohol-related background stimuli inhibited the approach behavior in detoxified alcohol-dependent patients (t = -3.86, p < .001), but not in healthy controls (t = -0.92, p = .36). This behavioral inhibition was associated with neural activation in the nucleus accumbens (NAcc) (t((30)) = 2.06, p < .05). Interestingly, both the effects were only present in subsequent abstainers, but not relapsers and in those with mild but not severe dependence. Our data show that alcohol-related cues can acquire inhibitory behavioral features typical of aversive stimuli despite being accompanied by a stronger NAcc activation, suggesting salience attribution. The fact that these findings are restricted to abstinence and milder illness suggests that they may be potential resilience factors.
One of the major risk factors for global death and disability is alcohol, tobacco, and illicit drug use. While there is increasing knowledge with respect to individual factors promoting the initiation and maintenance of substance use disorders (SUDs), disease trajectories involved in losing and regaining control over drug intake (ReCoDe) are still not well described. Our newly formed German Collaborative Research Centre (CRC) on ReCoDe has an interdisciplinary approach funded by the German Research Foundation (DFG) with a 12-year perspective. The main goals of our research consortium are (i) to identify triggers and modifying factors that longitudinally modulate the trajectories of losing and regaining control over drug consumption in real life, (ii) to study underlying behavioral, cognitive, and neurobiological mechanisms, and (iii) to implicate mechanism-based interventions. These goals will be achieved by: (i) using mobile health (m-health) tools to longitudinally monitor the effects of triggers (drug cues, stressors, and priming doses) and modify factors (eg, age, gender, physical activity, and cognitive control) on drug consumption patterns in real-life conditions and in animal models of addiction; (ii) the identification and computational modeling of key mechanisms mediating the effects of such triggers and modifying factors on goal-directed, habitual, and compulsive aspects of behavior from human studies and animal models; and (iii) developing and testing interventions that specifically target the underlying mechanisms for regaining control over drug intake.
Behavioral choice can be characterized along two axes. One axis distinguishes reflexive, model-free systems that slowly accumulate values through experience and a model-based system that uses knowledge to reason prospectively. The second axis distinguishes Pavlovian valuation of stimuli from instrumental valuation of actions or stimulus–action pairs. This results in four values and many possible interactions between them, with important consequences for accounts of individual variation. We here explored whether individual variation along one axis was related to individual variation along the other. Specifically, we asked whether individuals' balance between model-based and model-free learning was related to their tendency to show Pavlovian interferences with instrumental decisions. In two independent samples with a total of 243 participants, Pavlovian–instrumental transfer effects were negatively correlated with the strength of model-based reasoning in a two-step task. This suggests a potential common underlying substrate predisposing individuals to both have strong Pavlovian interference and be less model-based and provides a framework within which to interpret the observation of both effects in addiction.
Background Anxiety and depressive disorders share common features of mood dysfunctions. This has stimulated interest in transdiagnostic dimensional research as proposed by the Research Domain Criteria (RDoC) approach by the National Institute of Mental Health (NIMH) aiming to improve the understanding of underlying disease mechanisms. The purpose of this study was to investigate the processing of RDoC domains in relation to disease severity in order to identify latent disorder-specific as well as transdiagnostic indicators of disease severity in patients with anxiety and depressive disorders.
Methods Within the German research network for mental disorders, 895 participants (n = 476 female, n = 602 anxiety disorder, n = 257 depressive disorder) were recruited for the Phenotypic, Diagnostic and Clinical Domain Assessment Network Germany (PD-CAN) and included in this cross-sectional study. We performed incremental regression models to investigate the association of four RDoC domains on disease severity in patients with affective disorders: Positive (PVS) and Negative Valance System (NVS), Cognitive Systems (CS) and Social Processes (SP).
Results The results confirmed a transdiagnostic relationship for all four domains, as we found significant main effects on disease severity within domain-specific models (PVS: & beta; = -0.35; NVS: & beta; = 0.39; CS: & beta; = -0.12; SP: & beta; = -0.32). We also found three significant interaction effects with main diagnosis showing a disease-specific association.
Limitations The cross-sectional study design prevents causal conclusions. Further limitations include possible outliers and heteroskedasticity in all regression models which we appropriately controlled for.
Conclusion Our key results show that symptom burden in anxiety and depressive disorders is associated with latent RDoC indicators in transdiagnostic and disease-specific ways.
Pavlovian cues can influence ongoing instrumental behaviour via Pavlovian-to-instrumental transfer (PIT) processes. While appetitive Pavlovian cues tend to promote instrumental approach, they are detrimental when avoidance behaviour is required, and vice versa for aversive cues. We recently reported that susceptibility to interference between Pavlovian and instrumental control assessed via a PIT task was associated with risky alcohol use at age 18. We now investigated whether such susceptibility also predicts drinking trajectories until age 24, based on AUDIT (Alcohol Use Disorders Identification Test) consumption and binge drinking (gramme alcohol/drinking occasion) scores. The interference PIT effect, assessed at ages 18 and 21 during fMRI, was characterized by increased error rates (ER) and enhanced neural responses in the ventral striatum (VS), the lateral and dorsomedial prefrontal cortices (dmPFC) during conflict, that is, when an instrumental approach was required in the presence of an aversive Pavlovian cue or vice versa. We found that a stronger VS response during conflict at age 18 was associated with a higher starting point of both drinking trajectories but predicted a decrease in binge drinking. At age 21, high ER and enhanced neural responses in the dmPFC were associated with increasing AUDIT-C scores over the next 3 years until age 24. Overall, susceptibility to interference between Pavlovian and instrumental control might be viewed as a predisposing mechanism towards hazardous alcohol use during young adulthood, and the identified high-risk group may profit from targeted interventions.
Theories of decision-making and its neural substrates have long assumed the existence of two distinct and competing valuation systems, variously described as goal-directed vs. habitual, or, more recently and based on statistical arguments, as model-free vs. model-based reinforcement-learning. Though both have been shown to control choices, the cognitive abilities associated with these systems are under ongoing investigation. Here we examine the link to cognitive abilities, and find that individual differences in processing speed covary with a shift from model-free to model-based choice control in the presence of above-average working memory function. This suggests shared cognitive and neural processes; provides a bridge between literatures on intelligence and valuation; and may guide the development of process models of different valuation components. Furthermore, it provides a rationale for individual differences in the tendency to deploy valuation systems, which may be important for understanding the manifold neuropsychiatric diseases associated with malfunctions of valuation.
Rationale: Advances in neurocomputational modeling suggest that valuation systems for goal-directed (deliberative) on one side, and habitual (automatic) decision-making on the other side may rely on distinct computational strategies for reinforcement learning, namely model-free vs. model-based learning. As a key theoretical difference, the model-based system strongly demands cognitive functions to plan actions prospectively based on an internal cognitive model of the environment, whereas valuation in the model-free system relies on rather simple learning rules from operant conditioning to retrospectively associate actions with their outcomes and is thus cognitively less demanding. Acute stress reactivity is known to impair model-based but not model-free choice behavior, with higher working memory capacity protecting the model-based system from acute stress. However, it is not clear which impact accumulated real life stress has on model-free and model-based decision systems and how this influence interacts with cognitive abilities. Methods: We used a sequential decision-making task distinguishing relative contributions of both learning strategies to choice behavior, the Social Readjustment Rating Scale questionnaire to assess accumulated real life stress, and the Digit Symbol Substitution Test to test cognitive speed in 95 healthy subjects. Results: Individuals reporting high stress exposure who had low cognitive speed showed reduced model-based but increased model-free behavioral control. In contrast, subjects exposed to accumulated real life stress with high cognitive speed displayed increased model-based performance but reduced model-free control. Conclusion: These findings suggest that accumulated real life stress exposure can enhance reliance on cognitive speed for model-based computations, which may ultimately protect the model-based system from the detrimental influences of accumulated real life stress. The combination of accumulated real life stress exposure and slower information processing capacities, however, might favor model-free strategies. Thus, the valence and preference of either system strongly depends on stressful experiences and individual cognitive capacities.
In animals and humans, behavior can be influenced by irrelevant stimuli, a phenomenon called Pavlovian-to-instrumental transfer (PIT). In subjects with substance use disorder, PIT is even enhanced with functional activation in the nucleus accumbens (NAcc) and amygdala. While we observed enhanced behavioral and neural PIT effects in alcohol-dependent subjects, we here aimed to determine whether behavioral PIT is enhanced in young men with high-risk compared to low-risk drinking and subsequently related functional activation in an a-priori region of interest encompassing the NAcc and amygdala and related to polygenic risk for alcohol consumption. A representative sample of 18-year old men (n = 1937) was contacted: 445 were screened, 209 assessed: resulting in 191 valid behavioral, 139 imaging and 157 genetic datasets. None of the subjects fulfilled criteria for alcohol dependence according to the Diagnostic and Statistical Manual of Mental Disorders-IV-TextRevision (DSM-IV-TR). We measured how instrumental responding for rewards was influenced by background Pavlovian conditioned stimuli predicting action-independent rewards and losses. Behavioral PIT was enhanced in high-compared to low-risk drinkers (b = 0.09, SE = 0.03, z = 2.7, p < 0.009). Across all subjects, we observed PIT-related neural blood oxygen level-dependent (BOLD) signal in the right amygdala (t = 3.25, p(SVC) = 0.04, x = 26, y = -6, z = -12), but not in NAcc. The strength of the behavioral PIT effect was positively correlated with polygenic risk for alcohol consumption (r(s) = 0.17, p = 0.032). We conclude that behavioral PIT and polygenic risk for alcohol consumption might be a biomarker for a subclinical phenotype of risky alcohol consumption, even if no drug-related stimulus is present. The association between behavioral PIT effects and the amygdala might point to habitual processes related to out PIT task. In non-dependent young social drinkers, the amygdala rather than the NAcc is activated during PIT; possible different involvement in association with disease trajectory should be investigated in future studies.
No association of goal-directed and habitual control with alcohol consumption in young adults
(2017)
Alcohol dependence is a mental disorder that has been associated with an imbalance in behavioral control favoring model-free habitual over model-based goal-directed strategies. It is as yet unknown, however, whether such an imbalance reflects a predisposing vulnerability or results as a consequence of repeated and/or excessive alcohol exposure. We, therefore, examined the association of alcohol consumption with model-based goal-directed and model-free habitual control in 188 18-year-old social drinkers in a two-step sequential decision-making task while undergoing functional magnetic resonance imaging before prolonged alcohol misuse could have led to severe neurobiological adaptations. Behaviorally, participants showed a mixture of model-free and model-based decision-making as observed previously. Measures of impulsivity were positively related to alcohol consumption. In contrast, neither model-free nor model-based decision weights nor the trade-off between them were associated with alcohol consumption. There were also no significant associations between alcohol consumption and neural correlates of model-free or model-based decision quantities in either ventral striatum or ventromedial prefrontal cortex. Exploratory whole-brain functional magnetic resonance imaging analyses with a lenient threshold revealed early onset of drinking to be associated with an enhanced representation of model-free reward prediction errors in the posterior putamen. These results suggest that an imbalance between model-based goal-directed and model-free habitual control might rather not be a trait marker of alcohol intake per se.
A mechanism known as Pavlovian-to-instrumental transfer (PIT) describes a phenomenon by which the values of environmental cues acquired through Pavlovian conditioning can motivate instrumental behavior. PIT may be one basic mechanism of action control that can characterize mental disorders on a dimensional level beyond current classification systems. Therefore, we review human PIT studies investigating subclinical and clinical mental syndromes. The literature prevails an inhomogeneous picture concerning PIT. While enhanced PIT effects seem to be present in non-substance-related disorders, overweight people, and most studies with AUD patients, no altered PIT effects were reported in tobacco use disorder and obesity. Regarding AUD and relapsing alcohol-dependent patients, there is mixed evidence of enhanced or no PIT effects.
Additionally, there is evidence for aberrant corticostriatal activation and genetic risk, e.g., in association with high-risk alcohol consumption and relapse after alcohol detoxification. In patients with anorexia nervosa, stronger PIT effects elicited by low caloric stimuli were associated with increased disease severity.
In patients with depression, enhanced aversive PIT effects and a loss of action-specificity associated with poorer treatment outcomes were reported. Schizophrenic patients showed disrupted specific but intact general PIT effects. Patients with chronic back pain showed reduced PIT effects.
We provide possible reasons to understand heterogeneity in PIT effects within and across mental disorders. Further, we strengthen the importance of reliable experimental tasks and provide test-retest data of a PIT task showing moderate to good reliability.
Finally, we point toward stress as a possible underlying factor that may explain stronger PIT effects in mental disorders, as there is some evidence that stress per se interacts with the impact of environmental cues on behavior by selectively increasing cue-triggered wanting.
To conclude, we discuss the results of the literature review in the light of Research Domain Criteria, suggesting future studies that comprehensively assess PIT across psychopathological dimensions.
Background:
Prejudices against minorities can be understood as habitually negative evaluations that are kept in spite of evidence to the contrary. Therefore, individuals with strong prejudices might be dominated by habitual or "automatic" reactions at the expense of more controlled reactions. Computational theories suggest individual differences in the balance between habitual/model-free and deliberative/model-based decision-making.
Methods:
127 subjects performed the two Step task and completed the blatant and subtle prejudice scale.
Results:
By using analyses of choices and reaction times in combination with computational modeling, subjects with stronger blatant prejudices showed a shift away from model-based control. There was no association between these decision-making processes and subtle prejudices.
Conclusion:
These results support the idea that blatant prejudices toward minorities are related to a relative dominance of habitual decision-making. This finding has important implications for developing interventions that target to change prejudices across societies.
This study aimed to build on the relationship of well-established self-report and behavioral assessments to the latent constructs positive (PVS) and negative valence systems (NVS), cognitive systems (CS), and social processes (SP) of the Research Domain Criteria (RDoC) framework in a large transnosological population which cuts across DSM/ICD-10 disorder criteria categories. One thousand four hundred and thirty one participants (42.1% suffering from anxiety/fear-related, 18.2% from depressive, 7.9% from schizophrenia spectrum, 7.5% from bipolar, 3.4% from autism spectrum, 2.2% from other disorders, 18.4% healthy controls, and 0.2% with no diagnosis specified) recruited in studies within the German research network for mental disorders for the Phenotypic, Diagnostic and Clinical Domain Assessment Network Germany (PD-CAN) were examined with a Mini-RDoC-Assessment including behavioral and self-report measures. The respective data was analyzed with confirmatory factor analysis (CFA) to delineate the underlying latent RDoC-structure. A revised four-factor model reflecting the core domains positive and negative valence systems as well as cognitive systems and social processes showed a good fit across this sample and showed significantly better fit compared to a one factor solution. The connections between the domains PVS, NVS and SP could be substantiated, indicating a universal latent structure spanning across known nosological entities. This study is the first to give an impression on the latent structure and intercorrelations between four core Research Domain Criteria in a transnosological sample. We emphasize the possibility of using already existing and well validated self-report and behavioral measurements to capture aspects of the latent structure informed by the RDoC matrix.
Mobile data collection of cognitive-behavioral tasks in substance use disorders: Where are we now?
(2022)
Introduction: Over the last decades, our understanding of the cognitive, motivational, and neural processes involved in addictive behavior has increased enormously. A plethora of laboratory-based and cross-sectional studies has linked cognitive-behavioral measures to between-subject differences in drinking behavior. However, such laboratory-based studies inevitably suffer from small sample sizes and the inability to link temporal fluctuations in task measures to fluctuations in real-life substance use. To overcome these problems, several existing behavioral tasks have been transferred to smartphones to allow studying cognition in the field. Method: In this narrative review, we first summarize studies that used existing behavioral tasks in the laboratory and self-reports of substance use with ecological momentary assessment (EMA) in the field. Next, we review studies on psychometric properties of smartphone-based behavioral tasks. Finally, we review studies that used both smartphone-based tasks and self-reports with EMA in the field. Results: Overall, studies were scarce and heterogenous both in tasks and in study outcomes. Nevertheless, existing findings are promising and point toward several methodological recommendations: concerning psychometrics, studies show that - although more systematic studies are necessary - task validity and reliability can be improved, for example, by analyzing several measurement sessions at once rather than analyzing sessions separately. Studies that use tasks in the field, moreover, show that power can be improved by choosing sampling schemes that combine time-based with event-based sampling, rather than relying on time-based sampling alone. Increasing sampling frequency can further increase power. However, as this also increases the burden to participants, more research is necessary to determine the ideal sampling frequency for each task. Conclusion: Although more research is necessary to systematically study both the psychometrics of smartphone-based tasks and the frequency at which task measures fluctuate, existing studies are promising and reveal important methodological recommendations useful for researchers interested in implementing behavioral tasks in EMA studies.
Bone pathology is frequent in stressed individuals. A comprehensive examination of mechanisms linking life stress, depression and disturbed bone homeostasis is missing. In this translational study, mice exposed to early life stress (MSUS) were examined for bone microarchitecture (μCT), metabolism (qPCR/ELISA), and neuronal stress mediator expression (qPCR) and compared with a sample of depressive patients with or without early life stress by analyzing bone mineral density (BMD) (DXA) and metabolic changes in serum (osteocalcin, PINP, CTX-I). MSUS mice showed a significant decrease in NGF, NPYR1, VIPR1 and TACR1 expression, higher innervation density in bone, and increased serum levels of CTX-I, suggesting a milieu in favor of catabolic bone turnover. MSUS mice had a significantly lower body weight compared to control mice, and this caused minor effects on bone microarchitecture. Depressive patients with experiences of childhood neglect also showed a catabolic pattern. A significant reduction in BMD was observed in depressive patients with childhood abuse and stressful life events during childhood. Therefore, future studies on prevention and treatment strategies for both mental and bone disease should consider early life stress as a risk factor for bone pathologies.
Bone pathology is frequent in stressed individuals. A comprehensive examination of mechanisms linking life stress, depression and disturbed bone homeostasis is missing. In this translational study, mice exposed to early life stress (MSUS) were examined for bone microarchitecture (μCT), metabolism (qPCR/ELISA), and neuronal stress mediator expression (qPCR) and compared with a sample of depressive patients with or without early life stress by analyzing bone mineral density (BMD) (DXA) and metabolic changes in serum (osteocalcin, PINP, CTX-I). MSUS mice showed a significant decrease in NGF, NPYR1, VIPR1 and TACR1 expression, higher innervation density in bone, and increased serum levels of CTX-I, suggesting a milieu in favor of catabolic bone turnover. MSUS mice had a significantly lower body weight compared to control mice, and this caused minor effects on bone microarchitecture. Depressive patients with experiences of childhood neglect also showed a catabolic pattern. A significant reduction in BMD was observed in depressive patients with childhood abuse and stressful life events during childhood. Therefore, future studies on prevention and treatment strategies for both mental and bone disease should consider early life stress as a risk factor for bone pathologies.
Drugs of abuse elicit dopamine release in the ventral striatum, possibly biasing dopamine-driven reinforcement learning towards drug-related reward at the expense of non-drug-related reward. Indeed, in alcohol-dependent patients, reactivity in dopaminergic target areas is shifted from non-drug-related stimuli towards drug-related stimuli. Such hijacked' dopamine signals may impair flexible learning from non-drug-related rewards, and thus promote craving for the drug of abuse. Here, we used functional magnetic resonance imaging to measure ventral striatal activation by reward prediction errors (RPEs) during a probabilistic reversal learning task in recently detoxified alcohol-dependent patients and healthy controls (N=27). All participants also underwent 6-[F-18]fluoro-DOPA positron emission tomography to assess ventral striatal dopamine synthesis capacity. Neither ventral striatal activation by RPEs nor striatal dopamine synthesis capacity differed between groups. However, ventral striatal coding of RPEs correlated inversely with craving in patients. Furthermore, we found a negative correlation between ventral striatal coding of RPEs and dopamine synthesis capacity in healthy controls, but not in alcohol-dependent patients. Moderator analyses showed that the magnitude of the association between dopamine synthesis capacity and RPE coding depended on the amount of chronic, habitual alcohol intake. Despite the relatively small sample size, a power analysis supports the reported results. Using a multimodal imaging approach, this study suggests that dopaminergic modulation of neural learning signals is disrupted in alcohol dependence in proportion to long-term alcohol intake of patients. Alcohol intake may perpetuate itself by interfering with dopaminergic modulation of neural learning signals in the ventral striatum, thus increasing craving for habitual drug intake.
We investigated the efficacy of reminiscence therapy (RT) on symptoms of depression in patients with mild to moderate dementia. Out of 227 patients with mild to moderate dementia from a specialized physician’s office, 27 pairs (N = 54; mean age 79.04 ± 6.16 years) who had either received treatment as usual (TAU) or TAU combined with RT, were matched retrospectively according to age as well as cognitive and depressive symptom scores. After controlling for age and sex, symptoms of depression significantly decreased over time in the RT group compared to TAU (F1,52 = 4.36; p < .05). RT is a promising option for the treatment of depression in mild to moderate dementia. Larger randomized-controlled trials are needed.
Mental disorders are among the greatest medical and social challenges facing us. They can occur at all stages of life and are among the most important commonly occurring diseases. In Germany 28 % of the population suffer from a mental disorder every year, while the lifetime risk of suffering from a mental disorder is almost 50 %. Mental disorders cause great suffering for those affected and their social network. Quantitatively speaking, they can be considered to be among those diseases creating the greatest burden for society due to reduced productivity, absence from work and premature retirement. The Federal Ministry of Education and Research is funding a new research network from 2015 to 2019 with up to 35 million euros to investigate mental disorders in order to devise and develop better therapeutic measures and strategies for this population by means of basic and translational clinical research. This is the result of a competitive call for research proposals entitled research network for mental diseases. It is a nationwide network of nine consortia with up to ten psychiatric and clinical psychology partner institutions from largely university-based research facilities for adults and/or children and adolescents. Furthermore, three cross-consortia platform projects will seek to identify shared causes of diseases and new diagnostic modalities for anxiety disorders, attention deficit hyperactivity disorders (ADHS), autism, bipolar disorders, depression, schizophrenia and psychotic disorders as well as substance-related and addictive disorders. The spectrum of therapeutic approaches to be examined ranges from innovative pharmacological and psychotherapeutic treatment to novel brain stimulation procedures. In light of the enormous burden such diseases represent for society as a whole, a sustainable improvement in the financial support for those researching mental disorders seems essential. This network aims to become a nucleus for long overdue and sustained support for a German center for mental disorders.
Background: Infection with human immunodeficiency virus (HIV) affects muscle mass, altering independent activities of people living with HIV (PLWH). Resistance training alone (RT) or combined with aerobic exercise (AE) is linked to improved muscle mass and strength maintenance in PLWH. These exercise benefits have been the focus of different meta-analyses, although only a limited number of studies have been identified up to the year 2013/4. An up-to-date systematic review and meta-analysis concerning the effect of RT alone or combined with AE on strength parameters and hormones is of high value, since more and recent studies dealing with these types of exercise in PLWH have been published. Methods: Randomized controlled trials evaluating the effects of RT alone, AE alone or the combination of both (AERT) on PLWH was performed through five web-databases up to December 2017. Risk of bias and study quality was attained using the PEDro scale. Weighted mean difference (WMD) from baseline to post-intervention changes was calculated. The I2 statistics for heterogeneity was calculated.
Results: Thirteen studies reported strength outcomes. Eight studies presented a low risk of bias. The overall change in upper body strength was 19.3 Kg (95% CI: 9.8±28.8, p< 0.001) after AERT and 17.5 Kg (95% CI: 16±19.1, p< 0.001) for RT. Lower body change was 29.4 Kg (95% CI: 18.1±40.8, p< 0.001) after RT and 10.2 Kg (95% CI: 6.7±13.8, p< 0.001) for AERT. Changes were higher after controlling for the risk of bias in upper and lower body strength and for supervised exercise in lower body strength. A significant change towards lower levels of IL-6 was found (-2.4 ng/dl (95% CI: -2.6, -2.1, p< 0.001). Conclusion: Both resistance training alone and combined with aerobic exercise showed a positive change when studies with low risk of bias and professional supervision were analyzed, improving upper and, more critically, lower body muscle strength. Also, this study found that exercise had a lowering effect on IL-6 levels in PLWH.
Working memory (WM) performance declines with age. However, several studies have shown that WM training may lead to performance increases not only in the trained task, but also in untrained cognitive transfer tasks. It has been suggested that transfer effects occur if training task and transfer task share specific processing components that are supposedly processed in the same brain areas. In the current study, we investigated whether single-task WM training and training-related alterations in neural activity might support performance in a dual-task setting, thus assessing transfer effects to higher-order control processes in the context of dual-task coordination. A sample of older adults (age 60–72) was assigned to either a training or control group. The training group participated in 12 sessions of an adaptive n-back training. At pre and post-measurement, a multimodal dual-task was performed in all participants to assess transfer effects. This task consisted of two simultaneous delayed match to sample WM tasks using two different stimulus modalities (visual and auditory) that were performed either in isolation (single-task) or in conjunction (dual-task). A subgroup also participated in functional magnetic resonance imaging (fMRI) during the performance of the n-back task before and after training. While no transfer to single-task performance was found, dual-task costs in both the visual modality (p < 0.05) and the auditory modality (p < 0.05) decreased at post-measurement in the training but not in the control group. In the fMRI subgroup of the training participants, neural activity changes in left dorsolateral prefrontal cortex (DLPFC) during one-back predicted post-training auditory dual-task costs, while neural activity changes in right DLPFC during three-back predicted visual dual-task costs. Results might indicate an improvement in central executive processing that could facilitate both WM and dual-task coordination.
Background: Given the well-established association between perceived stress and quality of life (QoL) in dementia patients and their partners, our goal was to identify whether relationship quality and dyadic coping would operate as mediators between perceived stress and QoL.
Methods: 82 dyads of dementia patients and their spousal caregivers were included in a cross-sectional assessment from a prospective study. QoL was assessed with the Quality of Life in Alzheimer's Disease scale (QoL-AD) for dementia patients and the WHO Quality of Life-BREF for spousal caregivers. Perceived stress was measured with the Perceived Stress Scale (PSS-14). Both partners were assessed with the Dyadic Coping Inventory (DCI). Analyses of correlation as well as regression models including mediator analyses were performed.
Results: We found negative correlations between stress and QoL in both partners (QoL-AD: r = -0.62; p < 0.001; WHO-QOL Overall: r = -0.27; p = 0.02). Spousal caregivers had a significantly lower DCI total score than dementia patients (p < 0.001). Dyadic coping was a significant mediator of the relationship between stress and QoL in spousal caregivers (z = 0.28; p = 0.02), but not in dementia patients. Likewise, relationship quality significantly mediated the relationship between stress and QoL in caregivers only (z = -2.41; p = 0.02).
Conclusions: This study identified dyadic coping as a mediator on the relationship between stress and QoL in (caregiving) partners of dementia patients. In patients, however, we found a direct negative effect of stress on QoL. The findings suggest the importance of stress reducing and dyadic interventions for dementia patients and their partners, respectively.
Background: Continuous treatment is an important indicator of medication adherence in dementia. However, long-term studies in larger clinical settings are lacking, and little is known about moderating effects of patient and service characteristics.
Methods: Data from 12,910 outpatients with dementia (mean age 79.2 years; SD = 7.6 years) treated between January 2003 and December 2013 in Germany were included. Continuous treatment was analysed using Kaplan-Meier curves and log-rank tests. In addition, multivariate Cox regression models were fitted with continuous treatment as dependent variable and the predictors antidementia agent, age, gender, medical comorbidities, physician specialty, and health insurance status.
Results: After one year of follow-up, nearly 60% of patients continued drug treatment. Donezepil (HR: 0.88; 95% CI: 0.82-0.95) and memantine (HR: 0.85; 0.79-0.91) patients were less likely to be discontinued treatment as compared to rivastigmine users. Patients were less likely to be discontinued if they were treated by specialist physicians as compared to general practitioners (HR: 0.44; 0.41-0.48). Younger male patients and patients who had private health insurance had a lower discontinuation risk. Regarding comorbidity, patients were more likely to be continuously treated with the index substance if a diagnosis of heart failure or hypertension had been diagnosed at baseline.
Conclusions: Our results imply that besides type of antidementia agent, involvement of a specialist in the complex process of prescribing antidementia drugs can provide meaningful benefits to patients, in terms of more disease-specific and continuous treatment.
Background: Dementia is a psychiatric condition the development of which is associated with numerous aspects of life. Our aim was to estimate dementia risk factors in German primary care patients.
Methods: The case-control study included primary care patients (70-90 years) with first diagnosis of dementia (all-cause) during the index period (01/2010-12/2014) (Disease Analyzer, Germany), and controls without dementia matched (1:1) to cases on the basis of age, sex, type of health insurance, and physician. Practice visit records were used to verify that there had been 10 years of continuous follow-up prior to the index date. Multivariate logistic regression models were fitted with dementia as a dependent variable and the potential predictors.
Results: The mean age for the 11,956 cases and the 11,956 controls was 80.4 (SD: 5.3) years. 39.0% of them were male and 1.9% had private health insurance. In the multivariate regression model, the following variables were linked to a significant extent with an increased risk of dementia: diabetes (OR: 1.17; 95% CI: 1.10-1.24), lipid metabolism (1.07; 1.00-1.14), stroke incl. TIA (1.68; 1.57-1.80), Parkinson's disease (PD) (1.89; 1.64-2.19), intracranial injury (1.30; 1.00-1.70), coronary heart disease (1.06; 1.00-1.13), mild cognitive impairment (MCI) (2.12; 1.82-2.48), mental and behavioral disorders due to alcohol use (1.96; 1.50-2.57). The use of statins (OR: 0.94; 0.90-0.99), proton-pump inhibitors (PPI) (0.93; 0.90-0.97), and antihypertensive drugs (0.96, 0.94-0.99) were associated with a decreased risk of developing dementia.
Conclusions: Risk factors for dementia found in this study are consistent with the literature. Nevertheless, the associations between statin, PPI and antihypertensive drug use, and decreased risk of dementia need further investigations.
The aim was to analyze the risk of hip fracture in German primary care patients with dementia. This study included patients aged 65-90 from 1072 primary care practices who were first diagnosed with dementia between 2010 and 2013. Controls were matched (1:1) to cases for age, sex, and type of health insurance. The primary outcome was the diagnosis of hip fracture during the three-year follow-up period. A total of 53,156 dementia patients and 53,156 controls were included. A total of 5.3% of patients and 0.7% of controls displayed hip fracture after three years. Hip fracture occurred more frequently in dementia subjects living in nursing homes than in those living at home (9.2% versus 4.3%). Dementia, residence in nursing homes, and osteoporosis were risk factors for fracture development. Antidementia, antipsychotic, and antidepressant drugs generally had no significant impact on hip fracture risk when prescribed for less than six months. Dementia increased hip fracture risk in German primary care practices.
Background: The goal of this study was to estimate the prevalence of and risk factors for diagnosed depression in heart failure (HF) patients in German primary care practices.
Methods: This study was a retrospective database analysis in Germany utilizing the Disease Analyzer (R) Database (IMS Health, Germany). The study population included 132,994 patients between 40 and 90 years of age from 1,072 primary care practices. The observation period was between 2004 and 2013. Follow-up lasted up to five years and ended in April 2015. A total of 66,497 HF patients were selected after applying exclusion criteria. The same number of 66,497 controls were chosen and were matched (1:1) to HF patients on the basis of age, sex, health insurance, depression diagnosis in the past, and follow-up duration after index date.
Results: HF was a strong risk factor for diagnosed depression (p < 0.0001). A total of 10.5% of HF patients and 6.3% of matched controls developed depression after one year of follow-up (p < 0.001). Depression was documented in 28.9% of the HF group and 18.2% of the control group after the five-year follow-up (p < 0.001). Cancer, dementia, osteoporosis, stroke, and osteoarthritis were associated with a higher risk of developing depression. Male gender and private health insurance were associated with lower risk of depression.
Conclusions: The risk of diagnosed depression is significantly increased in patients with HF compared to patients without HF in primary care practices in Germany.
The interruption of learning processes by breaks filled with diverse activities is common in everyday life. We investigated the effects of active computer gaming and passive relaxation (rest and music) breaks on working memory performance. Young adults were exposed to breaks involving (i) eyes-open resting, (ii) listening to music and (iii) playing the video game “Angry Birds” before performing the n-back working memory task. Based on linear mixed-effects modeling, we found that playing the “Angry Birds” video game during a short learning break led to a decline in task performance over the course of the task as compared to eyes-open resting and listening to music, although overall task performance was not impaired. This effect was associated with high levels of daily mind wandering and low self-reported ability to concentrate. These findings indicate that video games can negatively affect working memory performance over time when played in between learning tasks. We suggest further investigation of these effects because of their relevance to everyday activity.
Different systems for habitual versus goal-directed control are thought to underlie human decision-making. Working memory is known to shape these decision-making systems and
their interplay, and is known to support goal-directed decision making even under stress. Here, we investigated if and how decision systems are differentially influenced by breaks filled with diverse everyday life activities known to modulate working memory performance. We used a within-subject design where young adults listened to music and played a video game during breaks interleaved with trials of a sequential two-step Markov decision task, designed to assess habitual as well as goal-directed decision making. Based on a neurocomputational model of task performance, we observed that for individuals with a rather limited working memory capacity video gaming as compared to music reduced reliance on the goal-directed decision-making system, while a rather large working memory capacity prevented such a decline. Our findings suggest differential effects of everyday activities on key decision-making processes.
Objective: To estimate the prevalence and the type of antidepressant medication prescribed by German psychiatrists to patients with depression and cardiovascular diseases (CVD). Methods: This study was a retrospective database analysis in Germany using the Disease Analyzer Database (IMS Health, Germany). The study population included 2,288 CVD patients between 40 and 90 years of age from 175 psychiatric practices. The observation period was between 2004 and 2013. Follow-up lasted up to 12 months and ended in April 2015. Also included were 2,288 non-CVD controls matched (1 : 1) to CVD cases on the basis of age, gender, health insurance coverage, depression severity, and diagnosing physician. Results: Mean age was 68.6 years. 46.2% of patients were men, and 5.9% had private health insurance coverage. Mild, moderate, or severe depression was present in 18.7%, 60.7%, and 20.6% of patients, respectively. Most patients had treatment within a year, many of them immediately after depression diagnosis. Patients with moderate and severe depression were more likely to receive treatment than patients with mild depression. There was no difference between CVD and non-CVD in the proportion of patients treated. Nonetheless, CVD patients received selective serotonin reuptake inhibitors / serotonin-noradrenaline reuptake inhibitors (SSRIs/SNRIs) significantly more frequently. Conversely, patients without CVD were more often treated with TCA. Conclusion: There was no association between CVD and the initiation of depression treatment. Furthermore, CVD patients received SSRIs/SNRIs more frequently.