Refine
Year of publication
Document Type
- Article (21)
- Postprint (6)
- Other (3)
- Conference Proceeding (1)
Language
- English (31)
Is part of the Bibliography
- yes (31) (remove)
Keywords
- Pavlovian-to-instrumental transfer (5)
- Alcohol dependence (4)
- alcohol (3)
- interference control (3)
- reward anticipation (3)
- Dopamine (2)
- Pavlovian‐to‐instrumental transfer (2)
- Reinforcement learning (2)
- Relapse (2)
- addiction (2)
BACKGROUND: Pavlovian-to-instrumental transfer (PIT) describes the influence of conditioned stimuli on instrumental behaviors and is discussed as a key process underlying substance abuse. Here, we tested whether neural responses during alcohol-related PIT predict future relapse in alcohol-dependent patients and future drinking behavior in adolescents. METHODS: Recently detoxified alcohol-dependent patients (n = 52) and young adults without dependence (n = 136) underwent functional magnetic resonance imaging during an alcohol-related PIT paradigm, and their drinking behavior was assessed in a 12-month follow-up. To predict future drinking behavior from PIT activation patterns, we used a multivoxel classification scheme based on linear support vector machines. RESULTS: When training and testing the classification scheme in patients, PIT activation patterns predicted future relapse with 71.2% accuracy. Feature selection revealed that classification was exclusively based on activation patterns in medial prefrontal cortex. To probe the generalizability of this functional magnetic resonance imaging-based prediction of future drinking behavior, we applied the support vector machine classifier that had been trained on patients to PIT functional magnetic resonance imaging data from adolescents. An analysis of cross-classification predictions revealed that those young social drinkers who were classified as abstainers showed a greater reduction in alcohol consumption at 12-month follow-up than those classified as relapsers (Delta = -24.4 +/- 6.0 g vs. -5.7 +/- 3.6 g; p = .019). CONCLUSIONS: These results suggest that neural responses during PIT could constitute a generalized prognostic marker for future drinking behavior in established alcohol use disorder and in at-risk states.
The influence of Pavlovian conditioned stimuli on ongoing behavior may contribute to explaining how alcohol cues stimulate drug seeking and intake. Using a Pavlovian-instrumental transfer task, we investigated the effects of alcohol-related cues on approach behavior (i.e., instrumental response behavior) and its neural correlates, and related both to the relapse after detoxification in alcohol-dependent patients. Thirty-one recently detoxified alcohol-dependent patients and 24 healthy controls underwent instrumental training, where approach or non-approach towards initially neutral stimuli was reinforced by monetary incentives. Approach behavior was tested during extinction with either alcohol-related or neutral stimuli (as Pavlovian cues) presented in the background during functional magnetic resonance imaging (fMRI). Patients were subsequently followed up for 6 months. We observed that alcohol-related background stimuli inhibited the approach behavior in detoxified alcohol-dependent patients (t = -3.86, p < .001), but not in healthy controls (t = -0.92, p = .36). This behavioral inhibition was associated with neural activation in the nucleus accumbens (NAcc) (t((30)) = 2.06, p < .05). Interestingly, both the effects were only present in subsequent abstainers, but not relapsers and in those with mild but not severe dependence. Our data show that alcohol-related cues can acquire inhibitory behavioral features typical of aversive stimuli despite being accompanied by a stronger NAcc activation, suggesting salience attribution. The fact that these findings are restricted to abstinence and milder illness suggests that they may be potential resilience factors.
Background: Human and animal work suggests a shift from goal-directed to habitual decision-making in addiction. However, the evidence for this in human alcohol dependence is as yet inconclusive. Methods: Twenty-six healthy controls and 26 recently detoxified alcohol-dependent patients underwent behavioral testing with a 2-step task designed to disentangle goal-directed and habitual response patterns. Results: Alcohol-dependent patients showed less evidence of goal-directed choices than healthy controls, particularly after losses. There was no difference in the strength of the habitual component. The group differences did not survive controlling for performance on the Digit Symbol Substitution Task. Conclusion: Chronic alcohol use appears to selectively impair goal-directed function, rather than promoting habitual responding. It appears to do so particularly after nonrewards, and this may be mediated by the effects of alcohol on more general cognitive functions subserved by the prefrontal cortex.
Mobile data collection of cognitive-behavioral tasks in substance use disorders: Where are we now?
(2022)
Introduction: Over the last decades, our understanding of the cognitive, motivational, and neural processes involved in addictive behavior has increased enormously. A plethora of laboratory-based and cross-sectional studies has linked cognitive-behavioral measures to between-subject differences in drinking behavior. However, such laboratory-based studies inevitably suffer from small sample sizes and the inability to link temporal fluctuations in task measures to fluctuations in real-life substance use. To overcome these problems, several existing behavioral tasks have been transferred to smartphones to allow studying cognition in the field. Method: In this narrative review, we first summarize studies that used existing behavioral tasks in the laboratory and self-reports of substance use with ecological momentary assessment (EMA) in the field. Next, we review studies on psychometric properties of smartphone-based behavioral tasks. Finally, we review studies that used both smartphone-based tasks and self-reports with EMA in the field. Results: Overall, studies were scarce and heterogenous both in tasks and in study outcomes. Nevertheless, existing findings are promising and point toward several methodological recommendations: concerning psychometrics, studies show that - although more systematic studies are necessary - task validity and reliability can be improved, for example, by analyzing several measurement sessions at once rather than analyzing sessions separately. Studies that use tasks in the field, moreover, show that power can be improved by choosing sampling schemes that combine time-based with event-based sampling, rather than relying on time-based sampling alone. Increasing sampling frequency can further increase power. However, as this also increases the burden to participants, more research is necessary to determine the ideal sampling frequency for each task. Conclusion: Although more research is necessary to systematically study both the psychometrics of smartphone-based tasks and the frequency at which task measures fluctuate, existing studies are promising and reveal important methodological recommendations useful for researchers interested in implementing behavioral tasks in EMA studies.
Rationale: Advances in neurocomputational modeling suggest that valuation systems for goal-directed (deliberative) on one side, and habitual (automatic) decision-making on the other side may rely on distinct computational strategies for reinforcement learning, namely model-free vs. model-based learning. As a key theoretical difference, the model-based system strongly demands cognitive functions to plan actions prospectively based on an internal cognitive model of the environment, whereas valuation in the model-free system relies on rather simple learning rules from operant conditioning to retrospectively associate actions with their outcomes and is thus cognitively less demanding. Acute stress reactivity is known to impair model-based but not model-free choice behavior, with higher working memory capacity protecting the model-based system from acute stress. However, it is not clear which impact accumulated real life stress has on model-free and model-based decision systems and how this influence interacts with cognitive abilities. Methods: We used a sequential decision-making task distinguishing relative contributions of both learning strategies to choice behavior, the Social Readjustment Rating Scale questionnaire to assess accumulated real life stress, and the Digit Symbol Substitution Test to test cognitive speed in 95 healthy subjects. Results: Individuals reporting high stress exposure who had low cognitive speed showed reduced model-based but increased model-free behavioral control. In contrast, subjects exposed to accumulated real life stress with high cognitive speed displayed increased model-based performance but reduced model-free control. Conclusion: These findings suggest that accumulated real life stress exposure can enhance reliance on cognitive speed for model-based computations, which may ultimately protect the model-based system from the detrimental influences of accumulated real life stress. The combination of accumulated real life stress exposure and slower information processing capacities, however, might favor model-free strategies. Thus, the valence and preference of either system strongly depends on stressful experiences and individual cognitive capacities.
Genetic and environmental factors both contribute to cognitive test performance. A substantial increase in average intelligence test results in the second half of the previous century within one generation is unlikely to be explained by genetic changes. One possible explanation for the strong malleability of cognitive performance measure is that environmental factors modify gene expression via epigenetic mechanisms. Epigenetic factors may help to understand the recent observations of an association between dopamine-dependent encoding of reward prediction errors and cognitive capacity, which was modulated by adverse life events. The possible manifestation of malleable biomarkers contributing to variance in cognitive test performance, and thus possibly contributing to the "missing heritability" between estimates from twin studies and variance explained by genetic markers, is still unclear. Here we show in 1475 healthy adolescents from the IMaging and GENetics (IMAGEN) sample that general IQ (gIQ) is associated with (1) polygenic scores for intelligence, (2) epigenetic modification of DRD2 gene, (3) gray matter density in striatum, and (4) functional striatal activation elicited by temporarily surprising reward-predicting cues. Comparing the relative importance for the prediction of gIQ in an overlapping subsample, our results demonstrate neurobiological correlates of the malleability of gIQ and point to equal importance of genetic variance, epigenetic modification of DRD2 receptor gene, as well as functional striatal activation, known to influence dopamine neurotransmission. Peripheral epigenetic markers are in need of confirmation in the central nervous system and should be tested in longitudinal settings specifically assessing individual and environmental factors that modify epigenetic structure.
Genetic and environmental factors both contribute to cognitive test performance. A substantial increase in average intelligence test results in the second half of the previous century within one generation is unlikely to be explained by genetic changes. One possible explanation for the strong malleability of cognitive performance measure is that environmental factors modify gene expression via epigenetic mechanisms. Epigenetic factors may help to understand the recent observations of an association between dopamine-dependent encoding of reward prediction errors and cognitive capacity, which was modulated by adverse life events. The possible manifestation of malleable biomarkers contributing to variance in cognitive test performance, and thus possibly contributing to the "missing heritability" between estimates from twin studies and variance explained by genetic markers, is still unclear. Here we show in 1475 healthy adolescents from the IMaging and GENetics (IMAGEN) sample that general IQ (gIQ) is associated with (1) polygenic scores for intelligence, (2) epigenetic modification of DRD2 gene, (3) gray matter density in striatum, and (4) functional striatal activation elicited by temporarily surprising reward-predicting cues. Comparing the relative importance for the prediction of gIQ in an overlapping subsample, our results demonstrate neurobiological correlates of the malleability of gIQ and point to equal importance of genetic variance, epigenetic modification of DRD2 receptor gene, as well as functional striatal activation, known to influence dopamine neurotransmission. Peripheral epigenetic markers are in need of confirmation in the central nervous system and should be tested in longitudinal settings specifically assessing individual and environmental factors that modify epigenetic structure.
Drunk decisions
(2018)
Background: Studies in humans and animals suggest a shift from goal-directed to habitual decision-making in addiction. We therefore tested whether acute alcohol administration reduces goal-directed and promotes habitual decision-making, and whether these effects are moderated by self-reported drinking problems. Methods: Fifty-three socially drinking males completed the two-step task in a randomised crossover design while receiving an intravenous infusion of ethanol (blood alcohol level=80 mg%), or placebo. To minimise potential bias by long-standing heavy drinking and subsequent neuropsychological impairment, we tested 18- to 19-year-old adolescents. Results: Alcohol administration consistently reduced habitual, model-free decisions, while its effects on goal-directed, model-based behaviour varied as a function of drinking problems measured with the Alcohol Use Disorders Identification Test. While adolescents with low risk for drinking problems (scoring <8) exhibited an alcohol-induced numerical reduction in goal-directed choices, intermediate-risk drinkers showed a shift away from habitual towards goal-directed decision-making, such that alcohol possibly even improved their performance. Conclusions: We assume that alcohol disrupted basic cognitive functions underlying habitual and goal-directed decisions in low-risk drinkers, thereby enhancing hasty choices. Further, we speculate that intermediate-risk drinkers benefited from alcohol as a negative reinforcer that reduced unpleasant emotional states, possibly displaying a novel risk factor for drinking in adolescence.
One of the major risk factors for global death and disability is alcohol, tobacco, and illicit drug use. While there is increasing knowledge with respect to individual factors promoting the initiation and maintenance of substance use disorders (SUDs), disease trajectories involved in losing and regaining control over drug intake (ReCoDe) are still not well described. Our newly formed German Collaborative Research Centre (CRC) on ReCoDe has an interdisciplinary approach funded by the German Research Foundation (DFG) with a 12-year perspective. The main goals of our research consortium are (i) to identify triggers and modifying factors that longitudinally modulate the trajectories of losing and regaining control over drug consumption in real life, (ii) to study underlying behavioral, cognitive, and neurobiological mechanisms, and (iii) to implicate mechanism-based interventions. These goals will be achieved by: (i) using mobile health (m-health) tools to longitudinally monitor the effects of triggers (drug cues, stressors, and priming doses) and modify factors (eg, age, gender, physical activity, and cognitive control) on drug consumption patterns in real-life conditions and in animal models of addiction; (ii) the identification and computational modeling of key mechanisms mediating the effects of such triggers and modifying factors on goal-directed, habitual, and compulsive aspects of behavior from human studies and animal models; and (iii) developing and testing interventions that specifically target the underlying mechanisms for regaining control over drug intake.
One of the major risk factors for global death and disability is alcohol, tobacco, and illicit drug use. While there is increasing knowledge with respect to individual factors promoting the initiation and maintenance of substance use disorders (SUDs), disease trajectories involved in losing and regaining control over drug intake (ReCoDe) are still not well described. Our newly formed German Collaborative Research Centre (CRC) on ReCoDe has an interdisciplinary approach funded by the German Research Foundation (DFG) with a 12-year perspective. The main goals of our research consortium are (i) to identify triggers and modifying factors that longitudinally modulate the trajectories of losing and regaining control over drug consumption in real life, (ii) to study underlying behavioral, cognitive, and neurobiological mechanisms, and (iii) to implicate mechanism-based interventions. These goals will be achieved by: (i) using mobile health (m-health) tools to longitudinally monitor the effects of triggers (drug cues, stressors, and priming doses) and modify factors (eg, age, gender, physical activity, and cognitive control) on drug consumption patterns in real-life conditions and in animal models of addiction; (ii) the identification and computational modeling of key mechanisms mediating the effects of such triggers and modifying factors on goal-directed, habitual, and compulsive aspects of behavior from human studies and animal models; and (iii) developing and testing interventions that specifically target the underlying mechanisms for regaining control over drug intake.