Refine
Language
- English (14)
Is part of the Bibliography
- yes (14)
Keywords
- Alcohol dependence (2)
- Pavlovian-to-instrumental transfer (2)
- addiction (2)
- alcohol (2)
- alternative rewards (2)
- amygdala (2)
- animal and computational models (2)
- cognitive-behavioral control (2)
- craving and relapse (2)
- habit formation (2)
SIRT6 is a NAD(+)-dependent deacetylase that modulates chromatin structure and safeguards genomic stability. Until now, SIRT6 has been assigned to the nucleus and only nuclear targets of SIRT6 are known. Here, we demonstrate that in response to stress, C. elegans SIR-2.4 and its mammalian orthologue SIRT6 localize to cytoplasmic stress granules, interact with various stress granule components and induce their assembly. Loss of SIRT6 or inhibition of its catalytic activity in mouse embryonic fibroblasts impairs stress granule formation and delays disassembly during recovery, whereas deficiency of SIR-2.4 diminishes maintenance of P granules and decreases survival of C. elegans under stress conditions. Our findings uncover a novel, evolutionary conserved function of SIRT6 in the maintenance of stress granules in response to stress.
Genetic and environmental factors both contribute to cognitive test performance. A substantial increase in average intelligence test results in the second half of the previous century within one generation is unlikely to be explained by genetic changes. One possible explanation for the strong malleability of cognitive performance measure is that environmental factors modify gene expression via epigenetic mechanisms. Epigenetic factors may help to understand the recent observations of an association between dopamine-dependent encoding of reward prediction errors and cognitive capacity, which was modulated by adverse life events. The possible manifestation of malleable biomarkers contributing to variance in cognitive test performance, and thus possibly contributing to the "missing heritability" between estimates from twin studies and variance explained by genetic markers, is still unclear. Here we show in 1475 healthy adolescents from the IMaging and GENetics (IMAGEN) sample that general IQ (gIQ) is associated with (1) polygenic scores for intelligence, (2) epigenetic modification of DRD2 gene, (3) gray matter density in striatum, and (4) functional striatal activation elicited by temporarily surprising reward-predicting cues. Comparing the relative importance for the prediction of gIQ in an overlapping subsample, our results demonstrate neurobiological correlates of the malleability of gIQ and point to equal importance of genetic variance, epigenetic modification of DRD2 receptor gene, as well as functional striatal activation, known to influence dopamine neurotransmission. Peripheral epigenetic markers are in need of confirmation in the central nervous system and should be tested in longitudinal settings specifically assessing individual and environmental factors that modify epigenetic structure.
One of the major risk factors for global death and disability is alcohol, tobacco, and illicit drug use. While there is increasing knowledge with respect to individual factors promoting the initiation and maintenance of substance use disorders (SUDs), disease trajectories involved in losing and regaining control over drug intake (ReCoDe) are still not well described. Our newly formed German Collaborative Research Centre (CRC) on ReCoDe has an interdisciplinary approach funded by the German Research Foundation (DFG) with a 12-year perspective. The main goals of our research consortium are (i) to identify triggers and modifying factors that longitudinally modulate the trajectories of losing and regaining control over drug consumption in real life, (ii) to study underlying behavioral, cognitive, and neurobiological mechanisms, and (iii) to implicate mechanism-based interventions. These goals will be achieved by: (i) using mobile health (m-health) tools to longitudinally monitor the effects of triggers (drug cues, stressors, and priming doses) and modify factors (eg, age, gender, physical activity, and cognitive control) on drug consumption patterns in real-life conditions and in animal models of addiction; (ii) the identification and computational modeling of key mechanisms mediating the effects of such triggers and modifying factors on goal-directed, habitual, and compulsive aspects of behavior from human studies and animal models; and (iii) developing and testing interventions that specifically target the underlying mechanisms for regaining control over drug intake.
One of the major risk factors for global death and disability is alcohol, tobacco, and illicit drug use. While there is increasing knowledge with respect to individual factors promoting the initiation and maintenance of substance use disorders (SUDs), disease trajectories involved in losing and regaining control over drug intake (ReCoDe) are still not well described. Our newly formed German Collaborative Research Centre (CRC) on ReCoDe has an interdisciplinary approach funded by the German Research Foundation (DFG) with a 12-year perspective. The main goals of our research consortium are (i) to identify triggers and modifying factors that longitudinally modulate the trajectories of losing and regaining control over drug consumption in real life, (ii) to study underlying behavioral, cognitive, and neurobiological mechanisms, and (iii) to implicate mechanism-based interventions. These goals will be achieved by: (i) using mobile health (m-health) tools to longitudinally monitor the effects of triggers (drug cues, stressors, and priming doses) and modify factors (eg, age, gender, physical activity, and cognitive control) on drug consumption patterns in real-life conditions and in animal models of addiction; (ii) the identification and computational modeling of key mechanisms mediating the effects of such triggers and modifying factors on goal-directed, habitual, and compulsive aspects of behavior from human studies and animal models; and (iii) developing and testing interventions that specifically target the underlying mechanisms for regaining control over drug intake.
Genetic and environmental factors both contribute to cognitive test performance. A substantial increase in average intelligence test results in the second half of the previous century within one generation is unlikely to be explained by genetic changes. One possible explanation for the strong malleability of cognitive performance measure is that environmental factors modify gene expression via epigenetic mechanisms. Epigenetic factors may help to understand the recent observations of an association between dopamine-dependent encoding of reward prediction errors and cognitive capacity, which was modulated by adverse life events. The possible manifestation of malleable biomarkers contributing to variance in cognitive test performance, and thus possibly contributing to the "missing heritability" between estimates from twin studies and variance explained by genetic markers, is still unclear. Here we show in 1475 healthy adolescents from the IMaging and GENetics (IMAGEN) sample that general IQ (gIQ) is associated with (1) polygenic scores for intelligence, (2) epigenetic modification of DRD2 gene, (3) gray matter density in striatum, and (4) functional striatal activation elicited by temporarily surprising reward-predicting cues. Comparing the relative importance for the prediction of gIQ in an overlapping subsample, our results demonstrate neurobiological correlates of the malleability of gIQ and point to equal importance of genetic variance, epigenetic modification of DRD2 receptor gene, as well as functional striatal activation, known to influence dopamine neurotransmission. Peripheral epigenetic markers are in need of confirmation in the central nervous system and should be tested in longitudinal settings specifically assessing individual and environmental factors that modify epigenetic structure.
In animals and humans, behavior can be influenced by irrelevant stimuli, a phenomenon called Pavlovian-to-instrumental transfer (PIT). In subjects with substance use disorder, PIT is even enhanced with functional activation in the nucleus accumbens (NAcc) and amygdala. While we observed enhanced behavioral and neural PIT effects in alcohol-dependent subjects, we here aimed to determine whether behavioral PIT is enhanced in young men with high-risk compared to low-risk drinking and subsequently related functional activation in an a-priori region of interest encompassing the NAcc and amygdala and related to polygenic risk for alcohol consumption. A representative sample of 18-year old men (n = 1937) was contacted: 445 were screened, 209 assessed: resulting in 191 valid behavioral, 139 imaging and 157 genetic datasets. None of the subjects fulfilled criteria for alcohol dependence according to the Diagnostic and Statistical Manual of Mental Disorders-IV-TextRevision (DSM-IV-TR). We measured how instrumental responding for rewards was influenced by background Pavlovian conditioned stimuli predicting action-independent rewards and losses. Behavioral PIT was enhanced in high-compared to low-risk drinkers (b = 0.09, SE = 0.03, z = 2.7, p < 0.009). Across all subjects, we observed PIT-related neural blood oxygen level-dependent (BOLD) signal in the right amygdala (t = 3.25, p(SVC) = 0.04, x = 26, y = -6, z = -12), but not in NAcc. The strength of the behavioral PIT effect was positively correlated with polygenic risk for alcohol consumption (r(s) = 0.17, p = 0.032). We conclude that behavioral PIT and polygenic risk for alcohol consumption might be a biomarker for a subclinical phenotype of risky alcohol consumption, even if no drug-related stimulus is present. The association between behavioral PIT effects and the amygdala might point to habitual processes related to out PIT task. In non-dependent young social drinkers, the amygdala rather than the NAcc is activated during PIT; possible different involvement in association with disease trajectory should be investigated in future studies.
Drunk decisions
(2018)
Background: Studies in humans and animals suggest a shift from goal-directed to habitual decision-making in addiction. We therefore tested whether acute alcohol administration reduces goal-directed and promotes habitual decision-making, and whether these effects are moderated by self-reported drinking problems. Methods: Fifty-three socially drinking males completed the two-step task in a randomised crossover design while receiving an intravenous infusion of ethanol (blood alcohol level=80 mg%), or placebo. To minimise potential bias by long-standing heavy drinking and subsequent neuropsychological impairment, we tested 18- to 19-year-old adolescents. Results: Alcohol administration consistently reduced habitual, model-free decisions, while its effects on goal-directed, model-based behaviour varied as a function of drinking problems measured with the Alcohol Use Disorders Identification Test. While adolescents with low risk for drinking problems (scoring <8) exhibited an alcohol-induced numerical reduction in goal-directed choices, intermediate-risk drinkers showed a shift away from habitual towards goal-directed decision-making, such that alcohol possibly even improved their performance. Conclusions: We assume that alcohol disrupted basic cognitive functions underlying habitual and goal-directed decisions in low-risk drinkers, thereby enhancing hasty choices. Further, we speculate that intermediate-risk drinkers benefited from alcohol as a negative reinforcer that reduced unpleasant emotional states, possibly displaying a novel risk factor for drinking in adolescence.