14 research outputs found

    Linking Striatal Dopamine and Decision-Making to Adolescent Risk-Taking

    Get PDF
    Adolescence is characterized by a peak in risk-taking behaviors that increases the likelihood of problematic substance use, sexually transmitted diseases, and fatal accidents. Prominent neurodevelopmental theories suggest these behaviors are driven by the maturation of the striatal dopamine (DA) system and its modulation of prefrontal-striatal circuitry. To date, research in this area has been limited, both by limitations in assessing DA systems in vivo in human adolescents and an incomplete understanding of the intermediate cognitive and affective processes linking striatal DA and risk-taking. This dissertation built upon a first-of-its kind longitudinal neuroimaging dataset (N=144) using direct (positron emission tomography [PET]) and indirect (brain tissue iron) measures of striatal DA, resting-state functional connectivity data, field-standard risk-taking measures, and a validated developmentally-sensitive decision-making task. To increase statistical power, an additional sample (N=187) with key overlapping measures was also examined. Across three aims, mixed support was found for the hypothesized integrative psychobiological model. Consistent with prior work, significant developmental differences were found in risk-taking propensity measures (both adolescent peaks and age-related decreases), in brain iron-based, indirect measures of striatal DA (age-related increases), and in model-based learning during the decision-making task (age-related increases). However, associations between risk-taking propensity measures and striatal DA measures were small in magnitude and not statistically significant. Evidence was found for an association between indirect striatal DA measures and an exploratory analysis of performance on the decision-making task, where those with higher striatal iron for their age displayed more habitual responding during early adolescence. There was also evidence that striatal tissue iron measures were associated with frontostriatal connectivity. Nevertheless, broader circuit-level hypotheses of developmental changes in dopamine processing supporting changes in frontostriatal connectivity and subsequently risk-taking propensity were limited in this sample. Results suggest risk-taking may be related to striatal DA indirectly via decreased frontostriatal connectivity, although these associations were not developmentally sensitive in the current sample. These initial results establish testable hypotheses for larger developmental samples with more detailed phenotyping and expanded imaging metrics. Ultimately, this work can inform diverse neurodevelopmental pathways of adolescent risk-taking and contribute to biologically informed interventions for at-risk youth

    Neural Correlates of Rewarded Response Inhibition in Youth at Risk for Problematic Alcohol Use

    No full text
    Risk for substance use disorder (SUD) is associated with poor response inhibition and heightened reward sensitivity. During adolescence, incentives improve performance on response inhibition tasks and increase recruitment of cortical control areas (Geier et al., 2010) associated with SUD (Chung et al., 2011). However, it is unknown whether incentives moderate the relationship between response inhibition and trait-level psychopathology and personality features of substance use risk. We examined these associations in the current project using a rewarded antisaccade (AS) task (Geier et al., 2010) in youth at risk for substance use. Participants were 116 adolescents and young adults (ages 12–21) from the University of Pittsburgh site of the National Consortium on Adolescent Neurodevelopment and Alcohol [NCANDA] study, with neuroimaging data collected at baseline and 1 year follow up visits. Building upon previous work using this task in normative developmental samples (Geier et al., 2010) and adolescents with SUD (Chung et al., 2011), we examined both trial-wise BOLD responses and those associated with individual task-epochs (cue presentation, response preparation, and response) and associated them with multiple substance use risk factors (externalizing and internalizing psychopathology, family history of substance use, and trait impulsivity). Results showed that externalizing psychopathology and high levels of trait impulsivity (positive urgency, SUPPS-P) were associated with general decreases in antisaccade performance. Accompanying this main effect of poor performance, positive urgency was associated with reduced recruitment of the frontal eye fields (FEF) and inferior frontal gyrus (IFG) in both a priori regions of interest and at the voxelwise level. Consistent with previous work, monetary incentive improved antisaccade behavioral performance and was associated with increased activation in the striatum and cortical control areas. However, incentives did not moderate the association between response inhibition behavioral performance and any trait-level psychopathology and personality factor of substance use risk. Reward interactions were observed for BOLD responses at the task-epoch level, however, they were inconsistent across substance use risk types. The results from this study may suggest poor response inhibition and heightened reward sensitivity are not overlapping neurocognitive features of substance use risk. Alternatively, more subtle, common longitudinal processes might jointly explain reward sensitivity and response inhibition deficits in substance use risk

    Adolescent development of cortical oscillations: Power, phase, and support of cognitive maturation.

    Get PDF
    During adolescence, the integration of specialized functional brain networks related to cognitive control continues to increase. Slow frequency oscillations (4-10 Hz) have been shown to support cognitive control processes, especially within prefrontal regions. However, it is unclear how neural oscillations contribute to functional brain network development and improvements in cognitive control during adolescence. To bridge this gap, we employed magnetoencephalography (MEG) to explore changes in oscillatory power and phase coupling across cortical networks in a sample of 68 adolescents and young adults. We found a redistribution of power from lower to higher frequencies throughout adolescence, such that delta band (1-3 Hz) power decreased, whereas beta band power (14-16 and 22-26 Hz) increased. Delta band power decreased with age most strongly in association networks within the frontal lobe and operculum. Conversely, beta band power increased throughout development, most strongly in processing networks and the posterior cingulate cortex, a hub of the default mode (DM) network. In terms of phase, theta band (5-9 Hz) phase-locking robustly decreased with development, following an anterior-to-posterior gradient, with the greatest decoupling occurring between association networks. Additionally, decreased slow frequency phase-locking between frontolimbic regions was related to decreased impulsivity with age. Thus, greater decoupling of slow frequency oscillations may afford functional networks greater flexibility during the resting state to instantiate control when required

    A canonical trajectory of executive function maturation from adolescence to adulthood

    No full text
    Abstract Theories of human neurobehavioral development suggest executive functions mature from childhood through adolescence, underlying adolescent risk-taking and the emergence of psychopathology. Investigations with relatively small datasets or narrow subsets of measures have identified general executive function development, but the specific maturational timing and independence of potential executive function subcomponents remain unknown. Integrating four independent datasets (N = 10,766; 8–35 years old) with twenty-three measures from seventeen tasks, we provide a precise charting, multi-assessment investigation, and replication of executive function development from adolescence to adulthood. Across assessments and datasets, executive functions follow a canonical non-linear trajectory, with rapid and statistically significant development in late childhood to mid-adolescence (10–15 years old), before stabilizing to adult-levels in late adolescence (18–20 years old). Age effects are well captured by domain-general processes that generate reproducible developmental templates across assessments and datasets. Results provide a canonical trajectory of executive function maturation that demarcates the boundaries of adolescence and can be integrated into future studies

    Adolescent Risk-Taking Across Population Subgroups of the United States

    No full text
    Aim Adolescence is assumed to be the period of the lifespan with the highest risk-taking and sensation-seeking behaviors. A normative peak in risk-taking during adolescence is thought to underlie both adaptive (e.g., independence from caregivers) and potentially harmful (e.g., substance use) outcomes. There are however, many population-level sociodemographic and psychological factors that have been speculated to be associated with risk-taking and sensation-seeking across the lifespan, and with mental health disorders where these behaviors are diagnostically relevant. Yet, it is not well known whether such factors influence the hypothesized lifespan peaks in risk-taking during adolescence and/or its link to harmful outcomes (e.g., substance use). Clarifying the generalizability of adolescent peaks in risk-taking across population-level, sociodemographic and psychiatric factors is thus essential for establishing broadly applicable and inclusive models of behavioral development for both basic developmental science and clinical care. Large-scale, national survey data provide a well-suited means towards testing these goals and can inform the increasing number of smaller, targeted studies of adolescent risk-taking. Methods National Survey on Drug Use and Health 2002-2019 data (N=1,005,421; 12-65-years-old) were used to fit non-linear lifespan trajectories (via spline regression) of a self-reported risk-taking propensity measure. Variation in this measure across the lifespan and its link to substance use during adolescence were examined in independent models across 19 sociodemographic and psychiatric subgroups (sex, race/ethnicity, socioeconomic status, population density, religious affiliation, and mental health) Results Self-reported risk-taking displayed a non-linear trajectory across the lifespan, with a robust peak during mid-adolescence (~16 years-old) that matched basic science theories of adolescent development. While average-level (“main effects”) differences on adolescent risk-taking were observed (e.g., males reported more risk-taking than females), population subgroups displayed highly consistent lifespan peaks in risk-taking during mid to late adolescence (18/19 subgroups displayed peaks between 15-18-years old). Likewise, across nearly all population subgroups, higher levels of risk-taking consistently disambiguated adolescents who regularly use cannabis or alcohol from non-using adolescents. Conclusions Results support predictions of adolescence as the period of the lifespan with the highest risk-taking that potentially confers vulnerability to negative outcomes, including substance use. The consistency of adolescent peaks in risk-taking and links to regular cannabis and alcohol use across nearly all population subgroups, suggests risk-taking measures may be useful in large-scale screening and related prevention efforts

    Brain tissue iron neurophysiology and its relationship with the cognitive effects of dopaminergic modulation in children with and without ADHD

    No full text
    Children with attention-deficit/hyperactivity disorder (ADHD) exhibit impairments in response inhibition. These impairments are ameliorated by modulating dopamine (DA) via the administration of rewards or stimulant medication like methylphenidate (MPH). It is currently unclear whether intrinsic DA availability impacts these effects of dopaminergic modulation on response inhibition. Thus, we estimated intrinsic DA availability using magnetic resonance-based assessments of basal ganglia and thalamic tissue iron in 36 medication-naïve children with ADHD and 29 typically developing (TD) children (8–12 y) who underwent fMRI scans and completed standard and rewarded go/no-go tasks. Children with ADHD additionally participated in a double-blind, randomized, placebo-controlled, crossover MPH challenge. Using linear regressions covarying for age and sex, we determined there were no group differences in brain tissue iron. We additionally found that higher putamen tissue iron was associated with worse response inhibition performance in all participants. Crucially, we observed that higher putamen and caudate tissue iron was associated with greater responsivity to MPH, as measured by improved task performance, in participants with ADHD. These results begin to clarify the role of subcortical brain tissue iron, a measure associated with intrinsic DA availability, in the cognitive effects of reward- and MPH-related dopaminergic modulation in children with ADHD and TD children
    corecore