223 research outputs found

    Prefrontal neural correlates of memory for sequences

    Get PDF

    Neural ensemble decoding reveals a correlate of viewer- to object-centered spatial transformation in monkey parietal cortex

    Get PDF
    The parietal cortex contains representations of space in multiple coordinate systems including retina-, head-, body-, and world-based systems. Previously, we found that when monkeys are required to perform spatial computations on objects, many neurons in parietal area 7a represent position in an object-centered coordinate system as well. Because visual information enters the brain in a retina-centered reference frame, generation of an object-centered reference requires the brain to perform computation on the visual input. We provide evidence that area 7a contains a correlate of that computation. Specifically, area 7a contains neurons that code information in retina- and object-centered coordinate systems. The information in retina-centered coordinates emerges first, followed by the information in object-centered coordinates. We found that the strength and accuracy of these representations is correlated across trials. Finally, we found that retina-centered information could be used to predict subsequent object-centered signals, but not vice versa. These results are consistent with the hypothesis that either area 7a, or an area that precedes area 7a in the visual processing hierarchy, is performing the retina- to object-centered transformation

    Integration of auditory and visual communication information in the primate ventrolateral prefrontal cortex

    Get PDF
    The integration of auditory and visual stimuli is crucial for recognizing objects, communicating effectively, and navigating through our complex world. Although the frontal lobes are involved in memory, communication, and language, there has been no evidence that the integration of communication information occurs at the single-cell level in the frontal lobes. Here, we show that neurons in the macaque ventrolateral prefrontal cortex (VLPFC) integrate audiovisual communication stimuli. The multisensory interactions included both enhancement and suppression of a predominantly auditory or a predominantly visual response, although multisensory suppression was the more common mode of response. The multisensory neurons were distributed across the VLPFC and within previously identified unimodal auditory and visual regions (O’Scalaidhe et al., 1997; Romanski and Goldman-Rakic, 2002). Thus, our study demonstrates, for the first time, that single prefrontal neurons integrate communication information from the auditory and visual domains, suggesting that these neurons are an important node in the cortical network responsible for communication

    Correlates of Auditory Decision-Making in Prefrontal, Auditory, and Basal Lateral Amygdala Cortical Areas

    Get PDF
    Spatial selective listening and auditory choice underlie important processes including attending to a speaker at a cocktail party and knowing how (or if) to respond. To examine task encoding and relative timing of potential neural substrates underlying these behaviors, we developed a spatial selective detection paradigm for monkeys, and recorded activity in primary auditory cortex (AC), dorsolateral prefrontal cortex (dlPFC) and the basolateral amygdala (BLA). A comparison of neural responses among these three areas showed that, as expected, AC encoded the side of the cue and target characteristics before dlPFC and BLA. Interestingly, AC also encoded the monkey's choice before dlPFC and around the time of BLA. Generally, BLA showed weak responses to all task features except the choice. Decoding analyses suggested that errors followed from a failure to encode the target stimulus in both AC and dlPFC, but again, these differences arose earlier in AC. The similarities between AC and dlPFC responses were abolished during passive sensory stimulation with identical trial conditions, suggesting that the robust sensory encoding in dlPFC is contextually gated. Thus, counter to a strictly PFC-driven decision process, in this spatial selective listening task, AC neural activity represents the sensory and decision information before dlPFC. Unlike in the visual domain, in this auditory task, the BLA does not appear to be robustly involved in selective spatial processing.SIGNIFICANCE STATEMENT:We examined neural correlates of an auditory spatial selective listening task by recording single neuron activity in behaving monkeys from the amygdala, dorsal-lateral prefrontal cortex, and auditory cortex. We found that auditory cortex coded spatial cues and choice-related activity before dorsal-lateral prefrontal cortex or the amygdala. Auditory cortex also had robust delay period activity. Therefore, we found that auditory cortex could support the neural computations that underlie the behavioral processes in the task

    Reinforcement learning in populations of spiking neurons

    Get PDF
    Population coding is widely regarded as a key mechanism for achieving reliable behavioral responses in the face of neuronal variability. But in standard reinforcement learning a flip-side becomes apparent. Learning slows down with increasing population size since the global reinforcement becomes less and less related to the performance of any single neuron. We show that, in contrast, learning speeds up with increasing population size if feedback about the populationresponse modulates synaptic plasticity in addition to global reinforcement. The two feedback signals (reinforcement and population-response signal) can be encoded by ambient neurotransmitter concentrations which vary slowly, yielding a fully online plasticity rule where the learning of a stimulus is interleaved with the processing of the subsequent one. The assumption of a single additional feedback mechanism therefore reconciles biological plausibility with efficient learning

    The statistical neuroanatomy of frontal networks in the macaque

    Get PDF
    We were interested in gaining insight into the functional properties of frontal networks based upon their anatomical inputs. We took a neuroinformatics approach, carrying out maximum likelihood hierarchical cluster analysis on 25 frontal cortical areas based upon their anatomical connections, with 68 input areas representing exterosensory, chemosensory, motor, limbic, and other frontal inputs. The analysis revealed a set of statistically robust clusters. We used these clusters to divide the frontal areas into 5 groups, including ventral-lateral, ventral-medial, dorsal-medial, dorsal-lateral, and caudal-orbital groups. Each of these groups was defined by a unique set of inputs. This organization provides insight into the differential roles of each group of areas and suggests a gradient by which orbital and ventral-medial areas may be responsible for decision-making processes based on emotion and primary reinforcers, and lateral frontal areas are more involved in integrating affective and rational information into a common framework

    Stimulus-dependent maximum entropy models of neural population codes

    Get PDF
    Neural populations encode information about their stimulus in a collective fashion, by joint activity patterns of spiking and silence. A full account of this mapping from stimulus to neural activity is given by the conditional probability distribution over neural codewords given the sensory input. To be able to infer a model for this distribution from large-scale neural recordings, we introduce a stimulus-dependent maximum entropy (SDME) model---a minimal extension of the canonical linear-nonlinear model of a single neuron, to a pairwise-coupled neural population. The model is able to capture the single-cell response properties as well as the correlations in neural spiking due to shared stimulus and due to effective neuron-to-neuron connections. Here we show that in a population of 100 retinal ganglion cells in the salamander retina responding to temporal white-noise stimuli, dependencies between cells play an important encoding role. As a result, the SDME model gives a more accurate account of single cell responses and in particular outperforms uncoupled models in reproducing the distributions of codewords emitted in response to a stimulus. We show how the SDME model, in conjunction with static maximum entropy models of population vocabulary, can be used to estimate information-theoretic quantities like surprise and information transmission in a neural population.Comment: 11 pages, 7 figure
    corecore