126 research outputs found

    Towards a Unified View on Pathways and Functions of Neural Recurrent Processing

    Get PDF
    There are three neural feedback pathways to the primary visual cortex (V1): corticocortical, pulvinocortical, and cholinergic. What are the respective functions of these three projections? Possible functions range from contextual modulation of stimulus processing and feedback of high-level information to predictive processing (PP). How are these functions subserved by different pathways and can they be integrated into an overarching theoretical framework? We propose that corticocortical and pulvinocortical connections are involved in all three functions, whereas the role of cholinergic projections is limited by their slow response to stimuli. PP provides a broad explanatory framework under which stimulus-context modulation and high-level processing are subsumed, involving multiple feedback pathways that provide mechanisms for inferring and interpreting what sensory inputs are about

    Spike-based coupling between single neurons and populations across rat sensory cortices, perirhinal cortex, and hippocampus

    Get PDF
    Cortical computations require coordination of neuronal activity within and across multiple areas. We characterized spiking relationships within and between areas by quantifying coupling of single neurons to population firing patterns. Single-neuron population coupling (SNPC) was investigated using ensemble recordings from hippocampal CA1 region and somatosensory, visual, and perirhinal cortices. Within-area coupling was heterogeneous across structures, with area CA1 showing higher levels than neocortical regions. In contrast to known anatomical connectivity, between-area coupling showed strong firing coherence of sensory neocortices with CA1, but less with perirhinal cortex. Cells in sensory neocortices and CA1 showed positive correlations between within- and between-area coupling; these were weaker for perirhinal cortex. All four areas harbored broadcasting cells, connecting to multiple external areas, which was uncorrelated to within-area coupling strength. When examining correlations between SNPC and spatial coding, we found that, if such correlations were significant, they were negative. This result was consistent with an overall preservation of SNPC across different brain states, suggesting a strong dependence on intrinsic network connectivity. Overall, SNPC offers an important window on cell-to-population synchronization in multi-area networks. Instead of pointing to specific information-coding functions, our results indicate a primary function of SNPC in dynamically organizing communication in systems composed of multiple, interconnected areas

    Absence of direction-specific cross-modal visual–auditory adaptation in motion-onset event-related potentials

    Get PDF
    Adaptation to visual or auditory motion affects within-modality motion processing as reflected by visual or auditory free-field motion-onset evoked potentials (VEPs, AEPs). Here, a visual–auditory motion adaptation paradigm was used to investigate the effect of visual motion adaptation on VEPs and AEPs to leftward motion-onset test stimuli. Effects of visual adaptation to (i) scattered light flashes, and motion in the (ii) same or in the (iii) opposite direction of the test stimulus were compared. For the motion-onset VEPs, i.e. the intra-modal adaptation conditions, direction-specific adaptation was observed – the change-N2 (cN2) and change-P2 (cP2) amplitudes were significantly smaller after motion adaptation in the same than in the opposite direction. For the motion-onset AEPs, i.e. the cross-modal adaptation condition, there was an effect of motion history only in the change-P1 (cP1), and this effect was not direction-specific – cP1 was smaller after scatter than after motion adaptation to either direction. No effects were found for later components of motion-onset AEPs. While the VEP results provided clear evidence for the existence of a direction-specific effect of motion adaptation within the visual modality, the AEP findings suggested merely a motion-related, but not a direction-specific effect. In conclusion, the adaptation of veridical auditory motion detectors by visual motion is not reflected by the AEPs of the present study

    Emotional actions are coded via two mechanisms: with and without identity representation

    Get PDF
    Accurate perception of an individual’s identity and emotion derived from their actions and behavior is essential for successful social functioning. Here we determined the role of identity in the representation of emotional whole-body actions using visual adaptation paradigms. Participants adapted to actors performing different whole-body actions in a happy and sad fashion. Following adaptation subsequent neutral actions appeared to convey the opposite emotion.We demonstrate two different emotional action aftereffects showing distinctive adaptation characteristics. For one short-lived aftereffect, adaptation to the emotion expressed by an individual resulted in biases in the perception of the expression of emotion by other individuals, indicating an identity-independent representation of emotional actions. A second, longer lasting, aftereffect was observed where adaptation to the emotion expressed by an individual resulted in longer-term biases in the perception of the expressions of emotion only by the same individual; this indicated an additional identity-dependent representation of emotional actions. Together, the presence of these two aftereffects indicates the existence of two mechanisms for coding emotional actions, only one of which takes into account the actor’s identity. The results that we observe might parallel processing of emotion from face and voice

    Threat modulates neural responses to looming visual stimuli

    Get PDF
    Objects on a collision course with an observer produce a specific pattern of optical expansion on the retina known as looming, which in theory exactly specifies the time-to-collision (TTC) of approaching objects. We recently demonstrated that the affective content of looming stimuli influences perceived TTC, with threatening objects judged as approaching sooner than non-threatening objects. Here, we investigated the neural mechanisms by which perceived threat modulates spatiotemporal perception. Participants judged the TTC of threatening (snakes, spiders) or non-threatening (butterflies, rabbits) stimuli, which expanded in size at a rate indicating one of five TTCs. We analysed visual-evoked potentials (VEPs) and oscillatory neural responses measured with electroencephalography (EEG). The arrival time of threatening stimuli was underestimated compared to non-threatening stimuli, though an interaction suggested that this underestimation was not constant across TTCs. Further, both speed of approach and threat modulated both VEPs and oscillatory responses. Speed of approach modulated the N1 parietal and oscillations in the beta band. Threat modulated several VEP components (P1, N1 frontal, N1 occipital, EPN and LPP) and oscillations in the alpha and high gamma band. The results for the high gamma band suggest an interaction between these two factors. Previous evidence suggests that looming stimuli activate sensorimotor areas, even in absence of an intended action. Our results show that threat disrupts the synchronization over the sensorimotor areas that are likely activated by the presentation of a looming stimulus
    • …
    corecore