21 research outputs found

    Coding of shape from shading in area V4 of the macaque monkey

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The shading of an object provides an important cue for recognition, especially for determining its 3D shape. However, neuronal mechanisms that allow the recovery of 3D shape from shading are poorly understood. The aim of our study was to determine the neuronal basis of 3D shape from shading coding in area V4 of the awake macaque monkey.</p> <p>Results</p> <p>We recorded the responses of V4 cells to stimuli presented parafoveally while the monkeys fixated a central spot. We used a set of stimuli made of 8 different 3D shapes illuminated from 4 directions (from above, the left, the right and below) and different 2D controls for each stimulus. The results show that V4 neurons present a broad selectivity to 3D shape and illumination direction, but without a preference for a unique illumination direction. However, 3D shape and illumination direction selectivities are correlated suggesting that V4 neurons can use the direction of illumination present in complex patterns of shading present on the surface of objects. In addition, a vast majority of V4 neurons (78%) have statistically different responses to the 3D and 2D versions of the stimuli, while responses to 3D are not systematically stronger than those to 2D controls. However, a hierarchical cluster analysis showed that the different classes of stimuli (3D, 2D controls) are clustered in the V4 cells response space suggesting a coding of 3D stimuli based on the population response. The different illumination directions also tend to be clustered in this space.</p> <p>Conclusion</p> <p>Together, these results show that area V4 participates, at the population level, in the coding of complex shape from the shading patterns coming from the illumination of the surface of corrugated objects. Hence V4 provides important information for one of the steps of cortical processing of the 3D aspect of objects in natural light environment.</p

    Les indices monoculaires de la perception tridimensionnelle : Ă©tude Ă©lectrophysiologique de l'aire V4 du macaque vigile

    No full text
    The shading and natural texture of an object provide important cues for its recognition, especially in helping to determine its 3D shape. The aim of our study is to determine the neuronal basis of these 3D monocular cues coding in area V4 of the awake macaque monkey.By performing extra-cellular recordings in the awake monkey, we show in the first chapter, that information of natural texture may to be encoded in visual area V4.The second chapter deals with the shape from shading coding by analyzing the results of neuronal responses to several 3D or 2D stimuli.The third chapter explores visual latencies of V4 neurons and suggests very short latencies in comparison with other studies.Our results show that area V4 participates in the coding of 3D shape extracting monocular cues as important sources of 3D information.Cette thÚse porte sur les bases neuronales des indices monoculaires statiques de la perception tridimensionnelle tels que la texture naturelle et les ombrages. A l'aide d'enregistrements extra-cellulaires réalisés chez le singe vigile, nous montrons dans le premier chapitre que la texture naturelle d'un objet est encodée au sein de l'aire visuelle V4 sur la base de caractéristiques tels que sa luminance ou son hétérogénéité. Dans le second chapitre, nous montrons que les neurones de V4 sont sensibles aux ombrages, indice puissant de la 3D. Dans le troisiÚme chapitre, nous nous intéressons aux latences de réponses des neurones de V4. Les temps de latence que nous relevons sont particuliÚrement courts en comparaison avec ceux rapportés dans la littérature. Nos résultats indiquent que l'aire V4 participe au traitement de la forme 3D de l'objet en encodant les ombrages et la texture naturelle comme indices tridimensionnels

    Covert spatial selection in primate basal ganglia.

    No full text
    The basal ganglia are important for action selection. They are also implicated in perceptual and cognitive functions that seem far removed from motor control. Here, we tested whether the role of the basal ganglia in selection extends to nonmotor aspects of behavior by recording neuronal activity in the caudate nucleus while animals performed a covert spatial attention task. We found that caudate neurons strongly select the spatial location of the relevant stimulus throughout the task even in the absence of any overt action. This spatially selective activity was dependent on task and visual conditions and could be dissociated from goal-directed actions. Caudate activity was also sufficient to correctly identify every epoch in the covert attention task. These results provide a novel perspective on mechanisms of attention by demonstrating that the basal ganglia are involved in spatial selection and tracking of behavioral states even in the absence of overt orienting movements

    Attention-related modulation of caudate neurons depends on superior colliculus activity

    No full text
    International audienceRecent work has implicated the primate basal ganglia in visual perception and attention, in addition to their traditional role in motor control. The basal ganglia, especially the caudate nucleus 'head' (CDh) of the striatum, receive indirect anatomical connections from the superior colliculus (SC), a midbrain structure that is known to play a crucial role in the control of visual attention. To test the possible functional relationship between these subcortical structures, we recorded CDh neuronal activity of macaque monkeys before and during unilateral SC inactivation in a spatial attention task. SC inactivation significantly altered the attention-related modulation of CDh neurons and strongly impaired the classification of task-epochs based on CDh activity. Only inactivation of SC on the same side of the brain as recorded CDh neurons, not the opposite side, had these effects. These results demonstrate a novel interaction between SC activity and attention-related visual processing in the basal ganglia

    Natural textures classification in area V4 of the macaque monkey.

    No full text
    International audienceNatural texture of an object is an important cue for recognition. In real conditions, the incidence angle of light on natural textures leads to a complex pattern of micro-shading that modifies 3D rendering of surfaces. Little is known about visual processing of material properties. The present work aims to study the coding of natural textures by the neurons of area V4 of the awake macaque monkey. We used patches of natural textures issued from the CURET database and illuminated with two or three different angles with their corresponding controls (scrambled Fourier phase). We recorded the responses of V4 neurons to stimuli flashed in their receptive fields (RFs) while the macaques performed a simple fixation task. We show that a large majority of V4 neurons responded to texture patches with a strong modulation across stimuli. The analysis of those responses indicate that V4 neurons integrate first and second order parameters in the image (mean luminance, SNR, and energy), which may be used to achieve texture clustering in a multidimensional space. This clustering was comparable to that of a pyramid of Gabor filters and was not affected by illumination angles. Altogether, these results suggest that the V4 neuronal population acts as a set of filters able to classify textures independently of illumination angle. We conclude that area V4 contains mechanisms that are sensitive to the aspect of textured surfaces, even in an environment where illumination changes continuously

    Using the Blind Spot to Investigate Trans-Saccadic Perception

    No full text
    International audienceWe introduce a blind spot method to create image changes contingent on eye movements. One challenge of eye movement research is triggering display changes contingent on gaze. The eye-tracking system must capture the image of the eye, discover and track the pupil and corneal reflections to estimate the gaze position, and then transfer this data to the computer that updates the display. All of these steps introduce delays that are often difficult to predict. To avoid these issues, we describe a simple blind spot method to generate gaze contingent display manipulations without any eye-tracking system and/or display controls
    corecore