200 research outputs found

    Structural connectivity and functional properties of the macaque superior parietal lobule

    Get PDF
    Despite the consolidated belief that the macaque superior parietal lobule (SPL) is entirely occupied by Brodmann’s area 5, recent data show that macaque SPL also hosts a large cortical region with structural and functional features similar to that of Brodmann’s area 7. According to these data, the anterior part of SPL is occupied by a somatosensory-dominated cortical region that hosts three architectural and functional distinct regions (PE, PEci, PEip) and the caudal half of SPL by a bimodal somato-visual region that hosts four areas: PEc, MIP, PGm, V6A. To date, the most studied areas of SPL are PE, PEc, and V6A. PE is essentially a high-order somatomotor area, while PEc and V6A are bimodal somatomotor–visuomotor areas, the former with predominant somatosensory input and the latter with predominant visual input. The functional properties of these areas and their anatomical connectivity strongly suggest their involvement in the control of limb movements. PE is suggested to be involved in the preparation/execution of limb movements, in particular, the movements of the upper limb; PEc in the control of movements of both upper and lower limbs, as well as in their interaction with the visual environment; V6A in the control of reach-to-grasp movements performed with the upper limb. In humans, SPL is traditionally considered to have a different organization with respect to macaques. Here, we review several lines of evidence suggesting that this is not the case, showing a similar structure for human and non-human primate SPLs

    Interplay Between Grip and Vision in the Monkey Medial Parietal Lobe

    Get PDF
    We aimed at understanding the relative contribution of visual information and hand shaping to the neuronal activity of medial posterior parietal area V6A, a newly added area in the monkey cortical grasping circuit. Two Macaca fascicularis performed a Reach-to-Grasp task in the dark and in the light, grasping objects of different shapes. We found that V6A contains Visual cells, activated only during grasping in the light; Motor neurons, equally activated during grasping in the dark and in the light; Visuomotor cells, differently activated while grasping in the dark and in the light. Visual, Motor, and Visuomotor neurons were moderately or highly selective during grasping, whereas they reduced their selectivity during object observation without performing grasping. The use of the same experimental design employed in the dorsolateral grasping area AIP by other authors allowed us to compare the grasp-related properties of V6A and AIP. From these data and from the literature a frame emerges with many similarities between medial grasping area V6A and lateral grasping area AIP: both areas update and control prehension, with V6A less sensitive than AIP to fine visual details of the objects to be grasped, but more involved in coordinating reaching and grasping

    The posterior parietal area V6A: an attentionally-modulated visuomotor region involved in the control of reach-to-grasp action

    Get PDF
    In the macaque, the posterior parietal area V6A is involved in the control of all phases of reach-to-grasp actions: the transport phase, given that reaching neurons are sensitive to the direction and amplitude of arm movement, and the grasping phase, since reaching neurons are also sensitive to wrist orientation and hand shaping. Reaching and grasping activity are corollary discharges which, together with the somatosensory and visual signals related to the same movement, allow V6A to act as a state estimator that signals discrepancies during the motor act in order to maintain consistency between the ongoing movement and the desired one. Area V6A is also able to encode the target of an action because of gaze-dependent visual neurons and real-position cells. Here, we advance the hypothesis that V6A also uses the spotlight of attention to guide goal-directed movements of the hand, and hosts a priority map that is specific for the guidance of reaching arm movement, combining bottom-up inputs such as visual responses with top-down signals such as reaching plans

    The dorsal visual stream revisited: Stable circuits or dynamic pathways?

    Get PDF
    In both macaque and human brain, information regarding visual motion flows from the extrastriate area V6 along two different paths: a dorsolateral one towards areas MT/V5, MST, V3A, and a dorsomedial one towards the visuomotor areas of the superior parietal lobule (V6A, MIP, VIP). The dorsolateral visual stream is involved in many aspects of visual motion analysis, including the recognition of object motion and self motion. The dorsomedial stream uses visual motion information to continuously monitor the spatial location of objects while we are looking and/or moving around, to allow skilled reaching for and grasping of the objects in structured, dynamically changing environments. Grasping activity is present in two areas of the dorsal stream, AIP and V6A. Area AIP is more involved than V6A in object recognition, V6A in encoding vision for action. We suggest that V6A is involved in the fast control of prehension and plays a critical role in biomechanically selecting appropriate postures during reach to grasp behaviors.In everyday life, numerous functional networks, often involving the same cortical areas, are continuously in action in the dorsal visual stream, with each network dynamically activated or inhibited according to the context. The dorsolateral and dorsomedial streams represent only two examples of these networks. Many others streams have been described in the literature, but it is worthwhile noting that the same cortical area, and even the same neurons within an area, are not specific for just one functional property, being part of networks that encode multiple functional aspects. Our proposal is to conceive the cortical streams not as fixed series of interconnected cortical areas in which each area belongs univocally to one stream and is strictly involved in only one function, but as interconnected neuronal networks, often involving the same neurons, that are involved in a number of functional processes and whose activation changes dynamically according to the context

    Perception meets action: fMRI and behavioural investigations of human tool use

    Get PDF
    Tool use is essential and culturally universal to human life, common to hunter-gatherer and modern advanced societies alike. Although the neuroscience of simpler visuomotor behaviors like reaching and grasping have been studied extensively, relatively little is known about the brain mechanisms underlying learned tool use. With learned tool use, stored knowledge of object function and use supervene requirements for action programming based on physical object properties. Contemporary models of tool use based primarily on evidence from the study of brain damaged individuals implicate a set of specialized brain areas underlying the planning and control of learned actions with objects, distinct from areas devoted to more basic aspects of visuomotor control. The findings from the current thesis build on these existing theoretical models and provide new insights into the neural and behavioural mechanisms of learned tool use. In Project 1, I used fMRI to visualize brain activity in response to viewing tool use grasping. Grasping actions typical of how tools are normally grasped during use were found to preferentially activate occipitotemporal areas, including areas specialized for visual object recognition. The findings revealed sensitivity within this network to learned contextual associations tied to stored knowledge of tool-specific actions. The effects were seen to arise implicitly, in the absence of concurrent effects in visuomotor areas of parietofrontal cortex. These findings were taken to reflect the tuning of higher-order visual areas of occipitotemporal cortex to learned statistical regularities of the visual world, including the way in which tools are typically seen to be grasped and used. These areas are likely to represent an important source of inputs to visuomotor areas as to learned conceptual knowledge of tool use. In Project 2, behavioural priming and the kinematics of real tool use grasping was explored. Behavioural priming provides an index into the planning stages of actions. Participants grasped tools to either move them, grasp-to-move (GTM), or to demonstrate their common use, grasp-to-use (GTU), and grasping actions were preceded by a visual preview (prime) of either the same (congruent) or different (incongruent) tool as that which was then acted with. Behavioural priming was revealed as a reaction time advantage for congruent trial types, thought to reflect the triggering of learned use-based motor plans by the viewing of tools at prime events. The findings from two separate experiments revealed differential sensitivity to priming according to task and task setting. When GTU and GTM tasks were presented separately, priming was specific to the GTU task. In contrast, when GTU and GTM tasks were presented in the same block of trials, in a mixed task setting, priming was evident for both tasks. Together the findings indicate the importance of both task and task setting in shaping effects of action priming, likely driven by differences in the allocation of attentional resources. Differences in attention to particular object features, in this case tool identity, modulate affordances driven by those features which in turn determines priming. Beyond the physical properties of objects, knowledge and intention of use provide a mechanism for which affordances and the priming of actions may operate. Project 3 comprised a neuroimaging variant of the behavioural priming paradigm used in Project 2, with tools and tool use actions specially tailored for the fMRI environment. Preceding tool use with a visual preview of the tool to be used gave rise to reliable neural priming, measured as reduced BOLD activity. Neural priming of tool use was taken to reflect increased metabolic efficiency in the retrieval and implementation of stored tool use plans. To demonstrate specificity of priming for familiar tool use, a control task was used whereby actions with tools were determined not by tool identity but by arbitrarily learned associations with handle color. The findings revealed specificity for familiar tool-use priming in four distinct parietofrontal areas, including left inferior parietal cortex previously implicated in the storage of learned tool use plans. Specificity of priming for tool-action and not color-action associations provides compelling evidence for tool-use-experience-dependent plasticity within parietofrontal areas

    Decoding information for grasping from the macaque dorsomedial visual stream

    Get PDF
    Neurodecoders have been developed by researchers mostly to control neuroprosthetic devices, but also to shed new light on neural functions. In this study, we show that signals representing grip configurations can be reliably decoded from neural data acquired from area V6A of the monkey medial posterior parietal cortex. Two Macaca fascicularis monkeys were trained to perform an instructed-delay reach-to-grasp task in the dark and in the light toward objects of different shapes. Population neural activity was extracted at various time intervals on vision of the objects, the delay before movement, and grasp execution. This activity was used to train and validate a Bayes classifier used for decoding objects and grip types. Recognition rates were well over chance level for all the epochs analyzed in this study. Furthermore, we detected slightly different decoding accuracies, depending on the task's visual condition. Generalization analysis was performed by training and testing the system during different time intervals. This analysis demonstrated that a change of code occurred during the course of the task. Our classifier was able to discriminate grasp types fairly well in advance with respect to grasping onset. This feature might be important when the timing is critical to send signals to external devices before the movement start. Our results suggest that the neural signals from the dorsomedial visual pathway can be a good substrate to feed neural prostheses for prehensile actions

    Three-dimensional eye position signals shape both peripersonal space and arm movement activity in the medial posterior parietal cortex

    Get PDF
    Research conducted over the last decades has established that the medial part of posterior parietal cortex (PPC) is crucial for controlling visually guided actions in human and non-human primates. Within this cortical sector there is area V6A, a crucial node of the parietofrontal network involved in arm movement control in both monkeys and humans. However, the encoding of action-in-depth by V6A cells had been not studied till recently. Recent neurophysiological studies show the existence in V6A neurons of signals related to the distance of targets from the eyes. These signals are integrated, often at the level of single cells, with information about the direction of gaze, thus encoding spatial location in 3D space. Moreover, 3D eye position signals seem to be further exploited at two additional levels of neural processing: (a) in determining whether targets are located in the peripersonal space or not, and (b) in shaping the spatial tuning of arm movement related activity toward reachable targets. These findings are in line with studies in putative homolog regions in humans and together point to a role of medial PPC in encoding both the vergence angle of the eyes and peripersonal space. Besides its role in spatial encoding also in depth, several findings demonstrate the involvement of this cortical sector in non-spatial processes

    Preparatory activity for purposeful arm movements in the dorsomedial parietal area V6A: Beyond the online guidance of movement

    Get PDF
    Over the years, electrophysiological recordings in macaque monkeys performing visuomotor tasks brought about accumulating evidence for the expression of neuronal properties (e.g., selectivity in the visuospatial and somatosensory domains, encoding of visual affordances and motor cues) in the posterior parietal area V6A that characterize it as an ideal neural substrate for online control of prehension. Interestingly, neuroimaging studies suggested a role of putative human V6A also in action preparation; moreover, pre-movement population activity in monkey V6A has been recently shown to convey grip-related information for upcoming grasping. Here we directly test whether macaque V6A neurons encode preparatory signals that effectively differentiate between dissimilar actions before movement. We recorded the activity of single V6A neurons during execution of two visuomotor tasks requiring either reach-to-press or reach-to-grasp movements in different background conditions, and described the nature and temporal dynamics of V6A activity preceding movement execution. We found striking consistency in neural discharges measured during pre-movement and movement epochs, suggesting that the former is a preparatory activity exquisitely linked to the subsequent execution of particular motor actions. These findings strongly support a role of V6A beyond the online guidance of movement, with preparatory activity implementing suitable motor programs that subsequently support action execution
    corecore