1,087 research outputs found

    Activation of cerebellum and basal ganglia during the observation and execution of manipulative actions

    Get PDF
    Studies on action observation mostly described the activation of a network of cortical areas, while less investigation focused specifically on the activation and role of subcortical nodes. In the present fMRI study, we investigated the recruitment of cerebellum and basal ganglia during the execution and observation of object manipulation performed with the right hand. The observation conditions consisted in: (a) observation of manipulative actions; (b) observation of sequences of random finger movements. In the execution conditions, participants had to perform the same actions or movements as in (a) and (b), respectively. The results of conjunction analysis showed significant shared activations during both observation and execution of manipulation in several subcortical structures, including: (1) cerebellar lobules V, VI, crus I, VIIIa and VIIIb (bilaterally); (2) globus pallidus, bilaterally, and left subthalamic nucleus; (3) red nucleus (bilaterally) and left thalamus. These findings support the hypothesis that the action observation/execution network also involves subcortical structures, such as cerebellum and basal ganglia, forming an integrated network. This suggests possible mechanisms, involving these subcortical structures, underlying learning of new motor skills, through action observation and imitation

    Cortical Motor Organization, Mirror Neurons, and Embodied Language: An Evolutionary Perspective

    Get PDF
    The recent conceptual achievement that the cortical motor system plays a crucial role not only in motor control but also in higher cognitive functions has given a new perspective also on the involvement of motor cortex in language perception and production. In particular, there is evidence that the matching mechanism based on mirror neurons can be involved in both pho-nological recognition and retrieval of meaning, especially for action word categories, thus suggesting a contribution of an action–perception mechanism to the automatic comprehension of semantics. Furthermore, a compari-son of the anatomo-functional properties of the frontal motor cortex among different primates and their communicative modalities indicates that the combination of the voluntary control of the gestural communication systems and of the vocal apparatus has been the critical factor in the transition from a gestural-based communication into a predominantly speech-based system. Finally, considering that the monkey and human premotor-parietal motor system, plus the prefrontal cortex, are involved in the sequential motor organization of actions and in the hierarchical combination of motor elements, we propose that elements of such motor organization have been exploited in other domains, including some aspects of the syntactic structure of language

    Chronic neural probe for simultaneous recording of single-unit, multi-unit, and local field potential activity from multiple brain sites

    Get PDF
    Drug resistant focal epilepsy can be treated by resecting the epileptic focus requiring a precise focus localization using stereoelectroencephalography (SEEG) probes. As commercial SEEG probes offer only a limited spatial resolution, probes of higher channel count and design freedom enabling the incorporation of macro and microelectrodes would help increasing spatial resolution and thus open new perspectives for investigating mechanisms underlying focal epilepsy and its treatment. This work describes a new fabrication process for SEEG probes with materials and dimensions similar to clinical probes enabling recording single neuron activity at high spatial resolution. Polyimide is used as a biocompatible flexible substrate into which platinum electrodes and leads are... The resulting probe features match those of clinically approved devices. Tests in saline solution confirmed the probe stability and functionality. Probes were implanted into the brain of one monkey (Macaca mulatta), trained to perform different motor tasks. Suitable configurations including up to 128 electrode sites allow the recording of task-related neuronal signals. Probes with 32 and 64 electrode sites were implanted in the posterior parietal cortex. Local field potentials and multi-unit activity were recorded as early as one hour after implantation. Stable single-unit activity was achieved for up to 26 days after implantation of a 64-channel probe. All recorded signals showed modulation during task execution. With the novel probes it is possible to record stable biologically relevant data over a time span exceeding the usual time needed for epileptic focus localization in human patients. This is the first time that single units are recorded along cylindrical polyimide probes chronically implanted 22 mm deep into the brain of a monkey, which suggests the potential usefulness of this probe for human applications

    Neurons Controlling Voluntary Vocalization in the Macaque Ventral Premotor Cortex

    Get PDF
    The voluntary control of phonation is a crucial achievement in the evolution of speech. In humans, ventral premotor cortex (PMv) and Broca's area are known to be involved in voluntary phonation. In contrast, no neurophysiological data are available about the role of the oro-facial sector of nonhuman primates PMv in this function. In order to address this issue, we recorded PMv neurons from two monkeys trained to emit coo-calls. Results showed that a population of motor neurons specifically fire during vocalization. About two thirds of them discharged before sound onset, while the remaining were time-locked with it. The response of vocalization-selective neurons was present only during conditioned (voluntary) but not spontaneous (emotional) sound emission. These data suggest that the control of vocal production exerted by PMv neurons constitutes a newly emerging property in the monkey lineage, shedding light on the evolution of phonation-based communication from a nonhuman primate species

    Neuronal Chains for Actions in the Parietal Lobe: A Computational Model

    Get PDF
    The inferior part of the parietal lobe (IPL) is known to play a very important role in sensorimotor integration. Neurons in this region code goal-related motor acts performed with the mouth, with the hand and with the arm. It has been demonstrated that most IPL motor neurons coding a specific motor act (e.g., grasping) show markedly different activation patterns according to the final goal of the action sequence in which the act is embedded (grasping for eating or grasping for placing). Some of these neurons (parietal mirror neurons) show a similar selectivity also during the observation of the same action sequences when executed by others. Thus, it appears that the neuronal response occurring during the execution and the observation of a specific grasping act codes not only the executed motor act, but also the agent's final goal (intention)

    Decoding grip type and action goal during the observation of reaching-grasping actions: A multivariate fMRI study

    Get PDF
    During execution and observation of reaching-grasping actions, the brain must encode, at the same time, the final action goal and the type of grip necessary to achieve it. Recently, it has been proposed that the Mirror Neuron System (MNS) is involved not only in coding the final goal of the observed action, but also the type of grip used to grasp the object. However, the specific contribution of the different areas of the MNS, at both cortical and subcortical level, in disentangling action goal and grip type is still unclear. Here, twenty human volunteers participated in an fMRI study in which they performed two tasks: (a) observation of four different types of actions, consisting in reaching-to-grasp a box handle with two possible grips (precision, hook) and two possible goals (open, close); (b) action execution, in which participants performed grasping actions similar to those presented during the observation task. A conjunction analysis revealed the presence of shared activated voxels for both action observation and execution within several cortical areas including dorsal and ventral premotor cortex, inferior and superior parietal cortex, intraparietal sulcus, primary somatosensory cortex, and cerebellar lobules VI and VIII. ROI analyses showed a main effect for grip type in several premotor and parietal areas and cerebellar lobule VI, with higher BOLD activation during observation of precision vs hook actions. A grip x goal interaction was also present in the left inferior parietal cortex, with higher BOLD activity during precision-to-close actions. A multivariate pattern analysis (MVPA) revealed a significant accuracy for the grip model in all ROIs, while for the action goal model, significant accuracy was observed only for left inferior parietal cortex ROI. These findings indicate that a large network involving cortical and cerebellar areas is involved in the processing of type of grip, while final action goal appears to be mainly processed within the inferior parietal region, suggesting a differential contribution of the areas activated in this study

    Extending Feynman's Formalisms for Modelling Human Joint Action Coordination

    Full text link
    The recently developed Life-Space-Foam approach to goal-directed human action deals with individual actor dynamics. This paper applies the model to characterize the dynamics of co-action by two or more actors. This dynamics is modelled by: (i) a two-term joint action (including cognitive/motivatonal potential and kinetic energy), and (ii) its associated adaptive path integral, representing an infinite--dimensional neural network. Its feedback adaptation loop has been derived from Bernstein's concepts of sensory corrections loop in human motor control and Brooks' subsumption architectures in robotics. Potential applications of the proposed model in human--robot interaction research are discussed. Keywords: Psycho--physics, human joint action, path integralsComment: 6 pages, Late

    Visual response of ventrolateral prefrontal neurons and their behavior-related modulation

    Get PDF
    The ventral part of lateral prefrontal cortex (VLPF) of the monkey receives strong visual input, mainly from inferotemporal cortex. It has been shown that VLPF neurons can show visual responses during paradigms requiring to associate arbitrary visual cues to behavioral reactions. Further studies showed that there are also VLPF neurons responding to the presentation of specific visual stimuli, such as objects and faces. However, it is largely unknown whether VLPF neurons respond and differentiate between stimuli belonging to different categories, also in absence of a specific requirement to actively categorize or to exploit these stimuli for choosing a given behavior. The first aim of the present study is to evaluate and map the responses of neurons of a large sector of VLPF to a wide set of visual stimuli when monkeys simply observe them. Recent studies showed that visual responses to objects are also present in VLPF neurons coding action execution, when they are the target of the action. Thus, the second aim of the present study is to compare the visual responses of VLPF neurons when the same objects are simply observed or when they become the target of a grasping action. Our results indicate that: (1) part of VLPF visually responsive neurons respond specifically to one stimulus or to a small set of stimuli, but there is no indication of a “passive” categorical coding; (2) VLPF neuronal visual responses to objects are often modulated by the task conditions in which the object is observed, with the strongest response when the object is target of an action. These data indicate that VLPF performs an early passive description of several types of visual stimuli, that can then be used for organizing and planning behavior. This could explain the modulation of visual response both in associative learning and in natural behavior

    How the Context Matters. Literal and Figurative Meaning in the Embodied Language Paradigm

    Get PDF
    The involvement of the sensorimotor system in language understanding has been widely demonstrated. However, the role of context in these studies has only recently started to be addressed. Though words are bearers of a semantic potential, meaning is the product of a pragmatic process. It needs to be situated in a context to be disambiguated. The aim of this study was to test the hypothesis that embodied simulation occurring during linguistic processing is contextually modulated to the extent that the same sentence, depending on the context of utterance, leads to the activation of different effector-specific brain motor areas. In order to test this hypothesis, we asked subjects to give a motor response with the hand or the foot to the presentation of ambiguous idioms containing action-related words when these are preceded by context sentences. The results directly support our hypothesis only in relation to the comprehension of hand-related action sentences
    corecore