86 research outputs found

    Dissociating Object Directed and Non-Object Directed Action in the Human Mirror System; Implications for Theories of Motor Simulation

    Get PDF
    Mirror neurons are single cells found in macaque premotor and parietal cortices that are active during action execution and observation. In non-human primates, mirror neurons have only been found in relation to object-directed movements or communicative gestures, as non-object directed actions of the upper limb are not well characterized in non-human primates. Mirror neurons provide important evidence for motor simulation theories of cognition, sometimes referred to as the direct matching hypothesis, which propose that observed actions are mapped onto associated motor schemata in a direct and automatic manner. This study, for the first time, directly compares mirror responses, defined as the overlap between action execution and observation, during object directed and meaningless non-object directed actions. We present functional MRI data that demonstrate a clear dissociation between object directed and non-object directed actions within the human mirror system. A premotor and parietal network was preferentially active during object directed actions, whether observed or executed. Moreover, we report spatially correlated activity across multiple voxels for observation and execution of an object directed action. In contrast to predictions made by motor simulation theory, no similar activity was observed for non-object directed actions. These data demonstrate that object directed and meaningless non-object directed actions are subserved by different neuronal networks and that the human mirror response is significantly greater for object directed actions. These data have important implications for understanding the human mirror system and for simulation theories of motor cognition. Subsequent theories of motor simulation must account for these differences, possibly by acknowledging the role of experience in modulating the mirror response

    Mirroring Intentional Forgetting in a Shared-Goal Learning Situation

    Get PDF
    Background: Intentional forgetting refers to the surprising phenomenon that we can forget previously successfully encoded memories if we are instructed to do so. Here, we show that participants cannot only intentionally forget episodic memories but they can also mirror the ‘‘forgetting performance’ ’ of an observed model. Methodology/Principal Findings: In four experiments a participant observed a model who took part in a memory experiment. In Experiment 1 and 2 observers saw a movie about the experiment, whereas in Experiment 3 and 4 the observers and the models took part together in a real laboratory experiment. The observed memory experiment was a directed forgetting experiment where the models learned two lists of items and were instructed either to forget or to remember the first list. In Experiment 1 and 3 observers were instructed to simply observe the experiment (‘‘simple observation’ ’ instruction). In Experiment 2 and 4, observers received instructions aimed to induce the same learning goal for the observers and the models (‘‘observation with goal-sharing’ ’ instruction). A directed forgetting effect (the reliably lower recall of to-be-forgotten items) emerged only when models received the ‘‘observation with goal-sharing’ ’ instruction (P,.001 in Experiment 2, and P,.05 in Experiment 4), and it was absent when observers received the ‘‘simple observation’’ instruction (P..1 in Experiment 1 and 3). Conclusion: If people observe another person with the same intention to learn, and see that this person is instructed t

    Processing of Hand-Related Verbs Specifically Affects the Planning and Execution of Arm Reaching Movements

    Get PDF
    Even though a growing body of research has shown that the processing of action language affects the planning and execution of motor acts, several aspects of this interaction are still hotly debated. The directionality (i.e. does understanding action-related language induce a facilitation or an interference with the corresponding action?), the time course, and the nature of the interaction (i.e. under what conditions does the phenomenon occur?) are largely unclear. To further explore this topic we exploited a go/no-go paradigm in which healthy participants were required to perform arm reaching movements toward a target when verbs expressing either hand or foot actions were shown, and to refrain from moving when abstract verbs were presented. We found that reaction times (RT) and percentages of errors increased when the verb involved the same effector used to give the response. This interference occurred very early, when the interval between verb presentation and the delivery of the go signal was 50 ms, and could be elicited until this delay was about 600 ms. In addition, RTs were faster when subjects used the right arm than when they used the left arm, suggesting that action–verb understanding is left-lateralized. Furthermore, when the color of the printed verb and not its meaning was the cue for movement execution the differences between RTs and error percentages between verb categories disappeared, unequivocally indicating that the phenomenon occurs only when the semantic content of a verb has to be retrieved. These results are compatible with the theory of embodied language, which hypothesizes that comprehending verbal descriptions of actions relies on an internal simulation of the sensory–motor experience of the action, and provide a new and detailed view of the interplay between action language and motor acts

    Motor imagery and action observation: cognitive tools for rehabilitation

    Get PDF
    Rehabilitation, for a large part may be seen as a learning process where old skills have to be re-acquired and new ones have to be learned on the basis of practice. Active exercising creates a flow of sensory (afferent) information. It is known that motor recovery and motor learning have many aspects in common. Both are largely based on response-produced sensory information. In the present article it is asked whether active physical exercise is always necessary for creating this sensory flow. Numerous studies have indicated that motor imagery may result in the same plastic changes in the motor system as actual physical practice. Motor imagery is the mental execution of a movement without any overt movement or without any peripheral (muscle) activation. It has been shown that motor imagery leads to the activation of the same brain areas as actual movement. The present article discusses the role that motor imagery may play in neurological rehabilitation. Furthermore, it will be discussed to what extent the observation of a movement performed by another subject may play a similar role in learning. It is concluded that, although the clinical evidence is still meager, the use of motor imagery in neurological rehabilitation may be defended on theoretical grounds and on the basis of the results of experimental studies with healthy subjects

    The Neural Basis of Cognitive Efficiency in Motor Skill Performance from Early Learning to Automatic Stages

    Get PDF

    Neural dynamics of learning sound-action associations

    Get PDF
    A motor component is pre-requisite to any communicative act as one must inherently move to communicate. To learn to make a communicative act, the brain must be able to dynamically associate arbitrary percepts to the neural substrate underlying the pre-requisite motor activity. We aimed to investigate whether brain regions involved in complex gestures (ventral pre-motor cortex, Brodmann Area 44) were involved in mediating association between novel abstract auditory stimuli and novel gestural movements. In a functional resonance imaging (fMRI) study we asked participants to learn associations between previously unrelated novel sounds and meaningless gestures inside the scanner. We use functional connectivity analysis to eliminate the often present confound of ‘strategic covert naming’ when dealing with BA44 and to rule out effects of non-specific reductions in signal. Brodmann Area 44, a region incorporating Broca's region showed strong, bilateral, negative correlation of BOLD (blood oxygen level dependent) response with learning of sound-action associations during data acquisition. Left-inferior-parietal-lobule (l-IPL) and bilateral loci in and around visual area V5, right-orbital-frontal-gyrus, right-hippocampus, left-para-hippocampus, right-head-of-caudate, right-insula and left-lingual-gyrus also showed decreases in BOLD response with learning. Concurrent with these decreases in BOLD response, an increasing connectivity between areas of the imaged network as well as the right-middle-frontal-gyrus with rising learning performance was revealed by a psychophysiological interaction (PPI) analysis. The increasing connectivity therefore occurs within an increasingly energy efficient network as learning proceeds. Strongest learning related connectivity between regions was found when analysing BA44 and l-IPL seeds. The results clearly show that BA44 and l-IPL is dynamically involved in linking gesture and sound and therefore provides evidence that one of the mechanisms required for the evolution of human communication is found within these motor regions

    Applauding with Closed Hands: Neural Signature of Action-Sentence Compatibility Effects

    Get PDF
    BACKGROUND: Behavioral studies have provided evidence for an action-sentence compatibility effect (ACE) that suggests a coupling of motor mechanisms and action-sentence comprehension. When both processes are concurrent, the action sentence primes the actual movement, and simultaneously, the action affects comprehension. The aim of the present study was to investigate brain markers of bidirectional impact of language comprehension and motor processes. METHODOLOGY/PRINCIPAL FINDINGS: Participants listened to sentences describing an action that involved an open hand, a closed hand, or no manual action. Each participant was asked to press a button to indicate his/her understanding of the sentence. Each participant was assigned a hand-shape, either closed or open, which had to be used to activate the button. There were two groups (depending on the assigned hand-shape) and three categories (compatible, incompatible and neutral) defined according to the compatibility between the response and the sentence. ACEs were found in both groups. Brain markers of semantic processing exhibited an N400-like component around the Cz electrode position. This component distinguishes between compatible and incompatible, with a greater negative deflection for incompatible. Motor response elicited a motor potential (MP) and a re-afferent potential (RAP), which are both enhanced in the compatible condition. CONCLUSIONS/SIGNIFICANCE: The present findings provide the first ACE cortical measurements of semantic processing and the motor response. N400-like effects suggest that incompatibility with motor processes interferes in sentence comprehension in a semantic fashion. Modulation of motor potentials (MP and RAP) revealed a multimodal semantic facilitation of the motor response. Both results provide neural evidence of an action-sentence bidirectional relationship. Our results suggest that ACE is not an epiphenomenal post-sentence comprehension process. In contrast, motor-language integration occurring during the verb onset supports a genuine and ongoing brain motor-language interaction

    Visuospatial Integration: Paleoanthropological and Archaeological Perspectives

    Get PDF
    The visuospatial system integrates inner and outer functional processes, organizing spatial, temporal, and social interactions between the brain, body, and environment. These processes involve sensorimotor networks like the eye–hand circuit, which is especially important to primates, given their reliance on vision and touch as primary sensory modalities and the use of the hands in social and environmental interactions. At the same time, visuospatial cognition is intimately connected with memory, self-awareness, and simulation capacity. In the present article, we review issues associated with investigating visuospatial integration in extinct human groups through the use of anatomical and behavioral data gleaned from the paleontological and archaeological records. In modern humans, paleoneurological analyses have demonstrated noticeable and unique morphological changes in the parietal cortex, a region crucial to visuospatial management. Archaeological data provides information on hand–tool interaction, the spatial behavior of past populations, and their interaction with the environment. Visuospatial integration may represent a critical bridge between extended cognition, self-awareness, and social perception. As such, visuospatial functions are relevant to the hypothesis that human evolution is characterized by changes in brain–body–environment interactions and relations, which enhance integration between internal and external cognitive components through neural plasticity and the development of a specialized embodiment capacity. We therefore advocate the investigation of visuospatial functions in past populations through the paleoneurological study of anatomical elements and archaeological analysis of visuospatial behaviors
    corecore