2,935 research outputs found
Interruption of visually perceived forward motion in depth evokes a cortical activation shift from spatial to intentional motor regions
Forward locomotion generates a radially expanding flow of visual motion which supports goal-directed walking. In stationary mode, wide-field visual presentation of optic flow stimuli evokes the illusion of forward self-motion. These effects illustrate an intimate relation between visual and motor processing. In the present fMRI study, we applied optic flow to identify distinct interfaces between circuitries implicated in vision and movement. The dorsal premotor cortex (PMd) was expected to contribute to wide-field forward motion flow (FFw), reflecting a pathway for externally triggered motor control. Medial prefrontal activation was expected to follow interrupted optic flow urging internally generated action. Data of 15 healthy subjects were analyzed with Statistical Parametric Mapping and confirmed this hypothesis. Right PMd activation was seen in FFw, together with activations of posterior parietal cortex, ventral V5, and the right fusiform gyms. Conjunction analysis of the transition from wide to narrow forward flow and reversed wide-field flow revealed selective dorsal medial prefrontal activation. These findings point at equivalent visuomotor transformations in locomotion and goal-directed hand movement, in which parietal-premotor circuitry is crucially implicated. Possible implications of an activation shift from spatial to intentional motor regions for understanding freezing of gait in Parkinson's disease are discussed: impaired medial prefrontal function in Parkinson's disease may reflect an insufficient internal motor drive when visual support from optic flow is reduced at the entrance of a narrow corridor. (C) 2010 Elsevier B.V. All rights reserved
Multisensory mechanisms of body ownership and self-location
Having an accurate sense of the spatial boundaries of the body is a prerequisite for
interacting with the environment and is thus essential for the survival of any organism with
a central nervous system. Every second, our brain receives a staggering amount of
information from the body across different sensory channels, each of which features a
certain degree of noise. Despite the complexity of the incoming multisensory signals, the
brain manages to construct and maintain a stable representation of our own body and its
spatial relationships to the external environment. This natural “in-body” experience is such
a fundamental subjective feeling that most of us take it for granted. However, patients with
lesions in particular brain areas can experience profound disturbances in their normal sense
of ownership over their body (somatoparaphrenia) or lose the feeling of being located
inside their physical body (out-of-body experiences), suggesting that our “in-body” experience depends on intact neural circuitry in the temporal, frontal, and parietal brain regions.
The question at the heart of this thesis relates to how the brain combines visual, tactile, and
proprioceptive signals to build an internal representation of the bodily self in space.
Over the past two decades, perceptual body illusions have become an important tool for
studying the mechanisms underlying our sense of body ownership and self-location. The
most influential of these illusions is the rubber hand illusion, in which ownership of an
artificial limb is induced via the synchronous stroking of a rubber hand and an individual’s
hidden real hand. Studies of this illusion have shown that multisensory integration within
the peripersonal space is a key mechanism for bodily self-attribution. In Study I, we
showed that the default sense of ownership of one’s real hand, not just the sense of rubber
hand ownership, also depends on spatial and temporal multisensory congruence principles
implemented in fronto-parietal brain regions. In Studies II and III, we characterized two
novel perceptual illusions that provide strong support for the notion that multisensory
integration within the peripersonal space is intimately related to the sense of limb ownership, and we examine the role of vision in this process. In Study IV, we investigated a fullbody version of the rubber hand illusion—the “out-of-body illusion”—and show that it can
be used to induce predictable changes in one’s sense of self-location and body ownership.
Finally, in Study V, we used the out-of-body illusion to “perceptually teleport” participants
during brain imaging and identify activity patterns specific to the sense of self-location in a
given position in space. Together, these findings shed light on the role of multisensory
integration in building the experience of the bodily self in space and provide initial
evidence for how representations of body ownership and self-location interact in the brain
Regional gray matter volumetric changes in autism associated with social and repetitive behavior symptoms.
BackgroundAlthough differences in brain anatomy in autism have been difficult to replicate using manual tracing methods, automated whole brain analyses have begun to find consistent differences in regions of the brain associated with the social cognitive processes that are often impaired in autism. We attempted to replicate these whole brain studies and to correlate regional volume changes with several autism symptom measures.MethodsWe performed MRI scans on 24 individuals diagnosed with DSM-IV autistic disorder and compared those to scans from 23 healthy comparison subjects matched on age. All participants were male. Whole brain, voxel-wise analyses of regional gray matter volume were conducted using voxel-based morphometry (VBM).ResultsControlling for age and total gray matter volume, the volumes of the medial frontal gyri, left pre-central gyrus, right post-central gyrus, right fusiform gyrus, caudate nuclei and the left hippocampus were larger in the autism group relative to controls. Regions exhibiting smaller volumes in the autism group were observed exclusively in the cerebellum. Significant partial correlations were found between the volumes of the caudate nuclei, multiple frontal and temporal regions, the cerebellum and a measure of repetitive behaviors, controlling for total gray matter volume. Social and communication deficits in autism were also associated with caudate, cerebellar, and precuneus volumes, as well as with frontal and temporal lobe regional volumes.ConclusionGray matter enlargement was observed in areas that have been functionally identified as important in social-cognitive processes, such as the medial frontal gyri, sensorimotor cortex and middle temporal gyrus. Additionally, we have shown that VBM is sensitive to associations between social and repetitive behaviors and regional brain volumes in autism
Electrophysiological responses to violations of expectation from eye gaze and arrow cues
Isolating processes within the brain that are specific to human behavior is a key goal for social neuroscience. The current research was an attempt to test whether recent findings of enhanced negative ERPs in response to unexpected human gaze are unique to eye gaze stimuli by comparing the effects of gaze cues with the effects of an arrow cue. ERPs were recorded while participants (N=30) observed a virtual actor or an arrow that gazed (or pointed) either toward (object congruent) or away from (object incongruent) a flashing checkerboard. An enhanced negative ERP (N300) in response to object incongruent compared to object congruent trials was recorded for both eye gaze and arrow stimuli. The findings are interpreted as reflecting a domain general mechanism for detecting unexpected events
Behavioral, Neural, and Computational Principles of Bodily Self-Consciousness
Recent work in human cognitive neuroscience has linked self-consciousness to the processing of multisensory bodily signals (bodily self-consciousness [BSC]) in fronto-parietal cortex and more posterior temporo-parietal regions. We highlight the behavioral, neurophysiological, neuroimaging, and computational laws that subtend BSC in humans and non-human primates. We propose that BSC includes body-centered perception (hand, face, and trunk), based on the integration of proprioceptive, vestibular, and visual bodily inputs, and involves spatio-temporal mechanisms integrating multisensory bodily stimuli within peripersonal space (PPS). We develop four major constraints of BSC (proprioception, body-related visual information, PPS, and embodiment) and argue that the fronto-parietal and temporo-parietal processing of trunk-centered multisensory signals in PPS is of particular relevance for theoretical models and simulations of BSC and eventually of self-consciousness
Communicating emotions in expressive avatars
Avatars have become a fundamental part of collaborative virtual environments. They are the visual embodiment of the user and are designed to address key issues in the interaction process between the user and the CVE. Giving avatars expressive abilities has been considered essential in computer-human reaction. Having an avatar, which has the ability to express facial expressions, as a part of the computer interface increases human performance. Researches have provided strong evidence that emotions can be effectively portrayed visually in avatars to represent human users in collaborative virtual environments. These include manipulation of facial expressions as they are efficient carriers of emotions. However, avatars have still only limited variations in their emotional expressions to become believable entities
- …