93 research outputs found

    Spatiotemporal neural network dynamics for the processing of dynamic facial expressions.

    Get PDF
    表情を処理する神経ネットワークの時空間ダイナミクスを解明. 京都大学プレスリリース. 2015-07-24.The dynamic facial expressions of emotion automatically elicit multifaceted psychological activities; however, the temporal profiles and dynamic interaction patterns of brain activities remain unknown. We investigated these issues using magnetoencephalography. Participants passively observed dynamic facial expressions of fear and happiness, or dynamic mosaics. Source-reconstruction analyses utilizing functional magnetic-resonance imaging data revealed higher activation in broad regions of the bilateral occipital and temporal cortices in response to dynamic facial expressions than in response to dynamic mosaics at 150-200 ms and some later time points. The right inferior frontal gyrus exhibited higher activity for dynamic faces versus mosaics at 300-350 ms. Dynamic causal-modeling analyses revealed that dynamic faces activated the dual visual routes and visual-motor route. Superior influences of feedforward and feedback connections were identified before and after 200 ms, respectively. These results indicate that hierarchical, bidirectional neural network dynamics within a few hundred milliseconds implement the processing of dynamic facial expressions

    Inhibition-excitation balance in the parietal cortex modulates volitional control for auditory and visual multistability

    Get PDF
    International audiencePerceptual organisation must select one interpretation from several alternatives to guide behaviour. Computational models suggest that this could be achieved through an interplay between inhibition and excitation across competing types of neural population coding for each interpretation. Here, to test for such models, we used magnetic resonance spectroscopy to measure non-invasively the concentrations of inhibitory γ-aminobutyric acid (GABA) and excitatory glutamate-glutamine (Glx) in several brain regions. Human participants first performed auditory and visual multistability tasks that produced spontaneous switching between percepts. Then, we observed that longer percept durations during behaviour were associated with higher GABA/Glx ratios in the sensory area coding for each modality. When participants were asked to voluntarily modulate their perception, a common factor across modalities emerged: the GABA/Glx ratio in the posterior parietal cortex tended to be positively correlated with the amount of effective volitional control. Our results provide direct evidence implicating that the balance between neural inhibition and excitation within sensory regions resolves perceptual competition. This powerful computational principle appears to be leveraged by both audition and vision, implemented independently across modalities, but modulated by an integrated control process. Perceptual multistability describes an intriguing situation, whereby an observer reports random changes in conscious perception for a physically unchanging stimulus 1,2. Multistability is a powerful tool with which to probe perceptual organisation, as it highlights perhaps the most fundamental issue faced by perception for any reasonably complex natural scene. And because the information encoded by sensory receptors is never sufficient to fully specify the state of the outside world 3 , at each instant perception must always choose between a number of competing alternatives. In realistic situations, the process produces a stable and useful representation of the world. In situations with intrinsically ambiguous information, the same process is revealed as multistable perception. A number of theoretical models have converged to pinpoint the generic computational principles likely to be required to explain multistability, and hence perceptual organisation 4-9. All of these models consider three core ingredients: inhibition between competing neural populations, adaptation within these populations, and neuronal noise. The precise role of each ingredient and their respective importance is still being debated. Noise is introduced to induce fluctuations in each population and initiate the stochastic perceptual switching in some models 7-9 , whereas switching dynamics are solely determined by inhibition in others 5,6. Functional brain imaging in humans has provided results qualitatively compatible with those computational principles at several levels of the visual processing hierarchy 10. But, for most functional imaging techniques in humans such as fMRI or MEG/EEG, change

    Neural networks for action representation: a functional magnetic-resonance imaging and dynamic causal modeling study

    Get PDF
    Automatic mimicry is based on the tight linkage between motor and perception action representations in which internal models play a key role. Based on the anatomical connection, we hypothesized that the direct effective connectivity from the posterior superior temporal sulcus (pSTS) to the ventral premotor area (PMv) formed an inverse internal model, converting visual representation into a motor plan, and that reverse connectivity formed a forward internal model, converting the motor plan into a sensory outcome of action. To test this hypothesis, we employed dynamic causal-modeling analysis with functional magnetic-resonance imaging (fMRI). Twenty-four normal participants underwent a change-detection task involving two visually-presented balls that were either manually rotated by the investigator's right hand (“Hand”) or automatically rotated. The effective connectivity from the pSTS to the PMv was enhanced by hand observation and suppressed by execution, corresponding to the inverse model. Opposite effects were observed from the PMv to the pSTS, suggesting the forward model. Additionally, both execution and hand observation commonly enhanced the effective connectivity from the pSTS to the inferior parietal lobule (IPL), the IPL to the primary sensorimotor cortex (S/M1), the PMv to the IPL, and the PMv to the S/M1. Representation of the hand action therefore was implemented in the motor system including the S/M1. During hand observation, effective connectivity toward the pSTS was suppressed whereas that toward the PMv and S/M1 was enhanced. Thus, the action-representation network acted as a dynamic feedback-control system during action observation

    Rapid Amygdala Gamma Oscillations in Response to Eye Gaze

    Get PDF
    Background: The eye gaze of other individuals conveys important social information and can trigger multiple psychological activities; some of which, such as emotional reactions and attention orienting, occur very rapidly. Although some neuroscientific evidence has suggested that the amygdala may be involved in such rapid gaze processing, no evidence has been reported concerning the speed at which the amygdala responds to eye gaze. Methodology/Principal Findings: To investigate this issue, we recorded electrical activity within the amygdala of six subjects using intracranial electrodes. Subjects observed images of eyes and mosaics pointing in averted and straight directions. The amygdala showed higher gamma-band oscillations for eye gaze than for mosaics, which peaked at 200 ms regardless of the direction of the gaze. Conclusion: These results indicate that the human amygdala rapidly processes eye gaze

    Widespread and lateralized social brain activity for processing dynamic facial expressions

    Get PDF
    Dynamic facial expressions of emotions constitute natural and powerful means of social communication in daily life. A number of previous neuroimaging studies have explored the neural mechanisms underlying the processing of dynamic facial expressions, and indicated the activation of certain social brain regions (e.g., the amygdala) during such tasks. However, the activated brain regions were inconsistent across studies, and their laterality was rarely evaluated. To investigate these issues, we measured brain activity using functional magnetic resonance imaging in a relatively large sample (n = 51) during the observation of dynamic facial expressions of anger and happiness and their corresponding dynamic mosaic images. The observation of dynamic facial expressions, compared with dynamic mosaics, elicited stronger activity in the bilateral posterior cortices, including the inferior occipital gyri, fusiform gyri, and superior temporal sulci. The dynamic facial expressions also activated bilateral limbic regions, including the amygdalae and ventromedial prefrontal cortices, more strongly versus mosaics. In the same manner, activation was found in the right inferior frontal gyrus (IFG) and left cerebellum. Laterality analyses comparing original and flipped images revealed right hemispheric dominance in the superior temporal sulcus and IFG and left hemispheric dominance in the cerebellum. These results indicated that the neural mechanisms underlying processing of dynamic facial expressions include widespread social brain regions associated with perceptual, emotional, and motor functions, and include a clearly lateralized (right cortical and left cerebellar) network like that involved in language processing

    Direction of Amygdala–Neocortex Interaction During Dynamic Facial Expression Processing

    Get PDF
    表情コミュニケーションでの影響は感情から認知. 京都大学プレスリリース. 2017-03-02.Dynamic facial expressions of emotion strongly elicit multifaceted emotional, perceptual, cognitive, and motor responses. Neuroimaging studies revealed that some subcortical (e.g., amygdala) and neocortical (e.g., superior temporal sulcus and inferior frontal gyrus) brain regions and their functional interaction were involved in processing dynamic facial expressions. However, the direction of the functional interaction between the amygdala and the neocortex remains unknown. To investigate this issue, we re-analyzed functional magnetic resonance imaging (fMRI) data from 2 studies and magnetoencephalography (MEG) data from 1 study. First, a psychophysiological interaction analysis of the fMRI data confirmed the functional interaction between the amygdala and neocortical regions. Then, dynamic causal modeling analysis was used to compare models with forward, backward, or bidirectional effective connectivity between the amygdala and neocortical networks in the fMRI and MEG data. The results consistently supported the model of effective connectivity from the amygdala to the neocortex. Further increasing time-window analysis of the MEG demonstrated that this model was valid after 200 ms from the stimulus onset. These data suggest that emotional processing in the amygdala rapidly modulates some neocortical processing, such as perception, recognition, and motor mimicry, when observing dynamic facial expressions of emotion

    Functional heterogeneity in the left lateral posterior parietal cortex during visual and haptic crossmodal dot-surface matching

    Get PDF
    Background Vision and touch are thought to contribute information to object perception in an independent but complementary manner. The left lateral posterior parietal cortex (LPPC) has long been associated with multisensory information processing, and it plays an important role in visual and haptic crossmodal information retrieval. However, it remains unclear how LPPC subregions are involved in visuo‐haptic crossmodal retrieval processing. Methods In the present study, we used an fMRI experiment with a crossmodal delayed match‐to‐sample paradigm to reveal the functional role of LPPC subregions related to unimodal and crossmodal dot‐surface retrieval. Results The visual‐to‐haptic condition enhanced the activity of the left inferior parietal lobule relative to the haptic unimodal condition, whereas the inverse condition enhanced the activity of the left superior parietal lobule. By contrast, activation of the left intraparietal sulcus did not differ significantly between the crossmodal and unimodal conditions. Seed‐based resting connectivity analysis revealed that these three left LPPC subregions engaged distinct networks, confirming their different functions in crossmodal retrieval processing. Conclusion Taken together, the findings suggest that functional heterogeneity of the left LPPC during visuo‐haptic crossmodal dot‐surface retrieval processing reflects that the left LPPC does not simply contribute to retrieval of past information; rather, each subregion has a specific functional role in resolving different task requirements

    Amygdala activity in response to forward versus backward dynamic facial expressions.

    Get PDF
    Observations of dynamic facial expressions of emotion activate several brain regions, but the psychological functions of these regions remain unknown. To investigate this issue, we presented dynamic facial expressions of fear and happiness forwards or backwards, thus altering the emotional meaning of the facial expression while maintaining comparable visual properties. Thirteen subjects passively viewed the stimuli while being scanned using fMRI. After image acquisition, the subject's emotions while perceiving the stimuli were investigated using valence and intensity scales. The left amygdala showed higher activity in response to forward compared with backward presentations, for both fearful and happy expressions. Amygdala activity showed a positive relationship with the intensity of the emotion experienced. These results suggest that the amygdala is not involved in the visual but is involved in the emotional processing of dynamic facial expressions, including specifically the elicitation of subjective emotions

    Commonalities and differences in the spatiotemporal neural dynamics associated with automatic attentional shifts induced by gaze and arrows.

    Get PDF
    Gaze and arrows automatically trigger attentional shifts. Neuroimaging studies have identified a commonality in the spatial distribution of the neural activation involved in such attentional shifts. However, it remains unknown whether these activations occur with common temporal profiles. To investigate this issue, magnetoencephalography (MEG) was used to evaluate neural activation involved in attentional shifts induced by gaze and arrows. MEG source reconstruction analyses revealed that the superior temporal sulcus and the inferior frontal gyrus were commonly activated after 200ms, in response to directional versus non-directional cues. Regression analyses further revealed that the magnitude of brain activity in these areas and in the bilateral occipital cortex was positively related to the effect of attentional shift on reaction times under both the gaze and the arrow conditions. The results also revealed that some brain regions were activated specifically in response to directional versus non-directional gaze or arrow cues at the 350-400ms time window. These results suggest that the neural mechanisms underlying attentional shifts induced by gaze and arrows share commonalities in their spatial distributions and temporal profiles, with some spatial differences at later time stages
    corecore