10 research outputs found

    The spatial effect of fearful faces in the autonomic response

    Get PDF
    Peripersonal space (PPS) corresponds to the space around the body and it is defined by the location in space where multimodal inputs from bodily and external stimuli are integrated. Its extent varies according to the characteristics of external stimuli, e.g. the salience of an emotional facial expression. In the present study, we investigated the psycho-physiological correlates of the extension phenomenon. Specifically, we investigated whether an approaching human face showing either an emotionally negative (fearful) or positive (joyful) facial expression would differentially modulate PPS representation, compared to the same face with a neutral expression. To this aim, we continuously recorded the skin conductance response (SCR) of 27 healthy participants while they watched approaching 3D avatar faces showing fearful, joyful or neutral expressions, and then pressed a button to respond to tactile stimuli delivered on their cheeks at three possible delays (visuo-tactile trials). The results revealed that the SCR to fearful faces, but not joyful or neutral faces, was modulated by the apparent distance from the participant\u2019s body. SCR increased from very far space to far and then to near space. We propose that the proximity of the fearful face provided a cue to the presence of a threat in the environment and elicited a robust and urgent organization of defensive responses. In contrast, there would be no need to organize defensive responses to joyful or neutral faces and, as a consequence, no SCR differences were found across spatial positions. These results confirm the defensive function of PPS

    【02】外国につながる子どもフォーラム2011 第2部 教員・支援者にとって有効な手引き書について考える ~『教員必携 外国につながる子どもの教育 Q&A・翻訳資料』をめぐって~

    Get PDF
    Alexithymia is a personality trait involving deficits in emotional processing. The personality construct has been extensively validated, but the underlying neural and physiological systems remain controversial. One theory suggests that low-level somatosensory mechanisms act as somatic markers of emotion, underpinning cognitive and affective impairments in alexithymia. In two separate samples (total N\uc2\ua0=\uc2\ua0100), we used an established Quantitative Sensory Testing (QST) battery to probe multiple neurophysiological submodalities of somatosensation, and investigated their associations with the widely-used Toronto Alexithymia Scale (TAS-20). Experiment one found reduced sensitivity to warmth in people with higher alexithymia scores, compared to individuals with lower scores, without deficits in other somatosensory submodalities. Experiment two replicated this result in a new group of participants using a full-sample correlation between threshold for warm detection and TAS-20 scores. We discuss the relations between low-level thermoceptive function and cognitive processing of emotion

    Unseen fearful faces facilitate visual discrimination in the intact field

    Get PDF
    Implicit visual processing of emotional stimuli has been widely investigated since the classical studies on affective blindsight, in which patients with primary visual cortex lesions showed discriminatory abilities for unseen emotional stimuli in the absence of awareness. In addition, more recent evidence from hemianopic patients showed response facilitation and enhanced early visual encoding of seen faces, only when fearful faces were presented concurrently in the blind field. However, it is still unclear whether unseen fearful faces specifically facilitate visual processing of facial stimuli, or whether the facilitatory effect constitutes an adaptive mechanism prioritizing the visual analysis of any stimulus. To test this question, we tested a group of hemianopic patients who perform at chance in forced-choice discrimination tasks of stimuli in the blind field. Patients performed a go/no-go task in which they were asked to discriminate simple visual stimuli (Gabor patches) presented in their intact field, while fearful, happy and neutral faces were concurrently presented in the blind field. The results showed a reduction in response times to the Gabor patches presented in the intact field, when fearful faces were concurrently presented in the blind field, but only in patients with left hemispheric lesions. No facilitatory effect was observed in patients with right hemispheric lesions. These results suggest that unseen fearful faces are implicitly processed and can facilitate the visual analysis of simple visual stimuli presented in the intact field. This effect might be subserved by activity in the spared colliculo-amygdala-extrastriate pathway that promotes efficient visual analysis of the environment and rapid execution of defensive responses. Such a facilitation is observed only in patients with left lesions, favouring the hypothesis that the right hemisphere mediates implicit visual processing of fear signals

    Compensatory recovery after multisensory stimulation in hemianopic patients: Behavioral and neurophysiological components

    Get PDF
    Lateralized post-chiasmatic lesions of the primary visual pathway result in loss of visual perception in the field retinotopically corresponding to the damaged cortical area. However, patients with visual field defects have shown enhanced detection and localization of multisensory audio-visual pairs presented in the blind field. This preserved multisensory integrative ability (i.e., crossmodal blindsight) seems to be subserved by the spared retino-colliculo-dorsal pathway. According to this view, audio- visual integrative mechanisms could be used to increase the functionality of the spared circuit and, as a consequence, might represent an important tool for the rehabilitation of visual field defects. The present study tested this hypothesis, investigating whether exposure to systematic multisensory audio-visual stimulation could induce long-lasting improvements in the visual performance of patients with visual field defects. A group of 10 patients with chronic visual field defects were exposed to audio-visual training for 4 h daily, over a period of 2 weeks. Behavioral, oculomotor and electroencephalography (EEG) measures were recorded during several visual tasks before and after audio-visual training. After audio-visual training, improvements in visual search abilities, visual detection, self-perceived disability in daily life activities and oculomotor parameters were found, suggesting the implementation of more effective visual exploration strategies. At the electrophysiological level, after training, patients showed a significant reduction of the P3 amplitude in response to stimuli presented in the intact field, reflecting a reduction in attentional resources allocated to the intact field, which might co-occur with a shift of spatial attention towards the blind field. More interestingly, both the behavioral improvements and the electrophysiological changes observed after training were found to be stable at a follow-up session (on average, 8 months after training), suggesting long-term effects of multisensory audio-visual training. These long-lasting effects seem to be subserved by the activation of the spared retino-colliculo-dorsal pathway, which promotes orienting responses towards the blind field, able to both compensate for the visual field loss and concurrently attenuate visual attention towards the intact field. These results add to previous findings the knowledge that audio-visual multisensory stimulation promote long-term plastic changes in hemianopics, resulting in stable and long-lasting ameliorations in behavioral and electrophysiological measures

    "Acoustical vision" of below threshold stimuli: Interaction among spatially converging audiovisual inputs

    No full text
    Crossmodal spatial integration between auditory and visual stimuli is a common phenomenon in space perception. The principles underlying such integration have been outlined by neurophysiological and behavioral studies in animals; this study investigated whether the integrative effects observed in animals also apply to humans. In this experiment we systematically varied the spatial disparity (0\ub0, 16\ub0, and 32\ub0) and the temporal interval (0, 100, 200, 300, 400, and 500 ms) between the visual and the auditory stimuli. Normal subjects were required to detect visual stimuli presented below threshold either in unimodal visual conditions or in crossmodal audiovisual conditions. Signal detection measures were used. An enhancement of the perceptual sensitivity (d\u2032) for luminance detection was found when the audiovisual stimuli followed a simple spatial and temporal rule, governing multisensory integration at the neuronal level. \ua9 Springer-Verlag 2004

    Lateralized emotional and movement-related body postures modulate the body specific N190 ERP component: different patterns in different hemispheres

    No full text
    The extent, to which emotional and movement-related information conveyed by human body postures is represented already at visual encoding, is still under debate. The present study investigated the modulations of the body specific ERP component (N190) by different body postures. Images of emotional (fearful and happy), neutral and static body postures were laterally presented in the left or right visual fields during a discrimination task between emotional and non-emotional body postures. The N190 component over the right hemisphere was differently modulated by each body posture presented: fear postures induced the highest N190 amplitudes, followed by neutral and happy postures. Static postures elicited the lowest N190 amplitudes. In contrast, the N190 component over the left hemisphere showed a significant difference only between the static body posture and movement-related postures: the static body postures lead to the lowest N190 amplitude. These findings suggest that the visual encoding for bodies is affected by emotional and movement-related information and that the two hemispheres differentially contribute to the visual analysis: the right hemisphere, probably due to its prominent role in the processing of emotions, is responsible for a more detailed encoding, able to distinguish between the different emotional postures. In contrast, the left hemisphere plays a role only in the low level distinction between static and movement-related postures (emotional or neutral)

    Alpha oscillations reveal implicit visual processing of motion in hemianopia

    No full text
    After lesion or deafferentation of the primary visual cortex, hemianopic patients experience loss of conscious vision in their blind field. However, due to the spared colliculo-extrastriate pathway, they might retain the ability to implicitly process motion stimuli through the activation of spared dorsal-extrastriate areas, despite the absence of awareness. To test this hypothesis, Electroencephalogram (EEG) was recorded from a group of hemianopic patients without blindsight (i.e., who performed at chance in different forced-choice tasks), while motion stimuli, static stimuli or no stimuli (i.e., blank condition) were presented either in their intact or in their blind visual field. EEG analyses were performed in the time-frequency domain. The presentation of both motion and static stimuli in the intact field induced synchronization in the theta band and desynchronization both in the alpha and the beta band. In contrast, for stimuli presented in the blind field, significantly greater desynchronization in the alpha range was observed only after the presentation of motion stimuli, compared to the blank condition, over posterior parietal-occipital electrodes in the lesioned hemisphere, at a late time window (500\u2013800 msec). No alpha desynchronization was elicited by static stimuli. These results show that hemianopic patients can process only visual signals relying on the activation of the dorsal pathway (i.e., motion stimuli) in the absence of awareness and suggest different patterns of electrophysiological activity for conscious and unconscious visual processing. Specifically, visual processing in the absence of awareness elicits an activity limited to the alpha range, most likely reflecting a \u201clocal\u201d process, occurring within the extrastriate areas and not participating in inter-areal communication. This also suggests a response specificity in this frequency band for implicit visual processing. In contrast, visual awareness evokes changes in different frequency bands, suggesting a \u201cglobal\u201d process, accomplished by activity in a wide range of frequencies, probably within and across cortical areas

    Multisensory stimulation in hemianopic patients boosts orienting responses to the hemianopic field and reduces attentional resources to the intact field

    No full text
    Purpose: Lateralised lesions can disrupt inhibitory cross-callosal fibres which maintain interhemispheric equilibrium in attention networks, with a consequent attentional bias towards the ipsilesional field. Some evidence of this imbalance has also been found in hemianopic patients (Tant et al., 2002). The aim of the present study was to reduce this attentional bias in hemianopic patients by using multisensory stimulation capable of activating subcortical structures responsible for orienting attention, such as the superior colliculus. Methods: Eight hemianopic patients underwent a course of multisensory stimulation treatment for two weeks and their behavioural and electrophysiological performance was tested at three time intervals: baseline 1 (before treatment), control baseline 2 (two weeks after baseline 1 and immediately before treatment as a control for practice effects) and finally after treatment. Results: The results show improvements on various clinical measures, on orienting responses in the hemianopic field, and a reduction of electrophysiological activity (P3 amplitude) in response to stimuli presented in the intact visual field. Conclusions: These results suggest that the primary visual deficit in hemianopic patients might be accompanied by an ipsilesional attentional bias which might be reduced by multisensory stimulation

    Error monitoring is related to processing internal affective states

    No full text
    Detecting behavioral errors is critical for optimizing performance. Here, we tested whether error monitoring is enhanced in emotional task contexts, and whether this enhancement depends on processing internal affective states. Event-related potentials were recorded in individuals with low and high levels of alexithymia\u2014that is, individuals with difficulties identifying and describing their feelings. We administered a face word Stroop paradigm (Egner, Etkin, Gale, & Hirsch, 2008) in which the task was to classify emotional faces either with respect to their expression (happy or fearful; emotional task set) or with respect to their gender (female or male; neutral task set). The error-related negativity, a marker of rapid error monitoring, was enhanced in individuals with low alexithymia when they adopted the emotional task set. By contrast, individuals with high alexithymia did not show such an enhancement. Moreover, in the high-alexithymia group, the difference in the error-related negativities between the emotional and neutral task sets correlated negatively with difficulties identifying their own feelings, as measured by the Toronto Alexithymia Scale. These results show that error-monitoring activity is stronger in emotional task contexts and that this enhancement depends on processing internal affective states

    Decoupling of early V5 motion processing from visual awareness: A matter of velocity as revealed by transcranial magnetic stimulation

    Get PDF
    Motion information can reach V5/MT through two parallel routes: one conveying information at early latencies through a direct subcortical route and the other reaching V5 later via recurrent projections through V1. Here, we tested the hypothesis that input via the faster direct pathway depends on motion characteristics. To this end, we presented motion stimuli to healthy human observers at different velocities (4.4\ub0/sec vs. 23\ub0/sec) with static stimuli as controls while applying transcranial magnetic stimulation (TMS) pulses over V5 or V1. We probed for TMS interference with objective (two-alternative forced choice [2AFC]) and subjective (awareness) measures of motion processing at six TMS delays from stimulus onset (poststimulus window covered: 3c27\u2013160 msec). Our results for V5\u2013TMS showed earlier interference with objective performance for fast motion (53.3 msec) than slow motion (80 msec) stimuli. Importantly, TMS-induced decreases in objective measures of motion processing did correlate with decreases in subjective measures for slow but not fast motion stimuli. Moreover, V1\u2013 TMS induced a temporally unspecific interference with visual processing as it impaired the processing of both motion and static stimuli at the same delays. These results are in accordance with fast moving stimuli reaching V5 through a different route than slow moving stimuli. The differential latencies and coupling to awareness suggest distinct involvement of a direct (i.e., colliculo-extrastriate) connection bypassing V1 depending on stimulus velocity (fast vs. slow). Implication of a direct pathway in the early processing of fast motion may have evolved through its behavioral relevance
    corecore