64 research outputs found

    The orienting response and the motor system

    Get PDF

    The effect of synesthetic associations between the visual and auditory modalities on the Colavita effect

    Get PDF
    The Colavita effect refers to the phenomenon that when confronted with an audiovisual stimulus, observers report more often to have perceived the visual than the auditory component. The Colavita effect depends on low-level stimulus factors such as spatial and temporal proximity between the unimodal signals. Here, we examined whether the Colavita effect is modulated by synesthetic congruency between visual size and auditory pitch. If the Colavita effect depends on synesthetic congruency, we expect a larger Colavita effect for synesthetically congruent size/pitch (large visual stimulus/low-pitched tone; small visual stimulus/high-pitched tone) than synesthetically incongruent (large visual stimulus/high-pitched tone; small visual stimulus/low-pitched tone) combinations. Participants had to identify stimulus type (visual, auditory or audiovisual). The study replicated the Colavita effect because participants reported more often the visual than auditory component of the audiovisual stimuli. Synesthetic congruency had, however, no effect on the magnitude of the Colavita effect. EEG recordings to congruent and incongruent audiovisual pairings showed a late frontal congruency effect at 400–550 ms and an occipitoparietal effect at 690–800 ms with neural sources in the anterior cingulate and premotor cortex for the 400- to 550-ms window and premotor cortex, inferior parietal lobule and the posterior middle temporal gyrus for the 690- to 800-ms window. The electrophysiological data show that synesthetic congruency was probably detected in a processing stage subsequent to the Colavita effect. We conclude that—in a modality detection task—the Colavita effect can be modulated by low-level structural factors but not by higher-order associations between auditory and visual inputs. Keywords Synesthetic congruency Audiovisual integration Colavita effect Event-related potential

    Sub-clinical levels of autistic traits impair multisensory integration of audiovisual speech

    Get PDF
    Autism Spectrum Disorder (ASD) is a pervasive neurodevelopmental disorder characterized by restricted interests, repetitive behavior, deficits in social communication and atypical multisensory perception. ASD symptoms are found to varying degrees in the general population. While impairments in multisensory speech processing are widely reported in clinical ASD populations, the impact of sub-clinical levels of autistic traits on multisensory speech perception is still unclear. The present study examined audiovisual (AV) speech processing in a large non-clinical adult population in relation to autistic traits measured by the Autism Quotient. AV speech processing was assessed using the McGurk illusion, a simultaneity judgment task and a spoken word recognition task in background noise. We found that difficulty with Imagination was associated with lower susceptibility to the McGurk illusion. Furthermore, difficulty with Attention-switching was associated with a wider temporal binding window and reduced gain from lip-read speech. These results demonstrate that sub-clinical ASD symptomatology is related to reduced AV speech processing performance, and are consistent with the notion of a spectrum of ASD traits that extends into the general population

    Neural correlates of impaired motor-auditory prediction in Autism Spectrum Disorder

    Get PDF
    Individuals with autism spectrum disorder (ASD) have difficulties with the unexpected and unpredictable nature of external events. Prior knowledge of the statistics of the environment (i.e. priors, in terms of Bayesian statistical framework) can aid resolving these uncertainties. In individuals with ASD, these priors may either be ill-constructed or not appropriately combined with the actual sensory information, resulting in less-precise or attenuated priors (Pellicano and Burr, 2012). These ‘hypo-priors’ may cause a greater reliance on bottom-up incoming sensory signals, which in turn leads to every stimulus being experienced afresh. Here, we tested the hypo-priors hypothesis by examining the neural underpinnings of prediction of sensory consequences of motor actions in individuals with ASD and individuals with typical development (TD). In this experiment, subjects pressed a button at a steady pace, which generated a sound. In another condition, the sounds were replayed at the same pace. In individuals with TD, the auditory N1 potential induced by the sound was attenuated in the motor-auditory condition compared to the auditory-only condition, indicating that - as expected - the motor action predicted the sound and dampened the sensation (Bäss et al., 2008). In individuals with ASD, there was no auditory N1 attenuation, indicating that they relied more strongly on bottom-up auditory cues. These results show that individuals with ASD make less use of their priors to interpret the sensory environment and support the notion of hypo-priors as the underlying cause of atypical multisensory processing in ASD

    From face to hand: Attentional bias towards expressive hands in social anxiety

    Get PDF
    The eye-region conveys important emotional information that we spontaneously attend to. Socially submissive individuals avoid other's gaze which is regarded as avoidance of others' emotional face expressions. But this interpretation ignores the fact that there are other sources of emotional information besides the face. Here we investigate whether gaze-aversion is associated with increased attention to emotional signals from the hands. We used eye-tracking to compare eye-fixations of pre-selected high and low socially anxious students when labeling bodily expressions (Experiment 1) with (non)-matching facial expressions (Experiment 2) and passively viewed (Experiment 3). High compared to low socially anxious individuals attended more to hand-regions. Our findings demonstrate that socially anxious individuals do attend to emotions, albeit to different signals than the eyes and the face. Our findings call for a closer investigation of alternative viewing patterns explaining gaze-avoidance and underscore that other signals besides the eyes and face must be considered to reach conclusions about social anxiety.Action Contro

    Detecting violations of temporal regularities in waking and sleeping two-month-old infants

    Get PDF
    Correctly processing rapid sequences of sounds is essential for developmental milestones, such as language acquisition. We investigated the sensitivity of two-month-old infants to violations of a temporal regularity, by recording event-related brain potentials (ERP) in an auditory oddball paradigm from 36 waking and 40 sleeping infants. Standard tones were presented at a regular 300 ms inter-stimulus interval (ISI). One deviant, otherwise identical to the standard, was preceded by a 100 ms ISI. Two other deviants, presented with the standard ISI, differed from the standard in their spectral makeup. We found significant differences between ERP responses elicited by the standard and each of the deviant sounds. The results suggest that the ability to extract both temporal and spectral regularities from a sound sequence is already functional within the first few months of life. The scalp distribution of all three deviant-stimulus responses was influenced by the infants‟ state of alertness
    • …
    corecore