25 research outputs found

    Effects of stimulus duration on audio-visual synchrony perception

    Get PDF
    The integration of visual and auditory inputs in the human brain occurs only if the components are perceived in temporal proximity, that is, when the intermodal time difference falls within the so-called subjective synchrony range. We used the midpoint of this range to estimate the point of subjective simultaneity (PSS). We measured the PSS for audio-visual (AV) stimuli in a synchrony judgment task, in which subjects had to judge a given AV stimulus using three response categories (audio first, synchronous, video first). The relevant stimulus manipulation was the duration of the auditory and visual components. Results for unimodal auditory and visual stimuli have shown that the perceived onset shifts to relatively later positions with increasing stimulus duration. These unimodal shifts should be reflected in changing PSS values, when AV stimuli with different durations of the auditory and visual components are used. The results for 17 subjects showed indeed a significant shift of the PSS for different duration combinations of the stimulus components. Because the shifts were approximately equal for duration changes in either of the components, no net shift of the PSS was observed as long as the durations of the two components were equal. This result indicates the need to appropriately account for unimodal timing effects when quantifying intermodal synchrony perceptio

    Sensory Integration and the Perceptual Experience of Persons with Autism

    Full text link

    The COGs (context, object, and goals) in multisensory processing

    Get PDF
    Our understanding of how perception operates in real-world environments has been substantially advanced by studying both multisensory processes and “top-down” control processes influencing sensory processing via activity from higher-order brain areas, such as attention, memory, and expectations. As the two topics have been traditionally studied separately, the mechanisms orchestrating real-world multisensory processing remain unclear. Past work has revealed that the observer’s goals gate the influence of many multisensory processes on brain and behavioural responses, whereas some other multisensory processes might occur independently of these goals. Consequently, other forms of top-down control beyond goal dependence are necessary to explain the full range of multisensory effects currently reported at the brain and the cognitive level. These forms of control include sensitivity to stimulus context as well as the detection of matches (or lack thereof) between a multisensory stimulus and categorical attributes of naturalistic objects (e.g. tools, animals). In this review we discuss and integrate the existing findings that demonstrate the importance of such goal-, object- and context-based top-down control over multisensory processing. We then put forward a few principles emerging from this literature review with respect to the mechanisms underlying multisensory processing and discuss their possible broader implications

    Exploring the benefit of auditory spatial continuity

    Get PDF
    Continuity of spatial location was recently shown to improve the ability to identify and recall a sequence of target digits presented in a mixture of confusable maskers [Best et al. (2008). Proc. Natl. Acad. Sci. U.S.A. 105, 13174–13178]. Three follow-up experiments were conducted to explore the basis of this improvement. The results suggest that the benefits of spatial continuity cannot be attributed to (a) the ability to plan where to direct attention in advance; (b) freedom from having to redirect attention across large distances; or (c) the challenge of filtering out signals that are confusable with the target

    Redundancy gains in simple responses and go/no-go tasks

    No full text
    In divided-attention tasks with two classes of target stimuli, participants typically respond more quickly if both targets are presented simultaneously, as compared with single-target presentation (redundant-signals effect). Different explanations exist for this effect, including serial, parallel, and coactivation models of information processing. In two experiments, we investigated redundancy gains in simple and go/no-go responses to auditory-visual stimuli presented with an onset asynchrony. In Experiment 1, go/no-go discrimination was performed for near-threshold and suprathreshold stimuli. Response times in both the simple and go/no-go responses were well explained by a common coactivation model assuming linear superposition of modality-specific activation. In Experiment 2, the go/no-go task was made more difficult. Participants had to respond to high-frequency tones or right-tilted Gabor patches and to withhold their response for low tones and left-tilted Gabors. Redundancy gains were consistent with coactivation models; however, channel-specific buildup of evidence seems to occur at different speeds in the two tasks. Response times of 1 participant support a serial self-terminating model of modality-specific information processing. Supplemental materials for this article may be downloaded from http://app.psychonomic-journals.org/content/supplemental
    corecore