987 research outputs found

    Sounds facilitate visual motion discrimination via the enhancement of late occipital visual representations

    Get PDF
    Kayser S, Philiastides MG, Kayser C. Sounds facilitate visual motion discrimination via the enhancement of late occipital visual representations. Neuroimage. 2017;148:31-41

    Selective attention to sound features mediates cross-modal activation of visual cortices.

    Get PDF
    Contemporary schemas of brain organization now include multisensory processes both in low-level cortices as well as at early stages of stimulus processing. Evidence has also accumulated showing that unisensory stimulus processing can result in cross-modal effects. For example, task-irrelevant and lateralised sounds can activate visual cortices; a phenomenon referred to as the auditory-evoked contralateral occipital positivity (ACOP). Some claim this is an example of automatic attentional capture in visual cortices. Other results, however, indicate that context may play a determinant role. Here, we investigated whether selective attention to spatial features of sounds is a determining factor in eliciting the ACOP. We recorded high-density auditory evoked potentials (AEPs) while participants selectively attended and discriminated sounds according to four possible stimulus attributes: location, pitch, speaker identity or syllable. Sound acoustics were held constant, and their location was always equiprobable (50% left, 50% right). The only manipulation was to which sound dimension participants attended. We analysed the AEP data from healthy participants within an electrical neuroimaging framework. The presence of sound-elicited activations of visual cortices depended on the to-be-discriminated, goal-based dimension. The ACOP was elicited only when participants were required to discriminate sound location, but not when they attended to any of the non-spatial features. These results provide a further indication that the ACOP is not automatic. Moreover, our findings showcase the interplay between task-relevance and spatial (un)predictability in determining the presence of the cross-modal activation of visual cortices

    Multisensory Approaches to Restore Visual Functions

    Get PDF

    Space, time and motion in a multisensory world

    Get PDF
    When interacting with environmental events, humans acquire information from different senses and combine these inputs within a coherent representation of the world. The present doctoral thesis aims at investigating how humans represent space, time, and motion through auditory and visual sensory modalities. It has been widely demonstrated a predisposition of different sensory systems towards the processing of different domains of representation, with hearing that prevails in representing the time domain and vision that is the most reliable sense for processing the space domain. Given this strong link between sensory modality and domain of representation, one objective of this thesis is to deepen the knowledge of the neural organization of multisensory spatial and temporal skills in healthy adults. In addition, by using blindness as a model to unravel the role of vision in the development of spatio-temporal abilities, this thesis explores the interaction of the spatial and temporal domains in the acoustic motion perception of early blind individuals. The interplay between space and time has also been explained as the result of humans performing actions in the surrounding environment since to carry out goal-directed motor behaviors it is useful for a person to associate the spatial and temporal information of one’s target into a shared mental map. In this regard, the present project also questions how the brain processes spatio-temporal cues of external events when it comes to manually intercepting moving objects with one hand. Finally, in light of the above results, this dissertation incorporates the development of a novel portable device, named MultiTab, for the behavioral evaluation of the processing of space, time, and motor responses, through the visual and acoustic sensory modality. For the purposes of this thesis, four methodological approaches have been employed: i) electroencephalogram (EEG) technique, to explore the cortical activation associated with multisensory spatial and temporal tasks; ii) psychophysical methods, to measure the relationship between stimuli in motion and the acoustic speed perception of blind and sighted individuals; iii) motion capture techniques, to measure indices of movements during an object’s interception task; iv) design and technical-behavioral validation of a new portable device. Studies of the present dissertation indicate the following results. First, this thesis highlights an early cortical gain modulation of sensory areas that depends on the domain of representation to process, with auditory areas mainly involved in the multisensory processing of temporal inputs, and visual areas of spatial inputs. Moreover, for the spatial domain specifically, the neural modulation of visual areas is also influenced by the kind of spatial layout representing multisensory stimuli. Second, this project shows that lack of vision influences the ability to process the speed of moving sounds by altering how blind individuals make use of the sounds’ temporal features. This result suggests that visual experience in the first years of life is a crucial factor when dealing with combined spatio-temporal information. Third, data of this thesis demonstrate that typically developing individuals manually intercepting a moving object with one hand take into consideration the item’s spatio-temporal cues, by adjusting their interceptive movements according to the object’s speed. Finally, the design and validation of MultiTab show its utility in the evaluation of multisensory processing such as the manual localization of audiovisual spatialized stimuli. Overall, findings from this thesis contribute to a more in-depth picture of how the human brain represents space, time, and motion through different senses. Moreover, they provide promising implications in exploring novel technological methods for the assessment and training of these dimensions in typical and atypical populations

    Space and time in the human brain

    Get PDF

    Seeing with sound: Investigating the behavioural applications and neural correlates of human echolocation

    Get PDF
    Some blind humans use the reflected echoes from self-produced signals to perceive their silent surroundings. Although the use of echolocation is well documented in animals such as bats and dolphins, comparatively little is known about human echolocation. The overarching goal of the work presented in this thesis was to shed light on some of the basic functions of human echolocation, including the perception of the shape, size, and material. I addressed these aspects of echolocation using behavioural psychophysics and neuroimaging. In Chapter 2 I show that blind echolocators were able to accurately identify the shape of 2D objects, but that their ability to do so was dependent on the use of head and body movements to ‘scan’ the objects’ edges. I suggest that these scanning movements may be similar to the many saccades made by sighted individuals when visually surveying an object or scene. In Chapter 3 I addressed the possibility that object size perception via echolocation shows size constancy – a perceptual phenomenon associated with vision. The results revealed that an expert echolocator accurately perceived the true physical size of objects independent of their distance, even though changes to distance directly affect size-related echo information. The results of this study highlight the ‘visual’ nature of echolocation, and suggest further parallels between the two modalities than previously known or theorized. Chapter 4 presents the results of a functional neuroimaging study aimed at uncovering the neural correlates of material processing via echolocation. By having echolocators listen to recordings of echoes reflected from surfaces of different materials, I show not only that they can determine the material properties of objects, but also that the neural processing underlying this ability may make use of a visual- and auditory-material processing area in the parahippocampal cortex. Taken together, the work presented in the current thesis describes some of the recent contributions to our understanding of human echolocation, with a particular emphasis on its apparent parallels with vision and visual processing. The results of this work show that accurate and reliable information can be extracted from echoes, thus supporting echolocation as a viable resource for the blind

    A Framework to Account for the Effects of Visual Loss on Human Auditory Abilities

    Get PDF
    Until recently, a commonly held view was that blindness resulted in enhanced auditory abilities, underpinned by the beneficial effects of cross-modal neuroplasticity. This viewpoint has been challenged by studies showing that blindness results in poorer performance for some auditory spatial tasks. It is now clear that visual loss does not result in a general increase or decrease in all auditory abilities. Although several hypotheses have been proposed to explain why certain auditory abilities are enhanced while others are degraded, these are often limited to a specific subset of tasks. A comprehensive explanation encompassing auditory abilities assessed in fully blind and partially sighted populations and spanning spatial and non-spatial cognition has not so far been proposed. The current article proposes a framework comprising a set of nine principles that can be used to predict whether auditory abilities are enhanced or degraded. The validity of these principles is assessed by comparing their predictions with a wide range of empirical evidence concerning the effects of visual loss on spatial and non-spatial auditory abilities. Developmental findings and the effects of early- versus late-onset visual loss are discussed. Ways of improving auditory abilities for individuals with visual loss and reducing auditory spatial deficits are summarized. A new Perceptual Restructuring Hypothesis is proposed within the framework, positing that the auditory system is restructured to provide the most accurate information possible given the loss of the visual signal and utilizing available cortical resources, resulting in different auditory abilities getting better or worse according to the nine principles

    Echolocation in humans: an overview

    Get PDF
    Bats and dolphins are known for their ability to use echolocation. They emit bursts of sounds and listen to the echoes that bounce back to detect the objects in their environment. What is not as well-known is that some blind people have learned to do the same thing, making mouth clicks, for example, and using the returning echoes from those clicks to sense obstacles and objects of interest in their surroundings. The current review explores some of the research that has examined human echolocation and the changes that have been observed in the brains of echolocation experts. We also discuss potential applications and assistive technology based on echolocation. Blind echolocation experts can sense small differences in the location of objects, differentiate between objects of various sizes and shapes, and even between objects made of different materials, just by listening to the reflected echoes from mouth clicks. It is clear that echolocation may enable some blind people to do things that are otherwise thought to be impossible without vision, potentially providing them with a high degree of independence in their daily lives and demonstrating that echolocation can serve as an effective mobility strategy in the blind. Neuroimaging has shown that the processing of echoes activates brain regions in blind echolocators that would normally support vision in the sighted brain, and that the patterns of these activations are modulated by the information carried by the echoes. This work is shedding new light on just how plastic the human brain is

    The enlanguaged brain: Cognitive and neural mechanisms of linguistic influence on perception

    Get PDF
    corecore