7,982 research outputs found
Multisensory Congruency as a Mechanism for Attentional Control over Perceptual Selection
The neural mechanisms underlying attentional selection of competing neural signals for awareness remains an unresolved issue. We studied attentional selection, using perceptually ambiguous stimuli in a novel multisensory paradigm that combined competing auditory and competing visual stimuli. We demonstrate that the ability to select, and attentively hold, one of the competing alternatives in either sensory modality is greatly enhanced when there is a matching cross-modal stimulus. Intriguingly, this multimodal enhancement of attentional selection seems to require a conscious act of attention, as passively experiencing the multisensory stimuli did not enhance control over the stimulus. We also demonstrate that congruent auditory or tactile information, and combined auditory–tactile information, aids attentional control over competing visual stimuli and visa versa. Our data suggest a functional role for recently found neurons that combine voluntarily initiated attentional functions across sensory modalities. We argue that these units provide a mechanism for structuring multisensory inputs that are then used to selectively modulate early (unimodal) cortical processing, boosting the gain of task-relevant features for willful control over perceptual awareness
Spatially valid proprioceptive cues improve the detection of a visual stimulus
Vision and proprioception are the main sensory modalities that convey hand location and direction of movement. Fusion of these sensory signals into a single robust percept is now well documented. However, it is not known whether these modalities also interact in the spatial allocation of attention, which has been demonstrated for other modality pairings. The aim of this study was to test whether proprioceptive signals can spatially cue a visual target to improve its detection. Participants were instructed to use a planar manipulandum in a forward reaching action and determine during this movement whether a near-threshold visual target appeared at either of two lateral positions. The target presentation was followed by a masking stimulus, which made its possible location unambiguous, but not its presence. Proprioceptive cues were given by applying a brief lateral force to the participant’s arm, either in the same direction (validly cued) or in the opposite direction (invalidly cued) to the on-screen location of the mask. The d′ detection rate of the target increased when the direction of proprioceptive stimulus was compatible with the location of the visual target compared to when it was incompatible. These results suggest that proprioception influences the allocation of attention in visual spac
Recommended from our members
Cross-modal extinction in a boy with severely autistic behaviour and high verbal intelligence
Anecdotal reports from individuals with autism suggest a loss of awareness to stimuli from one modality in the presence of stimuli from another. Here we document such a case in a detailed study of T.M., a 13-year-old boy with autism in whom significant autistic behaviors are combined with an uneven IQ profile of superior verbal and low performance abilities. Although T.M.'s speech is often unintelligible and his behavior is dominated by motor stereotypies and impulsivity, he can communicate by typing or pointing independently within a letter board. A series of experiments using simple and highly salient visual, auditory, and tactile stimuli demonstrated a hierarchy of cross-modal extinction, in which auditory information extinguished other modalities at various levels of processing. T.M. also showed deficits in shifting and sustaining attention. These results provide evidence for mono-channel perception in autism and suggest a general pattern of winner-takes-all processing in which a stronger stimulus-d riven representation dominates behavior, extinguishing weaker representations
Out of focus – brain attention control deficits in adult ADHD
Modern environments are full of information, and place high demands on the attention control mechanisms that allow the selection of information from one (focused attention) or multiple (divided attention) sources, react to changes in a given situation (stimulus-driven attention), and allocate effort according to demands (task-positive and task-negative activity). We aimed to reveal how attention deficit hyperactivity disorder (ADHD) affects the brain functions associated with these attention control processes in constantly demanding tasks. Sixteen adults with ADHD and 17 controls performed adaptive visual and auditory discrimination tasks during functional magnetic resonance imaging (fMRI). Overlapping brain activity in frontoparietal saliency and default-mode networks, as well as in the somato-motor, cerebellar, and striatal areas were observed in all participants. In the ADHD participants, we observed exclusive activity enhancement in the brain areas typically considered to be primarily involved in other attention control functions: During auditory-focused attention, we observed higher activation in the sensory cortical areas of irrelevant modality and the default-mode network (DMN). DMN activity also increased during divided attention in the ADHD group, in turn decreasing during a simple button-press task. Adding irrelevant stimulation resulted in enhanced activity in the salience network. Finally, the irrelevant distractors that capture attention in a stimulus-driven manner activated dorsal attention networks and the cerebellum. Our findings suggest that attention control deficits involve the activation of irrelevant sensory modality, problems in regulating the level of attention on demand, and may encumber top-down processing in cases of irrelevant information. (C) 2018 Elsevier B.V. All rights reserved.Peer reviewe
The role of the right temporoparietal junction in perceptual conflict: detection or resolution?
The right temporoparietal junction (rTPJ) is a polysensory cortical area that plays a key role in perception and awareness. Neuroimaging evidence shows activation of rTPJ in intersensory and sensorimotor conflict situations, but it remains unclear whether this activity reflects detection or resolution of such conflicts. To address this question, we manipulated the relationship between touch and vision using the so-called mirror-box illusion. Participants' hands lay on either side of a mirror, which occluded their left hand and reflected their right hand, but created the illusion that they were looking directly at their left hand. The experimenter simultaneously touched either the middle (D3) or the ring finger (D4) of each hand. Participants judged, which finger was touched on their occluded left hand. The visual stimulus corresponding to the touch on the right hand was therefore either congruent (same finger as touch) or incongruent (different finger from touch) with the task-relevant touch on the left hand. Single-pulse transcranial magnetic stimulation (TMS) was delivered to the rTPJ immediately after touch. Accuracy in localizing the left touch was worse for D4 than for D3, particularly when visual stimulation was incongruent. However, following TMS, accuracy improved selectively for D4 in incongruent trials, suggesting that the effects of the conflicting visual information were reduced. These findings suggest a role of rTPJ in detecting, rather than resolving, intersensory conflict
Early cross-modal interactions and adult human visual cortical plasticity revealed by binocular rivalry
In this research binocular rivalry is used as a tool to investigate different aspects of visual and multisensory perception. Several experiments presented here demonstrated that touch specifically interacts with vision during binocular rivalry and that the interaction likely occurs at early stages of visual processing, probably V1 or V2. Another line of research also presented here demonstrated that human adult visual cortex retains an unexpected high degree of experience-dependent plasticity by showing that a brief period of monocular deprivation produced important perceptual consequences on the dynamics of binocular rivalry, reflecting a homeostatic plasticity. In summary, this work shows that binocular rivalry is a powerful tool to investigate different aspects of visual perception and can be used to reveal unexpected properties of early visual cortex
Recommended from our members
Somatic salience and sensory precision in persistent depression
Persistent depression is a debilitating health condition with a poor prognosis even in the context of current gold-standard pharmacological and psychological interventions. A better understanding of the mechanisms contributing to its maintenance is needed to facilitate the development of more targeted psychological interventions. Bayesian predictive processing models of depression propose that negative emotional and physiological outcomes arise in depressive illness as a result of disturbed interoceptive precision estimation in depressed individuals; however, evidence from the clinical and cognitive neuroscience literatures suggests the hypothesis that sensory precision is attenuated in persistent depression across sensory modalities in general. A series of studies was designed to index sensory precision across somatic and auditory modalities and to identify the level at which any disruption manifested in persistently-depressed participants relative to controls. Study 1 (Chapter 3) measured baseline signal discriminability under conditions of focused attention. Study 2 (Chapter 4) measured the impact of failures of voluntary attention on signal discriminability. Study 3 (Chapter 5) used a simulation approach to model sensory precision in the first two studies and to identify mechanisms which could successfully predict the data. Studies 4 (Chapter 6) and 5 (Chapter 7) measured attentional capture by task-irrelevant and predictive sensory cues respectively. Study 6 (Chapter 8) partially replicated Studies 4 and 5 and used the resulting data to estimate the group-level sensory precision and salience parameters of a predictive processing model of precision optimization. The results suggest that sensory precision is attenuated in persistent depression across sensory modalities, and that this attenuation results from disturbances of voluntary and involuntary attention rather than baseline perceptual sensitivity. Under conditions of voluntary attention, reduced sensory precision may result from efforts at resource conservation; and under conditions of involuntary attentional capture, it may be related to a loss of target discriminability and salience. Conversely, the bottom-up salience of somatic stimuli was uniquely enhanced among depressed participants and was predicted by high anxiety and by low interoceptive sensibility. These findings open up new avenues for investigation of the mechanisms underlying persistent forms of depression, and have direct implications for clinical practice with respect to psychological intervention.Medical Research Council Studentshi
Are Neuronal Mechanisms of Attentional Modulation Universal Across Human Sensory and Motor Brain Maps?
One\u27s experience of shifting attention from the color to the smell to the act of picking a flower seems like a unitary process applied, at will, to one modality after another. Yet, the unique experience of sight vs smell vs movement might suggest that the neural mechanisms of attention have been selectively optimized to employ each modality to greatest advantage. Relevant experimental data can be difficult to compare across modalities due to design and methodological heterogeneity. Here we outline some of the issues related to this problem and suggest how experimental data can be obtained across modalities using more uniform methods and measurements. The ultimate goal is to spur efforts across disciplines to provide a large and varied database of empirical observations that will either support the notion of a universal neural substrate for attention or more clearly identify to what degree attentional mechanisms are specialized for each modality
Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans
Speech is inherently multisensory, containing auditory information from the voice and visual information from the mouth movements of the talker. Hearing the voice is usually sufficient to understand speech, however in noisy environments or when audition is impaired due to aging or disabilities, seeing mouth movements greatly improves speech perception. Although behavioral studies have well established this perceptual benefit, it is still not clear how the brain processes visual information from mouth movements to improve speech perception. To clarify this issue, I studied the neural activity recorded from the brain surfaces of human subjects using intracranial electrodes, a technique known as electrocorticography (ECoG). First, I studied responses to noisy speech in the auditory cortex, specifically in the superior temporal gyrus (STG). Previous studies identified the anterior parts of the STG as unisensory, responding only to auditory stimulus. On the other hand, posterior parts of the STG are known to be multisensory, responding to both auditory and visual stimuli, which makes it a key region for audiovisual speech perception. I examined how these different parts of the STG respond to clear versus noisy speech. I found that noisy speech decreased the amplitude and increased the across-trial variability of the response in the anterior STG. However, possibly due to its multisensory composition, posterior STG was not as sensitive to auditory noise as the anterior STG and responded similarly to clear and noisy speech. I also found that these two response patterns in the STG were separated by a sharp boundary demarcated by the posterior-most portion of the Heschl’s gyrus. Second, I studied responses to silent speech in the visual cortex. Previous studies demonstrated that visual cortex shows response enhancement when the auditory component of speech is noisy or absent, however it was not clear which regions of the visual cortex specifically show this response enhancement and whether this response enhancement is a result of top-down modulation from a higher region. To test this, I first mapped the receptive fields of different regions in the visual cortex and then measured their responses to visual (silent) and audiovisual speech stimuli. I found that visual regions that have central receptive fields show greater response enhancement to visual speech, possibly because these regions receive more visual information from mouth movements. I found similar response enhancement to visual speech in frontal cortex, specifically in the inferior frontal gyrus, premotor and dorsolateral prefrontal cortices, which have been implicated in speech reading in previous studies. I showed that these frontal regions display strong functional connectivity with visual regions that have central receptive fields during speech perception
- …