245 research outputs found

    The Behavioral Relevance of Multisensory Neural Response Interactions

    Get PDF
    Sensory information can interact to impact perception and behavior. Foods are appreciated according to their appearance, smell, taste and texture. Athletes and dancers combine visual, auditory, and somatosensory information to coordinate their movements. Under laboratory settings, detection and discrimination are likewise facilitated by multisensory signals. Research over the past several decades has shown that the requisite anatomy exists to support interactions between sensory systems in regions canonically designated as exclusively unisensory in their function and, more recently, that neural response interactions occur within these same regions, including even primary cortices and thalamic nuclei, at early post-stimulus latencies. Here, we review evidence concerning direct links between early, low-level neural response interactions and behavioral measures of multisensory integration

    Discussing Gamma

    Get PDF

    Topographic ERP Analyses: A Step-by-Step Tutorial Review

    Get PDF
    In this tutorial review, we detail both the rationale for as well as the implementation of a set of analyses of surface-recorded event-related potentials (ERPs) that uses the reference-free spatial (i.e. topographic) information available from high-density electrode montages to render statistical information concerning modulations in response strength, latency, and topography both between and within experimental conditions. In these and other ways these topographic analysis methods allow the experimenter to glean additional information and neurophysiologic interpretability beyond what is available from canonical waveform analyses. In this tutorial we present the example of somatosensory evoked potentials (SEPs) in response to stimulation of each hand to illustrate these points. For each step of these analyses, we provide the reader with both a conceptual and mathematical description of how the analysis is carried out, what it yields, and how to interpret its statistical outcome. We show that these topographic analysis methods are intuitive and easy-to-use approaches that can remove much of the guesswork often confronting ERP researchers and also assist in identifying the information contained within high-density ERP dataset

    Seeing things that are not there: illusions reveal how our brain constructs what we see

    Get PDF
    What we perceive is not always what our eyes see. Vision, and perception more generally, should not be thought of as a webcam that just takes pictures of the world. This is not a fault in how our brains work, but rather is exemplary of how the brain constructs perception and takes advantage of its massive inter-connectedness in ways that are highly similar to social networks. The construction of perception is not only based on the information the eyes capture, but also based on the information stored in the brain and "guesses" based on this stored information. Illusory figure similar to that shown in Figure 1 is a laboratory example of this construction process and demonstrates well how the visual system works. In the real world, the visual system must handle situations of occlusion, noise, and equivocality (that is, when it is unclear what bits of what we see belongs to one object versus another)

    Early, Low-Level Auditory-Somatosensory Multisensory Interactions Impact Reaction Time Speed

    Get PDF
    Several lines of research have documented early-latency non-linear response interactions between audition and touch in humans and non-human primates. That these effects have been obtained under anesthesia, passive stimulation, as well as speeded reaction time tasks would suggest that some multisensory effects are not directly influencing behavioral outcome. We investigated whether the initial non-linear neural response interactions have a direct bearing on the speed of reaction times. Electrical neuroimaging analyses were applied to event-related potentials in response to auditory, somatosensory, or simultaneous auditory–somatosensory multisensory stimulation that were in turn averaged according to trials leading to fast and slow reaction times (using a median split of individual subject data for each experimental condition). Responses to multisensory stimulus pairs were contrasted with each unisensory response as well as summed responses from the constituent unisensory conditions. Behavioral analyses indicated that neural response interactions were only implicated in the case of trials producing fast reaction times, as evidenced by facilitation in excess of probability summation. In agreement, supra-additive non-linear neural response interactions between multisensory and the sum of the constituent unisensory stimuli were evident over the 40–84 ms post-stimulus period only when reaction times were fast, whereas subsequent effects (86–128 ms) were observed independently of reaction time speed. Distributed source estimations further revealed that these earlier effects followed from supra-additive modulation of activity within posterior superior temporal cortices. These results indicate the behavioral relevance of early multisensory phenomena

    Comparing ICA-based and Single-Trial Topographic ERP Analyses

    Get PDF
    Single-trial analysis of human electroencephalography (EEG) has been recently proposed for better understanding the contribution of individual subjects to a group-analyis effect as well as for investigating single-subject mechanisms. Independent Component Analysis (ICA) has been repeatedly applied to concatenated single-trial responses and at a single-subject level in order to extract those components that resemble activities of interest. More recently we have proposed a single-trial method based on topographic maps that determines which voltage configurations are reliably observed at the event-related potential (ERP) level taking advantage of repetitions across trials. Here, we investigated the correspondence between the maps obtained by ICA versus the topographies that we obtained by the single-trial clustering algorithm that best explained the variance of the ERP. To do this, we used exemplar data provided from the EEGLAB website that are based on a dataset from a visual target detection task. We show there to be robust correpondence both at the level of the activation time courses and at the level of voltage configurations of a subset of relevant maps. We additionally show the estimated inverse solution (based on low-resolution electromagnetic tomography) of two corresponding maps occurring at approximately 300ms post-stimulus onset, as estimated by the two aforementioned approaches. The spatial distribution of the estimated sources significantly correlated and had in common a right parietal activation within Brodmann's Area (BA) 40. Despite their differences in terms of theoretical bases, the consistency between the results of these two approaches shows that their underlying assumptions are indeed compatibl

    Automatic and Intrinsic Auditory "What” and "Where” Processing in Humans Revealed by Electrical Neuroimaging

    Get PDF
    The auditory system includes 2 parallel functional pathways—one for treating sounds' identities and another for their spatial attributes (so-called "what” and "where” pathways). We examined the spatiotemporal mechanisms along auditory "what” and "where” pathways and whether they are automatically engaged in differentially processing spatial and pitch information of identical stimuli. Electrical neuroimaging of auditory evoked potentials (i.e., statistical analyses of waveforms, field strength, topographies, and source estimations) was applied to a passive "oddball” paradigm comprising 2 varieties of blocks of trials. On "what” blocks, band-pass-filtered noises varied in pitch, independently of perceived location. On "where” blocks, the identical stimuli varied in perceived location independently of pitch. Beginning 100 ms poststimulus, the electric field topography significantly differed between conditions, indicative of the automatic recruitment of distinct intracranial generators. A distributed linear inverse solution and statistical analysis thereof revealed activations within superior temporal cortex and prefrontal cortex bilaterally that were common for both conditions, as well as regions within the right temporoparietal cortices that were selective for the "where” condition. These findings support models of automatic and intrinsic parallel processing of auditory information, such that segregated processing of spatial and pitch features may be an organizing principle of auditory functio

    Spatiotemporal Analysis of Multichannel EEG: CARTOOL

    Get PDF
    This paper describes methods to analyze the brain's electric fields recorded with multichannel Electroencephalogram (EEG) and demonstrates their implementation in the software CARTOOL. It focuses on the analysis of the spatial properties of these fields and on quantitative assessment of changes of field topographies across time, experimental conditions, or populations. Topographic analyses are advantageous because they are reference independents and thus render statistically unambiguous results. Neurophysiologically, differences in topography directly indicate changes in the configuration of the active neuronal sources in the brain. We describe global measures of field strength and field similarities, temporal segmentation based on topographic variations, topographic analysis in the frequency domain, topographic statistical analysis, and source imaging based on distributed inverse solutions. All analysis methods are implemented in a freely available academic software package called CARTOOL. Besides providing these analysis tools, CARTOOL is particularly designed to visualize the data and the analysis results using 3-dimensional display routines that allow rapid manipulation and animation of 3D images. CARTOOL therefore is a helpful tool for researchers as well as for clinicians to interpret multichannel EEG and evoked potentials in a global, comprehensive, and unambiguous way

    Emotional Pre-eminence of Human Vocalizations

    Get PDF
    Human vocalizations (HV), as well as environmental sounds, convey a wide range of information, including emotional expressions. The latter have been relatively rarely investigated, and, in particular, it is unclear if duration-controlled non-linguistic HV sequences can reliably convey both positive and negative emotional information. The aims of the present psychophysical study were: (i) to generate a battery of duration-controlled and acoustically controlled extreme valence stimuli, and (ii) to compare the emotional impact of HV with that of other environmental sounds. A set of 144HV and other environmental sounds was selected to cover emotionally positive, negative, and neutral values. Sequences of 2s duration were rated on Likert scales by 16 listeners along three emotional dimensions (arousal, intensity, and valence) and two non-emotional dimensions (confidence in identifying the sound source and perceived loudness). The 2s stimuli were reliably perceived as emotionally positive, negative or neutral. We observed a linear relationship between intensity and arousal ratings and a "boomerang-shaped” intensity-valence distribution, as previously reported for longer, duration-variable stimuli. In addition, the emotional intensity ratings for HV were higher than for other environmental sounds, suggesting that HV constitute a characteristic class of emotional auditory stimuli. In addition, emotionally positive HV were more readily identified than other sounds, and emotionally negative stimuli, irrespective of their source, were perceived as louder than their positive and neutral counterparts. In conclusion, HV are a distinct emotional category of environmental sounds and they retain this emotional pre-eminence even when presented for brief period
    corecore