200 research outputs found

    The Behavioral Relevance of Multisensory Neural Response Interactions

    Get PDF
    Sensory information can interact to impact perception and behavior. Foods are appreciated according to their appearance, smell, taste and texture. Athletes and dancers combine visual, auditory, and somatosensory information to coordinate their movements. Under laboratory settings, detection and discrimination are likewise facilitated by multisensory signals. Research over the past several decades has shown that the requisite anatomy exists to support interactions between sensory systems in regions canonically designated as exclusively unisensory in their function and, more recently, that neural response interactions occur within these same regions, including even primary cortices and thalamic nuclei, at early post-stimulus latencies. Here, we review evidence concerning direct links between early, low-level neural response interactions and behavioral measures of multisensory integration

    Seeing things that are not there: illusions reveal how our brain constructs what we see

    Get PDF
    What we perceive is not always what our eyes see. Vision, and perception more generally, should not be thought of as a webcam that just takes pictures of the world. This is not a fault in how our brains work, but rather is exemplary of how the brain constructs perception and takes advantage of its massive inter-connectedness in ways that are highly similar to social networks. The construction of perception is not only based on the information the eyes capture, but also based on the information stored in the brain and "guesses" based on this stored information. Illusory figure similar to that shown in Figure 1 is a laboratory example of this construction process and demonstrates well how the visual system works. In the real world, the visual system must handle situations of occlusion, noise, and equivocality (that is, when it is unclear what bits of what we see belongs to one object versus another)

    Early, Low-Level Auditory-Somatosensory Multisensory Interactions Impact Reaction Time Speed

    Get PDF
    Several lines of research have documented early-latency non-linear response interactions between audition and touch in humans and non-human primates. That these effects have been obtained under anesthesia, passive stimulation, as well as speeded reaction time tasks would suggest that some multisensory effects are not directly influencing behavioral outcome. We investigated whether the initial non-linear neural response interactions have a direct bearing on the speed of reaction times. Electrical neuroimaging analyses were applied to event-related potentials in response to auditory, somatosensory, or simultaneous auditory–somatosensory multisensory stimulation that were in turn averaged according to trials leading to fast and slow reaction times (using a median split of individual subject data for each experimental condition). Responses to multisensory stimulus pairs were contrasted with each unisensory response as well as summed responses from the constituent unisensory conditions. Behavioral analyses indicated that neural response interactions were only implicated in the case of trials producing fast reaction times, as evidenced by facilitation in excess of probability summation. In agreement, supra-additive non-linear neural response interactions between multisensory and the sum of the constituent unisensory stimuli were evident over the 40–84 ms post-stimulus period only when reaction times were fast, whereas subsequent effects (86–128 ms) were observed independently of reaction time speed. Distributed source estimations further revealed that these earlier effects followed from supra-additive modulation of activity within posterior superior temporal cortices. These results indicate the behavioral relevance of early multisensory phenomena

    Spatiotemporal Analysis of Multichannel EEG: CARTOOL

    Get PDF
    This paper describes methods to analyze the brain's electric fields recorded with multichannel Electroencephalogram (EEG) and demonstrates their implementation in the software CARTOOL. It focuses on the analysis of the spatial properties of these fields and on quantitative assessment of changes of field topographies across time, experimental conditions, or populations. Topographic analyses are advantageous because they are reference independents and thus render statistically unambiguous results. Neurophysiologically, differences in topography directly indicate changes in the configuration of the active neuronal sources in the brain. We describe global measures of field strength and field similarities, temporal segmentation based on topographic variations, topographic analysis in the frequency domain, topographic statistical analysis, and source imaging based on distributed inverse solutions. All analysis methods are implemented in a freely available academic software package called CARTOOL. Besides providing these analysis tools, CARTOOL is particularly designed to visualize the data and the analysis results using 3-dimensional display routines that allow rapid manipulation and animation of 3D images. CARTOOL therefore is a helpful tool for researchers as well as for clinicians to interpret multichannel EEG and evoked potentials in a global, comprehensive, and unambiguous way

    Automatic and Intrinsic Auditory "What” and "Where” Processing in Humans Revealed by Electrical Neuroimaging

    Get PDF
    The auditory system includes 2 parallel functional pathways—one for treating sounds' identities and another for their spatial attributes (so-called "what” and "where” pathways). We examined the spatiotemporal mechanisms along auditory "what” and "where” pathways and whether they are automatically engaged in differentially processing spatial and pitch information of identical stimuli. Electrical neuroimaging of auditory evoked potentials (i.e., statistical analyses of waveforms, field strength, topographies, and source estimations) was applied to a passive "oddball” paradigm comprising 2 varieties of blocks of trials. On "what” blocks, band-pass-filtered noises varied in pitch, independently of perceived location. On "where” blocks, the identical stimuli varied in perceived location independently of pitch. Beginning 100 ms poststimulus, the electric field topography significantly differed between conditions, indicative of the automatic recruitment of distinct intracranial generators. A distributed linear inverse solution and statistical analysis thereof revealed activations within superior temporal cortex and prefrontal cortex bilaterally that were common for both conditions, as well as regions within the right temporoparietal cortices that were selective for the "where” condition. These findings support models of automatic and intrinsic parallel processing of auditory information, such that segregated processing of spatial and pitch features may be an organizing principle of auditory functio

    Stochastic modulation of oscillatory neural activity.

    Get PDF
    Rhythmic neural activity plays a central role in neural computation. Oscillatory activity has been associated with myriad functions such as homeostasis, attention, and cognition [1] as well as neurological and psychiatric disorders, including Parkinson’s disease, schizophrenia, and depression [2]. Despite this pervasiveness, little is known about the dynamic mechanisms by which the frequency and power of ongoing cyclical neural activity can be modulated either externally (e.g. external stimulation) or via internally-driven modulatory drive of nearby neurons. While numerous studies have focused on neural rhythms and synchrony, it remains unresolved what mediates frequency transitions whereby the predominant power spectrum shifts from one frequency to another. Here, we provide computational perspectives regarding responses of cortical networks to fast stochastic fluctuations (hereafter “noise”) at frequencies in the range of 10-500 Hz that are mimicked using Poisson shot-noise. Using a sparse and randomly connected network of neurons with time delay, we determine the functional impact of these fluctuations on network topology using mean-field approximations. We show how noise can be used to displace the equilibrium activity state of the population: the noise smoothly shifts the mean activity of the modeled neurons from a regime dominated by inhibition to a regime dominated by excitation. Moreover, we show that noise alone may support frequency transition via a non-nonlinear mechanism that operates in addition to resonance. Surprisingly, stochastic fluctuations non-monotonically modulate network’s oscillations, which are in the beta band. The system’s frequency is first slowed down and then accelerated as the stimulus intensity and/or rate increases. This non-linear effect is caused by combined input-induced linearization of the dynamics and enhanced network susceptibility. Our results provide insights regarding a potentially significant mechanism at play in synchronous neural systems; ongoing activity rhythms can be externally and dynamically modulated, and moreover indicate a candidate mechanism supporting frequency transitions. By altering the oscillation frequency of the network, power can be displaced from one frequency band to another. As such, the action of noise on oscillating neural systems must be regarded as strongly non-linear; its action recruiting more than resonance alone to operate on ongoing dynamics

    An Introductory Guide to Organizational Neuroscience

    Get PDF
    The time is ripe for a renewed and interdisciplinary approach to organizational research that incorporates neuroscientific techniques. Like all methods, they have methodological, analytical, and interpretational limitations; however, the potential gains from using these techniques are far more considerable. We have therefore assembled a succinct yet authoritative collection of articles on the topic of neuroscience in organizational research, to serve as a solid introduction to the methods of neuroscience and what they can accomplish. The special topic is organized into two parts. The first includes a set of accessible reviews of the palette of brain imaging, mapping, and stimulation techniques (fMRI, fNIRS, EEG, MEG, and NIBS) as well as examples of the application of neuroscience methods to various disciplines including economics, marketing, finance, organizational behavior, neuroethology, as well an integrative translational critique on a variety of applications. The second is a collection of articles resulting from a competitive call for submissions that cover various neuroscience topics, but also address important methodological and philosophical issues. The articles lay out a roadmap for the effective integration of neuroscientific methods into organizational research

    Hemispheric competence for auditory spatial representation

    Get PDF
    Sound localization relies on the analysis of interaural time and intensity differences, as well as attenuation patterns by the outer ear. We investigated the relative contributions of interaural time and intensity difference cues to sound localization by testing 60 healthy subjects: 25 with focal left and 25 with focal right hemispheric brain damage. Group and single-case behavioural analyses, as well as anatomo-clinical correlations, confirmed that deficits were more frequent and much more severe after right than left hemispheric lesions and for the processing of interaural time than intensity difference cues. For spatial processing based on interaural time difference cues, different error types were evident in the individual data. Deficits in discriminating between neighbouring positions occurred in both hemispaces after focal right hemispheric brain damage, but were restricted to the contralesional hemispace after focal left hemispheric brain damage. Alloacusis (perceptual shifts across the midline) occurred only after focal right hemispheric brain damage and was associated with minor or severe deficits in position discrimination. During spatial processing based on interaural intensity cues, deficits were less severe in the right hemispheric brain damage than left hemispheric brain damage group and no alloacusis occurred. These results, matched to anatomical data, suggest the existence of a binaural sound localization system predominantly based on interaural time difference cues and primarily supported by the right hemisphere. More generally, our data suggest that two distinct mechanisms contribute to: (i) the precise computation of spatial coordinates allowing spatial comparison within the contralateral hemispace for the left hemisphere and the whole space for the right hemisphere; and (ii) the building up of global auditory spatial representations in right temporo-parietal cortice
    corecore