451 research outputs found

    Exploring the Function of Neural Oscillations in Early Sensory Systems

    Get PDF
    Neuronal oscillations appear throughout the nervous system, in structures as diverse as the cerebral cortex, hippocampus, subcortical nuclei and sense organs. Whether neural rhythms contribute to normal function, are merely epiphenomena, or even interfere with physiological processing are topics of vigorous debate. Sensory pathways are ideal for investigation of oscillatory activity because their inputs can be defined. Thus, we will focus on sensory systems as we ask how neural oscillations arise and how they might encode information about the stimulus. We will highlight recent work in the early visual pathway that shows how oscillations can multiplex different types of signals to increase the amount of information that spike trains encode and transmit. Last, we will describe oscillation-based models of visual processing and explore how they might guide further research

    Importance of spike timing in touch: an analogy with hearing?

    Get PDF
    Touch is often conceived as a spatial sense akin to vision. However, touch also involves the transduction and processing of signals that vary rapidly over time, inviting comparisons with hearing. In both sensory systems, first order afferents produce spiking responses that are temporally precise and the timing of their responses carries stimulus information. The precision and informativeness of spike timing in the two systems invites the possibility that both implement similar mechanisms to extract behaviorally relevant information from these precisely timed responses. Here, we explore the putative roles of spike timing in touch and hearing and discuss common mechanisms that may be involved in processing temporal spiking patterns

    Functional Sensory Representations of Natural Stimuli: the Case of Spatial Hearing

    Get PDF
    In this thesis I attempt to explain mechanisms of neuronal coding in the auditory system as a form of adaptation to statistics of natural stereo sounds. To this end I analyse recordings of real-world auditory environments and construct novel statistical models of these data. I further compare regularities present in natural stimuli with known, experimentally observed neuronal mechanisms of spatial hearing. In a more general perspective, I use binaural auditory system as a starting point to consider the notion of function implemented by sensory neurons. In particular I argue for two, closely-related tenets: 1. The function of sensory neurons can not be fully elucidated without understanding statistics of natural stimuli they process. 2. Function of sensory representations is determined by redundancies present in the natural sensory environment. I present the evidence in support of the first tenet by describing and analysing marginal statistics of natural binaural sound. I compare observed, empirical distributions with knowledge from reductionist experiments. Such comparison allows to argue that the complexity of the spatial hearing task in the natural environment is much higher than analytic, physics-based predictions. I discuss the possibility that early brain stem circuits such as LSO and MSO do not \"compute sound localization\" as is often being claimed in the experimental literature. I propose that instead they perform a signal transformation, which constitutes the first step of a complex inference process. To support the second tenet I develop a hierarchical statistical model, which learns a joint sparse representation of amplitude and phase information from natural stereo sounds. I demonstrate that learned higher order features reproduce properties of auditory cortical neurons, when probed with spatial sounds. Reproduced aspects were hypothesized to be a manifestation of a fine-tuned computation specific to the sound-localization task. Here it is demonstrated that they rather reflect redundancies present in the natural stimulus. Taken together, results presented in this thesis suggest that efficient coding is a strategy useful for discovering structures (redundancies) in the input data. Their meaning has to be determined by the organism via environmental feedback

    Temporal Coding of Periodicity Pitch in the Auditory System: An Overview

    Get PDF
    This paper outlines a taxonomy of neural pulse codes and reviews neurophysiological evidence for interspike interval-based representations for pitch and timbre in the auditory nerve and cochlear nucleus. Neural pulse codes can be divided into channel-based codes, temporal-pattern codes, and time-of-arrival codes. Timings of discharges in auditory nerve fibers reflect the time structure of acoustic waveforms, such that the interspike intervals that are produced precisely convey information concerning stimulus periodicities. Population-wide inter-spike interval distributions are constructed by summing together intervals from the observed responses of many single Type I auditory nerve fibers. Features in such distributions correspond closely with pitches that are heard by human listeners. The most common all-order interval present in the auditory nerve array almost invariably corresponds to the pitch frequency, whereas the relative fraction of pitchrelated intervals amongst all others qualitatively corresponds to the strength of the pitch. Consequently, many diverse aspects of pitch perception are explained in terms of such temporal representations. Similar stimulus-driven temporal discharge patterns are observed in major neuronal populations of the cochlear nucleus. Population-interval distributions constitute an alternative time-domain strategy for representing sensory information that complements spatially organized sensory maps. Similar autocorrelation-like representations are possible in other sensory systems, in which neural discharges are time-locked to stimulus waveforms

    Neural coding of pitch cues in the auditory midbrain of unanesthetized rabbits

    Full text link
    Pitch is an important attribute of auditory perception that conveys key features in music, speech, and helps listeners extract useful information from complex auditory environments. Although the psychophysics of pitch perception has been extensively studied for over a century, the underlying neural mechanisms are still poorly understood. This thesis examines pitch cues in the inferior colliculus (IC), which is the core processing center in the mammalian auditory midbrain that relays and transforms convergent inputs from peripheral brainstem nuclei to the auditory cortex. Previous studies have shown that IC can encode low-frequency fluctuations in stimulus envelope that are related to pitch, but most experiments were conducted in anesthetized animals using stimuli that only evoked weak pitch sensations and only investigated a limited frequency range. Here, we used single-neuron recordings from the IC in normal hearing, unanesthetized rabbits in response to a comprehensive set of complex auditory stimuli to explore the role of IC in the neural processing of pitch. We characterized three neural codes for pitch cues: a temporal code for the stimulus envelope repetition rate (ERR) below 900 Hz, a rate code for ERR between 60 and 1600 Hz, and a rate-place code for frequency components individually resolved by the cochlea that is mainly available above 800 Hz. While the temporal code and the rate-place code are inherited from the auditory periphery, the rate code for ERR has not been currently characterized in processing stages prior to the IC. To help interpret our experimental findings, we used computational models to show that the IC rate code for ERR likely arises via temporal interaction of multiple synaptic inputs, and thus the IC performs a temporal-to-rate code transformation from peripheral to cortical representations of pitch cues. We also show that the IC rate-place code is robust across a 40 dB range of sound levels, and is likely strengthened by inhibitory synaptic inputs. Together, these three codes could provide neural substrates for pitch of stimuli with various temporal and spectral compositions over the entire frequency range

    Cellular specializations for sound localization

    Get PDF
    One of the key elements in auditory perception is the localization of sounds in space. The major cues used for localizing sounds in the azimuthal plane have long been recognized as interaural differences in time of arrival of a sound and amplitude differences between the two ears (Rayleigh 1907; Thompson 1878). High frequency sounds are reflected by the head and thereby produce interaural level differences (ILDs) that are used for localization. The head does not reflect low frequency sounds and so interaural timing differences (ITDs) are used. One of the cell groups of the auditory brainstem, the medial superior olive (MSO), functions in sound localization by comparing ITDs between the two ears. The MSO is defined as a binaural group of cells because it integrates input from the cochlear nucleus (CN) from each ear. Afferent nerve fibers from the ipsilateral CN are restricted to dendrites oriented laterally and inputs from the contralateral CN are segregated to medially oriented dendrites (Stotler 1953). At low to moderate sound levels, activation from each cochlear nucleus is below action potential threshold and MSO neurons only generate action potentials when inputs from both sides arrive within a short temporal window called the coincidence detection window.;Several cellular specializations exist along the auditory pathway that aid MSO cells in their ability to detect changes in ITD. These specializations include large nerve terminals and distinct organelle complexes located within terminals, which facilitate fast, well-timed inhibitory inputs to MSO cells. Very little is known about the role of inhibition in sound localization and proper understanding of its role depends on knowledge of the cells that impinge on the MSO and the pharmacology and kinetics of synaptic transmission in MSO cells. Also, the membranes of MSO cells contain specific voltage-gated potassium channels (Kv), these channels are known to affect membrane electrical properties, but how these channels influence ITD sensitivity is unknown. The main goal of my research was to understand these cellular specializations that contribute to neural processing of ITDs

    The temporal pattern of impulses in primary afferents analogously encodes touch and hearing information

    Full text link
    An open question in neuroscience is the contribution of temporal relations between individual impulses in primary afferents in conveying sensory information. We investigated this question in touch and hearing, while looking for any shared coding scheme. In both systems, we artificially induced temporally diverse afferent impulse trains and probed the evoked perceptions in human subjects using psychophysical techniques. First, we investigated whether the temporal structure of a fixed number of impulses conveys information about the magnitude of tactile intensity. We found that clustering the impulses into periodic bursts elicited graded increases of intensity as a function of burst impulse count, even though fewer afferents were recruited throughout the longer bursts. The interval between successive bursts of peripheral neural activity (the burst-gap) has been demonstrated in our lab to be the most prominent temporal feature for coding skin vibration frequency, as opposed to either spike rate or periodicity. Given the similarities between tactile and auditory systems, second, we explored the auditory system for an equivalent neural coding strategy. By using brief acoustic pulses, we showed that the burst-gap is a shared temporal code for pitch perception between the modalities. Following this evidence of parallels in temporal frequency processing, we next assessed the perceptual frequency equivalence between the two modalities using auditory and tactile pulse stimuli of simple and complex temporal features in cross-sensory frequency discrimination experiments. Identical temporal stimulation patterns in tactile and auditory afferents produced equivalent perceived frequencies, suggesting an analogous temporal frequency computation mechanism. The new insights into encoding tactile intensity through clustering of fixed charge electric pulses into bursts suggest a novel approach to convey varying contact forces to neural interface users, requiring no modulation of either stimulation current or base pulse frequency. Increasing control of the temporal patterning of pulses in cochlear implant users might improve pitch perception and speech comprehension. The perceptual correspondence between touch and hearing not only suggests the possibility of establishing cross-modal comparison standards for robust psychophysical investigations, but also supports the plausibility of cross-sensory substitution devices
    • …
    corecore