26 research outputs found

    Advantages of two-ear listening for speech degraded by noise and reverberation

    Get PDF
    Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2005.Includes bibliographical references (leaves 66-69).The current study investigates how the spatial locations of a target and masker influence consonant identification in anechoic and reverberant space. Reverberation was expected ·to interfere with the task both directly by degrading consonant identification and indirectly by altering interaural cues and decreasing spatial unmasking. Performance was measured as a function of target-to-masker ratio (TMR) to obtain multiple points along the psychometric function. Results suggest that for consonant identification, there is little spatial unmasking; however, in reverberant environments, performance improves with binaural listening even when the target and masker give rise to roughly the same interaural cues. It is hypothesized that the time-varying changes in TMR at both ears that result from reverberation can lead to such binaural listening advantages. The behavioral results are discussed with respect to an acoustic analysis that quantifies the expected improvement of binaural listening over monaural listening using an "independent looks" approach.by Sasha Devore.M.Eng

    Neural correlates and mechanisms of sounds localization in everyday reverberant settings

    Get PDF
    Thesis (Ph. D.)--Harvard-MIT Division of Health Sciences and Technology, 2009.Cataloged from PDF version of thesis.Includes bibliographical references (p. 161-176).Nearly all listening environments-indoors and outdoors alike-are full of boundary surfaces (e.g., walls, trees, and rocks) that produce acoustic reflections. These reflections interfere with the direct sound arriving at a listener's ears, distorting the binaural cues for sound localization. Yet, human listeners have little difficulty localizing sounds in most settings. This thesis addresses fundamental questions regarding the neural basis of sound localization in everyday reverberant environments. In the first set of experiments, we investigate the effects of reverberation on the directional sensitivity of low-frequency auditory neurons sensitive to interaural time differences (ITD), the principal cue for localizing sound containing low frequency energy. Because reverberant energy builds up over time, the source location is represented relatively faithfully during the early portion of a sound, but this representation becomes increasingly degraded later in the stimulus. We show that the directional sensitivity of ITD-sensitive neurons in the auditory midbrain of anesthetized cats and awake rabbits follows a similar time course. However, the tendency of neurons to fire preferentially at the onset of a stimulus results in more robust directional sensitivity than expected, suggesting a simple mechanism for improving directional sensitivity in reverberation. To probe the role of temporal response dynamics, we use a conditioning paradigm to systematically alter temporal response patterns of single neurons. Results suggest that making temporal response patterns less onset-dominated typically leads to poorer directional sensitivity in reverberation. In parallel behavioral experiments, we show that human lateralization judgments are consistent with predictions from a population rate model for decoding the observed midbrain responses, suggesting a subcortical origin for robust sound localization in reverberant environments. In the second part of the thesis we examine the effects of reverberation on directional sensitivity of neurons across the tonotopic axis in the awake rabbit auditory midbrain. We find that reverberation degrades the directional sensitivity of single neurons, although the amount of degradation depends on the characteristic frequency and the type of binaural cues available. When ITD is the only available directional cue, low frequency neurons sensitive to ITD in the fine-time structure maintain better directional sensitivity in reverberation than high frequency neurons sensitive to ITD in the envelope. On the other hand, when both ITD and interaural level differences (ILD) cues are available, directional sensitivity is comparable throughout the tonotopic axis, suggesting that, at high frequencies, ILDs provide better directional information than envelope ITDs in reverberation. These findings can account for results from human psychophysical studies of spatial hearing in reverberant environments. This thesis marks fundamental progress towards elucidating the neural basis for spatial hearing in everyday settings. Overall, our results suggest that the information contained in the rate responses of neurons in the auditory midbrain is sufficient to account for human sound localization in reverberant environments.by Sasha Devore.Ph.D

    Perceptual consequences of including reverberation in spatial auditory displays

    Get PDF
    Proceedings of the 9th International Conference on Auditory Display (ICAD), Boston, MA, July 7-9, 2003.This paper evaluates the perceptual consequences of including reverberation in spatial auditory displays for rapidly-varying signals (obstruent consonants). Preliminary results suggest that the effect of reverberation depends on both syllable position and reverberation characteristics. As many of the non-speech sounds in an auditory display share acoustic features with obstruent consonants, these results are important when designing spatial auditory displays for nonspeech signals as well

    Accurate Sound Localization in Reverberant Environments Is Mediated by Robust Encoding of Spatial Cues in the Auditory Midbrain

    Get PDF
    In reverberant environments, acoustic reflections interfere with the direct sound arriving at a listener's ears, distorting the spatial cues for sound localization. Yet, human listeners have little difficulty localizing sounds in most settings. Because reverberant energy builds up over time, the source location is represented relatively faithfully during the early portion of a sound, but this representation becomes increasingly degraded later in the stimulus. We show that the directional sensitivity of single neurons in the auditory midbrain of anesthetized cats follows a similar time course, although onset dominance in temporal response patterns results in more robust directional sensitivity than expected, suggesting a simple mechanism for improving directional sensitivity in reverberation. In parallel behavioral experiments, we demonstrate that human lateralization judgments are consistent with predictions from a population rate model decoding the observed midbrain responses, suggesting a subcortical origin for robust sound localization in reverberant environments.National Institutes of Health (U.S.) (Grant R01 DC002258)National Institutes of Health (U.S.) (Grant R01 DC05778-02)core National Institutes of Health (U.S.) (Eaton Peabody Laboratory. (Core) Grant P30 DC005209)National Institutes of Health (U.S.) (Grant T32 DC0003

    Internal Cholinergic Regulation of Learning and Recall in a Model of Olfactory Processing

    Get PDF
    In the olfactory system, cholinergic modulation has been associated with contrast modulation and changes in receptive fields in the olfactory bulb, as well the learning of odor associations in olfactory cortex. Computational modeling and behavioral studies suggest that cholinergic modulation could improve sensory processing and learning while preventing pro-active interference when task demands are high. However, how sensory inputs and/or learning regulate incoming modulation has not yet been elucidated. We here use a computational model of the olfactory bulb, piriform cortex (PC) and horizontal limb of the diagonal band of Broca (HDB) to explore how olfactory learning could regulate cholinergic inputs to the system in a closed feedback loop. In our model, the novelty of an odor is reflected in firing rates and sparseness of cortical neurons in response to that odor and these firing rates can directly regulate learning in the system by modifying cholinergic inputs to the system. In the model, cholinergic neurons reduce their firing in response to familiar odors—reducing plasticity in the PC, but increase their firing in response to novel odor—increasing PC plasticity. Recordings from HDB neurons in awake behaving rats reflect predictions from the model by showing that a subset of neurons decrease their firing as an odor becomes familiar

    Identification of Specific Circular RNA Expression Patterns and MicroRNA Interaction Networks in Mesial Temporal Lobe Epilepsy

    Get PDF
    Circular RNAs (circRNAs) regulate mRNA translation by binding to microRNAs (miRNAs), and their expression is altered in diverse disorders, including cancer, cardiovascular disease, and Parkinson’s disease. Here, we compare circRNA expression patterns in the temporal cortex and hippocampus of patients with pharmacoresistant mesial temporal lobe epilepsy (MTLE) and healthy controls. Nine circRNAs showed significant differential expression, including circRNA-HOMER1, which is expressed in synapses. Further, we identified miRNA binding sites within the sequences of differentially expressed (DE) circRNAs; expression levels of mRNAs correlated with changes in complementary miRNAs. Gene set enrichment analysis of mRNA targets revealed functions in heterocyclic compound binding, regulation of transcription, and signal transduction, which maintain the structure and function of hippocampal neurons. The circRNA–miRNA–mRNA interaction networks illuminate the molecular changes in MTLE, which may be pathogenic or an effect of the disease or treatments and suggests that DE circRNAs and associated miRNAs may be novel therapeutic target

    Odor preferences shape discrimination learning in rats.

    No full text

    PERCEPTUAL CONSENQUECES OF INCLUDING REVERBERATION IN SPATIAL AUDITORY DISPLAYS

    No full text
    This paper evaluates the perceptual consequences of including reverberation in spatial auditory displays for rapidly-varying signals (obstruent consonants). Preliminary results suggest that the effect of reverberation depends on both syllable position and reverberation characteristics. As many of the non-speech sounds in an auditory display share acoustic features with obstruent consonants, these results are important when designing spatial auditory displays for nonspeech signals as well. 1

    with distance

    No full text
    and behavioral sensitivities to azimuth degrad
    corecore