75 research outputs found

    Periodotopy in the gerbil inferior colliculus: local clustering rather than a gradient map

    Get PDF
    Periodicities in sound waveforms are widespread, and shape important perceptual attributes of sound including rhythm and pitch. Previous studies have indicated that, in the inferior colliculus (IC), a key processing stage in the auditory midbrain, neurons tuned to different periodicities might be arranged along a periodotopic axis which runs approximately orthogonal to the tonotopic axis. Here we map out the topography of frequency and periodicity tuning in the IC of gerbils in unprecedented detail, using pure tones and different periodic sounds, including click trains, sinusoidally amplitude modulated (SAM) noise and iterated rippled noise. We found that while the tonotopic map exhibited a clear and highly reproducible gradient across all animals, periodotopic maps varied greatly across different types of periodic sound and from animal to animal. Furthermore, periodotopic gradients typically explained only about 10% of the variance in modulation tuning between recording sites. However, there was a strong local clustering of periodicity tuning at a spatial scale of ca. 0.5 mm, which also differed from animal to animal

    Feel it in my bones: Composing multimodal experience through tissue conduction

    Get PDF
    We outline here the feasibility of coherently utilising tissue conduction for spatial audio and tactile input. Tissue conduction display-specific compositional concerns are discussed; it is hypothesised that the qualia available through this medium substantively differ from those for conventional artificial means of appealing to auditory spatial perception. The implications include that spatial music experienced in this manner constitutes a new kind of experience, and that the ground rules of composition are yet to be established. We refer to results from listening experiences with one hundred listeners in an unstructured attribute elicitation exercise, where prominent themes such as “strange”, “weird”, “positive”, “spatial” and “vibrations” emerged. We speculate on future directions aimed at taking maximal advantage of the principle of multimodal perception to broaden the informational bandwidth of the display system. Some implications for composition for hearing-impaired are elucidated.n/

    Acoustic Cues for Sound Source Distance and Azimuth in Rabbits, a Racquetball and a Rigid Spherical Model

    Get PDF
    There are numerous studies measuring the transfer functions representing signal transformation between a source and each ear canal, i.e., the head-related transfer functions (HRTFs), for various species. However, only a handful of these address the effects of sound source distance on HRTFs. This is the first study of HRTFs in the rabbit where the emphasis is on the effects of sound source distance and azimuth on HRTFs. With the rabbit placed in an anechoic chamber, we made acoustic measurements with miniature microphones placed deep in each ear canal to a sound source at different positions (10–160 cm distance, ±150° azimuth). The sound was a logarithmically swept broadband chirp. For comparisons, we also obtained the HRTFs from a racquetball and a computational model for a rigid sphere. We found that (1) the spectral shape of the HRTF in each ear changed with sound source location; (2) interaural level difference (ILD) increased with decreasing distance and with increasing frequency. Furthermore, ILDs can be substantial even at low frequencies when distance is close; and (3) interaural time difference (ITD) decreased with decreasing distance and generally increased with decreasing frequency. The observations in the rabbit were reproduced, in general, by those in the racquetball, albeit greater in magnitude in the rabbit. In the sphere model, the results were partly similar and partly different than those in the racquetball and the rabbit. These findings refute the common notions that ILD is negligible at low frequencies and that ITD is constant across frequency. These misconceptions became evident when distance-dependent changes were examined

    Egocentric and allocentric representations in auditory cortex

    Get PDF
    A key function of the brain is to provide a stable representation of an object’s location in the world. In hearing, sound azimuth and elevation are encoded by neurons throughout the auditory system, and auditory cortex is necessary for sound localization. However, the coordinate frame in which neurons represent sound space remains undefined: classical spatial receptive fields in head-fixed subjects can be explained either by sensitivity to sound source location relative to the head (egocentric) or relative to the world (allocentric encoding). This coordinate frame ambiguity can be resolved by studying freely moving subjects; here we recorded spatial receptive fields in the auditory cortex of freely moving ferrets. We found that most spatially tuned neurons represented sound source location relative to the head across changes in head position and direction. In addition, we also recorded a small number of neurons in which sound location was represented in a world-centered coordinate frame. We used measurements of spatial tuning across changes in head position and direction to explore the influence of sound source distance and speed of head movement on auditory cortical activity and spatial tuning. Modulation depth of spatial tuning increased with distance for egocentric but not allocentric units, whereas, for both populations, modulation was stronger at faster movement speeds. Our findings suggest that early auditory cortex primarily represents sound source location relative to ourselves but that a minority of cells can represent sound location in the world independent of our own position

    Audiotactile interactions in temporal perception

    Full text link

    Auditory neuroscience: sound segregation in the brainstem?

    Get PDF
    Separating a mixture of sounds into its constituent parts is a complex process likely to involve many processing stages. A new study suggests that the first steps in that process may occur already at the level of the first auditory processing centre in the brainstem

    Auditory neuroscience: neuronal sensitivity in humans.

    Get PDF
    Microelectrode recordings from the human auditory cortex suggest that the tuning of individual neurons can account for sound frequency discrimination thresholds and that this tuning varies in a context-dependent fashion with the type of sound used to measure it
    • …
    corecore