683 research outputs found

    Measuring vowel percepts in human listeners with behavioral response-triggered averaging

    Get PDF
    A vowel can be largely defined by the frequencies of its first two formants, but the absolute frequencies for a given vowel vary from talker to talker and utterance to utterance. Given this variability, it is unclear what criteria listeners use to identify vowels. To estimate the vowel features for which people listen, we adapted a noise-based reverse-correlation method from auditory neurophysiological studies and vision research (Gold et al., 1999). Listeners presented with the stimulus, which had a random spectrum with levels in 60 frequency bins changing every 0.5 s, were asked to press a key whenever they heard the vowels [a] or [i:]. Reverse-correlation was used to average the spectrum of the noise prior to each key press, thus estimating the features of the vowels for which the participants were listening. The formant frequencies of these reverse-correlated vowels were similar to those of their respective whispered vowels. The success of this response-triggered technique suggests that it may prove useful for estimating other internal representations, including perceptual phenomena like tinnitus. References: Gold, J., Bennett, P. J., and Sekuler, A. B. (1999). “Identification of band-pass filtered faces and letters by human and ideal observers,” Vis. Res. 39(21), 3537–3560

    The internal representation of vowel spectra investigated using behavioral response-triggered averaging

    Get PDF
    Listeners presented with noise were asked to press a key 13 whenever they heard the vowels [a] or [i:]. The noise had a random spectrum, with levels in 60 frequency bins changing every 0.5 s. Reverse correlation was used to average the spectrum of the noise prior to each key press, thus estimating the features of the vowels for which the participants were listening. The formant frequencies of these reverse-correlated vowels were similar to those of their respective whispered vowels. The success of this response-triggered technique suggests that it may prove useful for estimating other internal representations, including perceptual phenomena like tinnitus

    The minimum monitoring signal-to-noise ratio for off-axis signals and its implications for directional hearing aids

    Get PDF
    The signal-to-noise ratio (SNR) benefit of hearing aid directional microphones is dependent on the angle of the listener relative to the target, something that can change drastically and dynamically in a typical group conversation. When a new target signal is significantly off-axis, directional microphones lead to slower target orientation, more complex movements, and more reversals. This raises the question of whether there is an optimal design for directional microphones. In principle an ideal microphone would provide the user with sufficient directionality to help with speech understanding, but not attenuate off-axis signals so strongly that orienting to new signals was difficult or impossible. We investigated the latter part of this question. In order to measure the minimal monitoring SNR for reliable orientation to off-axis signals, we measured head-orienting behaviour towards targets of varying SNRs and locations for listeners with mild to moderate bilateral symmetrical hearing loss. Listeners were required to turn and face a female talker in background noise and movements were tracked using a head-mounted crown and infrared system that recorded yaw in a ring of loudspeakers. The target appeared randomly at ± 45, 90 or 135° from the start point. The results showed that as the target SNR decreased from 0 dB to −18 dB, first movement duration and initial misorientation count increased, then fixation error, and finally reversals increased. Increasing the target angle increased movement duration at all SNRs, decreased reversals (above −12 dB target SNR), and had little to no effect on initial misorientations. These results suggest that listeners experience some difficulty orienting towards sources as the target SNR drops below −6 dB, and that if one intends to make a directional microphone that is usable in a moving conversation, then off-axis attenuation should be no more than 12 dB

    DYNAMIC TORSO REFLECTION FILTERING FOR INTERACTIVE BINAURAL SPATIAL AUDIO BASED ON BIOLOGICALLY CONSTRAINED IMU DRIFT COMPENSATION

    Get PDF
    An audio uses information uses a biologically constrained IMU drift compensation for audio spatial rendering to drive a dynamic filtering process to better reproduce the acoustic effects of head on torso orientation on the HRTF

    The moving minimum audible angle is smaller during self motion than during source motion

    Get PDF
    We are rarely perfectly still: our heads rotate in three axes and move in three dimensions, constantly varying the spectral and binaural cues at the ear drums. In spite of this motion, static sound sources in the world are typically perceived as stable objects. This argues that the auditory system-in a manner not unlike the vestibulo-ocular reflex-works to compensate for self motion and stabilize our sensory representation of the world. We tested a prediction arising from this postulate: that self motion should be processed more accurately than source motion. We used an infrared motion tracking system to measure head angle, and real-time interpolation of head related impulse responses to create "head-stabilized" signals that appeared to remain fixed in space as the head turned. After being presented with pairs of simultaneous signals consisting of a man and a woman speaking a snippet of speech, normal and hearing impaired listeners were asked to report whether the female voice was to the left or the right of the male voice. In this way we measured the moving minimum audible angle (MMAA). This measurement was made while listeners were asked to turn their heads back and forth between ± 15° and the signals were stabilized in space. After this "self-motion" condition we measured MMAA in a second "source-motion" condition when listeners remained still and the virtual locations of the signals were moved using the trajectories from the first condition. For both normal and hearing impaired listeners, we found that the MMAA for signals moving relative to the head was ~1-2° smaller when the movement was the result of self motion than when it was the result of source motion, even though the motion with respect to the head was identical. These results as well as the results of past experiments suggest that spatial processing involves an ongoing and highly accurate comparison of spatial acoustic cues with self-motion cues

    Biomimetic direction of arrival estimation for resolving front-back confusions in hearing aids

    Get PDF
    Sound sources at the same angle in front or behind a two-microphone array (e.g., bilateral hearing aids) produce the same time delay and two estimates for the direction of arrival: A front-back confusion. The auditory system can resolve this issue using head movements. To resolve front-back confusion for hearing-aid algorithms, head movement was measured using an inertial sensor. Successive time-delay estimates between the microphones are shifted clockwise and counterclockwise by the head movement between estimates and aggregated in two histograms. The histogram with the largest peak after multiple estimates predicted the correct hemifield for the source, eliminating the front-back confusions

    Influence of microphone housing on the directional response of piezoelectric mems microphones inspired by Ormia ochracea

    Get PDF
    The influence of custom microphone housings on the acoustic directionality and frequency response of a multiband bio-inspired MEMS microphone is presented. The 3.2 mm by 1.7 mm piezoelectric MEMS microphone, fabricated by a cost-effective multi-user process, has four frequency bands of operation below 10 kHz, with a desired first-order directionality for all four bands. 7×7×2.5 mm3 3-D-printed bespoke housings with varying acoustic access to the backside of the microphone membrane are investigated through simulation and experiment with respect to their influence on the directionality and frequency response to sound stimulus. Results show a clear link between directionality and acoustic access to the back cavity of the microphone. Furthermore, there was a change in direction of the first-order directionality with reduced height in this back cavity acoustic access. The required configuration for creating an identical directionality for all four frequency bands is investigated along with the influence of reducing the symmetry of the acoustic back cavity access. This paper highlights the overall requirement of considering housing geometries and their influence on acoustic behavior for bio-inspired directional microphones

    Auditory compensation for head rotation is incomplete

    Get PDF
    Hearing is confronted by a similar problem to vision when the observer moves. The image motion that is created remains ambiguous until the observer knows the velocity of eye and/or head. One way the visual system solves this problem is to use motor commands, proprioception and vestibular information. These ‘extra-retinal signals’ compensate for self movement, converting image motion into head-centred coordinates, though not always perfectly. We investigated whether the auditory system also transforms coordinates by examining the degree of compensation for head rotation when judging a moving sound. Real-time recordings of head motion were used to change the ‘movement gain’ relating head movement to source movement across a loudspeaker array. We then determined psychophysically the gain that corresponded to a perceptually-stationary source. Experiment 1 showed that the gain was small and positive for a wide range of trained head speeds. Hence listeners perceived a stationary source as moving slightly opposite to the head rotation, in much the same way that observers see stationary visual objects move against a smooth pursuit eye movement. Experiment 2 showed the degree of compensation remained the same for sounds presented at different azimuths, although the precision of performance declined when the sound was eccentric. We discuss two possible explanations for incomplete compensation, one based on differences in the accuracy of signals encoding image motion and self-movement, and one concerning statistical optimisation that sacrifices accuracy for precision. We then consider the degree to which such explanations can be applied to auditory motion perception in moving listeners

    The effect of hearing aid microphone mode on performance in an auditory orienting task

    Get PDF
    OBJECTIVES: Although directional microphones on a hearing aid provide a signal-to-noise ratio benefit in a noisy background, the amount of benefit is dependent on how close the signal of interest is to the front of the user. It is assumed that when the signal of interest is off-axis, users can reorient themselves to the signal to make use of the directional microphones to improve signal-to-noise ratio. The present study tested this assumption by measuring the head-orienting behavior of bilaterally fit hearing-impaired individuals with their microphones set to omnidirectional and directional modes. The authors hypothesized that listeners using directional microphones would have greater difficulty in rapidly and accurately orienting to off-axis signals than they would when using omnidirectional microphones. DESIGN: The authors instructed hearing-impaired individuals to turn and face a female talker in simultaneous surrounding male-talker babble. Participants pressed a button when they felt they were accurately oriented in the direction of the female talker. Participants completed three blocks of trials with their hearing aids in omnidirectional mode and three blocks in directional mode, with mode order randomized. Using a Vicon motion tracking system, the authors measured head position and computed fixation error, fixation latency, trajectory complexity, and proportion of misorientations. RESULTS: Results showed that for larger off-axis target angles, listeners using directional microphones took longer to reach their targets than they did when using omnidirectional microphones, although they were just as accurate. They also used more complex movements and frequently made initial turns in the wrong direction. For smaller off-axis target angles, this pattern was reversed, and listeners using directional microphones oriented more quickly and smoothly to the targets than when using omnidirectional microphones. CONCLUSIONS: The authors argue that an increase in movement complexity indicates a switch from a simple orienting movement to a search behavior. For the most off-axis target angles, listeners using directional microphones appear to not know which direction to turn, so they pick a direction at random and simply rotate their heads until the signal becomes more audible. The changes in fixation latency and head orientation trajectories suggest that the decrease in off-axis audibility is a primary concern in the use of directional microphones, and listeners could experience a loss of initial target speech while turning toward a new signal of interest. If hearing-aid users are to receive maximum directional benefit in noisy environments, both adaptive directionality in hearing aids and clinical advice on using directional microphones should take head movement and orientation behavior into account
    • 

    corecore