30 research outputs found
Evidence of correlation between acoustic and visual features of speech
ABSTRACT This paper examines the degree of correlation between lip and jaw configuration and speech acoustics. The lip and jaw positions are characterised by a system of measurements taken from video images of the speaker's face and profile, and the acoustics are represented using line spectral pair parameters and a measure of RMS energy. A correlation is found between the measured acoustic parameters and a linear estimate of the acoustics recovered from the visual data. This correlation exists despite the simplicity of the visual representation and is in rough agreement with correlations measured in earlier work by Yehia et al. using different techniques. However, analysis of the estimation errors suggests that the visual information, as parameterised in our experiment, offers only a weak constraint on the acoustics. Results are discussed from the perspective of models of early audio-visual integration
Understanding language evolution : beyond Pan-centrism
Language does not fossilize but this does not mean that the language's evolutionary timeline is lost forever. Great apes provide a window back in time on our last prelinguistic ancestor's communication and cognition. Phylogeny and cladistics implicitly conjure Pan (chimpanzees, bonobos) as a superior (often the only) model for language evolution compared with earlier diverging lineages, Gorilla and Pongo (orangutans). Here, in reviewing the literature, it is shown that Pan do not surpass other great apes along genetic, cognitive, ecologic, or vocal traits that are putatively paramount for language onset and evolution. Instead, revived herein is the idea that only by abandoning single-species models and learning about the variation among great apes, there might be a chance to retrieve lost fragments of the evolutionary timeline of language.PostprintPeer reviewe
Perception audiovisual e inteligibilitad des habla en ninos sordos con implate cochlear
info:eu-repo/semantics/nonPublishe
Etude de la perception audio-visuelle de la parole dans le bruit chez les enfants en âge scolaire, Normo-entendants, Implantés cochléaires et dysphasiques
info:eu-repo/semantics/nonPublishe
A computational model for amplitude modulation extraction and analysis of simultaneous amplitude modulated signals
The existence of amplitude-modulation (AM) frequency-specific channels has previously been suggested on the basis of psychoacoustical experiments using an adaptation or masking paradigm. Electrophysiological studies have also shown that cochlear nucleus chop-s neurons are tuned to specific AM frequencies. We have developed a simple but physiologically plausible model of chop-S neuron properties based on a digital integration of Hodgkin and Huxley equations. Although it does not include an exhaustive peripheral processing, our model produces similar results to physiological data. We recover a bandpass AM transfer function at high input levels and non-linear filtering. This model enables us to simulate physiological masking situations in the modulation domain. We examine the strength of the model synchronous response to white noises modulated at two different frequencies and added with various intensity ratios. Simulations show that the chop-S model enhances AM separation of a well-tuned AM frequency added to another, relative to the auditory-nerve fibers inputs. Masking patterns computed by varying the masker-modulation frequencies show the recruitment of the neuron by these stimuli
Intelligibilité de la parole dans le bruit – traitement audio-visuel
info:eu-repo/semantics/nonPublishe
Audio-visual speech perception and intelligibility in noise: first results
info:eu-repo/semantics/nonPublishe
The role of audio-visual balance in speech perception in normally-hearing and cochlear implanted children
info:eu-repo/semantics/nonPublishe
Multimodal speech integration: Role of balance between auditory and visual cues
info:eu-repo/semantics/nonPublishe
I don’t see what you mean: impact of visual dégradation on audio-visual speech perception in children with cochlear implants
info:eu-repo/semantics/nonPublishe