8,708 research outputs found
Duration discrimination in younger and older adults
Ten normal hearing young adults and ten older adults were asked to identify the longer of two sequentially presented tones. The duration of the standard tones ranged from 1.5 ms to 1000 ms across blocks. Duration discrimination was not related to audiometric thresholds. These results show that older adults are much more disadvantaged than young adults when discriminating very short durations (i.e., below 40 ms) that are characteristic of speech sounds, and that this disadvantage cannot be accounted for by hearing levels
Informational drives for sensor evolution
© 2012 Massachusetts Institute of Technology Published under a Creative Commons Attribution 4.0 International (CC BY 4.0) licenseIt has been hypothesized that the evolution of sensors is a pivotal driver for the evolution of organisms, and especially, as a crucial part of the perception-action loop, a driver for cognitive development. The questions of why and how this is the case are important: what are the principles that push the evolution of sensorimotor systems? An interesting aspect of this problem is the co-option of sensors for functions other than those originally driving their development (e.g. the auditive sense of bats being employed as a 'visual' modality). Even more striking is the phenomenon found in nature of sensors being driven to the limits of precision, while starting from much simpler beginnings. While a large potential for diversification and exaptation is visible in the observed phenotypes, gaining a deeper understanding of why and how this can be achieved is a significant problem. In this present paper, we will introduce a formal and generic information-theoretic model for understanding potential drives of sensor evolution, both in terms of improving sensory ability and in terms of extending and/or shifting sensory function
Development of an instrument for early detection of dementia in people with Down syndrome
The successful detection of early signs of dementia in people with Down syndrome could form a basis for useful early support and for drug treatment. This report describes the development and preliminary application of an interview and test instrument for the assessment of dementia among people with intellectual disability, as well as a framework for diagnosis that combines the findings of an interview and a test with the diagnostic criteria of ICD-10, DSM-IV and NINCDS-ADRDA. From among the number of tests and interview questions developed, those showing the most significant differences between participants in three groups of differing levels of intellectual disability and estimated dementia were kept. Reported are the assumptions for the items used, descriptions of the process and items used, and the associations of test items with predicting the presence of dementia. The authors conclude that a protocol combining testing and interview has promise and potential for detecting early signs of dementia in this population and could prove feasible for use in practice
The influence of external and internal motor processes on human auditory rhythm perception
Musical rhythm is composed of organized temporal patterns, and the processes underlying rhythm perception are found to engage both auditory and motor systems. Despite behavioral and neuroscience evidence converging to this audio-motor interaction, relatively little is known about the effect of specific motor processes on auditory rhythm perception. This doctoral thesis was devoted to investigating the influence of both external and internal motor processes on the way we perceive an auditory rhythm. The first half of the thesis intended to establish whether overt body movement had a facilitatory effect on our ability to perceive the auditory rhythmic structure, and whether this effect was modulated by musical training. To this end, musicians and non-musicians performed a pulse-finding task either using natural body movement or through listening only, and produced their identified pulse by finger tapping. The results showed that overt movement benefited rhythm (pulse) perception especially for non-musicians, confirming the facilitatory role of external motor activities in hearing the rhythm, as well as its interaction with musical training. The second half of the thesis tested the idea that indirect, covert motor input, such as that transformed from the visual stimuli, could influence our perceived structure of an auditory rhythm. Three experiments examined the subjectively perceived tempo of an auditory sequence under different visual motion stimulations, while the auditory and visual streams were presented independently of each other. The results revealed that the perceived auditory tempo was accordingly influenced by the concurrent visual motion conditions, and the effect was related to the increment or decrement of visual motion speed. This supported the hypothesis that the internal motor information extracted from the visuomotor stimulation could be incorporated into the percept of an auditory rhythm. Taken together, the present thesis concludes that, rather than as a mere reaction to the given auditory input, our motor system plays an important role in contributing to the perceptual process of the auditory rhythm. This can occur via both external and internal motor activities, and may not only influence how we hear a rhythm but also under some circumstances improve our ability to hear the rhythm.Musikalische Rhythmen bestehen aus zeitlich strukturierten Mustern akustischer Stimuli. Es konnte gezeigt werden, dass die Prozesse, welche der Rhythmuswahrnehmung zugrunde liegen, sowohl motorische als auch auditive Systeme nutzen. Obwohl sich für diese auditiv-motorischen Interaktionen sowohl in den Verhaltenswissenschaften als auch Neurowissenschaften übereinstimmende Belege finden, weiß man bislang relativ wenig über die Auswirkungen spezifischer motorischer Prozesse auf die auditive Rhythmuswahrnehmung. Diese Doktorarbeit untersucht den Einfluss externaler und internaler motorischer Prozesse auf die Art und Weise, wie auditive Rhythmen wahrgenommen werden. Der erste Teil der Arbeit diente dem Ziel herauszufinden, ob körperliche Bewegungen es dem Gehirn erleichtern können, die Struktur von auditiven Rhythmen zu erkennen, und, wenn ja, ob dieser Effekt durch ein musikalisches Training beeinflusst wird. Um dies herauszufinden wurde Musikern und Nichtmusikern die Aufgabe gegeben, innerhalb von präsentierten auditiven Stimuli den Puls zu finden, wobei ein Teil der Probanden währenddessen Körperbewegungen ausführen sollte und der andere Teil nur zuhören sollte. Anschließend sollten die Probanden den gefundenen Puls durch Finger-Tapping ausführen, wobei die Reizgaben sowie die Reaktionen mittels eines computerisierten Systems kontrolliert wurden. Die Ergebnisse zeigen, dass offen ausgeführte Bewegungen die Wahrnehmung des Pulses vor allem bei Nichtmusikern verbesserten. Diese Ergebnisse bestätigen, dass Bewegungen beim Hören von Rhythmen unterstützend wirken. Außerdem zeigte sich, dass hier eine Wechselwirkung mit dem musikalischen Training besteht. Der zweite Teil der Doktorarbeit überprüfte die Idee, dass indirekte, verdeckte Bewegungsinformationen, wie sie z.B. in visuellen Stimuli enthalten sind, die wahrgenommene Struktur von auditiven Rhythmen beeinflussen können. Drei Experimente untersuchten, inwiefern das subjektiv wahrgenommene Tempo einer akustischen Sequenz durch die Präsentation unterschiedlicher visueller Bewegungsreize beeinflusst wird, wobei die akustischen und optischen Stimuli unabhängig voneinander präsentiert wurden. Die Ergebnisse zeigten, dass das wahrgenommene auditive Tempo durch die visuellen Bewegungsinformationen beeinflusst wird, und dass der Effekt in Verbindung mit der Zunahme oder Abnahme der visuellen Geschwindigkeit steht. Dies unterstützt die Hypothese, dass internale Bewegungsinformationen, welche aus visuomotorischen Reizen extrahiert werden, in die Wahrnehmung eines auditiven Rhythmus integriert werden können. Zusammen genommen,
5
zeigt die vorgestellte Arbeit, dass unser motorisches System eine wichtige Rolle im Wahrnehmungsprozess von auditiven Rhythmen spielt. Dies kann sowohl durch äußere als auch durch internale motorische Aktivitäten geschehen, und beeinflusst nicht nur die Art, wie wir Rhythmen hören, sondern verbessert unter bestimmten Bedingungen auch unsere Fähigkeit Rhythmen zu identifizieren
Speech Development by Imitation
The Double Cone Model (DCM) is a model
of how the brain transforms sensory input to
motor commands through successive stages of
data compression and expansion. We have
tested a subset of the DCM on speech recognition, production and imitation. The experiments show that the DCM is a good candidate
for an artificial speech processing system that
can develop autonomously. We show that the
DCM can learn a repertoire of speech sounds
by listening to speech input. It is also able to
link the individual elements of speech to sequences that can be recognized or reproduced,
thus allowing the system to imitate spoken
language
A longitudinal study of phonological processing skills and reading in bilingual children
French/English bilingual children (N=40) in French language schools participated in an 8-month longitudinal study of the relation between phonological processing skills and reading in French and English. Participants were administered measures of phonological awareness, working memory, naming speed, and reading in both languages. The results of the concurrent analyses show that phonological awareness skills in both French and English were uniquely predictive of reading performance in both languages after accounting for the influences of cognitive ability, reading ability, working memory, and naming speed. These findings support the hypothesis that phonological awareness is strongly related to beginning word reading skill in an alphabetic orthography. The results of the longitudinal analyses also suggest that orthographic depth influences phonological factors related to reading
Interactions audio-tactiles et perception de la parole : Comparaisons entre sujets aveugles et voyants
International audienceThe present study investigated whether manual tactile information from a speaker's face modulates the decoding of speech when audio-tactile perception is compared with audio-only perception. Two groups of congenitally blind and sighted adults were compared. Participants performed a syllable decision task across three conditions: audio-only and congruent/incongruent audio-tactile conditions. For the auditory modality, the syllables were presented in a background white noise or without noise. Our results demonstrate that manual tactile information relevant to recovering speech gestures modulates auditory speech perception in case of degraded acoustic information and that these audio-tactile interactions occur similarly in untrained listeners despite differences in sensory skillsLe but de cette expérience était d'évaluer si l'information tactile, obtenu manuellement par le contact avec le visage du locuteur peut moduler le décodage de la parole. La perception audio-tactile de la parole a été comparé à la seule perception auditive chez des sujets aveugles et des sujets voyants. Les participants devaient réaliser une tâche de décision phonémique dans trois conditions : auditive, audio-tactile cohérente et audio-tactile non cohérente, avec ou sans bruit masquant. Les résultats montrent qu'une information tactile sur les gestes de la parole améliore la perception de la parole en cas d'information acoustique dégradée. Malgré de possibles différences de sensibilité sensorielle entre ces deux groupes de sujets, le même type d'interaction audio-tactile a été constaté pour les auditeurs voyants et aveugles
SHPbench – a smart hybrid prototyping based environment for early testing, verification and (user based) validation of advanced driver assistant systems of cars
Statistical analysis show that more than 90 percent of all car accidents result from human mistakes. Advanced Driver Assistant Systems (ADAS) are intended to support and assist the car driver, and therefore contribute significantly to the reduction of accidents. ADAS become more and more complex and demanding regarding hard- and software fulfilling the requirements applied onto assistant systems nowadays and in the future. They have to be considered as multi-functional multi-domain mechatronic systems. Smart Hybrid Prototyping (SHP) is a by now proven approach for handling ADAS’ demands during and to the development process, specifically for early integrated component and system testing, its verification and validation with the focus on the interaction with the driver can only be reasonably and economically met by utilizing the SHP technology. For those mentioned purposes the SHPbench, an integrated development and validation environment, has been recently developed. The SHPbench's architecture and specification is presented and evaluated by applying a representative use case of an ADAS development process. This paper documents the use case setup, process steps and test results
- …
