100 research outputs found

    Orofacial cutaneous function in speech motor control and learning

    No full text
    International audienceSomatosensory signals from facial skin can provide a rich source of sensory input. However, it is unknown yet how cutaneous input works on speech motor control and learning. This chapter introduces a kinesthetic role of orofacial cutaneous afferents in speech processing. We argue for specificity of the orofacial somatosensory system from anatomical and physiological perspectives. The contribution of cutaneous afferents to speech production is evident in neurophysiological and psychophysical findings. Somatosensory modulation associated with facial skin deformation induces a reflex for articulatory motion adjustment in speech production and also an adaptive motion change in speech motor learning. In addition, cutaneous mechanoreceptors are narrowly tuned at the skin lateral to the oral angle. An intriguing function of somatosensory inputs associated with facial skin deformation is to interact with the processing of speech perception. Taken together, orofacial cutaneous afferents play an important role in both speech production and perception

    Electro-Haptic Stimulation: A New Approach for Improving Cochlear-Implant Listening

    Get PDF
    Cochlear implants (CIs) have been remarkably successful at restoring speech perception for severely to profoundly deaf individuals. Despite their success, several limitations remain, particularly in CI users’ ability to understand speech in noisy environments, locate sound sources, and enjoy music. A new multimodal approach has been proposed that uses haptic stimulation to provide sound information that is poorly transmitted by the implant. This augmenting of the electrical CI signal with haptic stimulation (electro-haptic stimulation; EHS) has been shown to improve speech-in-noise performance and sound localization in CI users. There is also evidence that it could enhance music perception. We review the evidence of EHS enhancement of CI listening and discuss key areas where further research is required. These include understanding the neural basis of EHS enhancement, understanding the effectiveness of EHS across different clinical populations, and the optimization of signal-processing strategies. We also discuss the significant potential for a new generation of haptic neuroprosthetic devices to aid those who cannot access hearing-assistive technology, either because of biomedical or healthcare-access issues. While significant further research and development is required, we conclude that EHS represents a promising new approach that could, in the near future, offer a non-invasive, inexpensive means of substantially improving clinical outcomes for hearing-impaired individuals

    Control space of apparent haptic motion

    Full text link

    Articulatory feature encoding and sensorimotor training for tactually supplemented speech reception by the hearing-impaired

    Get PDF
    Thesis (Ph. D.)--Harvard-MIT Division of Health Sciences and Technology, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 150-159).This thesis builds on previous efforts to develop tactile speech-reception aids for the hearing-impaired. Whereas conventional hearing aids mainly amplify acoustic signals, tactile speech aids convert acoustic information into a form perceptible via the sense of touch. By facilitating visual speechreading and providing sensory feedback for vocal control, tactile speech aids may substantially enhance speech communication abilities in the absence of useful hearing. Research for this thesis consisted of several lines of work. First, tactual detection and temporal order discrimination by congenitally deaf adults were examined, in order to assess the practicability of encoding acoustic speech information as temporal relationships among tactual stimuli. Temporal resolution among most congenitally deaf subjects was deemed adequate for reception of tactually-encoded speech cues. Tactual offset-order discrimination thresholds substantially exceeded those measured for onset-order, underscoring fundamental differences between stimulus masking dynamics in the somatosensory and auditory systems. Next, a tactual speech transduction scheme was designed with the aim of extending the amount of articulatory information conveyed by an earlier vocoder-type tactile speech display strategy. The novel transduction scheme derives relative amplitude cues from three frequency-filtered speech bands, preserving the cross-channel timing information required for consonant voicing discriminations, while retaining low-frequency modulations that distinguish voiced and aperiodic signal components. Additionally, a sensorimotor training approach ("directed babbling") was developed with the goal of facilitating tactile speech acquisition through frequent vocal imitation of visuo-tactile speech stimuli and attention to tactual feedback from one's own vocalizations. A final study evaluated the utility of the tactile speech display in resolving ambiguities among visually presented consonants, following either standard or enhanced sensorimotor training. Profoundly deaf and normal-hearing participants trained to exploit tactually-presented acoustic information in conjunction with visual speechreading to facilitate consonant identification in the absence of semantic context. Results indicate that the present transduction scheme can enhance reception of consonant manner and voicing information and facilitate identification of syllableinitial and syllable-final consonants. The sensorimotor training strategy proved selectively advantageous for subjects demonstrating more gradual tactual speech acquisition. Simple, low-cost tactile devices may prove suitable for widespread distribution in developing countries, where hearing aids and cochlear implants remain unaffordable for most severely and profoundly deaf individuals. They have the potential to enhance verbal communication with minimal need for clinical intervention.by Theodore M. Moallem.Ph.D

    The Relationship of Somatosensory Perception and Fine-Force Control in the Adult Human Orofacial System

    Get PDF
    The orofacial area stands apart from other body systems in that it possesses a unique performance anatomy whereby oral musculature inserts directly into the underlying cutaneous skin, allowing for the generation of complex three-dimensional deformations of the orofacial system. This anatomical substrate provides for the tight temporal synchrony between self-generated cutaneous somatosensation and oromotor control during functional behaviors in this region and provides the necessary feedback needed to learn and maintain skilled orofacial behaviors. The Directions into Velocity of Articulators (DIVA) model highlights the importance of the bidirectional relationship between sensation and production in the orofacial region in children learning speech. This relationship has not been as well-established in the adult orofacial system. The purpose of this observational study was to begin assessing the perception-action relationship in healthy adults and to describe how this relationship may be altered as a function of healthy aging. This study was designed to determine the correspondence between orofacial cutaneous perception using vibrotactile detection thresholds (VDT) and low-level static and dynamic force control tasks in three representative age cohorts. Correlational relationships among measures of somatosensory capacity and low-level skilled orofacial force control were determined for 60 adults (19-84 years). Significant correlational relationships were identified using non-parametric Spearman’s correlations with an alpha at 0.1 between the 5 Hz test probe and several 0.5 N low-level force control assessments in the static and slow ramp-and-hold condition. These findings indicate that as vibrotactile detection thresholds increase (labial sensation decreases), ability to maintain a low-level force endpoint decreases. Group data was analyzed using non-parametric Kruskal-Wallis tests and identified significant differences between the 5 Hz test frequency probe and various 0.5 N skilled force assessments for group variables such as age, pure tone hearing assessments, sex, speech usage and smoking history. Future studies will begin the processing of modeling this complex multivariate relationship in healthy individuals before moving to a disordered population

    Beyond language: The unspoken sensory-motor representation of the tongue in non-primates, non-human and human primates

    Get PDF
    The English idiom “on the tip of my tongue” commonly acknowledges that something is known, but it cannot be immediately brought to mind. This phrase accurately describes sensorimotor functions of the tongue, which are fundamental for many tongue-related behaviors (e.g., speech), but often neglected by scientific research. Here, we review a wide range of studies conducted on non-primates, non-human and human primates with the aim of providing a comprehensive description of the cortical representation of the tongue's somatosensory inputs and motor outputs across different phylogenetic domains. First, we summarize how the properties of passive non-noxious mechanical stimuli are encoded in the putative somatosensory tongue area, which has a conserved location in the ventral portion of the somatosensory cortex across mammals. Second, we review how complex self-generated actions involving the tongue are represented in more anterior regions of the putative somato-motor tongue area. Finally, we describe multisensory response properties of the primate and non-primate tongue area by also defining how the cytoarchitecture of this area is affected by experience and deafferentation

    The development of social processing in young children: insights from somatosensory activations during observation and experience of touch in typically developing children and speech processing in children with autism spectrum disorders

    Get PDF
    This thesis explores the neural mechanisms underlying the observation of touch and tactile processing in adults and typically developing children and speech versus computerized speech processing in children with autism spectrum disorders (ASD). Chapter 1 reviews the literature on mirror functioning, embodied cognition and typical and atypical development of social and speech processing in infancy and childhood. Chapter 2 investigates the neural mechanisms underlying hand and object touch observation in adults. In Chapter 3, a similar procedure is employed to investigate tactile mirroring mechanisms in children. The findings demonstrate that these mechanisms are relatively developed in 4- to 5- year old children. Chapter 4 further explores somatosensory activity during touch in adults and children. The findings reveal the modulation of somatosensory beta (15-24 Hz) activity during touch in adults, but not in children. Chapter 5 examines the neural mechanisms underlying speech versus computerized speech perception in children with ASD. These results suggest an impaired classification of speech sounds preceded by computerized speech, and atypical lateralization of speech processing in children with ASD. Together, these findings make a notable contribution to our understanding of typical development of tactile mirroring and touch processing mechanisms, and social processing dysfunctions in children with ASD

    Wave interaction in rotary vibro-tactile displays for human communication

    Get PDF
    This project began with the aim of developing an efficient vibrotactile communication device. A review of existing devices, mainly designed for speech communication, suggested that although adding an extra stimulator can improve the performance in some situations, it can degrade the performance in another situation. To explain these varied results, the properties of the human vibrotactile system involved in the perception of mechanical stimuli were studied. This study suggested that there is a great deal of interaction within the vibrotactile perceptual system, part of which is essential for a stimulus to be perceived. It also raised the question regarding the relative importance of the interaction which takes place prior to the tactile receptor as opposed to that occurring from the receptor onwards. Methods to reduce this interaction were introduced and on this basis a novel rotary vibrator was developed. A psychophysical method specifically aimed at measuring the interaction at the level between the stimulation site and the tactile receptors was developed. This method is based on the detection of "beats" arising from stimulation of two vibrators at slightly different frequencies. A system capable of driving a pair of similar vibrators at approximately 15dB SL over the frequency range of 25-500Hz was developed. The results of the psychophysical tests show that the introduced method of measuring interaction is indeed a practical method. In addition, the data from this study suggest that there is a difference between the perceived level of interaction from the two types of vibrators. The interaction is less in the case of the rotary vibrator compared to the conventional perpendicular vibrator at frequencies lower than about 50Hz. These findings offer a new way to look at the development of future vibrotactile devices

    Prediction of room acoustical parameters (A)

    Get PDF
    • …
    corecore