89 research outputs found

    Somatic ABC's: A Theoretical Framework for Designing, Developing and Evaluating the Building Blocks of Touch-Based Information Delivery

    Get PDF
    abstract: Situations of sensory overload are steadily becoming more frequent as the ubiquity of technology approaches reality--particularly with the advent of socio-communicative smartphone applications, and pervasive, high speed wireless networks. Although the ease of accessing information has improved our communication effectiveness and efficiency, our visual and auditory modalities--those modalities that today's computerized devices and displays largely engage--have become overloaded, creating possibilities for distractions, delays and high cognitive load; which in turn can lead to a loss of situational awareness, increasing chances for life threatening situations such as texting while driving. Surprisingly, alternative modalities for information delivery have seen little exploration. Touch, in particular, is a promising candidate given that it is our largest sensory organ with impressive spatial and temporal acuity. Although some approaches have been proposed for touch-based information delivery, they are not without limitations including high learning curves, limited applicability and/or limited expression. This is largely due to the lack of a versatile, comprehensive design theory--specifically, a theory that addresses the design of touch-based building blocks for expandable, efficient, rich and robust touch languages that are easy to learn and use. Moreover, beyond design, there is a lack of implementation and evaluation theories for such languages. To overcome these limitations, a unified, theoretical framework, inspired by natural, spoken language, is proposed called Somatic ABC's for Articulating (designing), Building (developing) and Confirming (evaluating) touch-based languages. To evaluate the usefulness of Somatic ABC's, its design, implementation and evaluation theories were applied to create communication languages for two very unique application areas: audio described movies and motor learning. These applications were chosen as they presented opportunities for complementing communication by offloading information, typically conveyed visually and/or aurally, to the skin. For both studies, it was found that Somatic ABC's aided the design, development and evaluation of rich somatic languages with distinct and natural communication units.Dissertation/ThesisPh.D. Computer Science 201

    Recognizing prosody from the lips: is it possible to extract prosodic focus from lip features?

    Get PDF
    International audienceThe aim of this chapter is to examine the possibility of extracting prosodic information from lip features. We used two measurement techniques enabling automatic lip feature extraction to evaluate the "lip pattern" of prosodic focus in French. Two corpora with Subject-Verb-Object (SVO) sentences were designed. Four focus conditions (S, V, O or neutral) were elicited in a natural dialogue situation. In a first set of experiments, we recorded two speakers of French with front and profile video cameras. The speakers wore blue make-up and facial markers. In a second set we recorded five speakers with a 3D optical tracker. An analysis of the lip features showed that visible articulatory lip correlates of focus exist for all speakers. Two types of patterns were observed: absolute and differential. A potential outcome of this study is to provide criteria for automatic visual detection of prosodic focus from lip data

    Musical Haptics

    Get PDF
    Haptic Musical Instruments; Haptic Psychophysics; Interface Design and Evaluation; User Experience; Musical Performanc

    Musical Haptics

    Get PDF
    Haptic Musical Instruments; Haptic Psychophysics; Interface Design and Evaluation; User Experience; Musical Performanc

    Articulatory feature encoding and sensorimotor training for tactually supplemented speech reception by the hearing-impaired

    Get PDF
    Thesis (Ph. D.)--Harvard-MIT Division of Health Sciences and Technology, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 150-159).This thesis builds on previous efforts to develop tactile speech-reception aids for the hearing-impaired. Whereas conventional hearing aids mainly amplify acoustic signals, tactile speech aids convert acoustic information into a form perceptible via the sense of touch. By facilitating visual speechreading and providing sensory feedback for vocal control, tactile speech aids may substantially enhance speech communication abilities in the absence of useful hearing. Research for this thesis consisted of several lines of work. First, tactual detection and temporal order discrimination by congenitally deaf adults were examined, in order to assess the practicability of encoding acoustic speech information as temporal relationships among tactual stimuli. Temporal resolution among most congenitally deaf subjects was deemed adequate for reception of tactually-encoded speech cues. Tactual offset-order discrimination thresholds substantially exceeded those measured for onset-order, underscoring fundamental differences between stimulus masking dynamics in the somatosensory and auditory systems. Next, a tactual speech transduction scheme was designed with the aim of extending the amount of articulatory information conveyed by an earlier vocoder-type tactile speech display strategy. The novel transduction scheme derives relative amplitude cues from three frequency-filtered speech bands, preserving the cross-channel timing information required for consonant voicing discriminations, while retaining low-frequency modulations that distinguish voiced and aperiodic signal components. Additionally, a sensorimotor training approach ("directed babbling") was developed with the goal of facilitating tactile speech acquisition through frequent vocal imitation of visuo-tactile speech stimuli and attention to tactual feedback from one's own vocalizations. A final study evaluated the utility of the tactile speech display in resolving ambiguities among visually presented consonants, following either standard or enhanced sensorimotor training. Profoundly deaf and normal-hearing participants trained to exploit tactually-presented acoustic information in conjunction with visual speechreading to facilitate consonant identification in the absence of semantic context. Results indicate that the present transduction scheme can enhance reception of consonant manner and voicing information and facilitate identification of syllableinitial and syllable-final consonants. The sensorimotor training strategy proved selectively advantageous for subjects demonstrating more gradual tactual speech acquisition. Simple, low-cost tactile devices may prove suitable for widespread distribution in developing countries, where hearing aids and cochlear implants remain unaffordable for most severely and profoundly deaf individuals. They have the potential to enhance verbal communication with minimal need for clinical intervention.by Theodore M. Moallem.Ph.D

    On the design of visual feedback for the rehabilitation of hearing-impaired speech

    Get PDF

    Musical Haptics

    Get PDF
    This Open Access book offers an original interdisciplinary overview of the role of haptic feedback in musical interaction. Divided into two parts, part I examines the tactile aspects of music performance and perception, discussing how they affect user experience and performance in terms of usability, functionality and perceived quality of musical instruments. Part II presents engineering, computational, and design approaches and guidelines that have been applied to render and exploit haptic feedback in digital musical interfaces. Musical Haptics introduces an emerging field that brings together engineering, human-computer interaction, applied psychology, musical aesthetics, and music performance. The latter, defined as the complex system of sensory-motor interactions between musicians and their instruments, presents a well-defined framework in which to study basic psychophysical, perceptual, and biomechanical aspects of touch, all of which will inform the design of haptic musical interfaces. Tactile and proprioceptive cues enable embodied interaction and inform sophisticated control strategies that allow skilled musicians to achieve high performance and expressivity. The use of haptic feedback in digital musical interfaces is expected to enhance user experience and performance, improve accessibility for disabled persons, and provide an effective means for musical tuition and guidance

    Conference Proceedings of the Euroregio / BNAM 2022 Joint Acoustic Conference

    Get PDF
    • …
    corecore