3,368 research outputs found

    A first investigation into the effectiveness of Tactons

    Get PDF
    This paper reports two experiments relating to the design of Tactons (or tactile icons). The first experiment investigated perception of vibro-tactile "roughness" (created using amplitude modulated sinusoids), and the results indicated that roughness could be used as a parameter for constructing Tactons. The second experiment is the first full evaluation of Tactons, and uses three values of roughness identified in the first experiment, along with three rhythms to create a set of Tactons. The results of this experiment showed that Tactons could be a successful means of communicating information in user interfaces, with an overall recognition rate of 71%, and recognition rates of 93% for rhythm and 80% for roughness

    Audio-tactile stimuli to improve health and well-being : a preliminary position paper

    Get PDF
    From literature and through common experience it is known that stimulation of the tactile (touch) sense or auditory (hearing) sense can be used to improve people's health and well-being. For example, to make people relax, feel better, sleep better or feel comforted. In this position paper we propose the concept of combined auditory-tactile stimulation and argue that it potentially has positive effects on human health and well-being through influencing a user's body and mental state. Such effects have, to date, not yet been fully explored in scientific research. The current relevant state of the art is briefly addressed and its limitations are indicated. Based on this, a vision is presented of how auditory-tactile stimulation could be used in healthcare and various other application domains. Three interesting research challenges in this field are identified: 1) identifying relevant mechanisms of human perception of combined auditory-tactile stimuli; 2) finding methods for automatic conversions between audio and tactile content; 3) using measurement and analysis of human bio-signals and behavior to adapt the stimulation in an optimal way to the user. Ideas and possible routes to address these challenges are presented

    To “Sketch-a-Scratch”

    Get PDF
    A surface can be harsh and raspy, or smooth and silky, and everything in between. We are used to sense these features with our fingertips as well as with our eyes and ears: the exploration of a surface is a multisensory experience. Tools, too, are often employed in the interaction with surfaces, since they augment our manipulation capabilities. “Sketch-a-Scratch” is a tool for the multisensory exploration and sketching of surface textures. The user’s actions drive a physical sound model of real materials’ response to interactions such as scraping, rubbing or rolling. Moreover, different input signals can be converted into 2D visual surface profiles, thus enabling to experience them visually, aurally and haptically

    Multi-Moji: Combining Thermal, Vibrotactile and Visual Stimuli to Expand the Affective Range of Feedback

    Get PDF
    This paper explores the combination of multiple concurrent modalities for conveying emotional information in HCI: temperature, vibration and abstract visual displays. Each modality has been studied individually, but can only convey a limited range of emotions within two-dimensional valencearousal space. This paper is the first to systematically combine multiple modalities to expand the available affective range. Three studies were conducted: Study 1 measured the emotionality of vibrotactile feedback by itself; Study 2 measured the perceived emotional content of three bimodal combinations: vibrotactile + thermal, vibrotactile + visual and visual + thermal. Study 3 then combined all three modalities. Results show that combining modalities increases the available range of emotional states, particularly in the problematic top-right and bottom-left quadrants of the dimensional model. We also provide a novel lookup resource for designers to identify stimuli to convey a range of emotions

    Analysis on Using Synthesized Singing Techniques in Assistive Interfaces for Visually Impaired to Study Music

    Get PDF
    Tactile and auditory senses are the basic types of methods that visually impaired people sense the world. Their interaction with assistive technologies also focuses mainly on tactile and auditory interfaces. This research paper discuss about the validity of using most appropriate singing synthesizing techniques as a mediator in assistive technologies specifically built to address their music learning needs engaged with music scores and lyrics. Music scores with notations and lyrics are considered as the main mediators in musical communication channel which lies between a composer and a performer. Visually impaired music lovers have less opportunity to access this main mediator since most of them are in visual format. If we consider a music score, the vocal performer’s melody is married to all the pleasant sound producible in the form of singing. Singing best fits for a format in temporal domain compared to a tactile format in spatial domain. Therefore, conversion of existing visual format to a singing output will be the most appropriate nonlossy transition as proved by the initial research on adaptive music score trainer for visually impaired [1]. In order to extend the paths of this initial research, this study seek on existing singing synthesizing techniques and researches on auditory interfaces

    Novel Multimodal Feedback Techniques for In-Car Mid-Air Gesture Interaction

    Get PDF
    This paper presents an investigation into the effects of different feedback modalities on mid-air gesture interaction for infotainment systems in cars. Car crashes and near-crash events are most commonly caused by driver distraction. Mid-air interaction is a way of reducing driver distraction by reducing visual demand from infotainment. Despite a range of available modalities, feedback in mid-air gesture systems is generally provided through visual displays. We conducted a simulated driving study to investigate how different types of multimodal feedback can support in-air gestures. The effects of different feedback modalities on eye gaze behaviour, and the driving and gesturing tasks are considered. We found that feedback modality influenced gesturing behaviour. However, drivers corrected falsely executed gestures more often in non-visual conditions. Our findings show that non-visual feedback can reduce visual distraction significantl
    • …
    corecore