593 research outputs found

    Haptics for the development of fundamental rhythm skills, including multi-limb coordination

    Get PDF
    This chapter considers the use of haptics for learning fundamental rhythm skills, including skills that depend on multi-limb coordination. Different sensory modalities have different strengths and weaknesses for the development of skills related to rhythm. For example, vision has low temporal resolution and performs poorly for tracking rhythms in real-time, whereas hearing is highly accurate. However, in the case of multi-limbed rhythms, neither hearing nor sight are particularly well suited to communicating exactly which limb does what and when, or how the limbs coordinate. By contrast, haptics can work especially well in this area, by applying haptic signals independently to each limb. We review relevant theories, including embodied interaction and biological entrainment. We present a range of applications of the Haptic Bracelets, which are computer-controlled wireless vibrotactile devices, one attached to each wrist and ankle. Haptic pulses are used to guide users in playing rhythmic patterns that require multi-limb coordination. One immediate aim of the system is to support the development of practical rhythm skills and multi-limb coordination. A longer-term goal is to aid the development of a wider range of fundamental rhythm skills including recognising, identifying, memorising, retaining, analysing, reproducing, coordinating, modifying and creating rhythms – particularly multi-stream (i.e. polyphonic) rhythmic sequences. Empirical results are presented. We reflect on related work, and discuss design issues for using haptics to support rhythm skills. Skills of this kind are essential not just to drummers and percussionists but also to keyboards players, and more generally to all musicians who need a firm grasp of rhythm

    Design and Effect of Continuous Wearable Tactile Displays

    Get PDF
    Our sense of touch is one of our core senses and while not as information rich as sight and hearing, it tethers us to reality. Our skin is the largest sensory organ in our body and we rely on it so much that we don\u27t think about it most of the time. Tactile displays - with the exception of actuators for notifications on smartphones and smartwatches - are currently understudied and underused. Currently tactile cues are mostly used in smartphones and smartwatches to notify the user of an incoming call or text message. Specifically continuous displays - displays that do not just send one notification but stay active for an extended period of time and continuously communicate information - are rarely studied. This thesis aims at exploring the utilization of our vibration perception to create continuous tactile displays. Transmitting a continuous stream of tactile information to a user in a wearable format can help elevate tactile displays from being mostly used for notifications to becoming more like additional senses enabling us to perceive our environment in new ways. This work provides a serious step forward in design, effect and use of continuous tactile displays and their use in human-computer interaction. The main contributions include: Exploration of Continuous Wearable Tactile Interfaces This thesis explores continuous tactile displays in different contexts and with different types of tactile information systems. The use-cases were explored in various domains for tactile displays - Sports, Gaming and Business applications. The different types of continuous tactile displays feature one- or multidimensional tactile patterns, temporal patterns and discrete tactile patterns. Automatic Generation of Personalized Vibration Patterns In this thesis a novel approach of designing vibrotactile patterns without expert knowledge by leveraging evolutionary algorithms to create personalized vibration patterns - is described. This thesis presents the design of an evolutionary algorithm with a human centered design generating abstract vibration patterns. The evolutionary algorithm was tested in a user study which offered evidence that interactive generation of abstract vibration patterns is possible and generates diverse sets of vibration patterns that can be recognized with high accuracy. Passive Haptic Learning for Vibration Patterns Previous studies in passive haptic learning have shown surprisingly strong results for learning Morse Code. If these findings could be confirmed and generalized, it would mean that learning a new tactile alphabet could be made easier and learned in passing. Therefore this claim was investigated in this thesis and needed to be corrected and contextualized. A user study was conducted to study the effects of the interaction design and distraction tasks on the capability to learn stimulus-stimulus-associations with Passive Haptic Learning. This thesis presents evidence that Passive Haptic Learning of vibration patterns induces only a marginal learning effect and is not a feasible and efficient way to learn vibration patterns that include more than two vibrations. Influence of Reference Frames for Spatial Tactile Stimuli Designing wearable tactile stimuli that contain spatial information can be a challenge due to the natural body movement of the wearer. An important consideration therefore is what reference frame to use for spatial cues. This thesis investigated allocentric versus egocentric reference frames on the wrist and compared them for induced cognitive load, reaction time and accuracy in a user study. This thesis presents evidence that using an allocentric reference frame drastically lowers cognitive load and slightly lowers reaction time while keeping the same accuracy as an egocentric reference frame, making a strong case for the utilization of allocentric reference frames in tactile bracelets with several tactile actuators

    A Review of Smart Materials in Tactile Actuators for Information Delivery

    Full text link
    As the largest organ in the human body, the skin provides the important sensory channel for humans to receive external stimulations based on touch. By the information perceived through touch, people can feel and guess the properties of objects, like weight, temperature, textures, and motion, etc. In fact, those properties are nerve stimuli to our brain received by different kinds of receptors in the skin. Mechanical, electrical, and thermal stimuli can stimulate these receptors and cause different information to be conveyed through the nerves. Technologies for actuators to provide mechanical, electrical or thermal stimuli have been developed. These include static or vibrational actuation, electrostatic stimulation, focused ultrasound, and more. Smart materials, such as piezoelectric materials, carbon nanotubes, and shape memory alloys, play important roles in providing actuation for tactile sensation. This paper aims to review the background biological knowledge of human tactile sensing, to give an understanding of how we sense and interact with the world through the sense of touch, as well as the conventional and state-of-the-art technologies of tactile actuators for tactile feedback delivery

    The haptic iPod: passive learning of multi-limb rhythm skills

    Get PDF
    Recent experiments showed that the use of haptic vibrotactile devices can support the learning of multi-limb rhythms [Holland et al., 2010]. These experiments centred on a tool called the Haptic Drum Kit, which uses vibrotactiles attached to wrists and ankles, together with a computer system that controls them, and a midi drum kit. The system uses haptic signals in real time, relying on human entrainment mechanisms [Clayton, Sager and Will, 2004] rather than stimulus response, to support the user in playing multi-limbed rhythms. In the present paper, we give a preliminary report on a new experiment, that aims to examine whether passive learning of multi-limb rhythms can occur through the silent playback of rhythmic stimuli via haptics when the subject is focusing on other tasks. The prototype system used for this new experiment is referred to as the Haptic iPod.Paper presented at the Workshop: When Words Fail: What can Music Interaction tell us about HCI? at BCS HCI Conference 2011, Newcastle, U

    Somatic ABC's: A Theoretical Framework for Designing, Developing and Evaluating the Building Blocks of Touch-Based Information Delivery

    Get PDF
    abstract: Situations of sensory overload are steadily becoming more frequent as the ubiquity of technology approaches reality--particularly with the advent of socio-communicative smartphone applications, and pervasive, high speed wireless networks. Although the ease of accessing information has improved our communication effectiveness and efficiency, our visual and auditory modalities--those modalities that today's computerized devices and displays largely engage--have become overloaded, creating possibilities for distractions, delays and high cognitive load; which in turn can lead to a loss of situational awareness, increasing chances for life threatening situations such as texting while driving. Surprisingly, alternative modalities for information delivery have seen little exploration. Touch, in particular, is a promising candidate given that it is our largest sensory organ with impressive spatial and temporal acuity. Although some approaches have been proposed for touch-based information delivery, they are not without limitations including high learning curves, limited applicability and/or limited expression. This is largely due to the lack of a versatile, comprehensive design theory--specifically, a theory that addresses the design of touch-based building blocks for expandable, efficient, rich and robust touch languages that are easy to learn and use. Moreover, beyond design, there is a lack of implementation and evaluation theories for such languages. To overcome these limitations, a unified, theoretical framework, inspired by natural, spoken language, is proposed called Somatic ABC's for Articulating (designing), Building (developing) and Confirming (evaluating) touch-based languages. To evaluate the usefulness of Somatic ABC's, its design, implementation and evaluation theories were applied to create communication languages for two very unique application areas: audio described movies and motor learning. These applications were chosen as they presented opportunities for complementing communication by offloading information, typically conveyed visually and/or aurally, to the skin. For both studies, it was found that Somatic ABC's aided the design, development and evaluation of rich somatic languages with distinct and natural communication units.Dissertation/ThesisPh.D. Computer Science 201

    The Haptic Bracelets: learning multi-limb rhythm skills from haptic stimuli while reading

    Get PDF
    The Haptic Bracelets are a system designed to help people learn multi-limbed rhythms (which involve multiple simultaneous rhythmic patterns) while they carry out other tasks. The Haptic Bracelets consist of vibrotactiles attached to each wrist and ankle, together with a computer system to control them. In this chapter, we report on an early empirical test of the capabilities of this system, and consider de-sign implications. In the pre-test phase, participants were asked to play a series of multi-limb rhythms on a drum kit, guided by audio recordings. Participants’ per-formances in this phase provided a base reference for later comparisons. During the following passive learning phase, away from the drum kit, just two rhythms from the set were silently 'played' to each subject via vibrotactiles attached to wrists and ankles, while participants carried out a 30-minute reading comprehen-sion test. Different pairs of rhythms were chosen for different subjects to control for effects of rhythm complexity. In each case, the two rhythms were looped and alternated every few minutes. In the final phase, subjects were asked to play again at the drum kit the complete set of rhythms from the pre-test, including, of course, the two rhythms to which they had been passively exposed. Pending analysis of quantitative data focusing on accuracy, timing, number of attempts and number of errors, in this chapter we present preliminary findings based on participants’ sub-jective evaluations. Most participants thought that the technology helped them to understand rhythms and to play rhythms better, and preferred haptic to audio to find out which limb to play when. Most participants indicated that they would pre-fer using a combination of haptics and audio for learning rhythms to either mo-dality on its own. Replies to open questions were analysed to identify design is-sues, and implications for design improvements were considered

    Mid-air haptic rendering of 2D geometric shapes with a dynamic tactile pointer

    Get PDF
    An important challenge that affects ultrasonic midair haptics, in contrast to physical touch, is that we lose certain exploratory procedures such as contour following. This makes the task of perceiving geometric properties and shape identification more difficult. Meanwhile, the growing interest in mid-air haptics and their application to various new areas requires an improved understanding of how we perceive specific haptic stimuli, such as icons and control dials in mid-air. We address this challenge by investigating static and dynamic methods of displaying 2D geometric shapes in mid-air. We display a circle, a square, and a triangle, in either a static or dynamic condition, using ultrasonic mid-air haptics. In the static condition, the shapes are presented as a full outline in mid-air, while in the dynamic condition, a tactile pointer is moved around the perimeter of the shapes. We measure participants’ accuracy and confidence of identifying shapes in two controlled experiments (n1 = 34, n2 = 25). Results reveal that in the dynamic condition people recognise shapes significantly more accurately, and with higher confidence. We also find that representing polygons as a set of individually drawn haptic strokes, with a short pause at the corners, drastically enhances shape recognition accuracy. Our research supports the design of mid-air haptic user interfaces in application scenarios such as in-car interactions or assistive technology in education

    Musical Haptics

    Get PDF
    Haptic Musical Instruments; Haptic Psychophysics; Interface Design and Evaluation; User Experience; Musical Performanc

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use
    • …
    corecore