554 research outputs found

    Multi-command Tactile Brain Computer Interface: A Feasibility Study

    Full text link
    The study presented explores the extent to which tactile stimuli delivered to the ten digits of a BCI-naive subject can serve as a platform for a brain computer interface (BCI) that could be used in an interactive application such as robotic vehicle operation. The ten fingertips are used to evoke somatosensory brain responses, thus defining a tactile brain computer interface (tBCI). Experimental results on subjects performing online (real-time) tBCI, using stimuli with a moderately fast inter-stimulus-interval (ISI), provide a validation of the tBCI prototype, while the feasibility of the concept is illuminated through information-transfer rates obtained through the case study.Comment: Haptic and Audio Interaction Design 2013, Daejeon, Korea, April 18-19, 2013, 15 pages, 4 figures, The final publication will be available at link.springer.co

    Doctor of Philosophy

    Get PDF
    dissertationThe study of haptic interfaces focuses on the use of the sense of touch in human-machine interaction. This document presents a detailed investigation of lateral skin stretch at the fingertip as a means of direction communication. Such tactile communication has applications in a variety of situations where traditional audio and visual channels are inconvenient, unsafe, or already saturated. Examples include handheld consumer electronics, where tactile communication would allow a user to control a device without having to look at it, or in-car navigation systems, where the audio and visual directions provided by existing GPS devices can distract the driver's attention away from the road. Lateral skin stretch, the displacement of the skin of the fingerpad in a plane tangent to the fingerpad, is a highly effective means of communicating directional information. Users are able to correctly identify the direction of skin stretch stimuli with skin displacements as small as 0.1 mm at rates as slow as 2 mm/s. Such stimuli can be rendered by a small, portable device suitable for integration into handheld devices. The design of the device-finger interface affects the ability of the user to perceive the stimuli accurately. A properly designed conical aperture effectively constrains the motion of the finger and provides an interface that is practical for use in handheld devices. When a handheld device renders directional tactile cues on the fingerpad, the user must often mentally rotate those cues from the reference frame of the finger to the world-centered reference frame where those cues are to be applied. Such mental rotation incurs a cognitive cost, requiring additional time to mentally process the stimuli. The magnitude of these cognitive costs is a function of the angle of rotation, and of the specific orientations of the arm, wrist and finger. Even with the difficulties imposed by required mental rotations, lateral skin stretch is a promising means of communicating information using the sense of touch with potential to substantially improve certain types of human-machine interaction

    The Change in Fingertip Contact Area as a Novel Proprioceptive Cue

    Get PDF
    Humans, many animals, and certain robotic hands have deformable fingertip pads [1, 2]. Deformable pads have the advantage of conforming to the objects that are being touched, ensuring a stable grasp for a large range of forces and shapes. Pad deformations change with finger displacements during touch. Pushing a finger against an external surface typically provokes an increase of the gross contact area [3], potentially providing a relative motion cue, a situation comparable to looming in vision [4]. The rate of increase of the area of contact also depends on the compliance of the object [5]. Because objects normally do not suddenly change compliance, participants may interpret an artificially induced variation in compliance, which coincides with a change in the gross contact area, as a change in finger displacement, and consequently they may misestimate their finger’s position relative to the touched object. To test this, we asked participants to compare the perceived displacements of their finger while contacting an object varying pseudo-randomly in compliance from trial to trial. Results indicate a bias in the perception of finger displacement induced by the change in compliance, hence in contact area, indicating that participants interpreted the altered cutaneous input as a cue to proprioception. This situation highlights the capacity of the brain to take advantage of knowledge of the mechanical properties of the body and of the external environment

    The interaction between motion and texture in the sense of touch

    Get PDF
    Besides providing information on elementary properties of objects, like texture, roughness, and softness, the sense of touch is also important in building a representation of object movement and the movement of our hands. Neural and behavioral studies shed light on the mechanisms and limits of our sense of touch in the perception of texture and motion, and of its role in the control of movement of our hands. The interplay between the geometrical and mechanical properties of the touched objects, such as shape and texture, the movement of the hand exploring the object, and the motion felt by touch, will be discussed in this article. Interestingly, the interaction between motion and textures can generate perceptual illusions in touch. For example, the orientation and the spacing of the texture elements on a static surface induces the illusion of surface motion when we move our hand on it or can elicit the perception of a curved trajectory during sliding, straight hand movements. In this work we present a multiperspective view that encompasses both the perceptual and the motor aspects, as well as the response of peripheral and central nerve structures, to analyze and better understand the complex mechanisms underpinning the tactile representation of texture and motion. Such a better understanding of the spatiotemporal features of the tactile stimulus can reveal novel transdisciplinary applications in neuroscience and haptics

    Effects of Fusion between Tactile and Proprioceptive Inputs on Tactile Perception

    Get PDF
    Tactile perception is typically considered the result of cortical interpretation of afferent signals from a network of mechanical sensors underneath the skin. Yet, tactile illusion studies suggest that tactile perception can be elicited without afferent signals from mechanoceptors. Therefore, the extent that tactile perception arises from isomorphic mapping of tactile afferents onto the somatosensory cortex remains controversial. We tested whether isomorphic mapping of tactile afferent fibers onto the cortex leads directly to tactile perception by examining whether it is independent from proprioceptive input by evaluating the impact of different hand postures on the perception of a tactile illusion across fingertips. Using the Cutaneous Rabbit Effect, a well studied illusion evoking the perception that a stimulus occurs at a location where none has been delivered, we found that hand posture has a significant effect on the perception of the illusion across the fingertips. This finding emphasizes that tactile perception arises from integration of perceived mechanical and proprioceptive input and not purely from tactile interaction with the external environment

    Brain Mechanism for Enhanced Hand Function with Remote Sensory Stimulation

    Get PDF
    The neurological bases for remote vibration enhanced sensory feedback and motor function are yet poorly understood. The purpose of this dissertation was to identify and examine the effect of vibration on finger tactile sensation in healthy adults and how imperceptible random vibration applied to the wrist changes cortical activity for fingertip sensation and precision grip. In a series of studies on healthy adults, white-noise vibration was applied to one of four locations (dorsum hand by the second knuckle, thenar and hypothenar areas, and volar wrist) at one of four intensities (zero, 60%, 80%, and 120% of the sensory threshold for each vibration location), while the fingertip sensation, the smallest vibratory signal that could be perceived on the thumb and index fingertip pads, was assessed. Vibration intensities significantly affected the fingertip sensation (p.01), all compared with the zero vibration condition. The next step was to examine the cortical activity for this vibration-enhanced fingertip sensation. We measured somatosensory evoked potentials to assess peak-to-peak response to light touch of the index fingertip with applied wrist vibration versus without. We observed increased peak-to-peak somatosensory evoked potentials with wrist vibration, especially with increased amplitude of the later component for the somatosensory, motor, and premotor cortex with wrist vibration. These findings corroborate an enhanced cortical-level sensory response motivated by vibration. It is possible that the cortical modulation observed here is the result of the establishment of transient networks for improved perception. Finally, we examined the effect of imperceptible vibration applied to the wrist on cortical control for precision grip. We measured β-band power to assess peak-to-peak response while subjects performed precision pinch with wrist vibration versus without. We observed increased peak-to-peak β-band power amplitude with wrist vibration, especially with event-related synchronization for the prefrontal, sensorimotor, motor, premotor, and supplementary motor areas with vibration. The enhanced motor function may possibly be a result of higher recalibration following movement and faster motor learning

    Crossmodal audio and tactile interaction with mobile touchscreens

    Get PDF
    Touchscreen mobile devices often use cut-down versions of desktop user interfaces placing high demands on the visual sense that may prove awkward in mobile settings. The research in this thesis addresses the problems encountered by situationally impaired mobile users by using crossmodal interaction to exploit the abundant similarities between the audio and tactile modalities. By making information available to both senses, users can receive the information in the most suitable way, without having to abandon their primary task to look at the device. This thesis begins with a literature review of related work followed by a definition of crossmodal icons. Two icons may be considered to be crossmodal if and only if they provide a common representation of data, which is accessible interchangeably via different modalities. Two experiments investigated possible parameters for use in crossmodal icons with results showing that rhythm, texture and spatial location are effective. A third experiment focused on learning multi-dimensional crossmodal icons and the extent to which this learning transfers between modalities. The results showed identification rates of 92% for three-dimensional audio crossmodal icons when trained in the tactile equivalents, and identification rates of 89% for tactile crossmodal icons when trained in the audio equivalent. Crossmodal icons were then incorporated into a mobile touchscreen QWERTY keyboard. Experiments showed that keyboards with audio or tactile feedback produce fewer errors and greater speeds of text entry compared to standard touchscreen keyboards. The next study examined how environmental variables affect user performance with the same keyboard. The data showed that each modality performs differently with varying levels of background noise or vibration and the exact levels at which these performance decreases occur were established. The final study involved a longitudinal evaluation of a touchscreen application, CrossTrainer, focusing on longitudinal effects on performance with audio and tactile feedback, the impact of context on performance and personal modality preference. The results show that crossmodal audio and tactile icons are a valid method of presenting information to situationally impaired mobile touchscreen users with recognitions rates of 100% over time. This thesis concludes with a set of guidelines on the design and application of crossmodal audio and tactile feedback to enable application and interface designers to employ such feedback in all systems

    Augmenting Sensorimotor Control Using “Goal-Aware” Vibrotactile Stimulation during Reaching and Manipulation Behaviors

    Get PDF
    We describe two sets of experiments that examine the ability of vibrotactile encoding of simple position error and combined object states (calculated from an optimal controller) to enhance performance of reaching and manipulation tasks in healthy human adults. The goal of the first experiment (tracking) was to follow a moving target with a cursor on a computer screen. Visual and/or vibrotactile cues were provided in this experiment, and vibrotactile feedback was redundant with visual feedback in that it did not encode any information above and beyond what was already available via vision. After only 10 minutes of practice using vibrotactile feedback to guide performance, subjects tracked the moving target with response latency and movement accuracy values approaching those observed under visually guided reaching. Unlike previous reports on multisensory enhancement, combining vibrotactile and visual feedback of performance errors conferred neither positive nor negative effects on task performance. In the second experiment (balancing), vibrotactile feedback encoded a corrective motor command as a linear combination of object states (derived from a linear-quadratic regulator implementing a trade-off between kinematic and energetic performance) to teach subjects how to balance a simulated inverted pendulum. Here, the tactile feedback signal differed from visual feedback in that it provided information that was not readily available from visual feedback alone. Immediately after applying this novel “goal-aware” vibrotactile feedback, time to failure was improved by a factor of three. Additionally, the effect of vibrotactile training persisted after the feedback was removed. These results suggest that vibrotactile encoding of appropriate combinations of state information may be an effective form of augmented sensory feedback that can be applied, among other purposes, to compensate for lost or compromised proprioception as commonly observed, for example, in stroke survivors

    Sensory Communication

    Get PDF
    Contains table of contents for Section 2, an introduction and reports on twelve research projects.National Institutes of Health Grant 5 R01 DC00117National Institutes of Health Contract 2 P01 DC00361National Institutes of Health Grant 5 R01 DC00126National Institutes of Health Grant R01-DC00270U.S. Air Force - Office of Scientific Research Contract AFOSR-90-0200National Institutes of Health Grant R29-DC00625U.S. Navy - Office of Naval Research Grant N00014-88-K-0604U.S. Navy - Office of Naval Research Grant N00014-91-J-1454U.S. Navy - Office of Naval Research Grant N00014-92-J-1814U.S. Navy - Naval Training Systems Center Contract N61339-93-M-1213U.S. Navy - Naval Training Systems Center Contract N61339-93-C-0055U.S. Navy - Naval Training Systems Center Contract N61339-93-C-0083U.S. Navy - Office of Naval Research Grant N00014-92-J-4005U.S. Navy - Office of Naval Research Grant N00014-93-1-119

    W-FYD: a Wearable Fabric-based Display for Haptic Multi-Cue Delivery and Tactile Augmented Reality

    Get PDF
    Despite the importance of softness, there is no evidence of wearable haptic systems able to deliver controllable softness cues. Here, we present the Wearable Fabric Yielding Display (W-FYD), a fabric-based display for multi-cue delivery that can be worn on user's finger and enables, for the first time, both active and passive softness exploration. It can also induce a sliding effect under the finger-pad. A given stiffness profile can be obtained by modulating the stretching state of the fabric through two motors. Furthermore, a lifting mechanism allows to put the fabric in contact with the user's finger-pad, to enable passive softness rendering. In this paper, we describe the architecture of W-FYD, and a thorough characterization of its stiffness workspace, frequency response and softness rendering capabilities. We also computed device Just Noticeable Difference in both active and passive exploratory conditions, for linear and non-linear stiffness rendering as well as for sliding direction perception. The effect of device weight was also considered. Furthermore, performance of participants and their subjective quantitative evaluation in detecting sliding direction and softness discrimination tasks are reported. Finally, applications of W-FYD in tactile augmented reality for open palpation are discussed, opening interesting perspectives in many fields of human-machine interaction
    corecore