138 research outputs found

    Enabling audio-haptics

    Get PDF
    This thesis deals with possible solutions to facilitate orientation, navigation and overview of non-visual interfaces and virtual environments with the help of sound in combination with force-feedback haptics. Applications with haptic force-feedback, s

    Upper extremity rehabilitation using interactive virtual environments

    Get PDF
    Stroke affects more than 700,000 people annually in the U.S. It is the leading cause of major disability. Recovery of upper extremity function remains particularly resistant to intervention, with 80% to 95% of persons demonstrating residual upper extremity impairments lasting beyond six months after the stroke. The NJIT Robot Assistive Virtual Rehabilitation (NJIT-RAVR) system has been developed to study optimal strategies for rehabilitation of arm and hand function. Several commercial available devices, such as HapticMasterâ„¢, Cybergloveâ„¢, trakSTARâ„¢ and Cybergraspâ„¢, have been integrated and 11 simulations were developed to allow users to interact with virtual environments. Visual interfaces used in these simulations were programmed either in Virtools or in C++ using the Open GL library. Stereoscopic glasses were used to enhance depth perception and to present movement targets to the subjects in a 3-dimensional stereo working space. Adaptive online and offline algorithms were developed that provided appropriate task difficulty to optimize the outcomes. A pilot study was done on four stroke patients and two children with cerebral palsy to demonstrate the usability of this robot-assisted VR system. The RAVR system performed well without unexpected glitches during two weeks of training. No subjects experienced side effects such as dizziness, nausea or disorientation while interacting with the virtual environment. Each subject was able to finish the training, either with or without robotic adaptive assistance. To investigate optimal therapeutic approaches, forty stroke subjects were randomly assigned to two groups: Hand and Arm training Together (HAT) and Hand and Arm training Separately (HAS). Each group was trained in similar virtual reality training environments for three hours a day, four days a week for two weeks. In addition, twelve stroke subjects participated as a control group. They received conventional rehabilitation training of similar intensity and duration as the HAS and HAT groups. Clinical outcome measurements included the Jebsen Test of Hand Function, the Wolf Motor Function Test, and the ReachGrasp test. Secondary outcome measurements were calculated from kinematic and kinetic data collected during training in real time at 100 Hz. Both HAS and HAT groups showed significant improvement in clinical and kinematic outcome measurements. Clinical improvement compared favorably to the randomized clinical trials reported in the literature. However, there was no significant improvement difference between the two groups. Subjects from the control group improved in clinical measurements and in the ReachGrasp test. Compared to the control group, the ReachGrasp test showed a larger increase in movement speed during reaching and in the efficiency of lifting an object from the table in the combined HAS and HAT group. The NJIT-RAVR system was further modified to address the needs of children with hemiplegia due to Cerebral Palsy. Thirteen children with cerebral palsy participated in the total of nine sessions of one hour training that lasted for three weeks. Nine of the children were trained using the RAVR system alone, and another four had training with the combined Constraint-Induced Movement therapy and RAVR therapy. As a group, the children demonstrated improved performance across measurements of the Arm Range of Motion (AROM), motor function, kinematics and motor control. While subjects\u27 responses to the games varied, they performed each simulation while maintaining attention sufficient to improve in both robotic task performance and in measures of motor function

    A Haptic System for Depicting Mathematical Graphics for Students with Visual Impairments

    Get PDF
    When teaching students with visual impairments educators generally rely on tactile tools to depict visual mathematical topics. Tactile media, such as embossed paper and simple manipulable materials, are typically used to convey graphical information. Although these tools are easy to use and relatively inexpensive, they are solely tactile and are not modifiable. Dynamic and interactive technologies such as pin matrices and haptic pens are also commercially available, but tend to be more expensive and less intuitive. This study aims to bridge the gap between easy-to-use tactile tools and dynamic, interactive technologies in order to facilitate the haptic learning of mathematical concepts. We developed an haptic assistive device using a Tanvas electrostatic touchscreen that provides the user with multimodal (haptic, auditory, and visual) output. Three methodological steps comprise this research: 1) a systematic literature review of the state of the art in the design and testing of tactile and haptic assistive devices, 2) a user-centered system design, and 3) testing of the system’s effectiveness via a usability study. The electrostatic touchscreen exhibits promise as an assistive device for displaying visual mathematical elements via the haptic modality

    Multimodal interaction: developing an interaction concept for a touchscreen incorporating tactile feedback

    Get PDF
    The touchscreen, as an alternative user interface for applications that normally require mice and keyboards, has become more and more commonplace, showing up on mobile devices, on vending machines, on ATMs and in the control panels of machines in industry, where conventional input devices cannot provide intuitive, rapid and accurate user interaction with the content of the display. The exponential growth in processing power on the PC, together with advances in understanding human communication channels, has had a significant effect on the design of usable, human-factored interfaces on touchscreens, and on the number and complexity of applications available on touchscreens. Although computer-driven touchscreen interfaces provide programmable and dynamic displays, the absence of the expected tactile cues on the hard and static surfaces of conventional touchscreens is challenging interface design and touchscreen usability, in particular for distracting, low-visibility environments. Current technology allows the human tactile modality to be used in touchscreens. While the visual channel converts graphics and text unidirectionally from the computer to the end user, tactile communication features a bidirectional information flow to and from the user as the user perceives and acts on the environment and the system responds to changing contextual information. Tactile sensations such as detents and pulses provide users with cues that make selecting and controlling a more intuitive process. Tactile features can compensate for deficiencies in some of the human senses, especially in tasks which carry a heavy visual or auditory burden. In this study, an interaction concept for tactile touchscreens is developed with a view to employing the key characteristics of the human sense of touch effectively and efficiently, especially in distracting environments where vision is impaired and hearing is overloaded. As a first step toward improving the usability of touchscreens through the integration of tactile effects, different mechanical solutions for producing motion in tactile touchscreens are investigated, to provide a basis for selecting suitable vibration directions when designing tactile displays. Building on these results, design know-how regarding tactile feedback patterns is further developed to enable dynamic simulation of UI controls, in order to give users a sense of perceiving real controls on a highly natural touch interface. To study the value of adding tactile properties to touchscreens, haptically enhanced UI controls are then further investigated with the aim of mapping haptic signals to different usage scenarios to perform primary and secondary tasks with touchscreens. The findings of the study are intended for consideration and discussion as a guide to further development of tactile stimuli, haptically enhanced user interfaces and touchscreen applications

    Sensory Communication

    Get PDF
    Contains table of contents for Section 2, an introduction and reports on twelve research projects.National Institutes of Health Grant R01 DC00117National Institutes of Health Grant R01 DC02032National Institutes of Health/National Institute of Deafness and Other Communication Disorders Grant 2 R01 DC00126National Institutes of Health Grant 2 R01 DC00270National Institutes of Health Contract N01 DC-5-2107National Institutes of Health Grant 2 R01 DC00100U.S. Navy - Office of Naval Research Grant N61339-96-K-0002U.S. Navy - Office of Naval Research Grant N61339-96-K-0003U.S. Navy - Office of Naval Research Grant N00014-97-1-0635U.S. Navy - Office of Naval Research Grant N00014-97-1-0655U.S. Navy - Office of Naval Research Subcontract 40167U.S. Navy - Office of Naval Research Grant N00014-96-1-0379U.S. Air Force - Office of Scientific Research Grant F49620-96-1-0202National Institutes of Health Grant RO1 NS33778Massachusetts General Hospital, Center for Innovative Minimally Invasive Therapy Research Fellowship Gran

    Playful haptic environment for engaging visually impaired learners with geometric shapes

    Get PDF
    This thesis asserts that modern developments in technology have not been used as extensively as they could to aid blind people in their learning objectives. The same could also be said of many aspects of other areas of their lives. In particular in many countries blind students are discouraged from learning mathematics because of the intrinsically visual nature of many of the topics and particularly geometry. For many young people mathematics is also not a subject that is easily or willingly tackled. The research presented here has thus sort to answer whether a playful haptic environment could be developed which would be attractive to blind users to learn and interact with geometric concepts. In the study a software tool using a haptic interface was developed with certain playful characteristics. The environment developed sought to give the blind users practice in interacting with three dimensional geometric shapes and the investigation of the size of these shapes and their cross-section. The playful elements were enhanced by adding elements of competition such as scores and time limits which promote competition between the users. The tests have shown that blind users can easily use the system to learn about three dimensional shapes and that practice increases their confidence in recognising shape and size of these objects

    Principles and Guidelines for Advancement of Touchscreen-Based Non-visual Access to 2D Spatial Information

    Get PDF
    Graphical materials such as graphs and maps are often inaccessible to millions of blind and visually-impaired (BVI) people, which negatively impacts their educational prospects, ability to travel, and vocational opportunities. To address this longstanding issue, a three-phase research program was conducted that builds on and extends previous work establishing touchscreen-based haptic cuing as a viable alternative for conveying digital graphics to BVI users. Although promising, this approach poses unique challenges that can only be addressed by schematizing the underlying graphical information based on perceptual and spatio-cognitive characteristics pertinent to touchscreen-based haptic access. Towards this end, this dissertation empirically identified a set of design parameters and guidelines through a logical progression of seven experiments. Phase I investigated perceptual characteristics related to touchscreen-based graphical access using vibrotactile stimuli, with results establishing three core perceptual guidelines: (1) a minimum line width of 1mm should be maintained for accurate line-detection (Exp-1), (2) a minimum interline gap of 4mm should be used for accurate discrimination of parallel vibrotactile lines (Exp-2), and (3) a minimum angular separation of 4mm should be used for accurate discrimination of oriented vibrotactile lines (Exp-3). Building on these parameters, Phase II studied the core spatio-cognitive characteristics pertinent to touchscreen-based non-visual learning of graphical information, with results leading to the specification of three design guidelines: (1) a minimum width of 4mm should be used for supporting tasks that require tracing of vibrotactile lines and judging their orientation (Exp-4), (2) a minimum width of 4mm should be maintained for accurate line tracing and learning of complex spatial path patterns (Exp-5), and (3) vibrotactile feedback should be used as a guiding cue to support the most accurate line tracing performance (Exp-6). Finally, Phase III demonstrated that schematizing line-based maps based on these design guidelines leads to development of an accurate cognitive map. Results from Experiment-7 provide theoretical evidence in support of learning from vision and touch as leading to the development of functionally equivalent amodal spatial representations in memory. Findings from all seven experiments contribute to new theories of haptic information processing that can guide the development of new touchscreen-based non-visual graphical access solutions

    Playful haptic environment for engaging visually impaired learners with geometric shapes

    Get PDF
    This thesis asserts that modern developments in technology have not been used as extensively as they could to aid blind people in their learning objectives. The same could also be said of many aspects of other areas of their lives. In particular in many countries blind students are discouraged from learning mathematics because of the intrinsically visual nature of many of the topics and particularly geometry. For many young people mathematics is also not a subject that is easily or willingly tackled. The research presented here has thus sort to answer whether a playful haptic environment could be developed which would be attractive to blind users to learn and interact with geometric concepts. In the study a software tool using a haptic interface was developed with certain playful characteristics. The environment developed sought to give the blind users practice in interacting with three dimensional geometric shapes and the investigation of the size of these shapes and their cross-section. The playful elements were enhanced by adding elements of competition such as scores and time limits which promote competition between the users. The tests have shown that blind users can easily use the system to learn about three dimensional shapes and that practice increases their confidence in recognising shape and size of these objects

    Haptic wearables as sensory replacement, sensory augmentation and trainer - a review

    Get PDF
    Sensory impairments decrease quality of life and can slow or hinder rehabilitation. Small, computationally powerful electronics have enabled the recent development of wearable systems aimed to improve function for individuals with sensory impairments. The purpose of this review is to synthesize current haptic wearable research for clinical applications involving sensory impairments. We define haptic wearables as untethered, ungrounded body worn devices that interact with skin directly or through clothing and can be used in natural environments outside a laboratory. Results of this review are categorized by degree of sensory impairment. Total impairment, such as in an amputee, blind, or deaf individual, involves haptics acting as sensory replacement; partial impairment, as is common in rehabilitation, involves haptics as sensory augmentation; and no impairment involves haptics as trainer. This review found that wearable haptic devices improved function for a variety of clinical applications including: rehabilitation, prosthetics, vestibular loss, osteoarthritis, vision loss and hearing loss. Future haptic wearables development should focus on clinical needs, intuitive and multimodal haptic displays, low energy demands, and biomechanical compliance for long-term usage
    • …
    corecore