11,381 research outputs found

    Exploring eye responsive control - from a head mounted to a remote system

    Get PDF
    The Attention Responsive Technology (ART) system is designed to enable control of the environment by individuals for whom movement is difficult or undesirable. This paper reports additional development of the ART system through replacing its initial head-mounted eye-tracking technology with a remotely mounted tracking system. The new system can release the user from the need to wear any head-mounted equipment, thus improving user comfort and acceptability. Instead, eye tracking cameras and the scene camera are situated in a fixed position some small distance from the user; these then allow tracking of the user’s eye gaze and field of view, respectively. This system would suit many situations in which the user remains seated, for example, in a wheelchair or before a workstation onto which the cameras can be mounted

    Vision-based interface applied to assistive robots

    Get PDF
    This paper presents two vision-based interfaces for disabled people to command a mobile robot for personal assistance. The developed interfaces can be subdivided according to the algorithm of image processing implemented for the detection and tracking of two different body regions. The first interface detects and tracks movements of the user's head, and these movements are transformed into linear and angular velocities in order to command a mobile robot. The second interface detects and tracks movements of the user's hand, and these movements are similarly transformed. In addition, this paper also presents the control laws for the robot. The experimental results demonstrate good performance and balance between complexity and feasibility for real-time applications.Fil: PĂ©rez Berenguer, MarĂ­a Elisa. Universidad Nacional de San Juan. Facultad de IngenierĂ­a. Departamento de ElectrĂłnica y AutomĂĄtica. Gabinete de TecnologĂ­a MĂ©dica; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas; ArgentinaFil: Soria, Carlos Miguel. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas. Centro CientĂ­fico TecnolĂłgico Conicet - San Juan. Instituto de AutomĂĄtica. Universidad Nacional de San Juan. Facultad de IngenierĂ­a. Instituto de AutomĂĄtica; ArgentinaFil: LĂłpez Celani, Natalia Martina. Universidad Nacional de San Juan. Facultad de IngenierĂ­a. Departamento de ElectrĂłnica y AutomĂĄtica. Gabinete de TecnologĂ­a MĂ©dica; Argentina. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas; ArgentinaFil: Nasisi, Oscar Herminio. Universidad Nacional de San Juan. Facultad de IngenierĂ­a. Instituto de AutomĂĄtica; ArgentinaFil: Mut, Vicente Antonio. Consejo Nacional de Investigaciones CientĂ­ficas y TĂ©cnicas. Centro CientĂ­fico TecnolĂłgico Conicet - San Juan. Instituto de AutomĂĄtica. Universidad Nacional de San Juan. Facultad de IngenierĂ­a. Instituto de AutomĂĄtica; Argentin

    Wheelchair-based game design for older adults

    Get PDF
    Few leisure activities are accessible to institutionalized older adults using wheelchairs; in consequence, they experience lower levels of perceived health than able-bodied peers. Video games have been shown to be an engaging leisure activity for older adults. In our work, we address the design of wheelchair-accessible motion-based games. We present KINECTWheels, a toolkit designed to integrate wheelchair movements into motion-based games, and Cupcake Heaven, a wheelchair-based video game designed for older adults using wheelchairs. Results of two studies show that KINECTWheels can be applied to make motion-based games wheelchair-accessible, and that wheelchair-based games engage older adults. Through the application of the wheelchair as an enabling technology in play, our work has the potential of encouraging older adults to develop a positive relationship with their wheelchair. Copyright 2013 ACM

    Designing wheelchair-based movement games

    Get PDF
    People using wheelchairs have access to fewer sports and other physically stimulating leisure activities than nondisabled persons, and often lead sedentary lifestyles that negatively influence their health. While motion- based video games have demonstrated great potential of encouraging physical activity among nondisabled players, the accessibility of motion-based games is limited for persons with mobility disabilities, thus also limiting access to the potential health benefits of playing these games. In our work, we address this issue through the design of wheelchair-accessible motion-based game controls. We present KINECTWheels, a toolkit designed to integrate wheelchair movements into motion-based games. Building on the toolkit, we developed Cupcake Heaven, a wheelchair-based video game designed for older adults using wheelchairs, and we created Wheelchair Revolution, a motion-based dance game that is accessible to both persons using wheelchairs and nondisabled players. Evaluation results show that KINECTWheels can be applied to make motion-based games wheelchair-accessible, and that wheelchair-based games engage broad audiences in physically stimulating play. Through the application of the wheelchair as an enabling technology in games, our work has the potential of encouraging players of all ages to develop a positive relationship with their wheelchair

    Gaze-based teleprosthetic enables intuitive continuous control of complex robot arm use: Writing & drawing

    Get PDF
    Eye tracking is a powerful mean for assistive technologies for people with movement disorders, paralysis and amputees. We present a highly intuitive eye tracking-controlled robot arm operating in 3-dimensional space based on the user's gaze target point that enables tele-writing and drawing. The usability and intuitive usage was assessed by a “tele” writing experiment with 8 subjects that learned to operate the system within minutes of first time use. These subjects were naive to the system and the task and had to write three letters on a white board with a white board pen attached to the robot arm's endpoint. The instructions are to imagine they were writing text with the pen and look where the pen would be going, they had to write the letters as fast and as accurate as possible, given a letter size template. Subjects were able to perform the task with facility and accuracy, and movements of the arm did not interfere with subjects ability to control their visual attention so as to enable smooth writing. On the basis of five consecutive trials there was a significant decrease in the total time used and the total number of commands sent to move the robot arm from the first to the second trial but no further improvement thereafter, suggesting that within writing 6 letters subjects had mastered the ability to control the system. Our work demonstrates that eye tracking is a powerful means to control robot arms in closed-loop and real-time, outperforming other invasive and non-invasive approaches to Brain-Machine-Interfaces in terms of calibration time (<;2 minutes), training time (<;10 minutes), interface technology costs. We suggests that gaze-based decoding of action intention may well become one of the most efficient ways to interface with robotic actuators - i.e. Brain-Robot-Interfaces - and become useful beyond paralysed and amputee users also for the general teleoperation of robotic and exoskeleton in human augmentation

    A Framework for Mouse Emulation that Uses a Minimally Invasive Tongue Palate Control Device utilizing Resistopalatography

    Get PDF
    The ability to interface fluently with a robust Human Input Device is a major challenge facing patients with severe levels of disability. This paper describes a new method of computer interaction utilizing Force Sensitive Resistor Array Technology, embedded into an Intra-Oral device (Resistopalatography), to emulate a USB Human Interface Device using standard Drivers. The system is based around the patient using their tongue to manipulate these sensors in order to give a position and force measurement; these can then be analyzed to generate the necessary metrics to control a mouse for computer input

    Passive wireless tags for tongue controlled assistive technology interfaces

    Get PDF
    Tongue control with low profile, passive mouth tags is demonstrated as a human–device interface by communicating values of tongue-tag separation over a wireless link. Confusion matrices are provided to demonstrate user accuracy in targeting by tongue position. Accuracy is found to increase dramatically after short training sequences with errors falling close to 1% in magnitude with zero missed targets. The rate at which users are able to learn accurate targeting with high accuracy indicates that this is an intuitive device to operate. The significance of the work is that innovative very unobtrusive, wireless tags can be used to provide intuitive human–computer interfaces based on low cost and disposable mouth mounted technology. With the development of an appropriate reading system, control of assistive devices such as computer mice or wheelchairs could be possible for tetraplegics and others who retain fine motor control capability of their tongues. The tags contain no battery and are intended to fit directly on the hard palate, detecting tongue position in the mouth with no need for tongue piercings
    • 

    corecore