188 research outputs found

    Integrated electromyogram and eye-gaze tracking cursor control system for computer users with motor disabilities

    Get PDF
    This research pursued the conceptualization, implementation, and testing of a system that allows for computer cursor control without requiring hand movement. The target user group for this system are individuals who are unable to use their hands because of spinal dysfunction or other afflictions. The system inputs consisted of electromyogram (EMG) signals from muscles in the face and point-of-gaze coordinates produced by an eye-gaze tracking (EGT) system. Each input was processed by an algorithm that produced its own cursor update information. These algorithm outputs were fused to produce an effective and efficient cursor control. Experiments were conducted to compare the performance of EMG/EGT, EGT-only, and mouse cursor controls. The experiments revealed that, although EMG/ EGT control was slower than EGT-only and mouse control, it effectively controlled the cursor without a spatial accuracy limitation and also facilitated a reliable click operation

    Adaptive eye-gaze tracking using neural-network-based user profiles to assist people with motor disability

    Get PDF
    This study developed an adaptive real-time humancomputer interface (HCI) that serves as an assistive technology tool for people with severe motor disability. The proposed HCI design uses eye gaze as the primary computer input device. Controlling the mouse cursor with raw eye coordinates results in sporadic motion of the pointer because of the saccadic nature of the eye. Even though eye movements are subtle and completely imperceptible under normal circumstances, they considerably affect the accuracy of an eye-gaze-based HCI. The proposed HCI system is novel because it adapts to each specific user’s different and potentially changing jitter characteristics through the configuration and training of an artificial neural network (ANN) that is structured to minimize the mouse jitter. This task is based on feeding the ANN a user’s initially recorded eye-gaze behavior through a short training session. The ANN finds the relationship between the gaze coordinates and the mouse cursor position based on the multilayer perceptron model. An embedded graphical interface is used during the training session to generate user profiles that make up these unique ANN configurations. The results with 12 subjects in test 1, which involved following a moving target, showed an average jitter reduction of 35%; the results with 9 subjects in test 2, which involved following the contour of a square object, showed an average jitter reduction of 53%. For both results, the outcomes led to trajectories that were significantly smoother and apt at reaching fixed or moving targets with relative ease and within a 5% error margin or deviation from desired trajectories. The positive effects of such jitter reduction are presented graphically for visual appreciation

    Evaluation of cervical posture improvement of children with cerebral palsy after physical therapy based on head movements and serious games

    Get PDF
    Background: This paper presents the preliminary results of a novel rehabilitation therapy for cervical and trunk control of children with cerebral palsy (CP) based on serious videogames and physical exercise. Materials: The therapy is based on the use of the ENLAZA Interface, a head mouse based on inertial technology that will be used to control a set of serious videogames with movements of the head. Methods: Ten users with CP participated in the study. Whereas the control group (n=5) followed traditional therapies, the experimental group (n=5) complemented these therapies with a series of ten sessions of gaming with ENLAZA to exercise cervical flexion-extensions, rotations and inclinations in a controlled, engaging environment. Results: The ten work sessions yielded improvements in head and trunk control that were higher in the experimental group for Visual Analogue Scale, Goal Attainment Scaling and Trunk Control Measurement Scale (TCMS). Significant differences (27% vs. 2% of percentage improvement) were found between the experimental and control groups for TCMS (p<0.05). The kinematic assessment shows that there were some improvements in the active and the passive range of motion. However, no significant differences were found pre- and post-intervention. Conclusions:Physical therapy that combines serious games with traditional rehabilitation could allow children with CP to achieve larger function improvements in the trunk and cervical regions. However, given the limited scope of this trial (n=10) additional studies are needed to corroborate this hypothesis

    Arduino-based myoelectric control: Towards longitudinal study of prosthesis use

    Get PDF
    Understanding how upper-limb prostheses are used in daily life helps to improve the design and robustness of prosthesis control algorithms and prosthetic components. However, only a very small fraction of published research includes prosthesis use in community settings. The cost, limited battery life, and poor generalisation may be the main reasons limiting the implementation of home-based applications. In this work, we introduce the design of a cost-effective Arduino-based myoelectric control system with wearable electromyogram (EMG) sensors. The design considerations focused on home studies, so the robustness, user-friendly control adjustments, and user supports were the main concerns. Three control algorithms, namely, direct control, abstract control, and linear discriminant analysis (LDA) classification, were implemented in the system. In this paper, we will share our design principles and report the robustness of the system in continuous operation in the laboratory. In addition, we will show a first real-time implementation of the abstract decoder for prosthesis control with an able-bodied participant

    Hybrid Human-Machine Interface to Mouse Control for Severely Disabled People

    Get PDF
    This paper describes a hybrid human-machine interface, based on electro-oculogram (EOG) and electromyogram (EMG), which allows the mouse control of a personal computer using eye movement and the voluntary contraction of any facial muscle. The bioelectrical signals are sensed through adhesives electrodes, and acquired by a custom designed portable and wireless system. The mouse can be moved in any direction, vertical, horizontal and diagonal, by two EOG channels and the EMG signal is used to perform the mouse click action. Blinks are avoided by a decision algorithm and the natural reading of the screen is possible with a specially designed software. A virtual keyboard was used for the experiments with healthy people and with a severely disabled patient. The results demonstrate an intuitive and accessible control, evaluated in terms of performance, time for task execution and userÂŽs acceptance. Besides, a quantitative index to estimate the training impact was computed with good results.Fil: LĂłpez Celani, Natalia Martina. Universidad Nacional de San Juan. Facultad de IngenierĂ­a. Departamento de ElectrĂłnica y AutomĂĄtica. Gabinete de TecnologĂ­a MĂ©dica; ArgentinaFil: Orosco, Eugenio Conrado. Universidad Nacional de San Juan. Facultad de IngenierĂ­a. Instituto de AutomĂĄtica; ArgentinaFil: PĂ©rez Berenguer, MarĂ­a Elisa. Universidad Nacional de San Juan. Facultad de IngenierĂ­a. Departamento de ElectrĂłnica y AutomĂĄtica. Gabinete de TecnologĂ­a MĂ©dica; ArgentinaFil: Bajinay, Sergio. Universidad Nacional de San Juan. Facultad de IngenierĂ­a. Departamento de ElectrĂłnica y AutomĂĄtica. Gabinete de TecnologĂ­a MĂ©dica; ArgentinaFil: Zanetti, Roberto. Universidad Nacional de San Juan. Facultad de IngenierĂ­a. Departamento de ElectrĂłnica y AutomĂĄtica. Gabinete de TecnologĂ­a MĂ©dica; ArgentinaFil: Valentinuzzi, Maximo. Universidad Nacional de San Juan. Facultad de IngenierĂ­a. Departamento de ElectrĂłnica y AutomĂĄtica. Gabinete de TecnologĂ­a MĂ©dica; Argentin

    The potential of the BCI for accessible and smart e-learning

    Get PDF
    The brain computer interface (BCI) should be the accessibility solution “par excellence” for interactive and e-learning systems. There is a substantial tradition of research on the human electro encephalogram (EEG) and on BCI systems that are based, inter alia, on EEG measurement. We have not yet seen a viable BCI for e-learning. For many users for a BCI based interface is their first choice for good quality interaction, such as those with major psychomotor or cognitive impairments. However, there are many more for whom the BCI would be an attractive option given an acceptable learning overhead, including less severe disabilities and safety critical conditions where cognitive overload or limited responses are likely. Recent progress has been modest as there are many technical and accessibility problems to overcome. We present these issues and report a survey of fifty papers to capture the state-of-the-art in BCI and the implications for e-learning

    The Tongue Enables Computer and Wheelchair Control for People with Spinal Cord Injury

    Get PDF
    The Tongue Drive System (TDS) is a wireless and wearable assistive technology, designed to allow individuals with severe motor impairments such as tetraplegia to access their environment using voluntary tongue motion. Previous TDS trials used a magnetic tracer temporarily attached to the top surface of the tongue with tissue adhesive. We investigated TDS efficacy for controlling a computer and driving a powered wheelchair in two groups of able-bodied subjects and a group of volunteers with spinal cord injury (SCI) at C6 or above. All participants received a magnetic tongue barbell and used the TDS for five to six consecutive sessions. The performance of the group was compared for TDS versus keypad and TDS versus a sip-and-puff device (SnP) using accepted measures of speed and accuracy. All performance measures improved over the course of the trial. The gap between keypad and TDS performance narrowed for able-bodied subjects. Despite participants with SCI already having familiarity with the SnP, their performance measures were up to three times better with the TDS than with the SnP and continued to improve. TDS flexibility and the inherent characteristics of the human tongue enabled individuals with high-level motor impairments to access computers and drive wheelchairs at speeds that were faster than traditional assistive technologies but with comparable accuracy

    Passive wireless tags for tongue controlled assistive technology interfaces

    Get PDF
    Tongue control with low profile, passive mouth tags is demonstrated as a human–device interface by communicating values of tongue-tag separation over a wireless link. Confusion matrices are provided to demonstrate user accuracy in targeting by tongue position. Accuracy is found to increase dramatically after short training sequences with errors falling close to 1% in magnitude with zero missed targets. The rate at which users are able to learn accurate targeting with high accuracy indicates that this is an intuitive device to operate. The significance of the work is that innovative very unobtrusive, wireless tags can be used to provide intuitive human–computer interfaces based on low cost and disposable mouth mounted technology. With the development of an appropriate reading system, control of assistive devices such as computer mice or wheelchairs could be possible for tetraplegics and others who retain fine motor control capability of their tongues. The tags contain no battery and are intended to fit directly on the hard palate, detecting tongue position in the mouth with no need for tongue piercings

    The human eye as human-machine interface

    Get PDF
    Eye tracking as an interface to operate a computer is under research for a while and new systems are still being developed nowadays that provide some encouragement to those bound to illnesses that incapacitates them to use any other form of interaction with a computer. Although using computer vision processing and a camera, these systems are usually based on head mount technology being considered a contact type system. This paper describes the implementation of a human-computer interface based on a fully non-contact eye tracking vision system in order to allow people with tetraplegia to interface with a computer. As an assistive technology, a graphical user interface with special features was developed including a virtual keyboard to allow user communication, fast access to pre-stored phrases and multimedia and even internet browsing. This system was developed with the focus on low cost, user friendly functionality and user independency and autonomy.The authors would like to thank the important contributions of Mr. Abel, his wife and Mr. Sampaio for the success of this work. This work was supported by the Automation and Robotics Laboratory from the Algoritmi Research Center at the University of Minho in Guimaraes. This work is funded by FEDER through the Operational Competitiveness Programme — COMPETE — and by national funds through the Foundation for Science and Technology — FCT — in the scope of project: FCOMP-01-0124-FEDER-022674
    • 

    corecore