1,735 research outputs found

    Sensory System for Implementing a Human—Computer Interface Based on Electrooculography

    Get PDF
    This paper describes a sensory system for implementing a human–computer interface based on electrooculography. An acquisition system captures electrooculograms and transmits them via the ZigBee protocol. The data acquired are analysed in real time using a microcontroller-based platform running the Linux operating system. The continuous wavelet transform and neural network are used to process and analyse the signals to obtain highly reliable results in real time. To enhance system usability, the graphical interface is projected onto special eyewear, which is also used to position the signal-capturing electrodes

    EOG-Based Eye Movement Classification and Application on HCI Baseball Game

    Full text link
    © 2013 IEEE. Electrooculography (EOG) is considered as the most stable physiological signal in the development of human-computer interface (HCI) for detecting eye-movement variations. EOG signal classification has gained more traction in recent years to overcome physical inconvenience in paralyzed patients. In this paper, a robust classification technique, such as eight directional movements is investigated by introducing a concept of buffer along with a variation of the slope to avoid misclassification effects in EOG signals. Blinking detection becomes complicated when the magnitude of the signals are considered. Hence, a correction technique is introduced to avoid misclassification for oblique eye movements. Meanwhile, a case study has been considered to apply these correction techniques to HCI baseball game to learn eye-movements

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application

    Design of a Wearable Eye-Movement Detection System Based on Electrooculography Signals and Its Experimental Validation.

    Full text link
    In the assistive research area, human-computer interface (HCI) technology is used to help people with disabilities by conveying their intentions and thoughts to the outside world. Many HCI systems based on eye movement have been proposed to assist people with disabilities. However, due to the complexity of the necessary algorithms and the difficulty of hardware implementation, there are few general-purpose designs that consider practicality and stability in real life. Therefore, to solve these limitations and problems, an HCI system based on electrooculography (EOG) is proposed in this study. The proposed classification algorithm provides eye-state detection, including the fixation, saccade, and blinking states. Moreover, this algorithm can distinguish among ten kinds of saccade movements (i.e., up, down, left, right, farther left, farther right, up-left, down-left, up-right, and down-right). In addition, we developed an HCI system based on an eye-movement classification algorithm. This system provides an eye-dialing interface that can be used to improve the lives of people with disabilities. The results illustrate the good performance of the proposed classification algorithm. Moreover, the EOG-based system, which can detect ten different eye-movement features, can be utilized in real-life applications

    Graphene textiles towards soft wearable interfaces for electroocular remote control of objects

    Get PDF
    Study of eye movements (EMs) and measurement of the resulting biopotentials, referred to as electrooculography (EOG), may find increasing use in applications within the domain of activity recognition, context awareness, mobile human-computer interaction (HCI) applications, and personalized medicine provided that the limitations of conventional “wet” electrodes are addressed. To overcome the limitations of conventional electrodes, this work, reports for the first time the use and characterization of graphene-based electroconductive textile electrodes for EOG acquisition using a custom-designed embedded eye tracker. This self-contained wearable device consists of a headband with integrated textile electrodes and a small, pocket-worn, battery-powered hardware with real-time signal processing which can stream data to a remote device over Bluetooth. The feasibility of the developed gel-free, flexible, dry textile electrodes was experimentally authenticated through side-by-side comparison with pre-gelled, wet, silver/silver chloride (Ag/AgCl) electrodes, where the simultaneously and asynchronous recorded signals displayed correlation of up to ~87% and ~91% respectively over durations reaching hundred seconds and repeated on several participants. Additionally, an automatic EM detection algorithm is developed and the performance of the graphene-embedded “all-textile” EM sensor and its application as a control element toward HCI is experimentally demonstrated. The excellent success rate ranging from 85% up to 100% for eleven different EM patterns demonstrates the applicability of the proposed algorithm in wearable EOG-based sensing and HCI applications with graphene textiles. The system-level integration and the holistic design approach presented herein which starts from fundamental materials level up to the architecture and algorithm stage is highlighted and will be instrumental to advance the state-of-the-art in wearable electronic devices based on sensing and processing of electrooculograms
    • 

    corecore