170 research outputs found

    Gyro-Accelerometer based Control of an Intelligent Wheelchair

    Get PDF
    This paper presents a free-hand interface to control an electric wheelchair using the head gesture for people with severe disabilities i.e. multiple sclerosis, quadriplegic patients and old age people. The patient head acceleration and rotation rate are used to control the intelligent wheelchair. The patient head gesture is detected using accelerometer and gyroscope sensors embedded on a single board MPU6050. The MEMS sensors outputs are combined using Kalman filter as sensor fusion to build a high accurate orientation sensor. The system uses an Arduino mega as microcontroller to perform data processing, sensor fusion and joystick emulation to control the intelligent wheelchair and HC-SR04 ultrasonic sensors to provide safe navigation.The wheelchair can be controlled using two modes. In the first mode, the wheelchair is controlled by the usual joystick. In the second mode, the patient uses his head motion to control the wheelchair. The principal advantage of the proposed approach is that the switching between the two control modes is soft, straightforward and transparent to the user

    Graphene textiles towards soft wearable interfaces for electroocular remote control of objects

    Get PDF
    Study of eye movements (EMs) and measurement of the resulting biopotentials, referred to as electrooculography (EOG), may find increasing use in applications within the domain of activity recognition, context awareness, mobile human-computer interaction (HCI) applications, and personalized medicine provided that the limitations of conventional “wet” electrodes are addressed. To overcome the limitations of conventional electrodes, this work, reports for the first time the use and characterization of graphene-based electroconductive textile electrodes for EOG acquisition using a custom-designed embedded eye tracker. This self-contained wearable device consists of a headband with integrated textile electrodes and a small, pocket-worn, battery-powered hardware with real-time signal processing which can stream data to a remote device over Bluetooth. The feasibility of the developed gel-free, flexible, dry textile electrodes was experimentally authenticated through side-by-side comparison with pre-gelled, wet, silver/silver chloride (Ag/AgCl) electrodes, where the simultaneously and asynchronous recorded signals displayed correlation of up to ~87% and ~91% respectively over durations reaching hundred seconds and repeated on several participants. Additionally, an automatic EM detection algorithm is developed and the performance of the graphene-embedded “all-textile” EM sensor and its application as a control element toward HCI is experimentally demonstrated. The excellent success rate ranging from 85% up to 100% for eleven different EM patterns demonstrates the applicability of the proposed algorithm in wearable EOG-based sensing and HCI applications with graphene textiles. The system-level integration and the holistic design approach presented herein which starts from fundamental materials level up to the architecture and algorithm stage is highlighted and will be instrumental to advance the state-of-the-art in wearable electronic devices based on sensing and processing of electrooculograms

    Auxilio: A Sensor-Based Wireless Head-Mounted Mouse for People with Upper Limb Disability

    Full text link
    Upper limb disability may be caused either due to accidents, neurological disorders, or even birth defects, imposing limitations and restrictions on the interaction with a computer for the concerned individuals using a generic optical mouse. Our work proposes the design and development of a working prototype of a sensor-based wireless head-mounted Assistive Mouse Controller (AMC), Auxilio, facilitating interaction with a computer for people with upper limb disability. Combining commercially available, low-cost motion and infrared sensors, Auxilio solely utilizes head and cheek movements for mouse control. Its performance has been juxtaposed with that of a generic optical mouse in different pointing tasks as well as in typing tasks, using a virtual keyboard. Furthermore, our work also analyzes the usability of Auxilio, featuring the System Usability Scale. The results of different experiments reveal the practicality and effectiveness of Auxilio as a head-mounted AMC for empowering the upper limb disabled community.Comment: 28 pages, 9 figures, 5 table

    EOG-Based Human–Computer Interface: 2000–2020 Review

    Get PDF
    Electro-oculography (EOG)-based brain-computer interface (BCI) is a relevant technology influencing physical medicine, daily life, gaming and even the aeronautics field. EOG-based BCI systems record activity related to users' intention, perception and motor decisions. It converts the bio-physiological signals into commands for external hardware, and it executes the operation expected by the user through the output device. EOG signal is used for identifying and classifying eye movements through active or passive interaction. Both types of interaction have the potential for controlling the output device by performing the user's communication with the environment. In the aeronautical field, investigations of EOG-BCI systems are being explored as a relevant tool to replace the manual command and as a communicative tool dedicated to accelerating the user's intention. This paper reviews the last two decades of EOG-based BCI studies and provides a structured design space with a large set of representative papers. Our purpose is to introduce the existing BCI systems based on EOG signals and to inspire the design of new ones. First, we highlight the basic components of EOG-based BCI studies, including EOG signal acquisition, EOG device particularity, extracted features, translation algorithms, and interaction commands. Second, we provide an overview of EOG-based BCI applications in the real and virtual environment along with the aeronautical application. We conclude with a discussion of the actual limits of EOG devices regarding existing systems. Finally, we provide suggestions to gain insight for future design inquiries

    Eye-tracking assistive technologies for individuals with amyotrophic lateral sclerosis

    Get PDF
    Amyotrophic lateral sclerosis, also known as ALS, is a progressive nervous system disorder that affects nerve cells in the brain and spinal cord, resulting in the loss of muscle control. For individuals with ALS, where mobility is limited to the movement of the eyes, the use of eye-tracking-based applications can be applied to achieve some basic tasks with certain digital interfaces. This paper presents a review of existing eye-tracking software and hardware through which eye-tracking their application is sketched as an assistive technology to cope with ALS. Eye-tracking also provides a suitable alternative as control of game elements. Furthermore, artificial intelligence has been utilized to improve eye-tracking technology with significant improvement in calibration and accuracy. Gaps in literature are highlighted in the study to offer a direction for future research

    Improving Speech Intelligibility by Hearing Aid Eye-Gaze Steering: Conditions With Head Fixated in a Multitalker Environment

    Get PDF
    The behavior of a person during a conversation typically involves both auditory and visual attention. Visual attention implies that the person directs his or her eye gaze toward the sound target of interest, and hence, detection of the gaze may provide a steering signal for future hearing aids. The steering could utilize a beamformer or the selection of a specific audio stream from a set of remote microphones. Previous studies have shown that eye gaze can be measured through electrooculography (EOG). To explore the precision and real-time feasibility of the methodology, seven hearing-impaired persons were tested, seated with their head fixed in front of three targets positioned at -30 degrees, 0 degrees, and +30 degrees azimuth. Each target presented speech from the Danish DAT material, which was available for direct input to the hearing aid using head-related transfer functions. Speech intelligibility was measured in three conditions: a reference condition without any steering, a condition where eye gaze was estimated from EOG measures to select the desired audio stream, and an ideal condition with steering based on an eye-tracking camera. The "EOG-steering" improved the sentence correct score compared with the "no-steering" condition, although the performance was still significantly lower than the ideal condition with the eye-tracking camera. In conclusion, eye-gaze steering increases speech intelligibility, although real-time EOG-steering still requires improvements of the signal processing before it is feasible for implementation in a hearing aid.Funding Agencies|EU Horizon 2020 Grant [644732]</p
    • …
    corecore