84 research outputs found

    EYECOM: an innovative approach for computer interaction

    Get PDF
    The world is innovating rapidly, and there is a need for continuous interaction with the technology. Sadly, there do not exist promising options for paralyzed people to interact with the machines i.e., laptops, smartphones, and tabs. A few commercial solutions such as Google Glasses are costly and cannot be afforded by every paralyzed person for such interaction. Towards this end, the thesis proposes a retina-controlled device called EYECOM. The proposed device is constructed from off-the-shelf cost-effective yet robust IoT devices (i.e., Arduino microcontrollers, Xbee wireless sensors, IR diodes, and accelerometer). The device can easily be mounted on to the glasses; the paralyzed person using this device can interact with the machine using simple head movement and eye blinks. The IR detector is located in front of the eye to illuminate the eye region. As a result of illumination, the eye reflects IR light which includes electrical signals and as the eyelids close, the reflected light over eye surface is disrupted, and such change in reflected value is recorded. Further to enable cursor movement onto the computer screen for the paralyzed person a device named accelerometer is used. The accelerometer is a small device, with the size of phalanges, a human thumb bone. The device operates on the principle of axis-based motion sensing and it can be worn as a ring by a paralyzed person. A microcontroller processes the inputs from the IR sensors, accelerometer and transmits them wirelessly via Xbee wireless sensor (i.e., a radio) to another microcontroller attached to the computer. With the help of a proposed algorithm, the microcontroller attached to the computer, on receiving the signals moves cursor onto the computer screen and facilitate performing actions, as simple as opening a document to operating a word-to-speech software. EYECOM has features which can help paralyzed persons to continue their contributions towards the technological world and become an active part of the society. Resultantly, they will be able to perform number of tasks without depending upon others from as simple as reading a newspaper on the computer to activate word-to-voice software

    BRAINWAVE MANEUVERED WHEELCHAIR

    Get PDF
    In this world, there are millions of people who suffer from quadriplegic, paralysis, mobility disorder, neuromuscular disorder in which organ below the neck can’t be controlled by the patients. The system which has been developed in this project is using electroencephalogram based promising and important technology using Brain Computer Interface. It helps unblessed people to control the organ below neck using their own brain. Modern electroencephalogram-based Brain Computer Interface uses gel type electrodes and this type of technology is only limited to hospitals and laboratories and it requires 30 minutes to acquire a brain signal and this proposed system is very costly. But to overcome this cup type electrodes are used and overall cost is reduced to make it cost effective. It has been made portable, so that users can handle and carry it easily. It is possible to operate an electric wheelchair for individuals with disabilities using electroencephalogram signals of their eye movements, which is accomplished via the application of algorithms in MATLAB. Finally, the outcomes of this suggested system provide useful outputs for the user.Keywords: Algorithms; Brain Computer Interface (BCI); Electroencephalogram (EEG); Electric wheelchair and Eye movements

    Detecting head movement using gyroscope data collected via in-ear wearables

    Get PDF
    Abstract. Head movement is considered as an effective, natural, and simple method to determine the pointing towards an object. Head movement detection technology has significant potentiality in diverse field of applications and studies in this field verify such claim. The application includes fields like users interaction with computers, controlling many devices externally, power wheelchair operation, detecting drivers’ drowsiness while they drive, video surveillance system, and many more. Due to the diversity in application, the method of detecting head movement is also wide-ranging. A number of approaches such as acoustic-based, video-based, computer-vision based, inertial sensor data based head movement detection methods have been introduced by researchers over the years. In order to generate inertial sensor data, various types of wearables are available for example wrist band, smart watch, head-mounted device, and so on. For this thesis, eSense — a representative earable device — that has built-in inertial sensor to generate gyroscope data is employed. This eSense device is a True Wireless Stereo (TWS) earbud. It is augmented with some key equipment such as a 6-axis inertial motion unit, a microphone, and dual mode Bluetooth (Bluetooth Classic and Bluetooth Low Energy). Features are extracted from gyroscope data collected via eSense device. Subsequently, four machine learning models — Random Forest (RF), Support Vector Machine (SVM), Naïve Bayes, and Perceptron — are applied aiming to detect head movement. The performance of these models is evaluated by four different evaluation metrics such as Accuracy, Precision, Recall, and F1 score. Result shows that machine learning models that have been applied in this thesis are able to detect head movement. Comparing the performance of all these machine learning models, Random Forest performs better than others, it is able to detect head movement with approximately 77% accuracy. The accuracy rate of other three models such as Support Vector Machine, Naïve Bayes, and Perceptron is close to each other, where these models detect head movement with about 42%, 40%, and 39% accuracy, respectively. Besides, the result of other evaluation metrics like Precision, Recall, and F1 score verifies that using these machine learning models, different head direction such as left, right, or straight can be detected

    Smart Home Control for Disabled Using Brain Computer Interface

    Get PDF
    Electroencephalography (EEG) based smart home control system is one of the major applications of Brain Computer Interface (BCI) that allows disabled people to maximize their capabilities at home. A Brain Computer Interface (BCI) is a device that enables severely disabled people to communicate and interact with their environments using their brain waves. In this project, the scope includes Graphical User Interface (GUI) acts as a control and monitoring system for home appliances which using BCI as an input. Hence, NeuroSky MindWave headset is used to detect EEG signal from brain. Furthermore, a prototype model is developed using Raspberry Pi 3 Model B+, 4 channels 5V relay module, light bulb and fan. The raw data signal from brain wave is being extracted to operate the home appliances. Besides, the results agree well with the command signal used during the experiment. Lastly, the developed system can be easily implemented in smart homes and has high potential to be used in smart automation

    Assistente de navegação com apontador laser para conduzir cadeiras de rodas robotizadas

    Get PDF
    Orientador: Eric RohmerDissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de ComputaçãoResumo: As soluções de robótica assistida ajudam as pessoas a recuperar sua mobilidade e autonomia perdidas em suas vidas diárias. Este documento apresenta um assistente de navegação de baixo custo projetado para pessoas tetraplégicas para dirigir uma cadeira de rodas robotizada usando a combinação da orientação da cabeça e expressões faciais (sorriso e sobrancelhas para cima) para enviar comandos para a cadeira. O assistente fornece dois modos de navegação: manual e semi-autônomo. Na navegação manual, uma webcam normal com o algoritmo OpenFace detecta a orientação da cabeça do usuário e expressões faciais (sorriso, sobrancelhas para cima) para compor comandos e atuar diretamente nos movimentos da cadeira de rodas (parar, ir à frente, virar à direita, virar à esquerda). No modo semi-autônomo, o usuário controla um laser pan-tilt com a cabeça para apontar o destino desejado no solo e valida com o comando sobrancelhas para cima que faz com que a cadeira de rodas robotizada realize uma rotação seguida de um deslocamento linear para o alvo escolhido. Embora o assistente precise de melhorias, os resultados mostraram que essa solução pode ser uma tecnologia promissora para pessoas paralisadas do pescoço para controlar uma cadeira de rodas robotizadaAbstract: Assistive robotics solutions help people to recover their lost mobility and autonomy in their daily life. This document presents a low-cost navigation assistant designed for people paralyzed from down the neck to drive a robotized wheelchair using the combination of the head's posture and facial expressions (smile and eyebrows up) to send commands to the chair. The assistant provides two navigation modes: manual and semi-autonomous. In the manual navigation, a regular webcam with the OpenFace algorithm detects the user's head orientation and facial expressions (smile, eyebrows up) to compose commands and actuate directly on the wheelchair movements (stop, go front, turn right, turn left). In the semi-autonomous, the user controls a pan-tilt laser with his/her head to point the desired destination on the ground and validates with eyebrows up command which makes the robotized wheelchair performs a rotation followed by a linear displacement to the chosen target. Although the assistant need improvements, results have shown that this solution may be a promising technology for people paralyzed from down the neck to control a robotized wheelchairMestradoEngenharia de ComputaçãoMestre em Engenharia ElétricaCAPE

    On Assisted Living of Paralyzed Persons through Real-Time Eye Features Tracking and Classification using Support Vector Machines

    Get PDF
    Background: The eye features like eye-blink and eyeball movements can be used as a module in assisted living systems that allow a class of physically challenged people speaks – using their eyes. The objective of this work is to design a real-time customized keyboard to be used by a physically challenged person to speak to the outside world, for example, to enable a computer to read a story or a document, do gaming and exercise of nerves, etc., through eye features tracking Method: In a paralyzed person environment, the right-left, up-down eyeball movements act like a scroll and eye blink as a nod. The eye features are tracked using Support Vector Machines (SVMs). Results: A prototype keyboard is custom-designed to work with eye-blink detection and eyeball-movement tracking using Support Vector Machines (SVMs) and tested in a typical paralyzed person-environment under varied lighting conditions. Tests performed on male and female subjects of different ages showed results with a success rate of 92%. Conclusions: Since the system needs about 2 seconds to process one command, real-time use is not required. The efficiency can be improved through the use of a depth sensor camera, faster processor environment, or motion estimation

    Eye-tracking assistive technologies for individuals with amyotrophic lateral sclerosis

    Get PDF
    Amyotrophic lateral sclerosis, also known as ALS, is a progressive nervous system disorder that affects nerve cells in the brain and spinal cord, resulting in the loss of muscle control. For individuals with ALS, where mobility is limited to the movement of the eyes, the use of eye-tracking-based applications can be applied to achieve some basic tasks with certain digital interfaces. This paper presents a review of existing eye-tracking software and hardware through which eye-tracking their application is sketched as an assistive technology to cope with ALS. Eye-tracking also provides a suitable alternative as control of game elements. Furthermore, artificial intelligence has been utilized to improve eye-tracking technology with significant improvement in calibration and accuracy. Gaps in literature are highlighted in the study to offer a direction for future research

    Biosignal‐based human–machine interfaces for assistance and rehabilitation : a survey

    Get PDF
    As a definition, Human–Machine Interface (HMI) enables a person to interact with a device. Starting from elementary equipment, the recent development of novel techniques and unobtrusive devices for biosignals monitoring paved the way for a new class of HMIs, which take such biosignals as inputs to control various applications. The current survey aims to review the large literature of the last two decades regarding biosignal‐based HMIs for assistance and rehabilitation to outline state‐of‐the‐art and identify emerging technologies and potential future research trends. PubMed and other databases were surveyed by using specific keywords. The found studies were further screened in three levels (title, abstract, full‐text), and eventually, 144 journal papers and 37 conference papers were included. Four macrocategories were considered to classify the different biosignals used for HMI control: biopotential, muscle mechanical motion, body motion, and their combinations (hybrid systems). The HMIs were also classified according to their target application by considering six categories: prosthetic control, robotic control, virtual reality control, gesture recognition, communication, and smart environment control. An ever‐growing number of publications has been observed over the last years. Most of the studies (about 67%) pertain to the assistive field, while 20% relate to rehabilitation and 13% to assistance and rehabilitation. A moderate increase can be observed in studies focusing on robotic control, prosthetic control, and gesture recognition in the last decade. In contrast, studies on the other targets experienced only a small increase. Biopotentials are no longer the leading control signals, and the use of muscle mechanical motion signals has experienced a considerable rise, especially in prosthetic control. Hybrid technologies are promising, as they could lead to higher performances. However, they also increase HMIs’ complex-ity, so their usefulness should be carefully evaluated for the specific application

    Potential of consumer EEG for real-time interactions in immersive VR

    Get PDF
    Abstract. Virtual reality is an active research subject and has received a lot of attention over the last few years. We have seen multiple commercial VR devices, each improving upon the last iteration become available to the wider public. In addition, interest in brain-computer interface (BCI) devices has increased rapidly. As these devices are becoming more affordable and easy to use, we are presented with more accessible options to measure brain activity. In this study, our aim is to combine these two technologies to enhance the interaction within a virtual environment. In this study we sought to facilitate interaction in VR by using EEG signals. The EEG signals were used to estimate the volume of focus. By applying this concept with VR, we designed two use cases for further exploration. The methods of interactions explored in the study were telekinesis and teleportation. Telekinesis seemed an applicable option for this study since it allows the utilization of the EEG while maintaining a captivating and engaging user experience. With teleportation, the goal was the exploration of different options for locomotion in VR. To test our solution, we built a test environment by using Unity engine. We also invited several participants to gain feedback on the usability and accuracy of our methodology. For evaluation, 13 study participants were divided into two different groups. The other group tested our actual solution for the estimation of the focus. However, the other group used randomized values for the same purpose. Some key differences between the test groups were identified. We were able to create a working prototype where the users could interact with the environment by using their EEG signals. With some improvements, this could be expanded to a more refined solution with a better user experience. There is a lot of potential in combining the use of human brain signals with virtual environments to both enrich the interaction and increase the immersion of virtual reality.Kuluttaja-EEG laitteiden potentiaali reaaliaikaiseen vuorovaikutukseen immersiivisessä virtuaalitodellisuudessa. Tiivistelmä. Virtuaalitodellisuus (VR) on aktiivisen tutkimuksen kohde ja varsinkin viime vuosina herättänyt paljon huomiota. VR-laseissa on tapahtunut huomattavaa kehitystä ja niitä on saatavilla yhä laajemmalle käyttäjäkunnalle. Lisäksi kiinnostus aivo-tietokone -rajapintoihin (BCI) on kiihtymässä. Koska aivokäyrää mittaavat laitteet ovat yhä edullisempia ja kehittymässä helppokäyttöisemmiksi, monia uusia menetelmiä aivosignaalin mittamiseksi on saatavilla. Tässä työssä tavoitteemme oli yhdistää nämä kaksi teknologiaa parantaaksemme vuorovaikutusta virtuaalitodellisuudessa. Tässä tutkimuksessa käytimme aivosähkökäyrää VR-käyttäjäkokemuksen kehittämiseksi. Tätä tekniikkaa hyödyntäen arvioimme käyttäjän keskittymistä. Tutkimusta varten valitsimme kaksi vuorovaikutustapaa. Nämä tutkittavat tavat ovat telekinesia sekä teleportaatio. Telekinesia on mielenkiintoinen tapa hyödyntää aivosähkökäyrää luoden samalla mukaansatempaavan käyttäjäkokemuksen. Teleportaation päämääränä oli löytää uudenlaisia liikkumistapoja VR:ssä. Tutkimustamme varten, suunnittelimme testiympäristön Unity-pelimoottorilla. Kokosimme joukon testaajia, joiden avulla arvioimme työmme käyttökelpoisuutta sekä tarkkuutta. Saadaksemme luotettavampia testituloksia, jaoimme 13 testaajaa kahteen eri ryhmään. Toinen ryhmistä testasi varsinaista toteutustamme ja toinen ryhmä käytti satunnaistettuja keskittymisarvoja. Löysimme ratkaisevia eroja näiden kahden testiryhmän välillä. Onnistuimme kehittämään toimivan prototyypin, jossa käyttäjät kykenivät interaktioon virtuaaliympäristössä hyödyntäen aivosähkökäyrää. Jatkokehitystä tekemällä käyttäjäkokemusta olisi mahdollista parantaa entisestään. Integraatio aivosensoreiden ja virtuaalitodellisuuden välillä huokuu potentiaalia ja tarjoaa mahdollisuuksia tehdä virtuaalimaailmasta yhä immersiivisemmän

    A Hybrid-Powered Wireless System for Multiple Biopotential Monitoring

    Get PDF
    Chronic diseases are the top cause of human death in the United States and worldwide. A huge amount of healthcare costs is spent on chronic diseases every year. The high medical cost on these chronic diseases facilitates the transformation from in-hospital to out-of-hospital healthcare. The out-of-hospital scenarios require comfortability and mobility along with quality healthcare. Wearable electronics for well-being management provide good solutions for out-of-hospital healthcare. Long-term health monitoring is a practical and effective way in healthcare to prevent and diagnose chronic diseases. Wearable devices for long-term biopotential monitoring are impressive trends for out-of-hospital health monitoring. The biopotential signals in long-term monitoring provide essential information for various human physiological conditions and are usually used for chronic diseases diagnosis. This study aims to develop a hybrid-powered wireless wearable system for long-term monitoring of multiple biopotentials. For the biopotential monitoring, the non-contact electrodes are deployed in the wireless wearable system to provide high-level comfortability and flexibility for daily use. For providing the hybrid power, an alternative mechanism to harvest human motion energy, triboelectric energy harvesting, has been applied along with the battery to supply energy for long-term monitoring. For power management, an SSHI rectifying strategy associated with triboelectric energy harvester design has been proposed to provide a new perspective on designing TEHs by considering their capacitance concurrently. Multiple biopotentials, including ECG, EMG, and EEG, have been monitored to validate the performance of the wireless wearable system. With the investigations and studies in this project, the wearable system for biopotential monitoring will be more practical and can be applied in the real-life scenarios to increase the economic benefits for the health-related wearable devices
    corecore