568 research outputs found

    Robotic Autism Rehabilitation by Wearable Brain-Computer Interface and Augmented Reality

    Get PDF
    An instrument based on the integration of Brain Computer Interface (BCI) and Augmented Reality (AR) is proposed for robotic autism rehabilitation. Flickering stimuli at fixed frequencies appear on the display of Augmented Reality (AR) glasses. When the user focuses on one of the stimuli a Steady State Visual Evoked Potentials (SSVEP) occurs on his occipital region. A single-channel electroencephalographic Brain Computer Interface detects the elicited SSVEP and sends the corresponding commands to a mobile robot. The device's high wearability (single channel and dry electrodes), and the trainingless usability are fundamental for the acceptance by Autism Spectrum Disorder (ASD) children. Effectively controlling the movements of a robot through a new channel enhances rehabilitation engagement and effectiveness. A case study at an accredited rehabilitation center on 10 healthy adult subjects highlighted an average accuracy higher than 83%. Preliminary further tests at the Department of Translational Medical Sciences of University of Naples Federico II on 3 ASD patients between 8 and 10 years old provided positive feedback on device acceptance and attentional performance

    Wearable Brain-Computer Interface Instrumentation for Robot-Based Rehabilitation by Augmented Reality

    Get PDF
    An instrument for remote control of the robot by wearable brain-computer interface (BCI) is proposed for rehabilitating children with attention-deficit/hyperactivity disorder (ADHD). Augmented reality (AR) glasses generate flickering stimuli, and a single-channel electroencephalographic BCI detects the elicited steady-state visual evoked potentials (SSVEPs). This allows benefiting from the SSVEP robustness by leaving available the view of robot movements. Together with the lack of training, a single channel maximizes the device's wearability, fundamental for the acceptance by ADHD children. Effectively controlling the movements of a robot through a new channel enhances rehabilitation engagement and effectiveness. A case study at an accredited rehabilitation center on ten healthy adult subjects highlighted an average accuracy higher than 83%, with information transfer rate (ITR) up to 39 b/min. Preliminary further tests on four ADHD patients between six- and eight-years old provided highly positive feedback on device acceptance and attentional performance

    IoT Based Virtual Reality Game for Physio-therapeutic Patients

    Get PDF
    Biofeedback therapy trains the patient to control voluntarily the involuntary process of their body. This non-invasive and non-drug treatment is also used as a means to rehabilitate the physical impairments that may follow a stroke, a traumatic brain injury or even in neurological aspects within occupational therapy. The idea behind this study is based on using immersive gaming as a tool for physical rehabilitation that combines the idea of biofeedback and physical computing to get a patient emotionally involved in a game that requires them to do the exercises in order to interact with the game. This game is aimed towards addressing the basic treatment for ‘Frozen Shoulder’. In this work, the physical motions are captured by the wearable ultrasonic sensor attached temporarily to the various limbs of the patient. The data received from the sensors are then sent to the game via serial wireless communication. There are two main aspects to this study: motion capturing and game design. The current status of the application is a single ultrasonic detector. The experimental result shows that physio-therapeutic patients are benefited through the IoT based virtual reality game

    Effect of sensory-based technologies on atypical sensory responses of children with Autism Spectrum Disorder: A systematic review

    Get PDF
    © 2021 ACM, Inc. This is the accepted manuscript version of an article which has been published in final form at https://doi.org/10.1145/3485768.3485782.Atypical sensory responses are one of the most common issues observed in Autism Spectrum Disorder (ASD), affecting the development of a child's capability for social interaction, independent living and learning. In the past two decades, there has been a growing number of studies of technology-based interventions for atypical sensory responses of individuals with ASD. However, their effects and limitations have not been fully examined. This systematic review investigates the effects of sensory-based technologies (SBTs) on atypical sensory responses of children with ASD. Publications that report on the use of a SBT as an intervention tool were retrieved from four academic databases: “PubMed”, “IEEE Xplore”, “ACM Digital Library” and “Web of Science”. The search finally yielded 18 articles. The results indicated an emerging trend of studies investigating the effects of SBTs on atypical sensory responses over the past decade. Challenges and limitations were found in studies, mainly because the literatures adopted different methods and indicators, small sample sizes, and varying experimental designs. Findings were that the use of SBTs could effectively improve auditory and visual recognition, and some other behavioural outcomes such as attention in children with ASD. Future development of SBTs could further integrate more advanced techniques, such as machine learning, in order to widen the scope of SBTs usage to help more ASD children

    Creative Haptic Interface Design for the Aging Population

    Get PDF
    Audiovisual human-computer-interfaces still make up the majority of content to the public; however, haptic interfaces offer unique advantage over the dominant information infrastructure, particularly for users with a disability or diminishing cognitive and physical skills like the elderly. The tactile sense allows users to integrate new, unobstructive channels for digital information into their sensorium, one that is less likely to be overwhelmed compared to vision and audition. Haptics research focus on the development of hardware, improving resolution, modality, and fidelity of the actuators. Despite the technological limitations, haptic interfaces are shown to reinforce physical skill acquisition, therapy, and communication. This chapter will present key characteristics intuitive tactile interfaces should capture for elderly end-users; sample projects will showcase unique applications and designs that identify the limitations of the UI

    Robotic Platforms for Assistance to People with Disabilities

    Get PDF
    People with congenital and/or acquired disabilities constitute a great number of dependents today. Robotic platforms to help people with disabilities are being developed with the aim of providing both rehabilitation treatment and assistance to improve their quality of life. A high demand for robotic platforms that provide assistance during rehabilitation is expected because of the health status of the world due to the COVID-19 pandemic. The pandemic has resulted in countries facing major challenges to ensure the health and autonomy of their disabled population. Robotic platforms are necessary to ensure assistance and rehabilitation for disabled people in the current global situation. The capacity of robotic platforms in this area must be continuously improved to benefit the healthcare sector in terms of chronic disease prevention, assistance, and autonomy. For this reason, research about human–robot interaction in these robotic assistance environments must grow and advance because this topic demands sensitive and intelligent robotic platforms that are equipped with complex sensory systems, high handling functionalities, safe control strategies, and intelligent computer vision algorithms. This Special Issue has published eight papers covering recent advances in the field of robotic platforms to assist disabled people in daily or clinical environments. The papers address innovative solutions in this field, including affordable assistive robotics devices, new techniques in computer vision for intelligent and safe human–robot interaction, and advances in mobile manipulators for assistive tasks

    A portable EEG-BCI framework enhanced by machine learning techniques

    Get PDF
    Brain Computer Interfaces (BCIs) allow direct communication between the human brain and external devices through the processing and interpretation of brain signals. Indeed, BCI represents the ultimate achievement in human-machine interaction, eliminating all the intermediate physical steps between the intention of an action and its implementation. Electroencephalography (EEG) plays a key role in BCIs being the least invasive technique for capturing brain electrical activity. However, high performance devices turn out to be uncomfortable and of unpractical use outside dedicated facilities, mainly due to the use of many electrodes. Conversely, single-channel EEG devices made of fewer electrodes provide weak and noisy signals difficult to interpret. In this PhD thesis, a portable BCI prototype enhanced by machine learning techniques for the classification of brain signals — and in particular of Steady State Visual Evoked Potentials (SSVEPs) — is proposed. The current study embraces the design, realization, characterization, and optimization of a BCI built on top of a cost-effective single-channel EEG device. The results have been validated both in offline and online sessions thanks to the collaboration of volunteers equipped with the given prototype. Due to its usability, the proposed framework may broaden the range of state-of-the-art BCI applications

    Robotic Solutions in Pediatric Rehabilitation

    Get PDF
    • 

    corecore