399 research outputs found

    Physiological Responses During Hybrid BNCI Control of an Upper-Limb Exoskeleton

    Get PDF
    When combined with assistive robotic devices, such as wearable robotics, brain/neural-computer interfaces (BNCI) have the potential to restore the capabilities of handicapped people to carry out activities of daily living. To improve applicability of such systems, workload and stress should be reduced to a minimal level. Here, we investigated the user’s physiological reactions during the exhaustive use of the interfaces of a hybrid control interface. Eleven BNCI-naive healthy volunteers participated in the experiments. All participants sat in a comfortable chair in front of a desk and wore a whole-arm exoskeleton as well as wearable devices for monitoring physiological, electroencephalographic (EEG) and electrooculographic (EoG) signals. The experimental protocol consisted of three phases: (i) Set-up, calibration and BNCI training; (ii) Familiarization phase ; and (iii) Experimental phase during which each subject had to perform EEG and EoG tasks. After completing each task, the NASA-TLX questionnaire and self-assessment manikin (SAM) were completed by the user. We found significant differences (p-value < 0.05) in heart rate variability (HRV) and skin conductance level (SCL) between participants during the use of the two different biosignal modalities (EEG, EoG) of the BNCI. This indicates that EEG control is associated with a higher level of stress (associated with a decrease in HRV) and mental work load (associated with a higher level of SCL) when compared to EoG control. In addition, HRV and SCL modulations correlated with the subject’s workload perception and emotional responses assessed through NASA-TLX questionnaires and SAM

    A Python-based Brain-Computer Interface Package for Neural Data Analysis

    Get PDF
    Anowar, Md Hasan, A Python-based Brain-Computer Interface Package for Neural Data Analysis. Master of Science (MS), December, 2020, 70 pp., 4 tables, 23 figures, 74 references. Although a growing amount of research has been dedicated to neural engineering, only a handful of software packages are available for brain signal processing. Popular brain-computer interface packages depend on commercial software products such as MATLAB. Moreover, almost every brain-computer interface software is designed for a specific neuro-biological signal; there is no single Python-based package that supports motor imagery, sleep, and stimulated brain signal analysis. The necessity to introduce a brain-computer interface package that can be a free alternative for commercial software has motivated me to develop a toolbox using the python platform. In this thesis, the structure of MEDUSA, a brain-computer interface toolbox, is presented. The features of the toolbox are demonstrated with publicly available data sources. The MEDUSA toolbox provides a valuable tool to biomedical engineers and computational neuroscience researchers

    Efficient Implementation and Design of A New Single-Channel Electrooculography-based Human-Machine Interface System

    Get PDF
    published_or_final_versio

    A user-friendly wearable single-channel EOG-based human-computer interface for cursor control

    Get PDF
    This paper presents a novel wearable single-channel electrooculography (EOG) based human-computer interface (HCI) with a simple system design and robust performance. In the proposed system, EOG signals for control are generated from double eye blinks, collected by a commercial wearable device (the NeuroSky MindWave headset), and then converted into a sequence of commands that can control cursor navigations and actions. The EOG-based cursor control system was tested on 8 subjects in indoor or outdoor environment, and the average accuracy is 84.42% for indoor uses and 71.50% for outdoor uses. Compared with other existing EOG-based HCI systems, this system is highly user-friendly and does not require any training. Therefore, this system has the potential to provide an easy-to-use and cheap assistive technique for locked-in patients who have lost their main body muscular abilities but with proper eye-condition. © 2015 IEEE.published_or_final_versio

    Electro-oculography in the Field of Assistive Interaction Communication

    Get PDF
    There is sufficient evidence about feasible use of electrical bio signals in the field of Alternate Communication. Additionally, they are particularly suitable in the case of people with severe motor disorder, for example people with other physical disorder. Developing solutions for them involves different ways of using sensors that decides the user’s needs and limitations, which in turn converts the user’s intentions into commands. The system should be evaluated with an appropriate method. This paper submits alternative communication techniques used for communication using electrooculography technique. DOI: 10.17762/ijritcc2321-8169.15035

    The multimodal edge of human aerobotic interaction

    No full text
    This paper presents the idea of a multimodal human aerobotic interaction. An overview of the aerobotic system and its application is given. The joystick-based controller interface and its limitations is discussed. Two techniques are suggested as emerging alternatives to the joystick-based controller interface used in human aerobotic interaction. The first technique is a multimodal combination of speech, gaze, gesture, and other non-verbal cues already used in regular human-humaninteraction. The second is telepathic interaction via brain computer interfaces. The potential limitations of these alternatives is highlighted, and the considerations for further works are presented

    EOG-Based Human–Computer Interface: 2000–2020 Review

    Get PDF
    Electro-oculography (EOG)-based brain-computer interface (BCI) is a relevant technology influencing physical medicine, daily life, gaming and even the aeronautics field. EOG-based BCI systems record activity related to users' intention, perception and motor decisions. It converts the bio-physiological signals into commands for external hardware, and it executes the operation expected by the user through the output device. EOG signal is used for identifying and classifying eye movements through active or passive interaction. Both types of interaction have the potential for controlling the output device by performing the user's communication with the environment. In the aeronautical field, investigations of EOG-BCI systems are being explored as a relevant tool to replace the manual command and as a communicative tool dedicated to accelerating the user's intention. This paper reviews the last two decades of EOG-based BCI studies and provides a structured design space with a large set of representative papers. Our purpose is to introduce the existing BCI systems based on EOG signals and to inspire the design of new ones. First, we highlight the basic components of EOG-based BCI studies, including EOG signal acquisition, EOG device particularity, extracted features, translation algorithms, and interaction commands. Second, we provide an overview of EOG-based BCI applications in the real and virtual environment along with the aeronautical application. We conclude with a discussion of the actual limits of EOG devices regarding existing systems. Finally, we provide suggestions to gain insight for future design inquiries

    Hybrid brain/neural interface and autonomous vision-guided whole-arm exoskeleton control to perform activities of daily living (ADLs)

    Full text link
    [EN] Background The aging of the population and the progressive increase of life expectancy in developed countries is leading to a high incidence of age-related cerebrovascular diseases, which affect people's motor and cognitive capabilities and might result in the loss of arm and hand functions. Such conditions have a detrimental impact on people's quality of life. Assistive robots have been developed to help people with motor or cognitive disabilities to perform activities of daily living (ADLs) independently. Most of the robotic systems for assisting on ADLs proposed in the state of the art are mainly external manipulators and exoskeletal devices. The main objective of this study is to compare the performance of an hybrid EEG/EOG interface to perform ADLs when the user is controlling an exoskeleton rather than using an external manipulator. Methods Ten impaired participants (5 males and 5 females, mean age 52 +/- 16 years) were instructed to use both systems to perform a drinking task and a pouring task comprising multiple subtasks. For each device, two modes of operation were studied: synchronous mode (the user received a visual cue indicating the sub-tasks to be performed at each time) and asynchronous mode (the user started and finished each of the sub-tasks independently). Fluent control was assumed when the time for successful initializations ranged below 3 s and a reliable control in case it remained below 5 s. NASA-TLX questionnaire was used to evaluate the task workload. For the trials involving the use of the exoskeleton, a custom Likert-Scale questionnaire was used to evaluate the user's experience in terms of perceived comfort, safety, and reliability. Results All participants were able to control both systems fluently and reliably. However, results suggest better performances of the exoskeleton over the external manipulator (75% successful initializations remain below 3 s in case of the exoskeleton and bellow 5s in case of the external manipulator). Conclusions Although the results of our study in terms of fluency and reliability of EEG control suggest better performances of the exoskeleton over the external manipulator, such results cannot be considered conclusive, due to the heterogeneity of the population under test and the relatively limited number of participants.This study was funded by the European Commission under the project AIDE (G.A. no: 645322), Spanish Ministry of Science and Innovation, through the projects PID2019-108310RB-I00 and PLEC2022-009424 and by the Ministry of Universities and European Union, "fnanced by European Union-Next Generation EU" through Margarita Salas grant for the training of young doctors.Catalán, JM.; Trigili, E.; Nann, M.; Blanco-Ivorra, A.; Lauretti, C.; Cordella, F.; Ivorra, E.... (2023). Hybrid brain/neural interface and autonomous vision-guided whole-arm exoskeleton control to perform activities of daily living (ADLs). Journal of NeuroEngineering and Rehabilitation. 20(1):1-16. https://doi.org/10.1186/s12984-023-01185-w11620
    • …
    corecore