12,315 research outputs found
Brain-Switches for Asynchronous Brain−Computer Interfaces: A Systematic Review
A brain–computer interface (BCI) has been extensively studied to develop a novel communication system for disabled people using their brain activities. An asynchronous BCI system is more realistic and practical than a synchronous BCI system, in that, BCI commands can be generated whenever the user wants. However, the relatively low performance of an asynchronous BCI system is problematic because redundant BCI commands are required to correct false-positive operations. To significantly reduce the number of false-positive operations of an asynchronous BCI system, a two-step approach has been proposed using a brain-switch that first determines whether the user wants to use an asynchronous BCI system before the operation of the asynchronous BCI system. This study presents a systematic review of the state-of-the-art brain-switch techniques and future research directions. To this end, we reviewed brain-switch research articles published from 2000 to 2019 in terms of their (a) neuroimaging modality, (b) paradigm, (c) operation algorithm, and (d) performance
Recent and upcoming BCI progress: overview, analysis, and recommendations
Brain–computer interfaces (BCIs) are finally moving out of the laboratory and beginning to gain acceptance in real-world situations. As BCIs gain attention with broader groups of users, including persons with different disabilities and healthy users, numerous practical questions gain importance. What are the most practical ways to detect and analyze brain activity in field settings? Which devices and applications are most useful for different people? How can we make BCIs more natural and sensitive, and how can BCI technologies improve usability? What are some general trends and issues, such as combining different BCIs or assessing and comparing performance? This book chapter provides an overview of the different sections of this book, providing a summary of how authors address these and other questions. We also present some predictions and recommendations that ensue from our experience from discussing these and other issues with our authors and other researchers and developers within the BCI community. We conclude that, although some directions are hard to predict, the field is definitely growing and changing rapidly, and will continue doing so in the next several years
Combining brain-computer interfaces and assistive technologies: state-of-the-art and challenges
In recent years, new research has brought the field of EEG-based Brain-Computer Interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely,“Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user-machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human-computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices
AJILE Movement Prediction: Multimodal Deep Learning for Natural Human Neural Recordings and Video
Developing useful interfaces between brains and machines is a grand challenge
of neuroengineering. An effective interface has the capacity to not only
interpret neural signals, but predict the intentions of the human to perform an
action in the near future; prediction is made even more challenging outside
well-controlled laboratory experiments. This paper describes our approach to
detect and to predict natural human arm movements in the future, a key
challenge in brain computer interfacing that has never before been attempted.
We introduce the novel Annotated Joints in Long-term ECoG (AJILE) dataset;
AJILE includes automatically annotated poses of 7 upper body joints for four
human subjects over 670 total hours (more than 72 million frames), along with
the corresponding simultaneously acquired intracranial neural recordings. The
size and scope of AJILE greatly exceeds all previous datasets with movements
and electrocorticography (ECoG), making it possible to take a deep learning
approach to movement prediction. We propose a multimodal model that combines
deep convolutional neural networks (CNN) with long short-term memory (LSTM)
blocks, leveraging both ECoG and video modalities. We demonstrate that our
models are able to detect movements and predict future movements up to 800 msec
before movement initiation. Further, our multimodal movement prediction models
exhibit resilience to simulated ablation of input neural signals. We believe a
multimodal approach to natural neural decoding that takes context into account
is critical in advancing bioelectronic technologies and human neuroscience
Moregrasp: Restoration of Upper Limb Function in Individuals with High Spinal Cord Injury by Multimodal Neuroprostheses for Interaction in Daily Activities
The aim of the MoreGrasp project is to develop a noninvasive, multimodal user interface including a brain-computer interface (BCI) for intuitive control of a grasp neuroprosthesis to support individuals with high spinal cord injury (SCI) in everyday activities. We describe the current state of the project, including the EEG system, preliminary results of natural movements decoding in people with SCI, the new electrode concept for the grasp neuroprosthesis, the shared control architecture behind the system and the implementation of a user-centered design
- …