1,473 research outputs found
Cross-Platform Implementation of an SSVEP-Based BCI for the Control of a 6-DOF Robotic Arm
[EN] Robotics has been successfully applied in the design of collaborative robots for assistance to people with motor disabilities. However, man-machine interaction is difficult for those who suffer severe motor disabilities. The aim of this study was to test the feasibility of a low-cost robotic arm control system with an EEG-based brain-computer interface (BCI). The BCI system relays on the Steady State Visually Evoked Potentials (SSVEP) paradigm. A cross-platform application was obtained in C++. This C++ platform, together with the open-source software Openvibe was used to control a Staubli robot arm model TX60. Communication between Openvibe and the robot was carried out through the Virtual Reality Peripheral Network (VRPN) protocol. EEG signals were acquired with the 8-channel Enobio amplifier from Neuroelectrics. For the processing of the EEG signals, Common Spatial Pattern (CSP) filters and a Linear Discriminant Analysis classifier (LDA) were used. Five healthy subjects tried the BCI. This work allowed the communication and integration of a well-known BCI development platform such as Openvibe with the specific control software of a robot arm such as Staubli TX60 using the VRPN protocol. It can be concluded from this study that it is possible to control the robotic arm with an SSVEP-based BCI with a reduced number of dry electrodes to facilitate the use of the system.Funding for open access charge: Universitat Politecnica de Valencia.Quiles Cucarella, E.; Dadone, J.; Chio, N.; GarcĂa Moreno, E. (2022). Cross-Platform Implementation of an SSVEP-Based BCI for the Control of a 6-DOF Robotic Arm. Sensors. 22(13):1-26. https://doi.org/10.3390/s22135000126221
Manual 3D Control of an Assistive Robotic Manipulator Using Alpha Rhythms and an Auditory Menu:A Proof-of-Concept
Brain–Computer Interfaces (BCIs) have been regarded as potential tools for individuals with severe motor disabilities, such as those with amyotrophic lateral sclerosis, that render interfaces that rely on movement unusable. This study aims to develop a dependent BCI system for manual end-point control of a robotic arm. A proof-of-concept system was devised using parieto-occipital alpha wave modulation and a cyclic menu with auditory cues. Users choose a movement to be executed and asynchronously stop said action when necessary. Tolerance intervals allowed users to cancel or confirm actions. Eight able-bodied subjects used the system to perform a pick-and-place task. To investigate the potential learning effects, the experiment was conducted twice over the course of two consecutive days. Subjects obtained satisfactory completion rates (84.0 ± 15.0% and 74.4 ± 34.5% for the first and second day, respectively) and high path efficiency (88.9 ± 11.7% and 92.2 ± 9.6%). Subjects took on average 439.7 ± 203.3 s to complete each task, but the robot was only in motion 10% of the time. There was no significant difference in performance between both days. The developed control scheme provided users with intuitive control, but a considerable amount of time is spent waiting for the right target (auditory cue). Implementing other brain signals may increase its speed
Co-adaptive control strategies in assistive Brain-Machine Interfaces
A large number of people with severe motor disabilities cannot access any of the
available control inputs of current assistive products, which typically rely on residual
motor functions. These patients are therefore unable to fully benefit from existent
assistive technologies, including communication interfaces and assistive robotics. In
this context, electroencephalography-based Brain-Machine Interfaces (BMIs) offer a
potential non-invasive solution to exploit a non-muscular channel for communication
and control of assistive robotic devices, such as a wheelchair, a telepresence
robot, or a neuroprosthesis. Still, non-invasive BMIs currently suffer from limitations,
such as lack of precision, robustness and comfort, which prevent their practical
implementation in assistive technologies.
The goal of this PhD research is to produce scientific and technical developments
to advance the state of the art of assistive interfaces and service robotics based on
BMI paradigms. Two main research paths to the design of effective control strategies
were considered in this project. The first one is the design of hybrid systems, based on
the combination of the BMI together with gaze control, which is a long-lasting motor
function in many paralyzed patients. Such approach allows to increase the degrees
of freedom available for the control. The second approach consists in the inclusion
of adaptive techniques into the BMI design. This allows to transform robotic tools and
devices into active assistants able to co-evolve with the user, and learn new rules of
behavior to solve tasks, rather than passively executing external commands.
Following these strategies, the contributions of this work can be categorized
based on the typology of mental signal exploited for the control. These include:
1) the use of active signals for the development and implementation of hybrid eyetracking
and BMI control policies, for both communication and control of robotic
systems; 2) the exploitation of passive mental processes to increase the adaptability
of an autonomous controller to the user\u2019s intention and psychophysiological state,
in a reinforcement learning framework; 3) the integration of brain active and passive
control signals, to achieve adaptation within the BMI architecture at the level of
feature extraction and classification
Aerospace medicine and biology: A continuing bibliography with indexes (supplement 341)
This bibliography lists 133 reports, articles and other documents introduced into the NASA Scientific and Technical Information System during September 1990. Subject coverage includes: aerospace medicine and psychology, life support systems and controlled environments, safety equipment, exobiology and extraterrestrial life, and flight crew behavior and performance
NOIR: Neural Signal Operated Intelligent Robots for Everyday Activities
We present Neural Signal Operated Intelligent Robots (NOIR), a
general-purpose, intelligent brain-robot interface system that enables humans
to command robots to perform everyday activities through brain signals. Through
this interface, humans communicate their intended objects of interest and
actions to the robots using electroencephalography (EEG). Our novel system
demonstrates success in an expansive array of 20 challenging, everyday
household activities, including cooking, cleaning, personal care, and
entertainment. The effectiveness of the system is improved by its synergistic
integration of robot learning algorithms, allowing for NOIR to adapt to
individual users and predict their intentions. Our work enhances the way humans
interact with robots, replacing traditional channels of interaction with
direct, neural communication. Project website: https://noir-corl.github.io/
Brain–Machine Interface and Visual Compressive Sensing-Based Teleoperation Control of an Exoskeleton Robot
This paper presents a teleoperation control for an exoskeleton robotic system based on the brain-machine interface and vision feedback. Vision compressive sensing, brain-machine reference commands, and adaptive fuzzy controllers in joint-space have been effectively integrated to enable the robot performing manipulation tasks guided by human operator's mind. First, a visual-feedback link is implemented by a video captured by a camera, allowing him/her to visualize the manipulator's workspace and movements being executed. Then, the compressed images are used as feedback errors in a nonvector space for producing steady-state visual evoked potentials electroencephalography (EEG) signals, and it requires no prior information on features in contrast to the traditional visual servoing. The proposed EEG decoding algorithm generates control signals for the exoskeleton robot using features extracted from neural activity. Considering coupled dynamics and actuator input constraints during the robot manipulation, a local adaptive fuzzy controller has been designed to drive the exoskeleton tracking the intended trajectories in human operator's mind and to provide a convenient way of dynamics compensation with minimal knowledge of the dynamics parameters of the exoskeleton robot. Extensive experiment studies employing three subjects have been performed to verify the validity of the proposed method
A Consumer-tier based Visual-Brain Machine Interface for Augmented Reality Glasses Interactions
Objective.Visual-Brain Machine Interface(V-BMI) has provide a novel
interaction technique for Augmented Reality (AR) industries. Several
state-of-arts work has demonstates its high accuracy and real-time interaction
capbilities. However, most of the studies employ EEGs devices that are rigid
and difficult to apply in real-life AR glasseses application sceniraros. Here
we develop a consumer-tier Visual-Brain Machine Inteface(V-BMI) system
specialized for Augmented Reality(AR) glasses interactions. Approach. The
developed system consists of a wearable hardware which takes advantages of fast
set-up, reliable recording and comfortable wearable experience that
specificized for AR glasses applications. Complementing this hardware, we have
devised a software framework that facilitates real-time interactions within the
system while accommodating a modular configuration to enhance scalability. Main
results. The developed hardware is only 110g and 120x85x23 mm, which with 1
Tohm and peak to peak voltage is less than 1.5 uV, and a V-BMI based angry bird
game and an Internet of Thing (IoT) AR applications are deisgned, we
demonstrated such technology merits of intuitive experience and efficiency
interaction. The real-time interaction accuracy is between 85 and 96
percentages in a commercial AR glasses (DTI is 2.24s and ITR 65 bits-min ).
Significance. Our study indicates the developed system can provide an essential
hardware-software framework for consumer based V-BMI AR glasses. Also, we
derive several pivotal design factors for a consumer-grade V-BMI-based AR
system: 1) Dynamic adaptation of stimulation patterns-classification methods
via computer vision algorithms is necessary for AR glasses applications; and 2)
Algorithmic localization to foster system stability and latency reduction.Comment: 15 pages,10 figure
Dynamics of embodied dissociated cortical cultures for the control of hybrid biological robots.
The thesis presents a new paradigm for studying the importance of interactions between an organism and its environment using a combination of biology and technology: embodying cultured cortical neurons via robotics. From this platform, explanations of the emergent neural network properties leading to cognition are sought through detailed electrical observation of neural activity. By growing the networks of neurons and glia over multi-electrode arrays (MEA), which can be used to both stimulate and record the activity of multiple neurons in parallel over months, a long-term real-time 2-way communication with the neural network becomes possible. A better understanding of the processes leading to biological cognition can, in turn, facilitate progress in understanding neural pathologies, designing neural prosthetics, and creating fundamentally different types of artificial cognition.
Here, methods were first developed to reliably induce and detect neural plasticity using MEAs. This knowledge was then applied to construct sensory-motor mappings and training algorithms that produced adaptive goal-directed behavior. To paraphrase the results, most any stimulation could induce neural plasticity, while the inclusion of temporal and/or spatial information about neural activity was needed to identify plasticity. Interestingly, the plasticity of action potential propagation in axons was observed. This is a notion counter to the dominant theories of neural plasticity that focus on synaptic efficacies and is suggestive of a vast and novel computational mechanism for learning and memory in the brain.
Adaptive goal-directed behavior was achieved by using patterned training stimuli, contingent on behavioral performance, to sculpt the network into behaviorally appropriate functional states: network plasticity was not only induced, but could be customized. Clinically, understanding the relationships between electrical stimulation, neural activity, and the functional expression of neural plasticity could assist neuro-rehabilitation and the design of neuroprosthetics. In a broader context, the networks were also embodied with a robotic drawing machine exhibited in galleries throughout the world. This provided a forum to educate the public and critically discuss neuroscience, robotics, neural interfaces, cybernetics, bio-art, and the ethics of biotechnology.Ph.D.Committee Chair: Steve M. Potter; Committee Member: Eric Schumacher; Committee Member: Robert J. Butera; Committee Member: Stephan P. DeWeerth; Committee Member: Thomas D. DeMars
- …