787 research outputs found

    Sistema de telepresencia controlado por una interfaz cerebro-computador: pruebas iniciales con pacientes de ELA

    Get PDF
    Las interfaces cerebro-ordenador (BCIs, siglas de su término inglés Brain-Computer Interfaces) proporcionan a sus usuarios comunicación y control únicamente con su actividad cerebral. Éstas no dependen de los canales de salida habituales del cerebro de nervios periféricos y músculos, abriendo un nuevo y valioso canal de comunicación para personas con enfermedades neurológicas o musculares severas, tales como la esclerosis lateral amiotrófica (ELA), infarto cerebral, parálisis cerebral, y daños en la médula espinal. La combinación de las interfaces cerebro-ordenador con la robótica puede dotar a los usuarios de una entidad física personicada en un entorno real (en cualquier parte del mundo con acceso a Internet) preparada para percibir, explorar, e interaccionar, controlada únicamente con la actividad cerebral. Además, ha sido sugerido que este tipo de sistemas podría proporcionar benecios en patientes de ELA dentro del contexto de neurorehabilitación o mantenimiento de la actividad neural. Esta tesis fin de máster presenta el proceso completo de desarrollo de un prototipo inicial de un sistema de telepresencia basado en BCIs y su evaluación con usuarios sanos, y su posterior rediseño (para cubrir las necesidades de pacientes reales) y evaluación con pacientes de ELA. Los resultados mostraron la viabilidad de esta tecnología en pacientes reales

    Using Variable Natural Environment Brain-Computer Interface Stimuli for Real-time Humanoid Robot Navigation

    Full text link
    This paper addresses the challenge of humanoid robot teleoperation in a natural indoor environment via a Brain-Computer Interface (BCI). We leverage deep Convolutional Neural Network (CNN) based image and signal understanding to facilitate both real-time bject detection and dry-Electroencephalography (EEG) based human cortical brain bio-signals decoding. We employ recent advances in dry-EEG technology to stream and collect the cortical waveforms from subjects while they fixate on variable Steady State Visual Evoked Potential (SSVEP) stimuli generated directly from the environment the robot is navigating. To these ends, we propose the use of novel variable BCI stimuli by utilising the real-time video streamed via the on-board robot camera as visual input for SSVEP, where the CNN detected natural scene objects are altered and flickered with differing frequencies (10Hz, 12Hz and 15Hz). These stimuli are not akin to traditional stimuli - as both the dimensions of the flicker regions and their on-screen position changes depending on the scene objects detected. On-screen object selection via such a dry-EEG enabled SSVEP methodology, facilitates the on-line decoding of human cortical brain signals, via a specialised secondary CNN, directly into teleoperation robot commands (approach object, move in a specific direction: right, left or back). This SSVEP decoding model is trained via a priori offline experimental data in which very similar visual input is present for all subjects. The resulting classification demonstrates high performance with mean accuracy of 85% for the real-time robot navigation experiment across multiple test subjects.Comment: Accepted as a full paper at the 2019 International Conference on Robotics and Automation (ICRA

    Design of a six degree-of-freedom haptic hybrid platform manipultor

    Get PDF
    Thesis (Master)--Izmir Institute of Technology, Mechanical Engineering, Izmir, 2010Includes bibliographical references (leaves: 97-103)Text in English; Abstract: Turkish and Englishxv, 115 leavesThe word Haptic, based on an ancient Greek word called haptios, means related with touch. As an area of robotics, haptics technology provides the sense of touch for robotic applications that involve interaction with human operator and the environment. The sense of touch accompanied with the visual feedback is enough to gather most of the information about a certain environment. It increases the precision of teleoperation and sensation levels of the virtual reality (VR) applications by exerting physical properties of the environment such as forces, motions, textures. Currently, haptic devices find use in many VR and teleoperation applications. The objective of this thesis is to design a novel Six Degree-of-Freedom (DOF) haptic desktop device with a new structure that has the potential to increase the precision in the haptics technology. First, previously developed haptic devices and manipulator structures are reviewed. Following this, the conceptual designs are formed and a hybrid structured haptic device is designed manufactured and tested. Developed haptic device.s control algorithm and VR application is developed in Matlab© Simulink. Integration of the mechanism with mechanical, electromechanical and electronic components and the initial tests of the system are executed and the results are presented. According to the results, performance of the developed device is discussed and future works are addressed

    Multisensory wearable interface for immersion and telepresence in robotics

    Get PDF
    The idea of being present in a remote location has inspired researchers to develop robotic devices that make humans to experience the feeling of telepresence. These devices need of multiple sensory feedback to provide a more realistic telepresence experience. In this work, we develop a wearable interface for immersion and telepresence that provides to human with the capability of both to receive multisensory feedback from vision, touch and audio and to remotely control a robot platform. Multimodal feedback from a remote environment is based on the integration of sensor technologies coupled to the sensory system of the robot platform. Remote control of the robot is achieved by a modularised architecture, which allows to visually explore the remote environment. We validated our work with multiple experiments where participants, located at different venues, were able to successfully control the robot platform while visually exploring, touching and listening a remote environment. In our experiments we used two different robotic platforms: the iCub humanoid robot and the Pioneer LX mobile robot. These experiments show that our wearable interface is comfortable, easy to use and adaptable to different robotic platforms. Furthermore, we observed that our approach allows humans to experience a vivid feeling of being present in a remote environment
    corecore