2 research outputs found

    Assistente de navegaĆ§Ć£o com apontador laser para conduzir cadeiras de rodas robotizadas

    Get PDF
    Orientador: Eric RohmerDissertaĆ§Ć£o (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia ElĆ©trica e de ComputaĆ§Ć£oResumo: As soluƧƵes de robĆ³tica assistida ajudam as pessoas a recuperar sua mobilidade e autonomia perdidas em suas vidas diĆ”rias. Este documento apresenta um assistente de navegaĆ§Ć£o de baixo custo projetado para pessoas tetraplĆ©gicas para dirigir uma cadeira de rodas robotizada usando a combinaĆ§Ć£o da orientaĆ§Ć£o da cabeƧa e expressƵes faciais (sorriso e sobrancelhas para cima) para enviar comandos para a cadeira. O assistente fornece dois modos de navegaĆ§Ć£o: manual e semi-autĆ“nomo. Na navegaĆ§Ć£o manual, uma webcam normal com o algoritmo OpenFace detecta a orientaĆ§Ć£o da cabeƧa do usuĆ”rio e expressƵes faciais (sorriso, sobrancelhas para cima) para compor comandos e atuar diretamente nos movimentos da cadeira de rodas (parar, ir Ć  frente, virar Ć  direita, virar Ć  esquerda). No modo semi-autĆ“nomo, o usuĆ”rio controla um laser pan-tilt com a cabeƧa para apontar o destino desejado no solo e valida com o comando sobrancelhas para cima que faz com que a cadeira de rodas robotizada realize uma rotaĆ§Ć£o seguida de um deslocamento linear para o alvo escolhido. Embora o assistente precise de melhorias, os resultados mostraram que essa soluĆ§Ć£o pode ser uma tecnologia promissora para pessoas paralisadas do pescoƧo para controlar uma cadeira de rodas robotizadaAbstract: Assistive robotics solutions help people to recover their lost mobility and autonomy in their daily life. This document presents a low-cost navigation assistant designed for people paralyzed from down the neck to drive a robotized wheelchair using the combination of the head's posture and facial expressions (smile and eyebrows up) to send commands to the chair. The assistant provides two navigation modes: manual and semi-autonomous. In the manual navigation, a regular webcam with the OpenFace algorithm detects the user's head orientation and facial expressions (smile, eyebrows up) to compose commands and actuate directly on the wheelchair movements (stop, go front, turn right, turn left). In the semi-autonomous, the user controls a pan-tilt laser with his/her head to point the desired destination on the ground and validates with eyebrows up command which makes the robotized wheelchair performs a rotation followed by a linear displacement to the chosen target. Although the assistant need improvements, results have shown that this solution may be a promising technology for people paralyzed from down the neck to control a robotized wheelchairMestradoEngenharia de ComputaĆ§Ć£oMestre em Engenharia ElĆ©tricaCAPE

    Laser based driving assistance for smart robotic wheelchairs

    No full text
    This paper is presenting the ongoing work toward a novel driving assistance system of a robotic wheelchair, for people paralyzed from down the neck. The user\u27s head posture is tracked, to accordingly project a colored spot on the ground ahead, with a pan-tilt mounted laser. The laser dot on the ground represents a potential close range destination the operator wants to reach autonomously. The wheelchair is equipped with a low cost depth-camera (Kinect sensor) that models a traversability map in order to define if the designated destination is reachable or not by the chair. If reachable, the red laser dot turns green, and the operator can validate the wheelchair destination via an Electromyogram (EMG) device, detecting a specific group of muscle\u27s contraction. This validating action triggers the calculation of a path toward the laser pointed target, based on the traversability map. The wheelchair is then controlled to follow this path autonomously. In the future, the stream of 3D point cloud acquired during the process will be used to map and self localize the wheelchair in the environment, to be able to correct the estimate of the pose derived from the wheel\u27s encoders
    corecore