127 research outputs found

    Vision based interface system for hands free control of an intelligent wheelchair

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Due to the shift of the age structure in today's populations, the necessities for developing the devices or technologies to support them have been increasing. Traditionally, the wheelchair, including powered and manual ones, is the most popular and important rehabilitation/assistive device for the disabled and the elderly. However, it is still highly restricted especially for severely disabled. As a solution to this, the Intelligent Wheelchairs (IWs) have received considerable attention as mobility aids. The purpose of this work is to develop the IW interface for providing more convenient and efficient interface to the people the disability in their limbs.</p> <p>Methods</p> <p>This paper proposes an intelligent wheelchair (IW) control system for the people with various disabilities. To facilitate a wide variety of user abilities, the proposed system involves the use of face-inclination and mouth-shape information, where the direction of an IW is determined by the inclination of the user's face, while proceeding and stopping are determined by the shapes of the user's mouth. Our system is composed of electric powered wheelchair, data acquisition board, ultrasonic/infra-red sensors, a PC camera, and vision system. Then the vision system to analyze user's gestures is performed by three stages: detector, recognizer, and converter. In the detector, the facial region of the intended user is first obtained using Adaboost, thereafter the mouth region is detected based on edge information. The extracted features are sent to the recognizer, which recognizes the face inclination and mouth shape using statistical analysis and <it>K</it>-means clustering, respectively. These recognition results are then delivered to the converter to control the wheelchair.</p> <p>Result & conclusion</p> <p>The advantages of the proposed system include 1) accurate recognition of user's intention with minimal user motion and 2) robustness to a cluttered background and the time-varying illumination. To prove these advantages, the proposed system was tested with 34 users in indoor and outdoor environments and the results were compared with those of other systems, then the results showed that the proposed system has superior performance to other systems in terms of speed and accuracy. Therefore, it is proved that proposed system provided a friendly and convenient interface to the severely disabled people.</p

    Multimodal interface for an intelligent wheelchair

    Get PDF
    Tese de mestrado integrado. Engenharia Informática e Computação. Universidade do Porto. Faculdade de Engenharia. 201

    Classificação de pacientes para adaptação de cadeira de rodas inteligente

    Get PDF
    Doutoramento em Engenharia InformáticaA importância e preocupação dedicadas à autonomia e independência das pessoas idosas e dos pacientes que sofrem de algum tipo de deficiência tem vindo a aumentar significativamente ao longo das últimas décadas. As cadeiras de rodas inteligentes (CRI) são tecnologias que podem ajudar este tipo de população a aumentar a sua autonomia, sendo atualmente uma área de investigação bastante ativa. Contudo, a adaptação das CRIs a pacientes específicos e a realização de experiências com utilizadores reais são assuntos de estudo ainda muito pouco aprofundados. A cadeira de rodas inteligente, desenvolvida no âmbito do Projeto IntellWheels, é controlada a alto nível utilizando uma interface multimodal flexível, recorrendo a comandos de voz, expressões faciais, movimentos de cabeça e através de joystick. Este trabalho teve como finalidade a adaptação automática da CRI atendendo às características dos potenciais utilizadores. Foi desenvolvida uma metodologia capaz de criar um modelo do utilizador. A investigação foi baseada num sistema de recolha de dados que permite obter e armazenar dados de voz, expressões faciais, movimentos de cabeça e do corpo dos pacientes. A utilização da CRI pode ser efetuada em diferentes situações em ambiente real e simulado e um jogo sério foi desenvolvido permitindo especificar um conjunto de tarefas a ser realizado pelos utilizadores. Os dados foram analisados recorrendo a métodos de extração de conhecimento, de modo a obter o modelo dos utilizadores. Usando os resultados obtidos pelo sistema de classificação, foi criada uma metodologia que permite selecionar a melhor interface e linguagem de comando da cadeira para cada utilizador. A avaliação para validação da abordagem foi realizada no âmbito do Projeto FCT/RIPD/ADA/109636/2009 - "IntellWheels - Intelligent Wheelchair with Flexible Multimodal Interface". As experiências envolveram um vasto conjunto de indivíduos que sofrem de diversos níveis de deficiência, em estreita colaboração com a Escola Superior de Tecnologia de Saúde do Porto e a Associação do Porto de Paralisia Cerebral. Os dados recolhidos através das experiências de navegação na CRI foram acompanhados por questionários preenchidos pelos utilizadores. Estes dados foram analisados estatisticamente, a fim de provar a eficácia e usabilidade na adequação da interface da CRI ao utilizador. Os resultados mostraram, em ambiente simulado, um valor de usabilidade do sistema de 67, baseado na opinião de uma amostra de pacientes que apresentam os graus IV e V (os mais severos) de Paralisia Cerebral. Foi também demonstrado estatisticamente que a interface atribuída automaticamente pela ferramenta tem uma avaliação superior à sugerida pelos técnicos de Terapia Ocupacional, mostrando a possibilidade de atribuir automaticamente uma linguagem de comando adaptada a cada utilizador. Experiências realizadas com distintos modos de controlo revelaram a preferência dos utilizadores por um controlo compartilhado com um nível de ajuda associado ao nível de constrangimento do paciente. Em conclusão, este trabalho demonstra que é possível adaptar automaticamente uma CRI ao utilizador com claros benefícios a nível de usabilidade e segurança.The importance and concern given to the autonomy and independence of elderly people and patients suffering from some kind of disability has been growing significantly in the last few decades. Intelligent wheelchairs (IW) are technologies that can increase the autonomy and independence of this kind of population and are nowadays a very active research area. However, the adaptations to users’ specificities and experiments with real users are topics that lack deeper studies. The intelligent wheelchair, developed in the context of the IntellWheels project, is controlled at a high-level through a flexible multimodal interface, using voice commands, facial expressions, head movements and joystick as its main input modalities. This work intended to develop a system enabling the automatic adaptation, to the user characteristics, of the previously developed intelligent wheelchair. A methodology was created enabling the creation of a user model. The research was based on the development of a data gathering system, enabling the collection and storage of data from voice commands, facial expressions, head and body movements from several patients with distinct disabilities such as Cerebral Palsy. The wheelchair can be used in different situations in real and simulated environments and a serious game was developed where different tasks may be performed by users. Data was analysed using knowledge discovery methods in order to create an automatic patient classification system. Based on the classification system, a methodology was developed enabling to select the best wheelchair interface and command language for each patient. Evaluation was performed in the context of Project FCT/RIPD/ADA/109636/ 2009 – “IntellWheels – Intelligent Wheelchair with Flexible Multimodal Interface”. Experiments were conducted, using a large set of patients suffering from severe physical constraints in close collaboration with Escola Superior de Tecnologia de Saúde do Porto and Associação do Porto de Paralisia Cerebral. The experiments using the intelligent wheelchair were followed by user questionnaires. The results were statistically analysed in order to prove the effectiveness and usability of the adaptation of the Intelligent Wheelchair multimodal interface to the user characteristics. The results obtained in a simulated environment showed a 67 score on the system usability scale based in the opinion of a sample of cerebral palsy patients with the most severe cases IV and V of the Gross Motor Function Scale. It was also statistically demonstrated that the data analysis system advised the use of an adapted interface with higher evaluation than the one suggested by the occupational therapists, showing the usefulness of defining a command language adapted to each user. Experiments conducted with distinct control modes revealed the users' preference for a shared control with an aid level taking into account the level of constraint of the patient. In conclusion, this work demonstrates that it is possible to adapt an intelligent wheelchair to the user with clear usability and safety benefits

    Detection of eye movements based on EEG signals and the SAX algorithm

    Get PDF
    For patients with disabilities, particularly those with motor disabilities and difficulties to interact with computer and devices, Human-Machine Interaction (HMI) research may provide them new ways to solve this problem. In this paper, we propose the Brain-Computer Interface (BCI) approach as a potential technique. The patients may use a portable electroencephalography (EEG) device to give instruction to a computing device via eye movements. Classification algorithms have been investigated in past research to allow detection of eye movement. We would like to investigate another technique, namely the Symbolic Aggregate Approximation (SAX) algorithm, to find out its suitability and performance against known classification algorithms such as Support Vector Machine (SVM), k-Nearest Neighbour (KNN) and Decision Tree (DT)

    Human motion analysis and simulation tools

    Get PDF
    Análise de movimento é actualmente um tópico de pesquisa bastante activo nas áreas da Visão por Computador,Computação Gráfica e Biomecânica, devido à sua aplicabilidade num vasto espectro de aplicações em diversasáreas. Com este trabalho pretendemos apresentar um detalhado, abrangente e atualizado estudo sobre aplica-ções de análise e/ou de simulação de movimento, que têm sido desenvolvidas tanto pela comunidade científicacomo por entidades comerciais. O principal contributo deste estudo, além da listagem abrangente de ferramentasde análise de movimento, é a apresentação de um esquema eficaz para classificar e comparar ferramentasde simulação e de análise de movimento.Motion analysis is currently an active research topic in Computational Vision, Computer Graphics, Machine Learning and Biomechanics mainly due to its applicability into a wide spectrum of relevant applications in many areas. This work intends to present a detailed, broad and up to date survey on motion and/or simulation analysis software packages that have been developed both by the scientific community and commercial entities, to be used in the field of biomechanics. The main contribution of this study, beyond the comprehensive listing of motion analysis tools, is the presentation of an effective framework to classify and compare motion simulation and analysis tools

    État des connaissances sur les fauteuils roulants motorisés intelligents (FRMIs) et recommandations pour la poursuite de leur développement : un examen de la portée

    Full text link
    Contexte : La participation sociale d’utilisateurs de fauteuils roulants peut être affectée par certains facteurs, tels que l’accessibilité et les caractéristiques individuelles de ces derniers, pouvant limiter leur pleine implication dans la réalisation des activités signifiantes de la vie quotidienne. Afin d’améliorer leur mobilité, différents prototypes de fauteuils roulants motorisés intelligents (FRMIs) sont en développement, à l’intention de personnes présentant des déficiences physiques, cognitives ou sensorielles, et qui sont dans l’incapacité d’utiliser un fauteuil roulant motorisé (FRM). Le but visé est de leur procurer davantage d’autonomie dans leurs déplacements, et tenter ainsi de répondre à leurs besoins en termes de mobilité et de participation sociale. Objectif : Explorer la littérature portant sur le développement de FRMIs afin de comprendre dans quelle mesure les prototypes existants répondent aux besoins réels des utilisateurs, d’identifier les limites des études, et de faire des recommandations pour mieux orienter le développement continu des FRMIs. Méthodologie : Un examen de la portée a été réalisé suivant les six étapes proposées par Arskeys et O’Malley (2005), puis bonifiées par Levac et al. (2010). Les études publiées en anglais ou en français, jusqu’à septembre 2020, ont été consultées. Résultats : Au total, 41 études ont été retenues pour l’analyse. Les résultats suggèrent que les différentes technologies intégrées aux FRMIs pourraient contribuer à répondre à certains besoins d’utilisateurs présentant différentes incapacités, contribuer à améliorer leur mobilité, procurer de l’autonomie et favoriser leur participation sociale. Par ailleurs, des résultats complémentaires ont permis d’identifier : (a) d’autres technologies, pouvant favoriser davantage le sentiment d’autonomie et de confort aux utilisateurs, et (b) d’autres usages possibles du FRMI en clinique. Une limite importante identifiée est l’absence d’études expérimentales pouvant permettre d’évaluer l’efficacité du FRMI. Le point de vue des proches-aidants est également peu rapporté dans la littérature. Conclusion : Des études futures seraient à envisager en vue d’améliorer les prototypes de FRMIs existants.Background: The level of social participation among wheelchair users can be affected by factors such as accessibility, as well as their individual clinical profile, which can limit their full involvement in meaningful activities of daily living. To meet their needs in terms of mobility and social participation, different prototypes of intelligent powered wheelchairs (IPW) are being developed, in order to improve the mobility of people with physical, cognitive or sensorial impairments, who have difficulties using standard powered wheelchairs. Objective: The aim of this study was to map the existing literature on the nature of studies carried out on IPWs to better understand how the existing IPWs meet the needs of powered wheelchair users, and to better guide the ongoing development of IPWs. Methods: A scoping review was conducted in accordance with the six stages of Arskeys and O’Malley’s (2005) framework which was later enhanced by Levac et al. (2010). All studies available until September 2020, written in English or in French, were included. Results: A total of 41 studies were included in the scoping review. The results suggest that the various technologies integrated into IPWs could meet some of the needs of powered wheelchair users (PWu), could help improve mobility, provide independence, and promote social participation of some PWu. Moreover, additional results were identified: (a) other technologies, that could provide more independence and comfort to users, and (b) other clinical uses of IPW. An important limitation of the literature is the lack of experimental studies that could help assessing the efficiency of IPW. The point of view of caregivers is also less reported in the literature. Conclusion: Further studies should be considered to improve the functioning of the existing prototypes of IPW

    Gaze-Based Control of Robot Arm in Three-Dimensional Space

    Get PDF
    Eye tracking technology has opened up a new communication channel for people with very restricted body movements. These devices had already been successfully applied as a human computer interface, e.g. for writing a text, or to control different devices like a wheelchair. This thesis proposes a Human Robot Interface (HRI) that enables the user to control a robot arm in 3-Dimensional space using only 2-Dimensional gaze direction and the states of the eyes. The introduced interface provides all required commands to translate, rotate, open or close the gripper with the definition of different control modes. In each mode, different commands are provided and direct gaze direction of the user is applied to generate continuous robot commands. To distinguish between natural inspection eye movements and the eye movements that intent to control the robot arm, dynamic command areas are proposed. The dynamic command areas are defined around the robot gripper and are updated with its movements. To provide a direct interaction of the user, gaze gestures and states of the eyes are used to switch between different control modes. For the purpose of this thesis, two versions of the above-introduced HRI were developed. In the first version of the HRI, only two simple gaze gestures and two states of the eye (closed eyes and eye winking) are used for switching. In the second version, instead of the two simple gestures, four complex gaze gestures were applied and the positions of the dynamic command areas were optimized. The complex gaze gestures enable the user to switch directly from initial mode to the desired control mode. These gestures are flexible and can be generated directly in the robot environments. For the recognition of complex gaze gestures, a novel algorithm based on Dynamic Time Warping (DTW) is proposed. The results of the studies conducted with both HRIs confirmed their feasibility and showed the high potential of the proposed interfaces as hands-free interfaces. Furthermore, the results of subjective and objective measurements showed that the usability of the interface with simple gaze gestures had been improved with the integration of complex gaze gestures and the new positions of the dynamic command areas
    corecore