4 research outputs found

    Multimodal interface for an intelligent wheelchair

    Get PDF
    Tese de mestrado integrado. Engenharia Informática e Computação. Universidade do Porto. Faculdade de Engenharia. 201

    User modelling for robotic companions using stochastic context-free grammars

    Get PDF
    Creating models about others is a sophisticated human ability that robotic companions need to develop in order to have successful interactions. This thesis proposes user modelling frameworks to personalise the interaction between a robot and its user and devises novel scenarios where robotic companions may apply these user modelling techniques. We tackle the creation of user models in a hierarchical manner, using a streamlined version of the Hierarchical Attentive Multiple-Models for Execution and Recognition (HAMMER) architecture to detect low-level user actions and taking advantage of Stochastic Context-Free Grammars (SCFGs) to instantiate higher-level models which recognise uncertain and recursive sequences of low-level actions. We discuss a couple of distinct scenarios for robotic companions: a humanoid sidekick for power-wheelchair users and a companion of hospital patients. Next, we address the limitations of the previous scenarios by applying our user modelling techniques and designing two further scenarios that fully take advantage of the user model. These scenarios are: a wheelchair driving tutor which models the user abilities, and the musical collaborator which learns the preferences of its users. The methodology produced interesting results in all scenarios: users preferred the actual robot over a simulator as a wheelchair sidekick. Hospital patients rated positively their interactions with the companion independently of their age. Moreover, most users agreed that the music collaborator had become a better accompanist with our framework. Finally, we observed that users' driving performance improved when the robotic tutor instructed them to repeat a task. As our workforce ages and the care requirements in our society grow, robots will need to play a role in helping us lead better lives. This thesis shows that, through the use of SCFGs, adaptive user models may be generated which then can be used by robots to assist their users.Open Acces

    Human-Robot Interaction Strategies for Walker-Assisted Locomotion

    Get PDF
    Neurological and age-related diseases affect human mobility at different levels causing partial or total loss of such faculty. There is a significant need to improve safe and efficient ambulation of patients with gait impairments. In this context, walkers present important benefits for human mobility, improving balance and reducing the load on their lower limbs. Most importantly, walkers induce the use of patients residual mobility capacities in different environments. In the field of robotic technologies for gait assistance, a new category of walkers has emerged, integrating robotic technology, electronics and mechanics. Such devices are known as robotic walkers, intelligent walkers or smart walkers One of the specific and important common aspects to the field of assistive technologies and rehabilitation robotics is the intrinsic interaction between the human and the robot. In this thesis, the concept of Human-Robot Interaction (HRI) for human locomotion assistance is explored. This interaction is composed of two interdependent components. On the one hand, the key role of a robot in a Physical HRI (pHRI) is the generation of supplementary forces to empower the human locomotion. This involves a net flux of power between both actors. On the other hand, one of the crucial roles of a Cognitive HRI (cHRI) is to make the human aware of the possibilities of the robot while allowing him to maintain control of the robot at all times. This doctoral thesis presents a new multimodal human-robot interface for testing and validating control strategies applied to a robotic walkers for assisting human mobility and gait rehabilitation. This interface extracts navigation intentions from a novel sensor fusion method that combines: (i) a Laser Range Finder (LRF) sensor to estimate the users legs kinematics, (ii) wearable Inertial Measurement Unit (IMU) sensors to capture the human and robot orientations and (iii) force sensors measure the physical interaction between the humans upper limbs and the robotic walker. Two close control loops were developed to naturally adapt the walker position and to perform body weight support strategies. First, a force interaction controller generates velocity outputs to the walker based on the upper-limbs physical interaction. Second, a inverse kinematic controller keeps the walker within a desired position to the human improving such interaction. The proposed control strategies are suitable for natural human-robot interaction as shown during the experimental validation. Moreover, methods for sensor fusion to estimate the control inputs were presented and validated. In the experimental studies, the parameters estimation was precise and unbiased. It also showed repeatability when speed changes and continuous turns were performed
    corecore