2,542 research outputs found

    Dynamic bayesian networks for learning interactions between assistive robotic walker and human users

    Full text link
    Detection of individuals intentions and actions from a stream of human behaviour is an open problem. Yet for robotic agents to be truly perceived as human-friendly entities they need to respond naturally to the physical interactions with the surrounding environment, most notably with the user. This paper proposes a generative probabilistic approach in the form of Dynamic Bayesian Networks (DBN) to seamlessly account for users attitudes. A model is presented which can learn to recognize a subset of possible actions by the user of a gait stability support power rollator walker, such as standing up, sitting down or assistive strolling, and adapt the behaviour of the device accordingly. The communication between the user and the device is implicit, without any explicit intention such as a keypad or voice.The end result is a decision making mechanism that best matches the users cognitive attitude towards a set of assistive tasks, effectively incorporating the evolving activity model of the user in the process. The proposed framework is evaluated in real-life condition. © 2010 Springer-Verlag Berlin Heidelberg

    Perceiving user's intention-for-interaction: A probabilistic multimodal data fusion scheme

    Get PDF
    International audienceUnderstanding people's intention, be it action or thought, plays a fundamental role in establishing coherent communication amongst people, especially in non-proactive robotics, where the robot has to understand explicitly when to start an interaction in a natural way. In this work, a novel approach is presented to detect people's intention-for-interaction. The proposed detector fuses multimodal cues, including estimated head pose, shoulder orientation and vocal activity detection, using a probabilistic discrete state Hidden Markov Model. The multimodal detector achieves up to 80% correct detection rates improving purely audio and RGB-D based variants

    Learning Dynamic Systems for Intention Recognition in Human-Robot-Cooperation

    Get PDF
    This thesis is concerned with intention recognition for a humanoid robot and investigates how the challenges of uncertain and incomplete observations, a high degree of detail of the used models, and real-time inference may be addressed by modeling the human rationale as hybrid, dynamic Bayesian networks and performing inference with these models. The key focus lies on the automatic identification of the employed nonlinear stochastic dependencies and the situation-specific inference

    Intention Recognition for Partial-Order Plans Using Dynamic Bayesian Networks

    Get PDF
    In this paper, a novel probabilistic approach to intention recognition for partial-order plans is proposed. The key idea is to exploit independences between subplans to substantially reduce the state space sizes in the compiled Dynamic Bayesian Networks. This makes inference more efficient. The main con- tributions are the computationally exploitable definition of subplan structures, the introduction of a novel Lay- ered Intention Model and a Dynamic Bayesian Net- work representation with an inference mechanism that exploits consecutive and concurrent subplans\u27 indepen- dences. The presented approach reduces the state space to the order of the most complex subplan and requires only minor changes in the standard inference mecha- nism. The practicability of this approach is demon- strated by recognizing the process of shelf-assembly

    The development of test action bank for active robot learning

    Get PDF
    A thesis submitted to the University of Bedfordshire, in fulfilment of the requirements for the degree of Master of Science by researchIn the rapidly expanding service robotics research area, interactions between robots and humans become increasingly cornmon as more and more jobs will require cooperation between the robots and their human users. It is important to address cooperation between a robot and its user. ARL is a promising approach which facilitates a robot to develop high-order beliefs by actively performing test actions in order to obtain its user's intention from his responses to the actions. Test actions are crucial to ARL. This study carried out primary research on developing a Test Action Bank (TAB) to provide test actions for ARL. In this study, a verb-based task classifier was developed to extract tasks from user's commands. Taught tasks and their corresponding test actions were proposed and stored in database to establish the TAB. A backward test actions retrieval method was used to locate a task in a task tree and retrieve its test actions from TAB. A simulation environment was set up with a service robot model and a user model to test TAB and demonstrate some test actions. Simulations were also perfonned in this study, the simulation results proved TAB can successfully provide test actions according to different tasks and the proposed service robot model can demonstrate test actions

    Towards Intuitive Human-Robot Cooperation

    Get PDF
    Human-robot cooperation calls for the treatment of human-machine communication channels, especially if humanoid robots are involved. In this paper, we consider implicit nonverbal channels given by recognizing the partner\u27s intention and proactive execution of tasks. We propose a method that keeps the human in the loop and allows for the systematic reduction of uncertainty inherent in implicit cooperation. We present a benchmark scenario as well as preliminary implementation results

    A multi-modal perception based assistive robotic system for the elderly

    Get PDF
    Edited by Giovanni Maria Farinella, Takeo Kanade, Marco Leo, Gerard G. Medioni, Mohan TrivediInternational audienceIn this paper, we present a multi-modal perception based framework to realize a non-intrusive domestic assistive robotic system. It is non-intrusive in that it only starts interaction with a user when it detects the user's intention to do so. All the robot's actions are based on multi-modal perceptions which include user detection based on RGB-D data, user's intention-for-interaction detection with RGB-D and audio data, and communication via user distance mediated speech recognition. The utilization of multi-modal cues in different parts of the robotic activity paves the way to successful robotic runs (94% success rate). Each presented perceptual component is systematically evaluated using appropriate dataset and evaluation metrics. Finally the complete system is fully integrated on the PR2 robotic platform and validated through system sanity check runs and user studies with the help of 17 volunteer elderly participants

    Semiotics and Human-Robot Interaction

    Get PDF
    Keywords: Semi-autonomous robot, human-robot interaction, semiotics. Abstract: This paper describes a robot control architecture supported on a human-robot interaction model obtained directly from semiotics concepts. The architecture is composed of a set of objects defined after a semiotic sign model. Simulation experiments using unicycle robots are presented that illustrate the interactions within a team of robots equipped with skills similar to those used in human-robot interactions.
    • …
    corecore