2 research outputs found

    Encoding bi-manual coordination patterns from human demonstrations

    Get PDF
    ABSTRACT Humans perform tasks such as bowl mixing bi-manually, but programming them on a robot can be challenging specially in tasks that require force control or on-line stiffness modulation. In this paper we first propose a user-friendly setup for demonstrating bi-manual tasks, while collecting complementary information on motion and forces sensed on a robotic arm, as well as the human hand configuration and grasp information. Secondly for learning the task we propose a method for extracting task constraints for each arm and coordination patterns between the arms. We use a statistical encoding of the data based on the extracted constraints and reproduce the task using a cartesian impedance controller

    Learning Task Priorities from Demonstrations

    Full text link
    Bimanual operations in humanoids offer the possibility to carry out more than one manipulation task at the same time, which in turn introduces the problem of task prioritization. We address this problem from a learning from demonstration perspective, by extending the Task-Parameterized Gaussian Mixture Model (TP-GMM) to Jacobian and null space structures. The proposed approach is tested on bimanual skills but can be applied in any scenario where the prioritization between potentially conflicting tasks needs to be learned. We evaluate the proposed framework in: two different tasks with humanoids requiring the learning of priorities and a loco-manipulation scenario, showing that the approach can be exploited to learn the prioritization of multiple tasks in parallel.Comment: Accepted for publication at the IEEE Transactions on Robotic
    corecore