4 research outputs found

    Incremental Learning for Robot Perception through HRI

    Full text link
    Scene understanding and object recognition is a difficult to achieve yet crucial skill for robots. Recently, Convolutional Neural Networks (CNN), have shown success in this task. However, there is still a gap between their performance on image datasets and real-world robotics scenarios. We present a novel paradigm for incrementally improving a robot's visual perception through active human interaction. In this paradigm, the user introduces novel objects to the robot by means of pointing and voice commands. Given this information, the robot visually explores the object and adds images from it to re-train the perception module. Our base perception module is based on recent development in object detection and recognition using deep learning. Our method leverages state of the art CNNs from off-line batch learning, human guidance, robot exploration and incremental on-line learning

    Hand and Arm Gesture-based Human-Robot Interaction: A Review

    Full text link
    The study of Human-Robot Interaction (HRI) aims to create close and friendly communication between humans and robots. In the human-center HRI, an essential aspect of implementing a successful and effective HRI is building a natural and intuitive interaction, including verbal and nonverbal. As a prevalent nonverbally communication approach, hand and arm gesture communication happen ubiquitously in our daily life. A considerable amount of work on gesture-based HRI is scattered in various research domains. However, a systematic understanding of the works on gesture-based HRI is still lacking. This paper intends to provide a comprehensive review of gesture-based HRI and focus on the advanced finding in this area. Following the stimulus-organism-response framework, this review consists of: (i) Generation of human gesture(stimulus). (ii) Robot recognition of human gesture(organism). (iii) Robot reaction to human gesture(response). Besides, this review summarizes the research status of each element in the framework and analyze the advantages and disadvantages of related works. Toward the last part, this paper discusses the current research challenges on gesture-based HRI and provides possible future directions.Comment: 10 pages, 1 figure

    From Human Physical Interaction To Online Motion Adaptation Using Parameterized Dynamical Systems

    Get PDF
    In this work, we present an adaptive motion planning approach for impedance-controlled robots to modify their tasks based on human physical interactions. We use a class of parameterized time-independent dynamical systems for motion generation where the modulation of such parameters allows for motion flexibility. To adapt to human interactions, we update the parameter of our dynamical system in order to reduce the tracking error (i.e., between the desired trajectory generated by the dynamical system and the real trajectory influenced by the human interaction). We provide analytical analysis and several simulations of our method. Finally, we investigate our approach through real world experiments with 7-DOF KUKA LWR 4+ robot performing tasks such as polishing and pick-and-place
    corecore