405 research outputs found

    Recognition of Haptic Interaction Patterns in Dyadic Joint Object Manipulation

    Get PDF
    The development of robots that can physically cooperate with humans has attained interest in the last decades. Obviously, this effort requires a deep understanding of the intrinsic properties of interaction. Up to now, many researchers have focused on inferring human intents in terms of intermediate or terminal goals in physical tasks. On the other hand, working side by side with people, an autonomous robot additionally needs to come up with in-depth information about underlying haptic interaction patterns that are typically encountered during human-human cooperation. However, to our knowledge, no study has yet focused on characterizing such detailed information. In this sense, this work is pioneering as an effort to gain deeper understanding of interaction patterns involving two or more humans in a physical task. We present a labeled human-human-interaction dataset, which captures the interaction of two humans, who collaboratively transport an object in an haptics-enabled virtual environment. In the light of information gained by studying this dataset, we propose that the actions of cooperating partners can be examined under three interaction types: In any cooperative task, the interacting humans either 1) work in harmony, 2) cope with conflicts, or 3) remain passive during interaction. In line with this conception, we present a taxonomy of human interaction patterns; then propose five different feature sets, comprising force-, velocity-and power-related information, for the classification of these patterns. Our evaluation shows that using a multi-class support vector machine (SVM) classifier, we can accomplish a correct classification rate of 86 percent for the identification of interaction patterns, an accuracy obtained by fusing a selected set of most informative features by Minimum Redundancy Maximum Relevance (mRMR) feature selection method

    A Shared-Control Teleoperation Architecture for Nonprehensile Object Transportation

    Get PDF
    This article proposes a shared-control teleoperation architecture for robot manipulators transporting an object on a tray. Differently from many existing studies about remotely operated robots with firm grasping capabilities, we consider the case in which, in principle, the object can break its contact with the robot end-effector. The proposed shared-control approach automatically regulates the remote robot motion commanded by the user and the end-effector orientation to prevent the object from sliding over the tray. Furthermore, the human operator is provided with haptic cues informing about the discrepancy between the commanded and executed robot motion, which assist the operator throughout the task execution. We carried out trajectory tracking experiments employing an autonomous 7-degree-of-freedom (DoF) manipulator and compared the results obtained using the proposed approach with two different control schemes (i.e., constant tray orientation and no motion adjustment). We also carried out a human-subjects study involving 18 participants in which a 3-DoF haptic device was used to teleoperate the robot linear motion and display haptic cues to the operator. In all experiments, the results clearly show that our control approach outperforms the other solutions in terms of sliding prevention, robustness, commands tracking, and user’s preference

    Admittance control for collaborative dual-arm manipulation

    Get PDF
    Human-robot collaboration is an appealing solution to increase the flexibility of production lines. In this context, we propose a kinematic control strategy for dual-arm robotic platforms physically collaborating with human operators. Based on admittance control, our approach aims at improving the performance of object transportation tasks by acting on two levels: estimating and compensating gravity effects on one side, and considering human intention in the cooperative task space on the other. An experimental study using virtual reality reveals the effectiveness of our method in terms of reduced human energy expenditure

    Representation and control of coordinated-motion tasks for human-robot systems

    Get PDF
    It is challenging for robots to perform various tasks in a human environment. This is because many human-centered tasks require coordination in both hands and may often involve cooperation with another human. Although human-centered tasks require different types of coordinated movements, most of the existing methodologies have focused only on specific types of coordination. This thesis aims at the description and control of coordinated-motion tasks for human-robot systems; i.e., humanoid robots as well as multi-robot and human-robot systems. First, for bimanually coordinated-motion tasks in dual-manipulator systems, we propose the Extended-Cooperative-Task-Space (ECTS) representation, which extends the existing Cooperative-Task-Space (CTS) representation based on the kinematic models for human bimanual movements in Biomechanics. The proposed ECTS representation can represent the whole spectrum of dual-arm motion/force coordination using two sets of ECTS motion/force variables in a unified manner. The type of coordination can be easily chosen by two meaningful coefficients, and during coordinated-motion tasks, each set of variables directly describes two different aspects of coordinated motion and force behaviors. Thus, the operator can specify coordinated-motion/force tasks more intuitively in high-level descriptions, and the specified tasks can be easily reused in other situations with greater flexibility. Moreover, we present consistent procedures of using the ECTS representation for task specifications in the upper-body and lower-body subsystems of humanoid robots in order to perform manipulation and locomotion tasks, respectively. Besides, we propose and discuss performance indices derived based on the ECTS representation, which can be used to evaluate and optimize the performance of any type of dual-arm manipulation tasks. We show that using the ECTS representation for specifying both dual-arm manipulation and biped locomotion tasks can greatly simplify the motion planning process, allowing the operator to focus on high-level descriptions of those tasks. Both upper-body and lower-body task specifications are demonstrated by specifying whole-body task examples on a Hubo II+ robot carrying out dual-arm manipulation as well as biped locomotion tasks in a simulation environment. We also present the results from experiments on a dual-arm robot (Baxter) for teleoperating various types of coordinated-motion tasks using a single 6D mouse interface. The specified upper- and lower-body tasks can be considered as coordinated motions with constraints. In order to express various constraints imposed across the whole-body, we discuss the modeling of whole-body structure and the computations for robotic systems having multiple kinematic chains. Then we present a whole-body controller formulated as a quadratic programming, which can take different types of constraints into account in a prioritized manner. We validate the whole-body controller based on the simulation results on a Hubo II+ robot performing specified whole-body task examples with a number of motion and force constraints as well as actuation limits. Lastly, we discuss an extension of the ECTS representation, called Hierarchical Extended-Cooperative-Task Space (H-ECTS) framework, which uses tree-structured graphical representations for coordinated-motion tasks of multi-robot and human-robot systems. The H-ECTS framework is validated by experimental results on two Baxter robots cooperating with each other as well as with an additional human partner

    Human-Mechanical system interaction in Virtual Reality

    Get PDF
    The present work aims to show the great potential of Virtual Reality (VR) technologies in the field of Human-Robot Interaction (HRI). Indeed, it is foreseeable that in not too distant future cooperating robots will be increasingly present in human environments. Many authors actually believe that after the current information revolution, we will witness the so-called "robotics revolution", with the spread of increasingly intelligent and autonomous robots capable of moving into our own environments. Since these machines must be able to interact with human beings in a safe way, new design tools for the study of Human-Robot Interaction (HRI) are needed. The author believes that VR is an ideal design tool for the study of the interaction between humans and automatic machines, since it allows the designers to interact in real-time with virtual robotic systems and to evaluate different control algorithms, without the need of physical prototypes. This also shields the user from any risk related to the physical experimentation. However, VR technologies have also a more immediate application in the field of HRI, such as the study of usability of interfaces for real-time controlled robots. In fact, these robots, such as robots for microsurgery or even "teleoperated" robots working in a hostile environments, are already quite common. VR allows the designers to evaluate the usability of such interfaces by relating their physical input with a virtual output. In particular, the author has developed a new software application aimed at simulating automatic robots and, more generally, mechanical systems in a virtual environment. The user can interact with one or more virtual manipulators and also control them in real-time by means of several input devices. Finally, an innovative approach to the modeling and control of a humanoid robot with high degree of redundancy is discussed. VR implementation of a virtual humanoid is useful for the study of both humanoid robots and human beings
    • …
    corecore