10 research outputs found

    A Visual-Based Shared Control Architecture for Remote Telemanipulation

    Get PDF
    International audience— Cleaning up the past half century of nuclear waste represents the largest environmental remediation project in the whole Europe. Nuclear waste must be sorted, segregated and stored according to its radiation level in order to optimize maintenance costs. The objective of this work is to develop a shared control framework for remote manipulation of objects using visual information. In the presented scenario, the human operator must control a system composed of two robotic arms, one equipped with a gripper and the other one with a camera. In order to facilitate the operator's task, a subset of the gripper motion are assumed to be regulated by an autonomous algorithm exploiting the camera view of the scene. At the same time, the operator has control over the remaining null-space motions w.r.t. the primary (autonomous) task by acting on a force feedback device. A novel force feedback algorithm is also proposed with the aim of informing the user about possible constraints of the robotic system such as, for instance, joint limits. Human/hardware-in-the-loop experiments with simulated slave robots and a real master device are finally reported for demonstrating the feasibility and effectiveness of the approach

    Modulating Human Input for Shared Autonomy in Dynamic Environments

    Get PDF

    Visual-Based Shared Control for Remote Telemanipulation with Integral Haptic Feedback

    Get PDF
    International audienceNowadays, one of the largest environmental challenges that European countries must face consists in dealing with the past half century of nuclear waste. In order to optimize maintenance costs, nuclear waste must be sorted, segregated and stored according to its radiation level. Towards this end, in [1] we have recently proposed a visual-based shared control architecture meant to facilitate a human operator in controlling two remote robotic arms (one equipped with a gripper and another with a camera) during remote manipulation tasks of nuclear waste via a master device. The operator could then receive force cues informative of the feasibility of her/his motion commands during the task execution. The strategy presented in [1], albeit effective, suffers however from a locality issue since the operator can only provide instantaneous velocity commands (in a suitable task space), and receive instantaneous force feedback cues. On the other hand, the ability to 'steer' a whole future trajectory in task space, and to receive a corresponding integral force feedback along the whole planned trajectory (because of any constraint of the considered system), could significantly enhance the operator's performance, especially when dealing with complex manipulation tasks. The aim of this work is to then extend [1] towards a planning-based shared control architecture able to take into account the mentioned requirements. A human/hardware-in-the-loop experiment with simulated slave robots and a real master device is reported for demonstrating the feasibility and effectiveness of the proposed approach

    Perception-Aware Human-Assisted Navigation of Mobile Robots on Persistent Trajectories

    Get PDF
    International audienceWe propose a novel shared control and active perception framework combining the skills of a human operator in accomplishing complex tasks with the capabilities of a mobile robot in autonomously maximizing the information acquired by the onboard sensors for improving its state estimation. The human operator modifies at runtime some suitable properties of a persistent cyclic path followed by the robot so as to achieve the given task (e.g., explore an environment). At the same time, the path is concurrently adjusted by the robot with the aim of maximizing the collected information. This combined behavior enables the human operator to control the high-level task of the robot while the latter autonomously improves its state estimation. The user's commands are included in a task priority framework together with other relevant constraints, while the quality of the acquired information is measured by the Shatten norm of the Constructibility Gramian. The user is also provided with guidance feedback pointing in the direction that would maximize this information metric. We evaluated the proposed approach in two human subject studies, testing the effectiveness of including the Constructibility Gramian into the task priority framework as well as the viability of providing either visual or haptic feedback to convey this information metric

    A Shared-Control Teleoperation Architecture for Nonprehensile Object Transportation

    Get PDF
    This article proposes a shared-control teleoperation architecture for robot manipulators transporting an object on a tray. Differently from many existing studies about remotely operated robots with firm grasping capabilities, we consider the case in which, in principle, the object can break its contact with the robot end-effector. The proposed shared-control approach automatically regulates the remote robot motion commanded by the user and the end-effector orientation to prevent the object from sliding over the tray. Furthermore, the human operator is provided with haptic cues informing about the discrepancy between the commanded and executed robot motion, which assist the operator throughout the task execution. We carried out trajectory tracking experiments employing an autonomous 7-degree-of-freedom (DoF) manipulator and compared the results obtained using the proposed approach with two different control schemes (i.e., constant tray orientation and no motion adjustment). We also carried out a human-subjects study involving 18 participants in which a 3-DoF haptic device was used to teleoperate the robot linear motion and display haptic cues to the operator. In all experiments, the results clearly show that our control approach outperforms the other solutions in terms of sliding prevention, robustness, commands tracking, and user’s preference

    A Haptic Shared-Control Architecture for Guided Multi-Target Robotic Grasping

    Get PDF
    Although robotic telemanipulation has always been a key technology for the nuclear industry, little advancement has been seen over the last decades. Despite complex remote handling requirements, simple mechanically linked master-slave manipulators still dominate the field. Nonetheless, there is a pressing need for more effective robotic solutions able to significantly speed up the decommissioning of legacy radioactive waste. This paper describes a novel haptic shared-control approach for assisting a human operator in the sort and segregation of different objects in a cluttered and unknown environment. A three-dimensional scan of the scene is used to generate a set of potential grasp candidates on the objects at hand. These grasp candidates are then used to generate guiding haptic cues, which assist the operator in approaching and grasping the objects. The haptic feedback is designed to be smooth and continuous as the user switches from a grasp candidate to the next one, or from one object to another one, avoiding any discontinuity or abrupt changes. To validate our approach, we carried out two human-subject studies, enrolling 15 participants. We registered an average improvement of 20.8%, 20.1%, and 32.5% in terms of completion time, linear trajectory, and perceived effectiveness, respectively, between the proposed approach and standard teleoperation
    corecore