262 research outputs found

    A model-based residual approach for human-robot collaboration during manual polishing operations

    Get PDF
    A fully robotized polishing of metallic surfaces may be insufficient in case of parts with complex geometric shapes, where a manual intervention is still preferable. Within the EU SYMPLEXITY project, we are considering tasks where manual polishing operations are performed in strict physical Human-Robot Collaboration (HRC) between a robot holding the part and a human operator equipped with an abrasive tool. During the polishing task, the robot should firmly keep the workpiece in a prescribed sequence of poses, by monitoring and resisting to the external forces applied by the operator. However, the user may also wish to change the orientation of the part mounted on the robot, simply by pushing or pulling the robot body and changing thus its configuration. We propose a control algorithm that is able to distinguish the external torques acting at the robot joints in two components, one due to the polishing forces being applied at the end-effector level, the other due to the intentional physical interaction engaged by the human. The latter component is used to reconfigure the manipulator arm and, accordingly, its end-effector orientation. The workpiece position is kept instead fixed, by exploiting the intrinsic redundancy of this subtask. The controller uses a F/T sensor mounted at the robot wrist, together with our recently developed model-based technique (the residual method) that is able to estimate online the joint torques due to contact forces/torques applied at any place along the robot structure. In order to obtain a reliable residual, which is necessary to implement the control algorithm, an accurate robot dynamic model (including also friction effects at the joints and drive gains) needs to be identified first. The complete dynamic identification and the proposed control method for the human-robot collaborative polishing task are illustrated on a 6R UR10 lightweight manipulator mounting an ATI 6D sensor

    Sensorless Physical Human-robot Interaction Using Deep-Learning

    Full text link
    Physical human-robot interaction has been an area of interest for decades. Collaborative tasks, such as joint compliance, demand high-quality joint torque sensing. While external torque sensors are reliable, they come with the drawbacks of being expensive and vulnerable to impacts. To address these issues, studies have been conducted to estimate external torques using only internal signals, such as joint states and current measurements. However, insufficient attention has been given to friction hysteresis approximation, which is crucial for tasks involving extensive dynamic to static state transitions. In this paper, we propose a deep-learning-based method that leverages a novel long-term memory scheme to achieve dynamics identification, accurately approximating the static hysteresis. We also introduce modifications to the well-known Residual Learning architecture, retaining high accuracy while reducing inference time. The robustness of the proposed method is illustrated through a joint compliance and task compliance experiment.Comment: 7 pages, ICRA 2024 Submissio

    Proprioceptive Robot Collision Detection through Gaussian Process Regression

    Full text link
    This paper proposes a proprioceptive collision detection algorithm based on Gaussian Regression. Compared to sensor-based collision detection and other proprioceptive algorithms, the proposed approach has minimal sensing requirements, since only the currents and the joint configurations are needed. The algorithm extends the standard Gaussian Process models adopted in learning the robot inverse dynamics, using a more rich set of input locations and an ad-hoc kernel structure to model the complex and non-linear behaviors due to frictions in quasi-static configurations. Tests performed on a Universal Robots UR10 show the effectiveness of the proposed algorithm to detect when a collision has occurred.Comment: Published at ACC 201

    A Framework of Hybrid Force/Motion Skills Learning for Robots

    Get PDF
    Human factors and human-centred design philosophy are highly desired in today’s robotics applications such as human-robot interaction (HRI). Several studies showed that endowing robots of human-like interaction skills can not only make them more likeable but also improve their performance. In particular, skill transfer by imitation learning can increase usability and acceptability of robots by the users without computer programming skills. In fact, besides positional information, muscle stiffness of the human arm, contact force with the environment also play important roles in understanding and generating human-like manipulation behaviours for robots, e.g., in physical HRI and tele-operation. To this end, we present a novel robot learning framework based on Dynamic Movement Primitives (DMPs), taking into consideration both the positional and the contact force profiles for human-robot skills transferring. Distinguished from the conventional method involving only the motion information, the proposed framework combines two sets of DMPs, which are built to model the motion trajectory and the force variation of the robot manipulator, respectively. Thus, a hybrid force/motion control approach is taken to ensure the accurate tracking and reproduction of the desired positional and force motor skills. Meanwhile, in order to simplify the control system, a momentum-based force observer is applied to estimate the contact force instead of employing force sensors. To deploy the learned motion-force robot manipulation skills to a broader variety of tasks, the generalization of these DMP models in actual situations is also considered. Comparative experiments have been conducted using a Baxter Robot to verify the effectiveness of the proposed learning framework on real-world scenarios like cleaning a table

    A sensorless virtual slave control scheme for kinematically dissimilar master-slave teleoperation

    Get PDF
    The use of telerobotic systems is essential for remote handling (RH) operations in radioactive areas of scientific facilities that generate high doses of radiation. Recent developments in remote handling technology has seen a great deal of effort being directed towards the design of modular remote handling control rooms equipped with a standard master arm which will be used to separately control a range of different slave devices. This application thus requires a kinematically dissimilar master-slave control scheme. In order to avoid drag and other effects such as friction or other non-linear and unmodelled slave arm effects of the common position-position architecture in nonbackdrivable slaves, this research has implemented a force-position control scheme. End-effector force is derived from motor torque values which, to avoid the use of radiation intolerant and costly sensing devices, are inferred from motor current measurement. This has been demonstrated on a 1-DOF test-rig with a permanent magnet synchronous motor teleoperated by a Sensable Phantom Omni® haptic master. This has been shown to allow accurate control while realistically conveying dynamic force information back to the operator

    On Sensorless Collision Detection and Measurement of External Forces in Presence of Modeling Inaccuracies

    Get PDF
    The field of human-robot interaction has garnered significant interest in the last decade. Every form of human-robot coexistence must guarantee the safety of the user. Safety in human-robot interaction is being vigorously studied, in areas such as collision avoidance, soft actuators, light-weight robots, computer vision techniques, soft tissue modeling, collision detection, etc. Despite the safety provisions, unwanted collisions can occur in case of system faults. In such cases, before post-collision strategies are triggered, it is imperative to effectively detect the collisions. Implementation of tactile sensors, vision systems, sonar and Lidar sensors, etc., allows for detection of collisions. However, due to the cost of such methods, more practical approaches are being investigated. A general goal remains to develop methods for fast detection of external contacts using minimal sensory information. Availability of position data and command torques in manipulators permits development of observer-based techniques to measure external forces/torques. The presence of disturbances and inaccuracies in the model of the robot presents challenges in the efficacy of observers in the context of collision detection. The purpose of this thesis is to develop methods that reduce the effects of modeling inaccuracies in external force/torque estimation and increase the efficacy of collision detection. It is comprised of the following four parts: 1. The KUKA Light-Weight Robot IV+ is commonly employed for research purposes. The regressor matrix, minimal inertial parameters and the friction model of this robot are identified and presented in detail. To develop the model, relative weight analysis is employed for identification. 2. Modeling inaccuracies and robot state approximation errors are considered simultaneously to develop model-based time-varying thresholds for collision detection. A metric is formulated to compare trajectories realizing the same task in terms of their collision detection and external force/torque estimation capabilities. A method for determining optimal trajectories with regards to accurate external force/torque estimation is also developed. 3. The effects of velocity on external force/torque estimation errors are studied with and without the use of joint force/torque sensors. Velocity-based thresholds are developed and implemented to improve collision detection. The results are compared with the collision detection module integrated in the KUKA Light-Weight Robot IV+. 4. An alternative joint-by-joint heuristic method is proposed to identify the effects of modeling inaccuracies on external force/torque estimation. Time-varying collision detection thresholds associated with the heuristic method are developed and compared with constant thresholds. In this work, the KUKA Light-Weight Robot IV+ is used for obtaining the experimental results. This robot is controlled via the Fast Research Interface and Visual C++ 2008. The experimental results confirm the efficacy of the proposed methodologies

    Collision Detection and Contact Point Estimation Using Virtual Joint Torque Sensing Applied to a Cobot

    Get PDF
    In physical human-robot interaction (pHRI) it is essential to reliably estimate and localize contact forces between the robot and the environment. In this paper, a complete contact detection, isolation, and reaction scheme is presented and tested on a new 6-dof industrial collaborative robot. We combine two popular methods, based on monitoring energy and generalized momentum, to detect and isolate collisions on the whole robot body in a more robust way. The experimental results show the effectiveness of our implementation on the LARA5 cobot, that only relies on motor current and joint encoder measurements. For validation purposes, contact forces are also measured using an external GTE CoboSafe sensor. After a successful collision detection, the contact point location is isolated using a combination of the residual method based on the generalized momentum with a contact particle filter (CPF) scheme. We show for the first time a successful implementation of such combination on a real robot, without relying on joint torque sensor measurements
    • …
    corecore