6 research outputs found
Kinematics-Based Estimation of Contact Constraints Using Only Proprioception
Robots are increasingly being required to perform tasks which involve contacts with the environment. This paper addresses the problem of estimating environmental constraints on the robot's motion. We present a method which estimates such constraints, by computing the null space of a set of velocity vectors which differ from commanded velocities during contacts. We further extend this method to handle unilateral constraints, for example when the robot touches a rigid surface. Unlike previous work, our method is based on kinematics analysis, using only proprioceptive joint encoders, thus there is no need for either expensive force-torque sensors or tactile sensors at the contact points or any use of vision. We first show results of experiments with a simulated robot in a variety of situations, and we analyse the effect of various levels of observation noise on the resulting contact estimates. Finally we evaluate the performance of our method on two sets of experiments using a KUKA LWR IV manipulator, tasked with exploring and estimating the constraints caused by a horizontal surface and an inclined surface
Unsupervised Contact Learning for Humanoid Estimation and Control
This work presents a method for contact state estimation using fuzzy
clustering to learn contact probability for full, six-dimensional humanoid
contacts. The data required for training is solely from proprioceptive sensors
- endeffector contact wrench sensors and inertial measurement units (IMUs) -
and the method is completely unsupervised. The resulting cluster means are used
to efficiently compute the probability of contact in each of the six
endeffector degrees of freedom (DoFs) independently. This clustering-based
contact probability estimator is validated in a kinematics-based base state
estimator in a simulation environment with realistic added sensor noise for
locomotion over rough, low-friction terrain on which the robot is subject to
foot slip and rotation. The proposed base state estimator which utilizes these
six DoF contact probability estimates is shown to perform considerably better
than that which determines kinematic contact constraints purely based on
measured normal force.Comment: Submitted to the IEEE International Conference on Robotics and
Automation (ICRA) 201
Unsupervised Contact Learning for Humanoid Estimation and Control
This work presents a method for contact state estimation using fuzzy
clustering to learn contact probability for full, six-dimensional humanoid
contacts. The data required for training is solely from proprioceptive sensors
- endeffector contact wrench sensors and inertial measurement units (IMUs) -
and the method is completely unsupervised. The resulting cluster means are used
to efficiently compute the probability of contact in each of the six
endeffector degrees of freedom (DoFs) independently. This clustering-based
contact probability estimator is validated in a kinematics-based base state
estimator in a simulation environment with realistic added sensor noise for
locomotion over rough, low-friction terrain on which the robot is subject to
foot slip and rotation. The proposed base state estimator which utilizes these
six DoF contact probability estimates is shown to perform considerably better
than that which determines kinematic contact constraints purely based on
measured normal force.Comment: Submitted to the IEEE International Conference on Robotics and
Automation (ICRA) 201
Vision-based trajectory control of unsensored robots to increase functionality, without robot hardware modication
In nuclear decommissioning operations, very rugged remote manipulators are used, which lack proprioceptive joint angle sensors. Hence these machines are simply tele-operated, where a human operator controls each joint of the robot individually using a teach pendant or a set of switches. Moreover, decommissioning tasks often involve forceful interactions between the environment and powerful tools at the robot's end-effector. Such interactions can result in complex dynamics, large torques at the robot's joints, and can also lead to erratic movements of a mobile manipulator's base frame with respect to the task space. This Thesis seeks to address these problems by, firstly, showing how
the configuration of such robots can be tracked in real-time by a vision system and fed back into a trajectory control scheme. Secondly, the Thesis investigates the dynamics of robot-environment contacts, and proposes several control schemes for detecting, coping
with, and also exploiting such contacts. Several contributions are advanced in this Thesis. Specifically a control framework is presented which exploits the constraints arising at contact points to effectively reduce commanded torques to perform tasks; methods are advanced to estimate the constraints arising from contacts in a number of situations, using only kinematic quantities; a framework is proposed to estimate the
configuration of a manipulator using a single monocular camera; and finally, a general control framework is described which uses all of the above contributions to servo a manipulator. The results of a number of experiments are presented which demonstrate the feasibility of the proposed methods