14,017 research outputs found
Recommended from our members
Object Recognition Using Vision and Touch
A robotic system for object recognition is described that uses both active exploratory tactile sensing and passive stereo vision. The complementary nature of these sensing modalities allows the system to discover the underlying three dimensional structure of the objects to be recognized. This structure is embodied in rich, hierarchical, viewpoint independent 3-D models of the objects which include curved surfaces, concavities and holes. The vision processing provides sparse 3-D data about regions of interest that are then actively explored by the tactile sensor which is mounted on the end of a six degree of freedom manipulator. A robust hierarchical procedure has been developed to integrate the visual and tactile data into accurate three dimensional surface and feature primitives. This integration of vision and touch provides geometric measures of the surfaces and features that are used in a matching phase to find model objects that are consistent with the sensory data. Methods for verification of the hypothesis are presented, including the sensing of visually occluded areas with the tactile sensor. A number of experiments have been performed using real sensors and real, noisy data to demonstrate the utility of these methods and the ability of such a system to recognize objects that would be difficult for a system using vision alone
Learning Latent Space Dynamics for Tactile Servoing
To achieve a dexterous robotic manipulation, we need to endow our robot with
tactile feedback capability, i.e. the ability to drive action based on tactile
sensing. In this paper, we specifically address the challenge of tactile
servoing, i.e. given the current tactile sensing and a target/goal tactile
sensing --memorized from a successful task execution in the past-- what is the
action that will bring the current tactile sensing to move closer towards the
target tactile sensing at the next time step. We develop a data-driven approach
to acquire a dynamics model for tactile servoing by learning from
demonstration. Moreover, our method represents the tactile sensing information
as to lie on a surface --or a 2D manifold-- and perform a manifold learning,
making it applicable to any tactile skin geometry. We evaluate our method on a
contact point tracking task using a robot equipped with a tactile finger. A
video demonstrating our approach can be seen in https://youtu.be/0QK0-Vx7WkIComment: Accepted to be published at the International Conference on Robotics
and Automation (ICRA) 2019. The final version for publication at ICRA 2019 is
7 pages (i.e. 6 pages of technical content (including text, figures, tables,
acknowledgement, etc.) and 1 page of the Bibliography/References), while this
arXiv version is 8 pages (added Appendix and some extra details
Shear-invariant Sliding Contact Perception with a Soft Tactile Sensor
Manipulation tasks often require robots to be continuously in contact with an
object. Therefore tactile perception systems need to handle continuous contact
data. Shear deformation causes the tactile sensor to output path-dependent
readings in contrast to discrete contact readings. As such, in some
continuous-contact tasks, sliding can be regarded as a disturbance over the
sensor signal. Here we present a shear-invariant perception method based on
principal component analysis (PCA) which outputs the required information about
the environment despite sliding motion. A compliant tactile sensor (the TacTip)
is used to investigate continuous tactile contact. First, we evaluate the
method offline using test data collected whilst the sensor slides over an edge.
Then, the method is used within a contour-following task applied to 6 objects
with varying curvatures; all contours are successfully traced. The method
demonstrates generalisation capabilities and could underlie a more
sophisticated controller for challenging manipulation or exploration tasks in
unstructured environments. A video showing the work described in the paper can
be found at https://youtu.be/wrTM61-pieUComment: Accepted in ICRA 201
- …