15,632 research outputs found
Learning Latent Space Dynamics for Tactile Servoing
To achieve a dexterous robotic manipulation, we need to endow our robot with
tactile feedback capability, i.e. the ability to drive action based on tactile
sensing. In this paper, we specifically address the challenge of tactile
servoing, i.e. given the current tactile sensing and a target/goal tactile
sensing --memorized from a successful task execution in the past-- what is the
action that will bring the current tactile sensing to move closer towards the
target tactile sensing at the next time step. We develop a data-driven approach
to acquire a dynamics model for tactile servoing by learning from
demonstration. Moreover, our method represents the tactile sensing information
as to lie on a surface --or a 2D manifold-- and perform a manifold learning,
making it applicable to any tactile skin geometry. We evaluate our method on a
contact point tracking task using a robot equipped with a tactile finger. A
video demonstrating our approach can be seen in https://youtu.be/0QK0-Vx7WkIComment: Accepted to be published at the International Conference on Robotics
and Automation (ICRA) 2019. The final version for publication at ICRA 2019 is
7 pages (i.e. 6 pages of technical content (including text, figures, tables,
acknowledgement, etc.) and 1 page of the Bibliography/References), while this
arXiv version is 8 pages (added Appendix and some extra details
Tactile Mapping and Localization from High-Resolution Tactile Imprints
This work studies the problem of shape reconstruction and object localization
using a vision-based tactile sensor, GelSlim. The main contributions are the
recovery of local shapes from contact, an approach to reconstruct the tactile
shape of objects from tactile imprints, and an accurate method for object
localization of previously reconstructed objects. The algorithms can be applied
to a large variety of 3D objects and provide accurate tactile feedback for
in-hand manipulation. Results show that by exploiting the dense tactile
information we can reconstruct the shape of objects with high accuracy and do
on-line object identification and localization, opening the door to reactive
manipulation guided by tactile sensing. We provide videos and supplemental
information in the project's website
http://web.mit.edu/mcube/research/tactile_localization.html.Comment: ICRA 2019, 7 pages, 7 figures. Website:
http://web.mit.edu/mcube/research/tactile_localization.html Video:
https://youtu.be/uMkspjmDbq
Effects of force-torque and tactile haptic modalities on classifying the success of robot manipulation tasks
We investigate which haptic sensing modalities, or combination of haptic sensing modalities, best enable a robot to determine whether it successfully completed a manipulation task. In this paper, we consider haptic sensing modalities obtained from a wrist-mounted force-torque sensor and three types of fingertip sensors: a pair of FlexiForce force-sensing resistors, a pair of NumaTac sensors, and a pair of BioTac sensors. For each type of fingertip sensor, we simultaneously record force-torque and fingertip tactile data as the robot attempted to complete two manipulation tasks-a picking task and a scooping task-two-hundred times each. We leverage the resulting dataset to train and test a classification method using forty-one different haptic feature combinations, obtained from exhaustive combinations of individual modalities of the force-torque sensor and fingertip sensors. Our results show that the classification method's ability to distinguish between successful and unsuccessful task attempts depends on both the type of manipulation task and the subset of haptic modalities used to train and test the classification method.Accepted manuscrip
Recommended from our members
An Integrated System for Dextrous Manipulation
This paper describes an integrated system for dextrous manipulation using a Utah-MIT hand that allows one to look at the higher levels of control in a number of grasping and manipulation tasks. The system consists of a number of low-level system primitives for grasping, integrated hand and robotic arm movement, tactile sensors mounted on the fingertips, sensing primitives to utilize joint position, tendon force and tactile array feedback, and a high-level programming environment that allows task level scripts to be created for grasping and manipulation tasks. A number of grasping and manipulation tasks are described that have been implemented with this system
Sim-to-Real Model-Based and Model-Free Deep Reinforcement Learning for Tactile Pushing
Object pushing presents a key non-prehensile manipulation problem that is
illustrative of more complex robotic manipulation tasks. While deep
reinforcement learning (RL) methods have demonstrated impressive learning
capabilities using visual input, a lack of tactile sensing limits their
capability for fine and reliable control during manipulation. Here we propose a
deep RL approach to object pushing using tactile sensing without visual input,
namely tactile pushing. We present a goal-conditioned formulation that allows
both model-free and model-based RL to obtain accurate policies for pushing an
object to a goal. To achieve real-world performance, we adopt a sim-to-real
approach. Our results demonstrate that it is possible to train on a single
object and a limited sample of goals to produce precise and reliable policies
that can generalize to a variety of unseen objects and pushing scenarios
without domain randomization. We experiment with the trained agents in harsh
pushing conditions, and show that with significantly more training samples, a
model-free policy can outperform a model-based planner, generating shorter and
more reliable pushing trajectories despite large disturbances. The simplicity
of our training environment and effective real-world performance highlights the
value of rich tactile information for fine manipulation. Code and videos are
available at https://sites.google.com/view/tactile-rl-pushing/.Comment: Accepted by IEEE Robotics and Automation Letters (RA-L
- …