7,775 research outputs found
ChainQueen: A Real-Time Differentiable Physical Simulator for Soft Robotics
Physical simulators have been widely used in robot planning and control.
Among them, differentiable simulators are particularly favored, as they can be
incorporated into gradient-based optimization algorithms that are efficient in
solving inverse problems such as optimal control and motion planning.
Simulating deformable objects is, however, more challenging compared to rigid
body dynamics. The underlying physical laws of deformable objects are more
complex, and the resulting systems have orders of magnitude more degrees of
freedom and therefore they are significantly more computationally expensive to
simulate. Computing gradients with respect to physical design or controller
parameters is typically even more computationally challenging. In this paper,
we propose a real-time, differentiable hybrid Lagrangian-Eulerian physical
simulator for deformable objects, ChainQueen, based on the Moving Least Squares
Material Point Method (MLS-MPM). MLS-MPM can simulate deformable objects
including contact and can be seamlessly incorporated into inference, control
and co-design systems. We demonstrate that our simulator achieves high
precision in both forward simulation and backward gradient computation. We have
successfully employed it in a diverse set of control tasks for soft robots,
including problems with nearly 3,000 decision variables.Comment: In submission to ICRA 2019. Supplemental Video:
https://www.youtube.com/watch?v=4IWD4iGIsB4 Project Page:
https://github.com/yuanming-hu/ChainQuee
Active Clothing Material Perception using Tactile Sensing and Deep Learning
Humans represent and discriminate the objects in the same category using
their properties, and an intelligent robot should be able to do the same. In
this paper, we build a robot system that can autonomously perceive the object
properties through touch. We work on the common object category of clothing.
The robot moves under the guidance of an external Kinect sensor, and squeezes
the clothes with a GelSight tactile sensor, then it recognizes the 11
properties of the clothing according to the tactile data. Those properties
include the physical properties, like thickness, fuzziness, softness and
durability, and semantic properties, like wearing season and preferred washing
methods. We collect a dataset of 153 varied pieces of clothes, and conduct 6616
robot exploring iterations on them. To extract the useful information from the
high-dimensional sensory output, we applied Convolutional Neural Networks (CNN)
on the tactile data for recognizing the clothing properties, and on the Kinect
depth images for selecting exploration locations. Experiments show that using
the trained neural networks, the robot can autonomously explore the unknown
clothes and learn their properties. This work proposes a new framework for
active tactile perception system with vision-touch system, and has potential to
enable robots to help humans with varied clothing related housework.Comment: ICRA 2018 accepte
- …