892 research outputs found
Proprioceptive Learning with Soft Polyhedral Networks
Proprioception is the "sixth sense" that detects limb postures with motor
neurons. It requires a natural integration between the musculoskeletal systems
and sensory receptors, which is challenging among modern robots that aim for
lightweight, adaptive, and sensitive designs at a low cost. Here, we present
the Soft Polyhedral Network with an embedded vision for physical interactions,
capable of adaptive kinesthesia and viscoelastic proprioception by learning
kinetic features. This design enables passive adaptations to omni-directional
interactions, visually captured by a miniature high-speed motion tracking
system embedded inside for proprioceptive learning. The results show that the
soft network can infer real-time 6D forces and torques with accuracies of
0.25/0.24/0.35 N and 0.025/0.034/0.006 Nm in dynamic interactions. We also
incorporate viscoelasticity in proprioception during static adaptation by
adding a creep and relaxation modifier to refine the predicted results. The
proposed soft network combines simplicity in design, omni-adaptation, and
proprioceptive sensing with high accuracy, making it a versatile solution for
robotics at a low cost with more than 1 million use cycles for tasks such as
sensitive and competitive grasping, and touch-based geometry reconstruction.
This study offers new insights into vision-based proprioception for soft robots
in adaptive grasping, soft manipulation, and human-robot interaction.Comment: 20 pages, 10 figures, 2 tables, submitted to the International
Journal of Robotics Research for revie
Towards Robotic Tree Manipulation: Leveraging Graph Representations
There is growing interest in automating agricultural tasks that require
intricate and precise interaction with specialty crops, such as trees and
vines. However, developing robotic solutions for crop manipulation remains a
difficult challenge due to complexities involved in modeling their deformable
behavior. In this study, we present a framework for learning the deformation
behavior of tree-like crops under contact interaction. Our proposed method
involves encoding the state of a spring-damper modeled tree crop as a graph.
This representation allows us to employ graph networks to learn both a forward
model for predicting resulting deformations, and a contact policy for inferring
actions to manipulate tree crops. We conduct a comprehensive set of experiments
in a simulated environment and demonstrate generalizability of our method on
previously unseen trees. Videos can be found on the project website:
https://kantor-lab.github.io/tree_gnnComment: 7 pages, 10 figure
Bi-Touch: Bimanual Tactile Manipulation with Sim-to-Real Deep Reinforcement Learning
Bimanual manipulation with tactile feedback will be key to human-level robot
dexterity. However, this topic is less explored than single-arm settings,
partly due to the availability of suitable hardware along with the complexity
of designing effective controllers for tasks with relatively large state-action
spaces. Here we introduce a dual-arm tactile robotic system (Bi-Touch) based on
the Tactile Gym 2.0 setup that integrates two affordable industrial-level robot
arms with low-cost high-resolution tactile sensors (TacTips). We present a
suite of bimanual manipulation tasks tailored towards tactile feedback:
bi-pushing, bi-reorienting and bi-gathering. To learn effective policies, we
introduce appropriate reward functions for these tasks and propose a novel
goal-update mechanism with deep reinforcement learning. We also apply these
policies to real-world settings with a tactile sim-to-real approach. Our
analysis highlights and addresses some challenges met during the sim-to-real
application, e.g. the learned policy tended to squeeze an object in the
bi-reorienting task due to the sim-to-real gap. Finally, we demonstrate the
generalizability and robustness of this system by experimenting with different
unseen objects with applied perturbations in the real world. Code and videos
are available at https://sites.google.com/view/bi-touch/.Comment: Accepted by IEEE Robotics and Automation Letters (RA-L
- …