3,530 research outputs found
Learning to Navigate Cloth using Haptics
We present a controller that allows an arm-like manipulator to navigate
deformable cloth garments in simulation through the use of haptic information.
The main challenge of such a controller is to avoid getting tangled in, tearing
or punching through the deforming cloth. Our controller aggregates force
information from a number of haptic-sensing spheres all along the manipulator
for guidance. Based on haptic forces, each individual sphere updates its target
location, and the conflicts that arise between this set of desired positions is
resolved by solving an inverse kinematic problem with constraints.
Reinforcement learning is used to train the controller for a single
haptic-sensing sphere, where a training run is terminated (and thus penalized)
when large forces are detected due to contact between the sphere and a
simplified model of the cloth. In simulation, we demonstrate successful
navigation of a robotic arm through a variety of garments, including an
isolated sleeve, a jacket, a shirt, and shorts. Our controller out-performs two
baseline controllers: one without haptics and another that was trained based on
large forces between the sphere and cloth, but without early termination.Comment: Supplementary video available at https://youtu.be/iHqwZPKVd4A.
Related publications http://www.cc.gatech.edu/~karenliu/Robotic_dressing.htm
MaestROB: A Robotics Framework for Integrated Orchestration of Low-Level Control and High-Level Reasoning
This paper describes a framework called MaestROB. It is designed to make the
robots perform complex tasks with high precision by simple high-level
instructions given by natural language or demonstration. To realize this, it
handles a hierarchical structure by using the knowledge stored in the forms of
ontology and rules for bridging among different levels of instructions.
Accordingly, the framework has multiple layers of processing components;
perception and actuation control at the low level, symbolic planner and Watson
APIs for cognitive capabilities and semantic understanding, and orchestration
of these components by a new open source robot middleware called Project Intu
at its core. We show how this framework can be used in a complex scenario where
multiple actors (human, a communication robot, and an industrial robot)
collaborate to perform a common industrial task. Human teaches an assembly task
to Pepper (a humanoid robot from SoftBank Robotics) using natural language
conversation and demonstration. Our framework helps Pepper perceive the human
demonstration and generate a sequence of actions for UR5 (collaborative robot
arm from Universal Robots), which ultimately performs the assembly (e.g.
insertion) task.Comment: IEEE International Conference on Robotics and Automation (ICRA) 2018.
Video: https://www.youtube.com/watch?v=19JsdZi0TW
- …