32 research outputs found
Multi-robot grasp planning for sequential assembly operations
This paper addresses the problem of finding robot configurations to grasp assembly parts during a sequence of collaborative assembly operations. We formulate the search for such configurations as a constraint satisfaction problem (CSP).Collision constraints in an operation and transfer constraints between operations determine the sets of feasible robot configurations. We show that solving the connected constraint graph with off-the-shelf CSP algorithms can quickly become infeasible even fora few sequential assembly operations. We present an algorithm which, through the assumption of feasible regrasps, divides the CSP into independent smaller problems that can be solved exponentially faster. The algorithm then uses local search techniques to improve this solution by removing a gradually increasing number of regrasps from the plan. The algorithm enables the user to stop the planner anytime and use the current best plan if the cost of removing regrasps from the plan exceeds the cost of executing those regrasps. We present simulation experiments to compare our algorithm’s performance toa naive algorithm which directly solves the connected constraint graph. We also present a physical robot system which uses the output of our planner to grasp and bring parts together in assembly configurations
A Certified-Complete Bimanual Manipulation Planner
Planning motions for two robot arms to move an object collaboratively is a
difficult problem, mainly because of the closed-chain constraint, which arises
whenever two robot hands simultaneously grasp a single rigid object. In this
paper, we propose a manipulation planning algorithm to bring an object from an
initial stable placement (position and orientation of the object on the support
surface) towards a goal stable placement. The key specificity of our algorithm
is that it is certified-complete: for a given object and a given environment,
we provide a certificate that the algorithm will find a solution to any
bimanual manipulation query in that environment whenever one exists. Moreover,
the certificate is constructive: at run-time, it can be used to quickly find a
solution to a given query. The algorithm is tested in software and hardware on
a number of large pieces of furniture.Comment: 12 pages, 7 figures, 1 tabl
Planning simultaneous perception and manipulation
This thesis is concerned with deriving planning algorithms for robot manipulators. Manipulation has two effects, the robot has a physical effect on the object, and it also acquires information about the object. This thesis presents algorithms that treat both problems.
First, I present an extension of the well-known piano mover’s problem where a robot pushing an object must plan its movements as well as those of the object. This requires simultaneous planning in the joint space of the robot and the configuration space of the object, in contrast to the original problem which only requires planning in the latter space. The effects of a robot action on the object configuration are determined by the non-invertible rigid body mechanics. To solve this a two-level planner is presented that coordinates planning in each space.
Second, I consider planning under uncertainty and in particular planning for information effects. I consider the case where a robot has to reach and grasp an object under pose uncertainty caused by shape incompleteness. The main novel outcome is to enable tactile information gain planning for a dexterous, highdegree of freedom manipulator with non- Gaussian pose uncertainty. The method is demonstrated in trials with both simulated and real robots
A Hierarchical Architecture for Flexible Human-Robot Collaboration
This thesis is devoted to design a software architecture for Human-
Robot Collaboration (HRC), to enhance the robots\u2019 abilities for working
alongside humans. We propose FlexHRC, a hierarchical and
flexible human-robot cooperation architecture specifically designed
to provide collaborative robots with an extended degree of autonomy
when supporting human operators in tasks with high-variability.
Along with FlexHRC, we have introduced novel techniques appropriate
for three interleaved levels, namely perception, representation,
and action, each one aimed at addressing specific traits of humanrobot
cooperation tasks.
The Industry 4.0 paradigm emphasizes the crucial benefits that collaborative
robots could bring to the whole production process. In this
context, a yet unreached enabling technology is the design of robots
able to deal at all levels with humans\u2019 intrinsic variability, which is
not only a necessary element to a comfortable working experience
for humans but also a precious capability for efficiently dealing with
unexpected events. Moreover, a flexible assembly of semi-finished
products is one of the expected features of next-generation shop-floor
lines. Currently, such flexibility is placed on the shoulders of human
operators, who are responsible for product variability, and therefore
they are subject to potentially high stress levels and cognitive load
when dealing with complex operations. At the same time, operations
in the shop-floor are still very structured and well-defined. Collaborative
robots have been designed to allow for a transition of such burden
from human operators to robots that are flexible enough to support
them in high-variability tasks while they unfold.
As mentioned before, FlexHRC architecture encompasses three perception,
action, and representation levels. The perception level relies
on wearable sensors for human action recognition and point cloud
data for perceiving the object in the scene. The action level embraces
four components, the robot execution manager for decoupling
action planning from robot motion planning and mapping the symbolic
actions to the robot controller command interface, a task Priority
framework to control the robot, a differential equation solver to
simulate and evaluate the robot behaviour on-the-fly, and finally a
random-based method for the robot path planning. The representation
level depends on AND/OR graphs for the representation of and
the reasoning upon human-robot cooperation models online, a task
manager to plan, adapt, and make decision for the robot behaviors,
and a knowledge base in order to store the cooperation and workspace
information.
We evaluated the FlexHRC functionalities according to the application
desired objectives. This evaluation is accompanied with several
experiments, namely collaborative screwing task, coordinated transportation
of the objects in cluttered environment, collaborative table
assembly task, and object positioning tasks.
The main contributions of this work are: (i) design and implementation
of FlexHRC which enables the functional requirements necessary
for the shop-floor assembly application such as task and team
level flexibility, scalability, adaptability, and safety just a few to name,
(ii) development of the task representation, which integrates a hierarchical
AND/OR graph whose online behaviour is formally specified
using First Order Logic, (iii) an in-the-loop simulation-based decision
making process for the operations of collaborative robots coping with
the variability of human operator actions, (iv) the robot adaptation to
the human on-the-fly decisions and actions via human action recognition,
and (v) the predictable robot behavior to the human user thanks
to the task priority based control frame, the introduced path planner,
and the natural and intuitive communication of the robot with the
human