16,060 research outputs found
Goal Set Inverse Optimal Control and Iterative Re-planning for Predicting Human Reaching Motions in Shared Workspaces
To enable safe and efficient human-robot collaboration in shared workspaces
it is important for the robot to predict how a human will move when performing
a task. While predicting human motion for tasks not known a priori is very
challenging, we argue that single-arm reaching motions for known tasks in
collaborative settings (which are especially relevant for manufacturing) are
indeed predictable. Two hypotheses underlie our approach for predicting such
motions: First, that the trajectory the human performs is optimal with respect
to an unknown cost function, and second, that human adaptation to their
partner's motion can be captured well through iterative re-planning with the
above cost function. The key to our approach is thus to learn a cost function
which "explains" the motion of the human. To do this, we gather example
trajectories from pairs of participants performing a collaborative assembly
task using motion capture. We then use Inverse Optimal Control to learn a cost
function from these trajectories. Finally, we predict reaching motions from the
human's current configuration to a task-space goal region by iteratively
re-planning a trajectory using the learned cost function. Our planning
algorithm is based on the trajectory optimizer STOMP, it plans for a 23 DoF
human kinematic model and accounts for the presence of a moving collaborator
and obstacles in the environment. Our results suggest that in most cases, our
method outperforms baseline methods when predicting motions. We also show that
our method outperforms baselines for predicting human motion when a human and a
robot share the workspace.Comment: 12 pages, Accepted for publication IEEE Transaction on Robotics 201
Flexible human-robot cooperation models for assisted shop-floor tasks
The Industry 4.0 paradigm emphasizes the crucial benefits that collaborative
robots, i.e., robots able to work alongside and together with humans, could
bring to the whole production process. In this context, an enabling technology
yet unreached is the design of flexible robots able to deal at all levels with
humans' intrinsic variability, which is not only a necessary element for a
comfortable working experience for the person but also a precious capability
for efficiently dealing with unexpected events. In this paper, a sensing,
representation, planning and control architecture for flexible human-robot
cooperation, referred to as FlexHRC, is proposed. FlexHRC relies on wearable
sensors for human action recognition, AND/OR graphs for the representation of
and reasoning upon cooperation models, and a Task Priority framework to
decouple action planning from robot motion planning and control.Comment: Submitted to Mechatronics (Elsevier
Proceedings of the ECCS 2005 satellite workshop: embracing complexity in design - Paris 17 November 2005
Embracing complexity in design is one of the critical issues and challenges of the 21st century. As the realization grows that design activities and artefacts display properties associated with complex adaptive systems, so grows the need to use complexity concepts and methods to understand these properties and inform the design of better artifacts. It is a great challenge because complexity science represents an epistemological and methodological swift that promises a holistic approach in the understanding and operational support of design. But design is also a major contributor in complexity research. Design science is concerned with problems that are fundamental in the sciences in general and complexity sciences in particular. For instance, design has been perceived and studied as a ubiquitous activity inherent in every human activity, as the art of generating hypotheses, as a type of experiment, or as a creative co-evolutionary process. Design science and its established approaches and practices can be a great source for advancement and innovation in complexity science. These proceedings are the result of a workshop organized as part of the activities of a UK government AHRB/EPSRC funded research cluster called Embracing Complexity in Design (www.complexityanddesign.net) and the European Conference in Complex Systems (complexsystems.lri.fr). Embracing complexity in design is one of the critical issues and challenges of the 21st century. As the realization grows that design activities and artefacts display properties associated with complex adaptive systems, so grows the need to use complexity concepts and methods to understand these properties and inform the design of better artifacts. It is a great challenge because complexity science represents an epistemological and methodological swift that promises a holistic approach in the understanding and operational support of design. But design is also a major contributor in complexity research. Design science is concerned with problems that are fundamental in the sciences in general and complexity sciences in particular. For instance, design has been perceived and studied as a ubiquitous activity inherent in every human activity, as the art of generating hypotheses, as a type of experiment, or as a creative co-evolutionary process. Design science and its established approaches and practices can be a great source for advancement and innovation in complexity science. These proceedings are the result of a workshop organized as part of the activities of a UK government AHRB/EPSRC funded research cluster called Embracing Complexity in Design (www.complexityanddesign.net) and the European Conference in Complex Systems (complexsystems.lri.fr)
A Hierarchical Architecture for Flexible Human-Robot Collaboration
This thesis is devoted to design a software architecture for Human-
Robot Collaboration (HRC), to enhance the robots\u2019 abilities for working
alongside humans. We propose FlexHRC, a hierarchical and
flexible human-robot cooperation architecture specifically designed
to provide collaborative robots with an extended degree of autonomy
when supporting human operators in tasks with high-variability.
Along with FlexHRC, we have introduced novel techniques appropriate
for three interleaved levels, namely perception, representation,
and action, each one aimed at addressing specific traits of humanrobot
cooperation tasks.
The Industry 4.0 paradigm emphasizes the crucial benefits that collaborative
robots could bring to the whole production process. In this
context, a yet unreached enabling technology is the design of robots
able to deal at all levels with humans\u2019 intrinsic variability, which is
not only a necessary element to a comfortable working experience
for humans but also a precious capability for efficiently dealing with
unexpected events. Moreover, a flexible assembly of semi-finished
products is one of the expected features of next-generation shop-floor
lines. Currently, such flexibility is placed on the shoulders of human
operators, who are responsible for product variability, and therefore
they are subject to potentially high stress levels and cognitive load
when dealing with complex operations. At the same time, operations
in the shop-floor are still very structured and well-defined. Collaborative
robots have been designed to allow for a transition of such burden
from human operators to robots that are flexible enough to support
them in high-variability tasks while they unfold.
As mentioned before, FlexHRC architecture encompasses three perception,
action, and representation levels. The perception level relies
on wearable sensors for human action recognition and point cloud
data for perceiving the object in the scene. The action level embraces
four components, the robot execution manager for decoupling
action planning from robot motion planning and mapping the symbolic
actions to the robot controller command interface, a task Priority
framework to control the robot, a differential equation solver to
simulate and evaluate the robot behaviour on-the-fly, and finally a
random-based method for the robot path planning. The representation
level depends on AND/OR graphs for the representation of and
the reasoning upon human-robot cooperation models online, a task
manager to plan, adapt, and make decision for the robot behaviors,
and a knowledge base in order to store the cooperation and workspace
information.
We evaluated the FlexHRC functionalities according to the application
desired objectives. This evaluation is accompanied with several
experiments, namely collaborative screwing task, coordinated transportation
of the objects in cluttered environment, collaborative table
assembly task, and object positioning tasks.
The main contributions of this work are: (i) design and implementation
of FlexHRC which enables the functional requirements necessary
for the shop-floor assembly application such as task and team
level flexibility, scalability, adaptability, and safety just a few to name,
(ii) development of the task representation, which integrates a hierarchical
AND/OR graph whose online behaviour is formally specified
using First Order Logic, (iii) an in-the-loop simulation-based decision
making process for the operations of collaborative robots coping with
the variability of human operator actions, (iv) the robot adaptation to
the human on-the-fly decisions and actions via human action recognition,
and (v) the predictable robot behavior to the human user thanks
to the task priority based control frame, the introduced path planner,
and the natural and intuitive communication of the robot with the
human
- âŠ