3,161 research outputs found
Interaction primitives for human-robot cooperation tasks
To engage in cooperative activities with human
partners, robots have to possess basic interactive abilities
and skills. However, programming such interactive skills is a
challenging task, as each interaction partner can have different
timing or an alternative way of executing movements. In this
paper, we propose to learn interaction skills by observing how
two humans engage in a similar task. To this end, we introduce
a new representation called Interaction Primitives. Interaction
primitives build on the framework of dynamic motor primitives
(DMPs) by maintaining a distribution over the parameters of
the DMP. With this distribution, we can learn the inherent
correlations of cooperative activities which allow us to infer the
behavior of the partner and to participate in the cooperation.
We will provide algorithms for synchronizing and adapting the
behavior of humans and robots during joint physical activities
Environment-adaptive interaction primitives for human-robot motor skill learning
© 2016 IEEE. In complex environments where robots are expected to co-operate with human partners, it is vital for the robot to consider properties of their collaborative activity in addition to the behavior of its partner. In this paper, we propose to learn such complex interactive skills by observing the demonstrations of a human-robot team with additional external attributes. We propose Environment-adaptive Interaction Primitives (EalPs) as an extension of Interaction Primitives. In cooperation tasks between human and robot with different environmental conditions, EalPs not only improve the predicted motor skills of robot within a brief observed human motion, but also obtain the generalization ability to adapt to new environmental conditions by learning the relationships between each condition and the corresponding motor skills from training samples. Our method is validated in the collaborative task of covering objects by plastic bag with a humanoid Baxter robot. To achieve the task successfully, the robot needs to coordinate itself to its partner while also considering information about the object to be covered
Towards a Platform-Independent Cooperative Human Robot Interaction System: III. An Architecture for Learning and Executing Actions and Shared Plans
Robots should be capable of interacting in a cooperative and adaptive manner with their human counterparts in open-ended tasks that can change in real-time. An important aspect of the robot behavior will be the ability to acquire new knowledge of the cooperative tasks by observing and interacting with humans. The current research addresses this challenge. We present results from a cooperative human-robot interaction system that has been specifically developed for portability between different humanoid platforms, by abstraction layers at the perceptual and motor interfaces. In the perceptual domain, the resulting system is demonstrated to learn to recognize objects and to recognize actions as sequences of perceptual primitives, and to transfer this learning, and recognition, between different robotic platforms. For execution, composite actions and plans are shown to be learnt on one robot and executed successfully on a different one. Most importantly, the system provides the ability to link actions into shared plans, that form the basis of human-robot cooperation, applying principles from human cognitive development to the domain of robot cognitive systems. © 2009-2011 IEEE
Learning Human-Robot Collaboration Insights through the Integration of Muscle Activity in Interaction Motion Models
Recent progress in human-robot collaboration makes fast and fluid
interactions possible, even when human observations are partial and occluded.
Methods like Interaction Probabilistic Movement Primitives (ProMP) model human
trajectories through motion capture systems. However, such representation does
not properly model tasks where similar motions handle different objects. Under
current approaches, a robot would not adapt its pose and dynamics for proper
handling. We integrate the use of Electromyography (EMG) into the Interaction
ProMP framework and utilize muscular signals to augment the human observation
representation. The contribution of our paper is increased task discernment
when trajectories are similar but tools are different and require the robot to
adjust its pose for proper handling. Interaction ProMPs are used with an
augmented vector that integrates muscle activity. Augmented time-normalized
trajectories are used in training to learn correlation parameters and robot
motions are predicted by finding the best weight combination and temporal
scaling for a task. Collaborative single task scenarios with similar motions
but different objects were used and compared. For one experiment only joint
angles were recorded, for the other EMG signals were additionally integrated.
Task recognition was computed for both tasks. Observation state vectors with
augmented EMG signals were able to completely identify differences across
tasks, while the baseline method failed every time. Integrating EMG signals
into collaborative tasks significantly increases the ability of the system to
recognize nuances in the tasks that are otherwise imperceptible, up to 74.6% in
our studies. Furthermore, the integration of EMG signals for collaboration also
opens the door to a wide class of human-robot physical interactions based on
haptic communication that has been largely unexploited in the field.Comment: 7 pages, 2 figures, 2 tables. As submitted to Humanoids 201
Folding Assembly by Means of Dual-Arm Robotic Manipulation
In this paper, we consider folding assembly as an assembly primitive suitable
for dual-arm robotic assembly, that can be integrated in a higher level
assembly strategy. The system composed by two pieces in contact is modelled as
an articulated object, connected by a prismatic-revolute joint. Different
grasping scenarios were considered in order to model the system, and a simple
controller based on feedback linearisation is proposed, using force torque
measurements to compute the contact point kinematics. The folding assembly
controller has been experimentally tested with two sample parts, in order to
showcase folding assembly as a viable assembly primitive.Comment: 7 pages, accepted for ICRA 201
- …