7,602 research outputs found
Learning Task Priorities from Demonstrations
Bimanual operations in humanoids offer the possibility to carry out more than
one manipulation task at the same time, which in turn introduces the problem of
task prioritization. We address this problem from a learning from demonstration
perspective, by extending the Task-Parameterized Gaussian Mixture Model
(TP-GMM) to Jacobian and null space structures. The proposed approach is tested
on bimanual skills but can be applied in any scenario where the prioritization
between potentially conflicting tasks needs to be learned. We evaluate the
proposed framework in: two different tasks with humanoids requiring the
learning of priorities and a loco-manipulation scenario, showing that the
approach can be exploited to learn the prioritization of multiple tasks in
parallel.Comment: Accepted for publication at the IEEE Transactions on Robotic
Generation of dynamic motion for anthropomorphic systems under prioritized equality and inequality constraints
In this paper, we propose a solution to compute full-dynamic motions for a humanoid robot, accounting for various kinds of constraints such as dynamic balance or joint limits. As a first step, we propose a unification of task-based control schemes, in inverse kinematics or inverse dynamics. Based on this unification, we generalize the cascade of quadratic programs that were developed for inverse kinematics only. Then, we apply the solution to generate, in simulation, wholebody motions for a humanoid robot in unilateral contact with the ground, while ensuring the dynamic balance on a non horizontal surface
Safety-related Tasks within the Set-Based Task-Priority Inverse Kinematics Framework
In this paper we present a framework that allows the motion control of a
robotic arm automatically handling different kinds of safety-related tasks. The
developed controller is based on a Task-Priority Inverse Kinematics algorithm
that allows the manipulator's motion while respecting constraints defined
either in the joint or in the operational space in the form of equality-based
or set-based tasks. This gives the possibility to define, among the others,
tasks as joint-limits, obstacle avoidance or limiting the workspace in the
operational space. Additionally, an algorithm for the real-time computation of
the minimum distance between the manipulator and other objects in the
environment using depth measurements has been implemented, effectively allowing
obstacle avoidance tasks. Experiments with a Jaco manipulator, operating in
an environment where an RGB-D sensor is used for the obstacles detection, show
the effectiveness of the developed system
Handling robot constraints within a Set-Based Multi-Task Priority Inverse Kinematics Framework
Set-Based Multi-Task Priority is a recent framework to handle inverse
kinematics for redundant structures. Both equality tasks, i.e., control
objectives to be driven to a desired value, and set-bases tasks, i.e., control
objectives to be satisfied with a set/range of values can be addressed in a
rigorous manner within a priority framework. In addition, optimization tasks,
driven by the gradient of a proper function, may be considered as well, usually
as lower priority tasks. In this paper the proper design of the tasks, their
priority and the use of a Set-Based Multi-Task Priority framework is proposed
in order to handle several constraints simultaneously in real-time. It is shown
that safety related tasks such as, e.g., joint limits or kinematic singularity,
may be properly handled by consider them both at an higher priority as
set-based task and at a lower within a proper optimization functional.
Experimental results on a 7DOF Jaco$^2
Exploiting the robot kinematic redundancy for emotion conveyance to humans as a lower priority task
Current approaches do not allow robots to execute a task and simultaneously convey emotions to users using their body motions. This paper explores the capabilities of the Jacobian null space of a humanoid robot to convey emotions. A task priority formulation has been implemented in a Pepper robot which allows the specification of a primary task (waving gesture, transportation of an object, etc.) and exploits the kinematic redundancy of the robot to convey emotions to humans as a lower priority task. The emotions, defined by Mehrabian as points in the pleasure–arousal–dominance space, generate intermediate motion features (jerkiness, activity and gaze) that carry the emotional information. A map from this features to the joints of the robot is presented. A user study has been conducted in which emotional motions have been shown to 30 participants. The results show that happiness and sadness are very well conveyed to the user, calm is moderately well conveyed, and fear is not well conveyed. An analysis on the dependencies between the motion features and the emotions perceived by the participants shows that activity correlates positively with arousal, jerkiness is not perceived by the user, and gaze conveys dominance when activity is low. The results indicate a strong influence of the most energetic motions of the emotional task and point out new directions for further research. Overall, the results show that the null space approach can be regarded as a promising mean to convey emotions as a lower priority task.Postprint (author's final draft
- …