39,343 research outputs found
Development of an intelligent object for grasp and manipulation research
KÔiva R, Haschke R, Ritter H. Development of an intelligent object for grasp and manipulation research. Presented at the ICAR 2011, Tallinn, Estonia.In this paper we introduce a novel device, called iObject, which is equipped with tactile and motion tracking sensors that allow for the evaluation of human and robot grasping and manipulation actions. Contact location and contact force, object acceleration in space (6D) and orientation relative to the earth (3D magnetometer) are measured and transmitted wirelessly over a Bluetooth connection. By allowing human-human, human-robot and robot-robot comparisons to be made, iObject is a versatile tool for studying manual interaction.
To demonstrate the efficiency and flexibility of iObject for the study of bimanual interactions, we report on a physiological experiment and evaluate the main parameters of the considered dual-handed manipulation task
Data-Driven Grasp Synthesis - A Survey
We review the work on data-driven grasp synthesis and the methodologies for
sampling and ranking candidate grasps. We divide the approaches into three
groups based on whether they synthesize grasps for known, familiar or unknown
objects. This structure allows us to identify common object representations and
perceptual processes that facilitate the employed data-driven grasp synthesis
technique. In the case of known objects, we concentrate on the approaches that
are based on object recognition and pose estimation. In the case of familiar
objects, the techniques use some form of a similarity matching to a set of
previously encountered objects. Finally for the approaches dealing with unknown
objects, the core part is the extraction of specific features that are
indicative of good grasps. Our survey provides an overview of the different
methodologies and discusses open problems in the area of robot grasping. We
also draw a parallel to the classical approaches that rely on analytic
formulations.Comment: 20 pages, 30 Figures, submitted to IEEE Transactions on Robotic
Grasping bulky objects with two anthropomorphic hands
© 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other worksThis paper presents an algorithm to compute precision grasps for bulky objects using two anthropomorphic hands. We use objects modeled as point clouds obtained from a sensor camera or from a CAD model. We then process the point clouds dividing them into two set of slices where we look for sets of triplets of points. Each triplet must accomplish some physical conditions based on the structure of the hands. Then, the triplets of points from each set of slices are evaluated to find a combination that satisfies the force closure condition (FC). Once one valid couple of triplets have been found the inverse kinematics of the system is computed in order to know if the corresponding points are reachable by the hands, if so, motion planning and a collision check are performed to asses if the final grasp configuration of the system is suitable. The paper
inclu des some application examples of the proposed approachAccepted versio
Dexterous manipulation of unknown objects using virtual contact points
The manipulation of unknown objects is a problem of special interest in robotics since it is not always possible to have exact models of the objects with which the robot interacts. This paper presents a simple strategy to manipulate unknown objects using a robotic hand equipped with tactile sensors. The hand configurations that allow the rotation of an unknown object are computed using only tactile and kinematic information, obtained during the manipulation process and reasoning about the desired and real positions of the fingertips during the manipulation. This is done taking into account that the desired positions of the fingertips are not physically reachable since they are located in the interior of the manipulated object and therefore they are virtual positions with associated virtual contact points. The proposed approach was satisfactorily validated using three fingers of an anthropomorphic robotic hand (Allegro Hand), with the original fingertips replaced by tactile sensors (WTS-FT). In the experimental validation, several everyday objects with different shapes were successfully manipulated, rotating them without the need of knowing their shape or any other physical property.Peer ReviewedPostprint (author's final draft
Regrasp Planning using 10,000s of Grasps
This paper develops intelligent algorithms for robots to reorient objects.
Given the initial and goal poses of an object, the proposed algorithms plan a
sequence of robot poses and grasp configurations that reorient the object from
its initial pose to the goal. While the topic has been studied extensively in
previous work, this paper makes important improvements in grasp planning by
using over-segmented meshes, in data storage by using relational database, and
in regrasp planning by mixing real-world roadmaps. The improvements enable
robots to do robust regrasp planning using 10,000s of grasps and their
relationships in interactive time. The proposed algorithms are validated using
various objects and robots
Safe Robotic Grasping: Minimum Impact-Force Grasp Selection
This paper addresses the problem of selecting from a choice of possible
grasps, so that impact forces will be minimised if a collision occurs while the
robot is moving the grasped object along a post-grasp trajectory. Such
considerations are important for safety in human-robot interaction, where even
a certified "human-safe" (e.g. compliant) arm may become hazardous once it
grasps and begins moving an object, which may have significant mass, sharp
edges or other dangers. Additionally, minimising collision forces is critical
to preserving the longevity of robots which operate in uncertain and hazardous
environments, e.g. robots deployed for nuclear decommissioning, where removing
a damaged robot from a contaminated zone for repairs may be extremely difficult
and costly. Also, unwanted collisions between a robot and critical
infrastructure (e.g. pipework) in such high-consequence environments can be
disastrous. In this paper, we investigate how the safety of the post-grasp
motion can be considered during the pre-grasp approach phase, so that the
selected grasp is optimal in terms applying minimum impact forces if a
collision occurs during a desired post-grasp manipulation. We build on the
methods of augmented robot-object dynamics models and "effective mass" and
propose a method for combining these concepts with modern grasp and trajectory
planners, to enable the robot to achieve a grasp which maximises the safety of
the post-grasp trajectory, by minimising potential collision forces. We
demonstrate the effectiveness of our approach through several experiments with
both simulated and real robots.Comment: To be appeared in IEEE/RAS IROS 201
- âŠ