2,191 research outputs found

    Proprioceptive Inference for Dual-Arm Grasping of Bulky Objects Using RoboSimian

    Get PDF
    This work demonstrates dual-arm lifting of bulky objects based on inferred object properties (center of mass (COM) location, weight, and shape) using proprioception (i.e. force torque measurements). Data-driven Bayesian models describe these quantities, which enables subsequent behaviors to depend on confidence of the learned models. Experiments were conducted using the NASA Jet Propulsion Laboratory's (JPL) RoboSimian to lift a variety of cumbersome objects ranging in mass from 7kg to 25kg. The position of a supporting second manipulator was determined using a particle set and heuristics that were derived from inferred object properties. The supporting manipulator decreased the initial manipulator's load and distributed the wrench load more equitably across each manipulator, for each bulky object. Knowledge of the objects came from pure proprioception (i.e. without reliance on vision or other exteroceptive sensors) throughout the experiments

    Integrating Vision and Physical Interaction for Discovery, Segmentation and Grasping of Unknown Objects

    Get PDF
    In dieser Arbeit werden Verfahren der Bildverarbeitung und die Fähigkeit humanoider Roboter, mit ihrer Umgebung physisch zu interagieren, in engem Zusammenspiel eingesetzt, um unbekannte Objekte zu identifizieren, sie vom Hintergrund und anderen Objekten zu trennen, und letztendlich zu greifen. Im Verlauf dieser interaktiven Exploration werden außerdem Eigenschaften des Objektes wie etwa sein Aussehen und seine Form ermittelt

    Inter-finger Small Object Manipulation with DenseTact Optical Tactile Sensor

    Full text link
    The ability to grasp and manipulate small objects in cluttered environments remains a significant challenge. This paper introduces a novel approach that utilizes a tactile sensor-equipped gripper with eight degrees of freedom to overcome these limitations. We employ DenseTact 2.0 for the gripper, enabling precise control and improved grasp success rates, particularly for small objects ranging from 5mm to 25mm. Our integrated strategy incorporates the robot arm, gripper, and sensor to manipulate and orient small objects for subsequent classification effectively. We contribute a specialized dataset designed for classifying these objects based on tactile sensor output and a new control algorithm for in-hand orientation tasks. Our system demonstrates 88% of successful grasp and successfully classified small objects in cluttered scenarios

    Object Recognition and Localization : the Role of Tactile Sensors

    Get PDF
    Tactile sensors, because of their intrinsic insensitivity to lighting conditions and water turbidity, provide promising opportunities for augmenting the capabilities of vision sensors in applications involving object recognition and localization. This thesis presents two approaches for haptic object recognition and localization for ground and underwater environments. The first approach called Batch Ransac and Iterative Closest Point augmented Sequential Filter (BRICPSF) is based on an innovative combination of a sequential filter, Iterative-Closest-Point algorithm, and a feature-based Random Sampling and Consensus (RANSAC) algorithm for database matching. It can handle a large database of 3D-objects of complex shapes and performs a complete six-degree-of-freedom localization of static objects. The algorithms are validated by experimentation in simulation and using actual hardware. To our knowledge this is the first instance of haptic object recognition and localization in underwater environments. The second approach is biologically inspired, and provides a close integration between exploration and recognition. An edge following exploration strategy is developed that receives feedback from the current state of recognition. A recognition by parts approach is developed which uses BRICPSF for object part recognition. Object exploration is either directed to explore a part until it is successfully recognized, or is directed towards new parts to endorse the current recognition belief. This approach is validated by simulation experiments

    Dexterous grasping under shape uncertainty

    Get PDF
    An important challenge in robotics is to achieve robust performance in object grasping and manipulation, dealing with noise and uncertainty. This paper presents an approach for addressing the performance of dexterous grasping under shape uncertainty. In our approach, the uncertainty in object shape is parameterized and incorporated as a constraint into grasp planning. The proposed approach is used to plan feasible hand con gurations for realizing planned contacts using different robotic hands. A compliant nger closing scheme is devised by exploiting both the object shape uncertainty and tactile sensing at ngertips. Experimental evaluation demonstrates that our method improves the performance of dexterous grasping under shape uncertainty
    • …
    corecore