58 research outputs found

    Regrasp Planning using 10,000s of Grasps

    Full text link
    This paper develops intelligent algorithms for robots to reorient objects. Given the initial and goal poses of an object, the proposed algorithms plan a sequence of robot poses and grasp configurations that reorient the object from its initial pose to the goal. While the topic has been studied extensively in previous work, this paper makes important improvements in grasp planning by using over-segmented meshes, in data storage by using relational database, and in regrasp planning by mixing real-world roadmaps. The improvements enable robots to do robust regrasp planning using 10,000s of grasps and their relationships in interactive time. The proposed algorithms are validated using various objects and robots

    On CAD Informed Adaptive Robotic Assembly

    Full text link
    We introduce a robotic assembly system that streamlines the design-to-make workflow for going from a CAD model of a product assembly to a fully programmed and adaptive assembly process. Our system captures (in the CAD tool) the intent of the assembly process for a specific robotic workcell and generates a recipe of task-level instructions. By integrating visual sensing with deep-learned perception models, the robots infer the necessary actions to assemble the design from the generated recipe. The perception models are trained directly from simulation, allowing the system to identify various parts based on CAD information. We demonstrate the system with a workcell of two robots to assemble interlocking 3D part designs. We first build and tune the assembly process in simulation, verifying the generated recipe. Finally, the real robotic workcell assembles the design using the same behavior

    Implementation of a robotic flexible assembly system

    Get PDF
    As part of the Intelligent Task Automation program, a team developed enabling technologies for programmable, sensory controlled manipulation in unstructured environments. These technologies include 2-D/3-D vision sensing and understanding, force sensing and high speed force control, 2.5-D vision alignment and control, and multiple processor architectures. The subsequent design of a flexible, programmable, sensor controlled robotic assembly system for small electromechanical devices is described using these technologies and ongoing implementation and integration efforts. Using vision, the system picks parts dumped randomly in a tray. Using vision and force control, it performs high speed part mating, in-process monitoring/verification of expected results and autonomous recovery from some errors. It is programmed off line with semiautomatic action planning

    A Reactive Planning Framework for Dexterous Robotic Manipulation

    Get PDF
    This thesis investigates a reactive motion planning and controller framework that enables robots to manipulate objects dexterously. We develop a robotic platform that can quickly and reliably replan actions based on sensed information. Robotic manipulation is subject to noise due to uncertainty in frictional contact information, and reactivity is key for robustness. The planning framework has been designed with generality in mind and naturally extends to a variety of robotic tasks, manipulators and sensors. This design is validated experimentally on an ABB IRB 14000 dual-arm industrial collaborative robot. In this research, we are interested in dexterous robot manipulation, where the key technology is to move an object from an initial location to a desired configuration. The robot makes use of a high resolution tactile sensor to monitor the progress of the task and drive the reactive behavior of the robot to counter mistakes or unaccounted environment conditions. The motion planning framework is integrated with a task planner that dictates the high-level manipulation behavior of the robot, as well as a low-level controller, that adapts robot motions based on measured tactile signaOutgoin
    corecore