51 research outputs found

    Robotics Platforms Incorporating Manipulators Having Common Joint Designs

    Get PDF
    Manipulators in accordance with various embodiments of the invention can be utilized to implement statically stable robots capable of both dexterous manipulation and versatile mobility. Manipulators in accordance with one embodiment of the invention include: an azimuth actuator; three elbow joints that each include two actuators that are offset to allow greater than 360 degree rotation of each joint; a first connecting structure that connects the azimuth actuator and a first of the three elbow joints; a second connecting structure that connects the first elbow joint and a second of the three elbow joints; a third connecting structure that connects the second elbow joint to a third of the three elbow joints; and an end-effector interface connected to the third of the three elbow joints

    Supervised Autonomous Locomotion and Manipulation for Disaster Response with a Centaur-like Robot

    Full text link
    Mobile manipulation tasks are one of the key challenges in the field of search and rescue (SAR) robotics requiring robots with flexible locomotion and manipulation abilities. Since the tasks are mostly unknown in advance, the robot has to adapt to a wide variety of terrains and workspaces during a mission. The centaur-like robot Centauro has a hybrid legged-wheeled base and an anthropomorphic upper body to carry out complex tasks in environments too dangerous for humans. Due to its high number of degrees of freedom, controlling the robot with direct teleoperation approaches is challenging and exhausting. Supervised autonomy approaches are promising to increase quality and speed of control while keeping the flexibility to solve unknown tasks. We developed a set of operator assistance functionalities with different levels of autonomy to control the robot for challenging locomotion and manipulation tasks. The integrated system was evaluated in disaster response scenarios and showed promising performance.Comment: In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, October 201

    Proprioceptive Inference for Dual-Arm Grasping of Bulky Objects Using RoboSimian

    Get PDF
    This work demonstrates dual-arm lifting of bulky objects based on inferred object properties (center of mass (COM) location, weight, and shape) using proprioception (i.e. force torque measurements). Data-driven Bayesian models describe these quantities, which enables subsequent behaviors to depend on confidence of the learned models. Experiments were conducted using the NASA Jet Propulsion Laboratory's (JPL) RoboSimian to lift a variety of cumbersome objects ranging in mass from 7kg to 25kg. The position of a supporting second manipulator was determined using a particle set and heuristics that were derived from inferred object properties. The supporting manipulator decreased the initial manipulator's load and distributed the wrench load more equitably across each manipulator, for each bulky object. Knowledge of the objects came from pure proprioception (i.e. without reliance on vision or other exteroceptive sensors) throughout the experiments

    Supervised Remote Robot with Guided Autonomy and Teleoperation (SURROGATE): A Framework for Whole-Body Manipulation

    Get PDF
    The use of the cognitive capabilities of humans to help guide the autonomy of robotics platforms in what is typically called “supervised-autonomy” is becoming more commonplace in robotics research. The work discussed in this paper presents an approach to a human-in-the-loop mode of robot operation that integrates high level human cognition and commanding with the intelligence and processing power of autonomous systems. Our framework for a “Supervised Remote Robot with Guided Autonomy and Teleoperation” (SURROGATE) is demonstrated on a robotic platform consisting of a pan-tilt perception head, two 7-DOF arms connected by a single 7-DOF torso, mounted on a tracked-wheel base. We present an architecture that allows high-level supervisory commands and intents to be specified by a user that are then interpreted by the robotic system to perform whole body manipulation tasks autonomously. We use a concept of “behaviors” to chain together sequences of “actions” for the robot to perform which is then executed real time

    Architecture for in-space robotic assembly of a modular space telescope

    Get PDF
    An architecture and conceptual design for a robotically assembled, modular space telescope (RAMST) that enables extremely large space telescopes to be conceived is presented. The distinguishing features of the RAMST architecture compared with prior concepts include the use of a modular deployable structure, a general-purpose robot, and advanced metrology, with the option of formation flying. To demonstrate the feasibility of the robotic assembly concept, we present a reference design using the RAMST architecture for a formation flying 100-m telescope that is assembled in Earth orbit and operated at the Sun–Earth Lagrange Point 2

    Tactile sensing and control of robotic manipulator integrating fiber Bragg grating strain-sensor

    Get PDF
    Tactile sensing is an instrumental modality of robotic manipulation, as it provides information that is not accessible via remote sensors such as cameras or lidars. Touch is particularly crucial in unstructured environments, where the robot’s internal representation of manipulated objects is uncertain. In this study we present the sensorization of an existing artificial hand, with the aim to achieve fine control of robotic limbs and perception of object’s physical properties. Tactile feedback is conveyed by means of a soft sensor integrated at the fingertip of a robotic hand. The sensor consists of an optical fiber, housing Fiber Bragg Gratings (FBGs) transducers, embedded into a soft polymeric material integrated on a rigid hand. Through several tasks involving grasps of different objects in various conditions, the ability of the system to acquire information is assessed. Results show that a classifier based on the sensor outputs of the robotic hand is capable of accurately detecting both size and rigidity of the operated objects (99.36 and 100% accuracy, respectively). Furthermore, the outputs provide evidence of the ability to grab fragile objects without breakage or slippage e and to perform dynamic manipulative tasks, that involve the adaptation of fingers position based on the grasped objects’ condition
    • 

    corecore