2,564 research outputs found

    Design and control of a multi-fingered robot hand provided with tactile feedback

    Get PDF
    The design, construction, control and application of a three fingered robot hand with nine degrees of freedom and built-in multi-component force sensors is described. The adopted gripper kinematics are justified and optimized with respect to grasping and manipulation flexibility. The hand was constructed with miniature motor drive systems imbedded into the fingers. The control is hierarchically structured and is implemented on a simple PC-AT computer. The hand's dexterity and intelligence are demonstrated with some experiments

    Smart hands for the EVA retriever

    Get PDF
    Dexterous, robotic hands are required for the extravehicular activity retriever (EVAR) system being developed by the NASA Johnson Space Center (JSC). These hands, as part of the EVAR system, must be able to grasp objects autonomously and securely which inadvertently separate from the Space Station. Development of the required hands was initiated in 1987. Outlined here are the hand development activities, including design considerations, progress to date, and future plans. Several types of dexterous hands that were evaluated, along with a proximity-sensing capability that was developed to initiate a reflexive, adaptive grasp, are described. The evaluations resulted in the design and fabrication of a 6-degree-of-freedom (DOF) hand that has two fingers and a thumb arranged in an anthropomorphic configuration. Finger joint force and position sensors are included in the design, as well as infrared proximity sensors which allow initiation of the grasp sequence when an object is detected within the grasp envelope

    Method and apparatus for positioning a robotic end effector

    Get PDF
    A robotic end effector and operation protocol for a reliable grasp of a target object irrespective of the target's contours is disclosed. A robotic hand includes a plurality of jointed fingers, one of which, like a thumb, is in opposed relation to the other. Each finger is comprised of at least two jointed sections, and provided with reflective proximity sensors, one on the inner surface of each finger section. Each proximity sensor comprises a transmitter of a beam of radiant energy and means for receiving reflections of the transmitted energy when reflected by a target object and for generating electrical signals responsive thereto. On the fingers opposed to the thumb, the proximity sensors on the outermost finger sections are aligned in an outer sensor array and the sensors on the intermediate finger sections and sensors on the innermost finger sections are similarly arranged to form an intermediate sensor array and an inner sensor array, respectively. The invention includes a computer system with software and/or circuitry for a protocol comprising the steps in sequence of: (1) approach axis alignment to maximize the number of outer layer sensors which detect the target; (2) non-contact contour following the target by the robot fingers to minimize target escape potential; and (3) closing to rigidize the target including dynamically re-adjusting the end effector finger alignment to compensate for target motion. A signal conditioning circuit and gain adjustment means are included to maintain the dynamic range of low power reflection signals

    3D printed pneumatic soft actuators and sensors: their modeling, performance quantification, control and applications in soft robotic systems

    Get PDF
    Continued technological progress in robotic systems has led to more applications where robots and humans operate in close proximity and even physical contact in some cases. Soft robots, which are primarily made of highly compliant and deformable materials, provide inherently safe features, unlike conventional robots that are made of stiff and rigid components. These robots are ideal for interacting safely with humans and operating in highly dynamic environments. Soft robotics is a rapidly developing field exploiting biomimetic design principles, novel sensor and actuation concepts, and advanced manufacturing techniques. This work presents novel soft pneumatic actuators and sensors that are directly 3D printed in one manufacturing step without requiring postprocessing and support materials using low-cost and open-source fused deposition modeling (FDM) 3D printers that employ an off-the-shelf commercially available soft thermoplastic poly(urethane) (TPU). The performance of the soft actuators and sensors developed is optimized and predicted using finite element modeling (FEM) analytical models in some cases. A hyperelastic material model is developed for the TPU based on its experimental stress-strain data for use in FEM analysis. The novel soft vacuum bending (SOVA) and linear (LSOVA) actuators reported can be used in diverse robotic applications including locomotion robots, adaptive grippers, parallel manipulators, artificial muscles, modular robots, prosthetic hands, and prosthetic fingers. Also, the novel soft pneumatic sensing chambers (SPSC) developed can be used in diverse interactive human-machine interfaces including wearable gloves for virtual reality applications and controllers for soft adaptive grippers, soft push buttons for science, technology, engineering, and mathematics (STEM) education platforms, haptic feedback devices for rehabilitation, game controllers and throttle controllers for gaming and bending sensors for soft prosthetic hands. These SPSCs are directly 3D printed and embedded in a monolithic soft robotic finger as position and touch sensors for real-time position and force control. One of the aims of soft robotics is to design and fabricate robotic systems with a monolithic topology embedded with its actuators and sensors such that they can safely interact with their immediate physical environment. The results and conclusions of this thesis have significantly contributed to the realization of this aim

    Understanding of Object Manipulation Actions Using Human Multi-Modal Sensory Data

    Full text link
    Object manipulation actions represent an important share of the Activities of Daily Living (ADLs). In this work, we study how to enable service robots to use human multi-modal data to understand object manipulation actions, and how they can recognize such actions when humans perform them during human-robot collaboration tasks. The multi-modal data in this study consists of videos, hand motion data, applied forces as represented by the pressure patterns on the hand, and measurements of the bending of the fingers, collected as human subjects performed manipulation actions. We investigate two different approaches. In the first one, we show that multi-modal signal (motion, finger bending and hand pressure) generated by the action can be decomposed into a set of primitives that can be seen as its building blocks. These primitives are used to define 24 multi-modal primitive features. The primitive features can in turn be used as an abstract representation of the multi-modal signal and employed for action recognition. In the latter approach, the visual features are extracted from the data using a pre-trained image classification deep convolutional neural network. The visual features are subsequently used to train the classifier. We also investigate whether adding data from other modalities produces a statistically significant improvement in the classifier performance. We show that both approaches produce a comparable performance. This implies that image-based methods can successfully recognize human actions during human-robot collaboration. On the other hand, in order to provide training data for the robot so it can learn how to perform object manipulation actions, multi-modal data provides a better alternative
    corecore