198 research outputs found

    Soft Fingertips with Tactile Sensing and Active Deformation for Robust Grasping of Delicate Objects

    Get PDF
    Soft fingertips have shown significant adaptability for grasping a wide range of object shapes thanks to elasticity. This ability can be enhanced to grasp soft, delicate objects by adding touch sensing. However, in these cases, the complete restraint and robustness of the grasps have proved to be challenging, as the exertion of additional forces on the fragile object can result in damage. This paper presents a novel soft fingertip design for delicate objects based on the concept of embedded air cavities, which allow the dual ability of adaptive sensing and active shape changing. The pressurized air cavities act as soft tactile sensors to control gripper position from internal pressure variation; and active fingertip deformation is achieved by applying positive pressure to these cavities, which then enable a delicate object to be kept securely in position, despite externally applied forces, by form closure. We demonstrate this improved grasping capability by comparing the displacement of grasped delicate objects exposed to high-speed motions. Results show that passive soft fingertips fail to restrain fragile objects at accelerations as low as 0.1m/s2 , in contrast, with the proposed fingertips, delicate objects are completely secure even at accelerations of more than 5m/s2

    Innovative robot hand designs of reduced complexity for dexterous manipulation

    Get PDF
    This thesis investigates the mechanical design of robot hands to sensibly reduce the system complexity in terms of the number of actuators and sensors, and control needs for performing grasping and in-hand manipulations of unknown objects. Human hands are known to be the most complex, versatile, dexterous manipulators in nature, from being able to operate sophisticated surgery to carry out a wide variety of daily activity tasks (e.g. preparing food, changing cloths, playing instruments, to name some). However, the understanding of why human hands can perform such fascinating tasks still eludes complete comprehension. Since at least the end of the sixteenth century, scientists and engineers have tried to match the sensory and motor functions of the human hand. As a result, many contemporary humanoid and anthropomorphic robot hands have been developed to closely replicate the appearance and dexterity of human hands, in many cases using sophisticated designs that integrate multiple sensors and actuators---which make them prone to error and difficult to operate and control, particularly under uncertainty. In recent years, several simplification approaches and solutions have been proposed to develop more effective and reliable dexterous robot hands. These techniques, which have been based on using underactuated mechanical designs, kinematic synergies, or compliant materials, to name some, have opened up new ways to integrate hardware enhancements to facilitate grasping and dexterous manipulation control and improve reliability and robustness. Following this line of thought, this thesis studies four robot hand hardware aspects for enhancing grasping and manipulation, with a particular focus on dexterous in-hand manipulation. Namely: i) the use of passive soft fingertips; ii) the use of rigid and soft active surfaces in robot fingers; iii) the use of robot hand topologies to create particular in-hand manipulation trajectories; and iv) the decoupling of grasping and in-hand manipulation by introducing a reconfigurable palm. In summary, the findings from this thesis provide important notions for understanding the significance of mechanical and hardware elements in the performance and control of human manipulation. These findings show great potential in developing robust, easily programmable, and economically viable robot hands capable of performing dexterous manipulations under uncertainty, while exhibiting a valuable subset of functions of the human hand.Open Acces

    Active Surface with Passive Omni-Directional Adaptation of Soft Polyhedral Fingers for In-Hand Manipulation

    Full text link
    Track systems effectively distribute loads, augmenting traction and maneuverability on unstable terrains, leveraging their expansive contact areas. This tracked locomotion capability also aids in hand manipulation of not only regular objects but also irregular objects. In this study, we present the design of a soft robotic finger with an active surface on an omni-adaptive network structure, which can be easily installed on existing grippers and achieve stability and dexterity for in-hand manipulation. The system's active surfaces initially transfer the object from the fingertip segment with less compliance to the middle segment of the finger with superior adaptability. Despite the omni-directional deformation of the finger, in-hand manipulation can still be executed with controlled active surfaces. We characterized the soft finger's stiffness distribution and simplified models to assess the feasibility of repositioning and reorienting a grasped object. A set of experiments on in-hand manipulation was performed with the proposed fingers, demonstrating the dexterity and robustness of the strategy.Comment: 10 pages, 6 figures, 2 tables, submitted to ICRA 202

    感度調整可能な3軸マルチモーダルスキンセンサーモジュールの開発

    Get PDF
    早大学位記番号:新8538早稲田大

    Force/Torque Sensing for Soft Grippers using an External Camera

    Full text link
    Robotic manipulation can benefit from wrist-mounted force/torque (F/T) sensors, but conventional F/T sensors can be expensive, difficult to install, and damaged by high loads. We present Visual Force/Torque Sensing (VFTS), a method that visually estimates the 6-axis F/T measurement that would be reported by a conventional F/T sensor. In contrast to approaches that sense loads using internal cameras placed behind soft exterior surfaces, our approach uses an external camera with a fisheye lens that observes a soft gripper. VFTS includes a deep learning model that takes a single RGB image as input and outputs a 6-axis F/T estimate. We trained the model with sensor data collected while teleoperating a robot (Stretch RE1 from Hello Robot Inc.) to perform manipulation tasks. VFTS outperformed F/T estimates based on motor currents, generalized to a novel home environment, and supported three autonomous tasks relevant to healthcare: grasping a blanket, pulling a blanket over a manikin, and cleaning a manikin's limbs. VFTS also performed well with a manually operated pneumatic gripper. Overall, our results suggest that an external camera observing a soft gripper can perform useful visual force/torque sensing for a variety of manipulation tasks.Comment: Accepted for presentation at 2023 IEEE International Conference on Robotics and Automation (ICRA

    Learning to Singulate Layers of Cloth using Tactile Feedback

    Full text link
    Robotic manipulation of cloth has applications ranging from fabrics manufacturing to handling blankets and laundry. Cloth manipulation is challenging for robots largely due to their high degrees of freedom, complex dynamics, and severe self-occlusions when in folded or crumpled configurations. Prior work on robotic manipulation of cloth relies primarily on vision sensors alone, which may pose challenges for fine-grained manipulation tasks such as grasping a desired number of cloth layers from a stack of cloth. In this paper, we propose to use tactile sensing for cloth manipulation; we attach a tactile sensor (ReSkin) to one of the two fingertips of a Franka robot and train a classifier to determine whether the robot is grasping a specific number of cloth layers. During test-time experiments, the robot uses this classifier as part of its policy to grasp one or two cloth layers using tactile feedback to determine suitable grasping points. Experimental results over 180 physical trials suggest that the proposed method outperforms baselines that do not use tactile feedback and has better generalization to unseen cloth compared to methods that use image classifiers. Code, data, and videos are available at https://sites.google.com/view/reskin-cloth.Comment: IROS 2022. See https://sites.google.com/view/reskin-cloth for supplementary materia

    Design of a 3D-printed soft robotic hand with distributed tactile sensing for multi-grasp object identification

    Get PDF
    Tactile object identification is essential in environments where vision is occluded or when intrinsic object properties such as weight or stiffness need to be discriminated between. The robotic approach to this task has traditionally been to use rigid-bodied robots equipped with complex control schemes to explore different objects. However, whilst varying degrees of success have been demonstrated, these approaches are limited in their generalisability due to the complexity of the control schemes required to facilitate safe interactions with diverse objects. In this regard, Soft Robotics has garnered increased attention in the past decade due to the ability to exploit Morphological Computation through the agent's body to simplify the task by conforming naturally to the geometry of objects being explored. This exists as a paradigm shift in the design of robots since Soft Robotics seeks to take inspiration from biological solutions and embody adaptability in order to interact with the environment rather than relying on centralised computation. In this thesis, we formulate, simplify, and solve an object identification task using Soft Robotic principles. We design an anthropomorphic hand that has human-like range of motion and compliance in the actuation and sensing. The range of motion is validated through the Feix GRASP taxonomy and the Kapandji Thumb Opposition test. The hand is monolithically fabricated using multi-material 3D printing to enable the exploitation of different material properties within the same body and limit variability between samples. The hand's compliance facilitates adaptable grasping of a wide range of objects and features integrated distributed tactile sensing. We emulate the human approach of integrating information from multiple contacts and grasps of objects to discriminate between them. Two bespoke neural networks are designed to extract patterns from both the tactile data and the relationships between grasps to facilitate high classification accuracy
    corecore