84 research outputs found

    Two-Fingered Haptic Device for Robot Hand Teleoperation

    Get PDF
    A haptic feedback system is required to assist telerehabilitation with robot hand. The system should provide the reaction force measured in the robot hand to an operator. In this paper, we have developed a force feedback device that presents a reaction force to the distal segment of the operator's thumb, middle finger, and basipodite of the middle finger when the robot hand grasps an object. The device uses a shape memory alloy as an actuator, which affords a very compact, lightweight, and accurate device

    RoboSense At Edge: Detecting Slip, Crumple and Shape of the Object in Robotic Hand for Teleoprations

    Full text link
    Slip and crumple detection is essential for performing robust manipulation tasks with a robotic hand (RH) like remote surgery. It has been one of the challenging problems in the robotics manipulation community. In this work, we propose a technique based on machine learning (ML) based techniques to detect the slip, and crumple as well as the shape of an object that is currently held in the robotic hand. We proposed ML model will detect the slip, crumple, and shape using the force/torque exerted and the angular positions of the actuators present in the RH. The proposed model would be integrated into the loop of a robotic hand(RH) and haptic glove(HG). This would help us to reduce the latency in case of teleoperatio

    Human to robot hand motion mapping methods: review and classification

    Get PDF
    In this article, the variety of approaches proposed in literature to address the problem of mapping human to robot hand motions are summarized and discussed. We particularly attempt to organize under macro-categories the great quantity of presented methods, that are often difficult to be seen from a general point of view due to different fields of application, specific use of algorithms, terminology and declared goals of the mappings. Firstly, a brief historical overview is reported, in order to provide a look on the emergence of the human to robot hand mapping problem as a both conceptual and analytical challenge that is still open nowadays. Thereafter, the survey mainly focuses on a classification of modern mapping methods under six categories: direct joint, direct Cartesian, taskoriented, dimensionality reduction based, pose recognition based and hybrid mappings. For each of these categories, the general view that associates the related reported studies is provided, and representative references are highlighted. Finally, a concluding discussion along with the authors’ point of view regarding future desirable trends are reported.This work was supported in part by the European Commission’s Horizon 2020 Framework Programme with the project REMODEL under Grant 870133 and in part by the Spanish Government under Grant PID2020-114819GB-I00.Peer ReviewedPostprint (published version

    Multifingered robot hand compliant manipulation based on vision-based demonstration and adaptive force control

    Get PDF
    Multifingered hand dexterous manipulation is quite challenging in the domain of robotics. One remaining issue is how to achieve compliant behaviors. In this work, we propose a human-in-the-loop learning-control approach for acquiring compliant grasping and manipulation skills of a multifinger robot hand. This approach takes the depth image of the human hand as input and generates the desired force commands for the robot. The markerless vision-based teleoperation system is used for the task demonstration, and an end-to-end neural network model (i.e., TeachNet) is trained to map the pose of the human hand to the joint angles of the robot hand in real-time. To endow the robot hand with compliant human-like behaviors, an adaptive force control strategy is designed to predict the desired force control commands based on the pose difference between the robot hand and the human hand during the demonstration. The force controller is derived from a computational model of the biomimetic control strategy in human motor learning, which allows adapting the control variables (impedance and feedforward force) online during the execution of the reference joint angles. The simultaneous adaptation of the impedance and feedforward profiles enables the robot to interact with the environment compliantly. Our approach has been verified in both simulation and real-world task scenarios based on a multifingered robot hand, that is, the Shadow Hand, and has shown more reliable performances than the current widely used position control mode for obtaining compliant grasping and manipulation behaviors

    Perceptual Issues Improve Haptic Systems Performance

    Get PDF

    A Novel Untethered Hand Wearable with Fine-Grained Cutaneous Haptic Feedback

    Get PDF
    During open surgery, a surgeon relies not only on the detailed view of the organ being operated upon and on being able to feel the fine details of this organ but also heavily relies on the combination of these two senses. In laparoscopic surgery, haptic feedback provides surgeons information on interaction forces between instrument and tissue. There have been many studies to mimic the haptic feedback in laparoscopic-related telerobotics studies to date. However, cutaneous feedback is mostly restricted or limited in haptic feedback-based minimally invasive studies. We argue that fine-grained information is needed in laparoscopic surgeries to study the details of the instrument’s end and can convey via cutaneous feedback. We propose an exoskeleton haptic hand wearable which consists of five 4 ⇥ 4 miniaturized fingertip actuators, 80 in total, to convey cutaneous feedback. The wearable is described as modular, lightweight, Bluetooth, and WiFi-enabled, and has a maximum power consumption of 830 mW. Software is developed to demonstrate rapid tactile actuation of edges; this allows the user to feel the contours in cutaneous feedback. Moreover, to demonstrate the idea as an object displayed on a flat monitor, initial tests were carried out in 2D. In the second phase, the wearable exoskeleton glove is then further developed to feel 3D virtual objects by using a virtual reality (VR) headset demonstrated by a VR environment. Two-dimensional and 3D objects were tested by our novel untethered haptic hand wearable. Our results show that untethered humans understand actuation in cutaneous feedback just in a single tapping with 92.22% accuracy. Our wearable has an average latency of 46.5 ms, which is much less than the 600 ms tolerable delay acceptable by a surgeon in teleoperation. Therefore, we suggest our untethered hand wearable to enhance multimodal perception in minimally invasive surgeries to naturally feel the immediate environments of the instruments

    A human‐robot collaboration method for uncertain surface scanning

    Get PDF
    Robots are increasingly expected to replace humans in many repetitive and high‐precision tasks, of which surface scanning is a typical example. However, it is usually difficult for a robot to independently deal with a surface scanning task with uncertainties in, for example the irregular surface shapes and surface properties. Moreover, it usually requires surface modelling with additional sensors, which might be time‐consuming and costly. A human‐robot collaboration‐based approach that allows a human user and a robot to assist each other in scanning uncertain surfaces with uniform properties, such as scanning human skin in ultrasound examination is proposed. In this approach, teleoperation is used to obtain the operator's intent while allowing the operator to operate remotely. After external force perception and friction estimation, the orientation of the robot end‐effector can be autonomously adjusted to keep as perpendicular to the surface as possible. Force control enables the robotic manipulator to maintain a constant contact force with the surface. And hybrid force/motion control ensures that force, position, and pose can be regulated without interfering with each other while reducing the operator's workload. The proposed method is validated using the Elite robot to perform a mock B‐ultrasound scanning experiment

    An Untethered Multimodal Haptic Hand Wearable

    Get PDF
    Haptic primary colors correspond to temperature, vibration, and force. Previous studies combined these three haptic primary colors to produce different types of cutaneous sensations without the need to touch a real object. This study presents a low-cost untethered hand wearable with temperature, vibration, and force feedback. It is made from low-cost and commercial off-the-shelf components. A 26 mm annular Peltier element with a 10 mm hole is coupled to an 8 mm mini disc vibration motor, forming vibro-thermal tactile feedback for the user. All the other fingertips have an 8 mm disc vibration motor strapped on them using Velcro. Moreover, kinesthetic feedback extracted from a retractable ID badge holder with a small solenoid stopper is used as force feedback that restricts the fingers’ movement. Hand and finger tracking is done using Leap Motion Controller interfaced to a virtual setup with different geometric figures developed using Unity software. Therefore, we argue this prototype as a whole actuates cutaneous and kinesthetic feedback that would be useful in many virtual applications such as Virtual Reality (VR), teleoperated surgeries, and teleoperated farming and agriculture
    corecore