459 research outputs found

    Energy-based control approaches in human-robot collaborative disassembly

    Get PDF

    Parallel Guiding Virtual Fixtures: Control and Stability

    Get PDF
    International audienceGuiding virtual fixtures have been proposed as a method for human-robot co-manipulation. They constrain the motion of the robot to task-relevant trajectories, which enables the human to execute the task more efficiently, accurately and/or ergonomically. When sequences of different tasks must be solved, multiple guiding virtual fixtures are required, and the appropriate guide for the current task must be detected automatically. To this end, we propose a novel control scheme for multiple guiding virtual fixtures that are active in parallel. Furthermore, we determine under which conditions using multiple fixtures is stable. Finally, we perform a pilot study for a real-world application with a humanoid robot

    Factors of Micromanipulation Accuracy and Learning

    No full text
    Micromanipulation refers to the manipulation under a microscope in order to perform delicate procedures. It is difficult for humans to manipulate objects accurately under a microscope due to tremor and imperfect perception, limiting performance. This project seeks to understand factors affecting accuracy in micromanipulation, and to propose strategies for learning improving accuracy. Psychomotor experiments were conducted using computer-controlled setups to determine how various feedback modalities and learning methods can influence micromanipulation performance. In a first experiment, static and motion accuracy of surgeons, medical students and non-medical students under different magniification levels and grip force settings were compared. A second experiment investigated whether the non-dominant hand placed close to the target can contribute to accurate pointing of the dominant hand. A third experiment tested a training strategy for micromanipulation using unstable dynamics to magnify motion error, a strategy shown to be decreasing deviation in large arm movements. Two virtual reality (VR) modules were then developed to train needle grasping and needle insertion tasks, two primitive tasks in a microsurgery suturing procedure. The modules provided the trainee with a visual display in stereoscopic view and information on their grip, tool position and angles. Using the VR module, a study examining effects of visual cues was conducted to train tool orientation. Results from these studies suggested that it is possible to learn and improve accuracy in micromanipulation using appropriate sensorimotor feedback and training

    Development and evaluation of mixed reality-enhanced robotic systems for intuitive tele-manipulation and telemanufacturing tasks in hazardous conditions

    Get PDF
    In recent years, with the rapid development of space exploration, deep-sea discovery, nuclear rehabilitation and management, and robotic-assisted medical devices, there is an urgent need for humans to interactively control robotic systems to perform increasingly precise remote operations. The value of medical telerobotic applications during the recent coronavirus pandemic has also been demonstrated and will grow in the future. This thesis investigates novel approaches to the development and evaluation of a mixed reality-enhanced telerobotic platform for intuitive remote teleoperation applications in dangerous and difficult working conditions, such as contaminated sites and undersea or extreme welding scenarios. This research aims to remove human workers from the harmful working environments by equipping complex robotic systems with human intelligence and command/control via intuitive and natural human-robot- interaction, including the implementation of MR techniques to improve the user's situational awareness, depth perception, and spatial cognition, which are fundamental to effective and efficient teleoperation. The proposed robotic mobile manipulation platform consists of a UR5 industrial manipulator, 3D-printed parallel gripper, and customized mobile base, which is envisaged to be controlled by non-skilled operators who are physically separated from the robot working space through an MR-based vision/motion mapping approach. The platform development process involved CAD/CAE/CAM and rapid prototyping techniques, such as 3D printing and laser cutting. Robot Operating System (ROS) and Unity 3D are employed in the developing process to enable the embedded system to intuitively control the robotic system and ensure the implementation of immersive and natural human-robot interactive teleoperation. This research presents an integrated motion/vision retargeting scheme based on a mixed reality subspace approach for intuitive and immersive telemanipulation. An imitation-based velocity- centric motion mapping is implemented via the MR subspace to accurately track operator hand movements for robot motion control, and enables spatial velocity-based control of the robot tool center point (TCP). The proposed system allows precise manipulation of end-effector position and orientation to readily adjust the corresponding velocity of maneuvering. A mixed reality-based multi-view merging framework for immersive and intuitive telemanipulation of a complex mobile manipulator with integrated 3D/2D vision is presented. The proposed 3D immersive telerobotic schemes provide the users with depth perception through the merging of multiple 3D/2D views of the remote environment via MR subspace. The mobile manipulator platform can be effectively controlled by non-skilled operators who are physically separated from the robot working space through a velocity-based imitative motion mapping approach. Finally, this thesis presents an integrated mixed reality and haptic feedback scheme for intuitive and immersive teleoperation of robotic welding systems. By incorporating MR technology, the user is fully immersed in a virtual operating space augmented by real-time visual feedback from the robot working space. The proposed mixed reality virtual fixture integration approach implements hybrid haptic constraints to guide the operator’s hand movements following the conical guidance to effectively align the welding torch for welding and constrain the welding operation within a collision-free area. Overall, this thesis presents a complete tele-robotic application space technology using mixed reality and immersive elements to effectively translate the operator into the robot’s space in an intuitive and natural manner. The results are thus a step forward in cost-effective and computationally effective human-robot interaction research and technologies. The system presented is readily extensible to a range of potential applications beyond the robotic tele- welding and tele-manipulation tasks used to demonstrate, optimise, and prove the concepts

    A Visual-Based Shared Control Architecture for Remote Telemanipulation

    Get PDF
    International audience— Cleaning up the past half century of nuclear waste represents the largest environmental remediation project in the whole Europe. Nuclear waste must be sorted, segregated and stored according to its radiation level in order to optimize maintenance costs. The objective of this work is to develop a shared control framework for remote manipulation of objects using visual information. In the presented scenario, the human operator must control a system composed of two robotic arms, one equipped with a gripper and the other one with a camera. In order to facilitate the operator's task, a subset of the gripper motion are assumed to be regulated by an autonomous algorithm exploiting the camera view of the scene. At the same time, the operator has control over the remaining null-space motions w.r.t. the primary (autonomous) task by acting on a force feedback device. A novel force feedback algorithm is also proposed with the aim of informing the user about possible constraints of the robotic system such as, for instance, joint limits. Human/hardware-in-the-loop experiments with simulated slave robots and a real master device are finally reported for demonstrating the feasibility and effectiveness of the approach

    Tactile Arrays for Virtual Textures

    Get PDF
    This thesis describes the development of three new tactile stimulators for active touch, i.e. devices to deliver virtual touch stimuli to the fingertip in response to exploratory movements by the user. All three stimulators are designed to provide spatiotemporal patterns of mechanical input to the skin via an array of contactors, each under individual computer control. Drive mechanisms are based on piezoelectric bimorphs in a cantilever geometry. The first of these is a 25-contactor array (5 × 5 contactors at 2 mm spacing). It is a rugged design with a compact drive system and is capable of producing strong stimuli when running from low voltage supplies. Combined with a PC mouse, it can be used for active exploration tasks. Pilot studies were performed which demonstrated that subjects could successfully use the device for discrimination of line orientation, simple shape identification and line following tasks. A 24-contactor stimulator (6 × 4 contactors at 2 mm spacing) with improved bandwidth was then developed. This features control electronics designed to transmit arbitrary waveforms to each channel (generated on-the-fly, in real time) and software for rapid development of experiments. It is built around a graphics tablet, giving high precision position capability over a large 2D workspace. Experiments using two-component stimuli (components at 40 Hz and 320 Hz) indicate that spectral balance within active stimuli is discriminable independent of overall intensity, and that the spatial variation (texture) within the target is easier to detect at 320 Hz that at 40 Hz. The third system developed (again 6 × 4 contactors at 2 mm spacing) was a lightweight modular stimulator developed for fingertip and thumb grasping tasks; furthermore it was integrated with force-feedback on each digit and a complex graphical display, forming a multi-modal Virtual Reality device for the display of virtual textiles. It is capable of broadband stimulation with real-time generated outputs derived from a physical model of the fabric surface. In an evaluation study, virtual textiles generated from physical measurements of real textiles were ranked in categories reflecting key mechanical and textural properties. The results were compared with a similar study performed on the real fabrics from which the virtual textiles had been derived. There was good agreement between the ratings of the virtual textiles and the real textiles, indicating that the virtual textiles are a good representation of the real textiles and that the system is delivering appropriate cues to the user

    The effects of realistic tactile haptic feedback on user surface texture perception

    Get PDF
    Haptic interaction plays an important role in virtual reality and human-computer interaction paradigms. However, most haptic devices only create kinesthetic feedback or simple unrealistic tactile feedback. This study presents theory and practice for creating realistic tactile feedback. The approach is based upon skin sensing capabilities, tactile perception principles, and tactile stimulation techniques. The approach uses a vibration sensor, controller, and actuator to create a tactile haptic device. The device is portable, small, light, and cost-effective. This study uses the device to create realistic tactile sensations from actual surface features, and measures the effects of tactile haptic feedback on user surface texture perception. Verification test results show that the device can create realistic tactile feedback that matches actual surface features well. User test results show that users can match actuator vibrations for 40-grit and 180-grit surface textures to actual 40-grit and 180-grit surface textures 99.3 % of the time

    The effects of realistic tactile haptic feedback on user surface texture perception

    Get PDF
    Haptic interaction plays an important role in virtual reality and human-computer interaction paradigms. However, most haptic devices only create kinesthetic feedback or simple unrealistic tactile feedback. This study presents theory and practice for creating realistic tactile feedback. The approach is based upon skin sensing capabilities, tactile perception principles, and tactile stimulation techniques. The approach uses a vibration sensor, controller, and actuator to create a tactile haptic device. The device is portable, small, light, and cost-effective. This study uses the device to create realistic tactile sensations from actual surface features, and measures the effects of tactile haptic feedback on user surface texture perception. Verification test results show that the device can create realistic tactile feedback that matches actual surface features well. User test results show that users can match actuator vibrations for 40-grit and 180-grit surface textures to actual 40-grit and 180-grit surface textures 99.3 % of the time

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 13th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2022, held in Hamburg, Germany, in May 2022. The 36 regular papers included in this book were carefully reviewed and selected from 129 submissions. They were organized in topical sections as follows: haptic science; haptic technology; and haptic applications
    corecore