254 research outputs found

    Haptic teleoperation of mobile manipulator systems using virtual fixtures.

    Get PDF
    In order to make the task of controlling Mobile-Manipulator Systems (MMS) simpler, a novel command strategy that uses a single joystick is presented to replace the existing paradigm of using multiple joysticks. To improve efficiency and accuracy, virtual fixtures were implemented with the use of a haptic joystick. Instead of modeling the MMS as a single unit with three redundant degrees-of-freedom (DOF), the operator controls either the manipulator or the mobile base, with the command strategy choosing which one to move. The novel command strategy uses three modes of operation to automatically switch control between the manipulator and base. The three modes of operation are called near-target manipulation mode, off-target manipulation mode, and transportation mode. The system enters near-target manipulation mode only when close to a target of interest, and allows the operator to control the manipulator using velocity control. When the operator attempts to move the manipulator out of its workspace limits, the system temporarily enters transportation mode. When the operator moves the manipulator in a direction towards the manipulator???s workspace the system returns to near-target manipulation mode. In off-target manipulation mode, when the operator moves the manipulator to its workspace limits, the system retracts the arm near to the centre of its workspace to enter and remain in transportation mode. While in transportation mode the operator controls the base using velocity control. Two types of virtual fixtures are used, repulsive virtual fixtures and forbidden region virtual fixtures. Repulsive virtual fixtures are present in the form of six virtual walls forming a cube at the manipulator???s workspace limits. When the operator approaches a virtual wall, a repulsive force is felt pushing the operator???s hand away from the workspace limits. The forbidden region virtual fixtures prevent the operator from driving into obstacles by disregarding motion commands that would result in a collision. The command strategy was implemented on the Omnibot MMS and test results show that it was successful in improving simplicity, accuracy, and efficiency when teleoperating a MMS

    Model-Augmented Haptic Telemanipulation: Concept, Retrospective Overview, and Current Use Cases

    Get PDF
    Certain telerobotic applications, including telerobotics in space, pose particularly demanding challenges to both technology and humans. Traditional bilateral telemanipulation approaches often cannot be used in such applications due to technical and physical limitations such as long and varying delays, packet loss, and limited bandwidth, as well as high reliability, precision, and task duration requirements. In order to close this gap, we research model-augmented haptic telemanipulation (MATM) that uses two kinds of models: a remote model that enables shared autonomous functionality of the teleoperated robot, and a local model that aims to generate assistive augmented haptic feedback for the human operator. Several technological methods that form the backbone of the MATM approach have already been successfully demonstrated in accomplished telerobotic space missions. On this basis, we have applied our approach in more recent research to applications in the fields of orbital robotics, telesurgery, caregiving, and telenavigation. In the course of this work, we have advanced specific aspects of the approach that were of particular importance for each respective application, especially shared autonomy, and haptic augmentation. This overview paper discusses the MATM approach in detail, presents the latest research results of the various technologies encompassed within this approach, provides a retrospective of DLR's telerobotic space missions, demonstrates the broad application potential of MATM based on the aforementioned use cases, and outlines lessons learned and open challenges

    Dynamic virtual reality user interface for teleoperation of heterogeneous robot teams

    Full text link
    This research investigates the possibility to improve current teleoperation control for heterogeneous robot teams using modern Human-Computer Interaction (HCI) techniques such as Virtual Reality. It proposes a dynamic teleoperation Virtual Reality User Interface (VRUI) framework to improve the current approach to teleoperating heterogeneous robot teams

    Shared-Control Teleoperation Paradigms on a Soft Growing Robot Manipulator

    Full text link
    Semi-autonomous telerobotic systems allow both humans and robots to exploit their strengths, while enabling personalized execution of a task. However, for new soft robots with degrees of freedom dissimilar to those of human operators, it is unknown how the control of a task should be divided between the human and robot. This work presents a set of interaction paradigms between a human and a soft growing robot manipulator, and demonstrates them in both real and simulated scenarios. The robot can grow and retract by eversion and inversion of its tubular body, a property we exploit to implement interaction paradigms. We implemented and tested six different paradigms of human-robot interaction, beginning with full teleoperation and gradually adding automation to various aspects of the task execution. All paradigms were demonstrated by two expert and two naive operators. Results show that humans and the soft robot manipulator can split control along degrees of freedom while acting simultaneously. In the simple pick-and-place task studied in this work, performance improves as the control is gradually given to the robot, because the robot can correct certain human errors. However, human engagement and enjoyment may be maximized when the task is at least partially shared. Finally, when the human operator is assisted by haptic feedback based on soft robot position errors, we observed that the improvement in performance is highly dependent on the expertise of the human operator.Comment: 15 pages, 14 figure

    Telerobotic Sensor-based Tool Control Derived From Behavior-based Robotics Concepts

    Get PDF
    @font-face { font-family: TimesNewRoman ; }p.MsoNormal, li.MsoNormal, div.MsoNormal { margin: 0in 0in 0.0001pt; font-size: 12pt; font-family: Times New Roman ; }div.Section1 { page: Section1; } Teleoperated task execution for hazardous environments is slow and requires highly skilled operators. Attempts to implement telerobotic assists to improve efficiency have been demonstrated in constrained laboratory environments but are not being used in the field because they are not appropriate for use on actual remote systems operating in complex unstructured environments using typical operators. This work describes a methodology for combining select concepts from behavior-based systems with telerobotic tool control in a way that is compatible with existing manipulator architectures used by remote systems typical to operations in hazardous environment. The purpose of the approach is to minimize the task instance modeling in favor of a priori task type models while using sensor information to register the task type model to the task instance. The concept was demonstrated for two tools useful to decontamination & dismantlement type operations—a reciprocating saw and a powered socket tool. The experimental results demonstrated that the approach works to facilitate traded control telerobotic tooling execution by enabling difficult tasks and by limiting tool damage. The role of the tools and tasks as drivers to the telerobotic implementation was better understood in the need for thorough task decomposition and the discovery and examination of the tool process signature. The contributions of this work include: (1) the exploration and evaluation of select features of behavior-based robotics to create a new methodology for integrating telerobotic tool control with positional teleoperation in the execution of complex tool-centric remote tasks, (2) the simplification of task decomposition and the implementation of sensor-based tool control in such a way that eliminates the need for the creation of a task instance model for telerobotic task execution, and (3) the discovery, demonstrated use, and documentation of characteristic tool process signatures that have general value in the investigation of other tool control, tool maintenance, and tool development strategies above and beyond the benefit sustained for the methodology described in this work

    Haptic Control of Mobile Manipulators Interacting with the Environment

    Get PDF
    In the modern society the haptic control of robotic manipulators plays a central role in many industrial fields because of the improvement of human capabilities and the prevention of many hazards that it can provide. Many different studies are focusing on the improvement of the operator experience, aiming at simplifying the control interface and increasing the level of intuitiveness that the system can provide to a non-trained user. This work focus on the control of mobile manipulator platforms, that are gaining popularity in the industrial world because of their capability to merge the manipulation of the environment with a potentially infinite workspace. In particular three different aspects concerning the haptic shared control of mobile manipulators will be studied. Initially the manipulation of liquid container is analyzed and a new feed-forward filtering technique able to guarantee a slosh free motion without any a priori knowledge of the imposed trajectory is proposed. Then the trajectory planning for a mobile base in an unstructured environment is considered. A new planner based on the properties of B-spline curves is studied and tested for both the haptic and the autonomous case. Eventually the control of a mobile manipulator by means of a single commercial haptic device is addressed. A new mapping technique able to provide an intuitive interface for the control for the human operator is presented. The effectiveness of the proposed works is confirmed viaseveral experimental tests

    Virtual Reality-Based Interface for Advanced Assisted Mobile Robot Teleoperation

    Full text link
    [EN] This work proposes a new interface for the teleoperation of mobile robots based on virtual reality that allows a natural and intuitive interaction and cooperation between the human and the robot, which is useful for many situations, such as inspection tasks, the mapping of complex environments, etc. Contrary to previous works, the proposed interface does not seek the realism of the virtual environment but provides all the minimum necessary elements that allow the user to carry out the teleoperation task in a more natural and intuitive way. The teleoperation is carried out in such a way that the human user and the mobile robot cooperate in a synergistic way to properly accomplish the task: the user guides the robot through the environment in order to benefit from the intelligence and adaptability of the human, whereas the robot is able to automatically avoid collisions with the objects in the environment in order to benefit from its fast response. The latter is carried out using the well-known potential field-based navigation method. The efficacy of the proposed method is demonstrated through experimentation with the Turtlebot3 Burger mobile robot in both simulation and real-world scenarios. In addition, usability and presence questionnaires were also conducted with users of different ages and backgrounds to demonstrate the benefits of the proposed approach. In particular, the results of these questionnaires show that the proposed virtual reality based interface is intuitive, ergonomic and easy to use.This research was funded by the Spanish Government (Grant PID2020-117421RB-C21 funded byMCIN/AEI/10.13039/501100011033) and by the Generalitat Valenciana (Grant GV/2021/181).Solanes, JE.; Muñoz García, A.; Gracia Calandin, LI.; Tornero Montserrat, J. (2022). Virtual Reality-Based Interface for Advanced Assisted Mobile Robot Teleoperation. Applied Sciences. 12(12):1-22. https://doi.org/10.3390/app12126071122121

    Bimanual robot control for surface treatment tasks

    Get PDF
    This work develops a method to perform surface treatment tasks using a bimanual robotic system, i.e. two robot arms cooperatively performing the task. In particular, one robot arm holds the workpiece while the other robot arm has the treatment tool attached to its end-effector. Moreover, the human user teleoperates all the six coordinates of the former robot arm and two coordinates of the latter robot arm, i.e. the teleoperator can move the treatment tool on the plane given by the workpiece surface. Furthermore, a force sensor attached to the treatment tool is used to automatically attain the desired pressure between the tool and the workpiece and to automatically keep the tool orientation orthogonal to the workpiece surface. In addition, to assist the human user during the teleoperation, several constraints are defined for both robot arms in order to avoid exceeding the allowed workspace, e.g. to avoid collisions with other objects in the environment. The theory used in this work to develop the bimanual robot control relies on sliding mode control and task prioritisation. Finally, the feasibility and effectiveness of the method are shown through experimental results using two robot arms
    • …
    corecore