77 research outputs found
Intuitive Hand Teleoperation by Novice Operators Using a Continuous Teleoperation Subspace
Human-in-the-loop manipulation is useful in when autonomous grasping is not
able to deal sufficiently well with corner cases or cannot operate fast enough.
Using the teleoperator's hand as an input device can provide an intuitive
control method but requires mapping between pose spaces which may not be
similar. We propose a low-dimensional and continuous teleoperation subspace
which can be used as an intermediary for mapping between different hand pose
spaces. We present an algorithm to project between pose space and teleoperation
subspace. We use a non-anthropomorphic robot to experimentally prove that it is
possible for teleoperation subspaces to effectively and intuitively enable
teleoperation. In experiments, novice users completed pick and place tasks
significantly faster using teleoperation subspace mapping than they did using
state of the art teleoperation methods.Comment: ICRA 2018, 7 pages, 7 figures, 2 table
Recommended from our members
Intuitive Human-Machine Interfaces for Non-Anthropomorphic Robotic Hands
As robots become more prevalent in our everyday lives, both in our workplaces and in our homes, it becomes increasingly likely that people who are not experts in robotics will be asked to interface with robotic devices. It is therefore important to develop robotic controls that are intuitive and easy for novices to use. Robotic hands, in particular, are very useful, but their high dimensionality makes creating intuitive human-machine interfaces for them complex. In this dissertation, we study the control of non-anthropomorphic robotic hands by non-roboticists in two contexts: collaborative manipulation and assistive robotics.
In the field of collaborative manipulation, the human and the robot work side by side as independent agents. Teleoperation allows the human to assist the robot when autonomous grasping is not able to deal sufficiently well with corner cases or cannot operate fast enough. Using the teleoperator’s hand as an input device can provide an intuitive control method, but finding a mapping between a human hand and a non-anthropomorphic robot hand can be difficult, due to the hands’ dissimilar kinematics. In this dissertation, we seek to create a mapping between the human hand and a fully actuated, non-anthropomorphic robot hand that is intuitive enough to enable effective real-time teleoperation, even for novice users.
We propose a low-dimensional and continuous teleoperation subspace which can be used as an intermediary for mapping between different hand pose spaces. We first propose the general concept of the subspace, its properties and the variables needed to map from the human hand to a robot hand. We then propose three ways to populate the teleoperation subspace mapping. Two of our mappings use a dataglove to harvest information about the user's hand. We define the mapping between joint space and teleoperation subspace with an empirical definition, which requires a person to define hand motions in an intuitive, hand-specific way, and with an algorithmic definition, which is kinematically independent, and uses objects to define the subspace. Our third mapping for the teleoperation subspace uses forearm electromyography (EMG) as a control input.
Assistive orthotics is another area of robotics where human-machine interfaces are critical, since, in this field, the robot is attached to the hand of the human user. In this case, the goal is for the robot to assist the human with movements they would not otherwise be able to achieve. Orthotics can improve the quality of life of people who do not have full use of their hands. Human-machine interfaces for assistive hand orthotics that use EMG signals from the affected forearm as input are intuitive and repeated use can strengthen the muscles of the user's affected arm. In this dissertation, we seek to create an EMG based control for an orthotic device used by people who have had a stroke. We would like our control to enable functional motions when used in conjunction with a orthosis and to be robust to changes in the input signal.
We propose a control for a wearable hand orthosis which uses an easy to don, commodity forearm EMG band. We develop an supervised algorithm to detect a user’s intent to open and close their hand, and pair this algorithm with a training protocol which makes our intent detection robust to changes in the input signal. We show that this algorithm, when used in conjunction with an orthosis over several weeks, can improve distal function in users. Additionally, we propose two semi-supervised intent detection algorithms designed to keep our control robust to changes in the input data while reducing the length and frequency of our training protocol
Aerospace medicine and biology: A continuing bibliography with indexes (supplement 344)
This bibliography lists 125 reports, articles and other documents introduced into the NASA Scientific and Technical Information System during January, 1989. Subject coverage includes: aerospace medicine and psychology, life support systems and controlled environments, safety equipment, exobiology and extraterrestrial life, and flight crew behavior and performance
Planning grasping motions for humanoid robots
This paper addresses the problem of obtaining the required motions for a humanoid robot to perform grasp actions trying to mimic the coordinated hand–arm movements humans do. The first step is the data acquisition and analysis, which consists in capturing human movements while grasping several everyday objects (covering four possible grasp types), mapping them to the robot and computing the hand motion synergies for the pre-grasp and grasp phases (per grasp type). Then, the grasp and motion synthesis step is done, which consists in generating potential grasps for a given object using the four family types, and planning the motions using a bi-directional multi-goal sampling-based planner, which efficiently guides the motion planning following the synergies in a reduced search space, resulting in paths with human-like appearance. The approach has been tested in simulation, thoroughly compared with other state-of-the-art planning algorithms obtaining better results, and also implemented in a real robot.Peer ReviewedPostprint (author's final draft
Development and evaluation of mixed reality-enhanced robotic systems for intuitive tele-manipulation and telemanufacturing tasks in hazardous conditions
In recent years, with the rapid development of space exploration, deep-sea discovery, nuclear rehabilitation and management, and robotic-assisted medical devices, there is an urgent need for humans to interactively control robotic systems to perform increasingly precise remote operations. The value of medical telerobotic applications during the recent coronavirus pandemic has also been demonstrated and will grow in the future. This thesis investigates novel approaches to the development and evaluation of a mixed reality-enhanced telerobotic platform for intuitive remote teleoperation applications in dangerous and difficult working conditions, such as contaminated sites and undersea or extreme welding scenarios. This research aims to remove human workers from the harmful working environments by equipping complex robotic systems with human intelligence and command/control via intuitive and natural human-robot- interaction, including the implementation of MR techniques to improve the user's situational awareness, depth perception, and spatial cognition, which are fundamental to effective and efficient teleoperation.
The proposed robotic mobile manipulation platform consists of a UR5 industrial manipulator, 3D-printed parallel gripper, and customized mobile base, which is envisaged to be controlled by non-skilled operators who are physically separated from the robot working space through an MR-based vision/motion mapping approach. The platform development process involved CAD/CAE/CAM and rapid prototyping techniques, such as 3D printing and laser cutting. Robot Operating System (ROS) and Unity 3D are employed in the developing process to enable the embedded system to intuitively control the robotic system and ensure the implementation of immersive and natural human-robot interactive teleoperation.
This research presents an integrated motion/vision retargeting scheme based on a mixed reality subspace approach for intuitive and immersive telemanipulation. An imitation-based velocity- centric motion mapping is implemented via the MR subspace to accurately track operator hand movements for robot motion control, and enables spatial velocity-based control of the robot tool center point (TCP). The proposed system allows precise manipulation of end-effector position and orientation to readily adjust the corresponding velocity of maneuvering.
A mixed reality-based multi-view merging framework for immersive and intuitive telemanipulation of a complex mobile manipulator with integrated 3D/2D vision is presented. The proposed 3D immersive telerobotic schemes provide the users with depth perception through the merging of multiple 3D/2D views of the remote environment via MR subspace. The mobile manipulator platform can be effectively controlled by non-skilled operators who are physically separated from the robot working space through a velocity-based imitative motion mapping approach.
Finally, this thesis presents an integrated mixed reality and haptic feedback scheme for intuitive and immersive teleoperation of robotic welding systems. By incorporating MR technology, the user is fully immersed in a virtual operating space augmented by real-time visual feedback from the robot working space. The proposed mixed reality virtual fixture integration approach implements hybrid haptic constraints to guide the operator’s hand movements following the conical guidance to effectively align the welding torch for welding and constrain the welding operation within a collision-free area.
Overall, this thesis presents a complete tele-robotic application space technology using mixed reality and immersive elements to effectively translate the operator into the robot’s space in an intuitive and natural manner. The results are thus a step forward in cost-effective and computationally effective human-robot interaction research and technologies. The system presented is readily extensible to a range of potential applications beyond the robotic tele- welding and tele-manipulation tasks used to demonstrate, optimise, and prove the concepts
Haptic Device Design and Teleoperation Control Algorithms for Mobile Manipulators
The increasing need of teleoperated robotic systems implies more and more often to use, as slave devices, mobile platforms (terrestrial, aerial or underwater) with integrated manipulation capabilities, provided e.g. by robotic arms with proper grasping/manipulation tools. Despite this, the research activity in teleoperation of robotic systems has mainly focused on the control of either fixed-base manipulators or mobile robots, non considering the integration of these two types of systems in a single device. Such a combined robotic devices are usually referred to as mobile manipulators: systems composed by both a robotic manipulator and a mobile platform (on which the arm is mounted) whose purpose is to enlarge the manipulator’s workspace. The combination of a mobile platform and a serial manipulator creates redundancy: a particular point in the space can be reached by moving the manipulator, by moving the mobile platform, or by a combined motion of both. A synchronized motion of both devices need then to be addressed. Although specific haptic devices explicitly oriented to the control of mobile manipulators need to be designed, there are no commercial solution yet. For this reason it is often necessary to control such as combined systems with traditional haptic devices not specifically oriented to the control of mobile manipulators.
The research activity presented in this Ph.D. thesis focuses in the first place on the design of a teleoperation control scheme which allows the simultaneous control of both the manipulator and the mobile platform by means of a single haptic device characterized by fixed base and an open kinematic chain. Secondly the design of a novel cable-drive haptic devices has been faced. Investigating the use of twisted strings actuation in force rendering is the most interesting challenge of the latter activity
Motion planning using synergies : application to anthropomorphic dual-arm robots
Motion planning is a traditional field in robotics, but new problems are nevertheless incessantly appearing, due to continuous advances in the robot developments. In order to solve these new problems, as well as to improve the existing solutions to classical problems, new approaches are being proposed. A paradigmatic case is the humanoid robotics, since the advances done in this field require motion planners not only to look efficiently for an optimal solution in the classic way, i.e. optimizing consumed energy or time in the plan execution, but also looking for human-like solutions, i.e. requiring the robot movements to be similar to those of the human beings. This anthropomorphism in the robot motion is desired not only for aesthetical reasons, but it is also needed to allow a better and safer human-robot collaboration: humans can predict more easily anthropomorphic robot motions thus avoiding collisions and enhancing the collaboration with the robot. Nevertheless, obtaining a satisfactory performance of these anthropomorphic robotic systems requires the automatic planning of the movements, which is still an arduous and non-evident task since the complexity of the planning problem increases exponentially with the number of degrees of freedom of the robotic system.
This doctoral thesis tackles the problem of planning the motions of dual-arm anthropomorphic robots (optionally with mobile base). The main objective is twofold: obtaining robot motions both in an efficient and in a human-like fashion at the same time. Trying to mimic the human movements while reducing the complexity of the search space for planning purposes leads to the concept of synergies, which could be conceptually defined as correlations (in the joint configuration space as well as in the joint velocity space) between the degrees of freedom of the system. This work proposes new sampling-based motion-planning procedures that exploit the concept of synergies, both in the configuration and velocity space, coordinating the movements of the arms, the hands and the mobile base of mobile anthropomorphic dual-arm robots.La planificación de movimientos es un campo tradicional de la robótica, sin embargo aparecen incesantemente nuevos problemas debido a los continuos avances en el desarrollo de los robots. Para resolver esos nuevos problemas, así como para mejorar las soluciones existentes a los problemas clásicos, se están proponiendo nuevos enfoques. Un caso paradigmático es la robótica humanoide, ya que los avances realizados en este campo requieren que los algoritmos planificadores de movimientos no sólo encuentren eficientemente una solución óptima en el sentido clásico, es decir, optimizar el consumo de energía o el tiempo de ejecución de la trayectoria; sino que también busquen soluciones con apariencia humana, es decir, que el movimiento del robot sea similar al del ser humano. Este antropomorfismo en el movimiento del robot se busca no sólo por razones estéticas, sino porque también es necesario para permitir una colaboración mejor y más segura entre el robot y el operario: el ser humano puede predecir con mayor facilidad los movimientos del robot si éstos son antropomórficos, evitando así las colisiones y mejorando la colaboración humano robot. Sin embargo, para obtener un desempeño satisfactorio de estos sistemas robóticos antropomórficos se requiere una planificación automática de sus movimientos, lo que sigue siendo una tarea ardua y poco evidente, ya que la complejidad del problema aumenta exponencialmente con el número de grados de libertad del sistema robótico. Esta tesis doctoral aborda el problema de la planificación de movimientos en robots antropomorfos bibrazo (opcionalmente con base móvil). El objetivo aquí es doble: obtener movimientos robóticos de forma eficiente y, a la vez, que tengan apariencia humana. Intentar imitar los movimientos humanos mientras a la vez se reduce la complejidad del espacio de búsqueda conduce al concepto de sinergias, que podrían definirse conceptualmente como correlaciones (tanto en el espacio de configuraciones como en el espacio de velocidades de las articulaciones) entre los distintos grados de libertad del sistema. Este trabajo propone nuevos procedimientos de planificación de movimientos que explotan el concepto de sinergias, tanto en el espacio de configuraciones como en el espacio de velocidades, coordinando así los movimientos de los brazos, las manos y la base móvil de robots móviles, bibrazo y antropomórficos.Postprint (published version
- …