525 research outputs found

    Doctor of Philosophy

    Get PDF
    dissertationHumans generally have difficulty performing precision tasks with their unsupported hands. To compensate for this difficulty, people often seek to support or rest their hand and arm on a fixed surface. However, when the precision task needs to be performed over a workspace larger than what can be reached from a fixed position, a fixed support is no longer useful. This dissertation describes the development of the Active Handrest, a device that expands its user's dexterous workspace by providing ergonomic support and precise repositioning motions over a large workspace. The prototype Active Handrest is a planar computer-controlled support for the user's hand and arm. The device can be controlled through force input from the user, position input from a grasped tool, or a combination of inputs. The control algorithm of the Active Handrest converts the input(s) into device motions through admittance control where the device's desired velocity is calculated proportionally to the input force or its equivalent. A robotic 2-axis admittance device was constructed as the initial Planar Active Handrest, or PAHR, prototype. Experiments were conducted to optimize the device's control input strategies. Large workspace shape tracing experiments were used to compare the PAHR to unsupported, fixed support, and passive moveable support conditions. The Active Handrest was found to reduce task error and provide better speedaccuracy performance. Next, virtual fixture strategies were explored for the device. From the options considered, a virtual spring fixture strategy was chosen based on its effectiveness. An experiment was conducted to compare the PAHR with its virtual fixture strategy to traditional virtual fixture techniques for a grasped stylus. Virtual fixtures implemented on the Active Handrest were found to be as effective as fixtures implemented on a grasped tool. Finally, a higher degree-of-freedom Enhanced Planar Active Handrest, or E-PAHR, was constructed to provide support for large workspace precision tasks while more closely following the planar motions of the human arm. Experiments were conducted to investigate appropriate control strategies and device utility. The E-PAHR was found to provide a skill level equal to that of the PAHR with reduced user force input and lower perceived exertion

    Doctor of Philosophy

    Get PDF
    dissertationMost humans have difficulty performing precision tasks, such as writing and painting, without additional physical support(s) to help steady or offload their arm's weight. To alleviate this problem, various passive and active devices have been developed. However, such devices often have a small workspace and lack scalable gravity compensation throughout the workspace and/or diversity in their applications. This dissertation describes the development of a Spatial Active Handrest (SAHR), a large-workspace manipulation aid, to offload the weight of the user's arm and increase user's accuracy over a large three-dimensional workspace. This device has four degrees-of-freedom and allows the user to perform dexterous tasks within a large workspace that matches the workspace of a human arm when performing daily tasks. Users can move this device to a desired position and orientation using force or position inputs, or a combination of both. The SAHR converts the given input(s) to desired velocit

    Haptic guidance for microrobotic intracellular injection

    Full text link
    The ability for a bio-operator to utilise a haptic device to manipulate a microrobot for intracellular injection offers immense benefits. One significant benefit is for the bio-operator to receive haptic guidance while performing the injection process. In order to address this, this paper investigates the use of haptic virtual fixtures for cell injection and proposes a novel force field virtual fixture. The guidance force felt by the bio-operator is determined by force field analysis within the virtual fixture. The proposed force field virtual fixture assists the bio-operator when performing intracellular injection by limiting the micropipette tip\u27s motion to a conical volume as well as recommending the desired path for optimal injection. A virtual fixture plane is also introduced to prevent the bio-operator from moving the micropipette tip beyond the deposition target inside the cell. Simulation results demonstrate the operation of the guidance system.<br /

    Haptics in Robot-Assisted Surgery: Challenges and Benefits

    Get PDF
    Robotic surgery is transforming the current surgical practice, not only by improving the conventional surgical methods but also by introducing innovative robot-enhanced approaches that broaden the capabilities of clinicians. Being mainly of man-machine collaborative type, surgical robots are seen as media that transfer pre- and intra-operative information to the operator and reproduce his/her motion, with appropriate filtering, scaling, or limitation, to physically interact with the patient. The field, however, is far from maturity and, more critically, is still a subject of controversy in medical communities. Limited or absent haptic feedback is reputed to be among reasons that impede further spread of surgical robots. In this paper objectives and challenges of deploying haptic technologies in surgical robotics is discussed and a systematic review is performed on works that have studied the effects of providing haptic information to the users in major branches of robotic surgery. It has been tried to encompass both classical works and the state of the art approaches, aiming at delivering a comprehensive and balanced survey both for researchers starting their work in this field and for the experts

    Intuitive, iterative and assisted virtual guides programming for human-robot comanipulation

    Get PDF
    Pendant très longtemps, l'automatisation a été assujettie à l'usage de robots industriels traditionnels placés dans des cages et programmés pour répéter des tâches plus ou moins complexes au maximum de leur vitesse et de leur précision. Cette automatisation, dite rigide, possède deux inconvénients majeurs : elle est chronophage dû aux contraintes contextuelles applicatives et proscrit la présence humaine. Il existe désormais une nouvelle génération de robots avec des systèmes moins encombrants, peu coûteux et plus flexibles. De par leur structure et leurs modes de fonctionnement ils sont intrinsèquement sûrs ce qui leurs permettent de travailler main dans la main avec les humains. Dans ces nouveaux espaces de travail collaboratifs, l'homme peut être inclus dans la boucle comme un agent décisionnel actif. En tant qu'instructeur ou collaborateur il peut influencer le processus décisionnel du robot : on parle de robots collaboratifs (ou cobots). Dans ce nouveau contexte, nous faisons usage de guides virtuels. Ils permettent aux cobots de soulager les efforts physiques et la charge cognitive des opérateurs. Cependant, la définition d'un guide virtuel nécessite souvent une expertise et une modélisation précise de la tâche. Cela restreint leur utilité aux scénarios à contraintes fixes. Pour palier ce problème et améliorer la flexibilité de la programmation du guide virtuel, cette thèse présente une nouvelle approche par démonstration : nous faisons usage de l'apprentissage kinesthésique de façon itérative et construisons le guide virtuel avec une spline 6D. Grâce à cette approche, l'opérateur peut modifier itérativement les guides tout en gardant leur assistance. Cela permet de rendre le processus plus intuitif et naturel ainsi que de réduire la pénibilité. La modification locale d'un guide virtuel en trajectoire est possible par interaction physique avec le robot. L'utilisateur peut déplacer un point clé cartésien ou modifier une portion entière du guide avec une nouvelle démonstration partielle. Nous avons également étendu notre approche aux guides virtuels 6D, où les splines en déplacement sont définies via une interpolation Akima (pour la translation) et une 'interpolation quadratique des quaternions (pour l'orientation). L'opérateur peut initialement définir un guide virtuel en trajectoire, puis utiliser l'assistance en translation pour ne se concentrer que sur la démonstration de l'orientation. Nous avons appliqué notre approche dans deux scénarios industriels utilisant un cobot. Nous avons ainsi démontré l'intérêt de notre méthode qui améliore le confort de l'opérateur lors de la comanipulation.For a very long time, automation was driven by the use of traditional industrial robots placed in cages, programmed to repeat more or less complex tasks at their highest speed and with maximum accuracy. This robot-oriented solution is heavily dependent on hard automation which requires pre-specified fixtures and time consuming programming, hindering robots from becoming flexible and versatile tools. These robots have evolved towards a new generation of small, inexpensive, inherently safe and flexible systems that work hand in hand with humans. In these new collaborative workspaces the human can be included in the loop as an active agent. As a teacher and as a co-worker he can influence the decision-making process of the robot. In this context, virtual guides are an important tool used to assist the human worker by reducing physical effort and cognitive overload during tasks accomplishment. However, the construction of virtual guides often requires expert knowledge and modeling of the task. These limitations restrict the usefulness of virtual guides to scenarios with unchanging constraints. To overcome these challenges and enhance the flexibility of virtual guides programming, this thesis presents a novel approach that allows the worker to create virtual guides by demonstration through an iterative method based on kinesthetic teaching and displacement splines. Thanks to this approach, the worker is able to iteratively modify the guides while being assisted by them, making the process more intuitive and natural while reducing its painfulness. Our approach allows local refinement of virtual guiding trajectories through physical interaction with the robots. We can modify a specific cartesian keypoint of the guide or re- demonstrate a portion. We also extended our approach to 6D virtual guides, where displacement splines are defined via Akima interpolation (for translation) and quadratic interpolation of quaternions (for orientation). The worker can initially define a virtual guiding trajectory and then use the assistance in translation to only concentrate on defining the orientation along the path. We demonstrated that these innovations provide a novel and intuitive solution to increase the human's comfort during human-robot comanipulation in two industrial scenarios with a collaborative robot (cobot)

    Haptic Guidance for Extended Range Telepresence

    Get PDF
    A novel navigation assistance for extended range telepresence is presented. The haptic information from the target environment is augmented with guidance commands to assist the user in reaching desired goals in the arbitrarily large target environment from the spatially restricted user environment. Furthermore, a semi-mobile haptic interface was developed, one whose lightweight design and setup configuration atop the user provide for an absolutely safe operation and high force display quality

    Spatial Motion Constraints Using Virtual Fixtures Generated by Anatomy

    Full text link

    Model Driven Robotic Assistance for Human-Robot Collaboration

    Get PDF
    While robots routinely perform complex assembly tasks in highly structured factory environments, it is challenging to apply completely autonomous robotic systems in less structured manipulation tasks, such as surgery and machine assembly/repair, due to the limitations of machine intelligence, sensor data interpretation and environment modeling. A practical, yet effective approach to accomplish these tasks is through human-robot collaboration, in which the human operator and the robot form a partnership and complement each other in performing a complex task. We recognize that humans excel at determining task goals and recognizing constraints, if given sufficient feedback about the interaction between the tool (e.g., end-effector of the robot) and the environment. Robots are precise, unaffected by fatigue and able to work in environments not suitable for humans. We hypothesize that by providing the operator with adequate information about the task, through visual and force (haptic) feedback, the operator can: (1) define the task model, in terms of task goals and virtual fixture constraints through an interactive, or immersive augmented reality interface, and (2) have the robot actively assist the operator to enhance the execution time, quality and precision of the tasks. We validate our approaches through the implementations of both cooperative (i.e., hands-on) control and telerobotic systems, for image-guided robotic neurosurgery and telerobotic manipulation tasks for satellite servicing under significant time delay

    Shared control for natural motion and safety in hands-on robotic surgery

    Get PDF
    Hands-on robotic surgery is where the surgeon controls the tool's motion by applying forces and torques to the robot holding the tool, allowing the robot-environment interaction to be felt though the tool itself. To further improve results, shared control strategies are used to combine the strengths of the surgeon with those of the robot. One such strategy is active constraints, which prevent motion into regions deemed unsafe or unnecessary. While research in active constraints on rigid anatomy has been well-established, limited work on dynamic active constraints (DACs) for deformable soft tissue has been performed, particularly on strategies which handle multiple sensing modalities. In addition, attaching the tool to the robot imposes the end effector dynamics onto the surgeon, reducing dexterity and increasing fatigue. Current control policies on these systems only compensate for gravity, ignoring other dynamic effects. This thesis presents several research contributions to shared control in hands-on robotic surgery, which create a more natural motion for the surgeon and expand the usage of DACs to point clouds. A novel null-space based optimization technique has been developed which minimizes the end effector friction, mass, and inertia of redundant robots, creating a more natural motion, one which is closer to the feeling of the tool unattached to the robot. By operating in the null-space, the surgeon is left in full control of the procedure. A novel DACs approach has also been developed, which operates on point clouds. This allows its application to various sensing technologies, such as 3D cameras or CT scans and, therefore, various surgeries. Experimental validation in point-to-point motion trials and a virtual reality ultrasound scenario demonstrate a reduction in work when maneuvering the tool and improvements in accuracy and speed when performing virtual ultrasound scans. Overall, the results suggest that these techniques could increase the ease of use for the surgeon and improve patient safety.Open Acces
    • …
    corecore