67 research outputs found

    Kinesthetic Haptics Sensing and Discovery with Bilateral Teleoperation Systems

    Get PDF
    In the mechanical engineering field of robotics, bilateral teleoperation is a classic but still increasing research topic. In bilateral teleoperation, a human operator moves the master manipulator, and a slave manipulator is controlled to follow the motion of the master in a remote, potentially hostile environment. This dissertation focuses on kinesthetic perception analysis in teleoperation systems. Design of the controllers of the systems is studied as the influential factor of this issue. The controllers that can provide different force tracking capability are compared using the same experimental protocol. A 6 DOF teleoperation system is configured as the system testbed. An innovative master manipulator is developed and a 7 DOF redundant manipulator is used as the slave robot. A singularity avoidance inverse kinematics algorithm is developed to resolve the redundancy of the slave manipulator. An experimental protocol is addressed and three dynamics attributes related to kineshtetic feedback are investigated: weight, center of gravity and inertia. The results support our hypothesis: the controller that can bring a better force feedback can improve the performance in the experiments

    Complementary Situational Awareness for an Intelligent Telerobotic Surgical Assistant System

    Get PDF
    Robotic surgical systems have contributed greatly to the advancement of Minimally Invasive Surgeries (MIS). More specifically, telesurgical robots have provided enhanced dexterity to surgeons performing MIS procedures. However, current robotic teleoperated systems have only limited situational awareness of the patient anatomy and surgical environment that would typically be available to a surgeon in an open surgery. Although the endoscopic view enhances the visualization of the anatomy, perceptual understanding of the environment and anatomy is still lacking due to the absence of sensory feedback. In this work, these limitations are addressed by developing a computational framework to provide Complementary Situational Awareness (CSA) in a surgical assistant. This framework aims at improving the human-robot relationship by providing elaborate guidance and sensory feedback capabilities for the surgeon in complex MIS procedures. Unlike traditional teleoperation, this framework enables the user to telemanipulate the situational model in a virtual environment and uses that information to command the slave robot with appropriate admittance gains and environmental constraints. Simultaneously, the situational model is updated based on interaction of the slave robot with the task space environment. However, developing such a system to provide real-time situational awareness requires that many technical challenges be met. To estimate intraoperative organ information continuous palpation primitives are required. Intraoperative surface information needs to be estimated in real-time while the organ is being palpated/scanned. The model of the task environment needs to be updated in near real-time using the estimated organ geometry so that the force-feedback applied on the surgeon's hand would correspond to the actual location of the model. This work presents a real-time framework that meets these requirements/challenges to provide situational awareness of the environment in the task space. Further, visual feedback is also provided for the surgeon/developer to view the near video frame rate updates of the task model. All these functions are executed in parallel and need to have a synchronized data exchange. The system is very portable and can be incorporated to any existing telerobotic platforms with minimal overhead

    Interactive control of articulated structures in the virtual space.

    Get PDF
    by Kwok Lai Ho Victor.Thesis (M.Phil.)--Chinese University of Hong Kong, 1998.Includes bibliographical references (leaves 77-82).Abstract also in Chinese.Chapter 1 --- Introduction --- p.1Chapter 2 --- Background --- p.5Chapter 2.1 --- History of Robotics --- p.5Chapter 2.2 --- Autonomous Robot Systems --- p.7Chapter 2.3 --- 3D Windowing Simulators --- p.8Chapter 2.4 --- Robot Simulation in VR --- p.8Chapter 3 --- Objective --- p.11Chapter 4 --- Articulated Structures --- p.13Chapter 4.1 --- Joints and links --- p.13Chapter 4.2 --- Degrees of Freedom --- p.16Chapter 4.3 --- Denavit-Hartenberg Notation --- p.17Chapter 5 --- Virtual Manipulators --- p.20Chapter 5.1 --- Arm(N-link) Structure --- p.20Chapter 5.2 --- Hand Model --- p.24Chapter 6 --- Motion Control Techniques --- p.27Chapter 6.1 --- Kinematics --- p.27Chapter 6.1.1 --- Forward Kinematics --- p.27Chapter 6.1.2 --- Inverse Kinematics --- p.29Chapter 6.1.3 --- Solving Kinematics Problem --- p.29Chapter 6.1.4 --- Redundancy --- p.31Chapter 6.1.5 --- Singularities --- p.32Chapter 6.2 --- Dynamics --- p.33Chapter 6.2.1 --- Forward Dynamics --- p.34Chapter 6.2.2 --- Inverse Dynamics --- p.35Chapter 6.3 --- Combination of Two Control Modes --- p.35Chapter 6.4 --- Constraints and Optimization --- p.36Chapter 7 --- Physical Feedback Systems --- p.38Chapter 7.1 --- Touch Feedback --- p.39Chapter 7.2 --- Force Feedback --- p.41Chapter 7.3 --- Force/Touch Feedback Systems --- p.42Chapter 8 --- Virtual Object Manipulation --- p.43Chapter 8.1 --- Previous Work --- p.44Chapter 8.2 --- Physics-based Virtual-hand Grasping --- p.45Chapter 8.3 --- Visual Correction --- p.43Chapter 8.3.1 --- Joint Correction --- p.50Chapter 8.3.2 --- Odd Finger Configurations --- p.51Chapter 8.4 --- Active Grasping --- p.52Chapter 8.5 --- Collision Detection of Complex Objects --- p.54Chapter 9 --- Experiments --- p.57Chapter 9.1 --- System Architecture --- p.57Chapter 9.1.1 --- Tracking System --- p.53Chapter 9.1.2 --- Glove System --- p.59Chapter 9.1.3 --- Host Computer --- p.60Chapter 9.2 --- Experimental Results --- p.60Chapter 9.2.1 --- General application --- p.61Chapter 9.2.2 --- Relationship between frictional coefficient and mass of the object --- p.61Chapter 10 --- Conclusions --- p.67Chapter 10.1 --- Summary --- p.67Chapter 10.2 --- Contributions --- p.69Chapter 10.3 --- Future Work --- p.69Chapter A --- Description files --- p.71Chapter A.1 --- Scene Description --- p.71Chapter A.2 --- Hand Description --- p.73Bibliography --- p.7

    Smart Navigation in Surgical Robotics

    Get PDF
    La cirugía mínimamente invasiva, y concretamente la cirugía laparoscópica, ha supuesto un gran cambio en la forma de realizar intervenciones quirúrgicas en el abdomen. Actualmente, la cirugía laparoscópica ha evolucionado hacia otras técnicas aún menos invasivas, como es la cirugía de un solo puerto, en inglés Single Port Access Surgery. Esta técnica consiste en realizar una única incisión, por la que son introducidos los instrumentos y la cámara laparoscópica a través de un único trocar multipuerto. La principal ventaja de esta técnica es una reducción de la estancia hospitalaria por parte del paciente, y los resultados estéticos, ya que el trocar se suele introducir por el ombligo, quedando la cicatriz oculta en él. Sin embargo, el hecho de que los instrumentos estén introducidos a través del mismo trocar hace la intervención más complicada para el cirujano, que necesita unas habilidades específicas para este tipo de intervenciones. Esta tesis trata el problema de la navegación de instrumentos quirúrgicos mediante plataformas robóticas teleoperadas en cirugía de un solo puerto. En concreto, se propone un método de navegación que dispone de un centro de rotación remoto virtual, el cuál coincide con el punto de inserción de los instrumentos (punto de fulcro). Para estimar este punto se han empleado las fuerzas ejercidas por el abdomen en los instrumentos quirúrgicos, las cuales han sido medidas por sensores de esfuerzos colocados en la base de los instrumentos. Debido a que estos instrumentos también interaccionan con tejido blando dentro del abdomen, lo cual distorsionaría la estimación del punto de inserción, es necesario un método que permita detectar esta circunstancia. Para solucionar esto, se ha empleado un detector de interacción con tejido basado en modelos ocultos de Markov el cuál se ha entrenado para detectar cuatro gestos genéricos. Por otro lado, en esta tesis se plantea el uso de guiado háptico para mejorar la experiencia del cirujano cuando utiliza plataformas robóticas teleoperadas. En concreto, se propone la técnica de aprendizaje por demostración (Learning from Demonstration) para generar fuerzas que puedan guiar al cirujano durante la resolución de tareas específicas. El método de navegación propuesto se ha implantado en la plataforma quirúrgica CISOBOT, desarrollada por la Universidad de Málaga. Los resultados experimentales obtenidos validan tanto el método de navegación propuesto, como el detector de interacción con tejido blando. Por otro lado, se ha realizado un estudio preliminar del sistema de guiado háptico. En concreto, se ha empleado una tarea genérica, la inserción de una clavija, para realizar los experimentos necesarios que permitan demostrar que el método propuesto es válido para resolver esta tarea y otras similares

    Proceedings of the NASA Conference on Space Telerobotics, volume 2

    Get PDF
    These proceedings contain papers presented at the NASA Conference on Space Telerobotics held in Pasadena, January 31 to February 2, 1989. The theme of the Conference was man-machine collaboration in space. The Conference provided a forum for researchers and engineers to exchange ideas on the research and development required for application of telerobotics technology to the space systems planned for the 1990s and beyond. The Conference: (1) provided a view of current NASA telerobotic research and development; (2) stimulated technical exchange on man-machine systems, manipulator control, machine sensing, machine intelligence, concurrent computation, and system architectures; and (3) identified important unsolved problems of current interest which can be dealt with by future research

    Intelligent Haptic Perception for Physical Robot Interaction

    Get PDF
    Doctorado en Ingeniería mecatrónica. Fecha de entrega de la Tesis doctoral: 8 de enero de 2020. Fecha de lectura de Tesis doctoral: 30 de marzo 2020.The dream of having robots living among us is coming true thanks to the recent advances in Artificial Intelligence (AI). The gap that still exists between that dream and reality will be filled by scientific research, but manifold challenges are yet to be addressed. Handling the complexity and uncertainty of real-world scenarios is still the major challenge in robotics nowadays. In this respect, novel AI methods are giving the robots the capability to learn from experience and therefore to cope with real-life situations. Moreover, we live in a physical world in which physical interactions are both vital and natural. Thus, those robots that are being developed to live among humans must perform tasks that require physical interactions. Haptic perception, conceived as the idea of feeling and processing tactile and kinesthetic sensations, is essential for making this physical interaction possible. This research is inspired by the dream of having robots among us, and therefore, addresses the challenge of developing robots with haptic perception capabilities that can operate in real-world scenarios. This PhD thesis tackles the problems related to physical robot interaction by employing machine learning techniques. Three AI solutions are proposed for different physical robot interaction challenges: i) Grasping and manipulation of humans’ limbs; ii) Tactile object recognition; iii) Control of Variable-Stiffness-Link (VSL) manipulators. The ideas behind this research work have potential robotic applications such as search and rescue, healthcare or rehabilitation. This dissertation consists of a compendium of publications comprising as the main body a compilation of previously published scientific articles. The baseline of this research is composed of a total of five papers published in prestigious peer-reviewed scientific journals and international robotics conferences

    Augmented Reality Navigation Interfaces Improve Human Performance In End-Effector Controlled Telerobotics

    Get PDF
    On the International Space Station (ISS) and space shuttles, the National Aeronautics and Space Administration (NASA) has used robotic manipulators extensively to perform payload handling and maintenance tasks. Teleoperating robots require expert skills and optimal performance is crucial to mission completion and crew safety. Degradation in performance is observed when manual control is mediated through remote camera views, resulting in poor end-effector navigation quality and extended task completion times. This thesis explores the application of three-dimensional augmented reality (AR) interfaces specifically designed to improve human performance during end-effector controlled teleoperations. A modular telerobotic test bed was developed for this purpose and several experiments were conducted. In the first experiment, the effect of camera placement on end-effector manipulation performance was evaluated. Results show that increasing misalignment between the displayed end-effector and hand-controller axes (display-control misalignments) increases the time required to process a movement input. Simple AR movement cues were found to mitigate the adverse effects of camera-based teleoperation and made performance invariant to misalignment. Applying these movement cues to payload transport tasks correspondingly demonstrated improvements in free-space navigation quality over conventional end-effector control using multiple cameras. Collision-free teleoperations are also a critical requirement in space. To help the operators guide robots safely, a novel method was evaluated. Navigation plans computed by a planning agent are presented to the operator sequentially through an AR interface. The plans in combination with the interface allow the operator to guide the end-effector through collision-free regions in the remote environment safely. Experimental results show significant benefits in control performance including reduced path deviation and travel distance. Overall, the results show that AR interfaces can improve performance during manual control of remote robots and have tremendous potential in current and future teleoperated space robotic systems; as well as in contemporary military and surgical applications
    corecore