321 research outputs found

    Design, Development, and Evaluation of a Teleoperated Master-Slave Surgical System for Breast Biopsy under Continuous MRI Guidance

    Get PDF
    The goal of this project is to design and develop a teleoperated master-slave surgical system that can potentially assist the physician in performing breast biopsy with a magnetic resonance imaging (MRI) compatible robotic system. MRI provides superior soft-tissue contrast compared to other imaging modalities such as computed tomography or ultrasound and is used for both diagnostic and therapeutic procedures. The strong magnetic field and the limited space inside the MRI bore, however, restrict direct means of breast biopsy while performing real-time imaging. Therefore, current breast biopsy procedures employ a blind targeting approach based on magnetic resonance (MR) images obtained a priori. Due to possible patient involuntary motion or inaccurate insertion through the registration grid, such approach could lead to tool tip positioning errors thereby affecting diagnostic accuracy and leading to a long and painful process, if repeated procedures are required. Hence, it is desired to develop the aforementioned teleoperation system to take advantages of real-time MR imaging and avoid multiple biopsy needle insertions, improving the procedure accuracy as well as reducing the sampling errors. The design, implementation, and evaluation of the teleoperation system is presented in this dissertation. A MRI-compatible slave robot is implemented, which consists of a 1 degree of freedom (DOF) needle driver, a 3-DOF parallel mechanism, and a 2-DOF X-Y stage. This slave robot is actuated with pneumatic cylinders through long transmission lines except the 1-DOF needle driver is actuated with a piezo motor. Pneumatic actuation through long transmission lines is then investigated using proportional pressure valves and controllers based on sliding mode control are presented. A dedicated master robot is also developed, and the kinematic map between the master and the slave robot is established. The two robots are integrated into a teleoperation system and a graphical user interface is developed to provide visual feedback to the physician. MRI experiment shows that the slave robot is MRI-compatible, and the ex vivo test shows over 85%success rate in targeting with the MRI-compatible robotic system. The success in performing in vivo animal experiments further confirm the potential of further developing the proposed robotic system for clinical applications

    Context-aware learning for robot-assisted endovascular catheterization

    Get PDF
    Endovascular intervention has become a mainstream treatment of cardiovascular diseases. However, multiple challenges remain such as unwanted radiation exposures, limited two-dimensional image guidance, insufficient force perception and haptic cues. Fast evolving robot-assisted platforms improve the stability and accuracy of instrument manipulation. The master-slave system also removes radiation to the operator. However, the integration of robotic systems into the current surgical workflow is still debatable since repetitive, easy tasks have little value to be executed by the robotic teleoperation. Current systems offer very low autonomy, potential autonomous features could bring more benefits such as reduced cognitive workloads and human error, safer and more consistent instrument manipulation, ability to incorporate various medical imaging and sensing modalities. This research proposes frameworks for automated catheterisation with different machine learning-based algorithms, includes Learning-from-Demonstration, Reinforcement Learning, and Imitation Learning. Those frameworks focused on integrating context for tasks in the process of skill learning, hence achieving better adaptation to different situations and safer tool-tissue interactions. Furthermore, the autonomous feature was applied to next-generation, MR-safe robotic catheterisation platform. The results provide important insights into improving catheter navigation in the form of autonomous task planning, self-optimization with clinical relevant factors, and motivate the design of intelligent, intuitive, and collaborative robots under non-ionizing image modalities.Open Acces

    ISMCR 1994: Topical Workshop on Virtual Reality. Proceedings of the Fourth International Symposium on Measurement and Control in Robotics

    Get PDF
    This symposium on measurement and control in robotics included sessions on: (1) rendering, including tactile perception and applied virtual reality; (2) applications in simulated medical procedures and telerobotics; (3) tracking sensors in a virtual environment; (4) displays for virtual reality applications; (5) sensory feedback including a virtual environment application with partial gravity simulation; and (6) applications in education, entertainment, technical writing, and animation

    Interactivity:the missing link between virtual reality technology and drug discovery pipelines

    Get PDF
    The potential of virtual reality (VR) to contribute to drug design and development has been recognised for many years. Hardware and software developments now mean that this potential is beginning to be realised, and VR methods are being actively used in this sphere. A recent advance is to use VR not only to visualise and interact with molecular structures, but also to interact with molecular dynamics simulations of 'on the fly' (interactive molecular dynamics in VR, IMD-VR), which is useful not only for flexible docking but also to examine binding processes and conformational changes. iMD-VR has been shown to be useful for creating complexes of ligands bound to target proteins, e.g., recently applied to peptide inhibitors of the SARS-CoV-2 main protease. In this review, we use the term 'interactive VR' to refer to software where interactivity is an inherent part of the user VR experience e.g., in making structural modifications or interacting with a physically rigorous molecular dynamics (MD) simulation, as opposed to simply using VR controllers to rotate and translate the molecule for enhanced visualisation. Here, we describe these methods and their application to problems relevant to drug discovery, highlighting the possibilities that they offer in this arena. We suggest that the ease of viewing and manipulating molecular structures and dynamics, and the ability to modify structures on the fly (e.g., adding or deleting atoms) makes modern interactive VR a valuable tool to add to the armoury of drug development methods.Comment: 19 pages, 3 figure

    A virtual hand assessment system for efficient outcome measures of hand rehabilitation

    Get PDF
    Previously held under moratorium from 1st December 2016 until 1st December 2021.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subject’s performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digits’ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation ρ = 0.7), inter-subject repeatability across the subject’s performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingers’ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subject’s specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplines’ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control.Hand rehabilitation is an extremely complex and critical process in the medical rehabilitation field. This is mainly due to the high articulation of the hand functionality. Recent research has focused on employing new technologies, such as robotics and system control, in order to improve the precision and efficiency of the standard clinical methods used in hand rehabilitation. However, the designs of these devices were either oriented toward a particular hand injury or heavily dependent on subjective assessment techniques to evaluate the progress. These limitations reduce the efficiency of the hand rehabilitation devices by providing less effective results for restoring the lost functionalities of the dysfunctional hands. In this project, a novel technological solution and efficient hand assessment system is produced that can objectively measure the restoration outcome and, dynamically, evaluate its performance. The proposed system uses a data glove sensorial device to measure the multiple ranges of motion for the hand joints, and a Virtual Reality system to return an illustrative and safe visual assistance environment that can self-adjust with the subject’s performance. The system application implements an original finger performance measurement method for analysing the various hand functionalities. This is achieved by extracting the multiple features of the hand digits’ motions; such as speed, consistency of finger movements and stability during the hold positions. Furthermore, an advanced data glove calibration method was developed and implemented in order to accurately manipulate the virtual hand model and calculate the hand kinematic movements in compliance with the biomechanical structure of the hand. The experimental studies were performed on a controlled group of 10 healthy subjects (25 to 42 years age). The results showed intra-subject reliability between the trials (average of crosscorrelation ρ = 0.7), inter-subject repeatability across the subject’s performance (p > 0.01 for the session with real objects and with few departures in some of the virtual reality sessions). In addition, the finger performance values were found to be very efficient in detecting the multiple elements of the fingers’ performance including the load effect on the forearm. Moreover, the electromyography measurements, in the virtual reality sessions, showed high sensitivity in detecting the tremor effect (the mean power frequency difference on the right Vextensor digitorum muscle is 176 Hz). Also, the finger performance values for the virtual reality sessions have the same average distance as the real life sessions (RSQ =0.07). The system, besides offering an efficient and quantitative evaluation of hand performance, it was proven compatible with different hand rehabilitation techniques where it can outline the primarily affected parts in the hand dysfunction. It also can be easily adjusted to comply with the subject’s specifications and clinical hand assessment procedures to autonomously detect the classification task events and analyse them with high reliability. The developed system is also adaptable with different disciplines’ involvements, other than the hand rehabilitation, such as ergonomic studies, hand robot control, brain-computer interface and various fields involving hand control

    A white paper: NASA virtual environment research, applications, and technology

    Get PDF
    Research support for Virtual Environment technology development has been a part of NASA's human factors research program since 1985. Under the auspices of the Office of Aeronautics and Space Technology (OAST), initial funding was provided to the Aerospace Human Factors Research Division, Ames Research Center, which resulted in the origination of this technology. Since 1985, other Centers have begun using and developing this technology. At each research and space flight center, NASA missions have been major drivers of the technology. This White Paper was the joint effort of all the Centers which have been involved in the development of technology and its applications to their unique missions. Appendix A is the list of those who have worked to prepare the document, directed by Dr. Cynthia H. Null, Ames Research Center, and Dr. James P. Jenkins, NASA Headquarters. This White Paper describes the technology and its applications in NASA Centers (Chapters 1, 2 and 3), the potential roles it can take in NASA (Chapters 4 and 5), and a roadmap of the next 5 years (FY 1994-1998). The audience for this White Paper consists of managers, engineers, scientists and the general public with an interest in Virtual Environment technology. Those who read the paper will determine whether this roadmap, or others, are to be followed

    Mechanisms of motor learning: by humans, for robots

    No full text
    Whenever we perform a movement and interact with objects in our environment, our central nervous system (CNS) adapts and controls the redundant system of muscles actuating our limbs to produce suitable forces and impedance for the interaction. As modern robots are increasingly used to interact with objects, humans and other robots, they too require to continuously adapt the interaction forces and impedance to the situation. This thesis investigated the motor mechanisms in humans through a series of technical developments and experiments, and utilized the result to implement biomimetic motor behaviours on a robot. Original tools were first developed, which enabled two novel motor imaging experiments using functional magnetic resonance imaging (fMRI). The first experiment investigated the neural correlates of force and impedance control to understand the control structure employed by the human brain. The second experiment developed a regressor free technique to detect dynamic changes in brain activations during learning, and applied this technique to investigate changes in neural activity during adaptation to force fields and visuomotor rotations. In parallel, a psychophysical experiment investigated motor optimization in humans in a task characterized by multiple error-effort optima. Finally a computational model derived from some of these results was implemented to exhibit human like control and adaptation of force, impedance and movement trajectory in a robot
    corecore