Location of Repository

Using Kinect TM and a Haptic Interface for Implementation of Real-Time Virtual Fixtures

By Fredrik Rydén, Howard Jay Chizeck, Sina Nia Kosari, Hawkeye King and Blake Hannaford

Abstract

Abstract—The use of haptic virtual fixtures is a potential tool to improve the safety of robotic and telerobotic surgery. They can “push back ” on the surgeon to prevent unintended surgical tool movements into protected zones. Previous work has suggested generating virtual fixtures from preoperative images like CT scans. However these are difficult to establish and register in dynamic environments. This paper demonstrates automatic generation of real-time haptic virtual fixtures using a low cost Xbox Kinect TM depth camera connected to a virtual environment. This allows generation of virtual fixtures and calculation of haptic forces, which are then passed on to a haptic device. This paper demonstrates that haptic forces can be successfully rendered from real-time environments containing both non-moving and moving objects. This approach has the potential to generate virtual fixtures from the patient in real-time during robotic surgery. I

Year: 2014
OAI identifier: oai:CiteSeerX.psu:10.1.1.413.5617
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://www.cs.washington.edu/a... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.