Abstract—The use of haptic virtual fixtures is a potential tool to improve the safety of robotic and telerobotic surgery. They can “push back ” on the surgeon to prevent unintended surgical tool movements into protected zones. Previous work has suggested generating virtual fixtures from preoperative images like CT scans. However these are difficult to establish and register in dynamic environments. This paper demonstrates automatic generation of real-time haptic virtual fixtures using a low cost Xbox Kinect TM depth camera connected to a virtual environment. This allows generation of virtual fixtures and calculation of haptic forces, which are then passed on to a haptic device. This paper demonstrates that haptic forces can be successfully rendered from real-time environments containing both non-moving and moving objects. This approach has the potential to generate virtual fixtures from the patient in real-time during robotic surgery. I
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.