4 research outputs found
Augmented Reality Assistance for Surgical Interventions using Optical See-Through Head-Mounted Displays
Augmented Reality (AR) offers an interactive user experience via enhancing the real world environment with computer-generated visual cues and other perceptual information. It has been applied to different applications, e.g. manufacturing, entertainment and healthcare, through different AR media. An Optical See-Through Head-Mounted Display (OST-HMD) is a specialized hardware for AR, where the computer-generated graphics can be overlaid directly onto the user's normal vision via optical combiners. Using OST-HMD for surgical intervention has many potential perceptual advantages. As a novel concept, many technical and clinical challenges exist for OST-HMD-based AR to be clinically useful, which motivates the work presented in this thesis.
From the technical aspects, we first investigate the display calibration of OST-HMD, which is an indispensable procedure to create accurate AR overlay. We propose various methods to reduce the user-related error, improve robustness of the calibration, and remodel the calibration as a 3D-3D registration problem. Secondly, we devise methods and develop hardware prototype to increase the user's visual acuity of both real and virtual content through OST-HMD, to aid them in tasks that require high visual acuity, e.g. dental procedures. Thirdly, we investigate the occlusion caused by the OST-HMD hardware, which limits the user's peripheral vision. We propose to use alternative indicators to remind the user of unattended environment motion.
From the clinical perspective, we identified many clinical use cases where OST-HMD-based AR is potentially helpful, developed applications integrated with current clinical systems, and conducted proof-of-concept evaluations. We first present a "virtual monitor'' for image-guided surgery. It can replace real radiology monitors in the operating room with easier user control and more flexibility in positioning. We evaluated the "virtual monitor'' for simulated percutaneous spine procedures. Secondly, we developed ARssist, an application for the bedside assistant in robotic surgery. The assistant can see the robotic instruments and endoscope within the patient body with ARssist. We evaluated the efficiency, safety and ergonomics of the assistant during two typical tasks: instrument insertion and manipulation. The performance for inexperienced users is significantly improved with ARssist, and for experienced users, the system significantly enhanced their confidence level. Lastly, we developed ARAMIS, which utilizes real-time 3D reconstruction and visualization to aid the laparoscopic surgeon. It demonstrates the concept of "X-ray see-through'' surgery. Our preliminary evaluation validated the application via a peg transfer task, and also showed significant improvement in hand-eye coordination.
Overall, we have demonstrated that OST-HMD based AR application provides ergonomic improvements, e.g. hand-eye coordination. In challenging situations or for novice users, the improvements in ergonomic factors lead to improvement in task performance. With continuous effort as a community, optical see-through augmented reality technology will be a useful interventional aid in the near future
3D-in-2D Displays for ATC.
This paper reports on the efforts and accomplishments
of the 3D-in-2D Displays for ATC project at the end of Year 1.
We describe the invention of 10 novel 3D/2D visualisations that
were mostly implemented in the Augmented Reality ARToolkit.
These prototype implementations of visualisation and interaction
elements can be viewed on the accompanying video. We have
identified six candidate design concepts which we will further
research and develop. These designs correspond with the early
feasibility studies stage of maturity as defined by the NASA
Technology Readiness Level framework. We developed the
Combination Display Framework from a review of the literature,
and used it for analysing display designs in terms of display
technique used and how they are combined. The insights we
gained from this framework then guided our inventions and the
human-centered innovation process we use to iteratively invent.
Our designs are based on an understanding of user work
practices. We also developed a simple ATC simulator that we
used for rapid experimentation and evaluation of design ideas.
We expect that if this project continues, the effort in Year 2 and 3
will be focus on maturing the concepts and employment in a
operational laboratory settings