The introduction of image-guided surgical navigation (IGSN) has greatly
benefited technically demanding surgical procedures by providing real-time
support and guidance to the surgeon during surgery. \hi{To develop effective
IGSN, a careful selection of the surgical information and the medium to present
this information to the surgeon is needed. However, this is not a trivial task
due to the broad array of available options.} To address this problem, we have
developed an open-source library that facilitates the development of multimodal
navigation systems in a wide range of surgical procedures relying on medical
imaging data. To provide guidance, our system calculates the minimum distance
between the surgical instrument and the anatomy and then presents this
information to the user through different mechanisms. The real-time performance
of our approach is achieved by calculating Signed Distance Fields at
initialization from segmented anatomical volumes. Using this framework, we
developed a multimodal surgical navigation system to help surgeons navigate
anatomical variability in a skull base surgery simulation environment. Three
different feedback modalities were explored: visual, auditory, and haptic. To
evaluate the proposed system, a pilot user study was conducted in which four
clinicians performed mastoidectomy procedures with and without guidance. Each
condition was assessed using objective performance and subjective workload
metrics. This pilot user study showed improvements in procedural safety without
additional time or workload. These results demonstrate our pipeline's
successful use case in the context of mastoidectomy.Comment: First two authors contributed equally. 6 page