3 research outputs found
The Reconstructive Metaverse – Collaboration in Real-Time Shared Mixed Reality Environments for Microsurgical Reconstruction
Plastic surgeons routinely use 3D-models in their clinical practice, from 3D-photography and surface imaging to 3D-segmentations from radiological scans. However, these models continue to be viewed on flattened 2D screens that do not enable an intuitive understanding of 3D-relationships and cause challenges regarding collaboration with colleagues. The Metaverse has been proposed as a new age of applications building on modern Mixed Reality headset technology that allows remote collaboration on virtual 3D-models in a shared physical-virtual space in real-time. We demonstrate the first use of the Metaverse in the context of reconstructive surgery, focusing on preoperative planning discussions and trainee education. Using a HoloLens headset with the Microsoft Mesh application, we performed planning sessions for 4 DIEP-flaps in our reconstructive metaverse on virtual patient-models segmented from routine CT angiography. In these sessions, surgeons discuss perforator anatomy and perforator selection strategies whilst comprehensively assessing the respective models. We demonstrate the workflow for a one-on-one interaction between an attending surgeon and a trainee in a video featuring both viewpoints as seen through the headset. We believe the Metaverse will provide novel opportunities to use the 3D-models that are already created in everyday plastic surgery practice in a more collaborative, immersive, accessible, and educational manner.Bayerisch-Kalifornischen Hochschulzentrumhttps://doi.org/10.13039/501100014177Bayerisches Forschungsinstitut fĂĽr Digitale Transformationhttps://doi.org/10.13039/100024171Bavaria California Technology CenterBayerisches Staatsministerium fĂĽr Wissenschaft und Kunsthttps://doi.org/10.13039/50110002171
Leveraging the Apple Ecosystem: Easy Viewing and Sharing of Three-dimensional Perforator Visualizations via iPad/iPhone-based Augmented Reality
Summary:. We introduce a novel technique using augmented reality (AR) on smartphones and tablets, making it possible for surgeons to review perforator anatomy in three dimensions on the go. Autologous breast reconstruction with abdominal flaps remains challenging due to the highly variable anatomy of the deep inferior epigastric artery. Computed tomography angiography has mitigated some but not all challenges. Previously, volume rendering and different headsets were used to enable better three-dimensional (3D) review for surgeons. However, surgeons have been dependent on others to provide 3D imaging data. Leveraging the ubiquity of Apple devices, our approach permits surgeons to review 3D models of deep inferior epigastric artery anatomy segmented from abdominal computed tomography angiography directly on their iPhone/iPad. Segmentation can be performed in common radiology software. The models are converted to the universal scene description zipped format, which allows immediate use on Apple devices without third-party software. They can be easily shared using secure, Health Insurance Portability and Accountability Act–compliant sharing services already provided by most hospitals. Surgeons can simply open the file on their mobile device to explore the images in 3D using “object mode” natively without additional applications or can switch to AR mode to pin the model in their real-world surroundings for intuitive exploration. We believe patient-specific 3D anatomy models are a powerful tool for intuitive understanding and communication of complex perforator anatomy and would be a valuable addition in routine clinical practice and education. Using this one-click solution on existing devices that is simple to implement, we hope to streamline the adoption of AR models by plastic surgeons
Suture Packaging as a Marker for Intraoperative Image Alignment in Augmented Reality on Mobile Devices
Summary:. Preoperative vascular imaging has become standard practice in the planning of microsurgical breast reconstruction. Currently, translating perforator locations from radiological findings to a patient’s abdomen is often not easy or intuitive. Techniques using three-dimensional printing or patient-specific guides have been introduced to superimpose anatomy onto the abdomen for reference. Augmented and mixed reality is currently actively investigated for perforator mapping by superimposing virtual models directly onto the patient. Most techniques have found only limited adoption due to complexity and price. Additionally, a critical step is aligning virtual models to patients. We propose repurposing suture packaging as an image tracking marker. Tracking markers allow quick and easy alignment of virtual models to the individual patient’s anatomy. Current techniques are often complicated or expensive and limit intraoperative use of augmented reality models. Suture packs are sterile, readily available, and can be used to align abdominal models on the patients. Using an iPad, the augmented reality models automatically align in the correct position by using a suture pack as a tracking marker. Given the ubiquity of iPads, the combination of these devices with readily available suture packs will predictably lower the barrier to entry and utilization of this technology. Here, our workflow is presented along with its intraoperative utilization. Additionally, we investigated the accuracy of this technology