4,512 research outputs found
Validation of a bovine rectal palpation simulator for training veterinary students
No abstract available
Evaluating Human Performance for Image-Guided Surgical Tasks
The following work focuses on the objective evaluation of human performance for two different interventional tasks; targeted prostate biopsy tasks using a tracked biopsy device, and external ventricular drain placement tasks using a mobile-based augmented reality device for visualization and guidance. In both tasks, a human performance methodology was utilized which respects the trade-off between speed and accuracy for users conducting a series of targeting tasks using each device. This work outlines the development and application of performance evaluation methods using these devices, as well as details regarding the implementation of the mobile AR application. It was determined that the Fitts’ Law methodology can be applied for evaluation of tasks performed in each surgical scenario, and was sensitive to differentiate performance across a range which spanned experienced and novice users. This methodology is valuable for future development of training modules for these and other medical devices, and can provide details about the underlying characteristics of the devices, and how they can be optimized with respect to human performance
Microscope 2.0: An Augmented Reality Microscope with Real-time Artificial Intelligence Integration
The brightfield microscope is instrumental in the visual examination of both
biological and physical samples at sub-millimeter scales. One key clinical
application has been in cancer histopathology, where the microscopic assessment
of the tissue samples is used for the diagnosis and staging of cancer and thus
guides clinical therapy. However, the interpretation of these samples is
inherently subjective, resulting in significant diagnostic variability.
Moreover, in many regions of the world, access to pathologists is severely
limited due to lack of trained personnel. In this regard, Artificial
Intelligence (AI) based tools promise to improve the access and quality of
healthcare. However, despite significant advances in AI research, integration
of these tools into real-world cancer diagnosis workflows remains challenging
because of the costs of image digitization and difficulties in deploying AI
solutions. Here we propose a cost-effective solution to the integration of AI:
the Augmented Reality Microscope (ARM). The ARM overlays AI-based information
onto the current view of the sample through the optical pathway in real-time,
enabling seamless integration of AI into the regular microscopy workflow. We
demonstrate the utility of ARM in the detection of lymph node metastases in
breast cancer and the identification of prostate cancer with a latency that
supports real-time workflows. We anticipate that ARM will remove barriers
towards the use of AI in microscopic analysis and thus improve the accuracy and
efficiency of cancer diagnosis. This approach is applicable to other microscopy
tasks and AI algorithms in the life sciences and beyond
Pilot study on virtual imaging for patient information on radiotherapy planning and delivery
It is widely accepted that health professionals might sometimes underestimate cancer patients' needs for information on the complex process of radiotherapy (RT) planning and delivery. Furthermore, relatives might also feel excluded from the treatment of their loved ones. This pilot study was carried out in order to assess whether both patients and their relatives would welcome further information on RT planning and delivery using the virtual reality (VR) system VERT. One hundred and fifty patients with different types of cancer receiving radical RT were included in the study. Patients and relatives were shown using VERT on a one-to-one basis with an oncologist or a radiographer, a standard room where RT is given, a linear accelerator, and how RT is planned and delivered using their own planning CT Scans. Patients welcomed this information as it helped them to reduce their fears about RT. Relatives felt also more involved in the treatment of their loved one. The results obtained in this pilot study show that VR aids could become an important tool for delivering information on RT to both patients and relatives
Robotic simulators for tissue examination training with multimodal sensory feedback
Tissue examination by hand remains an essential technique in clinical practice. The effective application depends on skills in sensorimotor coordination, mainly involving haptic, visual, and auditory feedback. The skills clinicians have to learn can be as subtle as regulating finger pressure with breathing, choosing palpation action, monitoring involuntary facial and vocal expressions in response to palpation, and using pain expressions both as a source of information and as a constraint on physical examination. Patient simulators can provide a safe learning platform to novice physicians before trying real patients. This paper reviews state-of-the-art medical simulators for the training for the first time with a consideration of providing multimodal feedback to learn as many manual examination techniques as possible. The study summarizes current advances in tissue examination training devices simulating different medical conditions and providing different types of feedback modalities. Opportunities with the development of pain expression, tissue modeling, actuation, and sensing are also analyzed to support the future design of effective tissue examination simulators
Focal Spot, Summer 2000
https://digitalcommons.wustl.edu/focal_spot_archives/1085/thumbnail.jp
- …