14 research outputs found

    Method for robotic motion compensation during PET imaging of mobile subjects

    Full text link
    Studies of the human brain during natural activities, such as locomotion, would benefit from the ability to image deep brain structures during these activities. While Positron Emission Tomography (PET) can image these structures, the bulk and weight of current scanners are not compatible with the desire for a wearable device. This has motivated the design of a robotic system to support a PET imaging system around the subject's head and to move the system to accommodate natural motion. We report here the design and experimental evaluation of a prototype robotic system that senses motion of a subject's head, using parallel string encoders connected between the robot-supported imaging ring and a helmet worn by the subject. This measurement is used to robotically move the imaging ring (coarse motion correction) and to compensate for residual motion during image reconstruction (fine motion correction). Minimization of latency and measurement error are the key design goals, respectively, for coarse and fine motion correction. The system is evaluated using recorded human head motions during locomotion, with a mock imaging system consisting of lasers and cameras, and is shown to provide an overall system latency of about 80 ms, which is sufficient for coarse motion correction and collision avoidance, as well as a measurement accuracy of about 0.5 mm for fine motion correction.Comment: 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS

    Calibration and evaluation of a motion measurement system for PET imaging studies

    Full text link
    Positron Emission Tomography (PET) enables functional imaging of deep brain structures, but the bulk and weight of current systems preclude their use during many natural human activities, such as locomotion. The proposed long-term solution is to construct a robotic system that can support an imaging system surrounding the subject's head, and then move the system to accommodate natural motion. This requires a system to measure the motion of the head with respect to the imaging ring, for use by both the robotic system and the image reconstruction software. We report here the design, calibration, and experimental evaluation of a parallel string encoder mechanism for sensing this motion. Our results indicate that with kinematic calibration, the measurement system can achieve accuracy within 0.5mm, especially for small motions.Comment: arXiv admin note: text overlap with arXiv:2311.1786

    Optical Fiber-Based Needle Shape Sensing in Real Tissue: Single Core vs. Multicore Approaches

    Full text link
    Flexible needle insertion procedures are common for minimally-invasive surgeries for diagnosing and treating prostate cancer. Bevel-tip needles provide physicians the capability to steer the needle during long insertions to avoid vital anatomical structures in the patient and reduce post-operative patient discomfort. To provide needle placement feedback to the physician, sensors are embedded into needles for determining the real-time 3D shape of the needle during operation without needing to visualize the needle intra-operatively. Through expansive research in fiber optics, a plethora of bio-compatible, MRI-compatible, optical shape-sensors have been developed to provide real-time shape feedback, such as single-core and multicore fiber Bragg gratings. In this paper, we directly compare single-core fiber-based and multicore fiber-based needle shape-sensing through identically constructed, four-active area sensorized bevel-tip needles inserted into phantom and \exvivo tissue on the same experimental platform. In this work, we found that for shape-sensing in phantom tissue, the two needles performed identically with a pp-value of 0.164>0.050.164 > 0.05, but in \exvivo real tissue, the single-core fiber sensorized needle significantly outperformed the multicore fiber configuration with a pp-value of 0.0005<0.050.0005 < 0.05. This paper also presents the experimental platform and method for directly comparing these optical shape sensors for the needle shape-sensing task, as well as provides direction, insight and required considerations for future work in constructively optimizing sensorized needles

    Multi-Contact Force-Sensing Guitar for Training and Therapy

    Full text link
    Hand injuries from repetitive high-strain and physical overload can hamper or even end a musician's career. To help musicians develop safer playing habits, we developed a multiplecontact force-sensing array that can substitute as a guitar fretboard. The system consists of 72 individual force sensing modules, each containing a flexure and a photointerrupter that measures the corresponding deflection when forces are applied. The system is capable of measuring forces between 0-25 N applied anywhere within the first 12 frets at a rate of 20 Hz with an average accuracy of 0.4 N and a resolution of 0.1 N. Accompanied with a GUI, the resulting prototype was received positively as a useful tool for learning and injury prevention by novice and expert musicians.Comment: IEEE Sensor Conference, 201

    Haptic-Enhanced Virtual Reality Simulator for Robot-Assisted Femur Fracture Surgery

    Full text link
    In this paper, we develop a virtual reality (VR) simulator for the Robossis robot-assisted femur fracture surgery. Due to the steep learning curve for such procedures, a VR simulator is essential for training surgeon(s) and staff. The Robossis Surgical Simulator (RSS) is designed to immerse user(s) in a realistic surgery setting using the Robossis system as completed in a previous real-world cadaveric procedure. The RSS is designed to interface the Sigma-7 Haptic Controller with the Robossis Surgical Robot (RSR) and the Meta Quest VR headset. Results show that the RSR follows user commands in 6 DOF and prevents the overlapping of bone segments. This development demonstrates a promising avenue for future implementation of the Robossis system.Comment: This paper is submitted to the IEEE Haptic Symposium 202

    Design and Experimental Evaluation of a Haptic Robot-Assisted System for Femur Fracture Surgery

    Full text link
    In the face of challenges encountered during femur fracture surgery, such as the high rates of malalignment and X-ray exposure to operating personnel, robot-assisted surgery has emerged as an alternative to conventional state-of-the-art surgical methods. This paper introduces the development of Robossis, a haptic system for robot-assisted femur fracture surgery. Robossis comprises a 7-DOF haptic controller and a 6-DOF surgical robot. A unilateral control architecture is developed to address the kinematic mismatch and the motion transfer between the haptic controller and the Robossis surgical robot. A real-time motion control pipeline is designed to address the motion transfer and evaluated through experimental testing. The analysis illustrates that the Robossis surgical robot can adhere to the desired trajectory from the haptic controller with an average translational error of 0.32 mm and a rotational error of 0.07 deg. Additionally, a haptic rendering pipeline is developed to resolve the kinematic mismatch by constraining the haptic controller (user hand) movement within the permissible joint limits of the Robossis surgical robot. Lastly, in a cadaveric lab test, the Robossis system assisted surgeons during a mock femur fracture surgery. The result shows that Robossis can provide an intuitive solution for surgeons to perform femur fracture surgery.Comment: This paper is to be submitted to an IEEE journa

    1.5 T augmented reality navigated interventional MRI: paravertebral sympathetic plexus injections

    Get PDF
    PURPOSE:The high contrast resolution and absent ionizing radiation of interventional magnetic resonance imaging (MRI) can be advantageous for paravertebral sympathetic nerve plexus injections. We assessed the feasibility and technical performance of MRI-guided paravertebral sympathetic injections utilizing augmented reality navigation and 1.5 T MRI scanner.METHODS:A total of 23 bilateral injections of the thoracic (8/23, 35%), lumbar (8/23, 35%), and hypogastric (7/23, 30%) paravertebral sympathetic plexus were prospectively planned in twelve human cadavers using a 1.5 Tesla (T) MRI scanner and augmented reality navigation system. MRI-conditional needles were used. Gadolinium-DTPA-enhanced saline was injected. Outcome variables included the number of control magnetic resonance images, target error of the needle tip, punctures of critical nontarget structures, distribution of the injected fluid, and procedure length.RESULTS: Augmented-reality navigated MRI guidance at 1.5 T provided detailed anatomical visualization for successful targeting of the paravertebral space, needle placement, and perineural paravertebral injections in 46 of 46 targets (100%). A mean of 2 images (range, 1–5 images) were required to control needle placement. Changes of the needle trajectory occurred in 9 of 46 targets (20%) and changes of needle advancement occurred in 6 of 46 targets (13%), which were statistically not related to spinal regions (P = 0.728 and P = 0.86, respectively) and cadaver sizes (P = 0.893 and P = 0.859, respectively). The mean error of the needle tip was 3.9±1.7 mm. There were no punctures of critical nontarget structures. The mean procedure length was 33±12 min.CONCLUSION:1.5 T augmented reality-navigated interventional MRI can provide accurate imaging guidance for perineural injections of the thoracic, lumbar, and hypogastric sympathetic plexus

    Robotic assistant for transperineal prostate interventions

    No full text
    Abstract. Numerous studies have demonstrated the efficacy of imageguided needle-based therapy and biopsy in the management of prostate cancer. The accuracy of traditional prostate interventions performed using transrectal ultrasound (TRUS) is limited by image fidelity, needle template guides, needle deflection and tissue deformation. Magnetic Resonance Imaging (MRI) is an ideal modality for guiding and monitoring such interventions due to its excellent visualization of the prostate, its sub-structure and surrounding tissues. We have designed a comprehensive robotic assistant system that allows prostate biopsy and brachytherapy procedures to be performed entirely inside a 3T closed MRI scanner. We present a detailed design of the robotic manipulator and an evaluation of its usability and MR compatibility

    Systematic calibration of an integrated x-ray and optical tomography system for preclinical radiation research

    No full text
    The cone beam computed tomography (CBCT) guided small animal radiation research platform (SARRP) has been developed for focal tumor irradiation, allowing laboratory researchers to test basic biological hypotheses that can modify radiotherapy outcomes in ways that were not feasible previously. CBCT provides excellent bone to soft tissue contrast, but is incapable of differentiating tumors from surrounding soft tissue. Bioluminescence tomography (BLT), in contrast, allows direct visualization of even subpalpable tumors and quantitative evaluation of tumor response. Integration of BLT with CBCT offers complementary image information, with CBCT delineating anatomic structures and BLT differentiating luminescent tumors. This study is to develop a systematic method to calibrate an integrated CBCT and BLT imaging system which can be adopted onboard the SARRP to guide focal tumor irradiation. The integrated imaging system consists of CBCT, diffuse optical tomography (DOT), and BLT. The anatomy acquired from CBCT and optical properties acquired from DOT serve as a priori information for the subsequent BLT reconstruction. Phantoms were designed and procedures were developed to calibrate the CBCT, DOT/BLT, and the entire integrated system. Geometrical calibration was performed to calibrate the CBCT system. Flat field correction was performed to correct the nonuniform response of the optical imaging system. Absolute emittance calibration was performed to convert the camera readout to the emittance at the phantom or animal surface, which enabled the direct reconstruction of the bioluminescence source strength. Phantom and mouse imaging were performed to validate the calibration. All calibration procedures were successfully performed. Both CBCT of a thin wire and a euthanized mouse revealed no spatial artifact, validating the accuracy of the CBCT calibration. The absolute emittance calibration was validated with a 650 nm laser source, resulting in a 3.0% difference between simulated and measured signal. The calibration of the entire system was confirmed through the CBCT and BLT reconstruction of a bioluminescence source placed inside a tissue-simulating optical phantom. Using a spatial region constraint, the source position was reconstructed with less than 1 mm error and the source strength reconstructed with less than 24% error. A practical and systematic method has been developed to calibrate an integrated x-ray and optical tomography imaging system, including the respective CBCT and optical tomography system calibration and the geometrical calibration of the entire system. The method can be modified and adopted to calibrate CBCT and optical tomography systems that are operated independently or hybrid x-ray and optical tomography imaging systems
    corecore