41 research outputs found

    Method for robotic motion compensation during PET imaging of mobile subjects

    Full text link
    Studies of the human brain during natural activities, such as locomotion, would benefit from the ability to image deep brain structures during these activities. While Positron Emission Tomography (PET) can image these structures, the bulk and weight of current scanners are not compatible with the desire for a wearable device. This has motivated the design of a robotic system to support a PET imaging system around the subject's head and to move the system to accommodate natural motion. We report here the design and experimental evaluation of a prototype robotic system that senses motion of a subject's head, using parallel string encoders connected between the robot-supported imaging ring and a helmet worn by the subject. This measurement is used to robotically move the imaging ring (coarse motion correction) and to compensate for residual motion during image reconstruction (fine motion correction). Minimization of latency and measurement error are the key design goals, respectively, for coarse and fine motion correction. The system is evaluated using recorded human head motions during locomotion, with a mock imaging system consisting of lasers and cameras, and is shown to provide an overall system latency of about 80 ms, which is sufficient for coarse motion correction and collision avoidance, as well as a measurement accuracy of about 0.5 mm for fine motion correction.Comment: 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS

    Calibration and evaluation of a motion measurement system for PET imaging studies

    Full text link
    Positron Emission Tomography (PET) enables functional imaging of deep brain structures, but the bulk and weight of current systems preclude their use during many natural human activities, such as locomotion. The proposed long-term solution is to construct a robotic system that can support an imaging system surrounding the subject's head, and then move the system to accommodate natural motion. This requires a system to measure the motion of the head with respect to the imaging ring, for use by both the robotic system and the image reconstruction software. We report here the design, calibration, and experimental evaluation of a parallel string encoder mechanism for sensing this motion. Our results indicate that with kinematic calibration, the measurement system can achieve accuracy within 0.5mm, especially for small motions.Comment: arXiv admin note: text overlap with arXiv:2311.1786

    Optical Fiber-Based Needle Shape Sensing in Real Tissue: Single Core vs. Multicore Approaches

    Full text link
    Flexible needle insertion procedures are common for minimally-invasive surgeries for diagnosing and treating prostate cancer. Bevel-tip needles provide physicians the capability to steer the needle during long insertions to avoid vital anatomical structures in the patient and reduce post-operative patient discomfort. To provide needle placement feedback to the physician, sensors are embedded into needles for determining the real-time 3D shape of the needle during operation without needing to visualize the needle intra-operatively. Through expansive research in fiber optics, a plethora of bio-compatible, MRI-compatible, optical shape-sensors have been developed to provide real-time shape feedback, such as single-core and multicore fiber Bragg gratings. In this paper, we directly compare single-core fiber-based and multicore fiber-based needle shape-sensing through identically constructed, four-active area sensorized bevel-tip needles inserted into phantom and \exvivo tissue on the same experimental platform. In this work, we found that for shape-sensing in phantom tissue, the two needles performed identically with a pp-value of 0.164>0.050.164 > 0.05, but in \exvivo real tissue, the single-core fiber sensorized needle significantly outperformed the multicore fiber configuration with a pp-value of 0.0005<0.050.0005 < 0.05. This paper also presents the experimental platform and method for directly comparing these optical shape sensors for the needle shape-sensing task, as well as provides direction, insight and required considerations for future work in constructively optimizing sensorized needles

    Multi-Contact Force-Sensing Guitar for Training and Therapy

    Full text link
    Hand injuries from repetitive high-strain and physical overload can hamper or even end a musician's career. To help musicians develop safer playing habits, we developed a multiplecontact force-sensing array that can substitute as a guitar fretboard. The system consists of 72 individual force sensing modules, each containing a flexure and a photointerrupter that measures the corresponding deflection when forces are applied. The system is capable of measuring forces between 0-25 N applied anywhere within the first 12 frets at a rate of 20 Hz with an average accuracy of 0.4 N and a resolution of 0.1 N. Accompanied with a GUI, the resulting prototype was received positively as a useful tool for learning and injury prevention by novice and expert musicians.Comment: IEEE Sensor Conference, 201

    Fiber bragg gratings for medical applications and future challenges: A review

    Get PDF
    In the last decades, fiber Bragg gratings (FBGs) have become increasingly attractive to medical applications due to their unique properties such as small size, biocompatibility, immunity to electromagnetic interferences, high sensitivity and multiplexing capability. FBGs have been employed in the development of surgical tools, assistive devices, wearables, and biosensors, showing great potentialities for medical uses. This paper reviews the FBG-based measuring systems, their principle of work, and their applications in medicine and healthcare. Particular attention is given to sensing solutions for biomechanics, minimally invasive surgery, physiological monitoring, and medical biosensing. Strengths, weaknesses, open challenges, and future trends are also discussed to highlight how FBGs can meet the demands of next-generation medical devices and healthcare system

    Haptic-Enhanced Virtual Reality Simulator for Robot-Assisted Femur Fracture Surgery

    Full text link
    In this paper, we develop a virtual reality (VR) simulator for the Robossis robot-assisted femur fracture surgery. Due to the steep learning curve for such procedures, a VR simulator is essential for training surgeon(s) and staff. The Robossis Surgical Simulator (RSS) is designed to immerse user(s) in a realistic surgery setting using the Robossis system as completed in a previous real-world cadaveric procedure. The RSS is designed to interface the Sigma-7 Haptic Controller with the Robossis Surgical Robot (RSR) and the Meta Quest VR headset. Results show that the RSR follows user commands in 6 DOF and prevents the overlapping of bone segments. This development demonstrates a promising avenue for future implementation of the Robossis system.Comment: This paper is submitted to the IEEE Haptic Symposium 202

    Design and Experimental Evaluation of a Haptic Robot-Assisted System for Femur Fracture Surgery

    Full text link
    In the face of challenges encountered during femur fracture surgery, such as the high rates of malalignment and X-ray exposure to operating personnel, robot-assisted surgery has emerged as an alternative to conventional state-of-the-art surgical methods. This paper introduces the development of Robossis, a haptic system for robot-assisted femur fracture surgery. Robossis comprises a 7-DOF haptic controller and a 6-DOF surgical robot. A unilateral control architecture is developed to address the kinematic mismatch and the motion transfer between the haptic controller and the Robossis surgical robot. A real-time motion control pipeline is designed to address the motion transfer and evaluated through experimental testing. The analysis illustrates that the Robossis surgical robot can adhere to the desired trajectory from the haptic controller with an average translational error of 0.32 mm and a rotational error of 0.07 deg. Additionally, a haptic rendering pipeline is developed to resolve the kinematic mismatch by constraining the haptic controller (user hand) movement within the permissible joint limits of the Robossis surgical robot. Lastly, in a cadaveric lab test, the Robossis system assisted surgeons during a mock femur fracture surgery. The result shows that Robossis can provide an intuitive solution for surgeons to perform femur fracture surgery.Comment: This paper is to be submitted to an IEEE journa

    Interactive Multi-Stage Robotic Positioner for Intra-Operative MRI-Guided Stereotactic Neurosurgery

    Get PDF
    Magnetic resonance imaging (MRI) demonstrates clear advantages over other imaging modalities in neurosurgery with its ability to delineate critical neurovascular structures and cancerous tissue in high-resolution 3D anatomical roadmaps. However, its application has been limited to interventions performed based on static pre/post-operative imaging, where errors accrue from stereotactic frame setup, image registration, and brain shift. To leverage the powerful intra-operative functions of MRI, e.g., instrument tracking, monitoring of physiological changes and tissue temperature in MRI-guided bilateral stereotactic neurosurgery, a multi-stage robotic positioner is proposed. The system positions cannula/needle instruments using a lightweight (203 g) and compact (Ø97 × 81 mm) skull-mounted structure that fits within most standard imaging head coils. With optimized design in soft robotics, the system operates in two stages: i) manual coarse adjustment performed interactively by the surgeon (workspace of ±30°), ii) automatic fine adjustment with precise (<0.2° orientation error), responsive (1.4 Hz bandwidth), and high-resolution (0.058°) soft robotic positioning. Orientation locking provides sufficient transmission stiffness (4.07 N/mm) for instrument advancement. The system's clinical workflow and accuracy is validated with lab-based (<0.8 mm) and MRI-based testing on skull phantoms (<1.7 mm) and a cadaver subject (<2.2 mm). Custom-made wireless omni-directional tracking markers facilitated robot registration under MRI

    1.5 T augmented reality navigated interventional MRI: paravertebral sympathetic plexus injections

    Get PDF
    PURPOSE:The high contrast resolution and absent ionizing radiation of interventional magnetic resonance imaging (MRI) can be advantageous for paravertebral sympathetic nerve plexus injections. We assessed the feasibility and technical performance of MRI-guided paravertebral sympathetic injections utilizing augmented reality navigation and 1.5 T MRI scanner.METHODS:A total of 23 bilateral injections of the thoracic (8/23, 35%), lumbar (8/23, 35%), and hypogastric (7/23, 30%) paravertebral sympathetic plexus were prospectively planned in twelve human cadavers using a 1.5 Tesla (T) MRI scanner and augmented reality navigation system. MRI-conditional needles were used. Gadolinium-DTPA-enhanced saline was injected. Outcome variables included the number of control magnetic resonance images, target error of the needle tip, punctures of critical nontarget structures, distribution of the injected fluid, and procedure length.RESULTS: Augmented-reality navigated MRI guidance at 1.5 T provided detailed anatomical visualization for successful targeting of the paravertebral space, needle placement, and perineural paravertebral injections in 46 of 46 targets (100%). A mean of 2 images (range, 1–5 images) were required to control needle placement. Changes of the needle trajectory occurred in 9 of 46 targets (20%) and changes of needle advancement occurred in 6 of 46 targets (13%), which were statistically not related to spinal regions (P = 0.728 and P = 0.86, respectively) and cadaver sizes (P = 0.893 and P = 0.859, respectively). The mean error of the needle tip was 3.9±1.7 mm. There were no punctures of critical nontarget structures. The mean procedure length was 33±12 min.CONCLUSION:1.5 T augmented reality-navigated interventional MRI can provide accurate imaging guidance for perineural injections of the thoracic, lumbar, and hypogastric sympathetic plexus
    corecore