180 research outputs found

    Robotic simulators for tissue examination training with multimodal sensory feedback

    Get PDF
    Tissue examination by hand remains an essential technique in clinical practice. The effective application depends on skills in sensorimotor coordination, mainly involving haptic, visual, and auditory feedback. The skills clinicians have to learn can be as subtle as regulating finger pressure with breathing, choosing palpation action, monitoring involuntary facial and vocal expressions in response to palpation, and using pain expressions both as a source of information and as a constraint on physical examination. Patient simulators can provide a safe learning platform to novice physicians before trying real patients. This paper reviews state-of-the-art medical simulators for the training for the first time with a consideration of providing multimodal feedback to learn as many manual examination techniques as possible. The study summarizes current advances in tissue examination training devices simulating different medical conditions and providing different types of feedback modalities. Opportunities with the development of pain expression, tissue modeling, actuation, and sensing are also analyzed to support the future design of effective tissue examination simulators

    W-FYD: a Wearable Fabric-based Display for Haptic Multi-Cue Delivery and Tactile Augmented Reality

    Get PDF
    Despite the importance of softness, there is no evidence of wearable haptic systems able to deliver controllable softness cues. Here, we present the Wearable Fabric Yielding Display (W-FYD), a fabric-based display for multi-cue delivery that can be worn on user's finger and enables, for the first time, both active and passive softness exploration. It can also induce a sliding effect under the finger-pad. A given stiffness profile can be obtained by modulating the stretching state of the fabric through two motors. Furthermore, a lifting mechanism allows to put the fabric in contact with the user's finger-pad, to enable passive softness rendering. In this paper, we describe the architecture of W-FYD, and a thorough characterization of its stiffness workspace, frequency response and softness rendering capabilities. We also computed device Just Noticeable Difference in both active and passive exploratory conditions, for linear and non-linear stiffness rendering as well as for sliding direction perception. The effect of device weight was also considered. Furthermore, performance of participants and their subjective quantitative evaluation in detecting sliding direction and softness discrimination tasks are reported. Finally, applications of W-FYD in tactile augmented reality for open palpation are discussed, opening interesting perspectives in many fields of human-machine interaction

    A fabric-based approach for wearable haptics

    Get PDF
    In recent years, wearable haptic systems (WHS) have gained increasing attention as a novel and exciting paradigm for human-robot interaction (HRI).These systems can be worn by users, carried around, and integrated in their everyday lives, thus enabling a more natural manner to deliver tactile cues.At the same time, the design of these types of devices presents new issues: the challenge is the correct identification of design guidelines, with the two-fold goal of minimizing system encumbrance and increasing the effectiveness and naturalness of stimulus delivery.Fabrics can represent a viable solution to tackle these issues.They are specifically thought “to be worn”, and could be the key ingredient to develop wearable haptic interfaces conceived for a more natural HRI.In this paper, the author will review some examples of fabric-based WHS that can be applied to different body locations, and elicit different haptic perceptions for different application fields.Perspective and future developments of this approach will be discussed

    Tactile Sensing System for Lung Tumour Localization during Minimally Invasive Surgery

    Get PDF
    Video-assisted thoracoscopie surgery (VATS) is becoming a prevalent method for lung cancer treatment. However, VATS suffers from the inability to accurately relay haptic information to the surgeon, often making tumour localization difficult. This limitation was addressed by the design of a tactile sensing system (TSS) consisting of a probe with a tactile sensor and interfacing visualization software. In this thesis, TSS performance was tested to determine the feasibility of implementing the system in VATS. This was accomplished through a series of ex vivo experiments in which the tactile sensor was calibrated and the visualization software was modified to provide haptic information visually to the user, and TSS performance was compared using human and robot palpation methods, and conventional VATS instruments. It was concluded that the device offers the possibility of providing to the surgeon the haptic information lost during surgery, thereby mitigating one of the current limitations of VATS

    Investigating the Feasibility of Using Focussed Airborne Ultrasound as Tactile Feedback in Medical Simulators

    Get PDF
    Novice medical practitioners commonly practice on live patients in real medical procedures. However, due to the inexperience of the practitioner, mistakes are likely which exposes the patient to undue risk. To improve the training of novices, medical simulators create a virtual patient providing a safe environment for the user to practice within. An important clinical skill is palpation, a physical examination technique. The practitioners use their hands to feel the body of the patient to make diagnosis. A virtual patient has a visual representation but as it is virtual, the patient is not physically present. Haptics technology provide additional benefits to the training session by stimulating the physical sense of touch. A novel technique has recently emerged for stimulating tactile sensation called acoustic radiation pressure from focussed airborne ultrasound. Acoustic radiation creates a focal point of concentrated acoustic pressure in a three-dimensional field producing a force in mid-air. Airborne ultrasound has several advantages over conventional technologies. It was also initially theorised that using airborne ultrasound to simulate palpation compared to a previous system called PalpSim which consists of a rubber tube filled with water permanently embedded in a block of silicone, will offer better controllability over the displayed sensation to simulate various tactile sensations. The thesis has investigated the feasibility of using focussed airborne ultrasound as tactile feedback in medical simulators. A tactile device called UltraSendo was completely custom built to simulate an arterial pulse and a thrill sensation. UltraSendo was integrated with an augmented reality simulator displaying a virtual patient for user interaction. The simulator was brought to Ysbyty Glan Clwyd hospital for user feedback. A wide range of user responses were gathered. The majority of responses felt the arterial pulse was not sufficiently realistic whilst there were higher ratings for the thrill sensation which is acceptably realistic. Positive feedback suggests that airborne ultrasound can indeed provide tactile feedback in a medical context and is better at simulating a thrill sensation compared to a pulse sensation

    Microscope Embedded Neurosurgical Training and Intraoperative System

    Get PDF
    In the recent years, neurosurgery has been strongly influenced by new technologies. Computer Aided Surgery (CAS) offers several benefits for patients\u27 safety but fine techniques targeted to obtain minimally invasive and traumatic treatments are required, since intra-operative false movements can be devastating, resulting in patients deaths. The precision of the surgical gesture is related both to accuracy of the available technological instruments and surgeon\u27s experience. In this frame, medical training is particularly important. From a technological point of view, the use of Virtual Reality (VR) for surgeon training and Augmented Reality (AR) for intra-operative treatments offer the best results. In addition, traditional techniques for training in surgery include the use of animals, phantoms and cadavers. The main limitation of these approaches is that live tissue has different properties from dead tissue and that animal anatomy is significantly different from the human. From the medical point of view, Low-Grade Gliomas (LGGs) are intrinsic brain tumours that typically occur in younger adults. The objective of related treatment is to remove as much of the tumour as possible while minimizing damage to the healthy brain. Pathological tissue may closely resemble normal brain parenchyma when looked at through the neurosurgical microscope. The tactile appreciation of the different consistency of the tumour compared to normal brain requires considerable experience on the part of the neurosurgeon and it is a vital point. The first part of this PhD thesis presents a system for realistic simulation (visual and haptic) of the spatula palpation of the LGG. This is the first prototype of a training system using VR, haptics and a real microscope for neurosurgery. This architecture can be also adapted for intra-operative purposes. In this instance, a surgeon needs the basic setup for the Image Guided Therapy (IGT) interventions: microscope, monitors and navigated surgical instruments. The same virtual environment can be AR rendered onto the microscope optics. The objective is to enhance the surgeon\u27s ability for a better intra-operative orientation by giving him a three-dimensional view and other information necessary for a safe navigation inside the patient. The last considerations have served as motivation for the second part of this work which has been devoted to improving a prototype of an AR stereoscopic microscope for neurosurgical interventions, developed in our institute in a previous work. A completely new software has been developed in order to reuse the microscope hardware, enhancing both rendering performances and usability. Since both AR and VR share the same platform, the system can be referred to as Mixed Reality System for neurosurgery. All the components are open source or at least based on a GPL license

    Haptic assessment of tissue stiffness in locating and identifying gynaecological cancer in human tissue

    Get PDF
    Gynaecological surgeons are not able to gather adequate tissue feedback during minimal access surgery for cancer treatment. This can result in failure to locate tumour boundaries and to ensure these are completely resected within tumour-free resection margins. Surgeons achieve significantly better surgical and oncological outcomes if they can identify the precise location of a gynaecological tumour. Indeed, the true nature of tumour, whether benign or cancerous, is often not known prior to surgery. If more details were available in relation to the characteristics that differentiate gynaecological cancer in tumours, this would enable more accurate diagnosis and help in the planning of surgery. HYPOTHESIS: Haptic technology has the potential to enhance the surgeon’s degree of perception during minimal access surgery. Alteration in tissue stiffness in gynaecological tumours, thought to be associated with the accelerated multiplication of cancer cells, should allow their location to be identified and help in determining the likelihood of malignancy. METHOD: Setting: (i) Guy's & St Thomas' Hospital (ii) Dept of Informatics (King's College London).Permission from the National Research Ethics Committee and Research & Development (R&D) approval were sought from the National Health Service. The Phantom Omni, capable of 3D motion tracking, attached to a nano-17 force sensor, was used to capture real-time position data and force data. Uniaxial indentation palpation behaviour was used. The indentation depth was calculated using the displacement of the probe from the surface to the deepest point for each contact. The tissue stiffness (TS) was then calculated.The haptic probe was tested first on silicone models with embedded nodules mimicking tumour(s). This was followed by assessing TS ex-vivo using a haptic probe on fresh human gynaecological organs that had been removed in surgery. Tissue stiffness maps were generated in real time using the haptic device by converting stiffness values into RGB values. Surgeons also manually palpated and recorded the site of the tumour. Histology was used as the gold standard for location and cancer diagnosis. Manual palpation and haptic data were compared for accuracy on tumour location. The tissue stiffness calculated by the haptic probe was compared in cancer and control specimens. Several data analysis techniques were applied to derive results.CONTRIBUTIONS: Haptic indentation probe was tested for the first time on fresh human gynaecological organs to locate cancer in a clinical setting. We are the first one to evaluate the accuracy of cancer diagnosis in human gynaecological organs with a force sensing haptic indentation probe measuring tissue stiffness

    Haptic feedback control designs in teleoperation systems for minimal invasive surgery

    Get PDF

    A Framework for Tumor Localization in Robot-Assisted Minimally Invasive Surgery

    Get PDF
    Manual palpation of tissue is frequently used in open surgery, e.g., for localization of tumors and buried vessels and for tissue characterization. The overall objective of this work is to explore how tissue palpation can be performed in Robot-Assisted Minimally Invasive Surgery (RAMIS) using laparoscopic instruments conventionally used in RAMIS. This thesis presents a framework where a surgical tool is moved teleoperatively in a manner analogous to the repetitive pressing motion of a finger during manual palpation. We interpret the changes in parameters due to this motion such as the applied force and the resulting indentation depth to accurately determine the variation in tissue stiffness. This approach requires the sensorization of the laparoscopic tool for force sensing. In our work, we have used a da Vinci needle driver which has been sensorized in our lab at CSTAR for force sensing using Fiber Bragg Grating (FBG). A computer vision algorithm has been developed for 3D surgical tool-tip tracking using the da Vinci \u27s stereo endoscope. This enables us to measure changes in surface indentation resulting from pressing the needle driver on the tissue. The proposed palpation framework is based on the hypothesis that the indentation depth is inversely proportional to the tissue stiffness when a constant pressing force is applied. This was validated in a telemanipulated setup using the da Vinci surgical system with a phantom in which artificial tumors were embedded to represent areas of different stiffnesses. The region with high stiffness representing tumor and region with low stiffness representing healthy tissue showed an average indentation depth change of 5.19 mm and 10.09 mm respectively while maintaining a maximum force of 8N during robot-assisted palpation. These indentation depth variations were then distinguished using the k-means clustering algorithm to classify groups of low and high stiffnesses. The results were presented in a colour-coded map. The unique feature of this framework is its use of a conventional laparoscopic tool and minimal re-design of the existing da Vinci surgical setup. Additional work includes a vision-based algorithm for tracking the motion of the tissue surface such as that of the lung resulting from respiratory and cardiac motion. The extracted motion information was analyzed to characterize the lung tissue stiffness based on the lateral strain variations as the surface inflates and deflates
    • …
    corecore