395 research outputs found

    HAPTIC AND VISUAL SIMULATION OF BONE DISSECTION

    Get PDF
    Marco AgusIn bone dissection virtual simulation, force restitution represents the key to realistically mimicking a patient– specific operating environment. The force is rendered using haptic devices controlled by parametrized mathematical models that represent the bone–burr contact. This dissertation presents and discusses a haptic simulation of a bone cutting burr, that it is being developed as a component of a training system for temporal bone surgery. A physically based model was used to describe the burr– bone interaction, including haptic forces evaluation, bone erosion process and resulting debris. The model was experimentally validated and calibrated by employing a custom experimental set–up consisting of a force–controlled robot arm holding a high–speed rotating tool and a contact force measuring apparatus. Psychophysical testing was also carried out to assess individual reaction to the haptic environment. The results suggest that the simulator is capable of rendering the basic material differences required for bone burring tasks. The current implementation, directly operating on a voxel discretization of patientspecific 3D CT and MR imaging data, is efficient enough to provide real–time haptic and visual feedback on a low–end multi–processing PC platform.

    The evaluation of a novel haptic machining VR-based process planning system using an original process planning usability method

    Get PDF
    This thesis provides an original piece of work and contribution to knowledge by creating a new process planning system; Haptic Aided Process Planning (HAPP). This system is based on the combination of haptics and virtual reality (VR). HAPP creates a simulative machining environment where Process plans are automatically generated from the real time logging of a user’s interaction. Further, through the application of a novel usability test methodology, a deeper study of how this approach compares to conventional process planning was undertaken. An abductive research approach was selected and an iterative and incremental development methodology chosen. Three development cycles were undertaken with evaluation studies carried out at the end of each. Each study, the pre-pilot, pilot and industrial, identified progressive refinements to both the usability of HAPP and the usability evaluation method itself. HAPP provided process planners with an environment similar to which they are already familiar. Visual images were used to represent tools and material whilst a haptic interface enabled their movement and positioning by an operator in a manner comparable to their native setting. In this way an intuitive interface was developed that allowed users to plan the machining of parts consisting of features that can be machined on a pillar drill, 21/2D axis milling machine or centre lathe. The planning activities included single or multiple set ups, fixturing and sequencing of cutting operations. The logged information was parsed and output to a process plan including route sheets, operation sheets, tool lists and costing information, in a human readable format. The system evaluation revealed that HAPP, from an expert planners perspective is perceived to be 70% more satisfying to use, 66% more efficient in completing process plans, primarily due to the reduced cognitive load, is more effective producing a higher quality output of information and is 20% more learnable than a traditional process planning approach

    Towards a psychophysical evaluation of a surgical simulator for bone-burring

    Get PDF
    The CRS4 experimental bone-burr simulator implements visual and haptic effects through the incorporation of a physics-based contact model and patient-specific data. Psychophysical tests demonstrate that, despite its simplified model and its inherent technological constraints, the simulator can articulate material differences, and that its users can learn to associate virtual bone with real bone material. Tests addressed both surface probing and interior drilling task. We also explore a haptic contrast sensitivity function based on the model s two main parameters: an elastic constant and an erosion factor. Both parameters manifest power-law-like sensitivity with respective exponents of around two and three. Further tests may reveal how well simulator users perceive fine differences in bone material, like those encountered while drilling through real volume boundaries.139-14

    Haptic and visual simulation of bone dissection

    Get PDF
    Tesi di dottorato: Università degli Studi di Cagliari, Facoltà di Ingegneria, Dipartiemnto di Ingegneria Meccanica, XV Ciclo di Dottorato in Progettazione Meccanica.In bone dissection virtual simulation, force restitution represents the key to realistically mimicking a patient--specific operating environment. The force is rendered using haptic devices controlled by parametrized mathematical models that represent the bone--burr contact. This dissertation presents and discusses a haptic simulation of a bone cutting burr, that it is being developed as a component of a training system for temporal bone surgery. A physically based model was used to describe the burr--bone interaction, including haptic forces evaluation, bone erosion process and resulting debris. The model was experimentally validated and calibrated by employing a custom experimental set--up consisting of a force--controlled robot arm holding a high--speed rotating tool and a contact force measuring apparatus. Psychophysical testing was also carried out to assess individual reaction to the haptic environment. The results suggest that the simulator is capable of rendering the basic material differences required for bone burring tasks. The current implementation, directly operating on a voxel discretization of patient-specific 3D CT and MR imaging data, is efficient enough to provide real--time haptic and visual feedback on a low--end multi--processing PC platformInedit

    Investigating Precise Control in Spatial Interactions: Proxemics, Kinesthetics, and Analytics

    Get PDF
    Augmented and Virtual Reality (AR/VR) technologies have reshaped the way in which we perceive the virtual world. In fact, recent technological advancements provide experiences that make the physical and virtual worlds almost indistinguishable. However, the physical world affords subtle sensorimotor cues which we subconsciously utilize to perform simple and complex tasks in our daily lives. The lack of this affordance in existing AR/VR systems makes it difficult for their mainstream adoption over conventional 2D2D user interfaces. As a case in point, existing spatial user interfaces (SUI) lack the intuition to perform tasks in a manner that is perceptually familiar to the physical world. The broader goal of this dissertation lies in facilitating an intuitive spatial manipulation experience, specifically for motor control. We begin by investigating the role of proximity to an action on precise motor control in spatial tasks. We do so by introducing a new SUI called the Clock-Maker's Work-Space (CMWS), with the goal of enabling precise actions close to the body, akin to the physical world. On evaluating our setup in comparison to conventional mixed-reality interfaces, we find CMWS to afford precise actions for bi-manual spatial tasks. We further compare our SUI with a physical manipulation task and observe similarities in user behavior across both tasks. We subsequently narrow our focus on studying precise spatial rotation. We utilize haptics, specifically force-feedback (kinesthetics) for augmenting fine motor control in spatial rotational task. By designing three kinesthetic rotation metaphors, we evaluate precise rotational control with and without haptic feedback for 3D shape manipulation. Our results show that haptics-based rotation algorithms allow for precise motor control in 3D space, also, help reduce hand fatigue. In order to understand precise control in its truest form, we investigate orthopedic surgery training from the point of analyzing bone-drilling tasks. We designed a hybrid physical-virtual simulator for bone-drilling training and collected physical data for analyzing precise drilling action. We also developed a Laplacian based performance metric to help expert surgeons evaluate the resident training progress across successive years of orthopedic residency

    Investigation of a holistic human-computer interaction (HCI) framework to support the design of extended reality (XR) based training simulators

    Get PDF
    In recent years, the use of Extended Reality (XR) based simulators for training has increased rapidly. In this context, there is a need to explore novel HCI-based approaches to design more effective 3D training environments. A major impediment in this research area is the lack of an HCI-based framework that is holistic and serves as a foundation to integrate the design and assessment of HCI-based attributes such as affordance, cognitive load, and user-friendliness. This research addresses this need by investigating the creation of a holistic framework along with a process for designing, building, and assessing training simulators using such a framework as a foundation. The core elements of the proposed framework include the adoption of participatory design principles, the creation of information-intensive process models of target processes (relevant to the training activities), and design attributes related to affordance and cognitive load. A new attribute related to affordance of 3D scenes is proposed (termed dynamic affordance) and its role in impacting user comprehension in data-rich 3D training environments is studied. The framework is presented for the domain of orthopedic surgery. Rigorous user-involved assessment of the framework and simulation approach has highlighted the positive impact of the HCI-based framework and attributes on the acquisition of skills and knowledge by healthcare users

    Virtual Reality Simulator for Training in Myringotomy with Tube Placement

    Get PDF
    Myringotomy refers to a surgical incision in the eardrum, and it is often followed by ventilation tube placement to treat middle-ear infections. The procedure is difficult to learn; hence, the objectives of this work were to develop a virtual-reality training simulator, assess its face and content validity, and implement quantitative performance metrics and assess construct validity. A commercial digital gaming engine (Unity3D) was used to implement the simulator with support for 3D visualization of digital ear models and support for major surgical tasks. A haptic arm co-located with the stereo scene was used to manipulate virtual surgical tools and to provide force feedback. A questionnaire was developed with 14 face validity questions focusing on realism and 6 content validity questions focusing on training potential. Twelve participants from the Department of Otolaryngology were recruited for the study. Responses to 12 of the 14 face validity questions were positive. One concern was with contact modeling related to tube insertion into the eardrum, and the second was with movement of the blade and forceps. The former could be resolved by using a higher resolution digital model for the eardrum to improve contact localization. The latter could be resolved by using a higher fidelity haptic device. With regard to content validity, 64% of the responses were positive, 21% were neutral, and 15% were negative. In the final phase of this work, automated performance metrics were programmed and a construct validity study was conducted with 11 participants: 4 senior Otolaryngology consultants and 7 junior Otolaryngology residents. Each participant performed 10 procedures on the simulator and metrics were automatically collected. Senior Otolaryngologists took significantly less time to completion compared to junior residents. Junior residents had 2.8 times more errors as compared to experienced surgeons. The senior surgeons also had significantly longer incision lengths, more accurate incision angles, and lower magnification keeping both the umbo and annulus in view. All metrics were able to discriminate senior Otolaryngologists from junior residents with a significance of p \u3c 0.002. The simulator has sufficient realism, training potential and performance discrimination ability to warrant a more resource intensive skills transference study

    Research on real-time physics-based deformation for haptic-enabled medical simulation

    Full text link
    This study developed a multiple effective visuo-haptic surgical engine to handle a variety of surgical manipulations in real-time. Soft tissue models are based on biomechanical experiment and continuum mechanics for greater accuracy. Such models will increase the realism of future training systems and the VR/AR/MR implementations for the operating room

    Sonification as a reliable alternative to conventional visual surgical navigation

    Get PDF
    Despite the undeniable advantages of image-guided surgical assistance systems in terms of accuracy, such systems have not yet fully met surgeons' needs or expectations regarding usability, time efficiency, and their integration into the surgical workflow. On the other hand, perceptual studies have shown that presenting independent but causally correlated information via multimodal feedback involving different sensory modalities can improve task performance. This article investigates an alternative method for computer-assisted surgical navigation, introduces a novel four-DOF sonification methodology for navigated pedicle screw placement, and discusses advanced solutions based on multisensory feedback. The proposed method comprises a novel four-DOF sonification solution for alignment tasks in four degrees of freedom based on frequency modulation synthesis. We compared the resulting accuracy and execution time of the proposed sonification method with visual navigation, which is currently considered the state of the art. We conducted a phantom study in which 17 surgeons executed the pedicle screw placement task in the lumbar spine, guided by either the proposed sonification-based or the traditional visual navigation method. The results demonstrated that the proposed method is as accurate as the state of the art while decreasing the surgeon's need to focus on visual navigation displays instead of the natural focus on surgical tools and targeted anatomy during task execution

    A comprehensive evaluation of work and simulation based assessment in otolaryngology training

    Get PDF
    Introduction: The otolaryngology curriculum requires trainees to show evidence of operative competence before completion of training. The General Medical Council recommended that structured assessment be used throughout training to monitor and guide trainee progression. Despite the reduction in operative exposure and the variation in trainee performance, a ‘one size fits all’ approach continues to be applied. The number of procedures performed remains the main indicator of competence. Objectives: To analyse the utilisation, reliability and validity of workplace-based assessments in otolaryngology training. To identify, develop and validate a series of simulation platforms suitable for incorporation into the otolaryngology curriculum. To develop a model of interchangeable workplace- and simulation-based assessment that reflects trainee’s trajectory, audit the delivery of training and set milestones for modular learning. Methods: A detailed review of the literature identified a list of procedure-specific assessment tools as well as simulators suitable to be used as assessment platforms. A simulation-integrated training programme was piloted and models were tested for feasibility, face, content and construct validity before being incorporated into the North London training programme. The outcomes of workplace- and simulation-based assessments of all core and specialty otolaryngology trainees were collated and analysed. Results: The outcomes of 6535 workplace-based assessments were analysed. The strengths and weaknesses of 4 different assessment tools are highlighted. Validated platforms utilising cadavers, animal tissue, synthetic material and virtual reality simulators were incorporated into the curriculum. 60 trainees and 40 consultants participated in the process and found it of great educational value. Conclusion: Assessment with structured feedback is integral to surgical training. Assessment using validated simulation modules can complement that undertaken in the workplace. The outcomes of structures assessments can be used to monitor and guide trainee trajectory at individual and regional level. The derived learning curves can shape and audit future otolaryngological training.Open Acces
    • …
    corecore