64 research outputs found

    Virtual Hand Representations to Support Natural Interaction in Immersive Environment

    Get PDF
    Immersive Computing Technology (ICT) offers designers the unique ability to evaluate human interaction with product design concepts through the use of stereo viewing and 3D position tracking. These technologies provide designers with opportunities to create virtual simulations for numerous different applications. In order to support the immersive experience of a virtual simulation, it is necessary to employ interaction techniques that are appropriately mapped to specific tasks. Numerous methods for interacting in various virtual applications have been developed which use wands, game controllers, and haptic devices. However, if the intent of the simulation is to gather information on how a person would interact in an environment, more natural interaction paradigms are needed. The use of 3D hand models coupled with position-tracked gloves provide for intuitive interactions in virtual environments. This paper presents several methods of representing a virtual hand model in the virtual environment to support natural interaction

    Impact of Ear Occlusion on In-Ear Sounds Generated by Intra-oral Behaviors

    Get PDF
    We conducted a case study with one volunteer and a recording setup to detect sounds induced by the actions: jaw clenching, tooth grinding, reading, eating, and drinking. The setup consisted of two in-ear microphones, where the left ear was semi-occluded with a commercially available earpiece and the right ear was occluded with a mouldable silicon ear piece. Investigations in the time and frequency domains demonstrated that for behaviors such as eating, tooth grinding, and reading, sounds could be recorded with both sensors. For jaw clenching, however, occluding the ear with a mouldable piece was necessary to enable its detection. This can be attributed to the fact that the mouldable ear piece sealed the ear canal and isolated it from the environment, resulting in a detectable change in pressure. In conclusion, our work suggests that detecting behaviors such as eating, grinding, reading with a semi-occluded ear is possible, whereas, behaviors such as clenching require the complete occlusion of the ear if the activity should be easily detectable. Nevertheless, the latter approach may limit real-world applicability because it hinders the hearing capabilities.</p

    Patient Specific Systems for Computer Assisted Robotic Surgery Simulation, Planning, and Navigation

    Get PDF
    The evolving scenario of surgery: starting from modern surgery, to the birth of medical imaging and the introduction of minimally invasive techniques, has seen in these last years the advent of surgical robotics. These systems, making possible to get through the difficulties of endoscopic surgery, allow an improved surgical performance and a better quality of the intervention. Information technology contributed to this evolution since the beginning of the digital revolution: providing innovative medical imaging devices and computer assisted surgical systems. Afterwards, the progresses in computer graphics brought innovative visualization modalities for medical datasets, and later the birth virtual reality has paved the way for virtual surgery. Although many surgical simulators already exist, there are no patient specific solutions. This thesis presents the development of patient specific software systems for preoperative planning, simulation and intraoperative assistance, designed for robotic surgery: in particular for bimanual robots that are becoming the future of single port interventions. The first software application is a virtual reality simulator for this kind of surgical robots. The system has been designed to validate the initial port placement and the operative workspace for the potential application of this surgical device. Given a bimanual robot with its own geometry and kinematics, and a patient specific 3D virtual anatomy, the surgical simulator allows the surgeon to choose the optimal positioning of the robot and the access port in the abdominal wall. Additionally, it makes possible to evaluate in a virtual environment if a dexterous movability of the robot is achievable, avoiding unwanted collisions with the surrounding anatomy to prevent potential damages in the real surgical procedure. Even if the software has been designed for a specific bimanual surgical robot, it supports any open kinematic chain structure: as far as it can be described in our custom format. The robot capabilities to accomplish specific tasks can be virtually tested using the deformable models: interacting directly with the target virtual organs, trying to avoid unwanted collisions with the surrounding anatomy not involved in the intervention. Moreover, the surgical simulator has been enhanced with algorithms and data structures to integrate biomechanical parameters into virtual deformable models (based on mass-spring-damper network) of target solid organs, in order to properly reproduce the physical behaviour of the patient anatomy during the interactions. The main biomechanical parameters (Young's modulus and density) have been integrated, allowing the automatic tuning of some model network elements, such as: the node mass and the spring stiffness. The spring damping coefficient has been modeled using the Rayleigh approach. Furthermore, the developed method automatically detect the external layer, allowing the usage of both the surface and internal Young's moduli, in order to model the main parts of dense organs: the stroma and the parenchyma. Finally the model can be manually tuned to represent lesion with specific biomechanical properties. Additionally, some software modules of the simulator have been properly extended to be integrated in a patient specific computer guidance system for intraoperative navigation and assistance in robotic single port interventions. This application provides guidance functionalities working in three different modalities: passive as a surgical navigator, assistive as a guide for the single port placement and active as a tutor preventing unwanted collision during the intervention. The simulation system has beed tested by five surgeons: simulating the robot access port placemen, and evaluating the robot movability and workspace inside the patient abdomen. The tested functionalities, rated by expert surgeons, have shown good quality and performance of the simulation. Moreover, the integration of biomechanical parameters into deformable models has beed tested with various material samples. The results have shown a good visual realism ensuring the performance required by an interactive simulation. Finally, the intraoperative navigator has been tested performing a cholecystectomy on a synthetic patient mannequin, in order to evaluate: the intraoperative navigation accuracy, the network communications latency and the overall usability of the system. The tests performed demonstrated the effectiveness and the usability of the software systems developed: encouraging the introduction of the proposed solution in the clinical practice, and the implementation of further improvements. Surgical robotics will be enhanced by an advanced integration of medical images into software systems: allowing the detailed planning of surgical interventions by means of virtual surgery simulation based on patient specific biomechanical parameters. Furthermore, the advanced functionalities offered by these systems, enable surgical robots to improve the intraoperative surgical assistance: benefitting of the knowledge of the virtual patient anatomy

    Development of an augmented reality guided computer assisted orthopaedic surgery system

    Get PDF
    Previously held under moratorium from 1st December 2016 until 1st December 2021.This body of work documents the developed of a proof of concept augmented reality guided computer assisted orthopaedic surgery system – ARgCAOS. After initial investigation a visible-spectrum single camera tool-mounted tracking system based upon fiducial planar markers was implemented. The use of visible-spectrum cameras, as opposed to the infra-red cameras typically used by surgical tracking systems, allowed the captured image to be streamed to a display in an intelligible fashion. The tracking information defined the location of physical objects relative to the camera. Therefore, this information allowed virtual models to be overlaid onto the camera image. This produced a convincing augmented experience, whereby the virtual objects appeared to be within the physical world, moving with both the camera and markers as expected of physical objects. Analysis of the first generation system identified both accuracy and graphical inadequacies, prompting the development of a second generation system. This too was based upon a tool-mounted fiducial marker system, and improved performance to near-millimetre probing accuracy. A resection system was incorporated into the system, and utilising the tracking information controlled resection was performed, producing sub-millimetre accuracies. Several complications resulted from the tool-mounted approach. Therefore, a third generation system was developed. This final generation deployed a stereoscopic visible-spectrum camera system affixed to a head-mounted display worn by the user. The system allowed the augmentation of the natural view of the user, providing convincing and immersive three dimensional augmented guidance, with probing and resection accuracies of 0.55±0.04 and 0.34±0.04 mm, respectively.This body of work documents the developed of a proof of concept augmented reality guided computer assisted orthopaedic surgery system – ARgCAOS. After initial investigation a visible-spectrum single camera tool-mounted tracking system based upon fiducial planar markers was implemented. The use of visible-spectrum cameras, as opposed to the infra-red cameras typically used by surgical tracking systems, allowed the captured image to be streamed to a display in an intelligible fashion. The tracking information defined the location of physical objects relative to the camera. Therefore, this information allowed virtual models to be overlaid onto the camera image. This produced a convincing augmented experience, whereby the virtual objects appeared to be within the physical world, moving with both the camera and markers as expected of physical objects. Analysis of the first generation system identified both accuracy and graphical inadequacies, prompting the development of a second generation system. This too was based upon a tool-mounted fiducial marker system, and improved performance to near-millimetre probing accuracy. A resection system was incorporated into the system, and utilising the tracking information controlled resection was performed, producing sub-millimetre accuracies. Several complications resulted from the tool-mounted approach. Therefore, a third generation system was developed. This final generation deployed a stereoscopic visible-spectrum camera system affixed to a head-mounted display worn by the user. The system allowed the augmentation of the natural view of the user, providing convincing and immersive three dimensional augmented guidance, with probing and resection accuracies of 0.55±0.04 and 0.34±0.04 mm, respectively

    Modeling and Simulation in Engineering

    Get PDF
    This book provides an open platform to establish and share knowledge developed by scholars, scientists, and engineers from all over the world, about various applications of the modeling and simulation in the design process of products, in various engineering fields. The book consists of 12 chapters arranged in two sections (3D Modeling and Virtual Prototyping), reflecting the multidimensionality of applications related to modeling and simulation. Some of the most recent modeling and simulation techniques, as well as some of the most accurate and sophisticated software in treating complex systems, are applied. All the original contributions in this book are jointed by the basic principle of a successful modeling and simulation process: as complex as necessary, and as simple as possible. The idea is to manipulate the simplifying assumptions in a way that reduces the complexity of the model (in order to make a real-time simulation), but without altering the precision of the results

    A Virtual University Infrastructure For Orthopaedic Surgical Training With Integrated Simulation

    No full text
    This thesis pivots around the fulcrum of surgical, educational and technological factors. Whilst there is no single conclusion drawn, it is a multidisciplinary thesis exploring the juxtaposition of different academic domains that have a significant influence upon each other. The relationship centres on the engineering and computer science factors in learning technologies for surgery. Following a brief introduction to previous efforts developing surgical simulation, this thesis considers education and learning in orthopaedics, the design and building of a simulator for shoulder surgery. The thesis considers the assessment of such tools and embedding into a virtual learning environment. It explains how the performed experiments clarified issues and their actual significance. This leads to discussion of the work and conclusions are drawn regarding the progress of integration of distributed simulation within the healthcare environment, suggesting how future work can proceed

    Evaluation techniques used to evaluate extended reality (XR) head mounted displays (HMDs) used in healthcare: A literature review

    Get PDF
    Extended Reality (XR) Head Mounted Displays (HMDs) are used across various healthcare pathways for staff/student education and training, and for improving patient experiences. As XR HMDs become affordable, accessible and their acceptance increases, it is critical to document the techniques used for evaluating the technology, processes of user engagement and immersion, and outcomes. At present there is limited research on evaluation techniques used to evaluate XR HMDs. This manuscript presents findings from 104 clinical studies that use XR HMDs. The aim of this review is to give the user an insight into the current healthcare XR HMD landscape by presenting the different HMDs used, variety of XR interventions and their applications across medical pathways and intended research outcomes of the XR applications. The manuscript further guides the reader toward a detailed documentation of evaluation techniques used to investigate antecedents and consequences of using XR and delivers a critical discussion and suggestions for improvement of XR evaluation practices. This paper will be of excellent use to clinicians, academics, funding bodies and hospital decision makers who would like suggestions for evaluating the efficacy and effectiveness of XR HMDs. The authors hope to encourage discussions on the importance of improving XR evaluation practices

    ISMCR 1994: Topical Workshop on Virtual Reality. Proceedings of the Fourth International Symposium on Measurement and Control in Robotics

    Get PDF
    This symposium on measurement and control in robotics included sessions on: (1) rendering, including tactile perception and applied virtual reality; (2) applications in simulated medical procedures and telerobotics; (3) tracking sensors in a virtual environment; (4) displays for virtual reality applications; (5) sensory feedback including a virtual environment application with partial gravity simulation; and (6) applications in education, entertainment, technical writing, and animation

    Restoring anatomy with TKA : from bone to soft tissue

    Get PDF
    • …
    corecore