746 research outputs found

    Conception et évaluation d’un simulateur à réalité virtuelle d’intervention laparoscopique actionné par des embrayages magnétorhéologiques

    Get PDF
    La laparoscopie est une technique chirurgicale qui offre une alternative moins invasive à la chirurgie abdominale traditionnelle, en permettant aux patients de récupérer plus rapidement et avec moins de douleur. Dès son arrivée, cette nouvelle technique a su révolutionner le monde de la chirurgie, mais cette révolution est d'ailleurs venue avec un cout, une formation longue et difficile. Des simulateurs haptiques ont tenté de rendre cet apprentissage plus facile, mais leur cout élevé et leurs grosses dimensions les rendent difficiles d'accès pour les étudiants moyens. Afin de résoudre ce problème, des concepts qui utilisent des dispositifs haptiques sont offerts sur le marché pour concevoir des plateformes de simulation d'interventions laparoscopiques. Ces plateformes sont toutefois peu fidèles à la réalité et n'atteignent pas simultanément les performances dynamiques et cinétiques nécessaires à un apprentissage adéquat. En effet, les moteurs électriques utilisés obligent les concepteurs de dispositifs haptiques à faire un compromis entre la force produite et la réponse dynamique du système. Cette approche pourrait par contre être utilisée avec un dispositif haptique nouvelle-génération, le T-Rex. Ce dernier a été développé récemment par Exonetik, une compagnie issue de recherches de l'Université de Sherbrooke. Contrairement aux dispositifs haptiques offerts sur le marché, le T-Rex utilise la technologie d'actionneurs magnéto-rhéologiques développée par Exonetik. Cette technologie pourrait permettre d'atteindre les performances dynamiques et cinétiques nécessaires à la formation de chirurgiens. Ce projet de recherche présente l'analyse préliminaire du T-Rex d'Exonetik en tant que simulateur à réalité virtuelle d'interventions laparoscopiques. Un simulateur à réalité virtuelle d'interventions laparoscopiques utilisant le T-Rex d'Exonetik en tant qu'interface haptique a été conçu. Des critères de performances ont été établis à l'aide de la littérature pour faire une évaluation quantitative du système. Des simulations utilisant la méthode des éléments finis ont aussi été développées pour faire une évaluation qualitative du système auprès de résidents et de chirurgiens. L'évaluation quantitative du système démontre qu'il répond aux quatre critères cinématiques ainsi qu'à trois des quatre critères cinétiques. Les résultats démontrent donc que l'utilisation d'actionneurs magnéto-rhéologiques dans les simulateurs à réalité virtuelle d'interventions laparoscopiques a beaucoup de potentiel. Par contre, la friction dans le système se doit d'être adressée dans les itérations futures du système

    HAPTIC AND VISUAL SIMULATION OF BONE DISSECTION

    Get PDF
    Marco AgusIn bone dissection virtual simulation, force restitution represents the key to realistically mimicking a patient– specific operating environment. The force is rendered using haptic devices controlled by parametrized mathematical models that represent the bone–burr contact. This dissertation presents and discusses a haptic simulation of a bone cutting burr, that it is being developed as a component of a training system for temporal bone surgery. A physically based model was used to describe the burr– bone interaction, including haptic forces evaluation, bone erosion process and resulting debris. The model was experimentally validated and calibrated by employing a custom experimental set–up consisting of a force–controlled robot arm holding a high–speed rotating tool and a contact force measuring apparatus. Psychophysical testing was also carried out to assess individual reaction to the haptic environment. The results suggest that the simulator is capable of rendering the basic material differences required for bone burring tasks. The current implementation, directly operating on a voxel discretization of patientspecific 3D CT and MR imaging data, is efficient enough to provide real–time haptic and visual feedback on a low–end multi–processing PC platform.

    Collision Detection and Merging of Deformable B-Spline Surfaces in Virtual Reality Environment

    Get PDF
    This thesis presents a computational framework for representing, manipulating and merging rigid and deformable freeform objects in virtual reality (VR) environment. The core algorithms for collision detection, merging, and physics-based modeling used within this framework assume that all 3D deformable objects are B-spline surfaces. The interactive design tool can be represented as a B-spline surface, an implicit surface or a point, to allow the user a variety of rigid or deformable tools. The collision detection system utilizes the fact that the blending matrices used to discretize the B-spline surface are independent of the position of the control points and, therefore, can be pre-calculated. Complex B-spline surfaces can be generated by merging various B-spline surface patches using the B-spline surface patches merging algorithm presented in this thesis. Finally, the physics-based modeling system uses the mass-spring representation to determine the deformation and the reaction force values provided to the user. This helps to simulate realistic material behaviour of the model and assist the user in validating the design before performing extensive product detailing or finite element analysis using commercially available CAD software. The novelty of the proposed method stems from the pre-calculated blending matrices used to generate the points for graphical rendering, collision detection, merging of B-spline patches, and nodes for the mass spring system. This approach reduces computational time by avoiding the need to solve complex equations for blending functions of B-splines and perform the inversion of large matrices. This alternative approach to the mechanical concept design will also help to do away with the need to build prototypes for conceptualization and preliminary validation of the idea thereby reducing the time and cost of concept design phase and the wastage of resources

    Investigation of the use of meshfree methods for haptic thermal management of design and simulation of MEMS

    Get PDF
    This thesis presents a novel approach of using haptic sensing technology combined with virtual environment (VE) for the thermal management of Micro-Electro-Mechanical-Systems (MEMS) design. The goal is to reduce the development cycle by avoiding the costly iterative prototyping procedure. In this regard, we use haptic feedback with virtua lprototyping along with an immersing environment. We also aim to improve the productivity and capability of the designer to better grasp the phenomena operating at the micro-scale level, as well as to augment computational steering through haptic channels. To validate the concept of haptic thermal management, we have implemented a demonstrator with a user friendly interface which allows to intuitively "feel" the temperature field through our concept of haptic texturing. The temperature field in a simple MEMS component is modeled using finite element methods (FEM) or finite difference method (FDM) and the user is able to feel thermal expansion using a combination of different haptic feedback. In haptic application, the force rendering loop needs to be updated at a frequency of 1Khz in order to maintain continuity in the user perception. When using FEM or FDM for our three-dimensional model, the computational cost increases rapidly as the mesh size is reduced to ensure accuracy. Hence, it constrains the complexity of the physical model to approximate temperature or stress field solution. It would also be difficult to generate or refine the mesh in real time for CAD process. In order to circumvent the limitations due to the use of conventional mesh-based techniques and to avoid the bothersome task of generating and refining the mesh, we investigate the potential of meshfree methods in the context of our haptic application. We review and compare the different meshfree formulations against FEM mesh based technique. We have implemented the different methods for benchmarking thermal conduction and elastic problems. The main work of this thesis is to determine the relevance of the meshfree option in terms of flexibility of design and computational charge for haptic physical model

    Advancing proxy-based haptic feedback in virtual reality

    Get PDF
    This thesis advances haptic feedback for Virtual Reality (VR). Our work is guided by Sutherland's 1965 vision of the ultimate display, which calls for VR systems to control the existence of matter. To push towards this vision, we build upon proxy-based haptic feedback, a technique characterized by the use of passive tangible props. The goal of this thesis is to tackle the central drawback of this approach, namely, its inflexibility, which yet hinders it to fulfill the vision of the ultimate display. Guided by four research questions, we first showcase the applicability of proxy-based VR haptics by employing the technique for data exploration. We then extend the VR system's control over users' haptic impressions in three steps. First, we contribute the class of Dynamic Passive Haptic Feedback (DPHF) alongside two novel concepts for conveying kinesthetic properties, like virtual weight and shape, through weight-shifting and drag-changing proxies. Conceptually orthogonal to this, we study how visual-haptic illusions can be leveraged to unnoticeably redirect the user's hand when reaching towards props. Here, we contribute a novel perception-inspired algorithm for Body Warping-based Hand Redirection (HR), an open-source framework for HR, and psychophysical insights. The thesis concludes by proving that the combination of DPHF and HR can outperform the individual techniques in terms of the achievable flexibility of the proxy-based haptic feedback.Diese Arbeit widmet sich haptischem Feedback für Virtual Reality (VR) und ist inspiriert von Sutherlands Vision des ultimativen Displays, welche VR-Systemen die Fähigkeit zuschreibt, Materie kontrollieren zu können. Um dieser Vision näher zu kommen, baut die Arbeit auf dem Konzept proxy-basierter Haptik auf, bei der haptische Eindrücke durch anfassbare Requisiten vermittelt werden. Ziel ist es, diesem Ansatz die für die Realisierung eines ultimativen Displays nötige Flexibilität zu verleihen. Dazu bearbeiten wir vier Forschungsfragen und zeigen zunächst die Anwendbarkeit proxy-basierter Haptik durch den Einsatz der Technik zur Datenexploration. Anschließend untersuchen wir in drei Schritten, wie VR-Systeme mehr Kontrolle über haptische Eindrücke von Nutzern erhalten können. Hierzu stellen wir Dynamic Passive Haptic Feedback (DPHF) vor, sowie zwei Verfahren, die kinästhetische Eindrücke wie virtuelles Gewicht und Form durch Gewichtsverlagerung und Veränderung des Luftwiderstandes von Requisiten vermitteln. Zusätzlich untersuchen wir, wie visuell-haptische Illusionen die Hand des Nutzers beim Greifen nach Requisiten unbemerkt umlenken können. Dabei stellen wir einen neuen Algorithmus zur Body Warping-based Hand Redirection (HR), ein Open-Source-Framework, sowie psychophysische Erkenntnisse vor. Abschließend zeigen wir, dass die Kombination von DPHF und HR proxy-basierte Haptik noch flexibler machen kann, als es die einzelnen Techniken alleine können

    Command and Control Systems for Search and Rescue Robots

    Get PDF
    The novel application of unmanned systems in the domain of humanitarian Search and Rescue (SAR) operations has created a need to develop specific multi-Robot Command and Control (RC2) systems. This societal application of robotics requires human-robot interfaces for controlling a large fleet of heterogeneous robots deployed in multiple domains of operation (ground, aerial and marine). This chapter provides an overview of the Command, Control and Intelligence (C2I) system developed within the scope of Integrated Components for Assisted Rescue and Unmanned Search operations (ICARUS). The life cycle of the system begins with a description of use cases and the deployment scenarios in collaboration with SAR teams as end-users. This is followed by an illustration of the system design and architecture, core technologies used in implementing the C2I, iterative integration phases with field deployments for evaluating and improving the system. The main subcomponents consist of a central Mission Planning and Coordination System (MPCS), field Robot Command and Control (RC2) subsystems with a portable force-feedback exoskeleton interface for robot arm tele-manipulation and field mobile devices. The distribution of these C2I subsystems with their communication links for unmanned SAR operations is described in detail. Field demonstrations of the C2I system with SAR personnel assisted by unmanned systems provide an outlook for implementing such systems into mainstream SAR operations in the future

    Assessment of a novel patient-specific 3D printed multi-material simulator for endoscopic sinus surgery

    Get PDF
    Background: Three-dimensional (3D) printing is an emerging tool in the creation of anatomical models for surgical training. Its use in endoscopic sinus surgery (ESS) has been limited because of the difficulty in replicating the anatomical details. Aim: To describe the development of a patient-specific 3D printed multi-material simulator for use in ESS, and to validate it as a training tool among a group of residents and experts in ear-nose-throat (ENT) surgery. Methods: Advanced material jetting 3D printing technology was used to produce both soft tissues and bony structures of the simulator to increase anatomical realism and tactile feedback of the model. A total of 3 ENT residents and 9 ENT specialists were recruited to perform both non-destructive tasks and ESS steps on the model. The anatomical fidelity and the usefulness of the simulator in ESS training were evaluated through specific questionnaires. Results: The tasks were accomplished by 100% of participants and the survey showed overall high scores both for anatomy fidelity and usefulness in training. Dacryocystorhinostomy, medial antrostomy, and turbinectomy were rated as accurately replicable on the simulator by 75% of participants. Positive scores were obtained also for ethmoidectomy and DRAF procedures, while the replication of sphenoidotomy received neutral ratings by half of the participants. Conclusion: This study demonstrates that a 3D printed multi-material model of the sino-nasal anatomy can be generated with a high level of anatomical accuracy and haptic response. This technology has the potential to be useful in surgical training as an alternative or complementary tool to cadaveric dissection
    corecore