919 research outputs found

    Conception et évaluation d’un simulateur à réalité virtuelle d’intervention laparoscopique actionné par des embrayages magnétorhéologiques

    Get PDF
    La laparoscopie est une technique chirurgicale qui offre une alternative moins invasive à la chirurgie abdominale traditionnelle, en permettant aux patients de récupérer plus rapidement et avec moins de douleur. Dès son arrivée, cette nouvelle technique a su révolutionner le monde de la chirurgie, mais cette révolution est d'ailleurs venue avec un cout, une formation longue et difficile. Des simulateurs haptiques ont tenté de rendre cet apprentissage plus facile, mais leur cout élevé et leurs grosses dimensions les rendent difficiles d'accès pour les étudiants moyens. Afin de résoudre ce problème, des concepts qui utilisent des dispositifs haptiques sont offerts sur le marché pour concevoir des plateformes de simulation d'interventions laparoscopiques. Ces plateformes sont toutefois peu fidèles à la réalité et n'atteignent pas simultanément les performances dynamiques et cinétiques nécessaires à un apprentissage adéquat. En effet, les moteurs électriques utilisés obligent les concepteurs de dispositifs haptiques à faire un compromis entre la force produite et la réponse dynamique du système. Cette approche pourrait par contre être utilisée avec un dispositif haptique nouvelle-génération, le T-Rex. Ce dernier a été développé récemment par Exonetik, une compagnie issue de recherches de l'Université de Sherbrooke. Contrairement aux dispositifs haptiques offerts sur le marché, le T-Rex utilise la technologie d'actionneurs magnéto-rhéologiques développée par Exonetik. Cette technologie pourrait permettre d'atteindre les performances dynamiques et cinétiques nécessaires à la formation de chirurgiens. Ce projet de recherche présente l'analyse préliminaire du T-Rex d'Exonetik en tant que simulateur à réalité virtuelle d'interventions laparoscopiques. Un simulateur à réalité virtuelle d'interventions laparoscopiques utilisant le T-Rex d'Exonetik en tant qu'interface haptique a été conçu. Des critères de performances ont été établis à l'aide de la littérature pour faire une évaluation quantitative du système. Des simulations utilisant la méthode des éléments finis ont aussi été développées pour faire une évaluation qualitative du système auprès de résidents et de chirurgiens. L'évaluation quantitative du système démontre qu'il répond aux quatre critères cinématiques ainsi qu'à trois des quatre critères cinétiques. Les résultats démontrent donc que l'utilisation d'actionneurs magnéto-rhéologiques dans les simulateurs à réalité virtuelle d'interventions laparoscopiques a beaucoup de potentiel. Par contre, la friction dans le système se doit d'être adressée dans les itérations futures du système

    Robotic simulators for tissue examination training with multimodal sensory feedback

    Get PDF
    Tissue examination by hand remains an essential technique in clinical practice. The effective application depends on skills in sensorimotor coordination, mainly involving haptic, visual, and auditory feedback. The skills clinicians have to learn can be as subtle as regulating finger pressure with breathing, choosing palpation action, monitoring involuntary facial and vocal expressions in response to palpation, and using pain expressions both as a source of information and as a constraint on physical examination. Patient simulators can provide a safe learning platform to novice physicians before trying real patients. This paper reviews state-of-the-art medical simulators for the training for the first time with a consideration of providing multimodal feedback to learn as many manual examination techniques as possible. The study summarizes current advances in tissue examination training devices simulating different medical conditions and providing different types of feedback modalities. Opportunities with the development of pain expression, tissue modeling, actuation, and sensing are also analyzed to support the future design of effective tissue examination simulators

    Developing a virtual reality environment for petrous bone surgery: a state-of-the-art review

    Get PDF
    The increasing power of computers has led to the development of sophisticated systems that aim to immerse the user in a virtual environment. The benefits of this type of approach to the training of physicians and surgeons are immediately apparent. Unfortunately the implementation of “virtual reality” (VR) surgical simulators has been restricted by both cost and technical limitations. The few successful systems use standardized scenarios, often derived from typical clinical data, to allow the rehearsal of procedures. In reality we would choose a system that allows us not only to practice typical cases but also to enter our own patient data and use it to define the virtual environment. In effect we want to re-write the scenario every time we use the environment and to ensure that its behavior exactly duplicates the behavior of the real tissue. If this can be achieved then VR systems can be used not only to train surgeons but also to rehearse individual procedures where variations in anatomy or pathology present specific surgical problems. The European Union has recently funded a multinational 3-year project (IERAPSI, Integrated Environment for Rehearsal and Planning of Surgical Interventions) to produce a virtual reality system for surgical training and for rehearsing individual procedures. Building the IERAPSI system will bring together a wide range of experts and combine the latest technologies to produce a true, patient specific virtual reality surgical simulator for petrous/temporal bone procedures. This article presents a review of the “state of the art” technologies currently available to construct a system of this type and an overview of the functionality and specifications such a system requires

    Investigation of the use of meshfree methods for haptic thermal management of design and simulation of MEMS

    Get PDF
    This thesis presents a novel approach of using haptic sensing technology combined with virtual environment (VE) for the thermal management of Micro-Electro-Mechanical-Systems (MEMS) design. The goal is to reduce the development cycle by avoiding the costly iterative prototyping procedure. In this regard, we use haptic feedback with virtua lprototyping along with an immersing environment. We also aim to improve the productivity and capability of the designer to better grasp the phenomena operating at the micro-scale level, as well as to augment computational steering through haptic channels. To validate the concept of haptic thermal management, we have implemented a demonstrator with a user friendly interface which allows to intuitively "feel" the temperature field through our concept of haptic texturing. The temperature field in a simple MEMS component is modeled using finite element methods (FEM) or finite difference method (FDM) and the user is able to feel thermal expansion using a combination of different haptic feedback. In haptic application, the force rendering loop needs to be updated at a frequency of 1Khz in order to maintain continuity in the user perception. When using FEM or FDM for our three-dimensional model, the computational cost increases rapidly as the mesh size is reduced to ensure accuracy. Hence, it constrains the complexity of the physical model to approximate temperature or stress field solution. It would also be difficult to generate or refine the mesh in real time for CAD process. In order to circumvent the limitations due to the use of conventional mesh-based techniques and to avoid the bothersome task of generating and refining the mesh, we investigate the potential of meshfree methods in the context of our haptic application. We review and compare the different meshfree formulations against FEM mesh based technique. We have implemented the different methods for benchmarking thermal conduction and elastic problems. The main work of this thesis is to determine the relevance of the meshfree option in terms of flexibility of design and computational charge for haptic physical model

    Exodex Adam—A Reconfigurable Dexterous Haptic User Interface for the Whole Hand

    Get PDF
    Applications for dexterous robot teleoperation and immersive virtual reality are growing. Haptic user input devices need to allow the user to intuitively command and seamlessly “feel” the environment they work in, whether virtual or a remote site through an avatar. We introduce the DLR Exodex Adam, a reconfigurable, dexterous, whole-hand haptic input device. The device comprises multiple modular, three degrees of freedom (3-DOF) robotic fingers, whose placement on the device can be adjusted to optimize manipulability for different user hand sizes. Additionally, the device is mounted on a 7-DOF robot arm to increase the user’s workspace. Exodex Adam uses a front-facing interface, with robotic fingers coupled to two of the user’s fingertips, the thumb, and two points on the palm. Including the palm, as opposed to only the fingertips as is common in existing devices, enables accurate tracking of the whole hand without additional sensors such as a data glove or motion capture. By providing “whole-hand” interaction with omnidirectional force-feedback at the attachment points, we enable the user to experience the environment with the complete hand instead of only the fingertips, thus realizing deeper immersion. Interaction using Exodex Adam can range from palpation of objects and surfaces to manipulation using both power and precision grasps, all while receiving haptic feedback. This article details the concept and design of the Exodex Adam, as well as use cases where it is deployed with different command modalities. These include mixed-media interaction in a virtual environment, gesture-based telemanipulation, and robotic hand–arm teleoperation using adaptive model-mediated teleoperation. Finally, we share the insights gained during our development process and use case deployments

    Tactile Displays with Parallel Mechanism

    Get PDF

    HAPTIC AND VISUAL SIMULATION OF BONE DISSECTION

    Get PDF
    Marco AgusIn bone dissection virtual simulation, force restitution represents the key to realistically mimicking a patient– specific operating environment. The force is rendered using haptic devices controlled by parametrized mathematical models that represent the bone–burr contact. This dissertation presents and discusses a haptic simulation of a bone cutting burr, that it is being developed as a component of a training system for temporal bone surgery. A physically based model was used to describe the burr– bone interaction, including haptic forces evaluation, bone erosion process and resulting debris. The model was experimentally validated and calibrated by employing a custom experimental set–up consisting of a force–controlled robot arm holding a high–speed rotating tool and a contact force measuring apparatus. Psychophysical testing was also carried out to assess individual reaction to the haptic environment. The results suggest that the simulator is capable of rendering the basic material differences required for bone burring tasks. The current implementation, directly operating on a voxel discretization of patientspecific 3D CT and MR imaging data, is efficient enough to provide real–time haptic and visual feedback on a low–end multi–processing PC platform.

    Digital Fabrication Approaches for the Design and Development of Shape-Changing Displays

    Get PDF
    Interactive shape-changing displays enable dynamic representations of data and information through physically reconfigurable geometry. The actuated physical deformations of these displays can be utilised in a wide range of new application areas, such as dynamic landscape and topographical modelling, architectural design, physical telepresence and object manipulation. Traditionally, shape-changing displays have a high development cost in mechanical complexity, technical skills and time/finances required for fabrication. There is still a limited number of robust shape-changing displays that go beyond one-off prototypes. Specifically, there is limited focus on low-cost/accessible design and development approaches involving digital fabrication (e.g. 3D printing). To address this challenge, this thesis presents accessible digital fabrication approaches that support the development of shape-changing displays with a range of application examples – such as physical terrain modelling and interior design artefacts. Both laser cutting and 3D printing methods have been explored to ensure generalisability and accessibility for a range of potential users. The first design-led content generation explorations show that novice users, from the general public, can successfully design and present their own application ideas using the physical animation features of the display. By engaging with domain experts in designing shape-changing content to represent data specific to their work domains the thesis was able to demonstrate the utility of shape-changing displays beyond novel systems and describe practical use-case scenarios and applications through rapid prototyping methods. This thesis then demonstrates new ways of designing and building shape-changing displays that goes beyond current implementation examples available (e.g. pin arrays and continuous surface shape-changing displays). To achieve this, the thesis demonstrates how laser cutting and 3D printing can be utilised to rapidly fabricate deformable surfaces for shape-changing displays with embedded electronics. This thesis is concluded with a discussion of research implications and future direction for this work
    corecore