2,028 research outputs found

    Instrumentation, Data, And Algorithms For Visually Understanding Haptic Surface Properties

    Get PDF
    Autonomous robots need to efficiently walk over varied surfaces and grasp diverse objects. We hypothesize that the association between how such surfaces look and how they physically feel during contact can be learned from a database of matched haptic and visual data recorded from various end-effectors\u27 interactions with hundreds of real-world surfaces. Testing this hypothesis required the creation of a new multimodal sensing apparatus, the collection of a large multimodal dataset, and development of a machine-learning pipeline. This thesis begins by describing the design and construction of the Portable Robotic Optical/Tactile ObservatioN PACKage (PROTONPACK, or Proton for short), an untethered handheld sensing device that emulates the capabilities of the human senses of vision and touch. Its sensory modalities include RGBD vision, egomotion, contact force, and contact vibration. Three interchangeable end-effectors (a steel tooling ball, an OptoForce three-axis force sensor, and a SynTouch BioTac artificial fingertip) allow for different material properties at the contact point and provide additional tactile data. We then detail the calibration process for the motion and force sensing systems, as well as several proof-of-concept surface discrimination experiments that demonstrate the reliability of the device and the utility of the data it collects. This thesis then presents a large-scale dataset of multimodal surface interaction recordings, including 357 unique surfaces such as furniture, fabrics, outdoor fixtures, and items from several private and public material sample collections. Each surface was touched with one, two, or three end-effectors, comprising approximately one minute per end-effector of tapping and dragging at various forces and speeds. We hope that the larger community of robotics researchers will find broad applications for the published dataset. Lastly, we demonstrate an algorithm that learns to estimate haptic surface properties given visual input. Surfaces were rated on hardness, roughness, stickiness, and temperature by the human experimenter and by a pool of purely visual observers. Then we trained an algorithm to perform the same task as well as infer quantitative properties calculated from the haptic data. Overall, the task of predicting haptic properties from vision alone proved difficult for both humans and computers, but a hybrid algorithm using a deep neural network and a support vector machine achieved a correlation between expected and actual regression output between approximately ρ = 0.3 and ρ = 0.5 on previously unseen surfaces

    Microscope Embedded Neurosurgical Training and Intraoperative System

    Get PDF
    In the recent years, neurosurgery has been strongly influenced by new technologies. Computer Aided Surgery (CAS) offers several benefits for patients\u27 safety but fine techniques targeted to obtain minimally invasive and traumatic treatments are required, since intra-operative false movements can be devastating, resulting in patients deaths. The precision of the surgical gesture is related both to accuracy of the available technological instruments and surgeon\u27s experience. In this frame, medical training is particularly important. From a technological point of view, the use of Virtual Reality (VR) for surgeon training and Augmented Reality (AR) for intra-operative treatments offer the best results. In addition, traditional techniques for training in surgery include the use of animals, phantoms and cadavers. The main limitation of these approaches is that live tissue has different properties from dead tissue and that animal anatomy is significantly different from the human. From the medical point of view, Low-Grade Gliomas (LGGs) are intrinsic brain tumours that typically occur in younger adults. The objective of related treatment is to remove as much of the tumour as possible while minimizing damage to the healthy brain. Pathological tissue may closely resemble normal brain parenchyma when looked at through the neurosurgical microscope. The tactile appreciation of the different consistency of the tumour compared to normal brain requires considerable experience on the part of the neurosurgeon and it is a vital point. The first part of this PhD thesis presents a system for realistic simulation (visual and haptic) of the spatula palpation of the LGG. This is the first prototype of a training system using VR, haptics and a real microscope for neurosurgery. This architecture can be also adapted for intra-operative purposes. In this instance, a surgeon needs the basic setup for the Image Guided Therapy (IGT) interventions: microscope, monitors and navigated surgical instruments. The same virtual environment can be AR rendered onto the microscope optics. The objective is to enhance the surgeon\u27s ability for a better intra-operative orientation by giving him a three-dimensional view and other information necessary for a safe navigation inside the patient. The last considerations have served as motivation for the second part of this work which has been devoted to improving a prototype of an AR stereoscopic microscope for neurosurgical interventions, developed in our institute in a previous work. A completely new software has been developed in order to reuse the microscope hardware, enhancing both rendering performances and usability. Since both AR and VR share the same platform, the system can be referred to as Mixed Reality System for neurosurgery. All the components are open source or at least based on a GPL license

    Influence of Haptic Communication on a Shared Manual Task in a Collaborative Virtual Environment

    Get PDF
    International audienceWith the advent of new haptic feedback devices, researchers are giving serious consideration to the incorporation of haptic communication in collaborative virtual environments. For instance, haptic interactions based tools can be used for medical and related education whereby students can train in minimal invasive surgery using virtual reality before approaching human subjects. To design virtual environments that support haptic communication, a deeper understanding of humans' haptic interactions is required. In this paper, human's haptic collaboration is investigated. A collaborative virtual environment was designed to support performing a shared manual task. To evaluate this system, 60 medical students participated to an experimental study. Participants were asked to perform in dyads a needle insertion task after a training period. Results show that compared to conventional training methods, a visual-haptic training improves user's collaborative performance. In addition, we found that haptic interaction influences the partners' verbal communication when sharing haptic information. This indicates that the haptic communication training changes the nature of the users' mental representations. Finally, we found that haptic interactions increased the sense of copresence in the virtual environment: haptic communication facilitates users' collaboration in a shared manual task within a shared virtual environment. Design implications for including haptic communication in virtual environments are outlined

    Patient-specific simulation environment for surgical planning and preoperative rehearsal

    Get PDF
    Surgical simulation is common practice in the fields of surgical education and training. Numerous surgical simulators are available from commercial and academic organisations for the generic modelling of surgical tasks. However, a simulation platform is still yet to be found that fulfils the key requirements expected for patient-specific surgical simulation of soft tissue, with an effective translation into clinical practice. Patient-specific modelling is possible, but to date has been time-consuming, and consequently costly, because data preparation can be technically demanding. This motivated the research developed herein, which addresses the main challenges of biomechanical modelling for patient-specific surgical simulation. A novel implementation of soft tissue deformation and estimation of the patient-specific intraoperative environment is achieved using a position-based dynamics approach. This modelling approach overcomes the limitations derived from traditional physically-based approaches, by providing a simulation for patient-specific models with visual and physical accuracy, stability and real-time interaction. As a geometrically- based method, a calibration of the simulation parameters is performed and the simulation framework is successfully validated through experimental studies. The capabilities of the simulation platform are demonstrated by the integration of different surgical planning applications that are found relevant in the context of kidney cancer surgery. The simulation of pneumoperitoneum facilitates trocar placement planning and intraoperative surgical navigation. The implementation of deformable ultrasound simulation can assist surgeons in improving their scanning technique and definition of an optimal procedural strategy. Furthermore, the simulation framework has the potential to support the development and assessment of hypotheses that cannot be tested in vivo. Specifically, the evaluation of feedback modalities, as a response to user-model interaction, demonstrates improved performance and justifies the need to integrate a feedback framework in the robot-assisted surgical setting.Open Acces

    Object grasping and manipulation in capuchin monkeys (genera Cebus and Sapajus)

    Get PDF
    The abilities to perform skilled hand movements and to manipulate objects dexterously are landmarks in the evolution of primates. The study of how primates use their hands to grasp and manipulate objects in accordance with their needs sheds light on how these species are physically and mentally equipped to deal with the problems they encounter in their daily life. We report data on capuchin monkeys, highly manipulative platyrrhine species that usually spend a great deal of time in active manipulation to search for food and to prepare it for ingestion. Our aim is to provide an overview of current knowledge on the ability of capuchins to grasp and manipulate objects, with a special focus on how these species express their cognitive potential through manual behaviour. Data on the ability of capuchins to move their hands and on the neural correlates sustaining their actions are reported, as are findings on the manipulative ability of capuchins to anticipate future actions and to relate objects to other objects and substrates. The manual behaviour of capuchins is considered in different domains, such as motor planning, extractive foraging and tool use, in both captive and natural settings. Anatomofunctional and behavioural similarities to and differences from other haplorrhine species regarding manual dexterity are also discussed

    Multimodal Human-Machine Interface For Haptic-Controlled Excavators

    Get PDF
    The goal of this research is to develop a human-excavator interface for the hapticcontrolled excavator that makes use of the multiple human sensing modalities (visual, auditory haptic), and efficiently integrates these modalities to ensure intuitive, efficient interface that is easy to learn and use, and is responsive to operator commands. Two empirical studies were conducted to investigate conflict in the haptic-controlled excavator interface and identify the level of force feedback for best operator performance

    Modeling and rendering for development of a virtual bone surgery system

    Get PDF
    A virtual bone surgery system is developed to provide the potential of a realistic, safe, and controllable environment for surgical education. It can be used for training in orthopedic surgery, as well as for planning and rehearsal of bone surgery procedures...Using the developed system, the user can perform virtual bone surgery by simultaneously seeing bone material removal through a graphic display device, feeling the force via a haptic deice, and hearing the sound of tool-bone interaction --Abstract, page iii

    Nonlinear effects in finite elements analysis of colorectal surgical clamping

    Get PDF
    Minimal Invasive Surgery (MIS) is a procedure that has increased its applications in past few years in different types of surgeries. As number of application fields are increasing day by day, new issues have been arising. In particular, instruments must be inserted through a trocar to access the abdominal cavity without capability of direct manipulation of tissues, so a loss of sensitivity occurs. Generally speaking, the student of medicine or junior surgeons need a lot of practice hours before starting any surgical procedure, since they have to difficulty in acquiring specific skills (hand–eye coordination among others) for this type of surgery. Here is what the surgical simulator present a promising training method using an approach based on Finite Element Method (FEM). The use of continuum mechanics, especially Finite Element Analysis (FEA) has gained an extensive application in medical field in order to simulate soft tissues. In particular, colorectal simulations can be used to understand the interaction between colon and the surrounding tissues and also between colon and instruments. Although several works have been introduced considering small displacements, FEA applied to colorectal surgical procedures with large displacements is a topic that asks for more investigations. This work aims to investigate how FEA can describe non-linear effects induced by material properties and different approximating geometries, focusing as test-case application colorectal surgery. More in detail, it shows a comparison between simulations that are performed using both linear and hyperelastic models. These different mechanical behaviours are applied on different geometrical models (planar, cylindrical, 3D-SS and a real model from digital acquisitions 3D-S) with the aim of evaluating the effects of geometric non-linearity. Final aim of the research is to provide a preliminary contribution to the simulation of the interaction between surgical instrument and colon tissues with multi-purpose FEA in order to help the preliminary set-up of different bioengineering tasks like force-contact evaluation or approximated modelling for virtual reality (surgical simulations). In particular, the contribution of this work is focused on the sensitivity analysis of the nonlinearities by FEA in the tissue-tool interaction through an explicit FEA solver. By doing in this way, we aim to demonstrate that the set-up of FEA computational surgical tools may be simplified in order to provide assistance to non-expert FEA engineers or medicians in more precise way of using FEA tools
    • 

    corecore