39 research outputs found

    Complementary Situational Awareness for an Intelligent Telerobotic Surgical Assistant System

    Get PDF
    Robotic surgical systems have contributed greatly to the advancement of Minimally Invasive Surgeries (MIS). More specifically, telesurgical robots have provided enhanced dexterity to surgeons performing MIS procedures. However, current robotic teleoperated systems have only limited situational awareness of the patient anatomy and surgical environment that would typically be available to a surgeon in an open surgery. Although the endoscopic view enhances the visualization of the anatomy, perceptual understanding of the environment and anatomy is still lacking due to the absence of sensory feedback. In this work, these limitations are addressed by developing a computational framework to provide Complementary Situational Awareness (CSA) in a surgical assistant. This framework aims at improving the human-robot relationship by providing elaborate guidance and sensory feedback capabilities for the surgeon in complex MIS procedures. Unlike traditional teleoperation, this framework enables the user to telemanipulate the situational model in a virtual environment and uses that information to command the slave robot with appropriate admittance gains and environmental constraints. Simultaneously, the situational model is updated based on interaction of the slave robot with the task space environment. However, developing such a system to provide real-time situational awareness requires that many technical challenges be met. To estimate intraoperative organ information continuous palpation primitives are required. Intraoperative surface information needs to be estimated in real-time while the organ is being palpated/scanned. The model of the task environment needs to be updated in near real-time using the estimated organ geometry so that the force-feedback applied on the surgeon's hand would correspond to the actual location of the model. This work presents a real-time framework that meets these requirements/challenges to provide situational awareness of the environment in the task space. Further, visual feedback is also provided for the surgeon/developer to view the near video frame rate updates of the task model. All these functions are executed in parallel and need to have a synchronized data exchange. The system is very portable and can be incorporated to any existing telerobotic platforms with minimal overhead

    Enhancing the E-Commerce Experience through Haptic Feedback Interaction

    Get PDF
    The sense of touch is important in our everyday lives and its absence makes it difficult to explore and manipulate everyday objects. Existing online shopping practice lacks the opportunity for physical evaluation, that people often use and value when making product choices. However, with recent advances in haptic research and technology, it is possible to simulate various physical properties such as heaviness, softness, deformation, and temperature. The research described here investigates the use of haptic feedback interaction to enhance e-commerce product evaluation, particularly haptic weight and texture evaluation. While other properties are equally important, besides being fundamental to the shopping experience of many online products, weight and texture can be simulated using cost-effective devices. Two initial psychophysical experiments were conducted using free motion haptic exploration in order to more closely resemble conventional shopping. One experiment was to measure weight force thresholds and another to measure texture force thresholds. The measurements can provide better understanding of haptic device limitation for online shopping in terms of the availability of different stimuli to represent physical products. The outcomes of the initial psychophysical experimental studies were then used to produce various absolute stimuli that were used in a comparative experimental study to evaluate user experience of haptic product evaluation. Although free haptic exploration was exercised on both psychophysical experiments, results were relatively consistent with previous work on haptic discrimination. The threshold for weight force discrimination represented as downward forces was 10 percent. The threshold for texture force discrimination represented as friction forces was 14.1 percent, when using dynamic coefficient of friction at any level of static coefficient of friction. On the other hand, the comparative experimental study to evaluate user experience of haptic product information indicated that haptic product evaluation does not change user performance significantly. However, although there was an increase in the time taken to complete the task, the number of button click actions tended to decrease. The results showed that haptic product evaluation could significantly increase the confidence of shopping decision. Nevertheless, the availability of haptic product evaluation does not necessarily impose different product choices but it complements other selection criteria such as price and appearance. The research findings from this work are a first step towards exploring haptic-based environments in e-commerce environments. The findings not only lay the foundation for designing online haptic shopping but also provide empirical support to research in this direction

    A technology-aided multi-modal training approach to assist abdominal palpation training and its assessment in medical education

    Get PDF
    Kinaesthetic Learning Activities (KLA) are techniques for enhancing the motor learning process to provide a deep understanding of fundamental skills in particular disciplines. With KLA learning takes place by carrying out a physical activity to transform empirical achievements into representative cognitive understanding. In disciplines such as medical education, frequent hands-on practice of certain motor skills plays a key role in the development of medical students' competency. Therefore it is essential that clinicians master these core skills early on in their educational journey as well as retain them for the entirety of their career. Transferring knowledge of performing dexterous motor skills, such as clinical examinations, from experts to novices demands a systematic approach to quantify relevant motor variables with the help of medical experts in order to form a reference best practice model for target skills. Additional information (augmented feedback) on certain aspects of movements could be extracted from this model and visualised via multi-modal sensory channels in order to enhance motor performance and learning processes. This thesis proposes a novel KLA methodology to significantly improve the quality of palpation training in medical students. In particular, it investigates whether it is possible to enhance the existing abdominal palpation skills acquisition process (motor performance and learning) with provision of instructional concurrent and terminal augmented feedback on applied forces by the learner's hand via an autonomous multimodal displays. This is achieved by considering the following: identifying key motor variables with help of medical experts; forming a gold standard model for target skills by collecting pre-defined motor variables with an innovative quantification technique; designing an assessment criteria by analysing the medical experts' data; and systematically evaluating the impact of instructional augmented feedback on medical students' motor performance with two distinct assessment approaches(a machine-based and a human-based). In addition, an evaluation of performance on a simpler task is carried out using a game-based training method, to compare feedback visualisation techniques, such as concurrent visual and auditory feedback as used in a serious games environment, with abstract visualisation of motor variables. A detailed between-participants study is presented to evaluate the effect of concurrent augmented feedback on participants' skills acquisition in the motor learning process. Significant improvement on medical students' motor performance was observed when augmented feedback on applied forces were visually presented (H(2) = 6:033, p < :05). Moreover, a positive correlation was reported between computer-generated scores and human-generated scores, r = :62, p (one-tailed) < :05. This indicates the potential of the computer-based assessment technique to assist the current assessment process in medical education. The same results were also achieved in a blind-folded (no-feedback) transfer test to evaluate performance and short-term retention of skills in the game-based training approach. The accuracy in the exerted target force for participants in the game-playing group, who were trained using the game approach (Mdn = 0:86), differed significantly from the participants in control group, who trained using the abstract visualisation of the exerted force value (Mdn = 1:56), U = 61, z = -2:137, p < :05, r = -0:36. Finally, the usability of both motor learning approaches were surveyed via feedback questionnaires and positive responses were achieved from users. The research presented shows that concurrent augmented feedback significantly improves the participants' motor control abilities. Furthermore, advanced visualisation techniques such as multi-modal displays increases the participants' motivation to engage in learning and to retain motor skills

    Recent Developments and Future Challenges in Medical Mixed Reality

    Get PDF
    As AR technology matures, we have seen many applicationsemerge in entertainment, education and training. However, the useof AR is not yet common in medical practice, despite the great po-tential of this technology to help not only learning and training inmedicine, but also in assisting diagnosis and surgical guidance. Inthis paper, we present recent trends in the use of AR across all med-ical specialties and identify challenges that must be overcome tonarrow the gap between academic research and practical use of ARin medicine. A database of 1403 relevant research papers publishedover the last two decades has been reviewed by using a novel re-search trend analysis method based on text mining algorithm. Wesemantically identified 10 topics including varies of technologiesand applications based on the non-biased and in-personal cluster-ing results from the Latent Dirichlet Allocatio (LDA) model andanalysed the trend of each topic from 1995 to 2015. The statisticresults reveal a taxonomy that can best describes the developmentof the medical AR research during the two decades. And the trendanalysis provide a higher level of view of how the taxonomy haschanged and where the focus will goes. Finally, based on the valu-able results, we provide a insightful discussion to the current limi-tations, challenges and future directions in the field. Our objectiveis to aid researchers to focus on the application areas in medicalAR that are most needed, as well as providing medical practitioners with latest technology advancements

    Ein haptisches Display zur Simulation der Weichgewebepalpation

    Get PDF
    In dieser Arbeit wird ein neuartiges Gerät vorgestellt, mit dem die Palpation geübt werden kann. Während des Abtastens werden die Kräfte an der Hand und am Finger wiedergegeben. In einer virtuellen Umgebung können die Eigenschaften des Weichgewebes vielfältig an unterschiedliche Trainingsszenarien angepasst werden. In einer Benutzerstudie wird die Funktionalität des haptischen Displays und der virtuellen Umgebung evaluiert und die Vorteile des neuen Trainingsgeräts nachgewiesen

    Medical Robotics

    Get PDF
    The first generation of surgical robots are already being installed in a number of operating rooms around the world. Robotics is being introduced to medicine because it allows for unprecedented control and precision of surgical instruments in minimally invasive procedures. So far, robots have been used to position an endoscope, perform gallbladder surgery and correct gastroesophogeal reflux and heartburn. The ultimate goal of the robotic surgery field is to design a robot that can be used to perform closed-chest, beating-heart surgery. The use of robotics in surgery will expand over the next decades without any doubt. Minimally Invasive Surgery (MIS) is a revolutionary approach in surgery. In MIS, the operation is performed with instruments and viewing equipment inserted into the body through small incisions created by the surgeon, in contrast to open surgery with large incisions. This minimizes surgical trauma and damage to healthy tissue, resulting in shorter patient recovery time. The aim of this book is to provide an overview of the state-of-art, to present new ideas, original results and practical experiences in this expanding area. Nevertheless, many chapters in the book concern advanced research on this growing area. The book provides critical analysis of clinical trials, assessment of the benefits and risks of the application of these technologies. This book is certainly a small sample of the research activity on Medical Robotics going on around the globe as you read it, but it surely covers a good deal of what has been done in the field recently, and as such it works as a valuable source for researchers interested in the involved subjects, whether they are currently “medical roboticists” or not

    Medical robots for MRI guided diagnosis and therapy

    No full text
    Magnetic Resonance Imaging (MRI) provides the capability of imaging tissue with fine resolution and superior soft tissue contrast, when compared with conventional ultrasound and CT imaging, which makes it an important tool for clinicians to perform more accurate diagnosis and image guided therapy. Medical robotic devices combining the high resolution anatomical images with real-time navigation, are ideal for precise and repeatable interventions. Despite these advantages, the MR environment imposes constraints on mechatronic devices operating within it. This thesis presents a study on the design and development of robotic systems for particular MR interventions, in which the issue of testing the MR compatibility of mechatronic components, actuation control, kinematics and workspace analysis, and mechanical and electrical design of the robot have been investigated. Two types of robotic systems have therefore been developed and evaluated along the above aspects. (i) A device for MR guided transrectal prostate biopsy: The system was designed from components which are proven to be MR compatible, actuated by pneumatic motors and ultrasonic motors, and tracked by optical position sensors and ducial markers. Clinical trials have been performed with the device on three patients, and the results reported have demonstrated its capability to perform needle positioning under MR guidance, with a procedure time of around 40mins and with no compromised image quality, which achieved our system speci cations. (ii) Limb positioning devices to facilitate the magic angle effect for diagnosis of tendinous injuries: Two systems were designed particularly for lower and upper limb positioning, which are actuated and tracked by the similar methods as the first device. A group of volunteers were recruited to conduct tests to verify the functionality of the systems. The results demonstrate the clear enhancement of the image quality with an increase in signal intensity up to 24 times in the tendon tissue caused by the magic angle effect, showing the feasibility of the proposed devices to be applied in clinical diagnosis

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Haptics: Science, Technology, Applications

    Get PDF
    This open access book constitutes the proceedings of the 12th International Conference on Human Haptic Sensing and Touch Enabled Computer Applications, EuroHaptics 2020, held in Leiden, The Netherlands, in September 2020. The 60 papers presented in this volume were carefully reviewed and selected from 111 submissions. The were organized in topical sections on haptic science, haptic technology, and haptic applications. This year's focus is on accessibility

    Multimodal interaction: developing an interaction concept for a touchscreen incorporating tactile feedback

    Get PDF
    The touchscreen, as an alternative user interface for applications that normally require mice and keyboards, has become more and more commonplace, showing up on mobile devices, on vending machines, on ATMs and in the control panels of machines in industry, where conventional input devices cannot provide intuitive, rapid and accurate user interaction with the content of the display. The exponential growth in processing power on the PC, together with advances in understanding human communication channels, has had a significant effect on the design of usable, human-factored interfaces on touchscreens, and on the number and complexity of applications available on touchscreens. Although computer-driven touchscreen interfaces provide programmable and dynamic displays, the absence of the expected tactile cues on the hard and static surfaces of conventional touchscreens is challenging interface design and touchscreen usability, in particular for distracting, low-visibility environments. Current technology allows the human tactile modality to be used in touchscreens. While the visual channel converts graphics and text unidirectionally from the computer to the end user, tactile communication features a bidirectional information flow to and from the user as the user perceives and acts on the environment and the system responds to changing contextual information. Tactile sensations such as detents and pulses provide users with cues that make selecting and controlling a more intuitive process. Tactile features can compensate for deficiencies in some of the human senses, especially in tasks which carry a heavy visual or auditory burden. In this study, an interaction concept for tactile touchscreens is developed with a view to employing the key characteristics of the human sense of touch effectively and efficiently, especially in distracting environments where vision is impaired and hearing is overloaded. As a first step toward improving the usability of touchscreens through the integration of tactile effects, different mechanical solutions for producing motion in tactile touchscreens are investigated, to provide a basis for selecting suitable vibration directions when designing tactile displays. Building on these results, design know-how regarding tactile feedback patterns is further developed to enable dynamic simulation of UI controls, in order to give users a sense of perceiving real controls on a highly natural touch interface. To study the value of adding tactile properties to touchscreens, haptically enhanced UI controls are then further investigated with the aim of mapping haptic signals to different usage scenarios to perform primary and secondary tasks with touchscreens. The findings of the study are intended for consideration and discussion as a guide to further development of tactile stimuli, haptically enhanced user interfaces and touchscreen applications
    corecore