234 research outputs found

    An evaluation of asymmetric interfaces for bimanual virtual assembly with haptics

    Get PDF
    Immersive computing technology provides a human–computer interface to support natural human interaction with digital data and models. One application for this technology is product assembly methods planning and validation. This paper presents the results of a user study which explores the effectiveness of various bimanual interaction device configurations for virtual assembly tasks. Participants completed two assembly tasks with two device configurations in five randomized bimanual treatment conditions (within subjects). A Phantom Omni® with and without haptics enabled and a 5DT Data Glove were used. Participant performance, as measured by time to assemble, was the evaluation metric. The results revealed that there was no significant difference in performance between the five treatment conditions. However, half of the participants chose the 5DT Data Glove and the haptic-enabled Phantom Omni® as their preferred device configuration. In addition, qualitative comments support both the preference of haptics during the assembly process and comments confirming Guiard’s kinematic chain model

    Factors of Micromanipulation Accuracy and Learning

    No full text
    Micromanipulation refers to the manipulation under a microscope in order to perform delicate procedures. It is difficult for humans to manipulate objects accurately under a microscope due to tremor and imperfect perception, limiting performance. This project seeks to understand factors affecting accuracy in micromanipulation, and to propose strategies for learning improving accuracy. Psychomotor experiments were conducted using computer-controlled setups to determine how various feedback modalities and learning methods can influence micromanipulation performance. In a first experiment, static and motion accuracy of surgeons, medical students and non-medical students under different magniification levels and grip force settings were compared. A second experiment investigated whether the non-dominant hand placed close to the target can contribute to accurate pointing of the dominant hand. A third experiment tested a training strategy for micromanipulation using unstable dynamics to magnify motion error, a strategy shown to be decreasing deviation in large arm movements. Two virtual reality (VR) modules were then developed to train needle grasping and needle insertion tasks, two primitive tasks in a microsurgery suturing procedure. The modules provided the trainee with a visual display in stereoscopic view and information on their grip, tool position and angles. Using the VR module, a study examining effects of visual cues was conducted to train tool orientation. Results from these studies suggested that it is possible to learn and improve accuracy in micromanipulation using appropriate sensorimotor feedback and training

    Bimanual Motor Strategies and Handedness Role During Human-Exoskeleton Haptic Interaction

    Full text link
    Bimanual object manipulation involves multiple visuo-haptic sensory feedbacks arising from the interaction with the environment that are managed from the central nervous system and consequently translated in motor commands. Kinematic strategies that occur during bimanual coupled tasks are still a scientific debate despite modern advances in haptics and robotics. Current technologies may have the potential to provide realistic scenarios involving the entire upper limb extremities during multi-joint movements but are not yet exploited to their full potential. The present study explores how hands dynamically interact when manipulating a shared object through the use of two impedance-controlled exoskeletons programmed to simulate bimanually coupled manipulation of virtual objects. We enrolled twenty-six participants (2 groups: right-handed and left-handed) who were requested to use both hands to grab simulated objects across the robot workspace and place them in specific locations. The virtual objects were rendered with different dynamic proprieties and textures influencing the manipulation strategies to complete the tasks. Results revealed that the roles of hands are related to the movement direction, the haptic features, and the handedness preference. Outcomes suggested that the haptic feedback affects bimanual strategies depending on the movement direction. However, left-handers show better control of the force applied between the two hands, probably due to environmental pressures for right-handed manipulations

    Two-Hand Virtual Object Manipulation Based on Networked Architecture

    Full text link
    A setup for bimanual virtual object manipulation is described in this paper. Index and thumb fingers are inserted in the corresponding thimbles in order to perform virtual object manipulations. A gimble, with 3-rotational degrees of freedom, connects each thimble to the corresponding serial-parallel mechanical structure with 3 actuated DoF. As a result, each finger has 6 DoF, movements and forces can be reflected in any direction without any torque component. Scenarios for virtual manipulation are based on distributed architecture where each finger device has its own real-time controller. A computer receives the status of each finger and runs a simulation with the virtual object manipulation. The information of the Scenario is updated at a rate of 200 Hz. The information from the haptic controller is processed at 1 kHz; it provides a good realism for object manipulation

    Phrasing Bimanual Interaction for Visual Design

    Get PDF
    Architects and other visual thinkers create external representations of their ideas to support early-stage design. They compose visual imagery with sketching to form abstract diagrams as representations. When working with digital media, they apply various visual operations to transform representations, often engaging in complex sequences. This research investigates how to build interactive capabilities to support designers in putting together, that is phrasing, sequences of operations using both hands. In particular, we examine how phrasing interactions with pen and multi-touch input can support modal switching among different visual operations that in many commercial design tools require using menus and tool palettes—techniques originally designed for the mouse, not pen and touch. We develop an interactive bimanual pen+touch diagramming environment and study its use in landscape architecture design studio education. We observe interesting forms of interaction that emerge, and how our bimanual interaction techniques support visual design processes. Based on the needs of architects, we develop LayerFish, a new bimanual technique for layering overlapping content. We conduct a controlled experiment to evaluate its efficacy. We explore the use of wearables to identify which user, and distinguish what hand, is touching to support phrasing together direct-touch interactions on large displays. From design and development of the environment and both field and controlled studies, we derive a set methods, based upon human bimanual specialization theory, for phrasing modal operations through bimanual interactions without menus or tool palettes

    Relevance of grasp types to assess functionality for personal autonomy

    Get PDF
    Study Design Cross-sectional research design. Introduction Current assessment of hand function is not focused on evaluating the real abilities required for autonomy. Purpose of the Study To quantify the relevance of grasp types for autonomy to guide hand recovery and its assessment. Methods Representative tasks of the International Classification of Functioning, Disability and Health activities in which the hands are directly involved were recorded. The videos were analyzed to identify the grasps used with each hand, and their relevance for autonomy was determined by weighting time with the frequency of appearance of each activity in disability and dependency scales. Relevance is provided globally and distinguished by hand (right-left) and bimanual function. Significant differences in relevance are also checked. Results The most relevant grasps are pad-to-pad pinch (31.9%), lumbrical (15.4%), cylindrical (12%), and special pinch (7.3%) together with the nonprehensile (18.6%) use of the hand. Lumbrical grasp has higher relevance for the left hand (19.9% vs 12%) while cylindrical grasp for the right hand (15.3% vs 7.7%). Relevancies are also different depending on bimanual function. Discussion Different relative importance was obtained when considering dependency vs disability scales. Pad-to-pad pinch and nonprehensile grasp are the most relevant grasps for both hands, whereas lumbrical grasp is more relevant for the left hand and cylindrical grasp for the right one. The most significant difference in bimanual function refers to pad-to-pad pinch (more relevant for unimanual actions of the left hand and bimanual actions of the right). Conclusions The relative importance of each grasp type for autonomy and the differences observed between hand and bimanual action should be used in medical and physical decision-making.This research was funded by the Universitat Jaume I through projects P1·1B2013-33 and P1-1B2014-10, and by the Spanish Ministry of Research and Innovation and the European Union (European Regional Development Funds) through project DPI2014-52095-P

    Prevalence of haptic feedback in robot-mediated surgery : a systematic review of literature

    Get PDF
    © 2017 Springer-Verlag. This is a post-peer-review, pre-copyedit version of an article published in Journal of Robotic Surgery. The final authenticated version is available online at: https://doi.org/10.1007/s11701-017-0763-4With the successful uptake and inclusion of robotic systems in minimally invasive surgery and with the increasing application of robotic surgery (RS) in numerous surgical specialities worldwide, there is now a need to develop and enhance the technology further. One such improvement is the implementation and amalgamation of haptic feedback technology into RS which will permit the operating surgeon on the console to receive haptic information on the type of tissue being operated on. The main advantage of using this is to allow the operating surgeon to feel and control the amount of force applied to different tissues during surgery thus minimising the risk of tissue damage due to both the direct and indirect effects of excessive tissue force or tension being applied during RS. We performed a two-rater systematic review to identify the latest developments and potential avenues of improving technology in the application and implementation of haptic feedback technology to the operating surgeon on the console during RS. This review provides a summary of technological enhancements in RS, considering different stages of work, from proof of concept to cadaver tissue testing, surgery in animals, and finally real implementation in surgical practice. We identify that at the time of this review, while there is a unanimous agreement regarding need for haptic and tactile feedback, there are no solutions or products available that address this need. There is a scope and need for new developments in haptic augmentation for robot-mediated surgery with the aim of improving patient care and robotic surgical technology further.Peer reviewe
    • …
    corecore