23,411 research outputs found

    Unimanual and Bimanual Weight Perception of Virtual Objects with a new Multi-finger Haptic Interface

    Get PDF
    Accurate weight perception is important particularly in tasks where the user has to apply vertical forces to ensure safe landing of a fragile object or precise penetration of a surface with a probe. Moreover, depending on physical properties of objects such as weight and size we may switch between unimanual and bimanual manipulation during a task. Research has shown that bimanual manipulation of real objects results in a misperception of their weight: they tend to feel lighter than similarly heavy objects which are handled with one hand only [8]. Effective simulation of bimanual manipulation with desktop haptic interfaces should be able to replicate this effect of bimanual manipulation on weight perception. Here, we present the MasterFinger-2, a new multi-finger haptic interface allowing bimanual manipulation of virtual objects with precision grip and we conduct weight discrimination experiments to evaluate its capacity to simulate unimanual and bimanual weight. We found that the bimanual ‘lighter’ bias is also observed with the MasterFinger-2 but the sensitivity to changes of virtual weights deteriorated

    Multi-scale gestural interaction for augmented reality

    Get PDF
    We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and finger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor configuration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone.Postprin

    Dexterous manipulation of unknown objects using virtual contact points

    Get PDF
    The manipulation of unknown objects is a problem of special interest in robotics since it is not always possible to have exact models of the objects with which the robot interacts. This paper presents a simple strategy to manipulate unknown objects using a robotic hand equipped with tactile sensors. The hand configurations that allow the rotation of an unknown object are computed using only tactile and kinematic information, obtained during the manipulation process and reasoning about the desired and real positions of the fingertips during the manipulation. This is done taking into account that the desired positions of the fingertips are not physically reachable since they are located in the interior of the manipulated object and therefore they are virtual positions with associated virtual contact points. The proposed approach was satisfactorily validated using three fingers of an anthropomorphic robotic hand (Allegro Hand), with the original fingertips replaced by tactile sensors (WTS-FT). In the experimental validation, several everyday objects with different shapes were successfully manipulated, rotating them without the need of knowing their shape or any other physical property.Peer ReviewedPostprint (author's final draft

    MasterFinger: Multi-finger Haptic Interface for Collaborative Environments

    Get PDF
    This paper introduces the Master Finger development and application, a multi-finger haptic interface for virtual object manipulation. This haptic device, with a modular interface, is specially designed to perform collaborative tasks. Each module is in charge of managing the haptic interaction with a finger. The mechanical structure of the module is based on a serial-parallel structure linked to the finger thimble by a gimble with its own controller. Cooperative applications based onMasterFinger-2 (MF2) are also described in this study. Results from these applications show that multifinger interface is a significant leap in haptic devices since precise object grasping and collaborative manipulation by using two hands are successfully performed

    Improving grasping forces during the manipulation of unknown objects

    Get PDF
    © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting /republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other worksMany of the solutions proposed for the object manipulation problem are based on the knowledge of the object features. The approach proposed in this paper intends to provide a simple geometrical approach to securely manipulate an unknown object based only on tactile and kinematic information. The tactile and kinematic data obtained during the manipulation is used to recognize the object shape (at least the local object curvature), allowing to improve the grasping forces when this information is added to the manipulation strategy. The approach has been fully implemented and tested using the Schunk Dexterous Hand (SDH2). Experimental results are shown to illustrate the efficiency of the approach.Peer ReviewedPostprint (author's final draft

    Prop-Based Haptic Interaction with Co-location and Immersion: an Automotive Application

    Get PDF
    Most research on 3D user interfaces aims at providing only a single sensory modality. One challenge is to integrate several sensory modalities into a seamless system while preserving each modality's immersion and performance factors. This paper concerns manipulation tasks and proposes a visuo-haptic system integrating immersive visualization, tactile force and tactile feedback with co-location. An industrial application is presented
    corecore