764 research outputs found

    Electrostatic Friction Displays to Enhance Touchscreen Experience

    Get PDF
    Touchscreens are versatile devices that can display visual content and receive touch input, but they lack the ability to provide programmable tactile feedback. This limitation has been addressed by a few approaches generally called surface haptics technology. This technology modulates the friction between a user’s fingertip and a touchscreen surface to create different tactile sensations when the finger explores the touchscreen. This functionality enables the user to see and feel digital content simultaneously, leading to improved usability and user experiences. One major approach in surface haptics relies on the electrostatic force induced between the finger and an insulating surface on the touchscreen by supplying high AC voltage. The use of AC also induces a vibrational sensation called electrovibration to the user. Electrostatic friction displays require only electrical components and provide uniform friction over the screen. This tactile feedback technology not only allows easy and lightweight integration into touchscreen devices but also provides dynamic, rich, and satisfactory user interfaces. In this chapter, we review the fundamental operation of the electrovibration technology as well as applications have been built upon

    Reliable non-prehensile door opening through the combination of vision, tactile and force feedback

    Get PDF
    Whereas vision and force feedback—either at the wrist or at the joint level—for robotic manipulation purposes has received considerable attention in the literature, the benefits that tactile sensors can provide when combined with vision and force have been rarely explored. In fact, there are some situations in which vision and force feedback cannot guarantee robust manipulation. Vision is frequently subject to calibration errors, occlusions and outliers, whereas force feedback can only provide useful information on those directions that are constrained by the environment. In tasks where the visual feedback contains errors, and the contact configuration does not constrain all the Cartesian degrees of freedom, vision and force sensors are not sufficient to guarantee a successful execution. Many of the tasks performed in our daily life that do not require a firm grasp belong to this category. Therefore, it is important to develop strategies for robustly dealing with these situations. In this article, a new framework for combining tactile information with vision and force feedback is proposed and validated with the task of opening a sliding door. Results show how the vision-tactile-force approach outperforms vision-force and force-alone, in the sense that it allows to correct the vision errors at the same time that a suitable contact configuration is guaranteed.This research was partly supported by the Korea Science and Engineering Foundation under the WCU (World Class University) program funded by the Ministry of Education, Science and Technology, S. Korea (Grant No. R31-2008-000-10062-0), by the European Commission’s Seventh Framework Programme FP7/2007-2013 under grant agreements 217077 (EYESHOTS project), and 248497(TRIDENT Project), by Ministerio de Ciencia e Innovación (DPI-2008-06636; and DPI2008-06548-C03-01), by Fundació Caixa Castelló-Bancaixa (P1-1B2008-51; and P1-1B2009-50) and by Universitat Jaume I

    Inter-finger Small Object Manipulation with DenseTact Optical Tactile Sensor

    Full text link
    The ability to grasp and manipulate small objects in cluttered environments remains a significant challenge. This paper introduces a novel approach that utilizes a tactile sensor-equipped gripper with eight degrees of freedom to overcome these limitations. We employ DenseTact 2.0 for the gripper, enabling precise control and improved grasp success rates, particularly for small objects ranging from 5mm to 25mm. Our integrated strategy incorporates the robot arm, gripper, and sensor to manipulate and orient small objects for subsequent classification effectively. We contribute a specialized dataset designed for classifying these objects based on tactile sensor output and a new control algorithm for in-hand orientation tasks. Our system demonstrates 88% of successful grasp and successfully classified small objects in cluttered scenarios

    Dexterous In-Hand Manipulation of Slender Cylindrical Objects through Deep Reinforcement Learning with Tactile Sensing

    Full text link
    Continuous in-hand manipulation is an important physical interaction skill, where tactile sensing provides indispensable contact information to enable dexterous manipulation of small objects. This work proposed a framework for end-to-end policy learning with tactile feedback and sim-to-real transfer, which achieved fine in-hand manipulation that controls the pose of a thin cylindrical object, such as a long stick, to track various continuous trajectories through multiple contacts of three fingertips of a dexterous robot hand with tactile sensor arrays. We estimated the central contact position between the stick and each fingertip from the high-dimensional tactile information and showed that the learned policies achieved effective manipulation performance with the processed tactile feedback. The policies were trained with deep reinforcement learning in simulation and successfully transferred to real-world experiments, using coordinated model calibration and domain randomization. We evaluated the effectiveness of tactile information via comparative studies and validated the sim-to-real performance through real-world experiments.Comment: 10 pages, 12 figures, submitted to Transaction on Mechatronic

    ThimbleSense: a fingertip-wearable tactile sensor for grasp analysis

    Get PDF
    Accurate measurement of contact forces between hand and grasped objects is crucial to study sensorimotor control during grasp and manipulation. In this work we introduce ThimbleSense, a prototype of individual-digit wearable force/torque sensor based on the principle of intrinsic tactile sensing. By exploiting the integration of this approach with an active marker-based motion capture system, the proposed device simultaneously measures absolute position and orientation of the fingertip, which in turn yields measurements of contacts and force components expressed in a global reference frame. The main advantage of this approach with respect to more conventional solutions is its versatility. Specifically, ThimbleSense can be used to study grasping and manipulation of a wide variety of objects, while still retaining complete force/torque measurements. Nevertheless, validation of the proposed device is a necessary step before it can be used for experimental purposes. In this work we present the results of a series of experiments designed to validate the accuracy of ThimbleSense measurements and evaluate the effects of distortion of tactile afferent inputs caused by the device's rigid shells on grasp forces

    Data-Driven Grasp Synthesis - A Survey

    Full text link
    We review the work on data-driven grasp synthesis and the methodologies for sampling and ranking candidate grasps. We divide the approaches into three groups based on whether they synthesize grasps for known, familiar or unknown objects. This structure allows us to identify common object representations and perceptual processes that facilitate the employed data-driven grasp synthesis technique. In the case of known objects, we concentrate on the approaches that are based on object recognition and pose estimation. In the case of familiar objects, the techniques use some form of a similarity matching to a set of previously encountered objects. Finally for the approaches dealing with unknown objects, the core part is the extraction of specific features that are indicative of good grasps. Our survey provides an overview of the different methodologies and discusses open problems in the area of robot grasping. We also draw a parallel to the classical approaches that rely on analytic formulations.Comment: 20 pages, 30 Figures, submitted to IEEE Transactions on Robotic
    • …
    corecore