346 research outputs found

    Tactile sensors for robotic applications

    Get PDF
    In recent years, tactile sensing has become a key enabling technology to implement complex tasks by using robotic systems [...]

    Spatial patterns of cutaneous vibration during whole-hand haptic interactions

    Get PDF
    We investigated the propagation patterns of cutaneous vibration in the hand during interactions with touched objects. Prior research has highlighted the importance of vibrotactile signals during haptic interactions, but little is known of how vibrations propagate throughout the hand. Furthermore, the extent to which the patterns of vibrations reflect the nature of the objects that are touched, and how they are touched, is unknown. Using an apparatus comprised of an array of accelerometers, we mapped and analyzed spatial distributions of vibrations propagating in the skin of the dorsal region of the hand during active touch, grasping, and manipulation tasks. We found these spatial patterns of vibration to vary systematically with touch interactions and determined that it is possible to use these data to decode the modes of interaction with touched objects. The observed vibration patterns evolved rapidly in time, peaking in intensity within a few milliseconds, fading within 20–30 ms, and yielding interaction-dependent distributions of energy in frequency bands that span the range of vibrotactile sensitivity. These results are consistent with findings in perception research that indicate that vibrotactile information distributed throughout the hand can transmit information regarding explored and manipulated objects. The results may further clarify the role of distributed sensory resources in the perceptual recovery of object attributes during active touch, may guide the development of approaches to robotic sensing, and could have implications for the rehabilitation of the upper extremity

    Survey of Visual and Force/Tactile Control of Robots for Physical Interaction in Spain

    Get PDF
    Sensors provide robotic systems with the information required to perceive the changes that happen in unstructured environments and modify their actions accordingly. The robotic controllers which process and analyze this sensory information are usually based on three types of sensors (visual, force/torque and tactile) which identify the most widespread robotic control strategies: visual servoing control, force control and tactile control. This paper presents a detailed review on the sensor architectures, algorithmic techniques and applications which have been developed by Spanish researchers in order to implement these mono-sensor and multi-sensor controllers which combine several sensors

    Instrumentation, Data, And Algorithms For Visually Understanding Haptic Surface Properties

    Get PDF
    Autonomous robots need to efficiently walk over varied surfaces and grasp diverse objects. We hypothesize that the association between how such surfaces look and how they physically feel during contact can be learned from a database of matched haptic and visual data recorded from various end-effectors\u27 interactions with hundreds of real-world surfaces. Testing this hypothesis required the creation of a new multimodal sensing apparatus, the collection of a large multimodal dataset, and development of a machine-learning pipeline. This thesis begins by describing the design and construction of the Portable Robotic Optical/Tactile ObservatioN PACKage (PROTONPACK, or Proton for short), an untethered handheld sensing device that emulates the capabilities of the human senses of vision and touch. Its sensory modalities include RGBD vision, egomotion, contact force, and contact vibration. Three interchangeable end-effectors (a steel tooling ball, an OptoForce three-axis force sensor, and a SynTouch BioTac artificial fingertip) allow for different material properties at the contact point and provide additional tactile data. We then detail the calibration process for the motion and force sensing systems, as well as several proof-of-concept surface discrimination experiments that demonstrate the reliability of the device and the utility of the data it collects. This thesis then presents a large-scale dataset of multimodal surface interaction recordings, including 357 unique surfaces such as furniture, fabrics, outdoor fixtures, and items from several private and public material sample collections. Each surface was touched with one, two, or three end-effectors, comprising approximately one minute per end-effector of tapping and dragging at various forces and speeds. We hope that the larger community of robotics researchers will find broad applications for the published dataset. Lastly, we demonstrate an algorithm that learns to estimate haptic surface properties given visual input. Surfaces were rated on hardness, roughness, stickiness, and temperature by the human experimenter and by a pool of purely visual observers. Then we trained an algorithm to perform the same task as well as infer quantitative properties calculated from the haptic data. Overall, the task of predicting haptic properties from vision alone proved difficult for both humans and computers, but a hybrid algorithm using a deep neural network and a support vector machine achieved a correlation between expected and actual regression output between approximately ρ = 0.3 and ρ = 0.5 on previously unseen surfaces

    Sensors for Robotic Hands: A Survey of State of the Art

    Get PDF
    Recent decades have seen significant progress in the field of artificial hands. Most of the surveys, which try to capture the latest developments in this field, focused on actuation and control systems of these devices. In this paper, our goal is to provide a comprehensive survey of the sensors for artificial hands. In order to present the evolution of the field, we cover five year periods starting at the turn of the millennium. At each period, we present the robot hands with a focus on their sensor systems dividing them into categories, such as prosthetics, research devices, and industrial end-effectors.We also cover the sensors developed for robot hand usage in each era. Finally, the period between 2010 and 2015 introduces the reader to the state of the art and also hints to the future directions in the sensor development for artificial hands

    Sense, Think, Grasp: A study on visual and tactile information processing for autonomous manipulation

    Get PDF
    Interacting with the environment using hands is one of the distinctive abilities of humans with respect to other species. This aptitude reflects on the crucial role played by objects\u2019 manipulation in the world that we have shaped for us. With a view of bringing robots outside industries for supporting people during everyday life, the ability of manipulating objects autonomously and in unstructured environments is therefore one of the basic skills they need. Autonomous manipulation is characterized by great complexity especially regarding the processing of sensors information to perceive the surrounding environment. Humans rely on vision for wideranging tridimensional information, prioprioception for the awareness of the relative position of their own body in the space and the sense of touch for local information when physical interaction with objects happens. The study of autonomous manipulation in robotics aims at transferring similar perceptive skills to robots so that, combined with state of the art control techniques, they could be able to achieve similar performance in manipulating objects. The great complexity of this task makes autonomous manipulation one of the open problems in robotics that has been drawing increasingly the research attention in the latest years. In this work of Thesis, we propose possible solutions to some key components of autonomous manipulation, focusing in particular on the perception problem and testing the developed approaches on the humanoid robotic platform iCub. When available, vision is the first source of information to be processed for inferring how to interact with objects. The object modeling and grasping pipeline based on superquadric functions we designed meets this need, since it reconstructs the object 3D model from partial point cloud and computes a suitable hand pose for grasping the object. Retrieving objects information with touch sensors only is a relevant skill that becomes crucial when vision is occluded, as happens for instance during physical interaction with the object. We addressed this problem with the design of a novel tactile localization algorithm, named Memory Unscented Particle Filter, capable of localizing and recognizing objects relying solely on 3D contact points collected on the object surface. Another key point of autonomous manipulation we report on in this Thesis work is bi-manual coordination. The execution of more advanced manipulation tasks in fact might require the use and coordination of two arms. Tool usage for instance often requires a proper in-hand object pose that can be obtained via dual-arm re-grasping. In pick-and-place tasks sometimes the initial and target position of the object do not belong to the same arm workspace, then requiring to use one hand for lifting the object and the other for locating it in the new position. At this regard, we implemented a pipeline for executing the handover task, i.e. the sequences of actions for autonomously passing an object from one robot hand on to the other. The contributions described thus far address specific subproblems of the more complex task of autonomous manipulation. This actually differs from what humans do, in that humans develop their manipulation skills by learning through experience and trial-and-error strategy. Aproper mathematical formulation for encoding this learning approach is given by Deep Reinforcement Learning, that has recently proved to be successful in many robotics applications. For this reason, in this Thesis we report also on the six month experience carried out at Berkeley Artificial Intelligence Research laboratory with the goal of studying Deep Reinforcement Learning and its application to autonomous manipulation

    Tactile Sensors for Friction Estimation and Incipient Slip Detection - Toward Dexterous Robotic Manipulation:A Review

    Get PDF
    Humans can handle and manipulate objects with ease; however, human dexterity has yet to be matched by artificial systems. Receptors in our fingers and hands provide essential tactile information to the motor control system during dexterous manipulation such that the grip force is scaled to the tangential forces according to the coefficient of friction. Likewise, tactile sensing will become essential for robotic and prosthetic gripping performance as applications move toward unstructured environments. However, most existing research ignores the need to sense the frictional properties of the sensor-object interface, which (along with contact forces and torques) is essential for finding the minimum grip force required to securely grasp an object. Here, we review this problem by surveying the field of tactile sensing from the perspective that sensors should: 1) detect gross slip (to adjust the grip force); 2) detect incipient slip (dependent on the frictional properties of the sensor-object interface and the geometries and mechanics of the sensor and the object) as an indication of grip security; or 3) measure friction on contact with an object and/or following a gross or incipient slip event while manipulating an object. Recommendations are made to help focus future sensor design efforts toward a generalizable and practical solution to sense, and hence control grip security. Specifically, we propose that the sensor mechanics should encourage incipient slip, by allowing parts of the sensor to slip while other parts remain stuck, and that instrumentation should measure displacement and deformation to complement conventional force, pressure, and vibration tactile sensing

    Wearable Technology For Healthcare And Athletic Performance

    Get PDF
    Wearable technology research has led to advancements in healthcare and athletic performance. Devices range from one size fits all fitness trackers to custom fitted devices with tailored algorithms. Because these devices are comfortable, discrete, and pervasive in everyday life, custom solutions can be created to fit an individual\u27s specific needs. In this dissertation, we design wearable sensors, develop features and algorithms, and create intelligent feedback systems that promote the advancement of healthcare and athletic performance. First, we present Magneto: a body mounted electromagnet-based sensing system for joint motion analysis. Joint motion analysis facilitates research into injury prevention, rehabilitation, and activity monitoring. Sensors used in such analysis must be unobtrusive, accurate, and capable of monitoring fast-paced dynamic motions. Our system is wireless, has a high sampling rate, and is unaffected by outside magnetic noise. Magnetic noise commonly influences magnetic field readings via magnetic interference from the Earth\u27s magnetic field, the environment, and nearby ferrous objects. Magneto uses the combination of an electromagnet and magnetometer to remove environmental interference from a magnetic field reading. We evaluated this sensing method to show its performance when removing the interference in three movement dimensions, in six environments, and with six different sampling rates. Then, we localized the electromagnet with respect to the magnetic field reader, allowing us to apply Magneto in two pilot studies: measuring elbow angles and calculating shoulder positions. We calculated elbow angles to the nearest 15â—¦ with 93.8% accuracy, shoulder position in two-degrees of freedom with 96.9% accuracy, and shoulder positions in three-degrees of freedom with 75.8% accuracy. Second, we present TracKnee: a sensing knee sleeve designed and fabricated to unobtrusively measure knee angles using conductive fabric sensors. We propose three models that can be used in succession to calculate knee angles from voltage. These models take an input of voltage, calculate the resistance of our conductive fabric sensor, then calculate the change in length across the front of the knee and finally to the angle of the knee. We evaluated our models and our device by conducting a user study with six participants where we collected 240 ground truth angles and sensor data from our TracKnee device. Our results show that our model is 94.86% accurate to the nearest 15th degree angle and that our average error per angle is error per angle is 3.69 degrees. Third, we present ServesUp: a sensing shirt designed to monitor shoulder and elbow motion during the volleyball serve. In this project, we will designed and fabricated a sensing shirt that is comfortable, unobtrusive, and washable that an athlete can wear during and without impeding volleyball play. To make the shirt comfortable, we used soft and flexible conductive fabric sensors to monitor the motion of the shoulder and the elbow. We conducted a user study with ten volleyball players for a total of 1000 volleyball serves. We classified serving motion using a KNN with a classification accuracy of 89.2%. We will use this data provide actionable insights back to the player to help improve their serving skill. Fourth, we present BreathEZ, the first smartwatch application that provides both choking first aid instruction and real-time tactile and visual feedback on the quality of the abdominal thrust compressions. We evaluated our application through two user studies involving 20 subjects and 200 abdominal thrust events. The results of our study show that BreathEZ achieves a classification accuracy of 90.9% for abdominal thrusts. All participants that used BreathEZ in our study were able to improve their performance of abdominal thrusts. Of these participants, 60% were able to perform within the recommended range with the use of BreathEZ. Comparatively no participants trained with a video only reached that range. Finally, we present BBAid: the first smartwatch based system that provides real-time feedback on the back blow portion of choking first aid while instructing the user on first aid procedure. We evaluated our application through two user studies involving 26 subjects and 260 back blow events. The results of our study show that BBAid achieves a classification accuracy of 93.75% for back blows. With the use of BBAid, participants in our study were able to perform back blows within the recommended range 75% of the time. Comparatively the participants trained with a video only reached that range 12% of the time. All participants in the study, after receiving training were much more willing to perform choking first aid

    Sensory Communication

    Get PDF
    Contains table of contents for Section 2 and reports on five research projects.National Institutes of Health Contract 2 R01 DC00117National Institutes of Health Contract 1 R01 DC02032National Institutes of Health Contract 2 P01 DC00361National Institutes of Health Contract N01 DC22402National Institutes of Health Grant R01-DC001001National Institutes of Health Grant R01-DC00270National Institutes of Health Grant 5 R01 DC00126National Institutes of Health Grant R29-DC00625U.S. Navy - Office of Naval Research Grant N00014-88-K-0604U.S. Navy - Office of Naval Research Grant N00014-91-J-1454U.S. Navy - Office of Naval Research Grant N00014-92-J-1814U.S. Navy - Naval Air Warfare Center Training Systems Division Contract N61339-94-C-0087U.S. Navy - Naval Air Warfare Center Training System Division Contract N61339-93-C-0055U.S. Navy - Office of Naval Research Grant N00014-93-1-1198National Aeronautics and Space Administration/Ames Research Center Grant NCC 2-77
    corecore