135 research outputs found

    Methods and Sensors for Slip Detection in Robotics: A Survey

    Get PDF
    The perception of slip is one of the distinctive abilities of human tactile sensing. The sense of touch allows recognizing a wide set of properties of a grasped object, such as shape, weight and dimension. Based on such properties, the applied force can be accordingly regulated avoiding slip of the grasped object. Despite the great importance of tactile sensing for humans, mechatronic hands (robotic manipulators, prosthetic hands etc.) are rarely endowed with tactile feedback. The necessity to grasp objects relying on robust slip prevention algorithms is not yet corresponded in existing artificial manipulators, which are relegated to structured environments then. Numerous approaches regarding the problem of slip detection and correction have been developed especially in the last decade, resorting to a number of sensor typologies. However, no impact on the industrial market has been achieved. This paper reviews the sensors and methods so far proposed for slip prevention in artificial tactile perception, starting from more classical techniques until the latest solutions tested on robotic systems. The strengths and weaknesses of each described technique are discussed, also in relation to the sensing technologies employed. The result is a summary exploring the whole state of art and providing a perspective towards the future research directions in the sector

    Distributed Sensing and Stimulation Systems Towards Sense of Touch Restoration in Prosthetics

    Get PDF
    Modern prostheses aim at restoring the functional and aesthetic characteristics of the lost limb. To foster prosthesis embodiment and functionality, it is necessary to restitute both volitional control and sensory feedback. Contemporary feedback interfaces presented in research use few sensors and stimulation units to feedback at most two discrete feedback variables (e.g. grasping force and aperture), whereas the human sense of touch relies on a distributed network of mechanoreceptors providing high-fidelity spatial information. To provide this type of feedback in prosthetics, it is necessary to sense tactile information from artificial skin placed on the prosthesis and transmit tactile feedback above the amputation in order to map the interaction between the prosthesis and the environment. This thesis proposes the integration of distributed sensing systems (e-skin) to acquire tactile sensation, and non-invasive multichannel electrotactile feedback and virtual reality to deliver high-bandwidth information to the user. Its core focus addresses the development and testing of close-loop sensory feedback human-machine interface, based on the latest distributed sensing and stimulation techniques for restoring the sense of touch in prosthetics. To this end, the thesis is comprised of two introductory chapters that describe the state of art in the field, the objectives and the used methodology and contributions; as well as three studies distributed over stimulation system level and sensing system level. The first study presents the development of close-loop compensatory tracking system to evaluate the usability and effectiveness of electrotactile sensory feedback in enabling real-time close-loop control in prosthetics. It examines and compares the subject\u2019s adaptive performance and tolerance to random latencies while performing the dynamic control task (i.e. position control) and simultaneously receiving either visual feedback or electrotactile feedback for communicating the momentary tracking error. Moreover, it reported the minimum time delay needed for an abrupt impairment of users\u2019 performance. The experimental results have shown that electrotactile feedback performance is less prone to changes with longer delays. However, visual feedback drops faster than electrotactile with increased time delays. This is a good indication for the effectiveness of electrotactile feedback in enabling close- loop control in prosthetics, since some delays are inevitable. The second study describes the development of a novel non-invasive compact multichannel interface for electrotactile feedback, containing 24 pads electrode matrix, with fully programmable stimulation unit, that investigates the ability of able-bodied human subjects to localize the electrotactile stimulus delivered through the electrode matrix. Furthermore, it designed a novel dual parameter -modulation (interleaved frequency and intensity) and compared it to conventional stimulation (same frequency for all pads). In addition and for the first time, it compared the electrotactile stimulation to mechanical stimulation. More, it exposes the integration of virtual prosthesis with the developed system in order to achieve better user experience and object manipulation through mapping the acquired real-time collected tactile data and feedback it simultaneously to the user. The experimental results demonstrated that the proposed interleaved coding substantially improved the spatial localization compared to same-frequency stimulation. Furthermore, it showed that same-frequency stimulation was equivalent to mechanical stimulation, whereas the performance with dual-parameter modulation was significantly better. The third study presents the realization of a novel, flexible, screen- printed e-skin based on P(VDF-TrFE) piezoelectric polymers, that would cover the fingertips and the palm of the prosthetic hand (particularly the Michelangelo hand by Ottobock) and an assistive sensorized glove for stroke patients. Moreover, it developed a new validation methodology to examine the sensors behavior while being solicited. The characterization results showed compatibility between the expected (modeled) behavior of the electrical response of each sensor to measured mechanical (normal) force at the skin surface, which in turn proved the combination of both fabrication and assembly processes was successful. This paves the way to define a practical, simplified and reproducible characterization protocol for e-skin patches In conclusion, by adopting innovative methodologies in sensing and stimulation systems, this thesis advances the overall development of close-loop sensory feedback human-machine interface used for restoration of sense of touch in prosthetics. Moreover, this research could lead to high-bandwidth high-fidelity transmission of tactile information for modern dexterous prostheses that could ameliorate the end user experience and facilitate it acceptance in the daily life

    Development of PVDF tactile dynamic sensing in a behaviour-based assembly robot

    Get PDF
    The research presented in this thesis focuses on the development of tactile event sig¬ nature sensors and their application, especially in reactive behaviour-based robotic assembly systems.In pursuit of practical and economic sensors for detecting part contact, the application ofPVDF (polyvinylidene fluoride) film, a mechanical vibration sensitive piezo material, is investigated. A Clunk Sensor is developed which remotely detects impact vibrations, and a Push Sensor is developed which senses small changes in the deformation of a compliant finger surface. The Push Sensor is further developed to provide some force direction and force pattern sensing capability.By being able to detect changes of state in an assembly, such as a change of contact force, an assembly robot can be well informed of current conditions. The complex structure of assembly tasks provides a rich context within which to interpret changes of state, so simple binary sensors can conveniently supply a lot more information than in the domain of mobile robots. Guarded motions, for example, which require sensing a change of state, have long been recognised as very useful in part mating tasks. Guarded motions are particularly well suited to be components of assembly behavioural modules.In behaviour-based robotic assembly systems, the high level planner is endowed with as little complexity as possible while the low level planning execution agent deals with actual sensing and action. Highly reactive execution agents can provide advantages by encapsulating low level sensing and action, hiding the details of sensori-motor complexity from the higher levels.Because behaviour-based assembly systems emphasise the utility of this kind of quali¬ tative state-change sensor (as opposed to sensors which measure physical quantities), the robustness and utility of the Push Sensor was tested in an experimental behaviourbased system. An experimental task of pushing a ring along a convoluted stiff wire is chosen, in which the tactile sensors developed here are aided by vision. Three differ¬ ent methods of combining these different sensors within the general behaviour-based paradigm are implemented and compared. This exercise confirms the robustness and utility of the PVDF-based tactile sensors. We argue that the comparison suggests that for behaviour-based assembly systems using multiple concurrent sensor systems, bottom-level motor control in terms of force or velocity would be more appropriate than positional control. Behaviour-based systems have traditionally tried to avoid symbolic knowledge. Considering this in the light of the above work, it was found useful to develop a taxonomy of type of knowledge and refine the prohibition

    Neuromorphic vision-based tactile sensor for robotic grasp

    Get PDF
    Tactile sensors are developed to mimic human sense of touch in robotics. The touch sense is essential for machines to interact with environment. Several approaches have been studied to obtain rich information from the contact point to correct robot’s actions and acquire further information about the objects. Vision-based tactile sensors aim to extract tactile information by observing the contact point between the robot’s hand and environment and applying computer vision algorithms. In this thesis, a novel class of vision-based tactile sensors is proposed, "Neuromorphic Vision-Based Tactile Sensor" to estimate the contact force and classify materials in a grasp. This novel approach utilises a neuromorphic vision sensor to capture intensity changes (events) in the contact point. The triggered events represent changes in the contact force at each pixel in microseconds. The proposed sensor has a high temporal resolution and dynamic range which are suitable for high-speed robotic applications. Initially, a general framework is demonstrated to show the sensor operations. Furthermore, the relationship between events and the contact force is presented. Afterwards, methods based on Time-Delay Neural Networks (TDNN), Gaussian Process (GP) and Deep Neural Networks (DNN) are developed to estimate the contact force and classify objects material from the accumulation of events. The results indicate a low mean squared error of 0.17N against a force sensor for the force estimation using TDNN. Moreover, the objects materials are classified with 79.12% accuracy which is 30% higher compared to piezoresistive force sensors. This is followed by an approach to preserve spatio-temporal information during the learning process. Therefore, the triggered events are framed (event-frames) within a time window to preserve spatial information. Afterwards, multiple types of Long Short-Term Memory (LSTM) networks with convolutional layers are developed to estimate the contact force for objects with different size. The results are validated against a force sensor and achieve a mean squared error of less than 0.1N. Finally, algorithmic augmentation techniques are investigated to improve the networks accuracy for a wider range of force. Image-based and time-series augmentation methods are developed to generate artificial samples for training the network. A novel time-domain approach Temporal Event Shifting (TES) is proposed to augment events by preserving the spatial information of events. The results are validated on real experiments which indicate that time-domain and hybrid augmentation methods improve the networks’ accuracy significantly considering an object with a different size

    Design and Development of Sensor Integrated Robotic Hand

    Get PDF
    Most of the automated systems using robots as agents do use few sensors according to the need. However, there are situations where the tasks carried out by the end-effector, or for that matter by the robot hand needs multiple sensors. The hand, to make the best use of these sensors, and behave autonomously, requires a set of appropriate types of sensors which could be integrated in proper manners. The present research work aims at developing a sensor integrated robot hand that can collect information related to the assigned tasks, assimilate there correctly and then do task action as appropriate. The process of development involves selection of sensors of right types and of right specification, locating then at proper places in the hand, checking their functionality individually and calibrating them for the envisaged process. Since the sensors need to be integrated so that they perform in the desired manner collectively, an integration platform is created using NI PXIe-1082. A set of algorithm is developed for achieving the integrated model. The entire process is first modelled and simulated off line for possible modification in order to ensure that all the sensors do contribute towards the autonomy of the hand for desired activity. This work also involves design of a two-fingered gripper. The design is made in such a way that it is capable of carrying out the desired tasks and can accommodate all the sensors within its fold. The developed sensor integrated hand has been put to work and its performance test has been carried out. This hand can be very useful for part assembly work in industries for any shape of part with a limit on the size of the part in mind. The broad aim is to design, model simulate and develop an advanced robotic hand. Sensors for pick up contacts pressure, force, torque, position, surface profile shape using suitable sensing elements in a robot hand are to be introduced. The hand is a complex structure with large number of degrees of freedom and has multiple sensing capabilities apart from the associated sensing assistance from other organs. The present work is envisaged to add multiple sensors to a two-fingered robotic hand having motion capabilities and constraints similar to the human hand. There has been a good amount of research and development in this field during the last two decades a lot remains to be explored and achieved. The objective of the proposed work is to design, simulate and develop a sensor integrated robotic hand. Its potential applications can be proposed for industrial environments and in healthcare field. The industrial applications include electronic assembly tasks, lighter inspection tasks, etc. Application in healthcare could be in the areas of rehabilitation and assistive techniques. The work also aims to establish the requirement of the robotic hand for the target application areas, to identify the suitable kinds and model of sensors that can be integrated on hand control system. Functioning of motors in the robotic hand and integration of appropriate sensors for the desired motion is explained for the control of the various elements of the hand. Additional sensors, capable of collecting external information and information about the object for manipulation is explored. Processes are designed using various software and hardware tools such as mathematical computation MATLAB, OpenCV library and LabVIEW 2013 DAQ system as applicable, validated theoretically and finally implemented to develop an intelligent robotic hand. The multiple smart sensors are installed on a standard six degree-of-freedom industrial robot KAWASAKI RS06L articulated manipulator, with the two-finger pneumatic SHUNK robotic hand or designed prototype and robot control programs are integrated in such a manner that allows easy application of grasping in an industrial pick-and-place operation where the characteristics of the object can vary or are unknown. The effectiveness of the actual recommended structure is usually proven simply by experiments using calibration involving sensors and manipulator. The dissertation concludes with a summary of the contribution and the scope of further work

    Towards observable haptics: Novel sensors for capturing tactile interaction patterns

    Get PDF
    Kõiva R. Towards observable haptics: Novel sensors for capturing tactile interaction patterns. Bielefeld: Bielefeld University; 2014.Touch is one of the primary senses humans use when performing coordinated interaction, but the lack of a sense of touch in the majority of contemporary interactive technical systems, such as robots, which operate in non-deterministic environments, results in interactions that can at best be described as clumsy. Observing human haptics and extracting the salient information from the gathered data is not only relevant if we are to try to understand the involved underlying cognitive processes, but should also provide us with significant clues to design future intelligent interactive systems. Such systems could one day help to take the burden of tedious tasks off our hands in a similar fashion to how industrial robots revolutionized manufacturing. The aim of the work in this thesis was to provide significant advancements in tactile sensing technology, and thus move us a step closer to realizing this goal. The contributions contained herein can be broken into two major parts. The first part investigates capturing interaction patterns in humans with the goals of better understanding manual intelligence and improving the lives of hand amputees, while the second part is focused on augmenting technical systems with a sense of touch. tacTiles, a wireless tactile sensitive surface element attached to a deformable textile, was developed to capture human full-body interactions with large surfaces we come into contact with in our daily lives, such as floors, chairs, sofas or other furniture. The Tactile Dataglove, iObject and the Tactile Pen were developed especially to observe human manual intelligence. Whereas iObject allows motion sensing and a higher definition tactile signal to be captured than the Tactile Dataglove (220 tactile cells in the first iObject prototype versus 54 cells in the glove), the wearable glove makes haptic interactions with arbitrary objects observable. The Tactile Pen was designed to measure grip force during handwriting in order to better facilitate therapeutic treatment assessments. These sensors have already been extensively used by various research groups, including our own, to gain a better understanding of human manual intelligence. The Finger-Force-Linear-Sensor and the Tactile Bracelet are two novel sensors that were developed to facilitate more natural control of dexterous multi Degree-of-Freedom (DOF) hand prostheses. The Finger-Force-Linear-Sensor is a very accurate bidirectional single finger force ground-truth measurement device that was designed to enable testing and development of single finger forces and muscle activations mapping algorithms. The Tactile Bracelet was designed with the goal to provide a more robust and intuitive means of control for multi-DOF hand prostheses by measuring the muscle bulgings of the remnant muscles of lower arm amputees. It is currently in development and will eventually cover the complete forearm circumference with high spatial resolution tactile sensitive surfaces. An experiment involving a large number of lower arm amputees has already been planned. The Modular flat tactile sensor system, the Fabric-based touch sensitive artificial skin and the 3D shaped tactile sensor were developed to cover and to add touch sensing capabilities to the surfaces of technical systems. The rapid augmentation of systems with a sense of touch was the main goal of the modular flat tactile sensor system. The developed sensor modules can be used alone or in an array to form larger tactile sensitive surfaces such as tactile sensitive tabletops. As many robots have curved surfaces, using flat rigid modules severely limits the areas that can be covered with tactile sensors. The Fabric-based tactile sensor, originally developed to form a tactile dataglove for human hands, can with minor modifications also function as an artificial skin for technical systems. Finally, the 3D shaped tactile sensor based on Laser-Direct-Structuring technology is a novel tactile sensor that has a true 3D shape and provides high sensitivity and a high spatial resolution. These sensors take us further along the path towards creating general purpose technical systems that in time can be of great help to us in our daily lives. The desired tactile sensor characteristics differ significantly according to which haptic interaction patterns we wish to measure. Large tactile sensor arrays that are used to capture full body haptic interactions with floors and upholstered furniture, or that are designed to cover large areas of technical system surfaces, need to be scalable, have low power consumption and should ideally have a low material cost. Two examples of such sensors are tacTiles and the Fabric-based sensor for curved surfaces. At the other end of the tactile sensor development spectrum, if we want to observe manual interactions, high spatial and temporal resolution are crucial to enable the measurement of fine grasping and manipulation actions. Our fingertips contain the highest density area of mechanoreceptors, the organs that sense mechanical pressure and distortions. Thus, to construct biologically inspired anthropomorphic robotic hands, the artificial tactile sensors for the fingertips require similar high-fidelity sensors with surfaces that are curved under small bending radii in 2 dimensions, have high spatial densities, while simultaneously providing high sensitivity. With the fingertip tactile sensor, designed to fit the Shadow Robot Hands' fingers, I show that such sensors can indeed be constructed in the 3D-shaped high spatial resolution tactile sensor section of my thesis. With my work I have made a significant contribution towards making haptics more observable. I achieved this by developing a high number of novel tactile sensors that are usable, give a deeper insight into human haptic interactions, have great potential to help amputees and that make technical systems, such as robots, more capable

    Tactile Perception And Visuotactile Integration For Robotic Exploration

    Get PDF
    As the close perceptual sibling of vision, the sense of touch has historically received less than deserved attention in both human psychology and robotics. In robotics, this may be attributed to at least two reasons. First, it suffers from the vicious cycle of immature sensor technology, which causes industry demand to be low, and then there is even less incentive to make existing sensors in research labs easy to manufacture and marketable. Second, the situation stems from a fear of making contact with the environment, avoided in every way so that visually perceived states do not change before a carefully estimated and ballistically executed physical interaction. Fortunately, the latter viewpoint is starting to change. Work in interactive perception and contact-rich manipulation are on the rise. Good reasons are steering the manipulation and locomotion communities’ attention towards deliberate physical interaction with the environment prior to, during, and after a task. We approach the problem of perception prior to manipulation, using the sense of touch, for the purpose of understanding the surroundings of an autonomous robot. The overwhelming majority of work in perception for manipulation is based on vision. While vision is a fast and global modality, it is insufficient as the sole modality, especially in environments where the ambient light or the objects therein do not lend themselves to vision, such as in darkness, smoky or dusty rooms in search and rescue, underwater, transparent and reflective objects, and retrieving items inside a bag. Even in normal lighting conditions, during a manipulation task, the target object and fingers are usually occluded from view by the gripper. Moreover, vision-based grasp planners, typically trained in simulation, often make errors that cannot be foreseen until contact. As a step towards addressing these problems, we present first a global shape-based feature descriptor for object recognition using non-prehensile tactile probing alone. Then, we investigate in making the tactile modality, local and slow by nature, more efficient for the task by predicting the most cost-effective moves using active exploration. To combine the local and physical advantages of touch and the fast and global advantages of vision, we propose and evaluate a learning-based method for visuotactile integration for grasping
    corecore