29 research outputs found
Angled sensor configuration capable of measuring tri-axial forces for pHRI
© 2016 IEEE. This paper presents a new configuration for single axis tactile sensor arrays molded in rubber to enable tri-axial force measurement. The configuration requires the sensing axis of each sensor in the array to be rotated out of alignment with respect to external forces. This angled sensor array measures shear forces along axes in a way that is different to a planar sensor array. Three sensors using the angled configuration (22.5°, 45° and 67.5°) and a fourth sensor using the planar configuration (0°) have been fabricated for experimental comparison. Artificial neural networks were trained to interpret the external force applied along each axis (X, Y and Z) from raw pressure sensor values. The results show that the angled sensor configuration is capable of measuring tri-axial external forces with a root mean squared error of 1.79N, less error in comparison to the equivalent sensor utilizing the planar configuration (4.52N). The sensors are then implemented to control a robotic arm. Preliminary findings show angled sensor arrays to be a viable alternative to planar sensor arrays for shear force measurement; this has wide applications in physical Human Robot Interaction (pHRI)
Recommended from our members
Sensing and Control for Robust Grasping with Simple Hardware
Robots can move, see, and navigate in the real world outside carefully structured factories, but they cannot yet grasp and manipulate objects without human intervention. Two key barriers are the complexity of current approaches, which require complicated hardware or precise perception to function effectively, and the challenge of understanding system performance in a tractable manner given the wide range of factors that impact successful grasping. This thesis presents sensors and simple control algorithms that relax the requirements on robot hardware, and a framework to understand the capabilities and limitations of grasping systems.Engineering and Applied Science
Comparing Piezoresistive Substrates for Tactile Sensing in Dexterous Hands
While tactile skins have been shown to be useful for detecting collisions
between a robotic arm and its environment, they have not been extensively used
for improving robotic grasping and in-hand manipulation. We propose a novel
sensor design for use in covering existing multi-fingered robot hands. We
analyze the performance of four different piezoresistive materials using both
fabric and anti-static foam substrates in benchtop experiments. We find that
although the piezoresistive foam was designed as packing material and not for
use as a sensing substrate, it performs comparably with fabrics specifically
designed for this purpose. While these results demonstrate the potential of
piezoresistive foams for tactile sensing applications, they do not fully
characterize the efficacy of these sensors for use in robot manipulation. As
such, we use a high density foam substrate to develop a scalable tactile skin
that can be attached to the palm of a robotic hand. We demonstrate several
robotic manipulation tasks using this sensor to show its ability to reliably
detect and localize contact, as well as analyze contact patterns during
grasping and transport tasks.Comment: 10 figures, 8 pages, submitted to ICRA 202
An Embedded, Multi-Modal Sensor System for Scalable Robotic and Prosthetic Hand Fingers
Grasping and manipulation with anthropomorphic robotic and prosthetic hands presents a scientific challenge regarding mechanical design, sensor system, and control. Apart from the mechanical design of such hands, embedding sensors needed for closed-loop control of grasping tasks remains a hard problem due to limited space and required high level of integration of different components. In this paper we present a scalable design model of artificial fingers, which combines mechanical design and embedded electronics with a sophisticated multi-modal sensor system consisting of sensors for sensing normal and shear force, distance, acceleration, temperature, and joint angles. The design is fully parametric, allowing automated scaling of the fingers to arbitrary dimensions in the human hand spectrum. To this end, the electronic parts are composed of interchangeable modules that facilitate the echanical scaling of the fingers and are fully enclosed by the mechanical parts of the finger. The resulting design model allows deriving freely scalable and multimodally sensorised fingers for robotic and prosthetic hands. Four physical demonstrators are assembled and tested to evaluate the approach
An Embedded, Multi-Modal Sensor System for Scalable Robotic and Prosthetic Hand Fingers
Grasping and manipulation with anthropomorphic robotic and prosthetic hands presents a scientific challenge regarding mechanical design, sensor system, and control. Apart from the mechanical design of such hands, embedding sensors needed for closed-loop control of grasping tasks remains a hard problem due to limited space and required high level of integration of different components. In this paper we present a scalable design model of artificial fingers, which combines mechanical design and embedded electronics with a sophisticated multi-modal sensor system consisting of sensors for sensing normal and shear force, distance, acceleration, temperature, and joint angles. The design is fully parametric, allowing automated scaling of the fingers to arbitrary dimensions in the human hand spectrum. To this end, the electronic parts are composed of interchangeable modules that facilitate the echanical scaling of the fingers and are fully enclosed by the mechanical parts of the finger. The resulting design model allows deriving freely scalable and multimodally sensorised fingers for robotic and prosthetic hands. Four physical demonstrators are assembled and tested to evaluate the approach
Recommended from our members
Data-driven Tactile Sensing using Spatially Overlapping Signals
Providing robots with distributed, robust and accurate tactile feedback is a fundamental problem in robotics because of the large number of tasks that require physical interaction with objects. Tactile sensors can provide robots with information about the location of each point of contact with the manipulated object, an estimation of the contact forces applied (normal and shear) and even slip detection. Despite significant advances in touch and force transduction, tactile sensing is still far from ubiquitous in robotic manipulation. Existing methods for building touch sensors have proven difficult to integrate into robot fingers due to multiple challenges, including difficulty in covering multicurved surfaces, high wire count, or packaging constrains preventing their use in dexterous hands.
In this dissertation, we focus on the development of soft tactile systems that can be deployed over complex, three-dimensional surfaces with a low wire count and using easily accessible manufacturing methods. To this effect, we present a general methodology called spatially overlapping signals. The key idea behind our method is to embed multiple sensing terminals in a volume of soft material which can be deployed over arbitrary, non-developable surfaces. Unlike a traditional taxel, these sensing terminals are not capable of measuring strain on their own. Instead, we take measurements across pairs of sensing terminals. Applying strain in the receptive field of this terminal pair should measurably affect the signal associated with it. As we embed multiple sensing terminals in this soft material, a significant overlap of these receptive fields occurs across the whole active sensing area, providing us with a very rich dataset characterizing the contact event. The use of an all-pairs approach, where all possible combinations of sensing terminals pairs are used, maximizes the number of signals extracted while reducing the total number of wires for the overall sensor, which in turn facilitates its integration.
Building an analytical model for how this rich signal set relates to various contacts events can be very challenging. Further, any such model would depend on knowing the exact locations of the terminals in the sensor, thus requiring very precise manufacturing. Instead, we build forward models of our sensors from data. We collect training data using a dataset of controlled indentations of known characteristics, directly learning the mapping between our signals and the variables characterizing a contact event. This approach allows for accessible, cheap manufacturing while enabling extensive coverage of curved surfaces. The concept of spatially overlapping signals can be realized using various transduction methods; we demonstrate sensors using piezoresistance, pressure transducers and optics. With piezoresistivity we measure resistance values across various electrodes embedded in a carbon nanotubes infused elastomer to determine the location of touch. Using commercially available pressure transducers embedded in various configurations inside a soft volume of rubber, we show its possible to localize contacts across a curved surface. Finally, using optics, we measure light transport between LEDs and photodiodes inside a clear elastomer which makes up our sensor. Our optical sensors are able to detect both the location and depth of an indentation very accurately on both planar and multicurved surfaces.
Our Distributed Interleaved Signals for Contact via Optics or D.I.S.C.O Finger is the culmination of this methodology: a fully integrated, sensorized robot finger, with a low wire count and designed for easy integration into dexterous manipulators. Our DISCO Finger can generally determine contact location with sub-millimeter accuracy, and contact force to within 10% (and often with 5%) of the true value without the need for analytical models. While our data-driven method requires training data representative of the final operational conditions that the system will encounter, we show our finger can be robust to novel contact scenarios where the shape of the indenter has not been seen during training. Moreover, the forward model that predicts contact locations and applied normal force can be transfered to new fingers with minimal loss of performance, eliminating the need to collect training data for each individual finger. We believe that rich tactile information, in a highly functional form with limited blind spots and a simple integration path into complete systems, like we demonstrate in this dissertation, will prove to be an important enabler for data-driven complex robotic motor skills, such as dexterous manipulation