4 research outputs found

    Novel Tactile-SIFT Descriptor for Object Shape Recognition

    Get PDF
    Using a tactile array sensor to recognize an object often requires multiple touches at different positions. This process is prone to move or rotate the object, which inevitably increases difficulty in object recognition. To cope with the unknown object movement, this paper proposes a new tactile-SIFT descriptor to extract features in view of gradients in the tactile image to represent objects, to allow the features being invariant to object translation and rotation. The tactile-SIFT segments a tactile image into overlapping subpatches, each of which is represented using a dn-dimensional gradient vector, similar to the classic SIFT descriptor. Tactile-SIFT descriptors obtained from multiple touches form a dictionary of k words, and the bag-of-words method is then used to identify objects. The proposed method has been validated by classifying 18 real objects with data from an off-the-shelf tactile sensor. The parameters of the tactile-SIFT descriptor, including the dimension size dn and the number of subpatches sp, are studied. It is found that the optimal performance is obtained using an 8-D descriptor with three subpatches, taking both the classification accuracy and time efficiency into consideration. By employing tactile-SIFT, a recognition rate of 91.33% has been achieved with a dictionary size of 50 clusters using only 15 touches

    Histogram based classification of tactile patterns on periodically distributed skin sensors for a humanoid robot

    No full text
    The main target of this work is to improve human-robot interaction capabilities, by adding a new modality of sense, touch, to KASPAR, a humanoid robot. Large scale distributed skin-like sensors are designed and integrated on the robot, covering KASPAR at various locations. One of the challenges is to classify different types of touch. Unlike digital images represented by grids of pixels, the geometrical structure of the sensor array limits the capability of straightforward application of well-established approaches for image patterns. This paper introduces a novel histogram-based classification algorithm, transforming tactile data into histograms of local features termed as codebook. Tactile pattern can be invariant at periodical locations, allowing tactile pattern classification using a smaller number of training data, instead of using training data from everywhere on the large scale skin sensors. To generate the codebook, this method uses a two-layer approach, namely local neighbourhood structures and encodings of pressure distribution of the local neighbourhood. Classification is performed based on the constructed features using Support Vector Machine (SVM) with the intersection kernel. Real experimental data are used for experiment to classify different patterns and have shown promising accuracy. To evaluate the performance, it is also compared with the SVM using the Radial Basis Function (RBF) kernel and results are discussed from both aspects of accuracy and the location invariance property

    Histogram based classification of tactile patterns on periodically distributed skin sensors for a humanoid robot

    No full text
    The main target of this work is to improve human-robot interaction capabilities, by adding a new modality of sense, touch, to KASPAR, a humanoid robot. Large scale distributed skin-like sensors are designed and integrated on the robot, covering KASPAR at various locations. One of the challenges is to classify different types of touch. Unlike digital images represented by grids of pixels, the geometrical structure of the sensor array limits the capability of straightforward application of well-established approaches for image patterns. This paper introduces a novel histogram-based classification algorithm, transforming tactile data into histograms of local features termed as codebook. Tactile pattern can be invariant at periodical locations, allowing tactile pattern classification using a smaller number of training data, instead of using training data from everywhere on the large scale skin sensors. To generate the codebook, this method uses a two-layer approach, namely local neighbourhood structures and encodings of pressure distribution of the local neighbourhood. Classification is performed based on the constructed features using Support Vector Machine (SVM) with the intersection kernel. Real experimental data are used for experiment to classify different patterns and have shown promising accuracy. To evaluate the performance, it is also compared with the SVM using the Radial Basis Function (RBF) kernel and results are discussed from both aspects of accuracy and the location invariance property
    corecore