35,676 research outputs found
Human and Object Recognition with a High-resolution tactile sensor
This paper 1 describes the use of two artificial intelligence methods for object
recognition via pressure images from a high-resolution tactile sensor. Both meth-
ods follow the same procedure of feature extraction and posterior classification
based on a supervised Supported Vector Machine (SVM). The two approaches
differ on how features are extracted: while the first one uses the Speeded-Up
Robust Features (SURF) descriptor, the other one employs a pre-trained Deep
Convolutional Neural Network (DCNN). Besides, this work shows its applica-
tion to object recognition for rescue robotics, by distinguishing between differ-
ent body parts and inert objects. The performance analysis of the proposed
methods is carried out with an experiment with 5-class non-human and 3-class
human classification, providing a comparison in terms of accuracy and compu-tational load. Finally, it is discussed how feature-extraction based on SURF can be obtained up to five times faster compared to DCNN. On the other hand, the
accuracy achieved using DCNN-based feature extraction can be 11.67% superior
to SURF.Proyecto DPI2015-65186-R
European Commission under grant agreement BES-2016-078237.
Universidad de Málaga. Campus de Excelencia Internacional AndalucĂa Tech
Pattern recognition in a multi-sensor environment
Journal ArticleCurrent pattern recognition systems tend to operate on a single sensor, e.g., a camera. however. the need is now evident for pattern recognition systems which can operate in multi-sensor environments. For example, a robotics workstation may use range finders. cameras, tactile pads, etc. The Multi-sensor Kernel System (MKS) provides an efficient and coherent approach to the specification, recovery, and analysis of patterns in the data sensed by such a diverse set of sensors. We demonstrate how much a system can be used to support both feature-based object models as well as structural models. The problems solved is the localization of a three-dimensional object in 3-space. Moreover, MKS allows rapid reconfiguration of the available sensors and the high-level models
Object Recognition and Localization : the Role of Tactile Sensors
Tactile sensors, because of their intrinsic insensitivity to lighting conditions and water turbidity, provide promising opportunities for augmenting the capabilities of vision sensors in applications involving object recognition and localization. This thesis presents two approaches for haptic object recognition and localization for ground and underwater environments. The first approach called Batch Ransac and Iterative Closest Point augmented Sequential Filter (BRICPSF) is based on an innovative combination of a sequential filter, Iterative-Closest-Point algorithm, and a feature-based Random Sampling and Consensus (RANSAC) algorithm for database matching. It can handle a large database of 3D-objects of complex shapes and performs a complete six-degree-of-freedom localization of static objects. The algorithms are validated by experimentation in simulation and using actual hardware. To our knowledge this is the first instance of haptic object recognition and localization in underwater environments. The second approach is biologically inspired, and provides a close integration between exploration and recognition. An edge following exploration strategy is developed that receives feedback from the current state of recognition. A recognition by parts approach is developed which uses BRICPSF for object part recognition. Object exploration is either directed to explore a part until it is successfully recognized, or is directed towards new parts to endorse the current recognition belief. This approach is validated by simulation experiments
Whiskered texture classification with uncertain contact pose geometry
Tactile sensing can be an important source of information for robots, and texture discrimination in particular is useful in object recognition and terrain identification. Whisker based tactile sensing has recently been shown to be a promising approach for mobile robots, using simple sensors and many classification approaches. However these approaches have often been tested in limited environments, and have not been compared against one another in a controlled way. A wide range of whisker-object contact poses are possible on a mobile robot, and the effect such contact variability has on sensing has not been properly investigated. We present a novel, carefully controlled study of simple surface texture classifiers on a large set of varied pose conditions that mimic those encountered by mobile robots. Namely, single brief whisker contacts with textured surfaces at a range of surface orientations and contact speeds. Results show that different classifiers are appropriate for different settings, with spectral template and feature based approaches performing best in surface texture, and contact speed estimation, respectively. The results may be used to inform selection of classifiers in tasks such as tactile SLAM
Tactile Mapping and Localization from High-Resolution Tactile Imprints
This work studies the problem of shape reconstruction and object localization
using a vision-based tactile sensor, GelSlim. The main contributions are the
recovery of local shapes from contact, an approach to reconstruct the tactile
shape of objects from tactile imprints, and an accurate method for object
localization of previously reconstructed objects. The algorithms can be applied
to a large variety of 3D objects and provide accurate tactile feedback for
in-hand manipulation. Results show that by exploiting the dense tactile
information we can reconstruct the shape of objects with high accuracy and do
on-line object identification and localization, opening the door to reactive
manipulation guided by tactile sensing. We provide videos and supplemental
information in the project's website
http://web.mit.edu/mcube/research/tactile_localization.html.Comment: ICRA 2019, 7 pages, 7 figures. Website:
http://web.mit.edu/mcube/research/tactile_localization.html Video:
https://youtu.be/uMkspjmDbq
- …