16 research outputs found

    Mapping haptic exploratory procedures to multiple shape representations

    Get PDF
    Research in human haptics has revealed a number of exploratory procedures (EPs) that are used in determining attributes on an object, particularly shape. This research has been used as a paradigm for building an intelligent robotic system that can perform shape recognition from touch sensing. In particular, a number of mappings between EPs and shape modeling primitives have been found. The choice of shape primitive for each EP is discussed, and results from experiments with a Utah-MIT dextrous hand system are presented. A vision algorithm to complement active touch sensing for the task of autonomous shape recovery is also presented

    Novel Tactile-SIFT Descriptor for Object Shape Recognition

    Get PDF
    Using a tactile array sensor to recognize an object often requires multiple touches at different positions. This process is prone to move or rotate the object, which inevitably increases difficulty in object recognition. To cope with the unknown object movement, this paper proposes a new tactile-SIFT descriptor to extract features in view of gradients in the tactile image to represent objects, to allow the features being invariant to object translation and rotation. The tactile-SIFT segments a tactile image into overlapping subpatches, each of which is represented using a dn-dimensional gradient vector, similar to the classic SIFT descriptor. Tactile-SIFT descriptors obtained from multiple touches form a dictionary of k words, and the bag-of-words method is then used to identify objects. The proposed method has been validated by classifying 18 real objects with data from an off-the-shelf tactile sensor. The parameters of the tactile-SIFT descriptor, including the dimension size dn and the number of subpatches sp, are studied. It is found that the optimal performance is obtained using an 8-D descriptor with three subpatches, taking both the classification accuracy and time efficiency into consideration. By employing tactile-SIFT, a recognition rate of 91.33% has been achieved with a dictionary size of 50 clusters using only 15 touches

    iCLAP: Shape Recognition by Combining Proprioception and Touch Sensing

    Get PDF
    The work presented in this paper was partially supported by the Engineering and Physical Sciences Council (EPSRC) Grant (Ref: EP/N020421/1) and the King’s-China Scholarship Council Ph.D. scholarship

    Manipulation primitives: A paradigm for abstraction and execution of grasping and manipulation tasks

    Get PDF
    Sensor-based reactive and hybrid approaches have proven a promising line of study to address imperfect knowledge in grasping and manipulation. However the reactive approaches are usually tightly coupled to a particular embodiment making transfer of knowledge difficult. This paper proposes a paradigm for modeling and execution of reactive manipulation actions, which makes knowledge transfer to different embodiments possible while retaining the reactive capabilities of the embodiments. The proposed approach extends the idea of control primitives coordinated by a state machine by introducing an embodiment independent layer of abstraction. Abstract manipulation primitives constitute a vocabulary of atomic, embodiment independent actions, which can be coordinated using state machines to describe complex actions. To obtain embodiment specific models, the abstract state machines are automatically translated to embodiment specific models, such that full capabilities of each platform can be utilized. The strength of the manipulation primitives paradigm is demonstrated by developing a set of corresponding embodiment specific primitives for object transport, including a complex reactive grasping primitive. The robustness of the approach is experimentally studied in emptying of a box filled with several unknown objects. The embodiment independence is studied by performing a manipulation task on two different platforms using the same abstract description
    corecore