6,187 research outputs found

    Controlled Tactile Exploration and Haptic Object Recognition

    Get PDF
    In this paper we propose a novel method for in-hand object recognition. The method is composed of a grasp stabilization controller and two exploratory behaviours to capture the shape and the softness of an object. Grasp stabilization plays an important role in recognizing objects. First, it prevents the object from slipping and facilitates the exploration of the object. Second, reaching a stable and repeatable position adds robustness to the learning algorithm and increases invariance with respect to the way in which the robot grasps the object. The stable poses are estimated using a Gaussian mixture model (GMM). We present experimental results showing that using our method the classifier can successfully distinguish 30 objects.We also compare our method with a benchmark experiment, in which the grasp stabilization is disabled. We show, with statistical significance, that our method outperforms the benchmark method

    Active haptic shape recognition by intrinsic motivation with a robot hand

    Get PDF
    In this paper, we present an intrinsic motivation approach applied to haptics in robotics for tactile object exploration and recognition. Here, touch is used as the sensation process for contact detection, whilst proprioceptive information is used for the perception process. First, a probabilistic method is employed to reduce uncertainty present in tactile measurements. Second, the object exploration process is actively controlled by intelligently moving the robot hand towards interesting locations. The active behaviour performed with the robotic hand is achieved by an intrinsic motivation approach, which permitted to improve the accuracy for object recognition over the results obtained by a fixed sequence of exploration movements. The proposed method was validated in a simulated environment with a Monte Carlo method, whilst for the real environment a three-fingered robotic hand and various object shapes were employed. The results demonstrate that our method is robust and suitable for haptic perception in autonomous robotics

    Mid-air haptic rendering of 2D geometric shapes with a dynamic tactile pointer

    Get PDF
    An important challenge that affects ultrasonic midair haptics, in contrast to physical touch, is that we lose certain exploratory procedures such as contour following. This makes the task of perceiving geometric properties and shape identification more difficult. Meanwhile, the growing interest in mid-air haptics and their application to various new areas requires an improved understanding of how we perceive specific haptic stimuli, such as icons and control dials in mid-air. We address this challenge by investigating static and dynamic methods of displaying 2D geometric shapes in mid-air. We display a circle, a square, and a triangle, in either a static or dynamic condition, using ultrasonic mid-air haptics. In the static condition, the shapes are presented as a full outline in mid-air, while in the dynamic condition, a tactile pointer is moved around the perimeter of the shapes. We measure participants’ accuracy and confidence of identifying shapes in two controlled experiments (n1 = 34, n2 = 25). Results reveal that in the dynamic condition people recognise shapes significantly more accurately, and with higher confidence. We also find that representing polygons as a set of individually drawn haptic strokes, with a short pause at the corners, drastically enhances shape recognition accuracy. Our research supports the design of mid-air haptic user interfaces in application scenarios such as in-car interactions or assistive technology in education

    Tactile information improves visual object discrimination in kea, Nestor notabilis, and capuchin monkeys, Sapajus spp.

    Get PDF
    In comparative visual cognition research, the influence of information acquired by nonvisual senses has received little attention. Systematic studies focusing on how the integration of information from sight and touch can affect animal perception are sparse. Here, we investigated whether tactile input improves visual discrimination ability of a bird, the kea, and capuchin monkeys, two species with acute vision, and known for their tendency to handle objects. To this end, we assessed whether, at the attainment of a criterion, accuracy and/or learning speed in the visual modality were enhanced by haptic (i.e. active tactile) exploration of an object. Subjects were trained to select the positive stimulus between two cylinders of the same shape and size, but with different surface structures. In the Sight condition, one pair of cylinders was inserted into transparent Plexiglas tubes. This prevented animals from haptically perceiving the objects' surfaces. In the Sight and Touch condition, one pair of cylinders was not inserted into transparent Plexiglas tubes. This allowed the subjects to perceive the objects' surfaces both visually and haptically. We found that both kea and capuchins (1) showed comparable levels of accuracy at the attainment of the learning criterion in both conditions, but (2) required fewer trials to achieve the criterion in the Sight and Touch condition. Moreover, this study showed that both kea and capuchins can integrate information acquired by the visual and tactile modalities. To our knowledge, this represents the first evidence of visuotactile integration in a bird species. Overall, our findings demonstrate that the acquisition of tactile information while manipulating objects facilitates visual discrimination of objects in two phylogenetically distant species

    Active haptic perception in robots: a review

    Get PDF
    In the past few years a new scenario for robot-based applications has emerged. Service and mobile robots have opened new market niches. Also, new frameworks for shop-floor robot applications have been developed. In all these contexts, robots are requested to perform tasks within open-ended conditions, possibly dynamically varying. These new requirements ask also for a change of paradigm in the design of robots: on-line and safe feedback motion control becomes the core of modern robot systems. Future robots will learn autonomously, interact safely and possess qualities like self-maintenance. Attaining these features would have been relatively easy if a complete model of the environment was available, and if the robot actuators could execute motion commands perfectly relative to this model. Unfortunately, a complete world model is not available and robots have to plan and execute the tasks in the presence of environmental uncertainties which makes sensing an important component of new generation robots. For this reason, today\u2019s new generation robots are equipped with more and more sensing components, and consequently they are ready to actively deal with the high complexity of the real world. Complex sensorimotor tasks such as exploration require coordination between the motor system and the sensory feedback. For robot control purposes, sensory feedback should be adequately organized in terms of relevant features and the associated data representation. In this paper, we propose an overall functional picture linking sensing to action in closed-loop sensorimotor control of robots for touch (hands, fingers). Basic qualities of haptic perception in humans inspire the models and categories comprising the proposed classification. The objective is to provide a reasoned, principled perspective on the connections between different taxonomies used in the Robotics and human haptic literature. The specific case of active exploration is chosen to ground interesting use cases. Two reasons motivate this choice. First, in the literature on haptics, exploration has been treated only to a limited extent compared to grasping and manipulation. Second, exploration involves specific robot behaviors that exploit distributed and heterogeneous sensory data

    Feeling the Shape: Active Exploration Behaviors for Object Recognition With a Robotic Hand

    Get PDF
    Autonomous exploration in robotics is a crucial feature to achieve robust and safe systems capable to interact with and recognize their surrounding environment. In this paper, we present a method for object recognition using a three-fingered robotic hand actively exploring interesting object locations to reduce uncertainty. We present a novel probabilistic perception approach with a Bayesian formulation to iteratively accumulate evidence from robot touch. Exploration of better locations for perception is performed by familiarity and novelty exploration behaviors, which intelligently control the robot hand to move toward locations with low and high levels of interestingness, respectively. These are active behaviors that, similar to the exploratory procedures observed in humans, allow robots to autonomously explore locations they believe that contain interesting information for recognition. Active behaviors are validated with object recognition experiments in both offline and real-time modes. Furthermore, the effects of inhibiting the active behaviors are analyzed with a passive exploration strategy. The results from the experiments demonstrate the accuracy of our proposed methods, but also their benefits for active robot control to intelligently explore and interact with the environment
    • …
    corecore