Article thumbnail
Location of Repository

Tapping into Touch

By Eduardo Torres-Jara, Lorenzo Natale and Paul Fitzpatrick

Abstract

Humans use a set of exploratory procedures to examine object properties through grasping and touch. Our goal is to exploit similar methods with a humanoid robot to enable developmental learning about manipulation. We use a compliant robot hand to find objects without prior knowledge of their presence or location, and then tap those objects with a finger. This behavior lets the robot generate and collect samples of the contact sound produced by impact with that object. We demonstrate the feasibility of recognizing objects by their sound, and relate this to human performance under situations analogous to that of the robot

Topics: Machine Learning, Robotics
Publisher: Lund University Cognitive Studies
Year: 2005
OAI identifier: oai:cogprints.org:4968

Suggested articles

Citations

  1. (2004). Domo: A force sensing humanoid robot for manipulation research.
  2. (1990). Elephants don’t play chess. Robotics and Autonomous Systems,
  3. (2005). Exploiting amodal cues for robot perception.
  4. (1991). Exploratory procedures for material properties: the temperature perception.
  5. (1993). Is visually guided reaching in early infancy a myth? Child Dev.,
  6. (1993). Motor development and the mind: the potential role of motor abilities as a determinant of aspects of perceptual development.
  7. (1994). Multimodal perception in the control of infant reaching.
  8. (2001). Perception and characterization of materials using signal processing techniques.
  9. (2003). The whole world in your hand: Active and interactive segmentation.
  10. (1993). Visual guidance in infants’ reaching toward suddenly displaced targets.

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.