research

A SOLID case for active bayesian perception in robot touch

Abstract

In a series of papers, we have formalized a Bayesian perception approach for robotics based on recent progress in understanding animal perception. The main principle is to accumulate evidence for multiple perceptual alternatives until reaching a preset belief threshold, formally related to sequential analysis methods for optimal decision making. Here, we extend this approach to active perception, by moving the sensor with a control strategy that depends on the posterior beliefs during decision making. This method can be used to solve problems involving Simultaneous Object Localization and IDentification (SOLID), or 'where and what'. Considering an example in robot touch, we find that active perception gives an efficient, accurate solution to the SOLID problem for uncertain object locations; in contrast, passive Bayesian perception, which lacked sensorimotor feedback, then performed poorly. Thus, active perception can enable robust sensing in unstructured environments. © 2013 Springer-Verlag Berlin Heidelberg

    Similar works