unknown

Situational Awareness Using A Single Omnidirectional Camera

Abstract

To retrieve scene informations using a single omnidirectional camera, we have based our work on a shape from texture method proposed by Lindeberg. To do so, we have adapted the method of Lindeberg, that was developed for planar images, in order to use it on the sphere S2 . The mathematical tools we use are stereographic dilation to implement scale variations for the scale-space representation, and filter steerability on the sphere to decrease computational order. The texture distortions due to the pro jection from the real world to the image contain the informations that enable shape and orientation to be computed. A multi-scale texture descriptor, the windowed second moment matrix, that contains distorsions informations, is computed and analyzed, with some assumptions about the surface texture to retrieve surface orientation. We have used synthetic signals to evaluate the performances of the adapted method. The obtained results for the distance and shape estimations are good when the textures on the surfaces correspond to the assumptions, generally around the equatorial plane of S2 , but when we move away from the equator, the precision of the estimations decreases significantly

    Similar works