Nowadays, several sensors and mechanisms are available to estimate a mobile robot
trajectory and location with respect to its surroundings. Usually absolute positioning
mechanisms are the most accurate, but they also are the most expensive ones, and require
pre installed equipment in the environment. Therefore, a system capable of measuring
its motion and location within the environment (relative positioning) has been a research goal since the beginning of autonomous vehicles. With the increasing of the computational performance, computer vision has become faster and, therefore, became possible to incorporate it in a mobile robot. In visual odometry feature based approaches, the model estimation requires absence of feature association outliers for an accurate motion. Outliers
rejection is a delicate process considering there is always a trade-off between speed and
reliability of the system.
This dissertation proposes an indoor 2D position system using Visual Odometry. The
mobile robot has a camera pointed to the ceiling, for image analysis. As requirements,
the ceiling and the oor (where the robot moves) must be planes. In the literature,
RANSAC is a widely used method for outlier rejection. However, it might be slow in critical circumstances. Therefore, it is proposed a new algorithm that accelerates RANSAC, maintaining its reliability. The algorithm, called FMBF, consists on comparing image texture patterns between pictures, preserving the most similar ones. There are several types of comparisons, with different computational cost and reliability. FMBF manages those comparisons in order to optimize the trade-off between speed and reliability