Skip to main content
Article thumbnail
Location of Repository

Self localization and mapping using optical and thermal imagery

By Marina Magnabosco


Given a mobile robot starting from an unknown position in an unknown environment, with the task of explores the surroundings, it has to be able to build an environmental map and localize itself inside that map. Achieving a solution of this problem allows the exploration of area that can be dangerous or inaccessible for humans. In our implementation we decide to use two primary sensors for the environment exploration: an optical and a thermal camera. Prior work on the combined use of optical and thermal sensors for the Simultaneous Localization And Mapping (SLAM) problem is limited. The innovative aspect of this work is based on this combined use of a secondary thermal camera as an additional visual sensor for navigation under varying environmental conditions. A secondary innovative aspect is that we focus our attention on both cameras, using them as two separate and independent sensors and combine the information in the final stage of environmental mapping. During the mobile robot navigation the two cameras capture images on the environment and SURF feature points are extracted and matched between successive scenes. Using a prior work on bearing-only SLAM approach as a reference, a feature initialization method is implemented and allows each new good candidate feature (optical or thermal) to be initialized with a sum of Gaussians that represents a set of possible spatial positions of the detected feature. Using successive observations, is possible to estimate the environment coordinates of the feature and adding it to the Extended Kalman Filter (EKF) dynamic state vector. The EKF state vector contains the information about the position of the 6 degree of freedom mobile robot and the environmental landmark coordinates. The update of this information is managed by the EKF algorithm, a statistical method that allows a prediction of the state vector and it updates based on sensor information available. The final methodology is tested in indoor and outdoor environments with several different light conditions and robot trajectories producing results that are robust in terms of noise in the images and in other sensor data (i.e. encoders and GPS). The use of the thermal camera improves the number of landmarks detected during the navigation adding useful information about the explored area

Publisher: Cranfield University
Year: 2011
OAI identifier:
Provided by: Cranfield CERES

Suggested articles


  1. (2009), 3D Robotic mapping the simultaneous localization and mapping problem with six degrees of freedom, Springer Tracts in Advanced Robotics. doi
  2. (1988). A combined corner and edge detector", doi
  3. (2005). A Comparison of Affine Region Detectors", doi
  4. (2000). A flexible new technique for camera calibration ", doi
  5. (2001). A solution to the simultaneous localization and map building (SLAM) problem ", doi
  6. (2000). About Direct Methods", doi
  7. (2003). An Invitation to 3-D Vision: From Images to Geometric Models, doi
  8. ANN: A Library for Approximate Nearest Neighbor Searching. Version 1.1.2, available at: (accessed 04/29).
  9. (1966). Applications of state-space methods to navigation problems", doi
  10. Autonomous underwater vehicles: a collection of groups and project, available at: (accessed 05/02).
  11. available at: (accessed 06/08).
  12. Camera Calibration Toolbox for Matlab, available at: (accessed 03/15).
  13. (2009). Combining Appearance and Structure from Motion Features for Road Scene Understanding ", doi
  14. (2009). Current state of the art of vision based SLAM ", doi
  15. (2009). Disparity Estimation in Stereo Sequences using Scene Flow", doi
  16. Explore the world using interactive maps, available at: (accessed 12/02).
  17. (2008). Fusing Monocular Information in Multicamera SLAM ", doi
  18. (2010). Government & Industrial, available at: (accessed 04/16).
  19. (2010). iRobot®, Robots that make a difference, available at: (accessed
  20. (2008). Large-Scale 6-DOF SLAM With Stereo-in-Hand ", doi
  21. (2009). Mappatura con griglie di occupazione per robot mobili (unpublished Ingegneria Aerospaziale, corso di Robotica thesis), Università di
  22. (2008). Monocular vision simultaneous localization and mapping using doi
  23. (2005). Navigation System Design (unpublished Centre of Excellence for Autonomous thesis),
  24. NURC – Partnering for Maritime Innovation, available at: (accessed 04/15).
  25. (1999). Object Recognition from Local Scale-Invariant Features", doi
  26. OpenSURF - Open Source SURF feature extraction library, available at: (accessed 03/09).
  27. (1981). Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography ", doi
  28. (2011). Real-time people and vehicle detection from UAV imagery", doi
  29. (2007). Real-Time SLAM Relocalisation ", doi
  30. (2005). Robust sonar feature detection for the SLAM of mobile robot ", doi
  31. (2004). Robust wide-baseline stereo from maximally stable extremal regions", doi
  32. (2006). Simultaneous Localisation and Mapping (SLAM): Part I The Essential Algorithms ", doi
  33. (2002). Simultaneous localization and mapping with detection and tracking ", doi
  34. (2007). SLAM of a mobile robot using thinning-based topological information",
  35. Stanford's New Driverless Car., available at: (accessed 04/15).
  36. Street View: Explore the world at street level , available at: (accessed 04/15).
  37. (2006). Surf: Speeded up robust features ", doi
  38. (2009). Techniques Used In Autonomous Vehicle Systems: A Survey
  39. The ABB Group, ABB: Power and productivity for a better world, available at: (accessed 04/13).
  40. (2001). Topological simultaneous localization and mapping (SLAM): toward exact localization without explicit localization ", doi
  41. (2002). Vision for mobile robot navigation: a survey ", doi
  42. (2007). Vision-Based SLAM: Stereo and Monocular Approaches", doi

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.