7 research outputs found

    Exploiting photometric information for planning under uncertainty

    Full text link
    Vision-based localization systems rely on highly-textured areas for achieving an accurate pose estimation. However, most previous path planning strategies propose to select trajectories with minimum pose uncertainty by leveraging only the geometric structure of the scene, neglecting the photometric information (i.e, texture). Our planner exploits the scene’s visual appearance (i.e, the photometric information) in combination with its 3D geometry. Furthermore, we assume that we have no prior knowledge about the environment given, meaning that there is no pre-computed map or 3D geometry available. We introduce a novel approach to update the optimal plan on-the-fly, as new visual information is gathered. We demonstrate our approach with real and simulated Micro Aerial Vehicles (MAVs) that perform perception-aware path planning in real-time during exploration. We show significantly reduced pose uncertainty over trajectories planned without considering the perception of the robot

    Long-Term Follow-Up

    No full text
    corecore