Learning Social Navigation from Demonstrations with Conditional Neural Processes

Abstract

Sociability is essential for modern robots to increase their acceptability in human environments. Traditional techniques use manually engineered utility functions inspired by observing pedestrian behaviors to achieve social navigation. However, social aspects of navigation are diverse, changing across different types of environments, societies, and population densities, making it unrealistic to use hand-crafted techniques in each domain. This paper presents a data-driven navigation architecture that uses state-of-the-art neural architectures, namely Conditional Neural Processes, to learn global and local controllers of the mobile robot from observations. Additionally, we leverage a state-of-the-art, deep prediction mechanism to detect situations not similar to the trained ones, where reactive controllers step in to ensure safe navigation. Our results demonstrate that the proposed framework can successfully carry out navigation tasks regarding social norms in the data. Further, we showed that our system produces fewer personal-zone violations, causing less discomfort

    Similar works

    Full text

    thumbnail-image

    Available Versions