29,207 research outputs found
Robot Navigation in Unseen Spaces using an Abstract Map
Human navigation in built environments depends on symbolic spatial
information which has unrealised potential to enhance robot navigation
capabilities. Information sources such as labels, signs, maps, planners, spoken
directions, and navigational gestures communicate a wealth of spatial
information to the navigators of built environments; a wealth of information
that robots typically ignore. We present a robot navigation system that uses
the same symbolic spatial information employed by humans to purposefully
navigate in unseen built environments with a level of performance comparable to
humans. The navigation system uses a novel data structure called the abstract
map to imagine malleable spatial models for unseen spaces from spatial symbols.
Sensorimotor perceptions from a robot are then employed to provide purposeful
navigation to symbolic goal locations in the unseen environment. We show how a
dynamic system can be used to create malleable spatial models for the abstract
map, and provide an open source implementation to encourage future work in the
area of symbolic navigation. Symbolic navigation performance of humans and a
robot is evaluated in a real-world built environment. The paper concludes with
a qualitative analysis of human navigation strategies, providing further
insights into how the symbolic navigation capabilities of robots in unseen
built environments can be improved in the future.Comment: 15 pages, published in IEEE Transactions on Cognitive and
Developmental Systems (http://doi.org/10.1109/TCDS.2020.2993855), see
https://btalb.github.io/abstract_map/ for access to softwar
Look, Listen, and Act: Towards Audio-Visual Embodied Navigation
A crucial ability of mobile intelligent agents is to integrate the evidence
from multiple sensory inputs in an environment and to make a sequence of
actions to reach their goals. In this paper, we attempt to approach the problem
of Audio-Visual Embodied Navigation, the task of planning the shortest path
from a random starting location in a scene to the sound source in an indoor
environment, given only raw egocentric visual and audio sensory data. To
accomplish this task, the agent is required to learn from various modalities,
i.e. relating the audio signal to the visual environment. Here we describe an
approach to audio-visual embodied navigation that takes advantage of both
visual and audio pieces of evidence. Our solution is based on three key ideas:
a visual perception mapper module that constructs its spatial memory of the
environment, a sound perception module that infers the relative location of the
sound source from the agent, and a dynamic path planner that plans a sequence
of actions based on the audio-visual observations and the spatial memory of the
environment to navigate toward the goal. Experimental results on a newly
collected Visual-Audio-Room dataset using the simulated multi-modal environment
demonstrate the effectiveness of our approach over several competitive
baselines.Comment: Accepted by ICRA 2020. Project page: http://avn.csail.mit.ed
Routing Diverse Evacuees with Cognitive Packets
This paper explores the idea of smart building evacuation when evacuees can
belong to different categories with respect to their ability to move and their
health conditions. This leads to new algorithms that use the Cognitive Packet
Network concept to tailor different quality of service needs to different
evacuees. These ideas are implemented in a simulated environment and evaluated
with regard to their effectiveness.Comment: 7 pages, 7 figure
A biologically inspired meta-control navigation system for the Psikharpax rat robot
A biologically inspired navigation system for the mobile rat-like robot named Psikharpax is presented, allowing for self-localization and autonomous navigation in an initially unknown environment. The ability of parts of the model (e. g. the strategy selection mechanism) to reproduce rat behavioral data in various maze tasks has been validated before in simulations. But the capacity of the model to work on a real robot platform had not been tested. This paper presents our work on the implementation on the Psikharpax robot of two independent navigation strategies (a place-based planning strategy and a cue-guided taxon strategy) and a strategy selection meta-controller. We show how our robot can memorize which was the optimal strategy in each situation, by means of a reinforcement learning algorithm. Moreover, a context detector enables the controller to quickly adapt to changes in the environment-recognized as new contexts-and to restore previously acquired strategy preferences when a previously experienced context is recognized. This produces adaptivity closer to rat behavioral performance and constitutes a computational proposition of the role of the rat prefrontal cortex in strategy shifting. Moreover, such a brain-inspired meta-controller may provide an advancement for learning architectures in robotics
- …