1 research outputs found

    Where is My Stuff? An Interactive System for Spatial Relations

    Full text link
    In this paper we present a system that detects and tracks objects and agents, computes spatial relations, and communicates those relations to the user using speech. Our system is able to detect multiple objects and agents at 30 frames per second using a RGBD camera. It is able to extract the spatial relations in, on, next to, near, and belongs to, and communicate these relations using natural language. The notion of belonging is particularly important for Human-Robot Interaction since it allows the robot ground the language and reason about the right objects. Although our system is currently static and targeted to a fixed location in a room, we are planning to port it to a mobile robot thus allowing it explore the environment and create a spatial knowledge base
    corecore