2 research outputs found

    Natural User Interface for Roombots

    Get PDF
    Roombots (RB) are self-reconfigurable modular robots designed to study robotic reconfiguration on a structured grid and adaptive locomotion off grid. One of the main goals of this platform is to create adaptive furniture inside living spaces such as homes or offices. To ease the control of RB modules in these environments, we propose a novel and more natural way of interaction with the RB modules on a RB grid, called the Natural Roombots User Interface. In our method, the user commands the RB modules using pointing gestures. The user's body is tracked using multiple Kinects. The user is also given real-time visual feedback of their physical actions and the state of the system via LED illumination electronics installed on both RB modules and the grid. We demonstrate how our interface can be used to efficiently control RB modules on simple point-to-point grid locomotion and conclude by discussing future extensions

    A Robust Integrated System for Selecting and Commanding Multiple Mobile Robots

    No full text
    Abstract — We describe a system whereby multiple humans and mobile robots interact robustly using a combination of sensing and signalling modalities. Extending our previous work on selecting an individual robot from a population by face-engagement, we show that reaching toward a robot- a specialization of pointing- can be used to designate a particular robot for subsequent one-on-one interaction. To achieve robust operation despite frequent sensing problems, the robots use three phases of human detection and tracking, and emit audio cues to solicit interaction and guide the behaviour of the human. A series of real-world trials demonstrates the practicality of our approach. I
    corecore