12 research outputs found

    Human-guided Swarms: Impedance Control-inspired Influence in Virtual Reality Environments

    Full text link
    Prior works in human-swarm interaction (HSI) have sought to guide swarm behavior towards established objectives, but may be unable to handle specific scenarios that require finer human supervision, variable autonomy, or application to large-scale swarms. In this paper, we present an approach that enables human supervisors to tune the level of swarm control, and guide a large swarm using an assistive control mechanism that does not significantly restrict emergent swarm behaviors. We develop this approach in a virtual reality (VR) environment, using the HTC Vive and Unreal Engine 4 with AirSim plugin. The novel combination of an impedance control-inspired influence mechanism and a VR test bed enables and facilitates the rapid design and test iterations to examine trade-offs between swarming behavior and macroscopic-scale human influence, while circumventing flight duration limitations associated with battery-powered small unmanned aerial system (sUAS) systems. The impedance control-inspired mechanism was tested by a human supervisor to guide a virtual swarm consisting of 16 sUAS agents. Each test involved moving the swarm's center of mass through narrow canyons, which were not feasible for a swarm to traverse autonomously. Results demonstrate that integration of the influence mechanism enabled the successful manipulation of the macro-scale behavior of the swarm towards task completion, while maintaining the innate swarming behavior.Comment: 11 pages, 5 figures, preprin

    Multi-Operator Gesture Control of Robotic Swarms Using Wearable Devices

    Get PDF
    The theory and design of effective interfaces for human interaction with multi-robot systems has recently gained significant interest. Robotic swarms are multi-robot systems where local interactions between robots and neighbors within their spatial neighborhood generate emergent collective behaviors. Most prior work has studied interfaces for human interaction with remote swarms, but swarms also have great potential in applications working alongside humans, motivating the need for interfaces for local interaction. Given the collective nature of swarms, human interaction may occur at many levels of abstraction ranging from swarm behavior selection to teleoperation. Wearable gesture control is an intuitive interaction modality that can meet this requirement while keeping operator hands usually unencumbered. In this paper, we present an interaction method using a gesture-based wearable device with a limited number of gestures for robust control of a complex system: a robotic swarm. Experiments conducted with a real robot swarm compare performance in single and two-operator conditions illustrating the effectiveness of the method. Results show human operators using our interaction method are able to successfully complete the task in all trials, illustrating the effectiveness of the method, with better performance in the two-operator condition, indicating separation of function is beneficial for our method. The primary contribution of our work is the development and demonstration of interaction methods that allow robust control of a difficult to understand multi robot system using only the noisy inputs typical of smartphones and other on-body sensor driven devices

    Human Interaction with Robot Swarms: A Survey

    Get PDF
    Recent advances in technology are delivering robots of reduced size and cost. A natural outgrowth of these advances are systems comprised of large numbers of robots that collaborate autonomously in diverse applications. Research on effective autonomous control of such systems, commonly called swarms, has increased dramatically in recent years and received attention from many domains, such as bioinspired robotics and control theory. These kinds of distributed systems present novel challenges for the effective integration of human supervisors, operators, and teammates that are only beginning to be addressed. This paper is the first survey of human–swarm interaction (HSI) and identifies the core concepts needed to design a human–swarm system. We first present the basics of swarm robotics. Then, we introduce HSI from the perspective of a human operator by discussing the cognitive complexity of solving tasks with swarm systems. Next, we introduce the interface between swarm and operator and identify challenges and solutions relating to human–swarm communication, state estimation and visualization, and human control of swarms. For the latter, we develop a taxonomy of control methods that enable operators to control swarms effectively. Finally, we synthesize the results to highlight remaining challenges, unanswered questions, and open problems for HSI, as well as how to address them in future works

    Conception, réalisation et étude d'un esssaim de robots autonome protégeant un groupe de personnes munies de semelle intelligente

    Get PDF
    Ce projet porte sur le problème de la protection d'un convoi de personnes munies de semelle intelligente par un essaim de robots. De nos jours, il y a beaucoup de flux de population qui nécessite d'être protégés dans des zones à risques (familles syriennes, irakiennes...). L'essaim de robots est une solution qui permettrait de les protéger sans exposer d'autres personnes au danger. Celui-ci devra suivre le groupe de personnes et éviter toutes les perturbations externes dans le but de réduire les erreurs de positionnement des robots. La semelle intelligente portée par les gens, élaborée à partir de plusieurs capteurs, donnera les informations sur leur orientation et leur vitesse de marche. Les robots pourront être munis de capteurs de distance et de centrale inertielle afin de détecter les obstacles environnant et de se déplacer autour du groupe de personnes. Un drone fournira également des informations visuelles sur l'environnement autour des personnes. Le système est entièrement basé sur un réseau de modules WiFi (WBAN : Wireless Body Area Network) qui communiqueront toutes les données recueillies. Un serveur se chargera de collecter toutes les données reçues par les robots et les semelles. Celles-ci seront traitées par différents algorithmes qui dirigeront les robots de manière autonome autour des personnes
    corecore