74 research outputs found

    Hand-worn Haptic Interface for Drone Teleoperation

    Full text link
    Drone teleoperation is usually accomplished using remote radio controllers, devices that can be hard to master for inexperienced users. Moreover, the limited amount of information fed back to the user about the robot's state, often limited to vision, can represent a bottleneck for operation in several conditions. In this work, we present a wearable interface for drone teleoperation and its evaluation through a user study. The two main features of the proposed system are a data glove to allow the user to control the drone trajectory by hand motion and a haptic system used to augment their awareness of the environment surrounding the robot. This interface can be employed for the operation of robotic systems in line of sight (LoS) by inexperienced operators and allows them to safely perform tasks common in inspection and search-and-rescue missions such as approaching walls and crossing narrow passages with limited visibility conditions. In addition to the design and implementation of the wearable interface, we performed a systematic study to assess the effectiveness of the system through three user studies (n = 36) to evaluate the users' learning path and their ability to perform tasks with limited visibility. We validated our ideas in both a simulated and a real-world environment. Our results demonstrate that the proposed system can improve teleoperation performance in different cases compared to standard remote controllers, making it a viable alternative to standard Human-Robot Interfaces.Comment: Accepted at the IEEE International Conference on Robotics and Automation (ICRA) 202

    Body swarm interface (BOSI) : controlling robotic swarms using human bio-signals

    Get PDF
    Traditionally robots are controlled using devices like joysticks, keyboards, mice and other similar human computer interface (HCI) devices. Although this approach is effective and practical for some cases, it is restrictive only to healthy individuals without disabilities, and it also requires the user to master the device before its usage. It becomes complicated and non-intuitive when multiple robots need to be controlled simultaneously with these traditional devices, as in the case of Human Swarm Interfaces (HSI). This work presents a novel concept of using human bio-signals to control swarms of robots. With this concept there are two major advantages: Firstly, it gives amputees and people with certain disabilities the ability to control robotic swarms, which has previously not been possible. Secondly, it also gives the user a more intuitive interface to control swarms of robots by using gestures, thoughts, and eye movement. We measure different bio-signals from the human body including Electroencephalography (EEG), Electromyography (EMG), Electrooculography (EOG), using off the shelf products. After minimal signal processing, we then decode the intended control action using machine learning techniques like Hidden Markov Models (HMM) and K-Nearest Neighbors (K-NN). We employ formation controllers based on distance and displacement to control the shape and motion of the robotic swarm. Comparison for ground truth for thoughts and gesture classifications are done, and the resulting pipelines are evaluated with both simulations and hardware experiments with swarms of ground robots and aerial vehicles

    Towards Autonomous Firefighting UAVs: Online Planners for Obstacle Avoidance and Payload Delivery

    Get PDF
    Drone technology is advancing rapidly and represents significant benefits during firefighting operations. This paper presents a novel approach for autonomous firefighting missions for Unmanned Aerial Vehicles (UAVs). The proposed UAV framework consists of a local planner module that finds an obstacle-free path to guide the vehicle toward a target zone. After detecting the target point, the UAV plans an optimal trajectory to perform a precision ballistic launch of an extinguishing ball, exploiting its kinematics. The generated trajectory minimises the overall traversal time and the final state error while respecting UAV dynamic limits. The performance of the proposed system is evaluated both in simulations and real tests with randomly positioned obstacles and target locations. The proposed framework has been employed in the 2022 UAV Competition of the International Conference on Unmanned Aircraft Systems (ICUAS), where it successfully completed the mission in several runs of increasing difficulty, both in simulation and in real scenarios, achieving third place overall. A video attachment to this paper is available on the website https://www.youtube.com/watch?v=_hdxX2xXkVQ

    Control of a drone with body gestures

    Get PDF
    Drones are becoming more popular within military applications and civil aviation by hobbyists and business. Achieving a natural Human-Drone Interaction (HDI) would enable unskilled drone pilots to take part in the flying of these devices and more generally easy the use of drones. The research within this paper focuses on the design and development of a Natural User Interface (NUI) allowing a user to pilot a drone with body gestures. A Microsoft Kinect was used to capture the user's body information which was processed by a motion recognition algorithm and converted into commands for the drone. The implementation of a Graphical User Interface (GUI) gives feedback to the user. Visual feedback from the drone's onboard camera is provided on a screen and an interactive menu controlled by body gestures and allowing the choice of functionalities such as photo and video capture or take-off and landing has been implemented. This research resulted in an efficient and functional system, more instinctive, natural, immersive and fun than piloting using a physical controller, including innovative aspects such as the implementation of additional functionalities to the drone's piloting and control of the flight speed

    Advances in Human Robot Interaction for Cloud Robotics applications

    Get PDF
    In this thesis are analyzed different and innovative techniques for Human Robot Interaction. The focus of this thesis is on the interaction with flying robots. The first part is a preliminary description of the state of the art interactions techniques. Then the first project is Fly4SmartCity, where it is analyzed the interaction between humans (the citizen and the operator) and drones mediated by a cloud robotics platform. Then there is an application of the sliding autonomy paradigm and the analysis of different degrees of autonomy supported by a cloud robotics platform. The last part is dedicated to the most innovative technique for human-drone interaction in the User’s Flying Organizer project (UFO project). This project wants to develop a flying robot able to project information into the environment exploiting concepts of Spatial Augmented Realit

    Using Programmable Drone in Educational Projects and Competitions

    Full text link
    The mainstream of educational robotics platforms orbits the various versions of versatile robotics sets and kits, while interesting outliers add new opportunities and extend the possible learning situations. Examples of such are reconfigurable robots, rolling sphere robots, humanoids, swimming, or underwater robots. Another kind within this category are flying drones. While remotely controlled drones were a very attractive target for hobby model makers for quite a long time already, they were seldom used in educational scenarios as robots that are programmed by children to perform various simple tasks. A milestone was reached with the introduction of the educational drone Tello, which can be programmed even in Scratch, or some general-purpose languages such as Node.js or Python. The programs can even have access to the robot sensors that are used by the underlying layers of the controller. In addition, they have the option to acquire images from the drone camera and perform actions based on processing the frames applying computer vision algorithms. We have been using this drone in an educational robotics competition for three years without camera, and after our students have developed several successful projects that utilized a camera, we prepared a new competition challenge that requires the use of the camera. In the article, we summarize related efforts and our experiences with educational drones, and their use in the student projects and competition.Comment: This work was co-funded by the Horizon-Widera-2021 European Twinning project TERAIS G.A. n. 101079338. Open Access code and data described in the paper are available at: https://doi.org/10.5281/zenodo.10715699 and https://github.com/RoboCup-Junior-Slovensko/softverova-podpora/tree/master/drone-robocup202
    • …
    corecore