169 research outputs found

    A Human-Embodied Drone for Dexterous Aerial Manipulation

    Full text link
    Current drones perform a wide variety of tasks in surveillance, photography, agriculture, package delivery, etc. However, these tasks are performed passively without the use of human interaction. Aerial manipulation shifts this paradigm and implements drones with robotic arms that allow interaction with the environment rather than simply sensing it. For example, in construction, aerial manipulation in conjunction with human interaction could allow operators to perform several tasks, such as hosing decks, drill into surfaces, and sealing cracks via a drone. This integration with drones will henceforth be known as dexterous aerial manipulation. Our recent work integrated the worker’s experience into aerial manipulation using haptic technology. The net effect was such a system could enable the worker to leverage drones and complete tasks while utilizing haptics on the task site remotely. However, the tasks were completed within the operator’s line-of-sight. Until now, immersive AR/VR frameworks has rarely been integrated in aerial manipulation. Yet, such a framework allows the drones to embody and transport the operator’s senses, actions, and presence to a remote location in real-time. As a result, the operator can both physically interact with the environment and socially interact with actual workers on the worksite. This dissertation presents a human-embodied drone interface for dexterous aerial manipulation. Using VR/AR technology, the interface allows the operator to leverage their intelligence to collaboratively perform desired tasks anytime, anywhere with a drone that possesses great dexterity

    Human-Machine Interfaces for Service Robotics

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    MorphoArms: Morphogenetic Teleoperation of Multimanual Robot

    Full text link
    Nowadays, there are few unmanned aerial vehicles (UAVs) capable of flying, walking and grasping. A drone with all these functionalities can significantly improve its performance in complex tasks such as monitoring and exploring different types of terrain, and rescue operations. This paper presents MorphoArms, a novel system that consists of a morphogenetic chassis and a hand gesture recognition teleoperation system. The mechanics, electronics, control architecture, and walking behavior of the morphogenetic chassis are described. This robot is capable of walking and grasping objects using four robotic limbs. Robotic limbs with four degrees-of-freedom are used as pedipulators when walking and as manipulators when performing actions in the environment. The robot control system is implemented using teleoperation, where commands are given by hand gestures. A motion capture system is used to track the user's hands and to recognize their gestures. The method of controlling the robot was experimentally tested in a study involving 10 users. The evaluation included three questionnaires (NASA TLX, SUS, and UEQ). The results showed that the proposed system was more user-friendly than 56% of the systems, and it was rated above average in terms of attractiveness, stimulation, and novelty.Comment: IEEE International Conference on Automation Science and Engineering (CASE 2023), Cordis, New Zeland, 26-30 August, 2023, in prin

    Virtual reality simulation of a quadrotor to monitor dependent people at home

    Get PDF
    Unmanned aerial vehicles (UAVs) represent an assistance solution for home care of dependent persons. These aircraft can cover the home, accompany the person, and position themselves to take photographs that can be analyzed to determine the person's mood and the assistance needed. In this context, this work principally aims to design a tool to aid in the development and validation of the navigation algorithms of an autonomous vision-based UAV for monitoring dependent people. For that, a distributed architecture has been proposed based on the real-time communication of two modules, one of them in charge of the dynamics of the UAV, the trajectory planning and the control algorithms, and the other devoted to visualizing the simulation in an immersive virtual environment. Thus, a system has been developed that allows the evaluation of the behavior of the assistant UAV from a technological point of view, as well as to carry out studies from the assisted person's viewpoint. An initial validation of a quadrotor model monitoring a virtual character demonstrates the advantages of the proposed system, which is an effective, safe and adaptable tool for the development of vision-based UAVs to help dependents at home.This work was partially supported by Ministerio de Ciencia, Innovación y Universidades, Agencia Estatal de Investigación/European Regional Development Fund under PID2019106084RB-I00 and DPI2016-80894-R grants, and by CIBERSAM of the Instituto de Salud Carlos III

    Using a Combination of PID Control and Kalman Filter to Design of IoT-based Telepresence Self-balancing Robots during COVID-19 Pandemic

    Get PDF
    COVID-19 is a very dangerous respiratory disease that can spread quickly through the air. Doctors, nurses, and medical personnel need protective clothing and are very careful in treating COVID-19 patients to avoid getting infected with the COVID-19 virus. Hence, a medical telepresence robot, which resembles a humanoid robot, is necessary to treat COVID-19 patients. The proposed self-balancing COVID-19 medical telepresence robot is a medical robot that handles COVID-19 patients, which resembles a stand-alone humanoid soccer robot with two wheels that can maneuver freely in hospital hallways. The proposed robot design has some control problems; it requires steady body positioning and is subjected to disturbance. A control method that functions to find the stability value such that the system response can reach the set-point is required to control the robot's stability and repel disturbances; this is known as disturbance rejection control. This study aimed to control the robot using a combination of Proportional-Integral-Derivative (PID) control and a Kalman filter. Mathematical equations were required to obtain a model of the robot's characteristics. The state-space model was derived from the self-balancing robot's mathematical equation. Since a PID control technique was used to keep the robot balanced, this state-space model was converted into a transfer function model. The second Ziegler-Nichols's rule oscillation method was used to tune the PID parameters. The values of the amplifier constants obtained were Kp=31.002, Ki=5.167, and Kd=125.992128. The robot was designed to be able to maintain its balance for more than one hour by using constant tuning, even when an external disturbance is applied to it. Doi: 10.28991/esj-2021-SP1-016 Full Text: PD

    Large-scale environment mapping and immersive human-robot interaction for agricultural mobile robot teleoperation

    Full text link
    Remote operation is a crucial solution to problems encountered in agricultural machinery operations. However, traditional video streaming control methods fall short in overcoming the challenges of single perspective views and the inability to obtain 3D information. In light of these issues, our research proposes a large-scale digital map reconstruction and immersive human-machine remote control framework for agricultural scenarios. In our methodology, a DJI unmanned aerial vehicle(UAV) was utilized for data collection, and a novel video segmentation approach based on feature points was introduced. To tackle texture richness variability, an enhanced Structure from Motion (SfM) using superpixel segmentation was implemented. This method integrates the open Multiple View Geometry (openMVG) framework along with Local Features from Transformers (LoFTR). The enhanced SfM results in a point cloud map, which is further processed through Multi-View Stereo (MVS) to generate a complete map model. For control, a closed-loop system utilizing TCP for VR control and positioning of agricultural machinery was introduced. Our system offers a fully visual-based immersive control method, where upon connection to the local area network, operators can utilize VR for immersive remote control. The proposed method enhances both the robustness and convenience of the reconstruction process, thereby significantly facilitating operators in acquiring more comprehensive on-site information and engaging in immersive remote control operations. The code is available at: https://github.com/LiuTao1126/Enhance-SF

    Data-driven body–machine interface for the accurate control of drones

    Get PDF
    The teleoperation of nonhumanoid robots is often a demanding task, as most current control interfaces rely on mappings between the operator’s and the robot’s actions, which are determined by the design and characteristics of the interface, and may therefore be challenging to master. Here, we describe a structured methodology to identify common patterns in spontaneous interaction behaviors, to implement embodied user interfaces, and to select the appropriate sensor type and positioning. Using this method, we developed an intuitive, gesture-based control interface for real and simulated drones, which outperformed a standard joystick in terms of learning time and steering abilities. Implementing this procedure to identify body-machine patterns for specific applications could support the development of more intuitive and effective interfaces

    Dynamic virtual reality user interface for teleoperation of heterogeneous robot teams

    Full text link
    This research investigates the possibility to improve current teleoperation control for heterogeneous robot teams using modern Human-Computer Interaction (HCI) techniques such as Virtual Reality. It proposes a dynamic teleoperation Virtual Reality User Interface (VRUI) framework to improve the current approach to teleoperating heterogeneous robot teams
    • …
    corecore