995 research outputs found

    Gesture Based Control of Semi-Autonomous Vehicles

    Get PDF
    The objective of this investigation is to explore the use of hand gestures to control semi-autonomous vehicles, such as quadcopters, using realistic, physics based simulations. This involves identifying natural gestures to control basic functions of a vehicle, such as maneuvering and onboard equipment operation, and building simulations using the Unity game engine to investigate preferred use of those gestures. In addition to creating a realistic operating experience, human factors associated with limitations on physical hand motion and information management are also considered in the simulation development process. Testing with external participants using a recreational quadcopter simulation built in Unity was conducted to assess the suitability of the simulation and preferences between a joystick approach and the gesture-based approach. Initial feedback indicated that the simulation represented the actual vehicle performance well and that the joystick is preferred over the gesture-based approach. Improvements in the gesture-based control are documented as additional features in the simulation, such as basic maneuver training and additional vehicle positioning information, are added to assist the user to better learn the gesture-based interface and implementation of active control concepts to interpret and apply vehicle forces and torques. Tests were also conducted with an actual ground vehicle to investigate if knowledge and skill from the simulated environment transfers to a real-life scenario. To assess this, an immersive virtual reality (VR) simulation was built in Unity as a training environment to learn how to control a remote control car using gestures. This was then followed by a control of the actual ground vehicle. Observations and participant feedback indicated that range of hand movement and hand positions transferred well to the actual demonstration. This illustrated that the VR simulation environment provides a suitable learning experience, and an environment from which to assess human performance; thus, also validating the observations from earlier tests. Overall results indicate that the gesture-based approach holds promise given the emergence of new technology, but additional work needs to be pursued. This includes algorithms to process gesture data to provide more stable and precise vehicle commands and training environments to familiarize users with this new interface concept

    Control of a drone with body gestures

    Get PDF
    Drones are becoming more popular within military applications and civil aviation by hobbyists and business. Achieving a natural Human-Drone Interaction (HDI) would enable unskilled drone pilots to take part in the flying of these devices and more generally easy the use of drones. The research within this paper focuses on the design and development of a Natural User Interface (NUI) allowing a user to pilot a drone with body gestures. A Microsoft Kinect was used to capture the user's body information which was processed by a motion recognition algorithm and converted into commands for the drone. The implementation of a Graphical User Interface (GUI) gives feedback to the user. Visual feedback from the drone's onboard camera is provided on a screen and an interactive menu controlled by body gestures and allowing the choice of functionalities such as photo and video capture or take-off and landing has been implemented. This research resulted in an efficient and functional system, more instinctive, natural, immersive and fun than piloting using a physical controller, including innovative aspects such as the implementation of additional functionalities to the drone's piloting and control of the flight speed

    Traditional vs Gesture Based UAV Control

    Get PDF
    Abstract. The purpose of this investigation was to assess user preferences for controlling an autonomous system. A comparison using a virtual environment (VE) was made between a joystick based, game controller and a gesture-based system using the leap motion controller. Command functions included basic flight maneuvers and switching between the operator and drone view. Comparisons were made between the control approaches using a representative quadcopter drone. The VE was designed to minimize the cognitive loading and focus on the flight control. It is a physics-based flight simulator built in Unity3D. Participants first spend time familiarizing themselves with the basic controls and vehicle response to command inputs. They then engaged in search missions. Data was gathered on time spent performing tasks, and post test interviews were conducted to uncover user preferences. Results indicate that while th

    Implementation of a Natural User Interface to Command a Drone

    Full text link
    In this work, we propose the use of a Natural User Interface (NUI) through body gestures using the open source library OpenPose, looking for a more dynamic and intuitive way to control a drone. For the implementation, we use the Robotic Operative System (ROS) to control and manage the different components of the project. Wrapped inside ROS, OpenPose (OP) processes the video obtained in real-time by a commercial drone, allowing to obtain the user's pose. Finally, the keypoints from OpenPose are obtained and translated, using geometric constraints, to specify high-level commands to the drone. Real-time experiments validate the full strategy.Comment: 2020 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 202
    • …
    corecore