6,102 research outputs found

    Accurate Landing of Unmanned Aerial Vehicles Using Ground Pattern Recognition

    Full text link
    [EN] Over the last few years, several researchers have been developing protocols and applications in order to autonomously land unmanned aerial vehicles (UAVs). However, most of the proposed protocols rely on expensive equipment or do not satisfy the high precision needs of some UAV applications such as package retrieval and delivery or the compact landing of UAV swarms. Therefore, in this work, a solution for high precision landing based on the use of ArUco markers is presented. In the proposed solution, a UAV equipped with a low-cost camera is able to detect ArUco markers sized 56×56 cm from an altitude of up to 30 m. Once the marker is detected, the UAV changes its flight behavior in order to land on the exact position where the marker is located. The proposal was evaluated and validated using both the ArduSim simulation platform and real UAV flights. The results show an average offset of only 11 cm from the target position, which vastly improves the landing accuracy compared to the traditional GPS-based landing, which typically deviates from the intended target by 1 to 3 m.This work was funded by the Ministerio de Ciencia, Innovación y Universidades, Programa Estatal de Investigación, Desarrollo e Innovación Orientada a los Retos de la Sociedad, Proyectos I+D+I 2018 , Spain, under Grant RTI2018-096384-B-I00.Wubben, J.; Fabra Collado, FJ.; Tavares De Araujo Cesariny Calafate, CM.; Krzeszowski, T.; Márquez Barja, JM.; Cano, J.; Manzoni, P. (2019). Accurate Landing of Unmanned Aerial Vehicles Using Ground Pattern Recognition. Electronics. 8(12):1-16. https://doi.org/10.3390/electronics8121532S116812Pan, X., Ma, D., Jin, L., & Jiang, Z. (2008). Vision-Based Approach Angle and Height Estimation for UAV Landing. 2008 Congress on Image and Signal Processing. doi:10.1109/cisp.2008.78Tang, D., Li, F., Shen, N., & Guo, S. (2011). UAV attitude and position estimation for vision-based landing. Proceedings of 2011 International Conference on Electronic & Mechanical Engineering and Information Technology. doi:10.1109/emeit.2011.6023131Gautam, A., Sujit, P. B., & Saripalli, S. (2014). A survey of autonomous landing techniques for UAVs. 2014 International Conference on Unmanned Aircraft Systems (ICUAS). doi:10.1109/icuas.2014.6842377Holybro Pixhawk 4 · PX4 v1.9.0 User Guidehttps://docs.px4.io/v1.9.0/en/flight_controller/pixhawk4.htmlGarrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F. J., & Medina-Carnicer, R. (2016). Generation of fiducial marker dictionaries using Mixed Integer Linear Programming. Pattern Recognition, 51, 481-491. doi:10.1016/j.patcog.2015.09.023Romero-Ramirez, F. J., Muñoz-Salinas, R., & Medina-Carnicer, R. (2018). Speeded up detection of squared fiducial markers. Image and Vision Computing, 76, 38-47. doi:10.1016/j.imavis.2018.05.004ArUco: Augmented reality library based on OpenCVhttps://sourceforge.net/projects/aruco/Jin, S., Zhang, J., Shen, L., & Li, T. (2016). On-board vision autonomous landing techniques for quadrotor: A survey. 2016 35th Chinese Control Conference (CCC). doi:10.1109/chicc.2016.7554984Chen, X., Phang, S. K., Shan, M., & Chen, B. M. (2016). System integration of a vision-guided UAV for autonomous landing on moving platform. 2016 12th IEEE International Conference on Control and Automation (ICCA). doi:10.1109/icca.2016.7505370Nowak, E., Gupta, K., & Najjaran, H. (2017). Development of a Plug-and-Play Infrared Landing System for Multirotor Unmanned Aerial Vehicles. 2017 14th Conference on Computer and Robot Vision (CRV). doi:10.1109/crv.2017.23Shaker, M., Smith, M. N. R., Yue, S., & Duckett, T. (2010). Vision-Based Landing of a Simulated Unmanned Aerial Vehicle with Fast Reinforcement Learning. 2010 International Conference on Emerging Security Technologies. doi:10.1109/est.2010.14Araar, O., Aouf, N., & Vitanov, I. (2016). Vision Based Autonomous Landing of Multirotor UAV on Moving Platform. Journal of Intelligent & Robotic Systems, 85(2), 369-384. doi:10.1007/s10846-016-0399-zPatruno, C., Nitti, M., Petitti, A., Stella, E., & D’Orazio, T. (2018). A Vision-Based Approach for Unmanned Aerial Vehicle Landing. Journal of Intelligent & Robotic Systems, 95(2), 645-664. doi:10.1007/s10846-018-0933-2Baca, T., Stepan, P., Spurny, V., Hert, D., Penicka, R., Saska, M., … Kumar, V. (2019). Autonomous landing on a moving vehicle with an unmanned aerial vehicle. Journal of Field Robotics, 36(5), 874-891. doi:10.1002/rob.21858De Souza, J. P. C., Marcato, A. L. M., de Aguiar, E. P., Jucá, M. A., & Teixeira, A. M. (2019). Autonomous Landing of UAV Based on Artificial Neural Network Supervised by Fuzzy Logic. Journal of Control, Automation and Electrical Systems, 30(4), 522-531. doi:10.1007/s40313-019-00465-ySITL Simulator (Software in the Loop)http://ardupilot.org/dev/docs/sitl-simulator-software-in-the-loop.htmlFabra, F., Calafate, C. T., Cano, J.-C., & Manzoni, P. (2017). On the impact of inter-UAV communications interference in the 2.4 GHz band. 2017 13th International Wireless Communications and Mobile Computing Conference (IWCMC). doi:10.1109/iwcmc.2017.7986413MAVLink Micro Air Vehicle Communication Protocolhttp://qgroundcontrol.org/mavlink/startFabra, F., Calafate, C. T., Cano, J. C., & Manzoni, P. (2018). ArduSim: Accurate and real-time multicopter simulation. Simulation Modelling Practice and Theory, 87, 170-190. doi:10.1016/j.simpat.2018.06.009Careem, M. A. A., Gomez, J., Saha, D., & Dutta, A. (2019). HiPER-V: A High Precision Radio Frequency Vehicle for Aerial Measurements. 2019 16th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON). doi:10.1109/sahcn.2019.882490

    Safe2Ditch Steer-To-Clear Development and Flight Testing

    Get PDF
    This paper describes a series of small unmanned aerial system (sUAS) flights performed at NASA Langley Research Center in April and May of 2019 to test a newly added Steer-to-Clear feature for the Safe2Ditch (S2D) prototype system. S2D is an autonomous crash management system for sUAS. Its function is to detect the onset of an emergency for an autonomous vehicle, and to enable that vehicle in distress to execute safe landings to avoid injuring people on the ground or damaging property. Flight tests were conducted at the City Environment Range for Testing Autonomous Integrated Navigation (CERTAIN) range at NASA Langley. Prior testing of S2D focused on rerouting to an alternate ditch site when an occupant was detected in the primary ditch site. For Steer-to-Clear testing, S2D was limited to a single ditch site option to force engagement of the Steer-to-Clear mode. The implementation of Steer-to-Clear for the flight prototype used a simple method to divide the target ditch site into four quadrants. An RC car was driven in circles in one quadrant to simulate an occupant in that ditch site. A simple implementation of Steer-to- Clear was programmed to land in the opposite quadrant to maximize distance to the occupants quadrant. A successful mission was tallied when this occurred. Out of nineteen flights, thirteen resulted in successful missions. Data logs from the flight vehicle and the RC car indicated that unsuccessful missions were due to geolocation error between the actual location of the RC car and the derived location of it by the Vision Assisted Landing component of S2D on the flight vehicle. Video data indicated that while the Vision Assisted Landing component reliably identified the location of the ditch site occupant in the image frame, the conversion of the occupants location to earth coordinates was sometimes adversely impacted by errors in sensor data needed to perform the transformation. Logged sensor data was analyzed to attempt to identify the primary error sources and their impact on the geolocation accuracy. Three trends were observed in the data evaluation phase. In one trend, errors in geolocation were relatively large at the flight vehicles cruise altitude, but reduced as the vehicle descended. This was the expected behavior and was attributed to sensor errors of the inertial measurement unit (IMU). The second trend showed distinct sinusoidal error for the entire descent that did not always reduce with altitude. The third trend showed high scatter in the data, which did not correlate well with altitude. Possible sources of observed error and compensation techniques are discussed

    Aerial-Ground collaborative sensing: Third-Person view for teleoperation

    Full text link
    Rapid deployment and operation are key requirements in time critical application, such as Search and Rescue (SaR). Efficiently teleoperated ground robots can support first-responders in such situations. However, first-person view teleoperation is sub-optimal in difficult terrains, while a third-person perspective can drastically increase teleoperation performance. Here, we propose a Micro Aerial Vehicle (MAV)-based system that can autonomously provide third-person perspective to ground robots. While our approach is based on local visual servoing, it further leverages the global localization of several ground robots to seamlessly transfer between these ground robots in GPS-denied environments. Therewith one MAV can support multiple ground robots on a demand basis. Furthermore, our system enables different visual detection regimes, and enhanced operability, and return-home functionality. We evaluate our system in real-world SaR scenarios.Comment: Accepted for publication in 2018 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR

    Vehicle to Vehicle (V2V) Communication for Collision Avoidance for Multi-Copters Flying in UTM -TCL4

    Get PDF
    NASAs UAS Traffic management (UTM) research initiative is aimed at identifying requirements for safe autonomous operations of UAS operating in dense urban environments. For complete autonomous operations vehicle to vehicle (V2V) communications has been identified as an essential tool. In this paper we simulate a complete urban operations in an high fidelity simulation environment. We design a V2V communication protocol and all the vehicles participating communicate over this system. We show how V2V communication can be used for finding feasible, collision-free paths for multi agent systems. Different collision avoidance schemes are explored and an end to end simulation study shows the use of V2V communication for UTM TCL4 deployment

    Learning Pose Estimation for UAV Autonomous Navigation and Landing Using Visual-Inertial Sensor Data

    Get PDF
    In this work, we propose a robust network-in-the-loop control system for autonomous navigation and landing of an Unmanned-Aerial-Vehicle (UAV). To estimate the UAV’s absolute pose, we develop a deep neural network (DNN) architecture for visual-inertial odometry, which provides a robust alternative to traditional methods. We first evaluate the accuracy of the estimation by comparing the prediction of our model to traditional visual-inertial approaches on the publicly available EuRoC MAV dataset. The results indicate a clear improvement in the accuracy of the pose estimation up to 25% over the baseline. Finally, we integrate the data-driven estimator in the closed-loop flight control system of Airsim, a simulator available as a plugin for Unreal Engine, and we provide simulation results for autonomous navigation and landing
    • …
    corecore