1,359 research outputs found

    Accurate Landing of Unmanned Aerial Vehicles Using Ground Pattern Recognition

    Full text link
    [EN] Over the last few years, several researchers have been developing protocols and applications in order to autonomously land unmanned aerial vehicles (UAVs). However, most of the proposed protocols rely on expensive equipment or do not satisfy the high precision needs of some UAV applications such as package retrieval and delivery or the compact landing of UAV swarms. Therefore, in this work, a solution for high precision landing based on the use of ArUco markers is presented. In the proposed solution, a UAV equipped with a low-cost camera is able to detect ArUco markers sized 56×56 cm from an altitude of up to 30 m. Once the marker is detected, the UAV changes its flight behavior in order to land on the exact position where the marker is located. The proposal was evaluated and validated using both the ArduSim simulation platform and real UAV flights. The results show an average offset of only 11 cm from the target position, which vastly improves the landing accuracy compared to the traditional GPS-based landing, which typically deviates from the intended target by 1 to 3 m.This work was funded by the Ministerio de Ciencia, Innovación y Universidades, Programa Estatal de Investigación, Desarrollo e Innovación Orientada a los Retos de la Sociedad, Proyectos I+D+I 2018 , Spain, under Grant RTI2018-096384-B-I00.Wubben, J.; Fabra Collado, FJ.; Tavares De Araujo Cesariny Calafate, CM.; Krzeszowski, T.; Márquez Barja, JM.; Cano, J.; Manzoni, P. (2019). Accurate Landing of Unmanned Aerial Vehicles Using Ground Pattern Recognition. Electronics. 8(12):1-16. https://doi.org/10.3390/electronics8121532S116812Pan, X., Ma, D., Jin, L., & Jiang, Z. (2008). Vision-Based Approach Angle and Height Estimation for UAV Landing. 2008 Congress on Image and Signal Processing. doi:10.1109/cisp.2008.78Tang, D., Li, F., Shen, N., & Guo, S. (2011). UAV attitude and position estimation for vision-based landing. Proceedings of 2011 International Conference on Electronic & Mechanical Engineering and Information Technology. doi:10.1109/emeit.2011.6023131Gautam, A., Sujit, P. B., & Saripalli, S. (2014). A survey of autonomous landing techniques for UAVs. 2014 International Conference on Unmanned Aircraft Systems (ICUAS). doi:10.1109/icuas.2014.6842377Holybro Pixhawk 4 · PX4 v1.9.0 User Guidehttps://docs.px4.io/v1.9.0/en/flight_controller/pixhawk4.htmlGarrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F. J., & Medina-Carnicer, R. (2016). Generation of fiducial marker dictionaries using Mixed Integer Linear Programming. Pattern Recognition, 51, 481-491. doi:10.1016/j.patcog.2015.09.023Romero-Ramirez, F. J., Muñoz-Salinas, R., & Medina-Carnicer, R. (2018). Speeded up detection of squared fiducial markers. Image and Vision Computing, 76, 38-47. doi:10.1016/j.imavis.2018.05.004ArUco: Augmented reality library based on OpenCVhttps://sourceforge.net/projects/aruco/Jin, S., Zhang, J., Shen, L., & Li, T. (2016). On-board vision autonomous landing techniques for quadrotor: A survey. 2016 35th Chinese Control Conference (CCC). doi:10.1109/chicc.2016.7554984Chen, X., Phang, S. K., Shan, M., & Chen, B. M. (2016). System integration of a vision-guided UAV for autonomous landing on moving platform. 2016 12th IEEE International Conference on Control and Automation (ICCA). doi:10.1109/icca.2016.7505370Nowak, E., Gupta, K., & Najjaran, H. (2017). Development of a Plug-and-Play Infrared Landing System for Multirotor Unmanned Aerial Vehicles. 2017 14th Conference on Computer and Robot Vision (CRV). doi:10.1109/crv.2017.23Shaker, M., Smith, M. N. R., Yue, S., & Duckett, T. (2010). Vision-Based Landing of a Simulated Unmanned Aerial Vehicle with Fast Reinforcement Learning. 2010 International Conference on Emerging Security Technologies. doi:10.1109/est.2010.14Araar, O., Aouf, N., & Vitanov, I. (2016). Vision Based Autonomous Landing of Multirotor UAV on Moving Platform. Journal of Intelligent & Robotic Systems, 85(2), 369-384. doi:10.1007/s10846-016-0399-zPatruno, C., Nitti, M., Petitti, A., Stella, E., & D’Orazio, T. (2018). A Vision-Based Approach for Unmanned Aerial Vehicle Landing. Journal of Intelligent & Robotic Systems, 95(2), 645-664. doi:10.1007/s10846-018-0933-2Baca, T., Stepan, P., Spurny, V., Hert, D., Penicka, R., Saska, M., … Kumar, V. (2019). Autonomous landing on a moving vehicle with an unmanned aerial vehicle. Journal of Field Robotics, 36(5), 874-891. doi:10.1002/rob.21858De Souza, J. P. C., Marcato, A. L. M., de Aguiar, E. P., Jucá, M. A., & Teixeira, A. M. (2019). Autonomous Landing of UAV Based on Artificial Neural Network Supervised by Fuzzy Logic. Journal of Control, Automation and Electrical Systems, 30(4), 522-531. doi:10.1007/s40313-019-00465-ySITL Simulator (Software in the Loop)http://ardupilot.org/dev/docs/sitl-simulator-software-in-the-loop.htmlFabra, F., Calafate, C. T., Cano, J.-C., & Manzoni, P. (2017). On the impact of inter-UAV communications interference in the 2.4 GHz band. 2017 13th International Wireless Communications and Mobile Computing Conference (IWCMC). doi:10.1109/iwcmc.2017.7986413MAVLink Micro Air Vehicle Communication Protocolhttp://qgroundcontrol.org/mavlink/startFabra, F., Calafate, C. T., Cano, J. C., & Manzoni, P. (2018). ArduSim: Accurate and real-time multicopter simulation. Simulation Modelling Practice and Theory, 87, 170-190. doi:10.1016/j.simpat.2018.06.009Careem, M. A. A., Gomez, J., Saha, D., & Dutta, A. (2019). HiPER-V: A High Precision Radio Frequency Vehicle for Aerial Measurements. 2019 16th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON). doi:10.1109/sahcn.2019.882490

    A pan-tilt camera Fuzzy vision controller on an unmanned aerial vehicle

    Get PDF
    is paper presents an implementation of two Fuzzy Logic controllers working in parallel for a pan-tilt camera platform on an UAV. This implementation uses a basic Lucas-Kanade tracker algorithm, which sends information about the error between the center of the object to track and the center of the image, to the Fuzzy controller. This information is enough for the controller, to follow the object moving a two axis servo-platform, besides the UAV vibrations and movements. The two Fuzzy controllers of each axis, work with a rules-base of 49 rules, two inputs and one output with a more significant sector defined to improve the behavior of those

    Vision - based self - guided Quadcopter landing on moving platform during fault detection

    Get PDF
    Fault occurrence in the quadcopter is very common during operation in the air. This paper presents a real-time implementation to detect the fault and then the system is guaranteeing to safely land on the surface, even the moving landing platform. Primarily, PixHawk auto-pilot was used to verify in real-time, with platform detection and various environmental conditions. The method is ensuring the quadcopter operates in the landing area zone with the help of a GPS feature. Then the precise landing on the astable-landing platform is calibrated automatically using the vision-based learning feedback technique. The proposed objective is developed using reconfigurable Raspberry Pi-3 with a Pi camera. The full decision on an efficient landing algorithm is deployed into the quadcopter. The system is self-guided and automatically returns to home-based whenever the fault detects. The study is conducted with the situation of low battery operation and the trigger of auto-pilot helps to land the device safely before any mal-function. The system is featured with predetermined speed and altitude while navigating the home base, thus improves the detection process. Finally, the experiment study provided successful trials to track usable platform, landing on a restricted area, and disarm the motors autonomously

    Vision-Based Autonomous Following of a Moving Platform and Landing for an Unmanned Aerial Vehicle

    Get PDF
    Interest in Unmanned Aerial Vehicles (UAVs) has increased due to their versatility and variety of applications, however their battery life limits their applications. Heterogeneous multi-robot systems can offer a solution to this limitation, by allowing an Unmanned Ground Vehicle (UGV) to serve as a recharging station for the aerial one. Moreover, cooperation between aerial and terrestrial robots allows them to overcome other individual limitations, such as communication link coverage or accessibility, and to solve highly complex tasks, e.g., environment exploration, infrastructure inspection or search and rescue. This work proposes a vision-based approach that enables an aerial robot to autonomously detect, follow, and land on a mobile ground platform. For this purpose, ArUcO fiducial markers are used to estimate the relative pose between the UAV and UGV by processing RGB images provided by a monocular camera on board the UAV. The pose estimation is fed to a trajectory planner and four decoupled controllers to generate speed set-points relative to the UAV. Using a cascade loop strategy, these set-points are then sent to the UAV autopilot for inner loop control. The proposed solution has been tested both in simulation, with a digital twin of a solar farm using ROS, Gazebo and Ardupilot Software-in-the-Loop (SiL); and in the real world at IST Lisbon’s outdoor facilities, with a UAV built on the basis of a DJ550 Hexacopter and a modified Jackal ground robot from DJI and Clearpath Robotics, respectively. Pose estimation, trajectory planning and speed set-point are computed on board the UAV, using a Single Board Computer (SBC) running Ubuntu and ROS, without the need for external infrastructure.This research was funded by the ISR/LARSyS Strategic Funding through the FCT project UIDB/50009/2020, the DURABLE project, under the Interreg Atlantic Area Programme through the European Regional Development Fund (ERDF), the Andalusian project UMA18-FEDERJA-090 and the University of Málaga Research Plan. Partial funding for open access charge: Universidad de Málag

    PRECISE LANDING OF VTOL UAVS USING A TETHER

    Get PDF
    Unmanned Aerial Vehicles (UAVs), also known as drones, are often considered the solution to complex robotics problems. The significant freedom to explore an environment is a major reason why UAVs are a popular choice for automated solutions. UAVs, however, have a very limited flight time due to the low capacity and weight ratio of current batteries. One way to extend the vehicles\u27 flight time is to use a tether to provide power from external batteries, generators on the ground, or another vehicle. Attaching a tether to a vehicle may constrain its navigation but it may also create some opportunities for improvement of some tasks, such as landing. A tethered UAV can still explore an environment, but with some additional limitations: the tether can become wrapped around or bent by an obstacle, stopping the drone from traveling further and requiring backtracking to undo; the tether can fall loose and get caught while dragging on the ground; or the base of the tether could be mobile and the UAV needs to have a way to return to it. Most issues, like those listed above, could be solved with a vision system and various kinds of markers, but this approach could not work in situations of low light, where cameras are no longer effective. In this project, a state machine was developed to land a tethered, vertical take-off and landing (VTOL) UAV using only angles taken from both ends of the tether, the tension in the tether, and the height of the UAV. The main scenarios focused on in this project were normal operation, obstacle interference, loose tether, and a moving base. Normal operation is essentially tether guidance using the tether as a direction back to the base. The obstacle case has to determine the best action for untangling the tether. The loose tether case has to handle the loss of information given by the angle sensors, as the tether direction is no longer available. This case is performed as a last-ditched effort to find the landing pad with only a moderate chance for success. Lastly, the moving base case uses the change in the angles over time to determine the speed needed to reach the base. The software was not the only focus of this project. Two hardware components of this project were a landing platform and a matching landing gear to support the landing process. These two components were designed to aid in the precision of the landed location and to ensure that the UAV was secured in position once landed. The landing platform was designed as a passive funnel-type positioning mechanism with a depression in the center that the landing gear was designed to match. The tension of the tether is used to further lock the UAV into place when in motion. While some of this project remained theoretical, particularly the moving base case, there was flight testing performed for validation of most states of the proposed state machine. The normal operation state was effective at guiding the UAV onto the landing pad. The loose tether case was also able to land within reasonable expectations. This case was not always successful at finding the landing pad. Particular methods of increasing the likelihood of success are discussed in Future Work. The Obstacle Case was also able to be detected, but the response action has yet to be tested in full. The prior testing of velocity following can be used as proof of concept due to its simplicity. In conclusion, this project successfully developed a state machine for precisely landing a tethered UAV with no environmental knowledge or localization. Further development is necessary to improve the likelihood of landing in problematic scenarios and more testing is necessary for the system as a whole. More landing scenarios could also be researched and added as cases to the state machine to increase the robustness of the landing process. However, each current subsystem achieved some level of validation and is to be improved with future developments

    Accurate navigation applied to landing maneuvers on mobile platforms for unmanned aerial vehicles

    Get PDF
    Drones are quickly developing worldwide and in Europe in particular. They represent the future of a high percentage of operations that are currently carried out by manned aviation or satellites. Compared to fixed-wing UAVs, rotary wing UAVs have as advantages the hovering, agile maneuvering and vertical take-off and landing capabilities, so that they are currently the most used aerial robotic platforms. In operations from ships and boats, the final approach and the landing maneuver are the phases of the operation that involves a higher risk and where it is required a higher level of precision in the position and velocity estimation, along with a high level of robustness in the operation. In the framework of the EC-SAFEMOBIL and the REAL projects, this thesis is devoted to the development of a guidance and navigation system that allows completing an autonomous mission from the take-off to the landing phase of a rotary-wing UAV (RUAV). More specifically, this thesis is focused on the development of new strategies and algorithms that provide sufficiently accurate motion estimation during the autonomous landing on mobile platforms without using the GNSS constellations. In one hand, for the phases of the flights where it is not required a centimetric accuracy solution, here it is proposed a new navigation approach that extends the current estimation techniques by using the EGNOS integrity information in the sensor fusion filter. This approach allows improving the accuracy of the estimation solution and the safety of the overall system, and also helps the remote pilot to have a more complete awareness of the operation status while flying the UAV In the other hand, for those flight phases where the accuracy is a critical factor in the safety of the operation, this thesis presents a precise navigation system that allows rotary-wing UAVs to approach and land safely on moving platforms, without using GNSS at any stage of the landing maneuver, and with a centimeter-level accuracy and high level of robustness. This system implements a novel concept where the relative position and velocity between the aerial vehicle and the landing platform can be calculated from a radio-beacon system installed in both the UAV and the landing platform or through the angles of a cable that physically connects the UAV and the landing platform. The use of a cable also incorporates several extra benefits, like increasing the precision in the control of the UAV altitude. It also facilitates to center the UAV right on top of the expected landing position and increases the stability of the UAV just after contacting the landing platform. The proposed guidance and navigation systems have been implemented in an unmanned rotorcraft and a large number of tests have been carried out under different conditions for measuring the accuracy and the robustness of the proposed solution. Results showed that the developed system allows landing with centimeter accuracy by using only local sensors and that the UAV is able to follow a mobile landing platform in multiple trajectories at different velocities

    Vision-Based Autonomous Landing of a Quadrotor on the Perturbed Deck of an Unmanned Surface Vehicle

    Get PDF
    Autonomous landing on the deck of an unmanned surface vehicle (USV) is still a major challenge for unmanned aerial vehicles (UAVs). In this paper, a fiducial marker is located on the platform so as to facilitate the task since it is possible to retrieve its six-degrees of freedom relative-pose in an easy way. To compensate interruption in the marker’s observations, an extended Kalman filter (EKF) estimates the current USV’s position with reference to the last known position. Validation experiments have been performed in a simulated environment under various marine conditions. The results confirmed that the EKF provides estimates accurate enough to direct the UAV in proximity of the autonomous vessel such that the marker becomes visible again. Using only the odometry and the inertial measurements for the estimation, this method is found to be applicable even under adverse weather conditions in the absence of the global positioning system

    Towards an autonomous vision-based unmanned aerial system againstwildlife poachers

    Get PDF
    Poaching is an illegal activity that remains out of control in many countries. Based on the 2014 report of the United Nations and Interpol, the illegal trade of global wildlife and natural resources amounts to nearly $213 billion every year, which is even helping to fund armed conflicts. Poaching activities around the world are further pushing many animal species on the brink of extinction. Unfortunately, the traditional methods to fight against poachers are not enough, hence the new demands for more efficient approaches. In this context, the use of new technologies on sensors and algorithms, as well as aerial platforms is crucial to face the high increase of poaching activities in the last few years. Our work is focused on the use of vision sensors on UAVs for the detection and tracking of animals and poachers, as well as the use of such sensors to control quadrotors during autonomous vehicle following and autonomous landing.Peer Reviewe

    Towards an autonomous vision-based unmanned aerial system against wildlife poachers.

    Get PDF
    Poaching is an illegal activity that remains out of control in many countries. Based on the 2014 report of the United Nations and Interpol, the illegal trade of global wildlife and natural resources amounts to nearly $ 213 billion every year, which is even helping to fund armed conflicts. Poaching activities around the world are further pushing many animal species on the brink of extinction. Unfortunately, the traditional methods to fight against poachers are not enough, hence the new demands for more efficient approaches. In this context, the use of new technologies on sensors and algorithms, as well as aerial platforms is crucial to face the high increase of poaching activities in the last few years. Our work is focused on the use of vision sensors on UAVs for the detection and tracking of animals and poachers, as well as the use of such sensors to control quadrotors during autonomous vehicle following and autonomous landing

    3D pose estimation based on planar object tracking for UAVs control

    Get PDF
    This article presents a real time Unmanned Aerial Vehicles UAVs 3D pose estimation method using planar object tracking, in order to be used on the control system of a UAV. The method explodes the rich information obtained by a projective transformation of planar objects on a calibrated camera. The algorithm obtains the metric and projective components of a reference object (landmark or helipad) with respect to the UAV camera coordinate system, using a robust real time object tracking based on homographies. The algorithm is validated on real flights that compare the estimated data against that obtained by the inertial measurement unit IMU, showing that the proposed method robustly estimates the helicopter's 3D position with respect to a reference landmark, with a high quality on the position and orientation estimation when the aircraft is flying at low altitudes, a situation in which the GPS information is often inaccurate. The obtained results indicate that the proposed algorithm is suitable for complex control tasks, such as autonomous landing, accurate low altitude positioning and dropping of payloads
    corecore