605 research outputs found
Design and Autonomous Stabilization of a Ballistically Launched Multirotor
Aircraft that can launch ballistically and convert to autonomous, free flying
drones have applications in many areas such as emergency response, defense, and
space exploration, where they can gather critical situational data using
onboard sensors. This paper presents a ballistically launched, autonomously
stabilizing multirotor prototype (SQUID, Streamlined Quick Unfolding
Investigation Drone) with an onboard sensor suite, autonomy pipeline, and
passive aerodynamic stability. We demonstrate autonomous transition from
passive to vision based, active stabilization, confirming the ability of the
multirotor to autonomously stabilize after a ballistic launch in a GPS denied
environment.Comment: Accepted to 2020 International Conference on Robotics and Automatio
Effective Target Aware Visual Navigation for UAVs
In this paper we propose an effective vision-based navigation method that
allows a multirotor vehicle to simultaneously reach a desired goal pose in the
environment while constantly facing a target object or landmark. Standard
techniques such as Position-Based Visual Servoing (PBVS) and Image-Based Visual
Servoing (IBVS) in some cases (e.g., while the multirotor is performing fast
maneuvers) do not allow to constantly maintain the line of sight with a target
of interest. Instead, we compute the optimal trajectory by solving a non-linear
optimization problem that minimizes the target re-projection error while
meeting the UAV's dynamic constraints. The desired trajectory is then tracked
by means of a real-time Non-linear Model Predictive Controller (NMPC): this
implicitly allows the multirotor to satisfy both the required constraints. We
successfully evaluate the proposed approach in many real and simulated
experiments, making an exhaustive comparison with a standard approach.Comment: Conference paper at "European Conference on Mobile Robotics" (ECMR)
201
Dynamic Landing of an Autonomous Quadrotor on a Moving Platform in Turbulent Wind Conditions
Autonomous landing on a moving platform presents unique challenges for
multirotor vehicles, including the need to accurately localize the platform,
fast trajectory planning, and precise/robust control. Previous works studied
this problem but most lack explicit consideration of the wind disturbance,
which typically leads to slow descents onto the platform. This work presents a
fully autonomous vision-based system that addresses these limitations by
tightly coupling the localization, planning, and control, thereby enabling fast
and accurate landing on a moving platform. The platform's position,
orientation, and velocity are estimated by an extended Kalman filter using
simulated GPS measurements when the quadrotor-platform distance is large, and
by a visual fiducial system when the platform is nearby. The landing trajectory
is computed online using receding horizon control and is followed by a boundary
layer sliding controller that provides tracking performance guarantees in the
presence of unknown, but bounded, disturbances. To improve the performance, the
characteristics of the turbulent conditions are accounted for in the
controller. The landing trajectory is fast, direct, and does not require
hovering over the platform, as is typical of most state-of-the-art approaches.
Simulations and hardware experiments are presented to validate the robustness
of the approach.Comment: 7 pages, 8 figures, ICRA2020 accepted pape
Accurate Landing of Unmanned Aerial Vehicles Using Ground Pattern Recognition
[EN] Over the last few years, several researchers have been developing protocols and applications in order to autonomously land unmanned aerial vehicles (UAVs). However, most of the proposed protocols rely on expensive equipment or do not satisfy the high precision needs of some UAV applications such as package retrieval and delivery or the compact landing of UAV swarms. Therefore, in this work, a solution for high precision landing based on the use of ArUco markers is presented. In the proposed solution, a UAV equipped with a low-cost camera is able to detect ArUco markers sized 56×56 cm from an altitude of up to 30 m. Once the marker is detected, the UAV changes its flight behavior in order to land on the exact position where the marker is located. The proposal was evaluated and validated using both the ArduSim simulation platform and real UAV flights. The results show an average offset of only 11 cm from the target position, which vastly improves the landing accuracy compared to the traditional GPS-based landing, which typically deviates from the intended target by 1 to 3 m.This work was funded by the Ministerio de Ciencia, Innovación y Universidades, Programa Estatal de Investigación, Desarrollo e Innovación Orientada a los Retos de la Sociedad, Proyectos I+D+I 2018 , Spain, under Grant RTI2018-096384-B-I00.Wubben, J.; Fabra Collado, FJ.; Tavares De Araujo Cesariny Calafate, CM.; Krzeszowski, T.; Márquez Barja, JM.; Cano, J.; Manzoni, P. (2019). Accurate Landing of Unmanned Aerial Vehicles Using Ground Pattern Recognition. Electronics. 8(12):1-16. https://doi.org/10.3390/electronics8121532S116812Pan, X., Ma, D., Jin, L., & Jiang, Z. (2008). Vision-Based Approach Angle and Height Estimation for UAV Landing. 2008 Congress on Image and Signal Processing. doi:10.1109/cisp.2008.78Tang, D., Li, F., Shen, N., & Guo, S. (2011). UAV attitude and position estimation for vision-based landing. Proceedings of 2011 International Conference on Electronic & Mechanical Engineering and Information Technology. doi:10.1109/emeit.2011.6023131Gautam, A., Sujit, P. B., & Saripalli, S. (2014). A survey of autonomous landing techniques for UAVs. 2014 International Conference on Unmanned Aircraft Systems (ICUAS). doi:10.1109/icuas.2014.6842377Holybro Pixhawk 4 · PX4 v1.9.0 User Guidehttps://docs.px4.io/v1.9.0/en/flight_controller/pixhawk4.htmlGarrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F. J., & Medina-Carnicer, R. (2016). Generation of fiducial marker dictionaries using Mixed Integer Linear Programming. Pattern Recognition, 51, 481-491. doi:10.1016/j.patcog.2015.09.023Romero-Ramirez, F. J., Muñoz-Salinas, R., & Medina-Carnicer, R. (2018). Speeded up detection of squared fiducial markers. Image and Vision Computing, 76, 38-47. doi:10.1016/j.imavis.2018.05.004ArUco: Augmented reality library based on OpenCVhttps://sourceforge.net/projects/aruco/Jin, S., Zhang, J., Shen, L., & Li, T. (2016). On-board vision autonomous landing techniques for quadrotor: A survey. 2016 35th Chinese Control Conference (CCC). doi:10.1109/chicc.2016.7554984Chen, X., Phang, S. K., Shan, M., & Chen, B. M. (2016). System integration of a vision-guided UAV for autonomous landing on moving platform. 2016 12th IEEE International Conference on Control and Automation (ICCA). doi:10.1109/icca.2016.7505370Nowak, E., Gupta, K., & Najjaran, H. (2017). Development of a Plug-and-Play Infrared Landing System for Multirotor Unmanned Aerial Vehicles. 2017 14th Conference on Computer and Robot Vision (CRV). doi:10.1109/crv.2017.23Shaker, M., Smith, M. N. R., Yue, S., & Duckett, T. (2010). Vision-Based Landing of a Simulated Unmanned Aerial Vehicle with Fast Reinforcement Learning. 2010 International Conference on Emerging Security Technologies. doi:10.1109/est.2010.14Araar, O., Aouf, N., & Vitanov, I. (2016). Vision Based Autonomous Landing of Multirotor UAV on Moving Platform. Journal of Intelligent & Robotic Systems, 85(2), 369-384. doi:10.1007/s10846-016-0399-zPatruno, C., Nitti, M., Petitti, A., Stella, E., & D’Orazio, T. (2018). A Vision-Based Approach for Unmanned Aerial Vehicle Landing. Journal of Intelligent & Robotic Systems, 95(2), 645-664. doi:10.1007/s10846-018-0933-2Baca, T., Stepan, P., Spurny, V., Hert, D., Penicka, R., Saska, M., … Kumar, V. (2019). Autonomous landing on a moving vehicle with an unmanned aerial vehicle. Journal of Field Robotics, 36(5), 874-891. doi:10.1002/rob.21858De Souza, J. P. C., Marcato, A. L. M., de Aguiar, E. P., Jucá, M. A., & Teixeira, A. M. (2019). Autonomous Landing of UAV Based on Artificial Neural Network Supervised by Fuzzy Logic. Journal of Control, Automation and Electrical Systems, 30(4), 522-531. doi:10.1007/s40313-019-00465-ySITL Simulator (Software in the Loop)http://ardupilot.org/dev/docs/sitl-simulator-software-in-the-loop.htmlFabra, F., Calafate, C. T., Cano, J.-C., & Manzoni, P. (2017). On the impact of inter-UAV communications interference in the 2.4 GHz band. 2017 13th International Wireless Communications and Mobile Computing Conference (IWCMC). doi:10.1109/iwcmc.2017.7986413MAVLink Micro Air Vehicle Communication Protocolhttp://qgroundcontrol.org/mavlink/startFabra, F., Calafate, C. T., Cano, J. C., & Manzoni, P. (2018). ArduSim: Accurate and real-time multicopter simulation. Simulation Modelling Practice and Theory, 87, 170-190. doi:10.1016/j.simpat.2018.06.009Careem, M. A. A., Gomez, J., Saha, D., & Dutta, A. (2019). HiPER-V: A High Precision Radio Frequency Vehicle for Aerial Measurements. 2019 16th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON). doi:10.1109/sahcn.2019.882490
Autonomous aerial robot for high-speed search and intercept applications
In recent years, high-speed navigation and environment interaction in the context of
aerial robotics has become a field of interest for several academic and industrial research studies. In
particular, Search and Intercept (SaI) applications for aerial robots pose a compelling research
area due to their potential usability in several environments. Nevertheless, SaI tasks involve a
challenging development regarding sensory weight, onboard computation resources, actuation design,
and algorithms for perception and control, among others. In this work, a fully autonomous aerial
robot for high-speed object grasping has been proposed. As an additional subtask, our system is able
to autonomously pierce balloons located in poles close to the surface. Our first contribution is the
design of the aerial robot at an actuation and sensory level consisting of a novel gripper design with
additional sensors enabling the robot to grasp objects at high speeds. The second contribution is
a complete software framework consisting of perception, state estimation, motion planning, motion
control, and mission control in order to rapidly and robustly perform the autonomous grasping
mission. Our approach has been validated in a challenging international competition and has shown
outstanding results, being able to autonomously search, follow, and grasp a moving object at 6 m/s
in an outdoor environment.Agencia Estatal de InvestigaciónKhalifa Universit
The MRS UAV System: Pushing the Frontiers of Reproducible Research, Real-world Deployment, and Education with Autonomous Unmanned Aerial Vehicles
We present a multirotor Unmanned Aerial Vehicle control (UAV) and estimation
system for supporting replicable research through realistic simulations and
real-world experiments. We propose a unique multi-frame localization paradigm
for estimating the states of a UAV in various frames of reference using
multiple sensors simultaneously. The system enables complex missions in GNSS
and GNSS-denied environments, including outdoor-indoor transitions and the
execution of redundant estimators for backing up unreliable localization
sources. Two feedback control designs are presented: one for precise and
aggressive maneuvers, and the other for stable and smooth flight with a noisy
state estimate. The proposed control and estimation pipeline are constructed
without using the Euler/Tait-Bryan angle representation of orientation in 3D.
Instead, we rely on rotation matrices and a novel heading-based convention to
represent the one free rotational degree-of-freedom in 3D of a standard
multirotor helicopter. We provide an actively maintained and well-documented
open-source implementation, including realistic simulation of UAV, sensors, and
localization systems. The proposed system is the product of years of applied
research on multi-robot systems, aerial swarms, aerial manipulation, motion
planning, and remote sensing. All our results have been supported by real-world
system deployment that shaped the system into the form presented here. In
addition, the system was utilized during the participation of our team from the
CTU in Prague in the prestigious MBZIRC 2017 and 2020 robotics competitions,
and also in the DARPA SubT challenge. Each time, our team was able to secure
top places among the best competitors from all over the world. On each
occasion, the challenges has motivated the team to improve the system and to
gain a great amount of high-quality experience within tight deadlines.Comment: 28 pages, 20 figures, submitted to Journal of Intelligent & Robotic
Systems (JINT), for the provided open-source software see
http://github.com/ctu-mr
Autonomous High-Precision Landing on a Unmanned Surface Vehicle
THE MAIN GOAL OF THIS THESIS IS THE DEVELOPMENT OF AN AUTONOMOUS
HIGH-PRECISION LANDING SYSTEM OF AN UAV IN AN AUTONOMOUS BOATIn this dissertation, a collaborative method for Multi Rotor Vertical Takeoff and Landing
(MR-VTOL) Unmanned Aerial Vehicle (UAV)s’ autonomous landing is presented. The
majority of common UAV autonomous landing systems adopt an approach in which the
UAV scans the landing zone for a predetermined pattern, establishes relative positions,
and uses those positions to execute the landing. These techniques have some shortcomings,
such as extensive processing being carried out by the UAV itself and requires a lot
of computational power. The fact that most of these techniques only work while the UAV
is already flying at a low altitude, since the pattern’s elements must be plainly visible to
the UAV’s camera, creates an additional issue. An RGB camera that is positioned in the
landing zone and pointed up at the sky is the foundation of the methodology described
throughout this dissertation. Convolutional Neural Networks and Inverse Kinematics
approaches can be used to isolate and analyse the distinctive motion patterns the UAV
presents because the sky is a very static and homogeneous environment. Following realtime
visual analysis, a terrestrial or maritime robotic system can transmit orders to the
UAV.
The ultimate result is a model-free technique, or one that is not based on established
patterns, that can help the UAV perform its landing manoeuvre. The method is trustworthy
enough to be used independently or in conjunction with more established techniques
to create a system that is more robust. The object detection neural network approach was
able to detect the UAV in 91,57% of the assessed frames with a tracking error under 8%,
according to experimental simulation findings derived from a dataset comprising three
different films. Also created was a high-level position relative control system that makes
use of the idea of an approach zone to the helipad. Every potential three-dimensional
point within the zone corresponds to a UAV velocity command with a certain orientation
and magnitude. The control system worked flawlessly to conduct the UAV’s landing
within 6 cm of the target during testing in a simulated setting.Nesta dissertação, é apresentado um método de colaboração para a aterragem autónoma
de Unmanned Aerial Vehicle (UAV)Multi Rotor Vertical Takeoff and Landing (MR-VTOL).
A maioria dos sistemas de aterragem autónoma de UAV comuns adopta uma abordagem
em que o UAV varre a zona de aterragem em busca de um padrão pré-determinado, estabelece
posições relativas, e utiliza essas posições para executar a aterragem. Estas técnicas
têm algumas deficiências, tais como o processamento extensivo a ser efectuado pelo próprio
UAV e requer muita potência computacional. O facto de a maioria destas técnicas só
funcionar enquanto o UAV já está a voar a baixa altitude, uma vez que os elementos do
padrão devem ser claramente visíveis para a câmara do UAV, cria um problema adicional.
Uma câmara RGB posicionada na zona de aterragem e apontada para o céu é a base da
metodologia descrita ao longo desta dissertação. As Redes Neurais Convolucionais e as
abordagens da Cinemática Inversa podem ser utilizadas para isolar e analisar os padrões
de movimento distintos que o UAV apresenta, porque o céu é um ambiente muito estático
e homogéneo. Após análise visual em tempo real, um sistema robótico terrestre ou
marítimo pode transmitir ordens para o UAV.
O resultado final é uma técnica sem modelo, ou que não se baseia em padrões estabelecidos,
que pode ajudar o UAV a realizar a sua manobra de aterragem. O método é
suficientemente fiável para ser utilizado independentemente ou em conjunto com técnicas
mais estabelecidas para criar um sistema que seja mais robusto. A abordagem da rede
neural de detecção de objectos foi capaz de detectar o UAV em 91,57% dos fotogramas
avaliados com um erro de rastreio inferior a 8%, de acordo com resultados de simulação
experimental derivados de um conjunto de dados composto por três filmes diferentes.
Também foi criado um sistema de controlo relativo de posição de alto nível que faz uso
da ideia de uma zona de aproximação ao heliporto. Cada ponto tridimensional potencial
dentro da zona corresponde a um comando de velocidade do UAV com uma certa orientação
e magnitude. O sistema de controlo funcionou sem falhas para conduzir a aterragem
do UAV dentro de 6 cm do alvo durante os testes num cenário simulado.
Traduzido com a versão gratuita do tradutor - www.DeepL.com/Translato
Autonomous UAV System for Cleaning Insulators in Power Line Inspection and Maintenance
The inspection and maintenance tasks of electrical installations are very demanding.
Nowadays, insulator cleaning is carried out manually by operators using scaffolds, ropes, or even
helicopters. However, these operations involve potential risks for humans and the electrical structure.
The use of Unmanned Aerial Vehicles (UAV) to reduce the risk of these tasks is rising. This paper
presents an UAV to autonomously clean insulators on power lines. First, an insulator detection and
tracking algorithm has been implemented to control the UAV in operation. Second, a cleaning tool
has been designed consisting of a pump, a tank, and an arm to direct the flow of cleaning liquid.
Third, a vision system has been developed that is capable of detecting soiled areas using a semantic
segmentation neuronal network, calculating the trajectory for cleaning in the image plane, and
generating arm trajectories to efficiently clean the insulator. Fourth, an autonomous system has been
developed to land on a charging pad to charge the batteries and potentially fill the tank with cleaning
liquid. Finally, the autonomous system has been validated in a controlled outdoor environment.Ministerio de Ciencia e Innovación (CDTI) AERIAL-CORE H2020 ICT-10-2019-2020FEDER INTERCONECT
- …