3,098 research outputs found
A survey on uninhabited underwater vehicles (UUV)
ASME Early Career Technical Conference, ASME ECTC, October 2-3, 2009, Tuscaloosa, Alabama, USAThis work presents the initiation of our underwater robotics research which will be focused on underwater
vehicle-manipulator systems. Our aim is to build an underwater vehicle with a robotic manipulator which has a robust system and also can compensate itself under the influence of the hydrodynamic effects. In this paper, overview of the existing underwater vehicle systems, thruster designs, their dynamic models and control architectures are given. The purpose and results of the existing methods in underwater robotics are investigated
Visibility in underwater robotics: Benchmarking and single image dehazing
Dealing with underwater visibility is one of the most important challenges in autonomous underwater robotics. The light transmission in the water medium degrades images making the interpretation of the scene difficult and consequently compromising the whole intervention. This thesis contributes by analysing the impact of the underwater image degradation in commonly used vision algorithms through benchmarking. An online framework for underwater research that makes possible to analyse results under different conditions is presented. Finally, motivated by the results of experimentation with the developed framework, a deep learning solution is proposed capable of dehazing a degraded image in real time restoring the original colors of the image.Una de las dificultades más grandes de la robótica autónoma submarina es lidiar con la falta de visibilidad en imágenes submarinas. La transmisión de la luz en el agua degrada las imágenes dificultando el reconocimiento de objetos y en consecuencia la intervención. Ésta tesis se centra en el análisis del impacto de la degradación de las imágenes submarinas en algoritmos de visión a través de benchmarking, desarrollando un entorno de trabajo en la nube que permite analizar los resultados bajo diferentes condiciones. Teniendo en cuenta los resultados obtenidos con este entorno, se proponen métodos basados en técnicas de aprendizaje profundo para mitigar el impacto de la degradación de las imágenes en tiempo real introduciendo un paso previo que permita recuperar los colores originales
Data-Driven Meets Navigation: Concepts, Models, and Experimental Validation
The purpose of navigation is to determine the position, velocity, and
orientation of manned and autonomous platforms, humans, and animals. Obtaining
accurate navigation commonly requires fusion between several sensors, such as
inertial sensors and global navigation satellite systems, in a model-based,
nonlinear estimation framework. Recently, data-driven approaches applied in
various fields show state-of-the-art performance, compared to model-based
methods. In this paper we review multidisciplinary, data-driven based
navigation algorithms developed and experimentally proven at the Autonomous
Navigation and Sensor Fusion Lab (ANSFL) including algorithms suitable for
human and animal applications, varied autonomous platforms, and multi-purpose
navigation and fusion approachesComment: 22 pages, 13 figure
Inertial Navigation Meets Deep Learning: A Survey of Current Trends and Future Directions
Inertial sensing is used in many applications and platforms, ranging from
day-to-day devices such as smartphones to very complex ones such as autonomous
vehicles. In recent years, the development of machine learning and deep
learning techniques has increased significantly in the field of inertial
sensing and sensor fusion. This is due to the development of efficient
computing hardware and the accessibility of publicly available sensor data.
These data-driven approaches mainly aim to empower model-based inertial sensing
algorithms. To encourage further research in integrating deep learning with
inertial navigation and fusion and to leverage their capabilities, this paper
provides an in-depth review of deep learning methods for inertial sensing and
sensor fusion. We discuss learning methods for calibration and denoising as
well as approaches for improving pure inertial navigation and sensor fusion.
The latter is done by learning some of the fusion filter parameters. The
reviewed approaches are classified by the environment in which the vehicles
operate: land, air, and sea. In addition, we analyze trends and future
directions in deep learning-based navigation and provide statistical data on
commonly used approaches
- …