2,279 research outputs found
Routing Unmanned Vehicles in GPS-Denied Environments
Most of the routing algorithms for unmanned vehicles, that arise in data
gathering and monitoring applications in the literature, rely on the Global
Positioning System (GPS) information for localization. However, disruption of
GPS signals either intentionally or unintentionally could potentially render
these algorithms not applicable. In this article, we present a novel method to
address this difficulty by combining methods from cooperative localization and
routing. In particular, the article formulates a fundamental combinatorial
optimization problem to plan routes for an unmanned vehicle in a GPS-restricted
environment while enabling localization for the vehicle. We also develop
algorithms to compute optimal paths for the vehicle using the proposed
formulation. Extensive simulation results are also presented to corroborate the
effectiveness and performance of the proposed formulation and algorithms.Comment: Publised in International Conference on Umanned Aerial System
PHALANX: Expendable Projectile Sensor Networks for Planetary Exploration
Technologies enabling long-term, wide-ranging measurement in hard-to-reach areas are a critical need for planetary science inquiry. Phenomena of interest include flows or variations in volatiles, gas composition or concentration, particulate density, or even simply temperature. Improved measurement of these processes enables understanding of exotic geologies and distributions or correlating indicators of trapped water or biological activity. However, such data is often needed in unsafe areas such as caves, lava tubes, or steep ravines not easily reached by current spacecraft and planetary robots. To address this capability gap, we have developed miniaturized, expendable sensors which can be ballistically lobbed from a robotic rover or static lander - or even dropped during a flyover. These projectiles can perform sensing during flight and after anchoring to terrain features. By augmenting exploration systems with these sensors, we can extend situational awareness, perform long-duration monitoring, and reduce utilization of primary mobility resources, all of which are crucial in surface missions. We call the integrated payload that includes a cold gas launcher, smart projectiles, planning software, network discovery, and science sensing: PHALANX. In this paper, we introduce the mission architecture for PHALANX and describe an exploration concept that pairs projectile sensors with a rover mothership. Science use cases explored include reconnaissance using ballistic cameras, volatiles detection, and building timelapse maps of temperature and illumination conditions. Strategies to autonomously coordinate constellations of deployed sensors to self-discover and localize with peer ranging (i.e. a local GPS) are summarized, thus providing communications infrastructure beyond-line-of-sight (BLOS) of the rover. Capabilities were demonstrated through both simulation and physical testing with a terrestrial prototype. The approach to developing a terrestrial prototype is discussed, including design of the launching mechanism, projectile optimization, micro-electronics fabrication, and sensor selection. Results from early testing and characterization of commercial-off-the-shelf (COTS) components are reported. Nodes were subjected to successful burn-in tests over 48 hours at full logging duty cycle. Integrated field tests were conducted in the Roverscape, a half-acre planetary analog environment at NASA Ames, where we tested up to 10 sensor nodes simultaneously coordinating with an exploration rover. Ranging accuracy has been demonstrated to be within +/-10cm over 20m using commodity radios when compared to high-resolution laser scanner ground truthing. Evolution of the design, including progressive miniaturization of the electronics and iterated modifications of the enclosure housing for streamlining and optimized radio performance are described. Finally, lessons learned to date, gaps toward eventual flight mission implementation, and continuing future development plans are discussed
OASIS: Optimal Arrangements for Sensing in SLAM
The number and arrangement of sensors on an autonomous mobile robot
dramatically influence its perception capabilities. Ensuring that sensors are
mounted in a manner that enables accurate detection, localization, and mapping
is essential for the success of downstream control tasks. However, when
designing a new robotic platform, researchers and practitioners alike usually
mimic standard configurations or maximize simple heuristics like field-of-view
(FOV) coverage to decide where to place exteroceptive sensors. In this work, we
conduct an information-theoretic investigation of this overlooked element of
mobile robotic perception in the context of simultaneous localization and
mapping (SLAM). We show how to formalize the sensor arrangement problem as a
form of subset selection under the E-optimality performance criterion. While
this formulation is NP-hard in general, we further show that a combination of
greedy sensor selection and fast convex relaxation-based post-hoc verification
enables the efficient recovery of certifiably optimal sensor designs in
practice. Results from synthetic experiments reveal that sensors placed with
OASIS outperform benchmarks in terms of mean squared error of visual SLAM
estimates
Reliable localization methods for intelligent vehicles based on environment perception
Mención Internacional en el título de doctorIn the near past, we would see autonomous vehicles and Intelligent Transport
Systems (ITS) as a potential future of transportation. Today, thanks to all the
technological advances in recent years, the feasibility of such systems is no longer a
question. Some of these autonomous driving technologies are already sharing our
roads, and even commercial vehicles are including more Advanced Driver-Assistance
Systems (ADAS) over the years. As a result, transportation is becoming more efficient
and the roads are considerably safer.
One of the fundamental pillars of an autonomous system is self-localization. An
accurate and reliable estimation of the vehicle’s pose in the world is essential to
navigation. Within the context of outdoor vehicles, the Global Navigation Satellite
System (GNSS) is the predominant localization system. However, these systems are
far from perfect, and their performance is degraded in environments with limited
satellite visibility. Additionally, their dependence on the environment can make them
unreliable if it were to change.
Accordingly, the goal of this thesis is to exploit the perception of the environment
to enhance localization systems in intelligent vehicles, with special attention to
their reliability. To this end, this thesis presents several contributions: First, a study
on exploiting 3D semantic information in LiDAR odometry is presented, providing
interesting insights regarding the contribution to the odometry output of each type
of element in the scene. The experimental results have been obtained using a public
dataset and validated on a real-world platform. Second, a method to estimate the
localization error using landmark detections is proposed, which is later on exploited
by a landmark placement optimization algorithm. This method, which has been
validated in a simulation environment, is able to determine a set of landmarks
so the localization error never exceeds a predefined limit. Finally, a cooperative
localization algorithm based on a Genetic Particle Filter is proposed to utilize vehicle
detections in order to enhance the estimation provided by GNSS systems. Multiple
experiments are carried out in different simulation environments to validate the
proposed method.En un pasado no muy lejano, los vehículos autónomos y los Sistemas Inteligentes
del Transporte (ITS) se veían como un futuro para el transporte con gran potencial.
Hoy, gracias a todos los avances tecnológicos de los últimos años, la viabilidad
de estos sistemas ha dejado de ser una incógnita. Algunas de estas tecnologías
de conducción autónoma ya están compartiendo nuestras carreteras, e incluso los
vehículos comerciales cada vez incluyen más Sistemas Avanzados de Asistencia a la
Conducción (ADAS) con el paso de los años. Como resultado, el transporte es cada
vez más eficiente y las carreteras son considerablemente más seguras.
Uno de los pilares fundamentales de un sistema autónomo es la autolocalización.
Una estimación precisa y fiable de la posición del vehículo en el mundo es esencial
para la navegación. En el contexto de los vehículos circulando en exteriores, el
Sistema Global de Navegación por Satélite (GNSS) es el sistema de localización predominante.
Sin embargo, estos sistemas están lejos de ser perfectos, y su rendimiento
se degrada en entornos donde la visibilidad de los satélites es limitada. Además, los
cambios en el entorno pueden provocar cambios en la estimación, lo que los hace
poco fiables en ciertas situaciones.
Por ello, el objetivo de esta tesis es utilizar la percepción del entorno para mejorar
los sistemas de localización en vehículos inteligentes, con una especial atención a
la fiabilidad de estos sistemas. Para ello, esta tesis presenta varias aportaciones:
En primer lugar, se presenta un estudio sobre cómo aprovechar la información
semántica 3D en la odometría LiDAR, generando una base de conocimiento sobre la
contribución de cada tipo de elemento del entorno a la salida de la odometría. Los
resultados experimentales se han obtenido utilizando una base de datos pública y se
han validado en una plataforma de conducción del mundo real. En segundo lugar,
se propone un método para estimar el error de localización utilizando detecciones
de puntos de referencia, que posteriormente es explotado por un algoritmo de
optimización de posicionamiento de puntos de referencia. Este método, que ha
sido validado en un entorno de simulación, es capaz de determinar un conjunto de
puntos de referencia para el cual el error de localización nunca supere un límite
previamente fijado. Por último, se propone un algoritmo de localización cooperativa
basado en un Filtro Genético de Partículas para utilizar las detecciones de vehículos
con el fin de mejorar la estimación proporcionada por los sistemas GNSS. El método
propuesto ha sido validado mediante múltiples experimentos en diferentes entornos
de simulación.Programa de Doctorado en Ingeniería Eléctrica, Electrónica y Automática por la Universidad Carlos III de MadridSecretario: Joshué Manuel Pérez Rastelli.- Secretario: Jorge Villagrá Serrano.- Vocal: Enrique David Martí Muño
Past, Present, and Future of Simultaneous Localization And Mapping: Towards the Robust-Perception Age
Simultaneous Localization and Mapping (SLAM)consists in the concurrent
construction of a model of the environment (the map), and the estimation of the
state of the robot moving within it. The SLAM community has made astonishing
progress over the last 30 years, enabling large-scale real-world applications,
and witnessing a steady transition of this technology to industry. We survey
the current state of SLAM. We start by presenting what is now the de-facto
standard formulation for SLAM. We then review related work, covering a broad
set of topics including robustness and scalability in long-term mapping, metric
and semantic representations for mapping, theoretical performance guarantees,
active SLAM and exploration, and other new frontiers. This paper simultaneously
serves as a position paper and tutorial to those who are users of SLAM. By
looking at the published research with a critical eye, we delineate open
challenges and new research issues, that still deserve careful scientific
investigation. The paper also contains the authors' take on two questions that
often animate discussions during robotics conferences: Do robots need SLAM? and
Is SLAM solved
Sensor-Based Topological Coverage And Mapping Algorithms For Resource-Constrained Robot Swarms
Coverage is widely known in the field of sensor networks as the task of deploying sensors to completely cover an environment with the union of the sensor footprints. Related to coverage is the task of exploration that includes guiding mobile robots, equipped with sensors, to map an unknown environment (mapping) or clear a known environment (searching and pursuit- evasion problem) with their sensors. This is an essential task for robot swarms in many robotic applications including environmental monitoring, sensor deployment, mine clearing, search-and-rescue, and intrusion detection. Utilizing a large team of robots not only improves the completion time of such tasks, but also improve the scalability of the applications while increasing the robustness to systems’ failure.
Despite extensive research on coverage, mapping, and exploration problems, many challenges remain to be solved, especially in swarms where robots have limited computational and sensing capabilities. The majority of approaches used to solve the coverage problem rely on metric information, such as the pose of the robots and the position of obstacles. These geometric approaches are not suitable for large scale swarms due to high computational complexity and sensitivity to noise. This dissertation focuses on algorithms that, using tools from algebraic topology and bearing-based control, solve the coverage related problem with a swarm of resource-constrained robots.
First, this dissertation presents an algorithm for deploying mobile robots to attain a hole-less sensor coverage of an unknown environment, where each robot is only capable of measuring the bearing angles to the other robots within its sensing region and the obstacles that it touches. Next, using the same sensing model, a topological map of an environment can be obtained using graph-based search techniques even when there is an insufficient number of robots to attain full coverage of the environment. We then introduce the landmark complex representation and present an exploration algorithm that not only is complete when the landmarks are sufficiently dense but also scales well with any swarm size. Finally, we derive a multi-pursuers and multi-evaders planning algorithm, which detects all possible evaders and clears complex environments
Scalable underwater assembly with reconfigurable visual fiducials
We present a scalable combined localization infrastructure deployment and
task planning algorithm for underwater assembly. Infrastructure is autonomously
modified to suit the needs of manipulation tasks based on an uncertainty model
on the infrastructure's positional accuracy. Our uncertainty model can be
combined with the noise characteristics from multiple devices. For the task
planning problem, we propose a layer-based clustering approach that completes
the manipulation tasks one cluster at a time. We employ movable visual fiducial
markers as infrastructure and an autonomous underwater vehicle (AUV) for
manipulation tasks. The proposed task planning algorithm is computationally
simple, and we implement it on AUV without any offline computation
requirements. Combined hardware experiments and simulations over large datasets
show that the proposed technique is scalable to large areas.Comment: Submitted to ICRA 202
- …