5,152 research outputs found
Past, Present, and Future of Simultaneous Localization And Mapping: Towards the Robust-Perception Age
Simultaneous Localization and Mapping (SLAM)consists in the concurrent
construction of a model of the environment (the map), and the estimation of the
state of the robot moving within it. The SLAM community has made astonishing
progress over the last 30 years, enabling large-scale real-world applications,
and witnessing a steady transition of this technology to industry. We survey
the current state of SLAM. We start by presenting what is now the de-facto
standard formulation for SLAM. We then review related work, covering a broad
set of topics including robustness and scalability in long-term mapping, metric
and semantic representations for mapping, theoretical performance guarantees,
active SLAM and exploration, and other new frontiers. This paper simultaneously
serves as a position paper and tutorial to those who are users of SLAM. By
looking at the published research with a critical eye, we delineate open
challenges and new research issues, that still deserve careful scientific
investigation. The paper also contains the authors' take on two questions that
often animate discussions during robotics conferences: Do robots need SLAM? and
Is SLAM solved
Learning to automatically detect features for mobile robots using second-order Hidden Markov Models
In this paper, we propose a new method based on Hidden Markov Models to
interpret temporal sequences of sensor data from mobile robots to automatically
detect features. Hidden Markov Models have been used for a long time in pattern
recognition, especially in speech recognition. Their main advantages over other
methods (such as neural networks) are their ability to model noisy temporal
signals of variable length. We show in this paper that this approach is well
suited for interpretation of temporal sequences of mobile-robot sensor data. We
present two distinct experiments and results: the first one in an indoor
environment where a mobile robot learns to detect features like open doors or
T-intersections, the second one in an outdoor environment where a different
mobile robot has to identify situations like climbing a hill or crossing a
rock.Comment: 200
Event-based Vision: A Survey
Event cameras are bio-inspired sensors that differ from conventional frame
cameras: Instead of capturing images at a fixed rate, they asynchronously
measure per-pixel brightness changes, and output a stream of events that encode
the time, location and sign of the brightness changes. Event cameras offer
attractive properties compared to traditional cameras: high temporal resolution
(in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low
power consumption, and high pixel bandwidth (on the order of kHz) resulting in
reduced motion blur. Hence, event cameras have a large potential for robotics
and computer vision in challenging scenarios for traditional cameras, such as
low-latency, high speed, and high dynamic range. However, novel methods are
required to process the unconventional output of these sensors in order to
unlock their potential. This paper provides a comprehensive overview of the
emerging field of event-based vision, with a focus on the applications and the
algorithms developed to unlock the outstanding properties of event cameras. We
present event cameras from their working principle, the actual sensors that are
available and the tasks that they have been used for, from low-level vision
(feature detection and tracking, optic flow, etc.) to high-level vision
(reconstruction, segmentation, recognition). We also discuss the techniques
developed to process events, including learning-based techniques, as well as
specialized processors for these novel sensors, such as spiking neural
networks. Additionally, we highlight the challenges that remain to be tackled
and the opportunities that lie ahead in the search for a more efficient,
bio-inspired way for machines to perceive and interact with the world
A genetic algorithm for mobile robot localization using ultrasonic sensors
A mobile robot requires the perception of its local environment for position estimation. Ultrasonic range data provide a robust description of the local environment for navigation. This article presents an ultrasonic sensor localization system for autonomous mobile robot navigation in an indoor semi-structured environment. The proposed algorithm is based upon an iterative non-linear filter, which utilizes matches between observed geometric beacons and an a-priori map of beacon locations, to correct the position and orientation of the vehicle. A non-linear filter based on a genetic algorithm as an emerging optimization method to search for optimal positions is described. The resulting self-localization module has been integrated successfully in a more complex navigation system. Experiments demonstrate the effectiveness of the proposed method in real world applications.Publicad
Design of a Robotic Inspection Platform for Structural Health Monitoring
Actively monitoring infrastructure is key to detecting and correcting problems before they become costly. The vast scale of modern infrastructure poses a challenge to monitoring due to insufficient personnel. Certain structures, such as refineries, pose additional challenges and can be expensive, time-consuming, and hazardous to inspect.
This thesis outlines the development of an autonomous robot for structural-health-monitoring. The robot is capable of operating autonomously in level indoor environments and can be controlled manually to traverse difficult terrain. Both visual and lidar SLAM, along with a procedural-mapping technique, allow the robot to capture colored-point-clouds.
The robot is successfully able to automate the point cloud collection of straightforward environments such as hallways and empty rooms. While it performs well in these situations, its accuracy suffers in complex environments with variable lighting. More work is needed to create a robust system, but the potential time savings and upgrades make the concept promising
- …