6,357 research outputs found
Construction and Calibration of a Low-Cost 3D Laser Scanner with 360â—¦ Field of View for Mobile Robots
Navigation of many mobile robots relies on environmental information obtained from three-dimensional (3D) laser scanners. This paper presents a new 360â—¦ field-of-view 3D laser scanner for mobile robots that avoids the high cost of commercial devices. The 3D scanner is based on spinning a Hokuyo UTM- 30LX-EX two-dimensional (2D) rangefinder around its optical center. The proposed design profits from lessons learned with the development of a previous 3D scanner with pitching motion. Intrinsic calibration of the new device has been performed to obtain both temporal and geometric parameters. The paper also shows the integration of the 3D device in the outdoor mobile robot Andabata.Universidad de Málaga. Campus de Excelencia Internacional AndalucÃa Tec
Evaluation of laser range-finder mapping for agricultural spraying vehicles
In this paper, we present a new application of laser range-finder sensing to agricultural spraying vehicles. The current generation of spraying vehicles use automatic controllers to maintain the height of the sprayer booms above the crop.
However, these control systems are typically based on ultrasonic sensors mounted on the booms, which limits the accuracy of the measurements and the response of the controller to changes in the terrain, resulting in a sub-optimal spraying process. To overcome these limitations, we propose to use a laser scanner, attached to the front of the sprayer's cabin, to scan the ground surface in front of the vehicle and to build a scrolling 3d map of the terrain. We evaluate the proposed solution in a series of field tests, demonstrating that the approach provides a more detailed and accurate representation of the environment than the current sonar-based solution, and which can lead to the development of more efficient boom control systems
Fast, Autonomous Flight in GPS-Denied and Cluttered Environments
One of the most challenging tasks for a flying robot is to autonomously
navigate between target locations quickly and reliably while avoiding obstacles
in its path, and with little to no a-priori knowledge of the operating
environment. This challenge is addressed in the present paper. We describe the
system design and software architecture of our proposed solution, and showcase
how all the distinct components can be integrated to enable smooth robot
operation. We provide critical insight on hardware and software component
selection and development, and present results from extensive experimental
testing in real-world warehouse environments. Experimental testing reveals that
our proposed solution can deliver fast and robust aerial robot autonomous
navigation in cluttered, GPS-denied environments.Comment: Pre-peer reviewed version of the article accepted in Journal of Field
Robotic
Radar-on-Lidar: metric radar localization on prior lidar maps
Radar and lidar, provided by two different range sensors, each has pros and
cons of various perception tasks on mobile robots or autonomous driving. In
this paper, a Monte Carlo system is used to localize the robot with a rotating
radar sensor on 2D lidar maps. We first train a conditional generative
adversarial network to transfer raw radar data to lidar data, and achieve
reliable radar points from generator. Then an efficient radar odometry is
included in the Monte Carlo system. Combining the initial guess from odometry,
a measurement model is proposed to match the radar data and prior lidar maps
for final 2D positioning. We demonstrate the effectiveness of the proposed
localization framework on the public multi-session dataset. The experimental
results show that our system can achieve high accuracy for long-term
localization in outdoor scenes
Towards Odor-Sensitive Mobile Robots
J. Monroy, J. Gonzalez-Jimenez, "Towards Odor-Sensitive Mobile Robots", Electronic Nose Technologies and Advances in Machine Olfaction, IGI Global, pp. 244--263, 2018, doi:10.4018/978-1-5225-3862-2.ch012
Versión preprint, con permiso del editorOut of all the components of a mobile robot, its sensorial system is undoubtedly among the most critical
ones when operating in real environments. Until now, these sensorial systems mostly relied on range
sensors (laser scanner, sonar, active triangulation) and cameras. While electronic noses have barely
been employed, they can provide a complementary sensory information, vital for some applications, as
with humans. This chapter analyzes the motivation of providing a robot with gas-sensing capabilities
and also reviews some of the hurdles that are preventing smell from achieving the importance of other
sensing modalities in robotics. The achievements made so far are reviewed to illustrate the current status
on the three main fields within robotics olfaction: the classification of volatile substances, the spatial
estimation of the gas dispersion from sparse measurements, and the localization of the gas source within
a known environment
CNN for Very Fast Ground Segmentation in Velodyne LiDAR Data
This paper presents a novel method for ground segmentation in Velodyne point
clouds. We propose an encoding of sparse 3D data from the Velodyne sensor
suitable for training a convolutional neural network (CNN). This general
purpose approach is used for segmentation of the sparse point cloud into ground
and non-ground points. The LiDAR data are represented as a multi-channel 2D
signal where the horizontal axis corresponds to the rotation angle and the
vertical axis the indexes channels (i.e. laser beams). Multiple topologies of
relatively shallow CNNs (i.e. 3-5 convolutional layers) are trained and
evaluated using a manually annotated dataset we prepared. The results show
significant improvement of performance over the state-of-the-art method by
Zhang et al. in terms of speed and also minor improvements in terms of
accuracy.Comment: ICRA 2018 submissio
Past, Present, and Future of Simultaneous Localization And Mapping: Towards the Robust-Perception Age
Simultaneous Localization and Mapping (SLAM)consists in the concurrent
construction of a model of the environment (the map), and the estimation of the
state of the robot moving within it. The SLAM community has made astonishing
progress over the last 30 years, enabling large-scale real-world applications,
and witnessing a steady transition of this technology to industry. We survey
the current state of SLAM. We start by presenting what is now the de-facto
standard formulation for SLAM. We then review related work, covering a broad
set of topics including robustness and scalability in long-term mapping, metric
and semantic representations for mapping, theoretical performance guarantees,
active SLAM and exploration, and other new frontiers. This paper simultaneously
serves as a position paper and tutorial to those who are users of SLAM. By
looking at the published research with a critical eye, we delineate open
challenges and new research issues, that still deserve careful scientific
investigation. The paper also contains the authors' take on two questions that
often animate discussions during robotics conferences: Do robots need SLAM? and
Is SLAM solved
- …