3,392 research outputs found

    Probably Unknown: Deep Inverse Sensor Modelling In Radar

    Full text link
    Radar presents a promising alternative to lidar and vision in autonomous vehicle applications, able to detect objects at long range under a variety of weather conditions. However, distinguishing between occupied and free space from raw radar power returns is challenging due to complex interactions between sensor noise and occlusion. To counter this we propose to learn an Inverse Sensor Model (ISM) converting a raw radar scan to a grid map of occupancy probabilities using a deep neural network. Our network is self-supervised using partial occupancy labels generated by lidar, allowing a robot to learn about world occupancy from past experience without human supervision. We evaluate our approach on five hours of data recorded in a dynamic urban environment. By accounting for the scene context of each grid cell our model is able to successfully segment the world into occupied and free space, outperforming standard CFAR filtering approaches. Additionally by incorporating heteroscedastic uncertainty into our model formulation, we are able to quantify the variance in the uncertainty throughout the sensor observation. Through this mechanism we are able to successfully identify regions of space that are likely to be occluded.Comment: 6 full pages, 1 page of reference

    Frequency Modulated Continuous Waveform Radar for Collision Prevention in Large Vehicles

    Get PDF
    The drivers of large vehicles can have very limited visibility, which contributes to poor situation awareness and an increased risk of collision with other agents. This thesis is focused on the development of reliable sensing for this close proximity problem in large vehicles operating in harsh environmental conditions. It emphasises the use of in-depth knowledge of a sensor’s physics and performance characteristics to develop effective mathematical models for use in different mapping algorithms. An analysis of the close proximity problem and the demands it poses on sensing technologies is presented. This guides the design and modelling process for a frequency modulated continuous waveform (FMCW) radar sensor for use in solving the close proximity problem. Radar offers better all-weather performance than other sensing modalities, but its measurement structure is more complex and often degraded by noise and clutter. The commonly used constant false alarm rate (CFAR) threshold approach performs poorly in applications with frequent extended targets and a short measurement vector, as is the case here. Therefore, a static detection threshold is calculated using measurements of clutter made using the radar, allowing clutter measurements to be filtered out in known environments. The detection threshold is used to develop a heuristic sensor model for occupancy grid mapping. This results in a more reliable representation of the environment than is achieved using the detection threshold alone. A Gaussian mixture extended Kalman probability hypothesis density filter (GM-EK-PHD) is implemented to allow mapping in dynamic environments using the FMCW radar. These methods are used to produce maps of the environment that can be displayed to the driver of a large vehicle to better avoid collisions. The concepts developed in this thesis are validated using simulated and real data from a low-cost 24GHz FMCW radar developed at the Australian Centre for Field Robotics at the University of Sydney

    Radar-based Dynamic Occupancy Grid Mapping and Object Detection

    Full text link
    Environment modeling utilizing sensor data fusion and object tracking is crucial for safe automated driving. In recent years, the classical occupancy grid map approach, which assumes a static environment, has been extended to dynamic occupancy grid maps, which maintain the possibility of a low-level data fusion while also estimating the position and velocity distribution of the dynamic local environment. This paper presents the further development of a previous approach. To the best of the author's knowledge, there is no publication about dynamic occupancy grid mapping with subsequent analysis based only on radar data. Therefore in this work, the data of multiple radar sensors are fused, and a grid-based object tracking and mapping method is applied. Subsequently, the clustering of dynamic areas provides high-level object information. For comparison, also a lidar-based method is developed. The approach is evaluated qualitatively and quantitatively with real-world data from a moving vehicle in urban environments. The evaluation illustrates the advantages of the radar-based dynamic occupancy grid map, considering different comparison metrics.Comment: Accepted to be published as part of the 23rd IEEE International Conference on Intelligent Transportation Systems (ITSC), Rhodes, Greece, September 20-23, 202

    Object Detection and Classification in Occupancy Grid Maps using Deep Convolutional Networks

    Full text link
    A detailed environment perception is a crucial component of automated vehicles. However, to deal with the amount of perceived information, we also require segmentation strategies. Based on a grid map environment representation, well-suited for sensor fusion, free-space estimation and machine learning, we detect and classify objects using deep convolutional neural networks. As input for our networks we use a multi-layer grid map efficiently encoding 3D range sensor information. The inference output consists of a list of rotated bounding boxes with associated semantic classes. We conduct extensive ablation studies, highlight important design considerations when using grid maps and evaluate our models on the KITTI Bird's Eye View benchmark. Qualitative and quantitative benchmark results show that we achieve robust detection and state of the art accuracy solely using top-view grid maps from range sensor data.Comment: 6 pages, 4 tables, 4 figure
    • …
    corecore