9,810 research outputs found

    A Study on Recent Developments and Issues with Obstacle Detection Systems for Automated Vehicles

    Get PDF
    This paper reviews current developments and discusses some critical issues with obstacle detection systems for automated vehicles. The concept of autonomous driving is the driver towards future mobility. Obstacle detection systems play a crucial role in implementing and deploying autonomous driving on our roads and city streets. The current review looks at technology and existing systems for obstacle detection. Specifically, we look at the performance of LIDAR, RADAR, vision cameras, ultrasonic sensors, and IR and review their capabilities and behaviour in a number of different situations: during daytime, at night, in extreme weather conditions, in urban areas, in the presence of smooths surfaces, in situations where emergency service vehicles need to be detected and recognised, and in situations where potholes need to be observed and measured. It is suggested that combining different technologies for obstacle detection gives a more accurate representation of the driving environment. In particular, when looking at technological solutions for obstacle detection in extreme weather conditions (rain, snow, fog), and in some specific situations in urban areas (shadows, reflections, potholes, insufficient illumination), although already quite advanced, the current developments appear to be not sophisticated enough to guarantee 100% precision and accuracy, hence further valiant effort is needed

    Air pollution and fog detection through vehicular sensors

    Get PDF
    We describe a method for the automatic recognition of air pollution and fog from a vehicle. Our system consists of sensors to acquire main data from cameras as well as from Light Detection and Recognition (LIDAR) instruments. We discuss how this data can be collected, analyzed and merged to determine the degree of air pollution or fog. Such data is essential for control systems of moving vehicles in making autonomous decisions for avoidance. Backend systems need such data for forecasting and strategic traffic planning and control. Laboratory based experimental results are presented for weather conditions like air pollution and fog, showing that the recognition scenario works with better than adequate results. This paper demonstrates that LIDAR technology, already onboard for the purpose of autonomous driving, can be used to improve weather condition recognition when compared with a camera only system. We conclude that the combination of a front camera and a LIDAR laser scanner is well suited as a sensor instrument set for air pollution and fog recognition that can contribute accurate data to driving assistance and weather alerting-systems

    Model Adaptation with Synthetic and Real Data for Semantic Dense Foggy Scene Understanding

    Full text link
    This work addresses the problem of semantic scene understanding under dense fog. Although considerable progress has been made in semantic scene understanding, it is mainly related to clear-weather scenes. Extending recognition methods to adverse weather conditions such as fog is crucial for outdoor applications. In this paper, we propose a novel method, named Curriculum Model Adaptation (CMAda), which gradually adapts a semantic segmentation model from light synthetic fog to dense real fog in multiple steps, using both synthetic and real foggy data. In addition, we present three other main stand-alone contributions: 1) a novel method to add synthetic fog to real, clear-weather scenes using semantic input; 2) a new fog density estimator; 3) the Foggy Zurich dataset comprising 38083808 real foggy images, with pixel-level semantic annotations for 1616 images with dense fog. Our experiments show that 1) our fog simulation slightly outperforms a state-of-the-art competing simulation with respect to the task of semantic foggy scene understanding (SFSU); 2) CMAda improves the performance of state-of-the-art models for SFSU significantly by leveraging unlabeled real foggy data. The datasets and code are publicly available.Comment: final version, ECCV 201

    The Reliability and Effectiveness of a Radar-Based Animal Detection System

    Get PDF
    This document contains data on the reliability and effectiveness of an animal detection system along U.S. Hwy 95 near Bonners Ferry, Idaho. The system uses a Doppler radar to detect large mammals (e.g., deer and elk) when they approach the highway. The system met most of the suggested minimum norms for reliability. The total time the warning signs were activated was at most 90 seconds per hour, and likely substantially less. Animal detection systems are designed to detect an approaching animal. After an animal has been detected, warning signs are activated which allow drivers to respond. Results showed that 58.1–67.9% of deer were detected sufficiently early for northbound drivers, and 70.4–85% of deer were detected sufficiently early for southbound drivers. The effect of the activated warning signs on vehicle speed was greatest when road conditions were challenging (e.g., freezing temperatures and snow- and ice-covered road surface) and when visibility was low (night). In summer, there was no measurable benefit of activated warning signs, at least not as far as vehicle speed is concerned. Depending on the conditions in autumn and winter, the activated warning signs resulted in a speed reduction of 0.69 to 4.43 miles per hour. The report includes practical recommendations for operation and maintenance of the system and suggestions for potential future research

    Wide area detection system: Conceptual design study

    Get PDF
    An integrated sensor for traffic surveillance on mainline sections of urban freeways is described. Applicable imaging and processor technology is surveyed and the functional requirements for the sensors and the conceptual design of the breadboard sensors are given. Parameters measured by the sensors include lane density, speed, and volume. The freeway image is also used for incident diagnosis
    • …
    corecore