4,781 research outputs found

    Paving the Roadway for Safety of Automated Vehicles: An Empirical Study on Testing Challenges

    Full text link
    The technology in the area of automated vehicles is gaining speed and promises many advantages. However, with the recent introduction of conditionally automated driving, we have also seen accidents. Test protocols for both, conditionally automated (e.g., on highways) and automated vehicles do not exist yet and leave researchers and practitioners with different challenges. For instance, current test procedures do not suffice for fully automated vehicles, which are supposed to be completely in charge for the driving task and have no driver as a back up. This paper presents current challenges of testing the functionality and safety of automated vehicles derived from conducting focus groups and interviews with 26 participants from five countries having a background related to testing automotive safety-related topics.We provide an overview of the state-of-practice of testing active safety features as well as challenges that needs to be addressed in the future to ensure safety for automated vehicles. The major challenges identified through the interviews and focus groups, enriched by literature on this topic are related to 1) virtual testing and simulation, 2) safety, reliability, and quality, 3) sensors and sensor models, 4) required scenario complexity and amount of test cases, and 5) handover of responsibility between the driver and the vehicle.Comment: 8 page

    Benchmarking LiDAR Sensors for Development and Evaluation of Automotive Perception

    Full text link
    Environment perception and representation are some of the most critical tasks in automated driving. To meet the stringent needs of safety standards such as ISO 26262 there is a need for efficient quantitative evaluation of the perceived information. However, to use typical methods of evaluation, such as comparing using annotated data, is not scalable due to the manual effort involved. There is thus a need to automate the process of data annotation. This paper focuses on the LiDAR sensor and aims to identify the limitations of the sensor and provides a methodology to generate annotated data of a measurable quality. The limitations with the sensor are analysed in a Systematic Literature Review on available academic texts and refined by unstructured interviews with experts. The main contributions are 1) the SLR with related interviews to identify LiDAR sensor limitations and 2) the associated methodology which allows us to generate world representations

    Systematic Analysis of the Sensor Coverage of Automated Vehicles Using Phenomenological Sensor Models

    Full text link
    The objective of this paper is to propose a systematic analysis of the sensor coverage of automated vehicles. Due to an unlimited number of possible traffic situations, a selection of scenarios to be tested must be applied in the safety assessment of automated vehicles. This paper describes how phenomenological sensor models can be used to identify system-specific relevant scenarios. In automated driving, the following sensors are predominantly used: camera, ultrasonic, \radar and \lidarohne. Based on the literature, phenomenological models have been developed for the four sensor types, which take into account phenomena such as environmental influences, sensor properties and the type of object to be detected. These phenomenological models have a significantly higher reliability than simple ideal sensor models and require lower computing costs than realistic physical sensor models, which represents an optimal compromise for systematic investigations of sensor coverage. The simulations showed significant differences between different system configurations and thus support the system-specific selection of relevant scenarios for the safety assessment of automated vehicles.Comment: Published at 2019 IEEE Intelligent Vehicles Symposium (IV19), June 201

    A framework to analyze noise factors of automotive perception sensors

    Get PDF
    Automated vehicles (AVs) are one of the breakthroughs of this century. The main argument to support their development is increased safety and reduction of human and economic losses; however, to demonstrate that AVs are safer than human drivers billions of miles of testing are required. Thus, realistic simulation and virtual testing of AV systems and sensors are crucial to accelerate the technological readiness. In particular, perception sensor measurements are affected by uncertainties due to noise factors; these uncertainties need to be included in simulations. This letter presents a framework to exhaustively analyze and simulate the effect of the combination of noise factors on sensor data. We applied the framework to analyze one sensor, the light detection and ranging (LiDAR), but it can be easily adapted to study other sensors. Results demonstrate that single noise factor analysis gives an incomplete knowledge of measurement degradation and perception is dramatically hindered when more noises are combined. The proposed framework is a powerful tool to predict the degradation of AV sensor performance

    People tracking by cooperative fusion of RADAR and camera sensors

    Get PDF
    Accurate 3D tracking of objects from monocular camera poses challenges due to the loss of depth during projection. Although ranging by RADAR has proven effective in highway environments, people tracking remains beyond the capability of single sensor systems. In this paper, we propose a cooperative RADAR-camera fusion method for people tracking on the ground plane. Using average person height, joint detection likelihood is calculated by back-projecting detections from the camera onto the RADAR Range-Azimuth data. Peaks in the joint likelihood, representing candidate targets, are fed into a Particle Filter tracker. Depending on the association outcome, particles are updated using the associated detections (Tracking by Detection), or by sampling the raw likelihood itself (Tracking Before Detection). Utilizing the raw likelihood data has the advantage that lost targets are continuously tracked even if the camera or RADAR signal is below the detection threshold. We show that in single target, uncluttered environments, the proposed method entirely outperforms camera-only tracking. Experiments in a real-world urban environment also confirm that the cooperative fusion tracker produces significantly better estimates, even in difficult and ambiguous situations

    A systematic review of perception system and simulators for autonomous vehicles research

    Get PDF
    This paper presents a systematic review of the perception systems and simulators for autonomous vehicles (AV). This work has been divided into three parts. In the first part, perception systems are categorized as environment perception systems and positioning estimation systems. The paper presents the physical fundamentals, principle functioning, and electromagnetic spectrum used to operate the most common sensors used in perception systems (ultrasonic, RADAR, LiDAR, cameras, IMU, GNSS, RTK, etc.). Furthermore, their strengths and weaknesses are shown, and the quantification of their features using spider charts will allow proper selection of different sensors depending on 11 features. In the second part, the main elements to be taken into account in the simulation of a perception system of an AV are presented. For this purpose, the paper describes simulators for model-based development, the main game engines that can be used for simulation, simulators from the robotics field, and lastly simulators used specifically for AV. Finally, the current state of regulations that are being applied in different countries around the world on issues concerning the implementation of autonomous vehicles is presented.This work was partially supported by DGT (ref. SPIP2017-02286) and GenoVision (ref. BFU2017-88300-C2-2-R) Spanish Government projects, and the “Research Programme for Groups of Scientific Excellence in the Region of Murcia" of the Seneca Foundation (Agency for Science and Technology in the Region of Murcia – 19895/GERM/15)
    • …
    corecore