1,201 research outputs found

    Right-Hook Crash Scenario: Effects of Environmental Factors on Driver\u27s Visual Attention and Crash Risk

    Get PDF
    A right-hook (RH) crash is a common type of bicycleโ€“motor vehicle crash that occurs between a right-turning vehicle and through-moving bicycle at an intersection in right-hand driving countries. Despite the frequency and severity of this crash type, no significant driver-performance based evidence of the causes of RH crashes at signalized intersections was found in the literature. This study examined the driverโ€™s visual attention in a right-turning scenario at signalized intersections with bicycle lanes but no exclusive right-turning lanes while interacting with a bicyclist to develop an understanding of RH crash causality. Fifty-one participants in 21 simulated road scenarios performed a right-turning maneuver at a signalized intersection while conflicting with traffic, pedestrians, and bicyclists. Overall, a total of 820 (41 ร— 20) observable right-turn maneuvers with visual attention data were analyzed. The results show that in the presence of conflicting oncoming left-turning vehicular traffic, drivers spent less visual attention on the approaching bicyclist, thus, making them less likely to be detected by the driver. The presence of oncoming left-turning traffic and the bicyclistโ€™s speed and relative position, and conflicting pedestrians were found likely to increase the risk of RH crashes. The results of the current study will help identify effective crash mitigation strategies that may include improving the vehicleโ€“human interface or the implementation of design treatments in the road environment to improve driver and bicyclist performance

    ๋„์‹ฌ ๊ต์ฐจ๋กœ์—์„œ์˜ ์ž์œจ์ฃผํ–‰์„ ์œ„ํ•œ ์ฃผ๋ณ€ ์ฐจ๋Ÿ‰ ๊ฒฝ๋กœ ์˜ˆ์ธก ๋ฐ ๊ฑฐ๋™ ๊ณ„ํš ์•Œ๊ณ ๋ฆฌ์ฆ˜

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ(๋ฐ•์‚ฌ)--์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› :๊ณต๊ณผ๋Œ€ํ•™ ๊ธฐ๊ณ„ํ•ญ๊ณต๊ณตํ•™๋ถ€,2020. 2. ์ด๊ฒฝ์ˆ˜.์ฐจ๋ž‘์šฉ ์„ผ์‹ฑ ๋ฐ ์ฒ˜๋ฆฌ๊ธฐ์ˆ ์ด ๋ฐœ๋‹ฌํ•จ์— ๋”ฐ๋ผ ์ž๋™์ฐจ ๊ธฐ์ˆ  ์—ฐ๊ตฌ๊ฐ€ ์ˆ˜๋™ ์•ˆ์ „ ๊ธฐ์ˆ ์—์„œ ๋Šฅ๋™ ์•ˆ์ „ ๊ธฐ์ˆ ๋กœ ์ดˆ์ ์ด ํ™•์žฅ๋˜๊ณ  ์žˆ๋‹ค. ์ตœ๊ทผ, ์ฃผ์š” ์ž๋™์ฐจ ์ œ์ž‘์‚ฌ๋“ค์€ ๋Šฅ๋™ํ˜• ์ฐจ๊ฐ„๊ฑฐ๋ฆฌ ์ œ์–ด, ์ฐจ์„  ์œ ์ง€ ๋ณด์กฐ, ๊ทธ๋ฆฌ๊ณ  ๊ธด๊ธ‰ ์ž๋™ ์ œ๋™๊ณผ ๊ฐ™์€ ๋Šฅ๋™ ์•ˆ์ „ ๊ธฐ์ˆ ์ด ์ด๋ฏธ ์ƒ์—…ํ™”ํ•˜๊ณ  ์žˆ๋‹ค. ์ด๋Ÿฌํ•œ ๊ธฐ์ˆ ์  ์ง„๋ณด๋Š” ์‚ฌ์ƒ๋ฅ  ์ œ๋กœ๋ฅผ ๋‹ฌ์„ฑํ•˜๊ธฐ ์œ„ํ•˜์—ฌ ๊ธฐ์ˆ  ์—ฐ๊ตฌ ๋ถ„์•ผ๋ฅผ ๋Šฅ๋™ ์•ˆ์ „ ๊ธฐ์ˆ ์„ ๋„˜์–ด์„œ ์ž์œจ์ฃผํ–‰ ์‹œ์Šคํ…œ์œผ๋กœ ํ™•์žฅ์‹œํ‚ค๊ณ  ์žˆ๋‹ค. ํŠนํžˆ, ๋„์‹ฌ ๋„๋กœ๋Š” ์ธ๋„, ์‚ฌ๊ฐ์ง€๋Œ€, ์ฃผ์ฐจ์ฐจ๋Ÿ‰, ์ด๋ฅœ์ฐจ, ๋ณดํ–‰์ž ๋“ฑ๊ณผ ๊ฐ™์€ ๊ตํ†ต ์œ„ํ—˜ ์š”์†Œ๋ฅผ ๋งŽ์ด ๊ฐ–๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์— ๊ณ ์†๋„๋กœ๋ณด๋‹ค ์‚ฌ๊ณ  ๋ฐœ์ƒ๋ฅ ๊ณผ ์‚ฌ์ƒ๋ฅ ์ด ๋†’์œผ๋ฉฐ, ์ด๋Š” ๋„์‹ฌ ๋„๋กœ์—์„œ์˜ ์ž์œจ์ฃผํ–‰์€ ํ•ต์‹ฌ ์ด์Šˆ๊ฐ€ ๋˜๊ณ  ์žˆ๋‹ค. ๋งŽ์€ ํ”„๋กœ์ ํŠธ๋“ค์ด ์ž์œจ์ฃผํ–‰์˜ ํ™˜๊ฒฝ์ , ์ธ๊ตฌํ•™์ , ์‚ฌํšŒ์ , ๊ทธ๋ฆฌ๊ณ  ๊ฒฝ์ œ์  ์ธก๋ฉด์—์„œ์˜ ์ž์œจ์ฃผํ–‰์˜ ํšจ๊ณผ๋ฅผ ํ‰๊ฐ€ํ•˜๊ธฐ ์œ„ํ•ด ์ˆ˜ํ–‰๋˜์—ˆ๊ฑฐ๋‚˜ ์ˆ˜ํ–‰ ์ค‘์— ์žˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด, ์œ ๋Ÿฝ์˜ AdaptIVE๋Š” ๋‹ค์–‘ํ•œ ์ž์œจ์ฃผํ–‰ ๊ธฐ๋Šฅ์„ ๊ฐœ๋ฐœํ•˜์˜€์œผ๋ฉฐ, ๊ตฌ์ฒด์ ์ธ ํ‰๊ฐ€ ๋ฐฉ๋ฒ•๋ก ์„ ๊ฐœ๋ฐœํ•˜์˜€๋‹ค. ๋˜ํ•œ, CityMobil2๋Š” ์œ ๋Ÿฝ ์ „์—ญ์˜ 9๊ฐœ์˜ ๋‹ค๋ฅธ ํ™˜๊ฒฝ์—์„œ ๋ฌด์ธ ์ง€๋Šฅํ˜• ์ฐจ๋Ÿ‰์„ ์„ฑ๊ณต์ ์œผ๋กœ ํ†ตํ•ฉํ•˜์˜€๋‹ค. ์ผ๋ณธ์—์„œ๋Š” 2014๋…„ 5์›”์— ์‹œ์ž‘๋œ Automated Driving System Research Project๋Š” ์ž์œจ์ฃผํ–‰ ์‹œ์Šคํ…œ๊ณผ ์ฐจ์„ธ๋Œ€ ๋„์‹ฌ ๊ตํ†ต ์ˆ˜๋‹จ์˜ ๊ฐœ๋ฐœ ๋ฐ ๊ฒ€์ฆ์— ์ดˆ์ ์„ ๋งž์ถ”์—ˆ๋‹ค. ๊ธฐ์กด ์—ฐ๊ตฌ๋“ค์— ๋Œ€ํ•œ ์กฐ์‚ฌ๋ฅผ ํ†ตํ•ด ์ž์œจ์ฃผํ–‰ ์‹œ์Šคํ…œ์€ ๊ตํ†ต ์ฐธ์—ฌ์ž๋“ค์˜ ์•ˆ์ „๋„๋ฅผ ํ–ฅ์ƒ์‹œํ‚ค๊ณ , ๊ตํ†ต ํ˜ผ์žก์„ ๊ฐ์†Œ์‹œํ‚ค๋ฉฐ, ์šด์ „์ž ํŽธ์˜์„ฑ์„ ์ฆ์ง„์‹œํ‚ค๋Š” ๊ฒƒ์ด ์ฆ๋ช…๋˜์—ˆ๋‹ค. ๋‹ค์–‘ํ•œ ๋ฐฉ๋ฒ•๋ก ๋“ค์ด ์ธ์ง€, ๊ฑฐ๋™ ๊ณ„ํš, ๊ทธ๋ฆฌ๊ณ  ์ œ์–ด์™€ ๊ฐ™์€ ๋„์‹ฌ ๋„๋กœ ์ž์œจ์ฃผํ–‰์ฐจ์˜ ํ•ต์‹ฌ ๊ธฐ์ˆ ๋“ค์„ ๊ฐœ๋ฐœํ•˜๊ธฐ ์œ„ํ•˜์—ฌ ์‚ฌ์šฉ๋˜์—ˆ๋‹ค. ํ•˜์ง€๋งŒ ๋งŽ์€ ์ตœ์‹ ์˜ ์ž์œจ์ฃผํ–‰ ์—ฐ๊ตฌ๋“ค์€ ๊ฐ ๊ธฐ์ˆ ์˜ ๊ฐœ๋ฐœ์„ ๋ณ„๊ฐœ๋กœ ๊ณ ๋ คํ•˜์—ฌ ์ง„ํ–‰ํ•ด์™”๋‹ค. ๊ฒฐ๊ณผ์ ์œผ๋กœ ํ†ตํ•ฉ์ ์ธ ๊ด€์ ์—์„œ์˜ ์ž์œจ์ฃผํ–‰ ๊ธฐ์ˆ  ์„ค๊ณ„๋Š” ์•„์ง ์ถฉ๋ถ„ํžˆ ๊ณ ๋ ค๋˜์–ด ์•Š์•˜๋‹ค. ๋”ฐ๋ผ์„œ, ๋ณธ ๋…ผ๋ฌธ์€ ๋ณต์žกํ•œ ๋„์‹ฌ ๋„๋กœ ํ™˜๊ฒฝ์—์„œ ๋ผ์ด๋‹ค, ์นด๋ฉ”๋ผ, GPS, ๊ทธ๋ฆฌ๊ณ  ๊ฐ„๋‹จํ•œ ๊ฒฝ๋กœ ๋งต์— ๊ธฐ๋ฐ˜ํ•œ ์™„์ „ ์ž์œจ์ฃผํ–‰ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ๊ฐœ๋ฐœํ•˜๋Š” ๊ฒƒ์„ ๋ชฉํ‘œ๋กœ ํ•˜์˜€๋‹ค. ์ œ์•ˆ๋œ ์ž์œจ์ฃผํ–‰ ์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ๋น„ํ†ต์ œ ๊ต์ฐจ๋กœ๋ฅผ ํฌํ•จํ•œ ๋„์‹ฌ ๋„๋กœ ์ƒํ™ฉ์„ ์ฐจ๋Ÿ‰ ๊ฑฐ๋™ ์˜ˆ์ธก๊ธฐ์™€ ๋ชจ๋ธ ์˜ˆ์ธก ์ œ์–ด ๊ธฐ๋ฒ•์— ๊ธฐ๋ฐ˜ํ•˜์—ฌ ์„ค๊ณ„๋˜์—ˆ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์€ ๋™์ , ์ •์  ํ™˜๊ฒฝ ํ‘œํ˜„ ๋ฐ ์ข…ํšก๋ฐฉํ–ฅ ๊ฑฐ๋™ ๊ณ„ํš์„ ์ค‘์ ์ ์œผ๋กœ ๋‹ค๋ฃจ์—ˆ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์€ ๋„์‹ฌ ๋„๋กœ ์ž์œจ์ฃผํ–‰์„ ์œ„ํ•œ ๊ฑฐ๋™ ๊ณ„ํš ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ๊ฐœ์š”๋ฅผ ์ œ์‹œํ•˜์˜€์œผ๋ฉฐ, ์‹ค์ œ ๊ตํ†ต ์ƒํ™ฉ์—์„œ์˜ ์‹คํ—˜ ๊ฒฐ๊ณผ๋Š” ์ œ์•ˆ๋œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ํšจ๊ณผ์„ฑ๊ณผ ์šด์ „์ž ๊ฑฐ๋™๊ณผ์˜ ์œ ์‚ฌ์„ฑ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. ์‹ค์ฐจ ์‹คํ—˜ ๊ฒฐ๊ณผ๋Š” ๋น„ํ†ต์ œ ๊ต์ฐจ๋กœ๋ฅผ ํฌํ•จํ•œ ๋„์‹ฌ ์‹œ๋‚˜๋ฆฌ์˜ค์—์„œ์˜ ๊ฐ•๊ฑดํ•œ ์„ฑ๋Šฅ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค.The foci of automotive researches have been expanding from passive safety systems to active safety systems with advances in sensing and processing technologies. Recently, the majority of automotive makers have already commercialized active safety systems, such as adaptive cruise control (ACC), lane keeping assistance (LKA), and autonomous emergency braking (AEB). Such advances have extended the research field beyond active safety systems to automated driving systems to achieve zero fatalities. Especially, automated driving on urban roads has become a key issue because urban roads possess numerous risk factors for traffic accidents, such as sidewalks, blind spots, on-street parking, motorcycles, and pedestrians, which cause higher accident rates and fatalities than motorways. Several projects have been conducted, and many others are still underway to evaluate the effects of automated driving in environmental, demographic, social, and economic aspects. For example, the European project AdaptIVe, develops various automated driving functions and defines specific evaluation methodologies. In addition, CityMobil2 successfully integrates driverless intelligent vehicles in nine other environments throughout Europe. In Japan, the Automated Driving System Research Project began on May 2014, which focuses on the development and verification of automated driving systems and next-generation urban transportation. From a careful review of a considerable amount of literature, automated driving systems have been proven to increase the safety of traffic users, reduce traffic congestion, and improve driver convenience. Various methodologies have been employed to develop the core technology of automated vehicles on urban roads, such as perception, motion planning, and control. However, the current state-of-the-art automated driving algorithms focus on the development of each technology separately. Consequently, designing automated driving systems from an integrated perspective is not yet sufficiently considered. Therefore, this dissertation focused on developing a fully autonomous driving algorithm in urban complex scenarios using LiDAR, vision, GPS, and a simple path map. The proposed autonomous driving algorithm covered the urban road scenarios with uncontrolled intersections based on vehicle motion prediction and model predictive control approach. Mainly, four research issues are considered: dynamic/static environment representation, and longitudinal/lateral motion planning. In the remainder of this thesis, we will provide an overview of the proposed motion planning algorithm for urban autonomous driving and the experimental results in real traffic, which showed the effectiveness and human-like behaviors of the proposed algorithm. The proposed algorithm has been tested and evaluated using both simulation and vehicle tests. The test results show the robust performance of urban scenarios, including uncontrolled intersections.Chapter 1 Introduction 1 1.1. Background and Motivation 1 1.2. Previous Researches 4 1.3. Thesis Objectives 9 1.4. Thesis Outline 10 Chapter 2 Overview of Motion Planning for Automated Driving System 11 Chapter 3 Dynamic Environment Representation with Motion Prediction 15 3.1. Moving Object Classification 17 3.2. Vehicle State based Direct Motion Prediction 20 3.2.1. Data Collection Vehicle 22 3.2.2. Target Roads 23 3.2.3. Dataset Selection 24 3.2.4. Network Architecture 25 3.2.5. Input and Output Features 33 3.2.6. Encoder and Decoder 33 3.2.7. Sequence Length 34 3.3. Road Structure based Interactive Motion Prediction 36 3.3.1. Maneuver Definition 38 3.3.2. Network Architecture 39 3.3.3. Path Following Model based State Predictor 47 3.3.4. Estimation of predictor uncertainty 50 3.3.5. Motion Parameter Estimation 53 3.3.6. Interactive Maneuver Prediction 56 3.4. Intersection Approaching Vehicle Motion Prediction 59 3.4.1. Driver Behavior Model at Intersections 59 3.4.2. Intention Inference based State Prediction 63 Chapter 4 Static Environment Representation 67 4.1. Static Obstacle Map Construction 69 4.2. Free Space Boundary Decision 74 4.3. Drivable Corridor Decision 76 Chapter 5 Longitudinal Motion Planning 81 5.1. In-Lane Target Following 82 5.2. Proactive Motion Planning for Narrow Road Driving 85 5.2.1. Motivation for Collision Preventive Velocity Planning 85 5.2.2. Desired Acceleration Decision 86 5.3. Uncontrolled Intersection 90 5.3.1. Driving Phase and Mode Definition 91 5.3.2. State Machine for Driving Mode Decision 92 5.3.3. Motion Planner for Approach Mode 95 5.3.4. Motion Planner for Risk Management Phase 98 Chapter 6 Lateral Motion Planning 105 6.1. Vehicle Model 107 6.2. Cost Function and Constraints 109 Chapter 7 Performance Evaluation 115 7.1. Motion Prediction 115 7.1.1. Prediction Accuracy Analysis of Vehicle State based Direct Motion Predictor 115 7.1.2. Prediction Accuracy and Effect Analysis of Road Structure based Interactive Motion Predictor 122 7.2. Prediction based Distance Control at Urban Roads 132 7.2.1. Driving Data Analysis of Direct Motion Predictor Application at Urban Roads 133 7.2.2. Case Study of Vehicle Test at Urban Roads 138 7.2.3. Analysis of Vehicle Test Results on Urban Roads 147 7.3. Complex Urban Roads 153 7.3.1. Case Study of Vehicle Test at Complex Urban Roads 154 7.3.2. Closed-loop Simulation based Safety Analysis 162 7.4. Uncontrolled Intersections 164 7.4.1. Simulation based Algorithm Comparison of Motion Planner 164 7.4.2. Monte-Carlo Simulation based Safety Analysis 166 7.4.3. Vehicle Tests Results in Real Traffic Conditions 172 7.4.4. Similarity Analysis between Human and Automated Vehicle 194 7.5. Multi-Lane Turn Intersections 197 7.5.1. Case Study of a Multi-Lane Left Turn Scenario 197 7.5.2. Analysis of Motion Planning Application Results 203 Chapter 8 Conclusion & Future Works 207 8.1. Conclusion 207 8.2. Future Works 209 Bibliography 210 Abstract in Korean 219Docto

    Vehicle Tracking and Motion Estimation Based on Stereo Vision Sequences

    Get PDF
    In this dissertation, a novel approach for estimating trajectories of road vehicles such as cars, vans, or motorbikes, based on stereo image sequences is presented. Moving objects are detected and reliably tracked in real-time from within a moving car. The resulting information on the pose and motion state of other moving objects with respect to the own vehicle is an essential basis for future driver assistance and safety systems, e.g., for collision prediction. The focus of this contribution is on oncoming traffic, while most existing work in the literature addresses tracking the lead vehicle. The overall approach is generic and scalable to a variety of traffic scenes including inner city, country road, and highway scenarios. A considerable part of this thesis addresses oncoming traffic at urban intersections. The parameters to be estimated include the 3D position and orientation of an object relative to the ego-vehicle, as well as the object's shape, dimension, velocity, acceleration and the rotational velocity (yaw rate). The key idea is to derive these parameters from a set of tracked 3D points on the object's surface, which are registered to a time-consistent object coordinate system, by means of an extended Kalman filter. Combining the rigid 3D point cloud model with the dynamic model of a vehicle is one main contribution of this thesis. Vehicle tracking at intersections requires covering a wide range of different object dynamics, since vehicles can turn quickly. Three different approaches for tracking objects during highly dynamic turn maneuvers up to extreme maneuvers such as skidding are presented and compared. These approaches allow for an online adaptation of the filter parameter values, overcoming manual parameter tuning depending on the dynamics of the tracked object in the scene. This is the second main contribution. Further issues include the introduction of two initialization methods, a robust outlier handling, a probabilistic approach for assigning new points to a tracked object, as well as mid-level fusion of the vision-based approach with a radar sensor. The overall system is systematically evaluated both on simulated and real-world data. The experimental results show the proposed system is able to accurately estimate the object pose and motion parameters in a variety of challenging situations, including night scenes, quick turn maneuvers, and partial occlusions. The limits of the system are also carefully investigated.In dieser Dissertation wird ein Ansatz zur Trajektorienschรคtzung von StraรŸenfahrzeugen (PKW, Lieferwagen, Motorrรคder,...) anhand von Stereo-Bildfolgen vorgestellt. Bewegte Objekte werden in Echtzeit aus einem fahrenden Auto heraus automatisch detektiert, vermessen und deren Bewegungszustand relativ zum eigenen Fahrzeug zuverlรคssig bestimmt. Die gewonnenen Informationen liefern einen entscheidenden Grundstein fรผr zukรผnftige Fahrerassistenz- und Sicherheitssysteme im Automobilbereich, beispielsweise zur Kollisionsprรคdiktion. Wรคhrend der GroรŸteil der existierenden Literatur das Detektieren und Verfolgen vorausfahrender Fahrzeuge in Autobahnszenarien adressiert, setzt diese Arbeit einen Schwerpunkt auf den Gegenverkehr, speziell an stรคdtischen Kreuzungen. Der Ansatz ist jedoch grundsรคtzlich generisch und skalierbar fรผr eine Vielzahl an Verkehrssituationen (Innenstadt, LandstraรŸe, Autobahn). Die zu schรคtzenden Parameter beinhalten die rรคumliche Lage des anderen Fahrzeugs relativ zum eigenen Fahrzeug, die Objekt-Geschwindigkeit und -Lรคngsbeschleunigung, sowie die Rotationsgeschwindigkeit (Gierrate) des beobachteten Objektes. Zusรคtzlich werden die ObjektabmaรŸe sowie die Objektform rekonstruiert. Die Grundidee ist es, diese Parameter anhand der Transformation von beobachteten 3D Punkten, welche eine ortsfeste Position auf der Objektoberflรคche besitzen, mittels eines rekursiven Schรคtzers (Kalman Filter) zu bestimmen. Ein wesentlicher Beitrag dieser Arbeit liegt in der Kombination des Starrkรถrpermodells der Punktewolke mit einem Fahrzeugbewegungsmodell. An Kreuzungen kรถnnen sehr unterschiedliche Dynamiken auftreten, von einer Geradeausfahrt mit konstanter Geschwindigkeit bis hin zum raschen Abbiegen. Um eine manuelle Parameteradaption abhรคngig von der jeweiligen Szene zu vermeiden, werden drei verschiedene Ansรคtze zur automatisierten Anpassung der Filterparameter an die vorliegende Situation vorgestellt und verglichen. Dies stellt den zweiten Hauptbeitrag der Arbeit dar. Weitere wichtige Beitrรคge sind zwei alternative Initialisierungsmethoden, eine robuste AusreiรŸerbehandlung, ein probabilistischer Ansatz zur Zuordnung neuer Objektpunkte, sowie die Fusion des bildbasierten Verfahrens mit einem Radar-Sensor. Das Gesamtsystem wird im Rahmen dieser Arbeit systematisch anhand von simulierten und realen StraรŸenverkehrsszenen evaluiert. Die Ergebnisse zeigen, dass das vorgestellte Verfahren in der Lage ist, die unbekannten Objektparameter auch unter schwierigen Umgebungsbedingungen, beispielsweise bei Nacht, schnellen Abbiegemanรถvern oder unter Teilverdeckungen, sehr prรคzise zu schรคtzen. Die Grenzen des Systems werden ebenfalls sorgfรคltig untersucht

    Exploring Older Driver Lateral Head Rotations at Intersections Using Naturalistic Driving Data

    Get PDF
    This study represented a meta-analysis across two naturalistic driving databases which were collected in the same geographic area but focused on distinct age groups. Differences in range of lateral head rotation between older and middle-aged drivers traversing the same pathway through unprotected left turn intersections were examined. These driving scenarios are known to be among the riskiest and most difficult for older drivers, who demonstrated an increased range of head rotation compared to their middle-aged counterparts. These results are interpreted in the context of possible compensation for reduced fields of view

    A car lane-changing model under bus priority-lane effects

    Get PDF
    Car lane-changing behaviour has been well investigated at merging locations or weaving sections where the lane-changes are usually due to different origin-destination trip purposes. However, the lane-changing behaviour under the effects of bus priority-lanes in urban streets has not been received much attention. This kind of behaviour is found to initially depend on the existence of oncoming buses in priority-lanes in urban streets. In this paper, a car lane-changing model under bus priority-lane effects in urban streets is proposed. This model comprises three steps: looking-back threshold determination, gap acceptance model and execution model. The modelโ€™s parameters are estimated jointly by using the Maximum Likelihood Method. The research results show that the car lane-changing behaviour under bus-priority-lane effects in urban streets is considered compulsory behaviour. The behaviour has specific characteristics with smaller critical gaps compared with those at other normal lane cases and can be modelled by the proposed model

    ๊ต์ฐจ๋กœ์—์„œ ์ž์œจ์ฃผํ–‰ ์ฐจ๋Ÿ‰์˜ ์ œํ•œ๋œ ๊ฐ€์‹œ์„ฑ๊ณผ ๋ถˆํ™•์‹ค์„ฑ์„ ๊ณ ๋ คํ•œ ์ข…๋ฐฉํ–ฅ ๊ฑฐ๋™๊ณ„ํš

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ(๋ฐ•์‚ฌ) -- ์„œ์šธ๋Œ€ํ•™๊ต๋Œ€ํ•™์› : ๊ณต๊ณผ๋Œ€ํ•™ ๊ธฐ๊ณ„๊ณตํ•™๋ถ€, 2023. 2. ์ด๊ฒฝ์ˆ˜.This dissertation presents a novel longitudinal motion planning of autonomous vehicle at urban intersection to overcome the limited visibility due to complicated road structures and sensor specification, guaranteeing the safety from the potential collision with vehicles appearing from the occluded region. The intersection autonomous driving requires high level of safety due to congested traffics and environmental complexities. Due to complicated road structures and the detection range of perception sensors, the occluded region is generated in urban autonomous driving. The virtual target is one of the motion planning methods to react the sudden appearance of vehicles from the blind spot. The Gaussian Process Regression (GPR) is implemented to train the virtual target model to generate various future driving trajectories interacting with the motion of the ego vehicle. The GPR model provides not only the predicted trajectories of the virtual target but also the uncertainty of the future motion. Therefore, prediction results from GPR can be utilized to a position constraint for the Model Predictive Control (MPC), and the uncertainties are taken into account as a chance constraint in the MPC. In order to comprehend the surrounding environment including dynamic objects, a region of interest (ROI) is defined to determine targets of the interest. With the pre-determined driving route of the ego vehicle and the route information of the intersection, driving lanes intersecting with the ego driving lane can be determined, and the intersecting lanes are defined as ROI, reducing the computational load by eliminating targets of disinterest. Then the future motion of the selected target is predicted by a Long Short-Term Memory-Recurrent Neural Network (LSTM-RNN). Driving data for training are directly obtained with two different autonomous vehicles, providing their odometry information regardless to the limited field of view (FOV). For a widely known autonomous driving datasets such as Waymo and nuScenes, the vehicle odometry information are collected from the perceptive sensors mounted on the test vehicle. Thus, information of target that are out of the FOV of the test vehicle cant be obtained. The obtained training data are organized in the target centered coordinates for better input-domain adaptation and generalization. The mean squared error and the negative log likelihood loss functions are adapted to train and provide the uncertainty information of the target vehicle for the motion planning of the autonomous vehicle. The MPC with a chance constraint is formulated to optimize the longitudinal motion of the autonomous vehicle. The dynamic and actuator constraints are designed to provide ride comfort and safety to drivers. The position constraint with the chance constraint guarantees the safety and prevent the potential collision with target vehicles. The position constraint for the travel distance over the prediction horizon time is determined based on the clearance between the predicted trajectories of the target and ego vehicle at every prediction sample time. The performance and feasibility of the proposed algorithm are evaluated via computer simulation and test-data based simulation. The offline simulation validates the safety of the proposed algorithm, and the suggested motion planner has been implemented on an autonomous driving vehicle and tested in a real road. Through the implementation of the algorithm to an actual vehicle, the suggested algorithm is confirmed to be applicable in real life autonomous driving.๋ณธ ๋…ผ๋ฌธ์€ ๋ณต์žกํ•œ ๋„๋กœ ๊ตฌ์กฐ์™€ ์„ผ์„œ ์‚ฌ์–‘์œผ๋กœ ์ธํ•œ ์‹œ์•ผ ์ œํ•œ์„ ๊ทน๋ณตํ•˜๋ฉฐ ์‚ฌ๊ฐ์ง€๋Œ€์—์„œ ๋“ฑ์žฅํ•˜๋Š” ์ฐจ๋Ÿ‰๊ณผ์˜ ์ž ์žฌ์ ์ธ ์ถฉ๋Œ๋กœ๋ถ€ํ„ฐ ์•ˆ์ „์„ ๋ณด์žฅํ•˜๊ธฐ ์œ„ํ•œ ๋„์‹ฌ ๊ต์ฐจ๋กœ์—์„œ์˜ ์ž์œจ์ฃผํ–‰์ฐจ์˜ ์ƒˆ๋กœ์šด ์ข…๋ฐฉํ–ฅ ๊ฑฐ๋™ ๊ณ„ํš์„ ์ œ์‹œํ•œ๋‹ค. ๋„์‹ฌ ์ž์œจ์ฃผํ–‰์€ ๊ตํ†ต์ฒด์ฆ๊ณผ ํ™˜๊ฒฝ์˜ ๋ณต์žก์„ฑ์œผ๋กœ ์ธํ•ด ๋†’์€ ์ˆ˜์ค€์˜ ์•ˆ์ „์„ฑ์ด ์š”๊ตฌ๋ฉ๋‹ˆ๋‹ค. ๋ณต์žกํ•œ ๋„๋กœ ๊ตฌ์กฐ์™€ ์ธ์ง€ ์„ผ์„œ์˜ ์ธ์ง€ ๋ฒ”์œ„๋กœ ์ธํ•ด ๋„์‹ฌ ์ž์œจ์ฃผํ–‰์—์„œ๋Š” ์‚ฌ๊ฐ์ง€๋Œ€๊ฐ€ ๋ฐœ์ƒํ•œ๋‹ค. ๊ฐ€์ƒ ํƒ€๊ฒŸ์€ ์‚ฌ๊ฐ์ง€๋Œ€์—์„œ ์ฐจ๋Ÿ‰์˜ ๊ฐ‘์ž‘์Šค๋Ÿฌ์šด ์ถœํ˜„์— ๋Œ€์‘ํ•˜๊ธฐ ์œ„ํ•œ ๊ฑฐ๋™ ๊ณ„ํš ๋ฐฉ๋ฒ• ์ค‘ ํ•˜๋‚˜์ž…๋‹ˆ๋‹ค. ์ž์ฐจ๋Ÿ‰์˜ ๊ฑฐ๋™๊ณผ ์ƒํ˜ธ์ž‘์šฉํ•˜๋Š” ๋‹ค์–‘ํ•œ ๋ฏธ๋ž˜ ์ฃผํ–‰ ๊ถค์ ์„ ์ƒ์„ฑํ•˜๋Š” ๊ฐ€์ƒ ํƒ€๊ฒŸ ๋ชจ๋ธ์„ ๊ตฌํ˜„ํ•˜๊ธฐ ์œ„ํ•˜์—ฌ Gaussian Process Regression (GPR) ๋ฐฉ๋ฒ•์„ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค. GPR ๋ชจ๋ธ์€ ๊ฐ€์ƒ ํ‘œ์ ์˜ ์˜ˆ์ธก๋œ ๊ถค์ ๋ฟ๋งŒ ์•„๋‹ˆ๋ผ ๋ฏธ๋ž˜ ๊ถค์ ์— ๋Œ€ํ•œ ๋ถˆํ™•์‹ค์„ฑ๋„ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค. ๋”ฐ๋ผ์„œ GPR์˜ ์˜ˆ์ธก ๊ฒฐ๊ณผ๋Š” Model Predictive Control (MPC)์— ๋Œ€ํ•œ ์œ„์น˜ ์ œ์•ฝ ์กฐ๊ฑด์œผ๋กœ ํ™œ์šฉ๋  ์ˆ˜ ์žˆ์œผ๋ฉฐ ๋ถˆํ™•์‹ค์„ฑ์€ MPC์—์„œ ๊ธฐํšŒ ์ œ์•ฝ ์กฐ๊ฑด์œผ๋กœ ๊ณ ๋ ค๋ฉ๋‹ˆ๋‹ค. ๋™์  ๊ฐ์ฒด๋ฅผ ํฌํ•จํ•œ ์ฃผ๋ณ€ ํ™˜๊ฒฝ์„ ํŒŒ์•…ํ•˜๊ธฐ ์œ„ํ•ด ๊ด€์‹ฌ์˜์—ญ์„ ์ •์˜ํ•˜์—ฌ ๋ชฉํ‘œ ๋Œ€์ƒ์„ ๊ฒฐ์ •ํ•ฉ๋‹ˆ๋‹ค. ๋ฏธ๋ฆฌ ๊ฒฐ์ •๋œ ์ž์ฐจ๋Ÿ‰์˜ ์ฃผํ–‰๊ฒฝ๋กœ์™€ ๊ต์ฐจ๋กœ์˜ ๊ฒฝ๋กœ์ •๋ณด๋ฅผ ํ†ตํ•˜์—ฌ ์ž์ฐจ๋Ÿ‰์˜ ์ฃผํ–‰์ฐจ๋กœ์™€ ๊ต์ฐจํ•˜๋Š” ๋‹ค๋ฅธ ์ฐจ์„ ์„ ํŒ๋‹จํ•˜์—ฌ ๊ด€์‹ฌ์˜์—ญ์œผ๋กœ ์ •์˜ํ•จ์œผ๋กœ์จ ๊ด€์‹ฌ์˜์—ญ ๋ฐ–์˜ ์ฐจ๋Ÿ‰์„ ์ œ์™ธํ•˜์—ฌ ์—ฐ์‚ฐ๋Ÿ‰์„ ๊ฐ์†Œ์‹œํ‚ฌ ์ˆ˜ ์žˆ๋‹ค. ๋‹ค์Œ์œผ๋กœ ์ธ์ง€๋œ ์ฐจ๋Ÿ‰์˜ ๋ฏธ๋ž˜ ์ด๋™ ๊ถค์ ์€ LSTM-RNN (Long Short-Term Memory Recurrent Neural Network)์— ์˜ํ•ด ์˜ˆ์ธก๋ฉ๋‹ˆ๋‹ค. ํ›ˆ๋ จ์„ ์œ„ํ•œ ์ฃผํ–‰ ๋ฐ์ดํ„ฐ๋Š” ๋‘ ๋Œ€์˜ ์ž์œจ์ฃผํ–‰ ์ฐจ๋Ÿ‰์—์„œ ์ง์ ‘ ํš๋“ํ•˜์—ฌ ์ œํ•œ๋œ ์‹œ์•ผ์— ๊ด€๊ณ„์—†์ด ์ฐจ๋Ÿ‰์˜ ์ƒํƒœ ์ •๋ณด๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค. ๊ตฌ๊ธ€ Waymo ๋ฐ nuScenes์™€ ๊ฐ™์ด ๋„๋ฆฌ ์•Œ๋ ค์ง„ ์ž์œจ์ฃผํ–‰ ๋ฐ์ดํ„ฐ์˜ ๊ฒฝ์šฐ ์ฐจ๋Ÿ‰ ์ƒํƒœ ์ •๋ณด๋Š” ํ…Œ์ŠคํŠธ ์ฐจ๋Ÿ‰์— ์žฅ์ฐฉ๋œ ์ธ์ง€ ์„ผ์„œ์—์„œ ์ˆ˜์ง‘๋ฉ๋‹ˆ๋‹ค. ๋”ฐ๋ผ์„œ ํ…Œ์ŠคํŠธ ์ฐจ๋Ÿ‰์˜ ์‹œ์•ผ์—์„œ ๋ฒ—์–ด๋‚˜ ์žˆ๋Š” ์ฐจ๋Ÿ‰ ์ •๋ณด๋Š” ์–ป์„ ์ˆ˜ ์—†์Šต๋‹ˆ๋‹ค. ์ทจ๋“ํ•œ ์ฃผํ–‰ ๋ฐ์ดํ„ฐ๋Š” ๋” ๋‚˜์€ ์ž…๋ ฅ ๋ฐ์ดํ„ฐ ์ ์‘ ๋ฐ ์ผ๋ฐ˜ํ™”๋ฅผ ์œ„ํ•ด ์ž์ฐจ๊ฐ€ ์•„๋‹Œ ํƒ€๊ฒŸ์ฐจ๋Ÿ‰ ์ค‘์‹ฌ ์ขŒํ‘œ๋กœ ๊ตฌ์„ฑ๋ฉ๋‹ˆ๋‹ค. ์†์‹คํ•จ์ˆ˜๋กœ ํ‰๊ท  ์ œ๊ณฑ ์˜ค์ฐจ ๋ฐ ์Œ์˜ ๋กœ๊ทธ ์šฐ๋„ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•˜์˜€๊ณ  ์Œ์˜ ๋กœ๊ทธ ์šฐ๋„ํ•จ์ˆ˜๋Š” ์ž์œจ์ฃผํ–‰ ์ฐจ๋Ÿ‰์˜ ๊ฑฐ๋™๊ณ„ํš์— ์‚ฌ์šฉ๋  ์ˆ˜ ์žˆ๊ฒŒ ํƒ€๊ฒŸ์ฐจ๋Ÿ‰์˜ ๋ฏธ๋ž˜ ๊ถค์ ์— ๋Œ€ํ•œ ๋ถˆํ™•์‹ค์„ฑ ์ •๋ณด๋ฅผ ์ œ๊ณตํ•œ๋‹ค. ๊ธฐํšŒ ์ œ์•ฝ ์กฐ๊ฑด์ด ์žˆ๋Š” MPC๋Š” ์ž์œจ์ฐจ๋Ÿ‰์˜ ์ข…๋ฐฉํ–ฅ ๊ฑฐ๋™์„ ์ตœ์ ํ™”ํ•˜๋„๋ก ๊ตฌํ˜„๋ฉ๋‹ˆ๋‹ค. ๋™์  ์ œ์•ฝ ์กฐ๊ฑด ๋ฐ ๊ตฌ๋™๊ธฐ ์ œ์•ฝ ์กฐ๊ฑด์€ ์šด์ „์ž์—๊ฒŒ ์Šน์ฐจ๊ฐ๊ณผ ์•ˆ์ „์„ ์ œ๊ณตํ•˜๋„๋ก ์„ค๊ณ„๋˜์—ˆ์Šต๋‹ˆ๋‹ค. ๊ธฐํšŒ ์ œ์•ฝ ์กฐ๊ฑด์€ ์œ„์น˜ ์ œ์•ฝ ์กฐ๊ฑด์„ ๊ฐ•๊ฑดํ•˜๊ฒŒ ํ•˜์—ฌ ์•ˆ์ „์„ ๋ณด์žฅํ•˜๊ณ  ๋Œ€์ƒ ์ฐจ๋Ÿ‰๊ณผ์˜ ์ž ์žฌ์ ์ธ ์ถฉ๋Œ์„ ๋ฐฉ์ง€ํ•ฉ๋‹ˆ๋‹ค. ์˜ˆ์ธก ์‹œ๊ฐ„๋™์•ˆ ์ด๋™ ๊ฑฐ๋ฆฌ์— ๋Œ€ํ•œ ์œ„์น˜ ์ œ์•ฝ ์กฐ๊ฑด์€ ๊ฐ ์˜ˆ์ธก์‹œ๊ฐ„์˜ ํƒ€๊ฒŸ๊ณผ ์ž์ฐจ๋Ÿ‰์˜ ์˜ˆ์ธก๋œ ๊ถค์  ๊ฐ„์˜ ๊ฑฐ๋ฆฌ ์ฐจ์ด์— ์˜ํ•ด ๊ฒฐ์ •๋œ๋‹ค. ์ œ์•ˆํ•œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ์„ฑ๋Šฅ๊ณผ ํƒ€๋‹น์„ฑ์€ ์ปดํ“จํ„ฐ ์‹œ๋ฎฌ๋ ˆ์ด์…˜๊ณผ ํ…Œ์ŠคํŠธ ๋ฐ์ดํ„ฐ ๊ธฐ๋ฐ˜ ์‹œ๋ฎฌ๋ ˆ์ด์…˜์„ ํ†ตํ•ด ํ‰๊ฐ€๋œ๋‹ค. ์˜คํ”„๋ผ์ธ ์‹œ๋ฎฌ๋ ˆ์ด์…˜์„ ํ†ตํ•ด ์ œ์•ˆํ•œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ์•ˆ์ „์„ฑ์„ ๊ฒ€์ฆํ•˜์˜€์œผ๋ฉฐ ์ œ์•ˆํ•œ ๊ฑฐ๋™๊ณ„ํš ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ž์œจ์ฃผํ–‰์ฐจ์— ๊ตฌํ˜„ํ•˜์—ฌ ์‹ค์ œ ๋„๋กœ์—์„œ ํ…Œ์ŠคํŠธํ•˜์˜€๋‹ค. ์ œ์•ˆํ•œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์‹ค์ œ ์ฐจ๋Ÿ‰์— ๊ตฌํ˜„ํ•˜์—ฌ ์‹ค์ œ ์ž์œจ์ฃผํ–‰์— ์ ์šฉํ•  ์ˆ˜ ์žˆ์Œ์„ ํ™•์ธํ•˜์˜€๋‹ค.Chapter 1. Introduction 1 1.1. Research Background and Motivation of Intersection Autonomous Driving 1 1.2. Previous Researches on Intersection Autonomous Driving 9 1.2.1. Research on Trajectory Prediction and Intention Inference at Urban Intersection 10 1.2.2. Research on Intersection Motion Planning 11 1.3. Thesis Objectives 18 1.4. Thesis Outline 19 Chapter 2. Overall Architecture of Intersection Autonomous Driving System 22 2.1. Software Configuration of Intersection Autonomous Driving 22 2.2. Hardware Configuration of Autonomous Driving and Test Vehicle 24 2.3. Vehicle Test Environment for Intersection Autonomous Driving 25 Chapter 3. Virtual Target Modelling for Intersection Motion Planning 27 3.1. Limitation of Conventional Virtual Target Model for Intersection 27 3.2. Virtual Target Generation for Intersection Occlusion 31 3.3. Intersection Virtual Target Modeling 34 3.3.1. Gaussian Process Regression based Virtual Target Model at Intersection 35 3.3.2. Data Processing for Gaussian Process Regression based Virtual Target Model 38 3.3.3. Definition of Visibility Index of Virtual Target at Intersection 45 3.3.4. Long Short-Term Memory based Virtual Target Model at Intersection 51 Chapter 4. Surrounding Vehicle Motion Prediction at Intersection 54 4.1. Intersection Surrounding Vehicle Classification 54 4.2. Data-driven Vehicle State based Motion Prediction at Intersection 58 4.2.1. Network Architecture of Motion Predictor 58 4.2.2. Dataset Processing of the Network 65 Chapter 5. Intersection Longitudinal Motion Planning 68 5.1. Outlines of Longitudinal Motion Planning with Model Predictive Control 68 5.2. Stochastic Model Predictive Control of Intersection Motion Planner 69 5.2.1. Definition of System Dynamics Model 69 5.2.2. Ego Vehicle Prediction and Reference States Definition 70 5.2.3. Safety Clearance Decision for Intersection Collision Avoidance 71 5.2.4. Driving Mode Decision of Intersection Motion Planning 79 5.2.5. Formulation of Model Predictive Control with the Chance Constraint 83 Chapter 6. Performance Evaluation of Intersection Longitudinal Motion Planning 86 6.1. Performance Evaluation of Virtual Target Prediction at Intersection 86 6.1.1. GPR based Virtual Target Model Prediction Results 86 6.1.2. Intersection Autonomous Driving Computer Simulation Environment 90 6.1.2.1. Simulation Result of Effect of Virtual Target in Intersection Autonomous Driving 92 6.1.2.2. Virtual Target Simulation Result of the Right Turn Across Path Scenario in the Intersection 96 6.1.2.3. Virtual Target Simulation Result of the Straight Across Path Scenario in the Intersection 102 6.1.2.4. Virtual Target Simulation Result of the Left Turn Across Path Scenario in the Intersection 108 6.1.2.5. Virtual Target Simulation Result of Crooked T-shaped Intersection 113 6.2. Performance Evaluation of Data-driven Vehicle State based Motion Prediction at Intersection 124 6.2.1. Data-driven Motion Prediction Accuracy Analysis 124 6.2.2. Prediction Trajectory Accuracy Analysis 134 6.3. Vehicle Test for Intersection Autonomous Driving 146 6.3.1. Test Vehicle Configuration for Intersection Autonomous Driving 146 6.3.2. Software Configuration for Autonomous Vehicle Operation 147 6.3.3. Vehicle Test Environment for Intersection Autonomous Driving 148 6.3.4. Vehicle Test Result of Intersection Autonomous Driving 151 Chapter 7. Conclusion and Future Work 161 7.1. Conclusion 161 7.2. Future Work 164 Bibliography 166 Abstract in Korean 172๋ฐ•

    HAZARD PERCEPTION TRAINING FOR ADOLESCENTS WITH AUTISM SPECTRUM DISORDER ON THE INTERACTIVE DRIVING SIMULATOR: USING EYE TRACKING TECHNOLOGY TO DETERMINE EFFECTIVENESS

    Get PDF
    Rationale: Driving is an important developmental milestone for all adolescents as it increases their independence and ability to participate in vehicle-dependent activities. However, adolescents with high functioning autism spectrum disorder (HFASD) are less likely to obtain licenses and drive independently due to characteristics related to their diagnosis. Although current research exists exploring the efficacy of driving simulator training for adolescent drivers with HFASD and eye tracking, there is a gap in the literature related to training on the simulator and its effects on overall driving performance and hazard perception and response in this population. Purpose: This pilot study utilized a training protocol on the simulator that included hazard perception to determine its effect on overall driving performance. Eye tracking technology was used to determine if there was a change in hazard perception and response to non-social and social hazards after training. Design: This study was a one group, pretest-posttest intervention design. Methods: There were 17 participants between the ages of 15 and 22 with a self-reported diagnosis of ASD and a desire to learn to drive independently. Each participant completed a pre-test and post-test on the driving simulator while wearing eye tracking technology. Each participant completed a protocol of 30 learning modules with scenarios related to driving skills and hazard detection and response in one-to-one training. Analysis: Driving performance was measured by a quantitative score from a standardized observational tool for driving. Eye tracking measures including fixation duration, fixation count, and time to first fixation were analyzed using a Wilcoxon Signed Rank Test. Results: Participants significantly increased their overall driving performance scores pre-test to post-test. Results of hazard perception using eye tracking technology tended towards improvement overall, but specific hazard results were inconsistent and varied for both non-social and social hazards in terms of fixation duration, fixation count, and time to first fixation. Discussion: Findings from this study indicate driving simulator training related to hazard perception was effective in improving overall driving simulator performance in adolescents with HFASD. Additionally, findings indicate hazard perception and response differs for this population after hazard perception training, but specific eye tracking measures may increase or decrease, and results may not be specific to non-social or social hazards

    iDriving: Toward Safe and Efficient Infrastructure-directed Autonomous Driving

    Full text link
    Autonomous driving will become pervasive in the coming decades. iDriving improves the safety of autonomous driving at intersections and increases efficiency by improving traffic throughput at intersections. In iDriving, roadside infrastructure remotely drives an autonomous vehicle at an intersection by offloading perception and planning from the vehicle to roadside infrastructure. To achieve this, iDriving must be able to process voluminous sensor data at full frame rate with a tail latency of less than 100 ms, without sacrificing accuracy. We describe algorithms and optimizations that enable it to achieve this goal using an accurate and lightweight perception component that reasons on composite views derived from overlapping sensors, and a planner that jointly plans trajectories for multiple vehicles. In our evaluations, iDriving always ensures safe passage of vehicles, while autonomous driving can only do so 27% of the time. iDriving also results in 5x lower wait times than other approaches because it enables traffic-light free intersections
    • โ€ฆ
    corecore