4,833 research outputs found

    ๋„์‹ฌ ๊ต์ฐจ๋กœ์—์„œ์˜ ์ž์œจ์ฃผํ–‰์„ ์œ„ํ•œ ์ฃผ๋ณ€ ์ฐจ๋Ÿ‰ ๊ฒฝ๋กœ ์˜ˆ์ธก ๋ฐ ๊ฑฐ๋™ ๊ณ„ํš ์•Œ๊ณ ๋ฆฌ์ฆ˜

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ(๋ฐ•์‚ฌ)--์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› :๊ณต๊ณผ๋Œ€ํ•™ ๊ธฐ๊ณ„ํ•ญ๊ณต๊ณตํ•™๋ถ€,2020. 2. ์ด๊ฒฝ์ˆ˜.์ฐจ๋ž‘์šฉ ์„ผ์‹ฑ ๋ฐ ์ฒ˜๋ฆฌ๊ธฐ์ˆ ์ด ๋ฐœ๋‹ฌํ•จ์— ๋”ฐ๋ผ ์ž๋™์ฐจ ๊ธฐ์ˆ  ์—ฐ๊ตฌ๊ฐ€ ์ˆ˜๋™ ์•ˆ์ „ ๊ธฐ์ˆ ์—์„œ ๋Šฅ๋™ ์•ˆ์ „ ๊ธฐ์ˆ ๋กœ ์ดˆ์ ์ด ํ™•์žฅ๋˜๊ณ  ์žˆ๋‹ค. ์ตœ๊ทผ, ์ฃผ์š” ์ž๋™์ฐจ ์ œ์ž‘์‚ฌ๋“ค์€ ๋Šฅ๋™ํ˜• ์ฐจ๊ฐ„๊ฑฐ๋ฆฌ ์ œ์–ด, ์ฐจ์„  ์œ ์ง€ ๋ณด์กฐ, ๊ทธ๋ฆฌ๊ณ  ๊ธด๊ธ‰ ์ž๋™ ์ œ๋™๊ณผ ๊ฐ™์€ ๋Šฅ๋™ ์•ˆ์ „ ๊ธฐ์ˆ ์ด ์ด๋ฏธ ์ƒ์—…ํ™”ํ•˜๊ณ  ์žˆ๋‹ค. ์ด๋Ÿฌํ•œ ๊ธฐ์ˆ ์  ์ง„๋ณด๋Š” ์‚ฌ์ƒ๋ฅ  ์ œ๋กœ๋ฅผ ๋‹ฌ์„ฑํ•˜๊ธฐ ์œ„ํ•˜์—ฌ ๊ธฐ์ˆ  ์—ฐ๊ตฌ ๋ถ„์•ผ๋ฅผ ๋Šฅ๋™ ์•ˆ์ „ ๊ธฐ์ˆ ์„ ๋„˜์–ด์„œ ์ž์œจ์ฃผํ–‰ ์‹œ์Šคํ…œ์œผ๋กœ ํ™•์žฅ์‹œํ‚ค๊ณ  ์žˆ๋‹ค. ํŠนํžˆ, ๋„์‹ฌ ๋„๋กœ๋Š” ์ธ๋„, ์‚ฌ๊ฐ์ง€๋Œ€, ์ฃผ์ฐจ์ฐจ๋Ÿ‰, ์ด๋ฅœ์ฐจ, ๋ณดํ–‰์ž ๋“ฑ๊ณผ ๊ฐ™์€ ๊ตํ†ต ์œ„ํ—˜ ์š”์†Œ๋ฅผ ๋งŽ์ด ๊ฐ–๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์— ๊ณ ์†๋„๋กœ๋ณด๋‹ค ์‚ฌ๊ณ  ๋ฐœ์ƒ๋ฅ ๊ณผ ์‚ฌ์ƒ๋ฅ ์ด ๋†’์œผ๋ฉฐ, ์ด๋Š” ๋„์‹ฌ ๋„๋กœ์—์„œ์˜ ์ž์œจ์ฃผํ–‰์€ ํ•ต์‹ฌ ์ด์Šˆ๊ฐ€ ๋˜๊ณ  ์žˆ๋‹ค. ๋งŽ์€ ํ”„๋กœ์ ํŠธ๋“ค์ด ์ž์œจ์ฃผํ–‰์˜ ํ™˜๊ฒฝ์ , ์ธ๊ตฌํ•™์ , ์‚ฌํšŒ์ , ๊ทธ๋ฆฌ๊ณ  ๊ฒฝ์ œ์  ์ธก๋ฉด์—์„œ์˜ ์ž์œจ์ฃผํ–‰์˜ ํšจ๊ณผ๋ฅผ ํ‰๊ฐ€ํ•˜๊ธฐ ์œ„ํ•ด ์ˆ˜ํ–‰๋˜์—ˆ๊ฑฐ๋‚˜ ์ˆ˜ํ–‰ ์ค‘์— ์žˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด, ์œ ๋Ÿฝ์˜ AdaptIVE๋Š” ๋‹ค์–‘ํ•œ ์ž์œจ์ฃผํ–‰ ๊ธฐ๋Šฅ์„ ๊ฐœ๋ฐœํ•˜์˜€์œผ๋ฉฐ, ๊ตฌ์ฒด์ ์ธ ํ‰๊ฐ€ ๋ฐฉ๋ฒ•๋ก ์„ ๊ฐœ๋ฐœํ•˜์˜€๋‹ค. ๋˜ํ•œ, CityMobil2๋Š” ์œ ๋Ÿฝ ์ „์—ญ์˜ 9๊ฐœ์˜ ๋‹ค๋ฅธ ํ™˜๊ฒฝ์—์„œ ๋ฌด์ธ ์ง€๋Šฅํ˜• ์ฐจ๋Ÿ‰์„ ์„ฑ๊ณต์ ์œผ๋กœ ํ†ตํ•ฉํ•˜์˜€๋‹ค. ์ผ๋ณธ์—์„œ๋Š” 2014๋…„ 5์›”์— ์‹œ์ž‘๋œ Automated Driving System Research Project๋Š” ์ž์œจ์ฃผํ–‰ ์‹œ์Šคํ…œ๊ณผ ์ฐจ์„ธ๋Œ€ ๋„์‹ฌ ๊ตํ†ต ์ˆ˜๋‹จ์˜ ๊ฐœ๋ฐœ ๋ฐ ๊ฒ€์ฆ์— ์ดˆ์ ์„ ๋งž์ถ”์—ˆ๋‹ค. ๊ธฐ์กด ์—ฐ๊ตฌ๋“ค์— ๋Œ€ํ•œ ์กฐ์‚ฌ๋ฅผ ํ†ตํ•ด ์ž์œจ์ฃผํ–‰ ์‹œ์Šคํ…œ์€ ๊ตํ†ต ์ฐธ์—ฌ์ž๋“ค์˜ ์•ˆ์ „๋„๋ฅผ ํ–ฅ์ƒ์‹œํ‚ค๊ณ , ๊ตํ†ต ํ˜ผ์žก์„ ๊ฐ์†Œ์‹œํ‚ค๋ฉฐ, ์šด์ „์ž ํŽธ์˜์„ฑ์„ ์ฆ์ง„์‹œํ‚ค๋Š” ๊ฒƒ์ด ์ฆ๋ช…๋˜์—ˆ๋‹ค. ๋‹ค์–‘ํ•œ ๋ฐฉ๋ฒ•๋ก ๋“ค์ด ์ธ์ง€, ๊ฑฐ๋™ ๊ณ„ํš, ๊ทธ๋ฆฌ๊ณ  ์ œ์–ด์™€ ๊ฐ™์€ ๋„์‹ฌ ๋„๋กœ ์ž์œจ์ฃผํ–‰์ฐจ์˜ ํ•ต์‹ฌ ๊ธฐ์ˆ ๋“ค์„ ๊ฐœ๋ฐœํ•˜๊ธฐ ์œ„ํ•˜์—ฌ ์‚ฌ์šฉ๋˜์—ˆ๋‹ค. ํ•˜์ง€๋งŒ ๋งŽ์€ ์ตœ์‹ ์˜ ์ž์œจ์ฃผํ–‰ ์—ฐ๊ตฌ๋“ค์€ ๊ฐ ๊ธฐ์ˆ ์˜ ๊ฐœ๋ฐœ์„ ๋ณ„๊ฐœ๋กœ ๊ณ ๋ คํ•˜์—ฌ ์ง„ํ–‰ํ•ด์™”๋‹ค. ๊ฒฐ๊ณผ์ ์œผ๋กœ ํ†ตํ•ฉ์ ์ธ ๊ด€์ ์—์„œ์˜ ์ž์œจ์ฃผํ–‰ ๊ธฐ์ˆ  ์„ค๊ณ„๋Š” ์•„์ง ์ถฉ๋ถ„ํžˆ ๊ณ ๋ ค๋˜์–ด ์•Š์•˜๋‹ค. ๋”ฐ๋ผ์„œ, ๋ณธ ๋…ผ๋ฌธ์€ ๋ณต์žกํ•œ ๋„์‹ฌ ๋„๋กœ ํ™˜๊ฒฝ์—์„œ ๋ผ์ด๋‹ค, ์นด๋ฉ”๋ผ, GPS, ๊ทธ๋ฆฌ๊ณ  ๊ฐ„๋‹จํ•œ ๊ฒฝ๋กœ ๋งต์— ๊ธฐ๋ฐ˜ํ•œ ์™„์ „ ์ž์œจ์ฃผํ–‰ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ๊ฐœ๋ฐœํ•˜๋Š” ๊ฒƒ์„ ๋ชฉํ‘œ๋กœ ํ•˜์˜€๋‹ค. ์ œ์•ˆ๋œ ์ž์œจ์ฃผํ–‰ ์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ๋น„ํ†ต์ œ ๊ต์ฐจ๋กœ๋ฅผ ํฌํ•จํ•œ ๋„์‹ฌ ๋„๋กœ ์ƒํ™ฉ์„ ์ฐจ๋Ÿ‰ ๊ฑฐ๋™ ์˜ˆ์ธก๊ธฐ์™€ ๋ชจ๋ธ ์˜ˆ์ธก ์ œ์–ด ๊ธฐ๋ฒ•์— ๊ธฐ๋ฐ˜ํ•˜์—ฌ ์„ค๊ณ„๋˜์—ˆ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์€ ๋™์ , ์ •์  ํ™˜๊ฒฝ ํ‘œํ˜„ ๋ฐ ์ข…ํšก๋ฐฉํ–ฅ ๊ฑฐ๋™ ๊ณ„ํš์„ ์ค‘์ ์ ์œผ๋กœ ๋‹ค๋ฃจ์—ˆ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์€ ๋„์‹ฌ ๋„๋กœ ์ž์œจ์ฃผํ–‰์„ ์œ„ํ•œ ๊ฑฐ๋™ ๊ณ„ํš ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ๊ฐœ์š”๋ฅผ ์ œ์‹œํ•˜์˜€์œผ๋ฉฐ, ์‹ค์ œ ๊ตํ†ต ์ƒํ™ฉ์—์„œ์˜ ์‹คํ—˜ ๊ฒฐ๊ณผ๋Š” ์ œ์•ˆ๋œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ํšจ๊ณผ์„ฑ๊ณผ ์šด์ „์ž ๊ฑฐ๋™๊ณผ์˜ ์œ ์‚ฌ์„ฑ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. ์‹ค์ฐจ ์‹คํ—˜ ๊ฒฐ๊ณผ๋Š” ๋น„ํ†ต์ œ ๊ต์ฐจ๋กœ๋ฅผ ํฌํ•จํ•œ ๋„์‹ฌ ์‹œ๋‚˜๋ฆฌ์˜ค์—์„œ์˜ ๊ฐ•๊ฑดํ•œ ์„ฑ๋Šฅ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค.The foci of automotive researches have been expanding from passive safety systems to active safety systems with advances in sensing and processing technologies. Recently, the majority of automotive makers have already commercialized active safety systems, such as adaptive cruise control (ACC), lane keeping assistance (LKA), and autonomous emergency braking (AEB). Such advances have extended the research field beyond active safety systems to automated driving systems to achieve zero fatalities. Especially, automated driving on urban roads has become a key issue because urban roads possess numerous risk factors for traffic accidents, such as sidewalks, blind spots, on-street parking, motorcycles, and pedestrians, which cause higher accident rates and fatalities than motorways. Several projects have been conducted, and many others are still underway to evaluate the effects of automated driving in environmental, demographic, social, and economic aspects. For example, the European project AdaptIVe, develops various automated driving functions and defines specific evaluation methodologies. In addition, CityMobil2 successfully integrates driverless intelligent vehicles in nine other environments throughout Europe. In Japan, the Automated Driving System Research Project began on May 2014, which focuses on the development and verification of automated driving systems and next-generation urban transportation. From a careful review of a considerable amount of literature, automated driving systems have been proven to increase the safety of traffic users, reduce traffic congestion, and improve driver convenience. Various methodologies have been employed to develop the core technology of automated vehicles on urban roads, such as perception, motion planning, and control. However, the current state-of-the-art automated driving algorithms focus on the development of each technology separately. Consequently, designing automated driving systems from an integrated perspective is not yet sufficiently considered. Therefore, this dissertation focused on developing a fully autonomous driving algorithm in urban complex scenarios using LiDAR, vision, GPS, and a simple path map. The proposed autonomous driving algorithm covered the urban road scenarios with uncontrolled intersections based on vehicle motion prediction and model predictive control approach. Mainly, four research issues are considered: dynamic/static environment representation, and longitudinal/lateral motion planning. In the remainder of this thesis, we will provide an overview of the proposed motion planning algorithm for urban autonomous driving and the experimental results in real traffic, which showed the effectiveness and human-like behaviors of the proposed algorithm. The proposed algorithm has been tested and evaluated using both simulation and vehicle tests. The test results show the robust performance of urban scenarios, including uncontrolled intersections.Chapter 1 Introduction 1 1.1. Background and Motivation 1 1.2. Previous Researches 4 1.3. Thesis Objectives 9 1.4. Thesis Outline 10 Chapter 2 Overview of Motion Planning for Automated Driving System 11 Chapter 3 Dynamic Environment Representation with Motion Prediction 15 3.1. Moving Object Classification 17 3.2. Vehicle State based Direct Motion Prediction 20 3.2.1. Data Collection Vehicle 22 3.2.2. Target Roads 23 3.2.3. Dataset Selection 24 3.2.4. Network Architecture 25 3.2.5. Input and Output Features 33 3.2.6. Encoder and Decoder 33 3.2.7. Sequence Length 34 3.3. Road Structure based Interactive Motion Prediction 36 3.3.1. Maneuver Definition 38 3.3.2. Network Architecture 39 3.3.3. Path Following Model based State Predictor 47 3.3.4. Estimation of predictor uncertainty 50 3.3.5. Motion Parameter Estimation 53 3.3.6. Interactive Maneuver Prediction 56 3.4. Intersection Approaching Vehicle Motion Prediction 59 3.4.1. Driver Behavior Model at Intersections 59 3.4.2. Intention Inference based State Prediction 63 Chapter 4 Static Environment Representation 67 4.1. Static Obstacle Map Construction 69 4.2. Free Space Boundary Decision 74 4.3. Drivable Corridor Decision 76 Chapter 5 Longitudinal Motion Planning 81 5.1. In-Lane Target Following 82 5.2. Proactive Motion Planning for Narrow Road Driving 85 5.2.1. Motivation for Collision Preventive Velocity Planning 85 5.2.2. Desired Acceleration Decision 86 5.3. Uncontrolled Intersection 90 5.3.1. Driving Phase and Mode Definition 91 5.3.2. State Machine for Driving Mode Decision 92 5.3.3. Motion Planner for Approach Mode 95 5.3.4. Motion Planner for Risk Management Phase 98 Chapter 6 Lateral Motion Planning 105 6.1. Vehicle Model 107 6.2. Cost Function and Constraints 109 Chapter 7 Performance Evaluation 115 7.1. Motion Prediction 115 7.1.1. Prediction Accuracy Analysis of Vehicle State based Direct Motion Predictor 115 7.1.2. Prediction Accuracy and Effect Analysis of Road Structure based Interactive Motion Predictor 122 7.2. Prediction based Distance Control at Urban Roads 132 7.2.1. Driving Data Analysis of Direct Motion Predictor Application at Urban Roads 133 7.2.2. Case Study of Vehicle Test at Urban Roads 138 7.2.3. Analysis of Vehicle Test Results on Urban Roads 147 7.3. Complex Urban Roads 153 7.3.1. Case Study of Vehicle Test at Complex Urban Roads 154 7.3.2. Closed-loop Simulation based Safety Analysis 162 7.4. Uncontrolled Intersections 164 7.4.1. Simulation based Algorithm Comparison of Motion Planner 164 7.4.2. Monte-Carlo Simulation based Safety Analysis 166 7.4.3. Vehicle Tests Results in Real Traffic Conditions 172 7.4.4. Similarity Analysis between Human and Automated Vehicle 194 7.5. Multi-Lane Turn Intersections 197 7.5.1. Case Study of a Multi-Lane Left Turn Scenario 197 7.5.2. Analysis of Motion Planning Application Results 203 Chapter 8 Conclusion & Future Works 207 8.1. Conclusion 207 8.2. Future Works 209 Bibliography 210 Abstract in Korean 219Docto

    Perception Intelligence Integrated Vehicle-to-Vehicle Optical Camera Communication.

    Get PDF
    Ubiquitous usage of cameras and LEDs in modern road and aerial vehicles open up endless opportunities for novel applications in intelligent machine navigation, communication, and networking. To this end, in this thesis work, we hypothesize the benefit of dual-mode usage of vehicular built-in cameras through novel machine perception capabilities combined with optical camera communication (OCC). Current key conception of understanding a line-of-sight (LOS) scenery is from the aspect of object, event, and road situation detection. However, the idea of blending the non-line-of-sight (NLOS) information with the LOS information to achieve a see-through vision virtually is new. This improves the assistive driving performance by enabling a machine to see beyond occlusion. Another aspect of OCC in the vehicular setup is to understand the nature of mobility and its impact on the optical communication channel quality. The research questions gathered from both the car-car mobility modelling, and evaluating a working setup of OCC communication channel can also be inherited to aerial vehicular situations like drone-drone OCC. The aim of this thesis is to answer the research questions along these new application domains, particularly, (i) how to enable a virtual see-through perception in the car assisting system that alerts the human driver about the visible and invisible critical driving events to help drive more safely, (ii) how transmitter-receiver cars behaves while in the mobility and the overall channel performance of OCC in motion modality, (iii) how to help rescue lost Unmanned Aerial Vehicles (UAVs) through coordinated localization with fusion of OCC and WiFi, (iv) how to model and simulate an in-field drone swarm operation experience to design and validate UAV coordinated localization for group of positioning distressed drones. In this regard, in this thesis, we present the end-to-end system design, proposed novel algorithms to solve the challenges in applying such a system, and evaluation results through experimentation and/or simulation

    Seamless Interactions Between Humans and Mobility Systems

    Full text link
    As mobility systems, including vehicles and roadside infrastructure, enter a period of rapid and profound change, it is important to enhance interactions between people and mobility systems. Seamless humanโ€”mobility system interactions can promote widespread deployment of engaging applications, which are crucial for driving safety and efficiency. The ever-increasing penetration rate of ubiquitous computing devices, such as smartphones and wearable devices, can facilitate realization of this goal. Although researchers and developers have attempted to adapt ubiquitous sensors for mobility applications (e.g., navigation apps), these solutions often suffer from limited usability and can be risk-prone. The root causes of these limitations include the low sensing modality and limited computational power available in ubiquitous computing devices. We address these challenges by developing and demonstrating that novel sensing techniques and machine learning can be applied to extract essential, safety-critical information from drivers natural driving behavior, even actions as subtle as steering maneuvers (e.g., left-/righthand turns and lane changes). We first show how ubiquitous sensors can be used to detect steering maneuvers regardless of disturbances to sensing devices. Next, by focusing on turning maneuvers, we characterize drivers driving patterns using a quantifiable metric. Then, we demonstrate how microscopic analyses of crowdsourced ubiquitous sensory data can be used to infer critical macroscopic contextual information, such as risks present at road intersections. Finally, we use ubiquitous sensors to profile a driverโ€™s behavioral patterns on a large scale; such sensors are found to be essential to the analysis and improvement of drivers driving behavior.PHDComputer Science & EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/163127/1/chendy_1.pd
    • โ€ฆ
    corecore