211,532 research outputs found

    ๋„์‹ฌ ๊ต์ฐจ๋กœ์—์„œ์˜ ์ž์œจ์ฃผํ–‰์„ ์œ„ํ•œ ์ฃผ๋ณ€ ์ฐจ๋Ÿ‰ ๊ฒฝ๋กœ ์˜ˆ์ธก ๋ฐ ๊ฑฐ๋™ ๊ณ„ํš ์•Œ๊ณ ๋ฆฌ์ฆ˜

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ(๋ฐ•์‚ฌ)--์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› :๊ณต๊ณผ๋Œ€ํ•™ ๊ธฐ๊ณ„ํ•ญ๊ณต๊ณตํ•™๋ถ€,2020. 2. ์ด๊ฒฝ์ˆ˜.์ฐจ๋ž‘์šฉ ์„ผ์‹ฑ ๋ฐ ์ฒ˜๋ฆฌ๊ธฐ์ˆ ์ด ๋ฐœ๋‹ฌํ•จ์— ๋”ฐ๋ผ ์ž๋™์ฐจ ๊ธฐ์ˆ  ์—ฐ๊ตฌ๊ฐ€ ์ˆ˜๋™ ์•ˆ์ „ ๊ธฐ์ˆ ์—์„œ ๋Šฅ๋™ ์•ˆ์ „ ๊ธฐ์ˆ ๋กœ ์ดˆ์ ์ด ํ™•์žฅ๋˜๊ณ  ์žˆ๋‹ค. ์ตœ๊ทผ, ์ฃผ์š” ์ž๋™์ฐจ ์ œ์ž‘์‚ฌ๋“ค์€ ๋Šฅ๋™ํ˜• ์ฐจ๊ฐ„๊ฑฐ๋ฆฌ ์ œ์–ด, ์ฐจ์„  ์œ ์ง€ ๋ณด์กฐ, ๊ทธ๋ฆฌ๊ณ  ๊ธด๊ธ‰ ์ž๋™ ์ œ๋™๊ณผ ๊ฐ™์€ ๋Šฅ๋™ ์•ˆ์ „ ๊ธฐ์ˆ ์ด ์ด๋ฏธ ์ƒ์—…ํ™”ํ•˜๊ณ  ์žˆ๋‹ค. ์ด๋Ÿฌํ•œ ๊ธฐ์ˆ ์  ์ง„๋ณด๋Š” ์‚ฌ์ƒ๋ฅ  ์ œ๋กœ๋ฅผ ๋‹ฌ์„ฑํ•˜๊ธฐ ์œ„ํ•˜์—ฌ ๊ธฐ์ˆ  ์—ฐ๊ตฌ ๋ถ„์•ผ๋ฅผ ๋Šฅ๋™ ์•ˆ์ „ ๊ธฐ์ˆ ์„ ๋„˜์–ด์„œ ์ž์œจ์ฃผํ–‰ ์‹œ์Šคํ…œ์œผ๋กœ ํ™•์žฅ์‹œํ‚ค๊ณ  ์žˆ๋‹ค. ํŠนํžˆ, ๋„์‹ฌ ๋„๋กœ๋Š” ์ธ๋„, ์‚ฌ๊ฐ์ง€๋Œ€, ์ฃผ์ฐจ์ฐจ๋Ÿ‰, ์ด๋ฅœ์ฐจ, ๋ณดํ–‰์ž ๋“ฑ๊ณผ ๊ฐ™์€ ๊ตํ†ต ์œ„ํ—˜ ์š”์†Œ๋ฅผ ๋งŽ์ด ๊ฐ–๊ณ  ์žˆ๊ธฐ ๋•Œ๋ฌธ์— ๊ณ ์†๋„๋กœ๋ณด๋‹ค ์‚ฌ๊ณ  ๋ฐœ์ƒ๋ฅ ๊ณผ ์‚ฌ์ƒ๋ฅ ์ด ๋†’์œผ๋ฉฐ, ์ด๋Š” ๋„์‹ฌ ๋„๋กœ์—์„œ์˜ ์ž์œจ์ฃผํ–‰์€ ํ•ต์‹ฌ ์ด์Šˆ๊ฐ€ ๋˜๊ณ  ์žˆ๋‹ค. ๋งŽ์€ ํ”„๋กœ์ ํŠธ๋“ค์ด ์ž์œจ์ฃผํ–‰์˜ ํ™˜๊ฒฝ์ , ์ธ๊ตฌํ•™์ , ์‚ฌํšŒ์ , ๊ทธ๋ฆฌ๊ณ  ๊ฒฝ์ œ์  ์ธก๋ฉด์—์„œ์˜ ์ž์œจ์ฃผํ–‰์˜ ํšจ๊ณผ๋ฅผ ํ‰๊ฐ€ํ•˜๊ธฐ ์œ„ํ•ด ์ˆ˜ํ–‰๋˜์—ˆ๊ฑฐ๋‚˜ ์ˆ˜ํ–‰ ์ค‘์— ์žˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด, ์œ ๋Ÿฝ์˜ AdaptIVE๋Š” ๋‹ค์–‘ํ•œ ์ž์œจ์ฃผํ–‰ ๊ธฐ๋Šฅ์„ ๊ฐœ๋ฐœํ•˜์˜€์œผ๋ฉฐ, ๊ตฌ์ฒด์ ์ธ ํ‰๊ฐ€ ๋ฐฉ๋ฒ•๋ก ์„ ๊ฐœ๋ฐœํ•˜์˜€๋‹ค. ๋˜ํ•œ, CityMobil2๋Š” ์œ ๋Ÿฝ ์ „์—ญ์˜ 9๊ฐœ์˜ ๋‹ค๋ฅธ ํ™˜๊ฒฝ์—์„œ ๋ฌด์ธ ์ง€๋Šฅํ˜• ์ฐจ๋Ÿ‰์„ ์„ฑ๊ณต์ ์œผ๋กœ ํ†ตํ•ฉํ•˜์˜€๋‹ค. ์ผ๋ณธ์—์„œ๋Š” 2014๋…„ 5์›”์— ์‹œ์ž‘๋œ Automated Driving System Research Project๋Š” ์ž์œจ์ฃผํ–‰ ์‹œ์Šคํ…œ๊ณผ ์ฐจ์„ธ๋Œ€ ๋„์‹ฌ ๊ตํ†ต ์ˆ˜๋‹จ์˜ ๊ฐœ๋ฐœ ๋ฐ ๊ฒ€์ฆ์— ์ดˆ์ ์„ ๋งž์ถ”์—ˆ๋‹ค. ๊ธฐ์กด ์—ฐ๊ตฌ๋“ค์— ๋Œ€ํ•œ ์กฐ์‚ฌ๋ฅผ ํ†ตํ•ด ์ž์œจ์ฃผํ–‰ ์‹œ์Šคํ…œ์€ ๊ตํ†ต ์ฐธ์—ฌ์ž๋“ค์˜ ์•ˆ์ „๋„๋ฅผ ํ–ฅ์ƒ์‹œํ‚ค๊ณ , ๊ตํ†ต ํ˜ผ์žก์„ ๊ฐ์†Œ์‹œํ‚ค๋ฉฐ, ์šด์ „์ž ํŽธ์˜์„ฑ์„ ์ฆ์ง„์‹œํ‚ค๋Š” ๊ฒƒ์ด ์ฆ๋ช…๋˜์—ˆ๋‹ค. ๋‹ค์–‘ํ•œ ๋ฐฉ๋ฒ•๋ก ๋“ค์ด ์ธ์ง€, ๊ฑฐ๋™ ๊ณ„ํš, ๊ทธ๋ฆฌ๊ณ  ์ œ์–ด์™€ ๊ฐ™์€ ๋„์‹ฌ ๋„๋กœ ์ž์œจ์ฃผํ–‰์ฐจ์˜ ํ•ต์‹ฌ ๊ธฐ์ˆ ๋“ค์„ ๊ฐœ๋ฐœํ•˜๊ธฐ ์œ„ํ•˜์—ฌ ์‚ฌ์šฉ๋˜์—ˆ๋‹ค. ํ•˜์ง€๋งŒ ๋งŽ์€ ์ตœ์‹ ์˜ ์ž์œจ์ฃผํ–‰ ์—ฐ๊ตฌ๋“ค์€ ๊ฐ ๊ธฐ์ˆ ์˜ ๊ฐœ๋ฐœ์„ ๋ณ„๊ฐœ๋กœ ๊ณ ๋ คํ•˜์—ฌ ์ง„ํ–‰ํ•ด์™”๋‹ค. ๊ฒฐ๊ณผ์ ์œผ๋กœ ํ†ตํ•ฉ์ ์ธ ๊ด€์ ์—์„œ์˜ ์ž์œจ์ฃผํ–‰ ๊ธฐ์ˆ  ์„ค๊ณ„๋Š” ์•„์ง ์ถฉ๋ถ„ํžˆ ๊ณ ๋ ค๋˜์–ด ์•Š์•˜๋‹ค. ๋”ฐ๋ผ์„œ, ๋ณธ ๋…ผ๋ฌธ์€ ๋ณต์žกํ•œ ๋„์‹ฌ ๋„๋กœ ํ™˜๊ฒฝ์—์„œ ๋ผ์ด๋‹ค, ์นด๋ฉ”๋ผ, GPS, ๊ทธ๋ฆฌ๊ณ  ๊ฐ„๋‹จํ•œ ๊ฒฝ๋กœ ๋งต์— ๊ธฐ๋ฐ˜ํ•œ ์™„์ „ ์ž์œจ์ฃผํ–‰ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ๊ฐœ๋ฐœํ•˜๋Š” ๊ฒƒ์„ ๋ชฉํ‘œ๋กœ ํ•˜์˜€๋‹ค. ์ œ์•ˆ๋œ ์ž์œจ์ฃผํ–‰ ์•Œ๊ณ ๋ฆฌ์ฆ˜์€ ๋น„ํ†ต์ œ ๊ต์ฐจ๋กœ๋ฅผ ํฌํ•จํ•œ ๋„์‹ฌ ๋„๋กœ ์ƒํ™ฉ์„ ์ฐจ๋Ÿ‰ ๊ฑฐ๋™ ์˜ˆ์ธก๊ธฐ์™€ ๋ชจ๋ธ ์˜ˆ์ธก ์ œ์–ด ๊ธฐ๋ฒ•์— ๊ธฐ๋ฐ˜ํ•˜์—ฌ ์„ค๊ณ„๋˜์—ˆ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์€ ๋™์ , ์ •์  ํ™˜๊ฒฝ ํ‘œํ˜„ ๋ฐ ์ข…ํšก๋ฐฉํ–ฅ ๊ฑฐ๋™ ๊ณ„ํš์„ ์ค‘์ ์ ์œผ๋กœ ๋‹ค๋ฃจ์—ˆ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์€ ๋„์‹ฌ ๋„๋กœ ์ž์œจ์ฃผํ–‰์„ ์œ„ํ•œ ๊ฑฐ๋™ ๊ณ„ํš ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ๊ฐœ์š”๋ฅผ ์ œ์‹œํ•˜์˜€์œผ๋ฉฐ, ์‹ค์ œ ๊ตํ†ต ์ƒํ™ฉ์—์„œ์˜ ์‹คํ—˜ ๊ฒฐ๊ณผ๋Š” ์ œ์•ˆ๋œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ํšจ๊ณผ์„ฑ๊ณผ ์šด์ „์ž ๊ฑฐ๋™๊ณผ์˜ ์œ ์‚ฌ์„ฑ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค. ์‹ค์ฐจ ์‹คํ—˜ ๊ฒฐ๊ณผ๋Š” ๋น„ํ†ต์ œ ๊ต์ฐจ๋กœ๋ฅผ ํฌํ•จํ•œ ๋„์‹ฌ ์‹œ๋‚˜๋ฆฌ์˜ค์—์„œ์˜ ๊ฐ•๊ฑดํ•œ ์„ฑ๋Šฅ์„ ๋ณด์—ฌ์ฃผ์—ˆ๋‹ค.The foci of automotive researches have been expanding from passive safety systems to active safety systems with advances in sensing and processing technologies. Recently, the majority of automotive makers have already commercialized active safety systems, such as adaptive cruise control (ACC), lane keeping assistance (LKA), and autonomous emergency braking (AEB). Such advances have extended the research field beyond active safety systems to automated driving systems to achieve zero fatalities. Especially, automated driving on urban roads has become a key issue because urban roads possess numerous risk factors for traffic accidents, such as sidewalks, blind spots, on-street parking, motorcycles, and pedestrians, which cause higher accident rates and fatalities than motorways. Several projects have been conducted, and many others are still underway to evaluate the effects of automated driving in environmental, demographic, social, and economic aspects. For example, the European project AdaptIVe, develops various automated driving functions and defines specific evaluation methodologies. In addition, CityMobil2 successfully integrates driverless intelligent vehicles in nine other environments throughout Europe. In Japan, the Automated Driving System Research Project began on May 2014, which focuses on the development and verification of automated driving systems and next-generation urban transportation. From a careful review of a considerable amount of literature, automated driving systems have been proven to increase the safety of traffic users, reduce traffic congestion, and improve driver convenience. Various methodologies have been employed to develop the core technology of automated vehicles on urban roads, such as perception, motion planning, and control. However, the current state-of-the-art automated driving algorithms focus on the development of each technology separately. Consequently, designing automated driving systems from an integrated perspective is not yet sufficiently considered. Therefore, this dissertation focused on developing a fully autonomous driving algorithm in urban complex scenarios using LiDAR, vision, GPS, and a simple path map. The proposed autonomous driving algorithm covered the urban road scenarios with uncontrolled intersections based on vehicle motion prediction and model predictive control approach. Mainly, four research issues are considered: dynamic/static environment representation, and longitudinal/lateral motion planning. In the remainder of this thesis, we will provide an overview of the proposed motion planning algorithm for urban autonomous driving and the experimental results in real traffic, which showed the effectiveness and human-like behaviors of the proposed algorithm. The proposed algorithm has been tested and evaluated using both simulation and vehicle tests. The test results show the robust performance of urban scenarios, including uncontrolled intersections.Chapter 1 Introduction 1 1.1. Background and Motivation 1 1.2. Previous Researches 4 1.3. Thesis Objectives 9 1.4. Thesis Outline 10 Chapter 2 Overview of Motion Planning for Automated Driving System 11 Chapter 3 Dynamic Environment Representation with Motion Prediction 15 3.1. Moving Object Classification 17 3.2. Vehicle State based Direct Motion Prediction 20 3.2.1. Data Collection Vehicle 22 3.2.2. Target Roads 23 3.2.3. Dataset Selection 24 3.2.4. Network Architecture 25 3.2.5. Input and Output Features 33 3.2.6. Encoder and Decoder 33 3.2.7. Sequence Length 34 3.3. Road Structure based Interactive Motion Prediction 36 3.3.1. Maneuver Definition 38 3.3.2. Network Architecture 39 3.3.3. Path Following Model based State Predictor 47 3.3.4. Estimation of predictor uncertainty 50 3.3.5. Motion Parameter Estimation 53 3.3.6. Interactive Maneuver Prediction 56 3.4. Intersection Approaching Vehicle Motion Prediction 59 3.4.1. Driver Behavior Model at Intersections 59 3.4.2. Intention Inference based State Prediction 63 Chapter 4 Static Environment Representation 67 4.1. Static Obstacle Map Construction 69 4.2. Free Space Boundary Decision 74 4.3. Drivable Corridor Decision 76 Chapter 5 Longitudinal Motion Planning 81 5.1. In-Lane Target Following 82 5.2. Proactive Motion Planning for Narrow Road Driving 85 5.2.1. Motivation for Collision Preventive Velocity Planning 85 5.2.2. Desired Acceleration Decision 86 5.3. Uncontrolled Intersection 90 5.3.1. Driving Phase and Mode Definition 91 5.3.2. State Machine for Driving Mode Decision 92 5.3.3. Motion Planner for Approach Mode 95 5.3.4. Motion Planner for Risk Management Phase 98 Chapter 6 Lateral Motion Planning 105 6.1. Vehicle Model 107 6.2. Cost Function and Constraints 109 Chapter 7 Performance Evaluation 115 7.1. Motion Prediction 115 7.1.1. Prediction Accuracy Analysis of Vehicle State based Direct Motion Predictor 115 7.1.2. Prediction Accuracy and Effect Analysis of Road Structure based Interactive Motion Predictor 122 7.2. Prediction based Distance Control at Urban Roads 132 7.2.1. Driving Data Analysis of Direct Motion Predictor Application at Urban Roads 133 7.2.2. Case Study of Vehicle Test at Urban Roads 138 7.2.3. Analysis of Vehicle Test Results on Urban Roads 147 7.3. Complex Urban Roads 153 7.3.1. Case Study of Vehicle Test at Complex Urban Roads 154 7.3.2. Closed-loop Simulation based Safety Analysis 162 7.4. Uncontrolled Intersections 164 7.4.1. Simulation based Algorithm Comparison of Motion Planner 164 7.4.2. Monte-Carlo Simulation based Safety Analysis 166 7.4.3. Vehicle Tests Results in Real Traffic Conditions 172 7.4.4. Similarity Analysis between Human and Automated Vehicle 194 7.5. Multi-Lane Turn Intersections 197 7.5.1. Case Study of a Multi-Lane Left Turn Scenario 197 7.5.2. Analysis of Motion Planning Application Results 203 Chapter 8 Conclusion & Future Works 207 8.1. Conclusion 207 8.2. Future Works 209 Bibliography 210 Abstract in Korean 219Docto

    Performance Evaluation of Vision-Based Algorithms for MAVs

    Get PDF
    An important focus of current research in the field of Micro Aerial Vehicles (MAVs) is to increase the safety of their operation in general unstructured environments. Especially indoors, where GPS cannot be used for localization, reliable algorithms for localization and mapping of the environment are necessary in order to keep an MAV airborne safely. In this paper, we compare vision-based real-time capable methods for localization and mapping and point out their strengths and weaknesses. Additionally, we describe algorithms for state estimation, control and navigation, which use the localization and mapping results of our vision-based algorithms as input.Comment: Presented at OAGM Workshop, 2015 (arXiv:1505.01065

    Perception-aware Path Planning

    Full text link
    In this paper, we give a double twist to the problem of planning under uncertainty. State-of-the-art planners seek to minimize the localization uncertainty by only considering the geometric structure of the scene. In this paper, we argue that motion planning for vision-controlled robots should be perception aware in that the robot should also favor texture-rich areas to minimize the localization uncertainty during a goal-reaching task. Thus, we describe how to optimally incorporate the photometric information (i.e., texture) of the scene, in addition to the the geometric one, to compute the uncertainty of vision-based localization during path planning. To avoid the caveats of feature-based localization systems (i.e., dependence on feature type and user-defined thresholds), we use dense, direct methods. This allows us to compute the localization uncertainty directly from the intensity values of every pixel in the image. We also describe how to compute trajectories online, considering also scenarios with no prior knowledge about the map. The proposed framework is general and can easily be adapted to different robotic platforms and scenarios. The effectiveness of our approach is demonstrated with extensive experiments in both simulated and real-world environments using a vision-controlled micro aerial vehicle.Comment: 16 pages, 20 figures, revised version. Conditionally accepted for IEEE Transactions on Robotic

    From Monocular SLAM to Autonomous Drone Exploration

    Full text link
    Micro aerial vehicles (MAVs) are strongly limited in their payload and power capacity. In order to implement autonomous navigation, algorithms are therefore desirable that use sensory equipment that is as small, low-weight, and low-power consuming as possible. In this paper, we propose a method for autonomous MAV navigation and exploration using a low-cost consumer-grade quadrocopter equipped with a monocular camera. Our vision-based navigation system builds on LSD-SLAM which estimates the MAV trajectory and a semi-dense reconstruction of the environment in real-time. Since LSD-SLAM only determines depth at high gradient pixels, texture-less areas are not directly observed so that previous exploration methods that assume dense map information cannot directly be applied. We propose an obstacle mapping and exploration approach that takes the properties of our semi-dense monocular SLAM system into account. In experiments, we demonstrate our vision-based autonomous navigation and exploration system with a Parrot Bebop MAV

    Performance evaluation of a distributed integrative architecture for robotics

    Get PDF
    The eld of robotics employs a vast amount of coupled sub-systems. These need to interact cooperatively and concurrently in order to yield the desired results. Some hybrid algorithms also require intensive cooperative interactions internally. The architecture proposed lends it- self amenable to problem domains that require rigorous calculations that are usually impeded by the capacity of a single machine, and incompatibility issues between software computing elements. Implementations are abstracted away from the physical hardware for ease of de- velopment and competition in simulation leagues. Monolithic developments are complex, and the desire for decoupled architectures arises. Decoupling also lowers the threshold for using distributed and parallel resources. The ability to re-use and re-combine components on de- mand, therefore is essential, while maintaining the necessary degree of interaction. For this reason we propose to build software components on top of a Service Oriented Architecture (SOA) using Web Services. An additional bene t is platform independence regarding both the operating system and the implementation language. The robot soccer platform as well as the associated simulation leagues are the target domain for the development. Furthermore are machine vision and remote process control related portions of the architecture currently in development and testing for industrial environments. We provide numerical data based on the Python frameworks ZSI and SOAPpy undermining the suitability of this approach for the eld of robotics. Response times of signi cantly less than 50 ms even for fully interpreted, dynamic languages provides hard information showing the feasibility of Web Services based SOAs even in time critical robotic applications

    Fast, Autonomous Flight in GPS-Denied and Cluttered Environments

    Full text link
    One of the most challenging tasks for a flying robot is to autonomously navigate between target locations quickly and reliably while avoiding obstacles in its path, and with little to no a-priori knowledge of the operating environment. This challenge is addressed in the present paper. We describe the system design and software architecture of our proposed solution, and showcase how all the distinct components can be integrated to enable smooth robot operation. We provide critical insight on hardware and software component selection and development, and present results from extensive experimental testing in real-world warehouse environments. Experimental testing reveals that our proposed solution can deliver fast and robust aerial robot autonomous navigation in cluttered, GPS-denied environments.Comment: Pre-peer reviewed version of the article accepted in Journal of Field Robotic

    Autonomy Infused Teleoperation with Application to BCI Manipulation

    Full text link
    Robot teleoperation systems face a common set of challenges including latency, low-dimensional user commands, and asymmetric control inputs. User control with Brain-Computer Interfaces (BCIs) exacerbates these problems through especially noisy and erratic low-dimensional motion commands due to the difficulty in decoding neural activity. We introduce a general framework to address these challenges through a combination of computer vision, user intent inference, and arbitration between the human input and autonomous control schemes. Adjustable levels of assistance allow the system to balance the operator's capabilities and feelings of comfort and control while compensating for a task's difficulty. We present experimental results demonstrating significant performance improvement using the shared-control assistance framework on adapted rehabilitation benchmarks with two subjects implanted with intracortical brain-computer interfaces controlling a seven degree-of-freedom robotic manipulator as a prosthetic. Our results further indicate that shared assistance mitigates perceived user difficulty and even enables successful performance on previously infeasible tasks. We showcase the extensibility of our architecture with applications to quality-of-life tasks such as opening a door, pouring liquids from containers, and manipulation with novel objects in densely cluttered environments
    • โ€ฆ
    corecore