17 research outputs found

    ์†Œํ˜• ๋ฌด์ธ๋น„ํ–‰์ฒด์˜ ์˜์ƒ๊ธฐ๋ฐ˜ ์ž๋™ํ•ญ๋ฒ•

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (๋ฐ•์‚ฌ)-- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› : ๊ธฐ๊ณ„ํ•ญ๊ณต๊ณตํ•™๋ถ€, 2015. 2. ๊น€ํ˜„์ง„.In this thesis, an extensible framework for visual navigation of micro aerial vehicles (MAVs) is presented. Throughout the thesis, design of a scalable visual navigation system and its application for MAVs will be discussed. The compositions presented in this thesis forms a MAV visual navigation system, allowing fully autonomous control without existing positioning sensors, such as GPS or motion capture system. Contributions of this thesis can be summarized into three parts. First, the problem of a monocular camera localization using input video, which is called image-based localization, is addressed. A prior knowledge of a 3D scene model is exploited, that is constructed offline by structure from motion (SfM) technique. From live input video, the proposed method continuously computes 6-DoF camera pose by efficiently tracking natural features and matching them to 3D points reconstructed by SfM. Secondly, visual simultaneous localization and mapping (SLAM), a method which a camera is localized within unknown scene while building a map, is considered. The proposed method is applied to three types of visual sensors: monocular, stereo and RGB-D camera. The proposed method continuously computes the current 6-DoF camera pose and 3D landmarks position from input video. The proposed method successfully builds consistent map from indoor and outdoor sequences using a camera as the only sensor. A large-scale loop closing is demonstrated based on relative metric-topological representation of pose graph, which effectively adjusts drift of poses. The proposed method is running onboard computer in real-time and is used to control a quadrotor MAV position without external positioning sensor. Thirdly, an optical flow-based velocity estimation method is proposed. This method utilizes optical flow to estimate translational velocity of a MAV to control translation motion. An autonomous hovering flight control of a MAV using an optical flow sensor, is implemented on an low-cost microprocessor without external positioning sensors. Experimental results from flight tests are validated with the ground-truth data provided by a high-accuracy motion capture system. I believe this work brings MAVs a step closer to autonomous operations in many useful areas, such as indoors, where the global positioning method is not available.Abstract i Acknowledgements iii Chapter 1 Introduction 1 1.1 TowardsVisualNavigationofMAVs ................ 3 1.1.1 Image-basedLocalization .................. 4 1.1.2 VisualSLAM......................... 5 1.1.3 OpticalFlow-basedNavigation ............... 5 1.2 Overview ............................... 6 1.3 Contributions............................. 8 Chapter 2 Related Work 13 2.1 Image-basedLocalization ...................... 13 2.2 Simultaneous Localization and Mapping (SLAM) . . . . . . . . . 15 2.2.1 LoopDetectionandClosing................. 17 2.2.2 KeyframeSelection...................... 18 2.2.3 SceneRepresentation..................... 18 2.3 OpticalFlow ............................. 19 Chapter 3 Image-based Localization 21 3.1 KeyElementsoftheProposedApproach. . . . . . . . . . . . . . 23 3.1.1 SceneRepresentation..................... 24 3.1.2 Multi-scaleFeatures ..................... 25 3.1.3 PlaceRecognition ...................... 26 3.1.4 GlobalMatching ....................... 27 3.1.5 GuidedMatching....................... 28 3.1.6 OfflinePreprocessing..................... 28 3.2 Real-timeLocalization........................ 29 3.2.1 KeypointTracking ...................... 29 3.2.2 Distributing Matching Computation . . . . . . . . . . . . 30 3.2.3 PoseEstimationandFiltering................ 30 3.3 Application:SemanticLocalization................. 32 3.4 ExperimentalResults......................... 33 3.4.1 ExperimentalSetup ..................... 34 3.4.2 ExperimentsonVicon-LabSequence . . . . . . . . . . . . 37 3.4.3 ExperimentsonIndoorDatasets .............. 38 3.4.4 ExperimentsonOutdoorDatasets . . . . . . . . . . . . . 39 3.4.5 Timings............................ 40 3.4.6 FailureCases ......................... 41 3.5 Discussion............................... 41 Chapter 4 Visual SLAM 49 4.1 Introduction.............................. 49 4.2 ProblemDescription ......................... 51 4.3 ContributionsOverview ....................... 52 4.4 ProblemFormulation......................... 54 4.4.1 Metric-topological Map Representation . . . . . . . . . . 55 4.4.2 Graph Representation of Topological Map . . . . . . . . . 56 4.4.3 Metric Embedding of Keyframes and Landmarks . . . . . 57 4.4.4 OptimizationofPoseGraph................. 58 4.5 ProposedMethod........................... 59 4.5.1 Fisher Information Matrix for Uncertainty Measure . . . 59 4.5.2 KeyframeSelectionScheme ................. 61 4.5.3 Multi-levelLoopClosing................... 63 4.6 VisualSLAMSystem......................... 64 4.6.1 MonocularVisualSLAMPipeline . . . . . . . . . . . . . 64 4.6.2 RGB-DandStereoVisualSLAMPipeline . . . . . . . . . 67 4.6.3 KeypointExtractionandTracking . . . . . . . . . . . . . 68 4.6.4 PoseEstimation ....................... 69 4.6.5 LoopClosing ......................... 70 4.7 Experimental Result: Monocular Visual SLAM . . . . . . . . . . 71 4.7.1 DatasetsDescription..................... 71 4.7.2 ComparisonwithGroundTruthData . . . . . . . . . . . 72 4.8 ExperimentalResult:RGB-DSLAM................ 75 4.8.1 BenchmarkingResult .................... 75 4.8.2 Keyframe Selection Scheme Comparison . . . . . . . . . . 80 4.9 Discussion............................... 80 Chapter 5 Vision-based Control of MAVs 87 5.1 ExperimentalSystem......................... 87 5.2 ExperimentalResults......................... 89 Chapter 6 Optical Flow-based Visual Navigation 95 6.1 OpticalFlowFormulation ...................... 96 6.1.1 General Optical Flow Model and Feasibility Analysis . . . 96 6.1.2 OpticalFlowonPlanarSurface............... 98 6.1.3 Subtraction of Rotational Component of Optical Flow . . 99 6.1.4 AltitudeEstimation .....................100 6.1.5 ConversiontoMetricUnit..................101 6.2 ControllerDesign...........................101 6.2.1 DynamicModelofaQuadrotor...............102 6.2.2 Controller...........................102 6.3 ExperimentalSetup .........................103 6.3.1 AQuadrotorPlatform....................104 6.3.2 OpticalFlowSensor .....................104 6.3.3 FlightControlHardware...................105 6.4 ExperimentalResults.........................105 6.4.1 ExperimentSetup ......................106 6.4.2 ExperimentEnvironment ..................106 6.4.3 Ground-truthEvaluation ..................106 6.5 Discussion...............................107 Chapter 7 Conclusion 111 Chapter A Publications 113 A.1 Journals................................113 A.2 Conferences..............................114 A.3 Workshops ..............................115 Chapter B Multimedia Extensions ์ดˆ๋ก (Abstract in Korean)Docto
    corecore