7 research outputs found

    A calibration method for MEMS inertial sensors based on optical techniques.

    Get PDF
    Dong, Zhuxin.Thesis (M.Phil.)--Chinese University of Hong Kong, 2008.Includes bibliographical references (leaves 77-80).Abstracts in English and Chinese.Abstract --- p.ii摘要 --- p.iiiAcknowledgements --- p.ivTable of Contents --- p.vList of Figures --- p.viiList of Tables --- p.ixChapter Chapter 1 --- Introduction --- p.1Chapter 1.1 --- Architecture of UDWI --- p.3Chapter 1.2 --- Background of IMU Sensor Calibration --- p.5Chapter 1.3 --- Organization --- p.7Chapter Chapter 2 --- 2D Motion Calibration --- p.10Chapter 2.1 --- Experimental Platform --- p.10Chapter 2.1.1 --- Transparent Table --- p.10Chapter 2.2 --- Matching Algorithm --- p.13Chapter 2.2.1 --- Motion Analysis --- p.13Chapter 2.2.2 --- Core Algorithm and Matching Criterion --- p.14Chapter 2.3 --- Usage of High Speed Camera --- p.17Chapter 2.4 --- Functions Realized --- p.17Chapter Chapter 3 --- Usage of Camera Calibration --- p.21Chapter 3.1 --- Introduction to Camera Calibration --- p.21Chapter 3.1.1 --- Related Coordinate Frames --- p.21Chapter 3.1.2 --- Pin-Hole Model --- p.24Chapter 3.2 --- Calibration for Nonlinear Model --- p.27Chapter 3.3 --- Implementation of Process to Calibrate Camera --- p.28Chapter 3.3.1 --- Image Capture --- p.28Chapter 3.3.2 --- Define World Frame and Extract Corners --- p.28Chapter 3.3.3 --- Main Calibration --- p.30Chapter 3.4 --- Calibration Results of High Speed Camera --- p.33Chapter 3.4.1 --- Lens Selection --- p.33Chapter 3.4.2 --- Property of High Speed Camera --- p.34Chapter Chapter 4 --- 3D Attitude Calibration --- p.36Chapter 4.1 --- The Necessity of Attitude Calibration --- p.36Chapter 4.2 --- Stereo Vision and 3D Reconstruction --- p.37Chapter 4.2.1 --- Physical Meaning and Mathematical Model Proof --- p.37Chapter 4.2.2 --- 3D Point Reconstruction --- p.38Chapter 4.3 --- Example of 3D Point Reconstruction --- p.40Chapter 4.4 --- Idea of Attitude Calibration --- p.42Chapter Chapter 5 --- Experimental Results --- p.45Chapter 5.1 --- Calculation of Proportional Parameter --- p.45Chapter 5.2 --- Accuracy Test of Stroke Reconstruction --- p.46Chapter 5.3 --- Writing Experiments of 26 Letters --- p.47Chapter 5.3.1 --- Experimental Results of Letter b --- p.48Chapter 5.3.2 --- Experimental Results of Letter n with ZVC --- p.51Chapter 5.3.3 --- Experimental Results of Letter u --- p.54Chapter 5.4 --- Writing of Single Letter s - Multiple Tests --- p.56Chapter 5.5 --- Analysis on Resolution Property of Current Vision Algorithm --- p.58Chapter 5.5.1 --- Resolution of Current Algorithm --- p.58Chapter 5.5.2 --- Tests with Various Filters --- p.59Chapter 5.6 --- Calculation of Static Attitude --- p.61Chapter Chapter 6 --- Future Work --- p.64Chapter 6.1 --- Another Multiple Tests of Letter k --- p.64Chapter 6.2 --- Letter Recognition Based on Neural Networks Classification --- p.66Chapter Chapter 7 --- Conclusion --- p.69Chapter 7.1 --- Calibration ofMAG-μlMU Sensors --- p.69Chapter 7.2 --- Calibration of Accelerometers --- p.70Chapter 7.3 --- Calibration of Attitude --- p.70Chapter 7.4 --- Future Work --- p.71Appendix A The Experimental Results of Writing English Letters --- p.7

    Parallel implementation of the full search block matching algorithm for motion estimation

    No full text
    Motion estimation is a key technique in most algorithms for video compression and particularly in the MPEG and H.261 standards. The most frequently used technique is based on a Full Search Block Matching Algorithm which is highly computing intensive and requires the use of special purpose architectures to obtain real-time performance. We propose an approach to the parallel implementation of the Full Search Block Matching Algorithm which is suitable for implementation on massively parallel architectures ranging from large scale SIMD computers to dedicated processor arrays realized in ASICs. While the first alternative can be used for the implementation of high performance coders the second alternative is particularly attractive for low cost video compression devices. This paper describes the approach proposed for the parallel implementation of the Full Search Block Matching Algorithm and the implementation of such an approach in an ASI

    Parallel implementation of the full search block matching algorithm for motion estimation

    No full text

    Parallel Implementation of the Full Search Block Matching Algorithm for Motion Estimation

    No full text
    Motion estimation is a key technique in most algorithms for video compression and particularly in the MPEG and H.261 standards. The most frequently used technique is based on a Full Search Block Matching Algorithm which is highly computing intensive and requires the use of special purpose architectures to obtain real-time performance. In this paper we propose an approach to the parallel implementation of the Full Search Block Matching Algorithm which is suitable for implementation on massively parallel architectures ranging from large scale SIMD computers to dedicated processor arrays realized in ASICs. While the first alternative can be used for the implementation of high performance coders the second alternative is particularly attractive for low cost video compression devices. This paper describes the approach proposed for the parallel implementation of the Full Search Block Matching Algorithm and the implementation of such an approach in an ASIC. Keywords: Video Coding, Motion Esti..

    Perception and Motion: use of Computer Vision to solve Geometry Processing problems

    Get PDF
    Computer vision and geometry processing are often see as two different and, in a certain sense, distant fields: the first one works on two-dimensional data, while the other needs three dimensional information. But are 2D and 3D data really disconnected? Think about the human vision: each eye captures patterns of light, that are then used by the brain in order to reconstruct the perception of the observed scene. In a similar way, if the eye detects a variation in the patterns of light, we are able to understand that the scene is not static; therefore, we're able to perceive the motion of one or more object in the scene. In this work, we'll show how the perception of the 2D motion can be used in order to solve two significant problems, both dealing with three-dimensional data. In the first part, we'll show how the so-called optical flow, representing the observed motion, can be used to estimate the alignment error of a set of digital cameras looking to the same object. In the second part, we'll see how the detected 2D motion of an object can be used to better understand its underlying geometric structure by means of detecting its rigid parts and the way they are connected
    corecore