439 research outputs found

    Applying image processing techniques to pose estimation and view synthesis.

    Get PDF
    Fung Yiu-fai Phineas.Thesis (M.Phil.)--Chinese University of Hong Kong, 1999.Includes bibliographical references (leaves 142-148).Abstracts in English and Chinese.Chapter 1 --- Introduction --- p.1Chapter 1.1 --- Model-based Pose Estimation --- p.3Chapter 1.1.1 --- Application - 3D Motion Tracking --- p.4Chapter 1.2 --- Image-based View Synthesis --- p.4Chapter 1.3 --- Thesis Contribution --- p.7Chapter 1.4 --- Thesis Outline --- p.8Chapter 2 --- General Background --- p.9Chapter 2.1 --- Notations --- p.9Chapter 2.2 --- Camera Models --- p.10Chapter 2.2.1 --- Generic Camera Model --- p.10Chapter 2.2.2 --- Full-perspective Camera Model --- p.11Chapter 2.2.3 --- Affine Camera Model --- p.12Chapter 2.2.4 --- Weak-perspective Camera Model --- p.13Chapter 2.2.5 --- Paraperspective Camera Model --- p.14Chapter 2.3 --- Model-based Motion Analysis --- p.15Chapter 2.3.1 --- Point Correspondences --- p.16Chapter 2.3.2 --- Line Correspondences --- p.18Chapter 2.3.3 --- Angle Correspondences --- p.19Chapter 2.4 --- Panoramic Representation --- p.20Chapter 2.4.1 --- Static Mosaic --- p.21Chapter 2.4.2 --- Dynamic Mosaic --- p.22Chapter 2.4.3 --- Temporal Pyramid --- p.23Chapter 2.4.4 --- Spatial Pyramid --- p.23Chapter 2.5 --- Image Pre-processing --- p.24Chapter 2.5.1 --- Feature Extraction --- p.24Chapter 2.5.2 --- Spatial Filtering --- p.27Chapter 2.5.3 --- Local Enhancement --- p.31Chapter 2.5.4 --- Dynamic Range Stretching or Compression --- p.32Chapter 2.5.5 --- YIQ Color Model --- p.33Chapter 3 --- Model-based Pose Estimation --- p.35Chapter 3.1 --- Previous Work --- p.35Chapter 3.1.1 --- Estimation from Established Correspondences --- p.36Chapter 3.1.2 --- Direct Estimation from Image Intensities --- p.49Chapter 3.1.3 --- Perspective-3-Point Problem --- p.51Chapter 3.2 --- Our Iterative P3P Algorithm --- p.58Chapter 3.2.1 --- Gauss-Newton Method --- p.60Chapter 3.2.2 --- Dealing with Ambiguity --- p.61Chapter 3.2.3 --- 3D-to-3D Motion Estimation --- p.66Chapter 3.3 --- Experimental Results --- p.68Chapter 3.3.1 --- Synthetic Data --- p.68Chapter 3.3.2 --- Real Images --- p.72Chapter 3.4 --- Discussions --- p.73Chapter 4 --- Panoramic View Analysis --- p.76Chapter 4.1 --- Advanced Mosaic Representation --- p.76Chapter 4.1.1 --- Frame Alignment Policy --- p.77Chapter 4.1.2 --- Multi-resolution Representation --- p.77Chapter 4.1.3 --- Parallax-based Representation --- p.78Chapter 4.1.4 --- Multiple Moving Objects --- p.79Chapter 4.1.5 --- Layers and Tiles --- p.79Chapter 4.2 --- Panorama Construction --- p.79Chapter 4.2.1 --- Image Acquisition --- p.80Chapter 4.2.2 --- Image Alignment --- p.82Chapter 4.2.3 --- Image Integration --- p.88Chapter 4.2.4 --- Significant Residual Estimation --- p.89Chapter 4.3 --- Advanced Alignment Algorithms --- p.90Chapter 4.3.1 --- Patch-based Alignment --- p.91Chapter 4.3.2 --- Global Alignment (Block Adjustment) --- p.92Chapter 4.3.3 --- Local Alignment (Deghosting) --- p.93Chapter 4.4 --- Mosaic Application --- p.94Chapter 4.4.1 --- Visualization Tool --- p.94Chapter 4.4.2 --- Video Manipulation --- p.95Chapter 4.5 --- Experimental Results --- p.96Chapter 5 --- Panoramic Walkthrough --- p.99Chapter 5.1 --- Problem Statement and Notations --- p.100Chapter 5.2 --- Previous Work --- p.101Chapter 5.2.1 --- 3D Modeling and Rendering --- p.102Chapter 5.2.2 --- Branching Movies --- p.103Chapter 5.2.3 --- Texture Window Scaling --- p.104Chapter 5.2.4 --- Problems with Simple Texture Window Scaling --- p.105Chapter 5.3 --- Our Walkthrough Approach --- p.106Chapter 5.3.1 --- Cylindrical Projection onto Image Plane --- p.106Chapter 5.3.2 --- Generating Intermediate Frames --- p.108Chapter 5.3.3 --- Occlusion Handling --- p.114Chapter 5.4 --- Experimental Results --- p.116Chapter 5.5 --- Discussions --- p.116Chapter 6 --- Conclusion --- p.121Chapter A --- Formulation of Fischler and Bolles' Method for P3P Problems --- p.123Chapter B --- Derivation of z1 and z3 in terms of z2 --- p.127Chapter C --- Derivation of e1 and e2 --- p.129Chapter D --- Derivation of the Update Rule for Gauss-Newton Method --- p.130Chapter E --- Proof of (λ1λ2-λ 4)>〉0 --- p.132Chapter F --- Derivation of φ and hi --- p.133Chapter G --- Derivation of w1j to w4j --- p.134Chapter H --- More Experimental Results on Panoramic Stitching Algorithms --- p.138Bibliography --- p.14

    Aerial Vehicles

    Get PDF
    This book contains 35 chapters written by experts in developing techniques for making aerial vehicles more intelligent, more reliable, more flexible in use, and safer in operation.It will also serve as an inspiration for further improvement of the design and application of aeral vehicles. The advanced techniques and research described here may also be applicable to other high-tech areas such as robotics, avionics, vetronics, and space

    Computational fluid dynamics modeling and in situ physics-based monitoring of aerosol jet printing toward functional assurance of additively-manufactured, flexible and hybrid electronics

    Get PDF
    Aerosol jet printing (AJP)—a direct-write, additive manufacturing technique—has emerged as the process of choice particularly for the fabrication of flexible and hybrid electronics. AJP has paved the way for high-resolution device fabrication with high placement accuracy, edge definition, and adhesion. In addition, AJP accommodates a broad range of ink viscosity, and allows for printing on non-planer surfaces. Despite the unique advantages and host of strategic applications, AJP is a highly unstable and complex process, prone to gradual drifts in machine behavior and deposited material. Hence, real-time monitoring and control of AJP process is a burgeoning need. In pursuit of this goal, the objectives of the work are, as follows: (i) In situ image acquisition from the traces/lines of printed electronic devices right after deposition. To realize this objective, the AJP experimental setup was instrumented with a high-resolution charge-coupled device (CCD) camera, mounted on a variable-magnification lens (in addition to the standard imaging system, already installed on the AJ printer). (ii) In situ image processing and quantification of the trace morphology. In this regard, several customized image processing algorithms were devised to quantify/extract various aspects of the trace morphology from online images. In addition, based on the concept of shape-from-shading (SfS), several other algorithms were introduced, allowing for not only reconstruction of the 3D profile of the AJ-printed electronic traces, but also quantification of 3D morphology traits, such as thickness, cross-sectional area, and surface roughness, among others. (iii) Development of a supervised multiple-input, single-output (MISO) machine learning model—based on sparse representation for classification (SRC)—with the aim to estimate the device functional properties (e.g., resistance) in near real-time with an accuracy of ≥ 90%. (iv) Forwarding a computational fluid dynamics (CFD) model to explain the underlying aerodynamic phenomena behind aerosol transport and deposition in AJP process, observed experimentally. Overall, this doctoral dissertation paves the way for: (i) implementation of physics-based real-time monitoring and control of AJP process toward conformal material deposition and device fabrication; and (ii) optimal design of direct-write components, such as nozzles, deposition heads, virtual impactors, atomizers, etc

    Indoor navigation systems for unmanned aerial vehicles

    Get PDF
    Ph.DDOCTOR OF PHILOSOPH

    Proof-of-concept of a single-point Time-of-Flight LiDAR system and guidelines towards integrated high-accuracy timing, advanced polarization sensing and scanning with a MEMS micromirror

    Get PDF
    Dissertação de mestrado integrado em Engenharia Física (área de especialização em Dispositivos, Microssistemas e Nanotecnologias)The core focus of the work reported herein is the fulfillment of a functional Light Detection and Ranging (LiDAR) sensor to validate the direct Time-of-Flight (ToF) ranging concept and the acquisition of critical knowledge regarding pivotal aspects jeopardizing the sensor’s performance, for forthcoming improvements aiming a realistic sensor targeted towards automotive applications. Hereupon, the ToF LiDAR system is implemented through an architecture encompassing both optical and electronical functions and is subsequently characterized under a sequence of test procedures usually applied in benchmarking of LiDAR sensors. The design employs a hybrid edge-emitting laser diode (pulsed at 6kHz, 46ns temporal FWHM, 7ns rise-time; 919nm wavelength with 5nm FWHM), a PIN photodiode to detect the back-reflected radiation, a transamplification stage and two Time-to-Digital Converters (TDCs), with leading-edge discrimination electronics to mark the transit time between emission and detection events. Furthermore, a flexible modular design is adopted using two separate Printed Circuit Boards (PCBs), comprising the transmitter (TX) and the receiver (RX), i.e. detection and signal processing. The overall output beam divergence is 0.4º×1º and an optical peak power of 60W (87% overall throughput) is realized. The sensor is tested indoors from 0.56 to 4.42 meters, and the distance is directly estimated from the pulses transit time. The precision within these working distances ranges from 4cm to 7cm, reflected in a Signal-to-Noise Ratio (SNR) between 12dB and 18dB. The design requires a calibration procedure to correct systematic errors in the range measurements, induced by two sources: the timing offset due to architecture-inherent differences in the optoelectronic paths and a supplementary bias resulting from the design, which renders an intensity dependence and is denoted time-walk. The calibrated system achieves a mean accuracy of 1cm. Two distinct target materials are used for characterization and performance evaluation: a metallic automotive paint and a diffuse material. This selection is representative of two extremes of actual LiDAR applications. The optical and electronic characterization is thoroughly detailed, including the recognition of a good agreement between empirical observations and simulations in ZEMAX, for optical design, and in a SPICE software, for the electrical subsystem. The foremost meaningful limitation of the implemented design is identified as an outcome of the leading-edge discrimination. A proposal for a Constant Fraction Discriminator addressing sub-millimetric accuracy is provided to replace the previous signal processing element. This modification is mandatory to virtually eliminate the aforementioned systematic bias in range sensing due to the intensity dependency. A further crucial addition is a scanning mechanism to supply the required Field-of-View (FOV) for automotive usage. The opto-electromechanical guidelines to interface a MEMS micromirror scanner, achieving a 46º×17º FOV, with the LiDAR sensor are furnished. Ultimately, a proof-of-principle to the use of polarization in material classification for advanced processing is carried out, aiming to complement the ToF measurements. The original design is modified to include a variable wave retarder, allowing the simultaneous detection of orthogonal linear polarization states using a single detector. The material classification with polarization sensing is tested with the previously referred materials culminating in an 87% and 11% degree of linear polarization retention from the metallic paint and the diffuse material, respectively, computed by Stokes parameters calculus. The procedure was independently validated under the same conditions with a micro-polarizer camera (92% and 13% polarization retention).O intuito primordial do trabalho reportado no presente documento é o desenvolvimento de um sensor LiDAR funcional, que permita validar o conceito de medição direta do tempo de voo de pulsos óticos para a estimativa de distância, e a aquisição de conhecimento crítico respeitante a aspetos fundamentais que prejudicam a performance do sensor, ambicionando melhorias futuras para um sensor endereçado para aplicações automóveis. Destarte, o sistema LiDAR é implementado através de uma arquitetura que engloba tanto funções óticas como eletrónicas, sendo posteriormente caracterizado através de uma sequência de testes experimentais comumente aplicáveis em benchmarking de sensores LiDAR. O design tira partido de um díodo de laser híbrido (pulsado a 6kHz, largura temporal de 46ns; comprimento de onda de pico de 919nm e largura espetral de 5nm), um fotodíodo PIN para detetar a radiação refletida, um andar de transamplificação e dois conversores tempo-digital, com discriminação temporal com threshold constante para marcar o tempo de trânsito entre emissão e receção. Ademais, um design modular flexível é adotado através de duas PCBs independentes, compondo o transmissor e o recetor (deteção e processamento de sinal). A divergência global do feixe emitido para o ambiente circundante é 0.4º×1º, apresentando uma potência ótica de pico de 60W (eficiência de 87% na transmissão). O sensor é testado em ambiente fechado, entre 0.56 e 4.42 metros. A precisão dentro das distâncias de trabalho varia entre 4cm e 7cm, o que se reflete numa razão sinal-ruído entre 12dB e 18dB. O design requer calibração para corrigir erros sistemáticos nas distâncias adquiridas devido a duas fontes: o desvio no ToF devido a diferenças nos percursos optoeletrónicos, inerentes à arquitetura, e uma dependência adicional da intensidade do sinal refletido, induzida pela técnica de discriminação implementada e denotada time-walk. A exatidão do sistema pós-calibração perfaz um valor médio de 1cm. Dois alvos distintos são utilizados durante a fase de caraterização e avaliação performativa: uma tinta metálica aplicada em revestimentos de automóveis e um material difusor. Esta seleção é representativa de dois cenários extremos em aplicações reais do LiDAR. A caraterização dos subsistemas ótico e eletrónico é minuciosamente detalhada, incluindo a constatação de uma boa concordância entre observações empíricas e simulações óticas em ZEMAX e elétricas num software SPICE. O principal elemento limitante do design implementado é identificado como sendo a técnica de discriminação adotada. Por conseguinte, é proposta a substituição do anterior bloco por uma técnica de discriminação a uma fração constante do pulso de retorno, com exatidões da ordem sub-milimétrica. Esta modificação é imperativa para eliminar o offset sistemático nas medidas de distância, decorrente da dependência da intensidade do sinal. Uma outra inclusão de extrema relevância é um mecanismo de varrimento que assegura o cumprimento dos requisitos de campo de visão para aplicações automóveis. As diretrizes para a integração de um micro-espelho no sensor concebido são providenciadas, permitindo atingir um campo de visão de 46º×17º. Conclusivamente, é feita uma prova de princípio para a utilização da polarização como complemento das medições do tempo de voo, de modo a suportar a classificação de materiais em processamento avançado. A arquitetura original é modificada para incluir uma lâmina de atraso variável, permitindo a deteção de estados de polarização ortogonais com um único fotodetetor. A classificação de materiais através da aferição do estado de polarização da luz refletida é testada para os materiais supramencionados, culminando numa retenção de polarização de 87% (tinta metálica) e 11% (difusor), calculados através dos parâmetros de Stokes. O procedimento é independentemente validado com uma câmara polarimétrica nas mesmas condições (retenção de 92% e 13%)

    Reliability index of optical flow that considers error margin of matches and stabilizes camera movement estimation

    No full text

    Bio-Inspired Motion Vision for Aerial Course Control

    No full text

    Pattern Recognition

    Get PDF
    Pattern recognition is a very wide research field. It involves factors as diverse as sensors, feature extraction, pattern classification, decision fusion, applications and others. The signals processed are commonly one, two or three dimensional, the processing is done in real- time or takes hours and days, some systems look for one narrow object class, others search huge databases for entries with at least a small amount of similarity. No single person can claim expertise across the whole field, which develops rapidly, updates its paradigms and comprehends several philosophical approaches. This book reflects this diversity by presenting a selection of recent developments within the area of pattern recognition and related fields. It covers theoretical advances in classification and feature extraction as well as application-oriented works. Authors of these 25 works present and advocate recent achievements of their research related to the field of pattern recognition

    A Multimodal and Multi-Algorithmic Architecture for Data Fusion in Biometric Systems

    Get PDF
    Software di autenticazione basato su tratti biometric
    corecore