12 research outputs found

    Characterization and digital aberration correction of a hyperspectral imaging system for plant disease detection

    Get PDF
    Hyperspectral imaging is a key technology for monitoring agricultural crops and vegetation. It can be used for health estimation and the early detection of disease symptoms in plants. This can help to reduce the use of pesticides by allowing targeted and early intervention. Cost-efficient hyperspectral imaging systems are necessary to meet the increasing demand for monitoring techniques for agricultural products. These systems usually suffer from sub-optimal image quality. Here we present a digital aberration correction for hyperspectral image data. Copyright 2023 Society of Photo-Optical Instrumentation Engineers. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited

    Subpixel point spread function estimation from two photographs at different distances

    Get PDF
    In most digital cameras, and even in high-end digital single lens reflex cameras, the acquired images are sampled at rates below the Nyquist critical rate, causing aliasing effects. This work introduces an algorithm for the subpixel estimation of the point spread function (PSF) of a digital camera from aliased photographs. The numerical procedure simply uses two fronto-parallel photographs of any planar textured scene at different distances. The mathematical theory developed herein proves that the camera PSF can be derived from these two images, under reasonable conditions. Mathematical proofs supplemented by experimental evidence show the well-posedness of the problem and the convergence of the proposed algorithm to the camera in-focus PSF. An experimental comparison of the resulting PSF estimates shows that the proposed algorithm reaches the accuracy levels of the best nonblind state-of-the-art methods

    Modeling Non-Stationary Asymmetric Lens Blur By Normal Sinh-Arcsinh Model

    Get PDF
    Department of Electrical EngineeringImages acquired by a camera show lens blur due to imperfection in the optical system. Lens blur is non-stationary in a sense the amount of blur depends on pixel locations in a sensor. Lens blur is also asymmetric in a sense the amount of blur is different in the radial and tangential directions, and also in the inward and outward radial directions. This paper presents parametric blur kernel models based on the normal sinh-arcsinh distribution function. The proposed models can provide flexible shapes of blur kernels with different symmetry and skewness to model complicated lens blur accurately. Blur of single focal length lenses is estimated and the accuracy of the models is compared with existing parametric blur models. Advantage of the proposed models is demonstrated through deblurring experiments.ope

    Modeling nonstationary lens blur using eigen blur kernels for restoration

    Get PDF
    Images acquired through a lens show nonstationary blur due to defocus and optical aberrations. This paper presents a method for accurately modeling nonstationary lens blur using eigen blur kernels obtained from samples of blur kernels through principal component analysis. Pixelwise variant nonstationary lens blur is expressed as a linear combination of stationary blur by eigen blur kernels. Operations that represent nonstationary blur can be implemented efficiently using the discrete Fourier transform. The proposed method provides a more accurate and efficient approach to modeling nonstationary blur compared with a widely used method called the efficient filter flow, which assumes stationarity within image regions. The proposed eigen blur kernel-based modeling is applied to total variation restoration of nonstationary lens blur. Accurate and efficient modeling of blur leads to improved restoration performance. The proposed method can be applied to model various nonstationary degradations of image acquisition processes, where degradation information is available only at some sparse pixel locations. (C) 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreemen

    Computational Imaging Approach to Recovery of Target Coordinates Using Orbital Sensor Data

    Get PDF
    This dissertation addresses the components necessary for simulation of an image-based recovery of the position of a target using orbital image sensors. Each component is considered in detail, focusing on the effect that design choices and system parameters have on the accuracy of the position estimate. Changes in sensor resolution, varying amounts of blur, differences in image noise level, selection of algorithms used for each component, and lag introduced by excessive processing time all contribute to the accuracy of the result regarding recovery of target coordinates using orbital sensor data. Using physical targets and sensors in this scenario would be cost-prohibitive in the exploratory setting posed, therefore a simulated target path is generated using Bezier curves which approximate representative paths followed by the targets of interest. Orbital trajectories for the sensors are designed on an elliptical model representative of the motion of physical orbital sensors. Images from each sensor are simulated based on the position and orientation of the sensor, the position of the target, and the imaging parameters selected for the experiment (resolution, noise level, blur level, etc.). Post-processing of the simulated imagery seeks to reduce noise and blur and increase resolution. The only information available for calculating the target position by a fully implemented system are the sensor position and orientation vectors and the images from each sensor. From these data we develop a reliable method of recovering the target position and analyze the impact on near-realtime processing. We also discuss the influence of adjustments to system components on overall capabilities and address the potential system size, weight, and power requirements from realistic implementation approaches

    Point spread function estimation of solar surface images with a cooperative particle swarm optmization on GPUS

    Get PDF
    Orientador : Prof. Dr. Daniel WeingaertnerDissertação (mestrado) - Universidade Federal do Paraná, Setor de Ciências Exatas, Programa de Pós-Graduação em Informática. Defesa: Curitiba, 21/02/3013Bibliografia : fls. 81-86Resumo: Apresentamos um método para a estimativa da função de espalhamento pontual (PSF) de imagens de superfície solar obtidas por telescópios terrestres e corrompidas pela atmosfera. A estimativa e feita obtendo-se a fase da frente de onda usando um conjunto de imagens de curta exposto, a reconstrucão de granulado optico do objeto observado e um modelo PSF parametrizado por polinómios de Zernikes. Estimativas da fase da frente de onda e do PSF sao computados atraves da minimizacao de uma funcao de erro com um metodo de otimizacão cooperativa por nuvens de partículas (CPSO), implementados em OpenCL para tirar vantagem do ambiente altamente paralelo Um metodo de calibracao e apresentado para ajustar os parâmetros do que as unidade de processamento gráfico (GPU) provem. algoritmo para resultados de baixo custo, resultando em solidas estimativas tanto para imagens de baixa frequencia quanto para imagens de alta frequencia. Os resultados mostram que o metodo apresentado possui râpida convergencia e e robusto a degradacao causada por ruídos. Experimentos executados em uma placa NVidia Tesla C2050 computaram 100 PSFs com 50 polinómios de Zernike em " 36 minutos. Ao aumentar-se o námero de coeficientes de Zernike dez vezes, de 50 para 500, o tempo de execucão aumentou somente 17%, o que demonstra que o algoritmo proposto e pouco afetado pelo numero de Zernikes utilizado.Abstract: We present a method for estimating the point spread function (PSF) of solar surface images acquired from ground telescopes and degraded by atmosphere. The estimation is done by retrieving the wavefront phase using a set of short exposures, the speckle reconstruction of the observed object and a PSF model parametrized by Zernike polynomials. Estimates of the wavefront phase and the PSF are computed by minimizing an error function with a cooperative particle swarm optimization method (CPSO), implemented in OpenCL to take advantage of highly parallel graphical processing units (GPUs). A calibration method is presented to adjust the algorithm parameters for low cost results, providing solid estimations for both low frequency and high frequency images. Results show that the method has a fast convergence and is robust to noise degradation. Experiments run on an NVidia Tesla C2050 were able to compute 100 PSFs with 50 Zernike polynomials in " 36 minutes. The increase on the number of Zernike coefficients tenfold, from 50 to 500, caused the increase of 17% on the execution time, showing that the proposed algorithm is only slightly affected by the number of Zernikes used

    Terahertz-Computer-Tomographie mit Zeitbereichsspektroskopie-Systemen

    Get PDF
    Diese Arbeit befasst sich mit dem optischen, zerstörungsfreien Verfahren der Terahertz (THz)-Computer-Tomographie (CT). Das Ziel dieser Arbeit ist die Überwindung einzelner physikalisch-technischer Herausforderungen zur Schaffung der Grundlagen für eine Übertragung dieser neuen Technologie in den Industrieeinsatz. Hierbei galt es, alle Mehrinformationen von zeitaufgelösten Spektroskopie-Systemen zu nutzen, auftretende optische Effekte der langwelligen Strahlung zu untersuchen, Korrekturmaßnahmen zu finden sowie die Messgeschwindigkeit bisheriger Systeme zu erhöhen. Ausgehend von der Verwendung gepulster Zeitbereichsspektroskopie (TDS) konnten im Rahmen der Arbeit neben den üblichen Absorptionsinformationen die zusätzlichen Informationen von Zeit und Spektrum kurzer THz-Pulse erfasst und für die volumetrische Rekonstruktion der inneren Strukturen und Eigenschaften verwendet werden. Hierbei wurden Substanzen sowohl räumlich lokalisiert als auch spektral identifiziert. Es wurde erstmals die THz-CT mit einem neuen mehrkanaligen TDS-System unter Verwendung von 15 Detektionskanälen aus LTG-InGaAs/InAlAs bei der Anregungswellenlänge von 1030 nm eines Ultrakurzpuls-Faserlasers demonstriert. Im Rahmen der Systementwicklungen entstanden neue Laser-generierte Antireflexstrukturen für hochbrechende THz-Optiken, welche die Signalqualitäten in Zukunft drastisch verbessern können. Infolge der Langwelligkeit der Strahlung wurden mit den Kanten- und Brechungseffekten zwei optische Haupteffekte untersucht, klassifiziert und Lösungsstrategien bezüglich der Korrektur und Verbesserung ihrer Artefakte in Rekonstruktionsbilder aufgezeigt. Hierbei entstanden Methoden zur Detektion mehrere Pulse im Zeitsignal, Entfaltungsoperationen zur Verbesserung der optischen Bildqualität als auch ein geometrisch-optisches Simulationswerkzeug für Analyseverfahren oder weitere Systementwicklungen

    Tatouage robuste d’images imprimées

    Get PDF
    Invisible watermarking for ID images printed on plastic card support is a challenging problem that interests the industrial world. In this study, we developed a watermarking algorithm robust to various attacks present in this case. These attacks are mainly related to the print/scan process on the plastic support and the degradations that an ID card can encounter along its lifetime. The watermarking scheme operates in the Fourier domain as this transform has invariance properties against global geometrical transformations. A preventive method consists of pre-processing the host image before the embedding process that reduces the variance of the embeddable vector. A curative method comprises two counterattacks dealing with blurring and color variations. For a false alarm probability of 10⁻⁴, we obtained an average improvement of 22% over the reference method when only preventative method is used. The combination of the preventive and curative methods leads to a detection rate greater than 99%. The detection algorithm takes less than 1 second for a 512×512 image with a conventional computer, which is compatible with the industrial application in question.Le tatouage invisible d’images d’identité imprimées sur un support en plastique est un problème difficile qui intéresse le monde industriel. Dans cette étude, nous avons développé un algorithme de tatouage robuste aux diverses attaques présentes dans ce cas. Ces attaques sont liées aux processus d’impression/numérisation sur le support plastique ainsi qu’aux dégradations qu’une carte plastique peut rencontrer le long de sa durée de vie. La méthode de tatouage opère dans le domaine de Fourier car cette transformée présente des propriétés d’invariances aux attaques géométriques globales. Une méthode préventive consiste en un prétraitement de l’image originale avant le processus d’insertion qui réduit la variance du vecteur support de la marque. Une méthode corrective comporte deux contre-attaques corrigeant le flou et les variations colorimétriques. Pour une probabilité de fausse alarme de 10⁻⁴, nous avons obtenu une amélioration moyenne de 22% par rapport à la méthode de référence lorsque seule la méthode préventive est utilisée. La combinaison de la méthode préventive avec la méthode corrective correspond à un taux de détection supérieur à 99%. L’algorithme de détection prends moins de 1 seconde pour à une image de 512×512 pixels avec un ordinateur classique ce qui est compatible avec l’application industrielle visée

    Direct PSF Estimation Using a Random Noise Target

    No full text
    Conventional point spread function (PSF) measurement methods often use parametric models for the estimation of the PSF. This limits the shape of the PSF to a specific form provided by the model. However, there are unconventional imaging systems like multispectral cameras with optical bandpass filters, which produce an, e.g., unsymmetric PSF. To estimate such PSFs we have developed a new measurement method utilizing a random noise test target with markers: After acquisition of this target, a synthetic prototype of the test target is geometrically transformed to match the acquired image with respect to its geometric alignment. This allows us to estimate the PSF by direct comparison between prototype and image. The noise target allows us to evaluate all frequencies due to the approximately “white ” spectrum of the test target – we are not limited to a specifically shaped PSF. The registration of the prototype pattern gives us the opportunity to take the specific spectrum into account and not just a “white ” spectrum, which might be a weak assumption in small image regions. Based on the PSF measurement, we perform a deconvolution. We present comprehensive results for the PSF estimation using our multispectral camera and provide deconvolution results

    Direct PSF estimation using a random noise target

    No full text
    corecore