1,010 research outputs found

    Image restoration in digital photography

    Get PDF
    This paper introduces some novel image restoration algorithms for digital photography, which has one of the fastest growing consumer electronics markets in recent years. Many attempts have been made to improve the quality of the digital pictures in comparison with photography taken on films. A lot of these methods have their roots in discrete signal and image processing developed over the last two decades, but the ever-increasing computational power of personal computers has made possible new designs and advanced techniques. The algorithms we are presenting here take advantage of the programmability of the pixels and the availability of a compression codec commonly found inside digital cameras, and work in compliance with either the JPEG or the JPEG-2000 image compression standard.published_or_final_versio

    Analysis of the DCT coefficient distributions for document coding

    Get PDF
    It is known that the distribution of the discrete cosine transform (DCT) coefficients of most natural images follow a Laplacian distribution, and this knowledge has been employed to improve decoder design. However, such is not the case for text documents. In this letter, we present an analysis of their DCT coefficient distributions, and show that a Gaussian distribution can be a realistic model. Furthermore, we can use a generalized Gaussian model to incorporate the Laplacian distribution found for natural images.published_or_final_versio

    A nonlinear image restoration framework using vector quantization

    Get PDF
    Vector quantization (VQ) is a powerful method used primarily in signal and image compression. In recent years, it has also been applied to various other image processing tasks, including image classification, histogram modification, and restoration. In this paper, we focus our attention on image restoration using VQ. We present a general framework that incorporates two other methods in the literature, and discuss our method that follows more naturally from this framework. With appropriate training data for the VQ codebook, this method can restore images beyond its diffraction limit. © 2004 IEEE.published_or_final_versio

    Compound document compression with model-based biased reconstruction

    Get PDF
    The usefulness of electronic document delivery and archives rests in large part on advances in compression technology. Documents can contain complex layouts with different data types, such as text and images, having different statistical characteristics. To achieve better image quality, it is important to make use of such characteristics in compression. We exploit the transform coefficient distributions for text and images. We show that the scheme in base-line JPEG does not lead to minimum mean-square error if we have models of these coefficients. Instead, we discuss an algorithm designed for this performance that involves first classifying the blocks, and then estimating the parameters to enable a biased reconstruction in the decompression value. Simulation results are shown to validate the advantages of this method. © 2004 SPIE and IS&T.published_or_final_versio

    Combining gray world and retinex theory for automatic white balance in digital photography

    Get PDF
    White balancing is an important step in the digital camera processing pipeline to adjust the color of the pixels under different illuminations. Efficient automatic white balance is usually a required component of a consumer digital camera because many users would not prefer to handle this task manually. Gray world assumption and Retinex theory are two common methods used, but their aims vary and their applicability depends on the nature of the images. In this paper, we present an effective technique that combines the two together, while preserving the strength of the two methods. Experimental results confirm that our approach is a viable alternative to the two existing methods 1. © IEEE.published_or_final_versio

    Graylevel alignment between two images using linear programming

    Get PDF
    A critical step in defect detection for semiconductor process is to align a test image against a reference. This includes both spatial alignment and grayscale alignment. For the latter, a direct least square approach is not very applicable because the presence of defects would skew the parameters. Instead, we use a linear programming formulation which has the advantage of having a fast algorithm, while at the same time can produce better alignment of the test image to the reference. Furthermore, this is a flexible algorithm capable of incorporating additional constraints, such as ensuring that the aligned pixel values are within the allowable intensity range.published_or_final_versio

    Edge-preserving sectional image reconstruction in optical scanning holography

    Get PDF
    Optical scanning holography (OSH) enables us to capture the three-dimensional information of an object, and a post-processing step known as sectional image reconstruction allows us to view its two-dimensional crosssection. Previous methods often produce reconstructed images that have blurry edges. In this paper, we argue that the hologram's two-dimensional Fourier transform maps into a semi-spherical surface in the threedimensional frequency domain of the object, a relationship akin to the Fourier diffraction theorem used in diffraction tomography. Thus, the sectional image reconstruction task is an ill-posed inverse problem, and here we make use of the total variation regularization with a nonnegative constraint and solve it with a gradient projection algorithm. Both simulated and experimental holograms are used to verify that edge-preserving reconstruction is achieved, and the axial distance between sections is reduced compared with previous regularization methods. © 2010 Optical Society of America.published_or_final_versio

    Hyperspectral reconstruction in biomedical imaging using terahertz systems

    Get PDF
    Terahertz time-domain spectroscopy (THz-TDS) is an emerging modality for biomedical imaging. It is non-ionizing and can detect differences between water content and tissue density, but the detectors are rather expensive and the scan time tends to be long. Recently, it has been shown that the compressed sensing theory can lead to a radical re-design of the imaging system with lower detector cost and shorter scan time, in exchange for computation in the image reconstruction. We show in this paper that it is in fact possible to make use of the multi-frequency nature of the terahertz pulse to achieve hyperspectral reconstruction. Through effective use of the spatial sparsity, spectroscopic phase information, and correlations across the hyperspectral bands, our method can significantly improve the reconstructed image quality. This is demonstrated through using a set of experimental THz data captured in a single-pixel terahertz system. ©2010 IEEE.published_or_final_versionThe IEEE International Symposium on Circuits and Systems: Nano-Bio Circuit Fabrics and Systems (ISCAS 2010), Pars, France, 30 May-2 June 2010. In Proceedings of ISCAS, 2010, p. 2079-208

    Nebulous hotspot and algorithm variability in computation lithography

    Get PDF
    Computation lithography relies on algorithms. However, these algorithms exhibit variability that can be as much as 5% (one standard deviation) of the critical dimension for the 65-nm technology. Using hotspot analysis and fixing as an example, we argue that such variability can be addressed on the algorithm level via controlling and eliminating its root causes, and on the application level by setting specifications that are commensurate with both the limitations of the algorithms and the goals of the application. © 2010 Society of Photo-Optical Instrumentation Engineers.published_or_final_versio

    Hyperspectral reconstruction in biomedical imaging using terahertz systems

    Get PDF
    Terahertz time-domain spectroscopy (THz-TDS) is an emerging modality for biomedical imaging. It is non-ionizing and can detect differences between water content and tissue density, but the detectors are rather expensive and the scan time tends to be long. Recently, it has been shown that the compressed sensing theory can lead to a radical re-design of the imaging system with lower detector cost and shorter scan time, in exchange for computation in the image reconstruction. We show in this paper that it is in fact possible to make use of the multi-frequency nature of the terahertz pulse to achieve hyperspectral reconstruction. Through effective use of the spatial sparsity, spectroscopic phase information, and correlations across the hyperspectral bands, our method can significantly improve the reconstructed image quality. This is demonstrated through using a set of experimental THz data captured in a single-pixel terahertz system. ©2010 IEEE.published_or_final_versionThe IEEE International Symposium on Circuits and Systems: Nano-Bio Circuit Fabrics and Systems (ISCAS 2010), Pars, France, 30 May-2 June 2010. In Proceedings of ISCAS, 2010, p. 2079-208
    • …
    corecore