699 research outputs found

    Automatic Denoising and Unmixing in Hyperspectral Image Processing

    Get PDF
    This thesis addresses two important aspects in hyperspectral image processing: automatic hyperspectral image denoising and unmixing. The first part of this thesis is devoted to a novel automatic optimized vector bilateral filter denoising algorithm, while the remainder concerns nonnegative matrix factorization with deterministic annealing for unsupervised unmixing in remote sensing hyperspectral images. The need for automatic hyperspectral image processing has been promoted by the development of potent hyperspectral systems, with hundreds of narrow contiguous bands, spanning the visible to the long wave infrared range of the electromagnetic spectrum. Due to the large volume of raw data generated by such sensors, automatic processing in the hyperspectral images processing chain is preferred to minimize human workload and achieve optimal result. Two of the mostly researched processing for such automatic effort are: hyperspectral image denoising, which is an important preprocessing step for almost all remote sensing tasks, and unsupervised unmixing, which decomposes the pixel spectra into a collection of endmember spectral signatures and their corresponding abundance fractions. Two new methodologies are introduced in this thesis to tackle the automatic processing problems described above. Vector bilateral filtering has been shown to provide good tradeoff between noise removal and edge degradation when applied to multispectral/hyperspectral image denoising. It has also been demonstrated to provide dynamic range enhancement of bands that have impaired signal to noise ratios. Typical vector bilateral filtering usage does not employ parameters that have been determined to satisfy optimality criteria. This thesis also introduces an approach for selection of the parameters of a vector bilateral filter through an optimization procedure rather than by ad hoc means. The approach is based on posing the filtering problem as one of nonlinear estimation and minimizing the Stein\u27s unbiased risk estimate (SURE) of this nonlinear estimator. Along the way, this thesis provides a plausibility argument with an analytical example as to why vector bilateral filtering outperforms band-wise 2D bilateral filtering in enhancing SNR. Experimental results show that the optimized vector bilateral filter provides improved denoising performance on multispectral images when compared to several other approaches. Non-negative matrix factorization (NMF) technique and its extensions were developed to find part based, linear representations of non-negative multivariate data. They have been shown to provide more interpretable results with realistic non-negative constrain in unsupervised learning applications such as hyperspectral imagery unmixing, image feature extraction, and data mining. This thesis extends the NMF method by incorporating deterministic annealing optimization procedure, which will help solve the non-convexity problem in NMF and provide a better choice of sparseness constrain. The approach is based on replacing the difficult non-convex optimization problem of NMF with an easier one by adding an auxiliary convex entropy constrain term and solving this first. Experiment results with hyperspectral unmixing application show that the proposed technique provides improved unmixing performance compared to other state-of-the-art methods

    Denoising of 3D magnetic resonance images using non-local PCA and Transform-Domain Filter

    Get PDF
    The Magnetic Resonance Imaging (MRI) technologyused in clinical diagnosis demands high Peak Signal-to-Noise ratio(PSNR) and improved resolution for accurate analysis and treatmentmonitoring. However, MRI data is often corrupted by random noisewhich degrades the quality of Magnetic Resonance (MR) images.Denoising is a paramount challenge as removing noise causesreduction in the fine details of MRI images. We have developed anovel algorithm which employs Principal Component Analysis(PCA) decomposition and Wiener filtering. We have proposed a twostage approach. In first stage, non-local PCA thresholding is appliedon noisy image and second stage uses Wiener filter over this filteredimage. Our algorithm is implemented using MATLAB andperformance is measured via PSNR. The proposed approach hasalso been compared with related state-of-art methods. Moreover, wepresent both qualitative and quantitative results which prove thatproposed algorithm gives superior denoising performance

    A nonlinear Stein based estimator for multichannel image denoising

    Get PDF
    The use of multicomponent images has become widespread with the improvement of multisensor systems having increased spatial and spectral resolutions. However, the observed images are often corrupted by an additive Gaussian noise. In this paper, we are interested in multichannel image denoising based on a multiscale representation of the images. A multivariate statistical approach is adopted to take into account both the spatial and the inter-component correlations existing between the different wavelet subbands. More precisely, we propose a new parametric nonlinear estimator which generalizes many reported denoising methods. The derivation of the optimal parameters is achieved by applying Stein's principle in the multivariate case. Experiments performed on multispectral remote sensing images clearly indicate that our method outperforms conventional wavelet denoising technique

    InSPECtor: an end-to-end design framework for compressive pixelated hyperspectral instruments

    Full text link
    Classic designs of hyperspectral instrumentation densely sample the spatial and spectral information of the scene of interest. Data may be compressed after the acquisition. In this paper we introduce a framework for the design of an optimized, micro-patterned snapshot hyperspectral imager that acquires an optimized subset of the spatial and spectral information in the scene. The data is thereby compressed already at the sensor level, but can be restored to the full hyperspectral data cube by the jointly optimized reconstructor. This framework is implemented with TensorFlow and makes use of its automatic differentiation for the joint optimization of the layout of the micro-patterned filter array as well as the reconstructor. We explore the achievable compression ratio for different numbers of filter passbands, number of scanning frames, and filter layouts using data collected by the Hyperscout instrument. We show resulting instrument designs that take snapshot measurements without losing significant information while reducing the data volume, acquisition time, or detector space by a factor of 40 as compared to classic, dense sampling. The joint optimization of a compressive hyperspectral imager design and the accompanying reconstructor provides an avenue to substantially reduce the data volume from hyperspectral imagers.Comment: 23 pages, 12 figures, published in Applied Optic

    Poisson noise reduction with non-local PCA

    Full text link
    Photon-limited imaging arises when the number of photons collected by a sensor array is small relative to the number of detector elements. Photon limitations are an important concern for many applications such as spectral imaging, night vision, nuclear medicine, and astronomy. Typically a Poisson distribution is used to model these observations, and the inherent heteroscedasticity of the data combined with standard noise removal methods yields significant artifacts. This paper introduces a novel denoising algorithm for photon-limited images which combines elements of dictionary learning and sparse patch-based representations of images. The method employs both an adaptation of Principal Component Analysis (PCA) for Poisson noise and recently developed sparsity-regularized convex optimization algorithms for photon-limited images. A comprehensive empirical evaluation of the proposed method helps characterize the performance of this approach relative to other state-of-the-art denoising methods. The results reveal that, despite its conceptual simplicity, Poisson PCA-based denoising appears to be highly competitive in very low light regimes.Comment: erratum: Image man is wrongly name pepper in the journal versio

    A Comprehensive Survey of Deep Learning in Remote Sensing: Theories, Tools and Challenges for the Community

    Full text link
    In recent years, deep learning (DL), a re-branding of neural networks (NNs), has risen to the top in numerous areas, namely computer vision (CV), speech recognition, natural language processing, etc. Whereas remote sensing (RS) possesses a number of unique challenges, primarily related to sensors and applications, inevitably RS draws from many of the same theories as CV; e.g., statistics, fusion, and machine learning, to name a few. This means that the RS community should be aware of, if not at the leading edge of, of advancements like DL. Herein, we provide the most comprehensive survey of state-of-the-art RS DL research. We also review recent new developments in the DL field that can be used in DL for RS. Namely, we focus on theories, tools and challenges for the RS community. Specifically, we focus on unsolved challenges and opportunities as it relates to (i) inadequate data sets, (ii) human-understandable solutions for modelling physical phenomena, (iii) Big Data, (iv) non-traditional heterogeneous data sources, (v) DL architectures and learning algorithms for spectral, spatial and temporal data, (vi) transfer learning, (vii) an improved theoretical understanding of DL systems, (viii) high barriers to entry, and (ix) training and optimizing the DL.Comment: 64 pages, 411 references. To appear in Journal of Applied Remote Sensin
    • …
    corecore