5 research outputs found

    Multisensor Concealed Weapon Detection Using the Image Fusion Approach

    Get PDF
    Detection of concealed weapons is an increasingly important problem for both military and police since global terrorism and crime have grown as threats over the years. This work presents two image fusion algorithms, one at pixel level and another at feature level, for efficient concealed weapon detection application. Both the algorithms presented in this work are based on the double-density dual-tree complex wavelet transform (DDDTCWT). In the pixel level fusion scheme, the fusion of low frequency band coefficients is determined by the local contrast, while the high frequency band fusion rule is developed with consideration of both texture feature of the human visual system (HVS) and local energy basis. In the feature level fusion algorithm, features are exacted using Gaussian Mixture model (GMM) based multiscale segmentation approach and the fusion rules are developed based on region activity measurement. Experiment results demonstrate the robustness and efficiency of the proposed algorithms

    Remote Sensing Monitoring System of Land Coverage Change in Mining Area

    Get PDF
    Based on remote sensing images, the panoramic views of land coverage distribution across a large geographic area can be accessed conveniently. Remote sensing monitoring system of land coverage change in mining area, which is a complex information system based on spatial database to manage multi-source heterogeneous data, was proposed in this article. The system structure, function and development strategy were studied in this paper. Remote sensing image fusion and classification are the key technologies in this system. The remote sensing image fusion method which is based on multi-band wavelet was discussed. Based on remote sensing image, the Chaos Immune Algorithm was proposed to improve the accuracy of land coverage classification. The results showed that this system can integrate the multi-source heterogeneous spatial data, including remote sensing image, vector data and related properties data into the whole body, also demonstrate graphical visualization and analyze compositely

    Multi-scale pixel-based image fusion using multivariate empirical mode decomposition.

    Get PDF
    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences
    corecore