210 research outputs found

    A Research and Strategy of Remote Sensing Image Denoising Algorithms

    Full text link
    Most raw data download from satellites are useless, resulting in transmission waste, one solution is to process data directly on satellites, then only transmit the processed results to the ground. Image processing is the main data processing on satellites, in this paper, we focus on image denoising which is the basic image processing. There are many high-performance denoising approaches at present, however, most of them rely on advanced computing resources or rich images on the ground. Considering the limited computing resources of satellites and the characteristics of remote sensing images, we do some research on these high-performance ground image denoising approaches and compare them in simulation experiments to analyze whether they are suitable for satellites. According to the analysis results, we propose two feasible image denoising strategies for satellites based on satellite TianZhi-1.Comment: 9 pages, 4 figures, ICNC-FSKD 201

    Patch-based anisotropic diffusion scheme for fluorescence diffuse optical tomography-part 1: technical principles

    Get PDF
    Fluorescence diffuse optical tomography (fDOT) provides 3D images of fluorescence distributions in biological tissue, which represent molecular and cellular processes. The image reconstruction problem is highly ill-posed and requires regularisation techniques to stabilise and find meaningful solutions. Quadratic regularisation tends to either oversmooth or generate very noisy reconstructions, depending on the regularisation strength. Edge preserving methods, such as anisotropic diffusion regularisation (AD), can preserve important features in the fluorescence image and smooth out noise. However, AD has limited ability to distinguish an edge from noise. In this two-part paper, we propose a patch-based anisotropic diffusion regularisation (PAD), where regularisation strength is determined by a weighted average according to the similarity between patches around voxels within a search window, instead of a simple local neighbourhood strategy. However, this method has higher computational complexity and, hence, we wavelet compress the patches (PAD-WT) to speed it up, while simultaneously taking advantage of the denoising properties of wavelet thresholding. The proposed method combines the nonlocal means (NLM), AD and wavelet shrinkage methods, which are image processing methods. Therefore, in this first paper, we used a denoising test problem to analyse the performance of the new method. Our results show that the proposed PAD-WT method provides better results than the AD or NLM methods alone. The efficacy of the method for fDOT image reconstruction problem is evaluated in part 2

    Thoracic low-dose CT image processing using an artifact suppressed large-scale nonlocal means.

    No full text
    International audienceThe x-ray exposure to patients has become a major concern in computed tomography (CT) and minimizing the radiation exposure has been one of the major efforts in the CT field. Due to plenty high-attenuation tissues in the human chest, under low-dose scan protocols, thoracic low-dose CT (LDCT) images tend to be severely degraded by excessive mottled noise and non-stationary streak artifacts. Their removal is rather a challenging task because the streak artifacts with directional prominence are often hard to discriminate from the attenuation information of normal tissues. This paper describes a two-step processing scheme called 'artifact suppressed large-scale nonlocal means' for suppressing both noise and artifacts in thoracic LDCT images. Specific scale and direction properties were exploited to discriminate the noise and artifacts from image structures. Parallel implementation has been introduced to speed up the whole processing by more than 100 times. Phantom and patient CT images were both acquired for evaluation purpose. Comparative qualitative and quantitative analyses were both performed that allows conclusion on the efficacy of our method in improving thoracic LDCT data

    Image Restoration for Remote Sensing: Overview and Toolbox

    Full text link
    Remote sensing provides valuable information about objects or areas from a distance in either active (e.g., RADAR and LiDAR) or passive (e.g., multispectral and hyperspectral) modes. The quality of data acquired by remotely sensed imaging sensors (both active and passive) is often degraded by a variety of noise types and artifacts. Image restoration, which is a vibrant field of research in the remote sensing community, is the task of recovering the true unknown image from the degraded observed image. Each imaging sensor induces unique noise types and artifacts into the observed image. This fact has led to the expansion of restoration techniques in different paths according to each sensor type. This review paper brings together the advances of image restoration techniques with particular focuses on synthetic aperture radar and hyperspectral images as the most active sub-fields of image restoration in the remote sensing community. We, therefore, provide a comprehensive, discipline-specific starting point for researchers at different levels (i.e., students, researchers, and senior researchers) willing to investigate the vibrant topic of data restoration by supplying sufficient detail and references. Additionally, this review paper accompanies a toolbox to provide a platform to encourage interested students and researchers in the field to further explore the restoration techniques and fast-forward the community. The toolboxes are provided in https://github.com/ImageRestorationToolbox.Comment: This paper is under review in GRS

    A Comparison of Image Denoising Methods

    Full text link
    The advancement of imaging devices and countless images generated everyday pose an increasingly high demand on image denoising, which still remains a challenging task in terms of both effectiveness and efficiency. To improve denoising quality, numerous denoising techniques and approaches have been proposed in the past decades, including different transforms, regularization terms, algebraic representations and especially advanced deep neural network (DNN) architectures. Despite their sophistication, many methods may fail to achieve desirable results for simultaneous noise removal and fine detail preservation. In this paper, to investigate the applicability of existing denoising techniques, we compare a variety of denoising methods on both synthetic and real-world datasets for different applications. We also introduce a new dataset for benchmarking, and the evaluations are performed from four different perspectives including quantitative metrics, visual effects, human ratings and computational cost. Our experiments demonstrate: (i) the effectiveness and efficiency of representative traditional denoisers for various denoising tasks, (ii) a simple matrix-based algorithm may be able to produce similar results compared with its tensor counterparts, and (iii) the notable achievements of DNN models, which exhibit impressive generalization ability and show state-of-the-art performance on various datasets. In spite of the progress in recent years, we discuss shortcomings and possible extensions of existing techniques. Datasets, code and results are made publicly available and will be continuously updated at https://github.com/ZhaomingKong/Denoising-Comparison.Comment: In this paper, we intend to collect and compare various denoising methods to investigate their effectiveness, efficiency, applicability and generalization ability with both synthetic and real-world experiment
    corecore