192 research outputs found

    Optimally Stabilized PET Image Denoising Using Trilateral Filtering

    Full text link
    Low-resolution and signal-dependent noise distribution in positron emission tomography (PET) images makes denoising process an inevitable step prior to qualitative and quantitative image analysis tasks. Conventional PET denoising methods either over-smooth small-sized structures due to resolution limitation or make incorrect assumptions about the noise characteristics. Therefore, clinically important quantitative information may be corrupted. To address these challenges, we introduced a novel approach to remove signal-dependent noise in the PET images where the noise distribution was considered as Poisson-Gaussian mixed. Meanwhile, the generalized Anscombe's transformation (GAT) was used to stabilize varying nature of the PET noise. Other than noise stabilization, it is also desirable for the noise removal filter to preserve the boundaries of the structures while smoothing the noisy regions. Indeed, it is important to avoid significant loss of quantitative information such as standard uptake value (SUV)-based metrics as well as metabolic lesion volume. To satisfy all these properties, we extended bilateral filtering method into trilateral filtering through multiscaling and optimal Gaussianization process. The proposed method was tested on more than 50 PET-CT images from various patients having different cancers and achieved the superior performance compared to the widely used denoising techniques in the literature.Comment: 8 pages, 3 figures; to appear in the Lecture Notes in Computer Science (MICCAI 2014

    Hierarchical Fusion Based Deep Learning Framework for Lung Nodule Classification

    Get PDF
    Lung cancer is the leading cancer type that causes the mortality in both men and women. Computer aided detection (CAD) and diagnosis systems can play a very important role for helping the physicians in cancer treatments. This dissertation proposes a CAD framework that utilizes a hierarchical fusion based deep learning model for detection of nodules from the stacks of 2D images. In the proposed hierarchical approach, a decision is made at each level individually employing the decisions from the previous level. Further, individual decisions are computed for several perspectives of a volume of interest (VOI). This study explores three different approaches to obtain decisions in a hierarchical fashion. The first model utilizes the raw images. The second model uses a single type feature images having salient content. The last model employs multi-type feature images. All models learn the parameters by means of supervised learning. In addition, this dissertation proposes a new Trilateral Filter to extract salient content of 2D images. This new filter includes a second anisotropic Laplacian kernel in addition to the Bilateral filter’s range kernel. The proposed CAD frameworks are tested using lung CT scans from the LIDC/IDRI database. The experimental results showed that the proposed multi-perspective hierarchical fusion approach significantly improves the performance of the classification

    Deep Graph Laplacian Regularization for Robust Denoising of Real Images

    Full text link
    Recent developments in deep learning have revolutionized the paradigm of image restoration. However, its applications on real image denoising are still limited, due to its sensitivity to training data and the complex nature of real image noise. In this work, we combine the robustness merit of model-based approaches and the learning power of data-driven approaches for real image denoising. Specifically, by integrating graph Laplacian regularization as a trainable module into a deep learning framework, we are less susceptible to overfitting than pure CNN-based approaches, achieving higher robustness to small datasets and cross-domain denoising. First, a sparse neighborhood graph is built from the output of a convolutional neural network (CNN). Then the image is restored by solving an unconstrained quadratic programming problem, using a corresponding graph Laplacian regularizer as a prior term. The proposed restoration pipeline is fully differentiable and hence can be end-to-end trained. Experimental results demonstrate that our work is less prone to overfitting given small training data. It is also endowed with strong cross-domain generalization power, outperforming the state-of-the-art approaches by a remarkable margin
    • …
    corecore