33 research outputs found

    Advanced imaging and data mining technologies for medical and food safety applications

    Get PDF
    As one of the most fast-developing research areas, biological imaging and image analysis receive more and more attentions, and have been already widely applied in many scientific fields including medical diagnosis and food safety inspection. To further investigate such a very interesting area, this research is mainly focused on advanced imaging and pattern recognition technologies in both medical and food safety applications, which include 1) noise reduction of ultra-low-dose multi-slice helical CT imaging for early lung cancer screening, and 2) automated discrimination between walnut shell and meat under hyperspectral florescence imaging. In the medical imaging and diagnosis area, because X-ray computed tomography (CT) has been applied to screen large populations for early lung cancer detection during the last decade, more and more attentions have been paid to studying low-dose, even ultra-low-dose X-ray CTs. However, reducing CT radiation exposure inevitably increases the noise level in the sinogram, thereby degrading the quality of reconstructed CT images. Thus, how to reduce the noise levels in the low-dose CT images becomes a meaningful topic. In this research, a nonparametric smoothing method with block based thin plate smoothing splines and the roughness penalty was introduced to restore the ultra-low-dose helical CT raw data, which was acquired under 120 kVp / 10 mAs protocol. The objective thorax image quality evaluation was first conducted to assess the image quality and noise level of proposed method. A web-based subjective evaluation system was also built for the total of 23 radiologists to compare proposed approach with traditional sinogram restoration method. Both objective and subjective evaluation studies showed the effectiveness of proposed thin-plate based nonparametric regression method in sinogram restoration of multi-slice helical ultra-low-dose CT. In food quality inspection area, automated discrimination between walnut shell and meat has become an imperative task in the walnut postharvest processing industry in the U.S. This research developed two hyperspectral fluorescence imaging based approaches, which were capable of differentiating walnut small shell fragments from meat. Firstly, a principal component analysis (PCA) and Gaussian mixture model (PCA-GMM)-based Bayesian classification method was introduced. PCA was used to extract features, and then the optimal number of components in PCA was selected by a cross-validation technique. The PCA-GMM-based Bayesian classifier was further applied to differentiate the walnut shell and meat according to the class-conditional probability and the prior estimated by the Gaussian mixture model. The experimental results showed the effectiveness of this PCA-GMM approach, and an overall 98.2% recognition rate was achieved. Secondly, Gaussian-kernel based Support Vector Machine (SVM) was presented for the walnut shell and meat discrimination in the hyperspectral florescence imagery. SVM was applied to seek an optimal low to high dimensional mapping such that the nonlinear separable input data in the original input data space became separable on the mapped high dimensional space, and hence fulfilled the classification between walnut shell and meat. An overall recognition rate of 98.7% was achieved by this method. Although the hyperspectral fluorescence imaging is capable of differentiating between walnut shell and meat, one persistent problem is how to deal with huge amount of data acquired by the hyperspectral imaging system, and hence improve the efficiency of application system. To solve this problem, an Independent Component Analysis with k-Nearest Neighbor Classifier (ICA-kNN) approach was presented in this research to reduce the data redundancy while not sacrifice the classification performance too much. An overall 90.6% detection rate was achieved given 10 optimal wavelengths, which constituted only 13% of the total acquired hyperspectral image data. In order to further evaluate the proposed method, the classification results of the ICA-kNN approach were also compared to the kNN classifier method alone. The experimental results showed that the ICA-kNN method with fewer wavelengths had the same performance as the kNN classifier alone using information from all 79 wavelengths. This demonstrated the effectiveness of the proposed ICA-kNN method for the hyperspectral band selection in the walnut shell and meat classification

    Comparison of Quadratic- and Median-Based Roughness Penalties for Penalized-Likelihood Sinogram Restoration in Computed Tomography

    Get PDF
    We have compared the performance of two different penalty choices for a penalized-likelihood sinogram-restoration strategy we have been developing. One is a quadratic penalty we have employed previously and the other is a new median-based penalty. We compared the approaches to a noniterative adaptive filter that loosely but not explicitly models data statistics. We found that the two approaches produced similar resolution-variance tradeoffs to each other and that they outperformed the adaptive filter in the low-dose regime, which suggests that the particular choice of penalty in our approach may be less important than the fact that we are explicitly modeling data statistics at all. Since the quadratic penalty allows for derivation of an algorithm that is guaranteed to monotonically increase the penalized-likelihood objective function, we find it to be preferable to the median-based penalty

    Sinogram Restoration for Low-Dosed X-Ray Computed Tomography Using Fractional-Order Perona-Malik Diffusion

    Get PDF
    Existing integer-order Nonlinear Anisotropic Diffusion (NAD) used in noise suppressing will produce undesirable staircase effect or speckle effect. In this paper, we propose a new scheme, named Fractal-order Perona-Malik Diffusion (FPMD), which replaces the integer-order derivative of the Perona-Malik (PM) Diffusion with the fractional-order derivative using G-L fractional derivative. FPMD, which is a interpolation between integer-order Nonlinear Anisotropic Diffusion (NAD) and fourth-order partial differential equations, provides a more flexible way to balance the noise reducing and anatomical details preserving. Smoothing results for phantoms and real sinograms show that FPMD with suitable parameters can suppress the staircase effects and speckle effects efficiently. In addition, FPMD also has a good performance in visual quality and root mean square errors (RMSE)

    Radiation dose reduction strategies for intraoperative guidance and navigation using CT

    Get PDF
    The advent of 64-slice computed tomography (CT) with high speed scanning makes CT a highly attractive and powerful tool for navigating image guided procedures. Interactive navigation needs scanning to be performed over extended time periods or even continuously. However, continuous CT is likely to expose the patient and the physician to potentially unsafe radiation levels. Before CT can be used appropriately for navigational purposes, the dose problem must be solved. Simple dose reduction is not adequate, because it degrades image quality. This study proposes two strategies for dose reduction; the first is the use of a statistical approach representing the stochastic nature of noisy projection data at low doses to lessen image degradation and the second, the modeling of local image deformations in a continuous scan. Taking advantage of modern CT scanners and specialized hardware, it may be possible to perform continuous CT scanning at acceptable radiation doses for intraoperative navigation

    Streak Detection in Mottled and Noisy Images

    Get PDF
    This thesis describes an algorithm for detecting streaks in printed images using adaptive window-based image projections and maximization of mutual information. To this effect, projections are computed across the entire image at different window sizes. The traces collected from the projections are correlated using maximization of mutual information to pinpoint streak locations and widths using a peak detection algorithm. Finally, for a given peak, the window size is changed adaptively to identify and locate the intensity and length of the corresponding streak while maximizing signal to noise ratio. Results on synthetic and real-life images are provided to demonstrate the effectiveness of the proposed technique

    Statistical image reconstruction for quantitative computed tomography

    Get PDF
    Statistical iterative reconstruction (SIR) algorithms for x-ray computed tomography (CT) have the potential to reconstruct images with less noise and systematic error than the conventional filtered backprojection (FBP) algorithm. More accurate reconstruction algorithms are important for reducing imaging dose and for a wide range of quantitative CT applications. The work presented herein investigates some potential advantages of one such statistically motivated algorithm called Alternating Minimization (AM). A simulation study is used to compare the tradeoff between noise and resolution in images reconstructed with the AM and FBP algorithms. The AM algorithm is employed with an edge-preserving penalty function, which is shown to result in images with contrast-dependent resolution. The AM algorithm always reconstructed images with less image noise than the FBP algorithm. Compared to previous studies in the literature, this is the first work to clearly illustrate that the reported noise advantage when using edge-preserving penalty functions can be highly dependent on the contrast of the object used for quantifying resolution. A polyenergetic version of the AM algorithm, which incorporates knowledge of the scanner’s x-ray spectrum, is then commissioned from data acquired on a commercially available CT scanner. Homogeneous cylinders are used to assess the absolute accuracy of the polyenergetic AM algorithm and to compare systematic errors to conventional FBP reconstruction. Methods to estimate the x-ray spectrum, model the bowtie filter and measure scattered radiation are outlined which support AM reconstruction to within 0.5% of the expected ground truth. The polyenergetic AM algorithm reconstructs the cylinders with less systematic error than FBP, in terms of better image uniformity and less object-size dependence. Finally, the accuracy of a post-processing dual-energy CT (pDECT) method to non-invasively measure a material’s photon cross-section information is investigated. Data is acquired on a commercial scanner for materials of known composition. Since the pDECT method has been shown to be highly sensitive to reconstructed image errors, both FBP and polyenergetic AM reconstruction are employed. Linear attenuation coefficients are estimated with residual errors of around 1% for energies of 30 keV to 1 MeV with errors rising to 3%-6% at lower energies down to 10 keV. In the ideal phantom geometry used here, the main advantage of AM reconstruction is less random cross-section uncertainty due to the improved noise performance
    corecore