2,881 research outputs found

    Monte Carlo Simulation for Polychromatic X-ray Fluorescence Computed Tomography with Sheet-Beam Geometry

    Get PDF
    X-ray fluorescence computed tomography based on sheet-beam can save a huge amount of time to obtain a whole set of projections using synchrotron. However, it is clearly unpractical for most biomedical research laboratories. In this paper, polychromatic X-ray fluorescence computed tomography with sheet-beam geometry is tested by Monte Carlo simulation. First, two phantoms (A and B) filled with PMMA are used to simulate imaging process through GEANT 4. Phantom A contains several GNP-loaded regions with the same size (10 mm) in height and diameter but different Au weight concentration ranging from 0.3% to 1.8%. Phantom B contains twelve GNP-loaded regions with the same Au weight concentration (1.6%) but different diameter ranging from 1mm to 9mm. Second, discretized presentation of imaging model is established to reconstruct more accurate XFCT images. Third, XFCT images of phantom A and B are reconstructed by fliter backprojection (FBP) and maximum likelihood expectation maximization (MLEM) with and without correction, respectively. Contrast to noise ratio (CNR) is calculated to evaluate all the reconstructed images. Our results show that it is feasible for sheet-beam XFCT system based on polychromatic X-ray source and the discretized imaging model can be used to reconstruct more accurate images

    Optimization of duty cycles for LED based indoor positioning system

    Get PDF

    A Compressed Sensing Algorithm for Sparse-View Pinhole Single Photon Emission Computed Tomography

    Get PDF
    Single Photon Emission Computed Tomography (SPECT) systems are being developed with multiple cameras and without gantry rotation to provide rapid dynamic acquisitions. However, the resulting data is angularly undersampled, due to the limited number of views. We propose a novel reconstruction algorithm for sparse-view SPECT based on Compressed Sensing (CS) theory. The algorithm models Poisson noise by modifying the Iterative Hard Thresholding algorithm to minimize the Kullback-Leibler (KL) distance by gradient descent. Because the underlying objects of SPECT images are expected to be smooth, a discrete wavelet transform (DWT) using an orthogonal spline wavelet kernel is used as the sparsifying transform. Preliminary feasibility of the algorithm was tested on simulated data of a phantom consisting of two Gaussian distributions. Single-pinhole projection data with Poisson noise were simulated at 128, 60, 15, 10, and 5 views over 360 degrees. Image quality was assessed using the coefficient of variation and the relative contrast between the two objects in the phantom. Overall, the results demonstrate preliminary feasibility of the proposed CS algorithm for sparse-view SPECT imaging

    Novel muon imaging techniques

    Get PDF
    Owing to the high penetrating power of high-energy cosmic ray muons, muon imaging techniques can be used to image large bulky objects, especially objects with heavy shielding. Muon imaging systems work just like CT scanners in the medical imaging field—that is, they can reveal information inside of a target. There are two forms of muon imaging techniques: muon absorption imaging and muon multiple scattering imaging. The former is based on the flux attenuation of muons, and the latter is based on the multiple scattering of muons in matter. The muon absorption imaging technique is capable of imaging very large objects such as volcanoes and large buildings, and also smaller objects like spent fuel casks; the muon multiple scattering imaging technique is best suited to inspect smaller objects such as nuclear waste containers. Muon imaging techniques can be applied in a broad variety of fields, i.e. from measuring the magma thickness of volcanoes to searching for secret cavities in pyramids, and from monitoring the borders of countries checking for special nuclear materials to monitoring the spent fuel casks for nuclear safeguards applications. In this paper, the principles of muon imaging are reviewed. Image reconstruction algorithms such as Filtered Back Projection and Maximum Likelihood Expectation Maximization are discussed. The capability of muon imaging techniques is demonstrated through a Geant4 simulation study for imaging a nuclear spent fuel cask

    Image reconstruction in fluorescence molecular tomography with sparsity-initialized maximum-likelihood expectation maximization

    Get PDF
    We present a reconstruction method involving maximum-likelihood expectation maximization (MLEM) to model Poisson noise as applied to fluorescence molecular tomography (FMT). MLEM is initialized with the output from a sparse reconstruction-based approach, which performs truncated singular value decomposition-based preconditioning followed by fast iterative shrinkage-thresholding algorithm (FISTA) to enforce sparsity. The motivation for this approach is that sparsity information could be accounted for within the initialization, while MLEM would accurately model Poisson noise in the FMT system. Simulation experiments show the proposed method significantly improves images qualitatively and quantitatively. The method results in over 20 times faster convergence compared to uniformly initialized MLEM and improves robustness to noise compared to pure sparse reconstruction. We also theoretically justify the ability of the proposed approach to reduce noise in the background region compared to pure sparse reconstruction. Overall, these results provide strong evidence to model Poisson noise in FMT reconstruction and for application of the proposed reconstruction framework to FMT imaging

    Processing optimization with parallel computing for the J-PET tomography scanner

    Get PDF
    The Jagiellonian-PET (J-PET) collaboration is developing a prototype TOF-PET detector based on long polymer scintillators. This novel approach exploits the excellent time properties of the plastic scintillators, which permit very precise time measurements. The very fast, FPGA-based front-end electronics and the data acquisition system, as well as, low- and high-level reconstruction algorithms were specially developed to be used with the J-PET scanner. The TOF-PET data processing and reconstruction are time and resource demanding operations, especially in case of a large acceptance detector, which works in triggerless data acquisition mode. In this article, we discuss the parallel computing methods applied to optimize the data processing for the J-PET detector. We begin with general concepts of parallel computing and then we discuss several applications of those techniques in the J-PET data processing.Comment: 8 page
    corecore