100,036 research outputs found

    Pragmatic fully 3D image reconstruction for the MiCES mouse imaging PET scanner

    Full text link
    We present a pragmatic approach to image reconstruction for data from the micro crystal elements system (MiCES) fully 3D mouse imaging positron emission tomography (PET) scanner under construction at the University of Washington. Our approach is modelled on fully 3D image reconstruction used in clinical PET scanners, which is based on Fourier rebinning (FORE) followed by 2D iterative image reconstruction using ordered-subsets expectation-maximization (OSEM). The use of iterative methods allows modelling of physical effects (e.g., statistical noise, detector blurring, attenuation, etc), while FORE accelerates the reconstruction process by reducing the fully 3D data to a stacked set of independent 2D sinograms. Previous investigations have indicated that non-stationary detector point-spread response effects, which are typically ignored for clinical imaging, significantly impact image quality for the MiCES scanner geometry. To model the effect of non-stationary detector blurring (DB) in the FORE+OSEM(DB) algorithm, we have added a factorized system matrix to the ASPIRE reconstruction library. Initial results indicate that the proposed approach produces an improvement in resolution without an undue increase in noise and without a significant increase in the computational burden. The impact on task performance, however, remains to be evaluated.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/48978/2/pmb4_19_008.pd

    State of the art: iterative CT reconstruction techniques

    Get PDF
    Owing to recent advances in computing power, iterative reconstruction (IR) algorithms have become a clinically viable option in computed tomographic (CT) imaging. Substantial evidence is accumulating about the advantages of IR algorithms over established analytical methods, such as filtered back projection. IR improves image quality through cyclic image processing. Although all available solutions share the common mechanism of artifact reduction and/or potential for radiation dose savings, chiefly due to image noise suppression, the magnitude of these effects depends on the specific IR algorithm. In the first section of this contribution, the technical bases of IR are briefly reviewed and the currently available algorithms released by the major CT manufacturers are described. In the second part, the current status of their clinical implementation is surveyed. Regardless of the applied IR algorithm, the available evidence attests to the substantial potential of IR algorithms for overcoming traditional limitations in CT imaging

    Technical Note: Phantom study to evaluate the dose and image quality effects of a computed tomography Organ-based Tube Current Modulation Technique

    Get PDF
    Purpose This technical note quantifies the dose and image quality performance of a clinically available organ-dose-based tube current modulation (ODM) technique, using experimental and simulation phantom studies. The investigated ODM implementation reduces the tube current for the anterior source positions, without increasing current for posterior positions, although such an approach was also evaluated for comparison. Methods Axial CT scans at 120 kV were performed on head and chest phantoms on an ODM-equipped scanner (Optima CT660, GE Healthcare, Chalfont St. Giles, England). Dosimeters quantified dose to breast, lung, heart, spine, eye lens, and brain regions for ODM and 3D-modulation (SmartmA) settings. Monte Carlo simulations, validated with experimental data, were performed on 28 voxelized head phantoms and 10 chest phantoms to quantify organ dose and noise standard deviation. The dose and noise effects of increasing the posterior tube current were also investigated. Results ODM reduced the dose for all experimental dosimeters with respect to SmartmA, with average dose reductions across dosimeters of 31% (breast), 21% (lung), 24% (heart), 6% (spine), 19% (eye lens), and 11% (brain), with similar results for the simulation validation study. In the phantom library study, the average dose reduction across all phantoms was 34% (breast), 20% (lung), 8% (spine), 20% (eye lens), and 8% (brain). ODM increased the noise standard deviation in reconstructed images by 6%–20%, with generally greater noise increases in anterior regions. Increasing the posterior tube current provided similar dose reduction as ODM for breast and eye lens, increased dose to the spine, with noise effects ranging from 2% noise reduction to 16% noise increase. At noise equal to SmartmA, ODM increased the estimated effective dose by 4% and 8% for chest and head scans, respectively. Increasing the posterior tube current further increased the effective dose by 15% (chest) and 18% (head) relative to SmartmA. Conclusions ODM reduced dose in all experimental and simulation studies over a range of phantoms, while increasing noise. The results suggest a net dose/noise benefit for breast and eye lens for all studied phantoms, negligible lung dose effects for two phantoms, increased lung dose and/or noise for eight phantoms, and increased dose and/or noise for brain and spine for all studied phantoms compared to the reference protocol

    Investigation of iterative image reconstruction in three-dimensional optoacoustic tomography

    Full text link
    Iterative image reconstruction algorithms for optoacoustic tomography (OAT), also known as photoacoustic tomography, have the ability to improve image quality over analytic algorithms due to their ability to incorporate accurate models of the imaging physics, instrument response, and measurement noise. However, to date, there have been few reported attempts to employ advanced iterative image reconstruction algorithms for improving image quality in three-dimensional (3D) OAT. In this work, we implement and investigate two iterative image reconstruction methods for use with a 3D OAT small animal imager: namely, a penalized least-squares (PLS) method employing a quadratic smoothness penalty and a PLS method employing a total variation norm penalty. The reconstruction algorithms employ accurate models of the ultrasonic transducer impulse responses. Experimental data sets are employed to compare the performances of the iterative reconstruction algorithms to that of a 3D filtered backprojection (FBP) algorithm. By use of quantitative measures of image quality, we demonstrate that the iterative reconstruction algorithms can mitigate image artifacts and preserve spatial resolution more effectively than FBP algorithms. These features suggest that the use of advanced image reconstruction algorithms can improve the effectiveness of 3D OAT while reducing the amount of data required for biomedical applications

    A Spectral CT Method to Directly Estimate Basis Material Maps From Experimental Photon-Counting Data

    Get PDF
    The proposed spectral CT method solves the constrained one-step spectral CT reconstruction (cOSSCIR) optimization problem to estimate basis material maps while modeling the nonlinear X-ray detection process and enforcing convex constraints on the basis map images. In order to apply the optimization-based reconstruction approach to experimental data, the presented method empirically estimates the effective energy-window spectra using a calibration procedure. The amplitudes of the estimated spectra were further optimized as part of the reconstruction process to reduce ring artifacts. A validation approach was developed to select constraint parameters. The proposed spectral CT method was evaluated through simulations and experiments with a photon-counting detector. Basis material map images were successfully reconstructed using the presented empirical spectral modeling and cOSSCIR optimization approach. In simulations, the cOSSCIR approach accurately reconstructed the basis map images

    Reducing distance errors for standard candles and standard sirens with weak-lensing shear and flexion maps

    Full text link
    Gravitational lensing induces significant errors in the measured distances to high-redshift standard candles and standard sirens such as type-Ia supernovae, gamma-ray bursts, and merging supermassive black hole binaries. There will therefore be a significant benefit from correcting for the lensing error by using independent and accurate estimates of the lensing magnification. We investigate how accurately the magnification can be inferred from convergence maps reconstructed from galaxy shear and flexion data. We employ ray-tracing through the Millennium Simulation to simulate lensing observations in large fields, and perform a weak-lensing reconstruction on these fields. We identify optimal ways to filter the reconstructed convergence maps and to convert them to magnification maps. We find that a shear survey with 100 galaxies/arcmin^2 can help to reduce the lensing-induced distance errors for standard candles/sirens at redshifts z=1.5 (z=5) on average by 20% (10%), whereas a futuristic survey with shear and flexion estimates from 500 galaxies/arcmin^2 yields much larger reductions of 50% (35%). For redshifts z>=3, a further improvement by 5% can be achieved, if the individual redshifts of the galaxies are used in the reconstruction. Moreover, the reconstruction allows one to identify regions for which the convergence is low, and in which an error reduction by up to 75% can be achieved.Comment: 16 pages, 18 figures, submitted to MNRAS, minor changes, references extended, comments welcom

    Body MRI artifacts in clinical practice: a physicist\u27s and radiologist\u27s perspective.

    Get PDF
    The high information content of MRI exams brings with it unintended effects, which we call artifacts. The purpose of this review is to promote understanding of these artifacts, so they can be prevented or properly interpreted to optimize diagnostic effectiveness. We begin by addressing static magnetic field uniformity, which is essential for many techniques, such as fat saturation. Eddy currents, resulting from imperfect gradient pulses, are especially problematic for new techniques that depend on high performance gradient switching. Nonuniformity of the transmit radiofrequency system constitutes another source of artifacts, which are increasingly important as magnetic field strength increases. Defects in the receive portion of the radiofrequency system have become a more complex source of problems as the number of radiofrequency coils, and the sophistication of the analysis of their received signals, has increased. Unwanted signals and noise spikes have many causes, often manifesting as zipper or banding artifacts. These image alterations become particularly severe and complex when they are combined with aliasing effects. Aliasing is one of several phenomena addressed in our final section, on artifacts that derive from encoding the MR signals to produce images, also including those related to parallel imaging, chemical shift, motion, and image subtraction

    Fast Mojette Transform for Discrete Tomography

    Full text link
    A new algorithm for reconstructing a two dimensional object from a set of one dimensional projected views is presented that is both computationally exact and experimentally practical. The algorithm has a computational complexity of O(n log2 n) with n = N^2 for an NxN image, is robust in the presence of noise and produces no artefacts in the reconstruction process, as is the case with conventional tomographic methods. The reconstruction process is approximation free because the object is assumed to be discrete and utilizes fully discrete Radon transforms. Noise in the projection data can be suppressed further by introducing redundancy in the reconstruction. The number of projections required for exact reconstruction and the response to noise can be controlled without comprising the digital nature of the algorithm. The digital projections are those of the Mojette Transform, a form of discrete linogram. A simple analytical mapping is developed that compacts these projections exactly into symmetric periodic slices within the Discrete Fourier Transform. A new digital angle set is constructed that allows the periodic slices to completely fill all of the objects Discrete Fourier space. Techniques are proposed to acquire these digital projections experimentally to enable fast and robust two dimensional reconstructions.Comment: 22 pages, 13 figures, Submitted to Elsevier Signal Processin

    Enhanced imaging of microcalcifications in digital breast tomosynthesis through improved image-reconstruction algorithms

    Full text link
    PURPOSE: We develop a practical, iterative algorithm for image-reconstruction in under-sampled tomographic systems, such as digital breast tomosynthesis (DBT). METHOD: The algorithm controls image regularity by minimizing the image total pp-variation (TpV), a function that reduces to the total variation when p=1.0p=1.0 or the image roughness when p=2.0p=2.0. Constraints on the image, such as image positivity and estimated projection-data tolerance, are enforced by projection onto convex sets (POCS). The fact that the tomographic system is under-sampled translates to the mathematical property that many widely varied resultant volumes may correspond to a given data tolerance. Thus the application of image regularity serves two purposes: (1) reduction of the number of resultant volumes out of those allowed by fixing the data tolerance, finding the minimum image TpV for fixed data tolerance, and (2) traditional regularization, sacrificing data fidelity for higher image regularity. The present algorithm allows for this dual role of image regularity in under-sampled tomography. RESULTS: The proposed image-reconstruction algorithm is applied to three clinical DBT data sets. The DBT cases include one with microcalcifications and two with masses. CONCLUSION: Results indicate that there may be a substantial advantage in using the present image-reconstruction algorithm for microcalcification imaging.Comment: Submitted to Medical Physic

    Roche tomography of cataclysmic variables: I. artefacts and techniques

    Get PDF
    Roche tomography is a technique used for imaging the Roche-lobe filling secondary stars in cataclysmic variables (CVs). In order to interpret Roche tomograms correctly, one must determine whether features in the reconstruction are real, or due to statistical or systematic errors. We explore the effects of systematic errors using reconstructions of simulated datasets and show that systematic errors result in characteristic distortions of the final reconstructions that can be identified and corrected. In addition, we present a new method of estimating statistical errors on tomographic reconstructions using a Monte-Carlo bootstrapping algorithm and show this method to be much more reliable than Monte-Carlo methods which `jiggle' the data points in accordance with the size of their error bars.Comment: 11 pages, 8 figures. Accepted for publication in MNRA
    • …
    corecore