52 research outputs found

    Segmentation-Free Statistical Image Reconstruction for Polyenergetic X-Ray Computed Tomography

    Full text link
    This paper describes a statistical iterative reconstruction method for X-ray CT based on a physical model that accounts for the polyenergetic X-ray source spectrum and the measurement nonlinearities caused by energy-dependent attenuation. The algorithm accommodates mixtures of tissues with known mass attenuation coefficients but unknown densities. We formulate a penalized-likelihood approach for this polyenergetic model based on Poisson statistics.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85881/1/Fessler173.pd

    Statistical X-Ray-Computed Tomography Image Reconstruction with Beam- Hardening Correction

    Full text link
    This paper describes two statistical iterative reconstruction methods for X-ray CT. The rst method assumes a mono-energetic model for X-ray attenuation. We approximate the transmission Poisson likelihood by a quadratic cost function and exploit its convexity to derive a separable quadratic surrogate function that is easily minimized using parallelizable algorithms. Ordered subsets are used to accelerate convergence. We apply this mono-energetic algorithm (with edge-preserving regularization) to simulated thorax X-ray CT scans. A few iterations produce reconstructed images with lower noise than conventional FBP images at equivalent resolutions. The second method generalizes the physical model and accounts for the poly-energetic X-ray source spectrum and the measurement nonlinearities caused by energy-dependent attenuation. We assume the object consists of a given number of nonoverlapping tissue types. The attenuation coeÆcient of each tissue is the product of its unknown density and a known energy-dependent mass attenuation coeÆcient. We formulate a penalized-likelihood function for this polyenergetic model and develop an iterative algorithm for estimating the unknown densities in each voxel. Applying this method to simulated X-ray CT measurements of a phantom containing both bone and soft tissue yields images with signi cantly reduced beam hardening artifacts.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85939/1/Fessler165.pd

    Efficient and Accurate Llikelihood for Iterative Image Reconstruction in X-Ray Computed Tomography

    Full text link
    We report a novel approach for statistical image reconstruction in X-ray CT. Statistical image reconstruction depends on maximizing a likelihood derived from a statistical model for the measurements. Traditionally, the measurements are assumed to be statistically Poisson, but more recent work has argued that CT measurements actually follow a compound Poisson distribution due to the polyenergetic nature of the X-ray source. Unlike the Poisson distribution, compound Poisson statistics have a complicated likelihood that impedes direct use of statistical reconstruction. Using a generalization of the saddle-point integration method, we derive an approximate likelihood for use with iterative algorithms. In its most realistic form, the approximate likelihood we derive accounts for polyenergetic X-rays and Poisson light statistics in the detector scintillator, and can be extended to account for electronic additive noise. The approximate likelihood is closer to the exact likelihood than is the conventional Poisson likelihood, and carries the promise of more accurate reconstruction, especially in low X-ray dose situations.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85924/1/Fessler182.pd

    Amoebiasis in the Tropics: Epidemiology and Pathogenesis

    Get PDF

    Statistical Image Reconstruction for Polyenergetic X-Ray Computed Tomography

    Full text link
    This paper describes a statistical image reconstruction method for X-ray computed tomography (CT) that is based on a physical model that accounts for the polyenergetic X-ray source spectrum and the measurement nonlinearities caused by energy-dependent attenuation. We assume that the object consists of a given number of nonoverlapping materials, such as soft tissue and bone. The attenuation coefficient of each voxel is the product of its unknown density and a known energy-dependent mass attenuation coefficient. We formulate a penalized-likelihood function for this polyenergetic model and develop an ordered-subsets iterative algorithm for estimating the unknown densities in each voxel. The algorithm monotonically decreases the cost function at each iteration when one subset is used. Applying this method to simulated X-ray CT measurements of objects containing both bone and soft tissue yields images with significantly reduced beam hardening artifacts.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85895/1/Fessler74.pd

    Segmentation-Free Statistical Image Reconstruction for Polyenergetic X-Ray Computed Tomography with Experimental Validation

    Full text link
    This paper describes a statistical image reconstruction method for x-ray CT that is based on a physical model that accounts for the polyenergetic x-ray source spectrum and the measurement nonlinearities caused by energy-dependent attenuation. Unlike our earlier work, the proposed algorithm does not require pre-segmentation of the object into the various tissue classes (e.g., bone and soft tissue) and allows mixed pixels. The attenuation coefficient of each voxel is modelled as the product of its unknown density and a weighted sum of energy-dependent mass attenuation coefficients. We formulate a penalized-likelihood function for this polyenergetic model and develop an iterative algorithm for estimating the unknown density of each voxel. Applying this method to simulated x-ray CT measurements of objects containing both bone and soft tissue yields images with significantly reduced beam hardening artefacts relative to conventional beam hardening correction methods. We also apply the method to real data acquired from a phantom containing various concentrations of potassium phosphate solution. The algorithm reconstructs an image with accurate density values for the different concentrations, demonstrating its potential for quantitative CT applications.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85911/1/Fessler66.pd

    Maximum-Likelihood Dual-Energy TomographicImage Reconstruction

    Full text link
    Dual-energy (DE) X-ray computed tomography (CT) has shown promise for material characterization and for providing quantitatively accurate CT values in a variety of applications. However, DE-CT has not been used routinely in medicine to date, primarily due to dose considerations. Most methods for DE-CT have used the filtered backprojection method for image reconstruction, leading to suboptimal noise/dose properties. This paper describes a statistical (maximum-likelihood) method for dual-energy X-ray CT that accommodates a wide variety of potential system configurations and measurement noise models. Regularized methods (such as penalized-likelihood or Bayesian estimation) are straightforward extensions. One version of the algorithm monotonically decreases the negative log-likelihood cost function each iteration. An ordered-subsets variation of the algorithm provides a fast and practical version.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85934/1/Fessler172.pd

    Statistical Reconstruction for Quantitative CT Applications

    Full text link
    This paper summarizes considerations in developing statistical reconstruction algorithms for polyenergetic X-ray CT. The algorithms are based on Poisson statistics and polyenergetic X-ray attenuation physics and object models. In single-kVp scans, object models enable estimates of the contributions of bone and soft tissue at every pixel, based on prior assumptions about the tissue properties. In dual-kVp scans, one can estimate water and bone images independently. Preliminary results with fan-beam data from two cone beam systems show better accuracy for iterative methods over FBP.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85899/1/Fessler186.pd

    Intestinal parasitic infections among expatriate workers in various occupations in Sharjah, United Arab Emirates

    Get PDF
    ABSTRACT Intestinal parasitic infections are prevalent throughout many countries. This study aimed to determine the prevalence of intestinal parasite carriers among 21,347 expatriate workers, including food handlers and housemaids attending the public health center laboratory in Sharjah, UAE. Stool sample collection was performed throughout the period between January and December 2013. All samples were examined microscopically. Demographic data were also obtained and analyzed. Intestinal parasites were found in 3.3% (708/21,347) of the studied samples (single and multiple infections). Among positive samples, six hundred and eighty-three samples (96.5%) were positive for a single parasite: Giardia lamblia (257; 36.3%) and Entamoeba histolytica/Entamoeba dispar (220; 31.1%), respectively, whereas mono-infections with helminths accounted for 206 (29.1%) of the samples. Infection rates with single worms were: Ascaris lumbricoides (84; 11.9%), Hookworm (34; 4.8%), Trichuris trichiura (33; 4.7%), Taenia spp. (27; 3.81%), Strongyloides stercoralis (13; 1.8%), Hymenolepis nana (13; 1.8%), and Enterobius vermicularis (2; 0.28%), respectively. Infections were significantly associated with gender (x2 = 14.18; p = 0.002) with males as the most commonly infected with both groups of intestinal parasites (protozoa and helminths). A strong statistical association was noted correlating the parasite occurrence with certain nationalities (x2= 49.5, p <0.001). Furthermore, the study has also found a strong statistical correlation between parasite occurrence and occupation (x2= 15.60; p = 0.029). Multiple infections were not common (3.5% of the positive samples), although one individual (0.14%) had four helminth species, concurrently. These findings emphasized that food handlers with different pathogenic parasitic organisms may pose a significant health risk to the public

    Identification of Eps15 as Antigen Recognized by the Monoclonal Antibodies aa2 and ab52 of the Wuerzburg Hybridoma Library against Drosophila Brain

    Get PDF
    The Wuerzburg Hybridoma Library against the Drosophila brain represents a collection of around 200 monoclonal antibodies that bind to specific structures in the Drosophila brain. Here we describe the immunohistochemical staining patterns, the Western blot signals of one- and two-dimensional electrophoretic separation, and the mass spectrometric characterization of the target protein candidates recognized by the monoclonal antibodies aa2 and ab52 from the library. Analysis of a mutant of a candidate gene identified the Drosophila homolog of the Epidermal growth factor receptor Pathway Substrate clone 15 (Eps15) as the antigen for these two antibodies
    • …
    corecore