1,471 research outputs found

    Development of a Research Roadmap Related to Safe and Reliable Transportation of Ethanol in Pipelines

    Get PDF
    Scope Phase 1 Results Suggested Projects for Near Future Phase 1 Results Other Recommendations Phase 2 Results Roadmap of All Future Projects Phase 2 Results Roadmap of All Future Projects (cont.) Recommended Project 1: Safety of Transporting Blends Containing More than 10 Percent Ethanol Recommended Project 3: Technical and Economic Feasibility of Preventing SCC Through Control of Oxygen Recommended Project 4: Feasibility of Preventing SCC by Using Inhibitors Recommended Project 5: Compatibility of Non-ferrous Metals with Ethanol Recommended Project 6: Phenomenological Understanding of Ethanol SCC Other Recommended Actions Full Project Pla

    Quadratic Regularization Design for 2-D CT

    Full text link
    Statistical methods for tomographic image reconstruction have improved noise and spatial resolution properties that may improve image quality in X-ray computed tomography (CT). Penalized weighted least squares (PWLS) methods using conventional quadratic regularization lead to nonuniform and anisotropic spatial resolution due to interactions between the weighting, which is necessary for good noise properties, and the regularizer. Previously, we addressed this problem for parallel-beam emission tomography using matrix algebra methods to design data-dependent, shift-variant regularizers that improve resolution uniformity. This paper develops a fast angular integral mostly analytical (AIMA) regularization design method for 2-D fan-beam X-ray CT imaging, for which parallel-beam tomography is a special case. Simulation results demonstrate that the new method for regularization design requires very modest computation and leads to nearly uniform and isotropic spatial resolution in transmission tomography when using quadratic regularization.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85858/1/Fessler18.pd

    Quadratic regularization design for fan beam transmission tomography

    Full text link
    Statistical methods for tomographic image reconstruction have shown considerable potential for improving image quality in X-ray CT. Penalized-likelihood (PL) image reconstruction methods require maximizing an objective function that is based on the log-likelihood of the sinogram measurements and on a roughness penalty function to control noise. In transmission tomography, PL methods (and MAP methods) based on conventional quadratic regularization functions lead to nonuniform and anisotropic spatial resolution, even for idealized shift-invariant imaging systems. We have previously addressed this problem for parallel-beam emission tomography by designing data-dependent, shift-variant regularizers that improve resolution uniformity. This paper extends those methods to the fan-beam geometry used in X-ray CT imaging. Simulation results demonstrate that the new method for regularization design requires very modest computation and leads to nearly uniform and isotropic spatial resolution in the fan-beam geometry when using quadratic regularization.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85936/1/Fessler208.pd

    Quadratic Regularization Design for Iterative Reconstruction in 3D multi-slice Axial CT

    Full text link
    In X-ray CT, statistical methods for tomographic image reconstruction create images with better noise properties than conventional filtered back projection (FBP) techniques. Penalized-likelihood (PL) image reconstruction methods maximize an objective function based on the log-likelihood of sinogram measurements and on a user defined roughness penalty which controls noise. Penalized-likelihood methods (as well as penalized weighted least squares methods) based on conventional quadratic regularizers result in nonuniform and anisotropic spatial resolution. We have previously addressed this problem for 2D emission tomography, 2D fan-beam transmission tomography, and 3D cylindrical emission tomography. This paper extends those methods to 3D multi-slice axial CT with small cone angles.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85860/1/Fessler222.pd

    Joint Reconstruction of Stokes Images from Polarimetric Measurements

    Full text link
    In the field of imaging polarimetry Stokes parameters are sought and must be inferred from noisy and blurred intensity measurements. Using a penalized-likelihood estimation framework we investigate reconstruction quality when estimating intensity images and then transforming to Stokes parameters, and when estimating Stokes parameters directly. We define our cost function for reconstruction by a weighted least-squares data fit term and a regularization penalty. We show that for quadratic regularization the estimators of Stokes and intensity images can be made equal by appropriate choice of regularization parameters. It is empirically shown that, when using edge preserving regularization, estimating the Stokes parameters directly leads to lower RMS error. Also, the addition of a cross channel regularization term further lowers the RMS error for both methods, especially in the case of low SNR.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85916/1/Fessler20.pd

    Regularized Estimation of Stokes Images from Polarimetric Measurements

    Full text link
    In the remote sensing context the goal of imaging polarimetry is to map the state of polarization of a scene of interest. The polarization state of a scene can be represented by the Stokes parameters. Since the Stokes parameters are not directly measurable one must first make several individual measurements and then the infer the Stokes parameters. We approach the Stokes parameter construction problem using penalized-likelihood estimation. Given the measured linearly polarized images, what is the optimal means by which to deblur and denoise and construct the Stokes parameters? In traditional image restoration one attempts to restore the blurred and noise corrupted data directly. In the case of imaging polarimetry we must answer the question of the optimality of restoring the measured data and then forming the Stokes images or restoring the Stokes images directly. An alternative approach is to estimate the Stokes parameters directly. We define our cost function for reconstruction by a weighted least squares data fit penalty and a regularization penalty. We show that for quadratic regularization the estimators of Stokes and intensity images can be made equal by appropriate choice of regularization parameters. It is empirically shown that, when using edge preserving regularization, estimating Stokes parameters directly leads to somewhat lower error.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85937/1/Fessler231.pd

    Improved fMRI Time-Series Registration Using Joint Probability Density Priors

    Full text link
    Functional MRI (fMRI) time-series studies are plagued by varying degrees of subject head motion. Faithful head motion correction is essential to accurately detect brain activation using statistical analyses of these time-series. Mutual information (MI) based slice-to-volume (SV) registration is used for motion estimation when the rate of change of head position is large. SV registration accounts for head motion between slice acquisitions by estimating an independent rigid transformation for each slice in the time-series. Consequently each MI optimization uses intensity counts from a single time-series slice, making the algorithm susceptible to noise for low complexity endslices (i.e., slices near the top of the head scans). This work focuses on improving the accuracy of MI-based SV registration of end-slices by using joint probability density priors derived from registered high complexity centerslices (i.e., slices near the middle of the head scans). Results show that the use of such priors can significantly improve SV registration accuracy.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85928/1/Fessler236.pd

    Spectral Analysis Using Regularized Non-Negative Least-Squares Estimation

    Full text link
    The implementation of spectral analysis techniques involves solving a highly underdetermined linear system equation and is prone to the effect of measurement noise. The authors propose to use a regularized non-negative least-square estimator to stabilize the implementation of the technique. They introduce a penalty term in their formulation of the function to discourage disparities in tracer kinetics between neighboring pixels and use an iterative method to impose positivity constraints. The authors show results from analysis of FDG thorax images of patients suspected to have cancers and summarize their findings.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85892/1/Fessler137.pd

    Gradient Based Image Registration Using Importance Sampling

    Full text link
    Analytical gradient based non-rigid image registration methods, using intensity based similarity measures (e.g. mutual information), have proven to be capable of accurately handling many types of deformations. While their versatility is largely in part to their high degrees of freedom, the computation of the gradient of the similarity measure with respect to the many warp parameters becomes very time consuming. Recently, a simple stochastic approximation method using a small random subset of image pixels to approximate this gradient has been shown to be effective. We propose to use importance sampling to improve the accuracy and reduce the variance of this approximation by preferentially selecting pixels near image edges. Initial empirical results show that a combination of stochastic approximation methods and importance sampling greatly improves the rate of convergence of the registration process while preserving accuracy.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/86019/1/Fessler217.pd
    • …
    corecore