14,810 research outputs found

    Image reconstruction/synthesis from nonuniform data and zero/threshold crossings

    Get PDF
    We address the problem of reconstructing functions from their nonuniform data and zero/threshold crossings. We introduce a deterministic process via the Gram-Schmidt orthonormalization procedure to reconstruct functions from their nonuniform data and zero/threshold crossings. This is achieved by first introducing the nonorthogonal basis functions in a chosen 2-D domain (e.g., for a band-limited signal, a possible choice is the 2-D Fourier domain of the image) that span the signal subspace of the nonuniform data. We then use the Gram-Schmidt procedure to construct a set of orthogonal basis functions that span the linear signal subspace defined by the nonorthogonal basis functions. Next, we project the N-dimensional measurement vector (N is the number of nonuniform data or threshold crossings) onto the newly constructed orthogonal basis functions. Finally, the function at any point can be reconstructed by projecting the representation with respect to the newly constructed orthonormal basis functions onto the reconstruction basis functions that span the signal subspace of the evenly spaced sampled data. The reconstructed signal gives the minimum mean square error estimate of the original signal. This procedure gives error-free reconstruction provided that the nonorthogonal basis functions that span the signal subspace of the nonuniform data form a complete set in the signal subspace of the original band-limited signal. We apply this algorithm to reconstruct functions from their unevenly spaced sampled data and zero crossings and also apply it to solve the problem of synthesis of a 2-D band-limited function with the prescribed level crossings

    Image reconstruction/synthesis from nonuniform data and zero/threshold crossings

    Full text link

    Reconstruction of Binary Functions and Shapes from Incomplete Frequency Information

    Full text link
    The characterization of a binary function by partial frequency information is considered. We show that it is possible to reconstruct binary signals from incomplete frequency measurements via the solution of a simple linear optimization problem. We further prove that if a binary function is spatially structured (e.g. a general black-white image or an indicator function of a shape), then it can be recovered from very few low frequency measurements in general. These results would lead to efficient methods of sensing, characterizing and recovering a binary signal or a shape as well as other applications like deconvolution of binary functions blurred by a low-pass filter. Numerical results are provided to demonstrate the theoretical arguments.Comment: IEEE Transactions on Information Theory, 201

    Spherical deconvolution of multichannel diffusion MRI data with non-Gaussian noise models and spatial regularization

    Get PDF
    Spherical deconvolution (SD) methods are widely used to estimate the intra-voxel white-matter fiber orientations from diffusion MRI data. However, while some of these methods assume a zero-mean Gaussian distribution for the underlying noise, its real distribution is known to be non-Gaussian and to depend on the methodology used to combine multichannel signals. Indeed, the two prevailing methods for multichannel signal combination lead to Rician and noncentral Chi noise distributions. Here we develop a Robust and Unbiased Model-BAsed Spherical Deconvolution (RUMBA-SD) technique, intended to deal with realistic MRI noise, based on a Richardson-Lucy (RL) algorithm adapted to Rician and noncentral Chi likelihood models. To quantify the benefits of using proper noise models, RUMBA-SD was compared with dRL-SD, a well-established method based on the RL algorithm for Gaussian noise. Another aim of the study was to quantify the impact of including a total variation (TV) spatial regularization term in the estimation framework. To do this, we developed TV spatially-regularized versions of both RUMBA-SD and dRL-SD algorithms. The evaluation was performed by comparing various quality metrics on 132 three-dimensional synthetic phantoms involving different inter-fiber angles and volume fractions, which were contaminated with noise mimicking patterns generated by data processing in multichannel scanners. The results demonstrate that the inclusion of proper likelihood models leads to an increased ability to resolve fiber crossings with smaller inter-fiber angles and to better detect non-dominant fibers. The inclusion of TV regularization dramatically improved the resolution power of both techniques. The above findings were also verified in brain data

    Significant edges in the case of a non-stationary Gaussian noise

    Get PDF
    In this paper, we propose an edge detection technique based on some local smoothing of the image followed by a statistical hypothesis testing on the gradient. An edge point being defined as a zero-crossing of the Laplacian, it is said to be a significant edge point if the gradient at this point is larger than a threshold s(\eps) defined by: if the image II is pure noise, then \P(\norm{\nabla I}\geq s(\eps) \bigm| \Delta I = 0) \leq\eps. In other words, a significant edge is an edge which has a very low probability to be there because of noise. We will show that the threshold s(\eps) can be explicitly computed in the case of a stationary Gaussian noise. In images we are interested in, which are obtained by tomographic reconstruction from a radiograph, this method fails since the Gaussian noise is not stationary anymore. But in this case again, we will be able to give the law of the gradient conditionally on the zero-crossing of the Laplacian, and thus compute the threshold s(\eps). We will end this paper with some experiments and compare the results with the ones obtained with some other methods of edge detection

    Cosmic-Ray Rejection by Laplacian Edge Detection

    Get PDF
    Conventional algorithms for rejecting cosmic-rays in single CCD exposures rely on the contrast between cosmic-rays and their surroundings, and may produce erroneous results if the Point Spread Function (PSF) is smaller than the largest cosmic-rays. This paper describes a robust algorithm for cosmic-ray rejection, based on a variation of Laplacian edge detection. The algorithm identifies cosmic-rays of arbitrary shapes and sizes by the sharpness of their edges, and reliably discriminates between poorly sampled point sources and cosmic-rays. Examples of its performance are given for spectroscopic and imaging data, including HST WFPC2 images.Comment: Accepted for publication in the PASP (November 2001 issue). The algorithm is implemented in the program L.A.Cosmic, which can be obtained from http://www.astro.caltech.edu/~pgd/lacosmic
    corecore