1,959 research outputs found

    A practical algorithm for tanner graph based image interpolation

    Full text link
    This paper interprets image interpolation as a decoding problem on tanner graph and proposes a practical belief propagation algorithm based on a gaussian autoregressive image model. This algorithm regards belief propagation as a way to generate and fuse predictions from various check nodes. A low complexity implementation of this algorithm measures and distributes the departure of current interpolation result from the image model. Convergence speed of the proposed algorithm is discussed. Experimental results show that good interpolation results can be obtained by a very small number of iterations.Engineering, Electrical & ElectronicImaging Science & Photographic TechnologyEICPCI-S(ISTP)

    Tight bounds for LDPC and LDGM codes under MAP decoding

    Full text link
    A new method for analyzing low density parity check (LDPC) codes and low density generator matrix (LDGM) codes under bit maximum a posteriori probability (MAP) decoding is introduced. The method is based on a rigorous approach to spin glasses developed by Francesco Guerra. It allows to construct lower bounds on the entropy of the transmitted message conditional to the received one. Based on heuristic statistical mechanics calculations, we conjecture such bounds to be tight. The result holds for standard irregular ensembles when used over binary input output symmetric channels. The method is first developed for Tanner graph ensembles with Poisson left degree distribution. It is then generalized to `multi-Poisson' graphs, and, by a completion procedure, to arbitrary degree distribution.Comment: 28 pages, 9 eps figures; Second version contains a generalization of the previous resul

    Phenomenological model of diffuse global and regional atrophy using finite-element methods

    Get PDF
    The main goal of this work is the generation of ground-truth data for the validation of atrophy measurement techniques, commonly used in the study of neurodegenerative diseases such as dementia. Several techniques have been used to measure atrophy in cross-sectional and longitudinal studies, but it is extremely difficult to compare their performance since they have been applied to different patient populations. Furthermore, assessment of performance based on phantom measurements or simple scaled images overestimates these techniques' ability to capture the complexity of neurodegeneration of the human brain. We propose a method for atrophy simulation in structural magnetic resonance (MR) images based on finite-element methods. The method produces cohorts of brain images with known change that is physically and clinically plausible, providing data for objective evaluation of atrophy measurement techniques. Atrophy is simulated in different tissue compartments or in different neuroanatomical structures with a phenomenological model. This model of diffuse global and regional atrophy is based on volumetric measurements such as the brain or the hippocampus, from patients with known disease and guided by clinical knowledge of the relative pathological involvement of regions and tissues. The consequent biomechanical readjustment of structures is modelled using conventional physics-based techniques based on biomechanical tissue properties and simulating plausible tissue deformations with finite-element methods. A thermoelastic model of tissue deformation is employed, controlling the rate of progression of atrophy by means of a set of thermal coefficients, each one corresponding to a different type of tissue. Tissue characterization is performed by means of the meshing of a labelled brain atlas, creating a reference volumetric mesh that will be introduced to a finite-element solver to create the simulated deformations. Preliminary work on the simulation of acquisition artefa- - cts is also presented. Cross-sectional and

    Tanner Graph Based Image Interpolation

    Full text link
    This paper interprets image interpolation as a channel decoding problem and proposes a tanner graph based interpolation framework, which regards each pixel in an image as a variable node and the local image structure around each pixel as a check node. The pixels available from low-resolution image are 'received' whereas other missing pixels of high-resolution image are 'erased', through an imaginary channel. Local image structures exhibited by the low-resolution image provide information on the joint distribution of pixels in a small neighborhood, and thus play the same role as parity symbols in the classic channel coding scenarios. We develop an efficient solution for the sum-product algorithm of belief propagation in this framework, based on a gaussian auto-regressive image model. Initial experiments show up to 3dB gain over other methods with the same image model. The proposed framework is flexible in message processing at each node and provides much room for incorporating more sophisticated image modelling techniques. ? 2010 IEEE.EI

    On the convergence of a linesearch based proximal-gradient method for nonconvex optimization

    Get PDF
    We consider a variable metric linesearch based proximal gradient method for the minimization of the sum of a smooth, possibly nonconvex function plus a convex, possibly nonsmooth term. We prove convergence of this iterative algorithm to a critical point if the objective function satisfies the Kurdyka-Lojasiewicz property at each point of its domain, under the assumption that a limit point exists. The proposed method is applied to a wide collection of image processing problems and our numerical tests show that our algorithm results to be flexible, robust and competitive when compared to recently proposed approaches able to address the optimization problems arising in the considered applications

    Bayesian Computation with Intractable Likelihoods

    Full text link
    This article surveys computational methods for posterior inference with intractable likelihoods, that is where the likelihood function is unavailable in closed form, or where evaluation of the likelihood is infeasible. We review recent developments in pseudo-marginal methods, approximate Bayesian computation (ABC), the exchange algorithm, thermodynamic integration, and composite likelihood, paying particular attention to advancements in scalability for large datasets. We also mention R and MATLAB source code for implementations of these algorithms, where they are available.Comment: arXiv admin note: text overlap with arXiv:1503.0806
    • …
    corecore