2,355 research outputs found

    Large Scale Variational Bayesian Inference for Structured Scale Mixture Models

    Get PDF
    Natural image statistics exhibit hierarchical dependencies across multiple scales. Representing such prior knowledge in non-factorial latent tree models can boost performance of image denoising, inpainting, deconvolution or reconstruction substantially, beyond standard factorial "sparse" methodology. We derive a large scale approximate Bayesian inference algorithm for linear models with non-factorial (latent tree-structured) scale mixture priors. Experimental results on a range of denoising and inpainting problems demonstrate substantially improved performance compared to MAP estimation or to inference with factorial priors.Comment: Appears in Proceedings of the 29th International Conference on Machine Learning (ICML 2012

    Kernel Belief Propagation

    Full text link
    We propose a nonparametric generalization of belief propagation, Kernel Belief Propagation (KBP), for pairwise Markov random fields. Messages are represented as functions in a reproducing kernel Hilbert space (RKHS), and message updates are simple linear operations in the RKHS. KBP makes none of the assumptions commonly required in classical BP algorithms: the variables need not arise from a finite domain or a Gaussian distribution, nor must their relations take any particular parametric form. Rather, the relations between variables are represented implicitly, and are learned nonparametrically from training data. KBP has the advantage that it may be used on any domain where kernels are defined (Rd, strings, groups), even where explicit parametric models are not known, or closed form expressions for the BP updates do not exist. The computational cost of message updates in KBP is polynomial in the training data size. We also propose a constant time approximate message update procedure by representing messages using a small number of basis functions. In experiments, we apply KBP to image denoising, depth prediction from still images, and protein configuration prediction: KBP is faster than competing classical and nonparametric approaches (by orders of magnitude, in some cases), while providing significantly more accurate results

    Convolutional Dictionary Regularizers for Tomographic Inversion

    Full text link
    There has been a growing interest in the use of data-driven regularizers to solve inverse problems associated with computational imaging systems. The convolutional sparse representation model has recently gained attention, driven by the development of fast algorithms for solving the dictionary learning and sparse coding problems for sufficiently large images and data sets. Nevertheless, this model has seen very limited application to tomographic reconstruction problems. In this paper, we present a model-based tomographic reconstruction algorithm using a learnt convolutional dictionary as a regularizer. The key contribution is the use of a data-dependent weighting scheme for the l1 regularization to construct an effective denoising method that is integrated into the inversion using the Plug-and-Play reconstruction framework. Using simulated data sets we demonstrate that our approach can improve performance over traditional regularizers based on a Markov random field model and a patch-based sparse representation model for sparse and limited-view tomographic data sets
    • …
    corecore