202,180 research outputs found

    Inferring Sparsity: Compressed Sensing using Generalized Restricted Boltzmann Machines

    Get PDF
    In this work, we consider compressed sensing reconstruction from MM measurements of KK-sparse structured signals which do not possess a writable correlation model. Assuming that a generative statistical model, such as a Boltzmann machine, can be trained in an unsupervised manner on example signals, we demonstrate how this signal model can be used within a Bayesian framework of signal reconstruction. By deriving a message-passing inference for general distribution restricted Boltzmann machines, we are able to integrate these inferred signal models into approximate message passing for compressed sensing reconstruction. Finally, we show for the MNIST dataset that this approach can be very effective, even for M<KM < K.Comment: IEEE Information Theory Workshop, 201

    Deeply Learning the Messages in Message Passing Inference

    Full text link
    Deep structured output learning shows great promise in tasks like semantic image segmentation. We proffer a new, efficient deep structured model learning scheme, in which we show how deep Convolutional Neural Networks (CNNs) can be used to estimate the messages in message passing inference for structured prediction with Conditional Random Fields (CRFs). With such CNN message estimators, we obviate the need to learn or evaluate potential functions for message calculation. This confers significant efficiency for learning, since otherwise when performing structured learning for a CRF with CNN potentials it is necessary to undertake expensive inference for every stochastic gradient iteration. The network output dimension for message estimation is the same as the number of classes, in contrast to the network output for general CNN potential functions in CRFs, which is exponential in the order of the potentials. Hence CNN message learning has fewer network parameters and is more scalable for cases that a large number of classes are involved. We apply our method to semantic image segmentation on the PASCAL VOC 2012 dataset. We achieve an intersection-over-union score of 73.4 on its test set, which is the best reported result for methods using the VOC training images alone. This impressive performance demonstrates the effectiveness and usefulness of our CNN message learning method.Comment: 11 pages. Appearing in Proc. The Twenty-ninth Annual Conference on Neural Information Processing Systems (NIPS), 2015, Montreal, Canad

    Decentralized Generalized Approximate Message-Passing for Tree-Structured Networks

    Full text link
    Decentralized generalized approximate message-passing (GAMP) is proposed for compressed sensing from distributed generalized linear measurements in a tree-structured network. Consensus propagation is used to realize average consensus required in GAMP via local communications between adjacent nodes. Decentralized GAMP is applicable to all tree-structured networks that do not necessarily have central nodes connected to all other nodes. State evolution is used to analyze the asymptotic dynamics of decentralized GAMP for zero-mean independent and identically distributed Gaussian sensing matrices. The state evolution recursion for decentralized GAMP is proved to have the same fixed points as that for centralized GAMP when homogeneous measurements with an identical dimension in all nodes are considered. Furthermore, existing long-memory proof strategy is used to prove that the state evolution recursion for decentralized GAMP with the Bayes-optimal denoisers converges to a fixed point. These results imply that the state evolution recursion for decentralized GAMP with the Bayes-optimal denoisers converges to the Bayes-optimal fixed point for the homogeneous measurements when the fixed point is unique. Numerical results for decentralized GAMP are presented in the cases of linear measurements and clipping. As examples of tree-structured networks, a one-dimensional chain and a tree with no central nodes are considered.Comment: submitted to IEEE Trans. Inf. Theor

    Graph Convolutional Matrix Completion

    Get PDF
    We consider matrix completion for recommender systems from the point of view of link prediction on graphs. Interaction data such as movie ratings can be represented by a bipartite user-item graph with labeled edges denoting observed ratings. Building on recent progress in deep learning on graph-structured data, we propose a graph auto-encoder framework based on differentiable message passing on the bipartite interaction graph. Our model shows competitive performance on standard collaborative filtering benchmarks. In settings where complimentary feature information or structured data such as a social network is available, our framework outperforms recent state-of-the-art methods.Comment: 9 pages, 3 figures, updated with additional experimental evaluatio

    Multilayer wave functions: A recursive coupling of local excitations

    Full text link
    Finding a succinct representation to describe the ground state of a disordered interacting system could be very helpful in understanding the interplay between the interactions that is manifested in a quantum phase transition. In this work we use some elementary states to construct recursively an ansatz of multilayer wave functions, where in each step the higher-level wave function is represented by a superposition of the locally "excited states" obtained from the lower-level wave function. This allows us to write the Hamiltonian expectation in terms of some local functions of the variational parameters, and employ an efficient message-passing algorithm to find the optimal parameters. We obtain good estimations of the ground-state energy and the phase transition point for the transverse Ising model with a few layers of mean-field and symmetric tree states. The work is the first step towards the application of local and distributed message-passing algorithms in the study of structured variational problems in finite dimensions.Comment: 23 pages, including 3 appendices and 6 figures. A shortened version published in EP
    corecore