19,078 research outputs found

    Entropy Message Passing

    Full text link
    The paper proposes a new message passing algorithm for cycle-free factor graphs. The proposed "entropy message passing" (EMP) algorithm may be viewed as sum-product message passing over the entropy semiring, which has previously appeared in automata theory. The primary use of EMP is to compute the entropy of a model. However, EMP can also be used to compute expressions that appear in expectation maximization and in gradient descent algorithms.Comment: 5 pages, 1 figure, to appear in IEEE Transactions on Information Theor

    Tree-AMP: Compositional Inference with Tree Approximate Message Passing

    Full text link
    We introduce Tree-AMP, standing for Tree Approximate Message Passing, a python package for compositional inference in high-dimensional tree-structured models. The package provides a unifying framework to study several approximate message passing algorithms previously derived for a variety of machine learning tasks such as generalized linear models, inference in multi-layer networks, matrix factorization, and reconstruction using non-separable penalties. For some models, the asymptotic performance of the algorithm can be theoretically predicted by the state evolution, and the measurements entropy estimated by the free entropy formalism. The implementation is modular by design: each module, which implements a factor, can be composed at will with other modules to solve complex inference tasks. The user only needs to declare the factor graph of the model: the inference algorithm, state evolution and entropy estimation are fully automated.Comment: Source code available at https://github.com/sphinxteam/tramp and documentation at https://sphinxteam.github.io/tramp.doc

    Multi Terminal Probabilistic Compressed Sensing

    Get PDF
    In this paper, the `Approximate Message Passing' (AMP) algorithm, initially developed for compressed sensing of signals under i.i.d. Gaussian measurement matrices, has been extended to a multi-terminal setting (MAMP algorithm). It has been shown that similar to its single terminal counterpart, the behavior of MAMP algorithm is fully characterized by a `State Evolution' (SE) equation for large block-lengths. This equation has been used to obtain the rate-distortion curve of a multi-terminal memoryless source. It is observed that by spatially coupling the measurement matrices, the rate-distortion curve of MAMP algorithm undergoes a phase transition, where the measurement rate region corresponding to a low distortion (approximately zero distortion) regime is fully characterized by the joint and conditional Renyi information dimension (RID) of the multi-terminal source. This measurement rate region is very similar to the rate region of the Slepian-Wolf distributed source coding problem where the RID plays a role similar to the discrete entropy. Simulations have been done to investigate the empirical behavior of MAMP algorithm. It is observed that simulation results match very well with predictions of SE equation for reasonably large block-lengths.Comment: 11 pages, 13 figures. arXiv admin note: text overlap with arXiv:1112.0708 by other author

    Towards Optimal Moment Estimation in Streaming and Distributed Models

    Get PDF
    One of the oldest problems in the data stream model is to approximate the p-th moment ||X||_p^p = sum_{i=1}^n X_i^p of an underlying non-negative vector X in R^n, which is presented as a sequence of poly(n) updates to its coordinates. Of particular interest is when p in (0,2]. Although a tight space bound of Theta(epsilon^-2 log n) bits is known for this problem when both positive and negative updates are allowed, surprisingly there is still a gap in the space complexity of this problem when all updates are positive. Specifically, the upper bound is O(epsilon^-2 log n) bits, while the lower bound is only Omega(epsilon^-2 + log n) bits. Recently, an upper bound of O~(epsilon^-2 + log n) bits was obtained under the assumption that the updates arrive in a random order. We show that for p in (0, 1], the random order assumption is not needed. Namely, we give an upper bound for worst-case streams of O~(epsilon^-2 + log n) bits for estimating |X |_p^p. Our techniques also give new upper bounds for estimating the empirical entropy in a stream. On the other hand, we show that for p in (1,2], in the natural coordinator and blackboard distributed communication topologies, there is an O~(epsilon^-2) bit max-communication upper bound based on a randomized rounding scheme. Our protocols also give rise to protocols for heavy hitters and approximate matrix product. We generalize our results to arbitrary communication topologies G, obtaining an O~(epsilon^2 log d) max-communication upper bound, where d is the diameter of G. Interestingly, our upper bound rules out natural communication complexity-based approaches for proving an Omega(epsilon^-2 log n) bit lower bound for p in (1,2] for streaming algorithms. In particular, any such lower bound must come from a topology with large diameter
    • …
    corecore