65,810 research outputs found

    Structure Adaptive Lasso

    Full text link
    Lasso is of fundamental importance in high-dimensional statistics and has been routinely used to regress a response on a high-dimensional set of predictors. In many scientific applications, there exists external information that encodes the predictive power and sparsity structure of the predictors. In this article, we develop a new method, called the Structure Adaptive Lasso (SA-Lasso), to incorporate these potentially useful side information into a penalized regression. The basic idea is to translate the external information into different penalization strengths for the regression coefficients. We study the risk properties of the resulting estimator. In particular, we generalize the state evolution framework recently introduced for the analysis of the approximate message-passing algorithm to the SA-Lasso setting. We show that the finite sample risk of the SA-Lasso estimator is consistent with the theoretical risk predicted by the state evolution equation. Our theory suggests that the SA-Lasso with an informative group or covariate structure can significantly outperform the Lasso, Adaptive Lasso, and Sparse Group Lasso. This evidence is further confirmed in our numerical studies. We also demonstrate the usefulness and the superiority of our method in a real data application.Comment: 42 pages, 24 figure

    Approximate Message Passing with Restricted Boltzmann Machine Priors

    Full text link
    Approximate Message Passing (AMP) has been shown to be an excellent statistical approach to signal inference and compressed sensing problem. The AMP framework provides modularity in the choice of signal prior; here we propose a hierarchical form of the Gauss-Bernouilli prior which utilizes a Restricted Boltzmann Machine (RBM) trained on the signal support to push reconstruction performance beyond that of simple iid priors for signals whose support can be well represented by a trained binary RBM. We present and analyze two methods of RBM factorization and demonstrate how these affect signal reconstruction performance within our proposed algorithm. Finally, using the MNIST handwritten digit dataset, we show experimentally that using an RBM allows AMP to approach oracle-support performance

    An Overview of Multi-Processor Approximate Message Passing

    Full text link
    Approximate message passing (AMP) is an algorithmic framework for solving linear inverse problems from noisy measurements, with exciting applications such as reconstructing images, audio, hyper spectral images, and various other signals, including those acquired in compressive signal acquisiton systems. The growing prevalence of big data systems has increased interest in large-scale problems, which may involve huge measurement matrices that are unsuitable for conventional computing systems. To address the challenge of large-scale processing, multiprocessor (MP) versions of AMP have been developed. We provide an overview of two such MP-AMP variants. In row-MP-AMP, each computing node stores a subset of the rows of the matrix and processes corresponding measurements. In column- MP-AMP, each node stores a subset of columns, and is solely responsible for reconstructing a portion of the signal. We will discuss pros and cons of both approaches, summarize recent research results for each, and explain when each one may be a viable approach. Aspects that are highlighted include some recent results on state evolution for both MP-AMP algorithms, and the use of data compression to reduce communication in the MP network

    A Factor Graph Approach to Automated Design of Bayesian Signal Processing Algorithms

    Get PDF
    The benefits of automating design cycles for Bayesian inference-based algorithms are becoming increasingly recognized by the machine learning community. As a result, interest in probabilistic programming frameworks has much increased over the past few years. This paper explores a specific probabilistic programming paradigm, namely message passing in Forney-style factor graphs (FFGs), in the context of automated design of efficient Bayesian signal processing algorithms. To this end, we developed "ForneyLab" (https://github.com/biaslab/ForneyLab.jl) as a Julia toolbox for message passing-based inference in FFGs. We show by example how ForneyLab enables automatic derivation of Bayesian signal processing algorithms, including algorithms for parameter estimation and model comparison. Crucially, due to the modular makeup of the FFG framework, both the model specification and inference methods are readily extensible in ForneyLab. In order to test this framework, we compared variational message passing as implemented by ForneyLab with automatic differentiation variational inference (ADVI) and Monte Carlo methods as implemented by state-of-the-art tools "Edward" and "Stan". In terms of performance, extensibility and stability issues, ForneyLab appears to enjoy an edge relative to its competitors for automated inference in state-space models.Comment: Accepted for publication in the International Journal of Approximate Reasonin
    • …
    corecore