6,043 research outputs found

    Rectified Gaussian Scale Mixtures and the Sparse Non-Negative Least Squares Problem

    Full text link
    In this paper, we develop a Bayesian evidence maximization framework to solve the sparse non-negative least squares (S-NNLS) problem. We introduce a family of probability densities referred to as the Rectified Gaussian Scale Mixture (R- GSM) to model the sparsity enforcing prior distribution for the solution. The R-GSM prior encompasses a variety of heavy-tailed densities such as the rectified Laplacian and rectified Student- t distributions with a proper choice of the mixing density. We utilize the hierarchical representation induced by the R-GSM prior and develop an evidence maximization framework based on the Expectation-Maximization (EM) algorithm. Using the EM based method, we estimate the hyper-parameters and obtain a point estimate for the solution. We refer to the proposed method as rectified sparse Bayesian learning (R-SBL). We provide four R- SBL variants that offer a range of options for computational complexity and the quality of the E-step computation. These methods include the Markov chain Monte Carlo EM, linear minimum mean-square-error estimation, approximate message passing and a diagonal approximation. Using numerical experiments, we show that the proposed R-SBL method outperforms existing S-NNLS solvers in terms of both signal and support recovery performance, and is also very robust against the structure of the design matrix.Comment: Under Review by IEEE Transactions on Signal Processin

    Variational Bayes with Intractable Likelihood

    Full text link
    Variational Bayes (VB) is rapidly becoming a popular tool for Bayesian inference in statistical modeling. However, the existing VB algorithms are restricted to cases where the likelihood is tractable, which precludes the use of VB in many interesting situations such as in state space models and in approximate Bayesian computation (ABC), where application of VB methods was previously impossible. This paper extends the scope of application of VB to cases where the likelihood is intractable, but can be estimated unbiasedly. The proposed VB method therefore makes it possible to carry out Bayesian inference in many statistical applications, including state space models and ABC. The method is generic in the sense that it can be applied to almost all statistical models without requiring too much model-based derivation, which is a drawback of many existing VB algorithms. We also show how the proposed method can be used to obtain highly accurate VB approximations of marginal posterior distributions.Comment: 40 pages, 6 figure

    Non-convex image reconstruction via Expectation Propagation

    Get PDF
    Tomographic image reconstruction can be mapped to a problem of finding solutions to a large system of linear equations which maximize a function that includes \textit{a priori} knowledge regarding features of typical images such as smoothness or sharpness. This maximization can be performed with standard local optimization tools when the function is concave, but it is generally intractable for realistic priors, which are non-concave. We introduce a new method to reconstruct images obtained from Radon projections by using Expectation Propagation, which allows us to reframe the problem from an Bayesian inference perspective. We show, by means of extensive simulations, that, compared to state-of-the-art algorithms for this task, Expectation Propagation paired with very simple but non log-concave priors, is often able to reconstruct images up to a smaller error while using a lower amount of information per pixel. We provide estimates for the critical rate of information per pixel above which recovery is error-free by means of simulations on ensembles of phantom and real images.Comment: 12 pages, 6 figure

    Fourth Moments and Independent Component Analysis

    Full text link
    In independent component analysis it is assumed that the components of the observed random vector are linear combinations of latent independent random variables, and the aim is then to find an estimate for a transformation matrix back to these independent components. In the engineering literature, there are several traditional estimation procedures based on the use of fourth moments, such as FOBI (fourth order blind identification), JADE (joint approximate diagonalization of eigenmatrices), and FastICA, but the statistical properties of these estimates are not well known. In this paper various independent component functionals based on the fourth moments are discussed in detail, starting with the corresponding optimization problems, deriving the estimating equations and estimation algorithms, and finding asymptotic statistical properties of the estimates. Comparisons of the asymptotic variances of the estimates in wide independent component models show that in most cases JADE and the symmetric version of FastICA perform better than their competitors.Comment: Published at http://dx.doi.org/10.1214/15-STS520 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Multivariate Bernoulli distribution

    Full text link
    In this paper, we consider the multivariate Bernoulli distribution as a model to estimate the structure of graphs with binary nodes. This distribution is discussed in the framework of the exponential family, and its statistical properties regarding independence of the nodes are demonstrated. Importantly the model can estimate not only the main effects and pairwise interactions among the nodes but also is capable of modeling higher order interactions, allowing for the existence of complex clique effects. We compare the multivariate Bernoulli model with existing graphical inference models - the Ising model and the multivariate Gaussian model, where only the pairwise interactions are considered. On the other hand, the multivariate Bernoulli distribution has an interesting property in that independence and uncorrelatedness of the component random variables are equivalent. Both the marginal and conditional distributions of a subset of variables in the multivariate Bernoulli distribution still follow the multivariate Bernoulli distribution. Furthermore, the multivariate Bernoulli logistic model is developed under generalized linear model theory by utilizing the canonical link function in order to include covariate information on the nodes, edges and cliques. We also consider variable selection techniques such as LASSO in the logistic model to impose sparsity structure on the graph. Finally, we discuss extending the smoothing spline ANOVA approach to the multivariate Bernoulli logistic model to enable estimation of non-linear effects of the predictor variables.Comment: Published in at http://dx.doi.org/10.3150/12-BEJSP10 the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Higher order sigma point filter: A new heuristic for nonlinear time series filtering

    Get PDF
    In this paper we present some new results related to the higher order sigma point filter (HOSPoF), introduced in [1] for filtering nonlinear multivariate time series. This paper makes two distinct contributions. Firstly, we propose a new algorithm to generate a discrete statistical distribution to match exactly a specified mean vector, a specified covariance matrix, the average of specified marginal skewness and the average of specified marginal kurtosis. Both the sigma points and the probability weights are given in closed-form and no numerical optimization is required. Combined with HOSPoF, this random sigma point generation algorithm provides a new method for generating proposal density which propagates the information about higher order moments. A numerical example on nonlinear, multivariate time series involving real financial market data demonstrates the utility of this new algorithm. Secondly, we show that HOSPoF achieves a higher order estimation accuracy as compared to UKF for smooth scalar nonlinearities. We believe that this new filter provides a new and powerful alternative heuristic to existing filtering algorithms and is useful especially in econometrics and in engineering applications
    corecore