5,459 research outputs found

    Vector Approximate Message Passing for the Generalized Linear Model

    Full text link
    The generalized linear model (GLM), where a random vector x\boldsymbol{x} is observed through a noisy, possibly nonlinear, function of a linear transform output z=Ax\boldsymbol{z}=\boldsymbol{Ax}, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A\boldsymbol{A} is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A\boldsymbol{A}, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a "vector AMP" (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A\boldsymbol{A} to the larger class of rotationally invariant A\boldsymbol{A}. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A\boldsymbol{A} than damped GAMP
    • …
    corecore