138 research outputs found

    Precise Phase Transition of Total Variation Minimization

    Full text link
    Characterizing the phase transitions of convex optimizations in recovering structured signals or data is of central importance in compressed sensing, machine learning and statistics. The phase transitions of many convex optimization signal recovery methods such as β„“1\ell_1 minimization and nuclear norm minimization are well understood through recent years' research. However, rigorously characterizing the phase transition of total variation (TV) minimization in recovering sparse-gradient signal is still open. In this paper, we fully characterize the phase transition curve of the TV minimization. Our proof builds on Donoho, Johnstone and Montanari's conjectured phase transition curve for the TV approximate message passing algorithm (AMP), together with the linkage between the minmax Mean Square Error of a denoising problem and the high-dimensional convex geometry for TV minimization.Comment: 6 page

    Empirical Bayes and Full Bayes for Signal Estimation

    Full text link
    We consider signals that follow a parametric distribution where the parameter values are unknown. To estimate such signals from noisy measurements in scalar channels, we study the empirical performance of an empirical Bayes (EB) approach and a full Bayes (FB) approach. We then apply EB and FB to solve compressed sensing (CS) signal estimation problems by successively denoising a scalar Gaussian channel within an approximate message passing (AMP) framework. Our numerical results show that FB achieves better performance than EB in scalar channel denoising problems when the signal dimension is small. In the CS setting, the signal dimension must be large enough for AMP to work well; for large signal dimensions, AMP has similar performance with FB and EB.Comment: This work was presented at the Information Theory and Application workshop (ITA), San Diego, CA, Feb. 201
    • …
    corecore