55 research outputs found

    Sparse Inverse Problems Over Measures: Equivalence of the Conditional Gradient and Exchange Methods

    Full text link
    We study an optimization program over nonnegative Borel measures that encourages sparsity in its solution. Efficient solvers for this program are in increasing demand, as it arises when learning from data generated by a `continuum-of-subspaces' model, a recent trend with applications in signal processing, machine learning, and high-dimensional statistics. We prove that the conditional gradient method (CGM) applied to this infinite-dimensional program, as proposed recently in the literature, is equivalent to the exchange method (EM) applied to its Lagrangian dual, which is a semi-infinite program. In doing so, we formally connect such infinite-dimensional programs to the well-established field of semi-infinite programming. On the one hand, the equivalence established in this paper allows us to provide a rate of convergence for EM which is more general than those existing in the literature. On the other hand, this connection and the resulting geometric insights might in the future lead to the design of improved variants of CGM for infinite-dimensional programs, which has been an active research topic. CGM is also known as the Frank-Wolfe algorithm

    Sparse non-negative super-resolution -- simplified and stabilised

    Full text link
    The convolution of a discrete measure, x=∑i=1kaiδtix=\sum_{i=1}^ka_i\delta_{t_i}, with a local window function, ϕ(s−t)\phi(s-t), is a common model for a measurement device whose resolution is substantially lower than that of the objects being observed. Super-resolution concerns localising the point sources {ai,ti}i=1k\{a_i,t_i\}_{i=1}^k with an accuracy beyond the essential support of ϕ(s−t)\phi(s-t), typically from mm samples y(sj)=∑i=1kaiϕ(sj−ti)+ηjy(s_j)=\sum_{i=1}^k a_i\phi(s_j-t_i)+\eta_j, where ηj\eta_j indicates an inexactness in the sample value. We consider the setting of xx being non-negative and seek to characterise all non-negative measures approximately consistent with the samples. We first show that xx is the unique non-negative measure consistent with the samples provided the samples are exact, i.e. ηj=0\eta_j=0, m≥2k+1m\ge 2k+1 samples are available, and ϕ(s−t)\phi(s-t) generates a Chebyshev system. This is independent of how close the sample locations are and {\em does not rely on any regulariser beyond non-negativity}; as such, it extends and clarifies the work by Schiebinger et al. and De Castro et al., who achieve the same results but require a total variation regulariser, which we show is unnecessary. Moreover, we characterise non-negative solutions x^\hat{x} consistent with the samples within the bound ∑j=1mηj2≤δ2\sum_{j=1}^m\eta_j^2\le \delta^2. Any such non-negative measure is within O(δ1/7){\mathcal O}(\delta^{1/7}) of the discrete measure xx generating the samples in the generalised Wasserstein distance, converging to one another as δ\delta approaches zero. We also show how to make these general results, for windows that form a Chebyshev system, precise for the case of ϕ(s−t)\phi(s-t) being a Gaussian window. The main innovation of these results is that non-negativity alone is sufficient to localise point sources beyond the essential sensor resolution.Comment: 59 pages, 7 figure

    Computing second-order points under equality constraints: revisiting Fletcher's augmented Lagrangian

    Full text link
    We address the problem of minimizing a smooth function under smooth equality constraints. Under regularity assumptions on the feasible set, we consider a smooth exact penalty function known as Fletcher's augmented Lagrangian. We propose an algorithm to minimize the penalized cost function which reaches ε\varepsilon-approximate second-order critical points of the original optimization problem in at most O(ε−3)\mathcal{O}(\varepsilon^{-3}) iterations. This improves on current best theoretical bounds. Along the way, we show new properties of Fletcher's augmented Lagrangian, which may be of independent interest
    • …
    corecore