9,915 research outputs found

    Regression on fixed-rank positive semidefinite matrices: a Riemannian approach

    Full text link
    The paper addresses the problem of learning a regression model parameterized by a fixed-rank positive semidefinite matrix. The focus is on the nonlinear nature of the search space and on scalability to high-dimensional problems. The mathematical developments rely on the theory of gradient descent algorithms adapted to the Riemannian geometry that underlies the set of fixed-rank positive semidefinite matrices. In contrast with previous contributions in the literature, no restrictions are imposed on the range space of the learned matrix. The resulting algorithms maintain a linear complexity in the problem size and enjoy important invariance properties. We apply the proposed algorithms to the problem of learning a distance function parameterized by a positive semidefinite matrix. Good performance is observed on classical benchmarks

    Geometry-Aware Neighborhood Search for Learning Local Models for Image Reconstruction

    Get PDF
    Local learning of sparse image models has proven to be very effective to solve inverse problems in many computer vision applications. To learn such models, the data samples are often clustered using the K-means algorithm with the Euclidean distance as a dissimilarity metric. However, the Euclidean distance may not always be a good dissimilarity measure for comparing data samples lying on a manifold. In this paper, we propose two algorithms for determining a local subset of training samples from which a good local model can be computed for reconstructing a given input test sample, where we take into account the underlying geometry of the data. The first algorithm, called Adaptive Geometry-driven Nearest Neighbor search (AGNN), is an adaptive scheme which can be seen as an out-of-sample extension of the replicator graph clustering method for local model learning. The second method, called Geometry-driven Overlapping Clusters (GOC), is a less complex nonadaptive alternative for training subset selection. The proposed AGNN and GOC methods are evaluated in image super-resolution, deblurring and denoising applications and shown to outperform spectral clustering, soft clustering, and geodesic distance based subset selection in most settings.Comment: 15 pages, 10 figures and 5 table

    Equivariant semidefinite lifts of regular polygons

    Full text link
    Given a polytope P in Rn\mathbb{R}^n, we say that P has a positive semidefinite lift (psd lift) of size d if one can express P as the linear projection of an affine slice of the positive semidefinite cone S+d\mathbf{S}^d_+. If a polytope P has symmetry, we can consider equivariant psd lifts, i.e. those psd lifts that respect the symmetry of P. One of the simplest families of polytopes with interesting symmetries are regular polygons in the plane, which have played an important role in the study of linear programming lifts (or extended formulations). In this paper we study equivariant psd lifts of regular polygons. We first show that the standard Lasserre/sum-of-squares hierarchy for the regular N-gon requires exactly ceil(N/4) iterations and thus yields an equivariant psd lift of size linear in N. In contrast we show that one can construct an equivariant psd lift of the regular 2^n-gon of size 2n-1, which is exponentially smaller than the psd lift of the sum-of-squares hierarchy. Our construction relies on finding a sparse sum-of-squares certificate for the facet-defining inequalities of the regular 2^n-gon, i.e., one that only uses a small (logarithmic) number of monomials. Since any equivariant LP lift of the regular 2^n-gon must have size 2^n, this gives the first example of a polytope with an exponential gap between sizes of equivariant LP lifts and equivariant psd lifts. Finally we prove that our construction is essentially optimal by showing that any equivariant psd lift of the regular N-gon must have size at least logarithmic in N.Comment: 29 page
    • …
    corecore