36 research outputs found

    An explicit construction for neighborly centrally symmetric polytopes

    Get PDF
    We give an explicit construction, based on Hadamard matrices, for an infinite series of floor{sqrt{d}/2}-neighborly centrally symmetric d-dimensional polytopes with 4d vertices. This appears to be the best explicit version yet of a recent probabilistic result due to Linial and Novik, who proved the existence of such polytopes with a neighborliness of d/400.Comment: 9 pages, no figure

    Centrally symmetric polytopes with many faces

    Full text link
    We present explicit constructions of centrally symmetric polytopes with many faces: first, we construct a d-dimensional centrally symmetric polytope P with about (1.316)^d vertices such that every pair of non-antipodal vertices of P spans an edge of P, second, for an integer k>1, we construct a d-dimensional centrally symmetric polytope P of an arbitrarily high dimension d and with an arbitrarily large number N of vertices such that for some 0 < delta_k < 1 at least (1-delta_k^d) {N choose k} k-subsets of the set of vertices span faces of P, and third, for an integer k>1 and a>0, we construct a centrally symmetric polytope Q with an arbitrary large number N of vertices and of dimension d=k^{1+o(1)} such that least (1 - k^{-a}){N choose k} k-subsets of the set of vertices span faces of Q.Comment: 14 pages, some minor improvement

    Compressed Sensing over the Grassmann Manifold: A Unified Analytical Framework

    Get PDF
    It is well known that compressed sensing problems reduce to finding the sparse solutions for large under-determined systems of equations. Although finding the sparse solutions in general may be computationally difficult, starting with the seminal work of [2], it has been shown that linear programming techniques, obtained from an l_(1)-norm relaxation of the original non-convex problem, can provably find the unknown vector in certain instances. In particular, using a certain restricted isometry property, [2] shows that for measurement matrices chosen from a random Gaussian ensemble, l_1 optimization can find the correct solution with overwhelming probability even when the support size of the unknown vector is proportional to its dimension. The paper [1] uses results on neighborly polytopes from [6] to give a ldquosharprdquo bound on what this proportionality should be in the Gaussian measurement ensemble. In this paper we shall focus on finding sharp bounds on the recovery of ldquoapproximately sparserdquo signals (also possibly under noisy measurements). While the restricted isometry property can be used to study the recovery of approximately sparse signals (and also in the presence of noisy measurements), the obtained bounds can be quite loose. On the other hand, the neighborly polytopes technique which yields sharp bounds for ideally sparse signals cannot be generalized to approximately sparse signals. In this paper, starting from a necessary and sufficient condition for achieving a certain signal recovery accuracy, using high-dimensional geometry, we give a unified null-space Grassmannian angle-based analytical framework for compressive sensing. This new framework gives sharp quantitative tradeoffs between the signal sparsity and the recovery accuracy of the l_1 optimization for approximately sparse signals. As it will turn out, the neighborly polytopes result of [1] for ideally sparse signals can be viewed as a special case of ours. Our result concerns fundamental properties of linear subspaces and so may be of independent mathematical interest

    Compressed sensing of approximately sparse signals

    Get PDF
    It is well known that compressed sensing problems reduce to solving large under-determined systems of equations. If we choose the compressed measurement matrix according to some appropriate distribution and the signal is sparse enough the l1 optimization can exactly recover the ideally sparse signal with overwhelming probability by Candes, E. and Tao, T., [2], [1]. In the current paper, we will consider the case of the so-called approximately sparse signals. These signals are a generalized version of the ideally sparse signals. Letting the zero valued components of the ideally sparse signals to take the values of certain small magnitude one can construct the approximately sparse signals. Using a different but simple proof technique we show that the claims similar to those of [2] and [1] related to the proportionality of the number of large components of the signals to the number of measurements, hold for approximately sparse signals as well. Furthermore, using the same technique we compute the explicit values of what this proportionality can be if the compressed measurement matrix A has a rotationally invariant distribution of the null-space. We also give the quantitative tradeoff between the signal sparsity and the recovery robustness of the l_1 minimization. As it will turn out in an asymptotic case of the number of measurements the threshold result of [1] corresponds to a special case of our result

    Random geometric complexes

    Full text link
    We study the expected topological properties of Cech and Vietoris-Rips complexes built on i.i.d. random points in R^d. We find higher dimensional analogues of known results for connectivity and component counts for random geometric graphs. However, higher homology H_k is not monotone when k > 0. In particular for every k > 0 we exhibit two thresholds, one where homology passes from vanishing to nonvanishing, and another where it passes back to vanishing. We give asymptotic formulas for the expectation of the Betti numbers in the sparser regimes, and bounds in the denser regimes. The main technical contribution of the article is in the application of discrete Morse theory in geometric probability.Comment: 26 pages, 3 figures, final revisions, to appear in Discrete & Computational Geometr
    corecore