160 research outputs found

    Sphere packing bounds in the Grassmann and Stiefel manifolds

    Full text link
    Applying the Riemann geometric machinery of volume estimates in terms of curvature, bounds for the minimal distance of packings/codes in the Grassmann and Stiefel manifolds will be derived and analyzed. In the context of space-time block codes this leads to a monotonically increasing minimal distance lower bound as a function of the block length. This advocates large block lengths for the code design.Comment: Replaced with final version, 11 page

    Density of Spherically-Embedded Stiefel and Grassmann Codes

    Full text link
    The density of a code is the fraction of the coding space covered by packing balls centered around the codewords. This paper investigates the density of codes in the complex Stiefel and Grassmann manifolds equipped with the chordal distance. The choice of distance enables the treatment of the manifolds as subspaces of Euclidean hyperspheres. In this geometry, the densest packings are not necessarily equivalent to maximum-minimum-distance codes. Computing a code's density follows from computing: i) the normalized volume of a metric ball and ii) the kissing radius, the radius of the largest balls one can pack around the codewords without overlapping. First, the normalized volume of a metric ball is evaluated by asymptotic approximations. The volume of a small ball can be well-approximated by the volume of a locally-equivalent tangential ball. In order to properly normalize this approximation, the precise volumes of the manifolds induced by their spherical embedding are computed. For larger balls, a hyperspherical cap approximation is used, which is justified by a volume comparison theorem showing that the normalized volume of a ball in the Stiefel or Grassmann manifold is asymptotically equal to the normalized volume of a ball in its embedding sphere as the dimension grows to infinity. Then, bounds on the kissing radius are derived alongside corresponding bounds on the density. Unlike spherical codes or codes in flat spaces, the kissing radius of Grassmann or Stiefel codes cannot be exactly determined from its minimum distance. It is nonetheless possible to derive bounds on density as functions of the minimum distance. Stiefel and Grassmann codes have larger density than their image spherical codes when dimensions tend to infinity. Finally, the bounds on density lead to refinements of the standard Hamming bounds for Stiefel and Grassmann codes.Comment: Two-column version (24 pages, 6 figures, 4 tables). To appear in IEEE Transactions on Information Theor

    Metric Entropy of Homogeneous Spaces

    Full text link
    For a (compact) subset KK of a metric space and ε>0\varepsilon > 0, the {\em covering number} N(K,ε)N(K , \varepsilon ) is defined as the smallest number of balls of radius ε\varepsilon whose union covers KK. Knowledge of the {\em metric entropy}, i.e., the asymptotic behaviour of covering numbers for (families of) metric spaces is important in many areas of mathematics (geometry, functional analysis, probability, coding theory, to name a few). In this paper we give asymptotically correct estimates for covering numbers for a large class of homogeneous spaces of unitary (or orthogonal) groups with respect to some natural metrics, most notably the one induced by the operator norm. This generalizes earlier author's results concerning covering numbers of Grassmann manifolds; the generalization is motivated by applications to noncommutative probability and operator algebras. In the process we give a characterization of geodesics in U(n)U(n) (or SO(m)SO(m)) for a class of non-Riemannian metric structures

    Adaptation of K-means-type algorithms to the Grassmann manifold, An

    Get PDF
    2019 Spring.Includes bibliographical references.The Grassmann manifold provides a robust framework for analysis of high-dimensional data through the use of subspaces. Treating data as subspaces allows for separability between data classes that is not otherwise achieved in Euclidean space, particularly with the use of the smallest principal angle pseudometric. Clustering algorithms focus on identifying similarities within data and highlighting the underlying structure. To exploit the properties of the Grassmannian for unsupervised data analysis, two variations of the popular K-means algorithm are adapted to perform clustering directly on the manifold. We provide the theoretical foundations needed for computations on the Grassmann manifold and detailed derivations of the key equations. Both algorithms are then thoroughly tested on toy data and two benchmark data sets from machine learning: the MNIST handwritten digit database and the AVIRIS Indian Pines hyperspectral data. Performance of algorithms is tested on manifolds of varying dimension. Unsupervised classification results on the benchmark data are compared to those currently found in the literature

    Linearization of Hyperbolic Finite-Time Processes

    Get PDF
    We adapt the notion of processes to introduce an abstract framework for dynamics in finite time, i.e.\ on compact time sets. For linear finite-time processes a notion of hyperbolicity namely exponential monotonicity dichotomy (EMD) is introduced, thereby generalizing and unifying several existing approaches. We present a spectral theory for linear processes in a coherent way, based only on a logarithmic difference quotient, prove robustness of EMD with respect to a suitable (semi-)metric and provide exact perturbation bounds. Furthermore, we give a complete description of the local geometry around hyperbolic trajectories, including a direct and intrinsic proof of finite-time analogues of the local (un)stable manifold theorem and theorem of linearized asymptotic stability. As an application, we discuss our results for ordinary differential equations on a compact time-interval.Comment: 32 page
    • …
    corecore