33,589 research outputs found

    Conformal and Grassmann structures

    Get PDF
    The main results on the theory of conformal and almost Grassmann structures are presented. The common properties of these structures and also the differences between them are outlined. In particular, the structure groups of these structures and their differential prolongations are found. A complete system of geometric objects of the almost Grassmann structure totally defining its geometric structure is determined. The vanishing of these objects determines a locally Grassmann manifold. It is proved that the integrable almost Grassmann structures are locally Grassmann. The criteria of semiintegrability of almost Grassmann structures is proved in invariant form.Comment: LaTeX, 25 page

    Affine Grassmann Codes

    Full text link
    We consider a new class of linear codes, called affine Grassmann codes. These can be viewed as a variant of generalized Reed-Muller codes and are closely related to Grassmann codes. We determine the length, dimension, and the minimum distance of any affine Grassmann code. Moreover, we show that affine Grassmann codes have a large automorphism group and determine the number of minimum weight codewords.Comment: Slightly Revised Version; 18 page

    Minimum distance of Symplectic Grassmann codes

    Get PDF
    We introduce the Symplectic Grassmann codes as projective codes defined by symplectic Grassmannians, in analogy with the orthogonal Grassmann codes introduced in [4]. Note that the Lagrangian-Grassmannian codes are a special class of Symplectic Grassmann codes. We describe the weight enumerator of the Lagrangian--Grassmannian codes of rank 22 and 33 and we determine the minimum distance of the line Symplectic Grassmann codes.Comment: Revised contents and biblograph

    A Grassmann integral equation

    Full text link
    The present study introduces and investigates a new type of equation which is called Grassmann integral equation in analogy to integral equations studied in real analysis. A Grassmann integral equation is an equation which involves Grassmann integrations and which is to be obeyed by an unknown function over a (finite-dimensional) Grassmann algebra G_m. A particular type of Grassmann integral equations is explicitly studied for certain low-dimensional Grassmann algebras. The choice of the equation under investigation is motivated by the effective action formalism of (lattice) quantum field theory. In a very general setting, for the Grassmann algebras G_2n, n = 2,3,4, the finite-dimensional analogues of the generating functionals of the Green functions are worked out explicitly by solving a coupled system of nonlinear matrix equations. Finally, by imposing the condition G[{\bar\Psi},{\Psi}] = G_0[{\lambda\bar\Psi}, {\lambda\Psi}] + const., 0<\lambda\in R (\bar\Psi_k, \Psi_k, k=1,...,n, are the generators of the Grassmann algebra G_2n), between the finite-dimensional analogues G_0 and G of the (``classical'') action and effective action functionals, respectively, a special Grassmann integral equation is being established and solved which also is equivalent to a coupled system of nonlinear matrix equations. If \lambda \not= 1, solutions to this Grassmann integral equation exist for n=2 (and consequently, also for any even value of n, specifically, for n=4) but not for n=3. If \lambda=1, the considered Grassmann integral equation has always a solution which corresponds to a Gaussian integral, but remarkably in the case n=4 a further solution is found which corresponds to a non-Gaussian integral. The investigation sheds light on the structures to be met for Grassmann algebras G_2n with arbitrarily chosen n.Comment: 58 pages LaTeX (v2: mainly, minor updates and corrections to the reference section; v3: references [4], [17]-[21], [39], [46], [49]-[54], [61], [64], [139] added

    Linear Odd Poisson Bracket on Grassmann Variables

    Get PDF
    A linear odd Poisson bracket (antibracket) realized solely in terms of Grassmann variables is suggested. It is revealed that the bracket, which corresponds to a semi-simple Lie group, has at once three Grassmann-odd nilpotent Δ\Delta-like differential operators of the first, the second and the third orders with respect to Grassmann derivatives, in contrast with the canonical odd Poisson bracket having the only Grassmann-odd nilpotent differential Δ\Delta-operator of the second order. It is shown that these Δ\Delta-like operators together with a Grassmann-odd nilpotent Casimir function of this bracket form a finite-dimensional Lie superalgebra.Comment: 7 pages, LATEX. Relation (34) is added and the rearrangement necessary for publication in Physics Letters B is mad

    Locality Preserving Projections for Grassmann manifold

    Full text link
    Learning on Grassmann manifold has become popular in many computer vision tasks, with the strong capability to extract discriminative information for imagesets and videos. However, such learning algorithms particularly on high-dimensional Grassmann manifold always involve with significantly high computational cost, which seriously limits the applicability of learning on Grassmann manifold in more wide areas. In this research, we propose an unsupervised dimensionality reduction algorithm on Grassmann manifold based on the Locality Preserving Projections (LPP) criterion. LPP is a commonly used dimensionality reduction algorithm for vector-valued data, aiming to preserve local structure of data in the dimension-reduced space. The strategy is to construct a mapping from higher dimensional Grassmann manifold into the one in a relative low-dimensional with more discriminative capability. The proposed method can be optimized as a basic eigenvalue problem. The performance of our proposed method is assessed on several classification and clustering tasks and the experimental results show its clear advantages over other Grassmann based algorithms.Comment: Accepted by IJCAI 201

    Building Deep Networks on Grassmann Manifolds

    Full text link
    Learning representations on Grassmann manifolds is popular in quite a few visual recognition tasks. In order to enable deep learning on Grassmann manifolds, this paper proposes a deep network architecture by generalizing the Euclidean network paradigm to Grassmann manifolds. In particular, we design full rank mapping layers to transform input Grassmannian data to more desirable ones, exploit re-orthonormalization layers to normalize the resulting matrices, study projection pooling layers to reduce the model complexity in the Grassmannian context, and devise projection mapping layers to respect Grassmannian geometry and meanwhile achieve Euclidean forms for regular output layers. To train the Grassmann networks, we exploit a stochastic gradient descent setting on manifolds of the connection weights, and study a matrix generalization of backpropagation to update the structured data. The evaluations on three visual recognition tasks show that our Grassmann networks have clear advantages over existing Grassmann learning methods, and achieve results comparable with state-of-the-art approaches.Comment: AAAI'18 pape
    corecore