147,918 research outputs found

    Gauge covariant neural network for 4 dimensional non-abelian gauge theory

    Full text link
    We develop a gauge covariant neural network for four dimensional non-abelian gauge theory, which realizes a map between rank-2 tensor valued vector fields. We find that the conventional smearing procedure and gradient flow for gauge fields can be regarded as known neural networks, residual networks and neural ordinal differential equations for rank-2 tensors with fixed parameters. In terms of machine learning context, projection or normalization functions in the smearing schemes correspond to an activation function in neural networks. Using the locality of the activation function, we derive the backpropagation for the gauge covariant neural network. Consequently, the smeared force in hybrid Monte Carlo (HMC) is naturally derived with the backpropagation. As a demonstration, we develop the self-learning HMC (SLHMC) with covariant neural network approximated action for non-abelian gauge theory with dynamical fermions, and we observe SLHMC reproduces results from HMC.Comment: 50 pages, 8 figure

    Neural Conservation Laws: A Divergence-Free Perspective

    Full text link
    We investigate the parameterization of deep neural networks that by design satisfy the continuity equation, a fundamental conservation law. This is enabled by the observation that any solution of the continuity equation can be represented as a divergence-free vector field. We hence propose building divergence-free neural networks through the concept of differential forms, and with the aid of automatic differentiation, realize two practical constructions. As a result, we can parameterize pairs of densities and vector fields that always exactly satisfy the continuity equation, foregoing the need for extra penalty methods or expensive numerical simulation. Furthermore, we prove these models are universal and so can be used to represent any divergence-free vector field. Finally, we experimentally validate our approaches by computing neural network-based solutions to fluid equations, solving for the Hodge decomposition, and learning dynamical optimal transport maps

    Tangent Bundle Filters and Neural Networks: From Manifolds to Cellular Sheaves and Back

    Get PDF
    In this work we introduce a convolution operation over the tangent bundle of Riemannian manifolds exploiting the Connection Laplacian operator. We use this convolution operation to define tangent bundle filters and tangent bundle neural networks (TNNs), novel continuous architectures operating on tangent bundle signals, i.e. vector fields over manifolds. We discretize TNNs both in space and time domains, showing that their discrete counterpart is a principled variant of the recently introduced Sheaf Neural Networks. We formally prove that this discrete architecture converges to the underlying continuous TNN. We numerically evaluate the effectiveness of the proposed architecture on a denoising task of a tangent vector field over the unit 2-sphere

    FineMorphs: Affine-diffeomorphic sequences for regression

    Full text link
    A multivariate regression model of affine and diffeomorphic transformation sequences - FineMorphs - is presented. Leveraging concepts from shape analysis, model states are optimally "reshaped" by diffeomorphisms generated by smooth vector fields during learning. Affine transformations and vector fields are optimized within an optimal control setting, and the model can naturally reduce (or increase) dimensionality and adapt to large datasets via suboptimal vector fields. An existence proof of solution and necessary conditions for optimality for the model are derived. Experimental results on real datasets from the UCI repository are presented, with favorable results in comparison with state-of-the-art in the literature and densely-connected neural networks in TensorFlow.Comment: 39 pages, 7 figure
    • …
    corecore