14 research outputs found

    Enabling Efficient Equivariant Operations in the Fourier Basis via Gaunt Tensor Products

    Full text link
    Developing equivariant neural networks for the E(3) group plays an important role in modeling 3D data across real-world applications. Enforcing this equivariance primarily involves the tensor products of irreducible representations (irreps). However, the computational complexity of such operations increases significantly as higher-order tensors are used. In this work, we propose a systematic approach to substantially accelerate the computation of the tensor products of irreps. We mathematically connect the commonly used Clebsch-Gordan coefficients to the Gaunt coefficients, which are integrals of products of three spherical harmonics. Through Gaunt coefficients, the tensor product of irreps becomes equivalent to the multiplication between spherical functions represented by spherical harmonics. This perspective further allows us to change the basis for the equivariant operations from spherical harmonics to a 2D Fourier basis. Consequently, the multiplication between spherical functions represented by a 2D Fourier basis can be efficiently computed via the convolution theorem and Fast Fourier Transforms. This transformation reduces the complexity of full tensor products of irreps from O(L6)\mathcal{O}(L^6) to O(L3)\mathcal{O}(L^3), where LL is the max degree of irreps. Leveraging this approach, we introduce the Gaunt Tensor Product, which serves as a new method to construct efficient equivariant operations across different model architectures. Our experiments on the Open Catalyst Project and 3BPA datasets demonstrate both the increased efficiency and improved performance of our approach.Comment: 36 pages; ICLR 2024 (Spotlight Presentation); Code: https://github.com/lsj2408/Gaunt-Tensor-Produc

    Learning differentiable solvers for systems with hard constraints

    Full text link
    We introduce a practical method to enforce partial differential equation (PDE) constraints for functions defined by neural networks (NNs), with a high degree of accuracy and up to a desired tolerance. We develop a differentiable PDE-constrained layer that can be incorporated into any NN architecture. Our method leverages differentiable optimization and the implicit function theorem to effectively enforce physical constraints. Inspired by dictionary learning, our model learns a family of functions, each of which defines a mapping from PDE parameters to PDE solutions. At inference time, the model finds an optimal linear combination of the functions in the learned family by solving a PDE-constrained optimization problem. Our method provides continuous solutions over the domain of interest that accurately satisfy desired physical constraints. Our results show that incorporating hard constraints directly into the NN architecture achieves much lower test error when compared to training on an unconstrained objective.Comment: Paper accepted to the 11th International Conference on Learning Representations (ICLR 2023). 9 pages + references + appendix. 5 figures in main tex

    Topological Regularization via Persistence-Sensitive Optimization

    Full text link
    Optimization, a key tool in machine learning and statistics, relies on regularization to reduce overfitting. Traditional regularization methods control a norm of the solution to ensure its smoothness. Recently, topological methods have emerged as a way to provide a more precise and expressive control over the solution, relying on persistent homology to quantify and reduce its roughness. All such existing techniques back-propagate gradients through the persistence diagram, which is a summary of the topological features of a function. Their downside is that they provide information only at the critical points of the function. We propose a method that instead builds on persistence-sensitive simplification and translates the required changes to the persistence diagram into changes on large subsets of the domain, including both critical and regular points. This approach enables a faster and more precise topological regularization, the benefits of which we illustrate with experimental evidence.Comment: The first two authors contributed equally to this wor

    Investigating the Behavior of Diffusion Models for Accelerating Electronic Structure Calculations

    Full text link
    We present an investigation into diffusion models for molecular generation, with the aim of better understanding how their predictions compare to the results of physics-based calculations. The investigation into these models is driven by their potential to significantly accelerate electronic structure calculations using machine learning, without requiring expensive first-principles datasets for training interatomic potentials. We find that the inference process of a popular diffusion model for de novo molecular generation is divided into an exploration phase, where the model chooses the atomic species, and a relaxation phase, where it adjusts the atomic coordinates to find a low-energy geometry. As training proceeds, we show that the model initially learns about the first-order structure of the potential energy surface, and then later learns about higher-order structure. We also find that the relaxation phase of the diffusion model can be re-purposed to sample the Boltzmann distribution over conformations and to carry out structure relaxations. For structure relaxations, the model finds geometries with ~10x lower energy than those produced by a classical force field for small organic molecules. Initializing a density functional theory (DFT) relaxation at the diffusion-produced structures yields a >2x speedup to the DFT relaxation when compared to initializing at structures relaxed with a classical force field

    EKOLOGI SPIRITUAL: SOLUSI KRISIS LINGKUNGAN

    Get PDF
    This article explains that the environmental crisis which is done by human being. This environmental damage is caused by the belief that the realm is offered by God to be utilized by human beings as khalifah on earth with the fullest extent. Through the perennial philosophy approach, this paper explores the importance of spiritual values ​​in human beings when dealing with ecology/environment. This paper concludes that nature and man are equally fitrah (holy). However, there is a very basic difference between the two, that is, humans are gifted by reason, whereas nature does not. Therefore: a) the central role of man is the servant of the universe; b) there is an urgent need for Muslims to improve their behavior to live more harmoniously with nature than humans; c) the moral and ethical dimensions of human beings are essential in order to treat nature with a friendly and courteous manner;  d) the spiritual values ​​in man must always be implied in every line of life when dealing with nature, and e. the task of man sent to the universe is inseparable from the concept of tawhid, khalifah, amanah, akhirah, adl, and mizan

    An ecosystem for digital reticular chemistry

    No full text
    The space of all plausible materials for a given application is so large that it cannot be explored using a brute-force approach. This is, in particular, the case for reticular chemistry which provides materials designers with a practically infinite playground on different length scales. One promising approach to guide the design and discovery of materials is machine learning, which typically involves learning a mapping of structures onto properties from data. While there have been plenty of examples of the use of machine learning for reticular materials, the progress in the field seems to have stagnated. From our perspective, an important reason is that digital reticular chemistry is still more an art than a science in which many parts are only accessible to experienced groups. The lack of standardization across all the steps of the machine learning pipeline makes it practically impossible to directly compare machine learning models and build on top of prior results. To confront these challenges, we present mofdscribe: a software ecosystem that accompanies—seasoned as well as novice—digital reticular chemists on all steps from ideation to model publication. Our package provides reference datasets (including a completely new one), more than 35 reported as well as completely novel featurization strategies, data splitters, and validation helpers which can be used to benchmark new modeling strategies on standard benchmark tasks and to report the results on a public leaderboard. We envision that this ecosystem allows for a more robust, comparable, and productive area of digital reticular chemistry
    corecore