8 research outputs found
M-ENIAC: A machine learning recreation of the first successful numerical weather forecasts
In 1950 the first successful numerical weather forecast was obtained by
solving the barotropic vorticity equation using the Electronic Numerical
Integrator and Computer (ENIAC), which marked the beginning of the age of
numerical weather prediction. Here, we ask the question of how these numerical
forecasts would have turned out, if machine learning based solvers had been
used instead of standard numerical discretizations. Specifically, we recreate
these numerical forecasts using physics-informed neural networks. We show that
physics-informed neural networks provide an easier and more accurate
methodology for solving meteorological equations on the sphere, as compared to
the ENIAC solver.Comment: 10 pages, 1 figur
Invariant Physics-Informed Neural Networks for Ordinary Differential Equations
Physics-informed neural networks have emerged as a prominent new method for
solving differential equations. While conceptually straightforward, they often
suffer training difficulties that lead to relatively large discretization
errors or the failure to obtain correct solutions. In this paper we introduce
invariant physics-informed neural networks for ordinary differential equations
that admit a finite-dimensional group of Lie point symmetries. Using the method
of equivariant moving frames, a differential equation is invariantized to
obtain a, generally, simpler equation in the space of differential invariants.
A solution to the invariantized equation is then mapped back to a solution of
the original differential equation by solving the reconstruction equations for
the left moving frame. The invariantized differential equation together with
the reconstruction equations are solved using a physcis-informed neural
network, and form what we call an invariant physics-informed neural network. We
illustrate the method with several examples, all of which considerably
outperform standard non-invariant physics-informed neural networks.Comment: 20 pages, 6 figure
Improving physics-informed DeepONets with hard constraints
Current physics-informed (standard or operator) neural networks still rely on
accurately learning the initial conditions of the system they are solving. In
contrast, standard numerical methods evolve such initial conditions without
needing to learn these. In this study, we propose to improve current
physics-informed deep learning strategies such that initial conditions do not
need to be learned and are represented exactly in the predicted solution.
Moreover, this method guarantees that when a DeepONet is applied multiple times
to time step a solution, the resulting function is continuous.Comment: 15 pages, 5 figures, 4 tables; release versio
MâENIAC: A PhysicsâInformed Machine Learning Recreation of the First Successful Numerical Weather Forecasts
Abstract In 1950 the first successful numerical weather forecast was obtained by solving the barotropic vorticity equation using the Electronic Numerical Integrator and Computer (ENIAC), which marked the beginning of the age of numerical weather prediction. Here, we ask the question of how these numerical forecasts would have turned out, if machine learning based solvers had been used instead of standard numerical discretizations. Specifically, we recreate these numerical forecasts using physicsâinformed neural networks. We show that physicsâinformed neural networks provide an easier and more accurate methodology for solving meteorological equations on the sphere, as compared to the ENIAC solver
Computing the Ensemble Spread From Deterministic Weather Predictions Using Conditional Generative Adversarial Networks
Abstract Ensemble prediction systems are an invaluable tool for weather forecasting. Practically, ensemble predictions are obtained by running several perturbations of the deterministic control forecast. However, ensemble prediction is associated with a high computational cost and often involves statistical postâprocessing steps to improve its quality. Here we propose to use deepâlearningâbased algorithms to learn the statistical properties of an ensemble prediction system, the ensemble spread, given only the deterministic control forecast. Thus, once trained, the costly ensemble prediction system will not be needed anymore to obtain future ensemble forecasts, and the statistical properties of the ensemble can be derived from a single deterministic forecast. We adapt the classical pix2pix architecture to a threeâdimensional model and train them against several years of operational (ensemble) weather forecasts for the 500Â hPa geopotential height. The results demonstrate that the trained models indeed allow obtaining a highly accurate ensemble spread from the control forecast only
Invariant variational schemes for ordinary differential equations
We propose a novel algorithmic method for constructing invariant variational schemes of systems of ordinary differential equations that are the EulerâLagrange equations of a variational principle. The method is based on the invariantization of standard, noninvariant discrete Lagrangian functionals using equivariant moving frames. The invariant variational schemes are given by the EulerâLagrange equations of the corresponding invariantized discrete Lagrangian functionals. We showcase this general method by constructing invariant variational schemes of ordinary differential equations that preserve variational and divergence symmetries of the associated continuous Lagrangians. Noether's theorem automatically implies that the resulting schemes are exactly conservative. Numerical simulations are carried out and show that these invariant variational schemes outperform standard numerical discretizations