We present a new data-driven reduced-order modeling approach to efficiently
solve parametrized partial differential equations (PDEs) for many-query
problems. This work is inspired by the concept of implicit neural
representation (INR), which models physics signals in a continuous manner and
independent of spatial/temporal discretization. The proposed framework encodes
PDE and utilizes a parametrized neural ODE (PNODE) to learn latent dynamics
characterized by multiple PDE parameters. PNODE can be inferred by a
hypernetwork to reduce the potential difficulties in learning PNODE due to a
complex multilayer perceptron (MLP). The framework uses an INR to decode the
latent dynamics and reconstruct accurate PDE solutions. Further, a
physics-informed loss is also introduced to correct the prediction of unseen
parameter instances. Incorporating the physics-informed loss also enables the
model to be fine-tuned in an unsupervised manner on unseen PDE parameters. A
numerical experiment is performed on a two-dimensional Burgers equation with a
large variation of PDE parameters. We evaluate the proposed method at a large
Reynolds number and obtain up to speedup of O(10^3) and ~1% relative error to
the ground truth values.Comment: 9 pages, 5 figures, Machine Learning and the Physical Sciences
Workshop, NeurIPS 202