We present a scalable strategy for development of mesh-free hybrid
neuro-symbolic partial differential equation solvers based on existing
mesh-based numerical discretization methods. Particularly, this strategy can be
used to efficiently train neural network surrogate models of partial
differential equations by (i) leveraging the accuracy and convergence
properties of advanced numerical methods, solvers, and preconditioners, as well
as (ii) better scalability to higher order PDEs by strictly limiting
optimization to first order automatic differentiation. The presented neural
bootstrapping method (hereby dubbed NBM) is based on evaluation of the finite
discretization residuals of the PDE system obtained on implicit Cartesian cells
centered on a set of random collocation points with respect to trainable
parameters of the neural network. Importantly, the conservation laws and
symmetries present in the bootstrapped finite discretization equations inform
the neural network about solution regularities within local neighborhoods of
training points. We apply NBM to the important class of elliptic problems with
jump conditions across irregular interfaces in three spatial dimensions. We
show the method is convergent such that model accuracy improves by increasing
number of collocation points in the domain and predonditioning the residuals.
We show NBM is competitive in terms of memory and training speed with other
PINN-type frameworks. The algorithms presented here are implemented using
\texttt{JAX} in a software package named \texttt{JAX-DIPS}
(https://github.com/JAX-DIPS/JAX-DIPS), standing for differentiable interfacial
PDE solver. We open sourced \texttt{JAX-DIPS} to facilitate research into use
of differentiable algorithms for developing hybrid PDE solvers