This paper proposes a mesh-free computational framework and machine learning
theory for solving elliptic PDEs on unknown manifolds, identified with point
clouds, based on diffusion maps (DM) and deep learning. The PDE solver is
formulated as a supervised learning task to solve a least-squares regression
problem that imposes an algebraic equation approximating a PDE (and boundary
conditions if applicable). This algebraic equation involves a graph-Laplacian
type matrix obtained via DM asymptotic expansion, which is a consistent
estimator of second-order elliptic differential operators. The resulting
numerical method is to solve a highly non-convex empirical risk minimization
problem subjected to a solution from a hypothesis space of neural-network type
functions. In a well-posed elliptic PDE setting, when the hypothesis space
consists of feedforward neural networks with either infinite width or depth, we
show that the global minimizer of the empirical loss function is a consistent
solution in the limit of large training data. When the hypothesis space is a
two-layer neural network, we show that for a sufficiently large width, the
gradient descent method can identify a global minimizer of the empirical loss
function. Supporting numerical examples demonstrate the convergence of the
solutions and the effectiveness of the proposed solver in avoiding numerical
issues that hampers the traditional approach when a large data set becomes
available, e.g., large matrix inversion