Time-dependent basis reduced order models (TDB ROMs) have successfully been
used for approximating the solution to nonlinear stochastic partial
differential equations (PDEs). For many practical problems of interest,
discretizing these PDEs results in massive matrix differential equations (MDEs)
that are too expensive to solve using conventional methods. While TDB ROMs have
the potential to significantly reduce this computational burden, they still
suffer from the following challenges: (i) inefficient for general
nonlinearities, (ii) intrusive implementation, (iii) ill-conditioned in the
presence of small singular values, and (iv) error accumulation due to fixed
rank. To this end, we present a scalable method based on oblique projections
for solving TDB ROMs that is computationally efficient, minimally intrusive,
robust in the presence of small singular values, rank-adaptive, and highly
parallelizable. These favorable properties are achieved via low-rank
approximation of the time discrete MDE. Using the discrete empirical
interpolation method (DEIM), a low-rank decomposition is computed at each
iteration of the time stepping scheme, enabling a near-optimal approximation at
a fraction of the cost. We coin the new approach TDB-CUR since it is equivalent
to a CUR decomposition based on sparse row and column samples of the MDE. We
also propose a rank-adaptive procedure to control the error on-the-fly.
Numerical results demonstrate the accuracy, efficiency, and robustness of the
new method for a diverse set of problems