Updating a truncated Singular Value Decomposition (SVD) is crucial in
representation learning, especially when dealing with large-scale data matrices
that continuously evolve in practical scenarios. Aligning SVD-based models with
fast-paced updates becomes increasingly important. Existing methods for
updating truncated SVDs employ Rayleigh-Ritz projection procedures, where
projection matrices are augmented based on original singular vectors. However,
these methods suffer from inefficiency due to the densification of the update
matrix and the application of the projection to all singular vectors. To
address these limitations, we introduce a novel method for dynamically
approximating the truncated SVD of a sparse and temporally evolving matrix. Our
approach leverages sparsity in the orthogonalization process of augmented
matrices and utilizes an extended decomposition to independently store
projections in the column space of singular vectors. Numerical experiments
demonstrate a remarkable efficiency improvement of an order of magnitude
compared to previous methods. Remarkably, this improvement is achieved while
maintaining a comparable precision to existing approaches