The singular value decomposition (SVD) is a crucial tool in machine learning
and statistical data analysis. However, it is highly susceptible to outliers in
the data matrix. Existing robust SVD algorithms often sacrifice speed for
robustness or fail in the presence of only a few outliers. This study
introduces an efficient algorithm, called Spherically Normalized SVD, for
robust SVD approximation that is highly insensitive to outliers,
computationally scalable, and provides accurate approximations of singular
vectors. The proposed algorithm achieves remarkable speed by utilizing only two
applications of a standard reduced-rank SVD algorithm to appropriately scaled
data, significantly outperforming competing algorithms in computation times. To
assess the robustness of the approximated singular vectors and their subspaces
against data contamination, we introduce new notions of breakdown points for
matrix-valued input, including row-wise, column-wise, and block-wise breakdown
points. Theoretical and empirical analyses demonstrate that our algorithm
exhibits higher breakdown points compared to standard SVD and its
modifications. We empirically validate the effectiveness of our approach in
applications such as robust low-rank approximation and robust principal
component analysis of high-dimensional microarray datasets. Overall, our study
presents a highly efficient and robust solution for SVD approximation that
overcomes the limitations of existing algorithms in the presence of outliers