29 research outputs found

    Online Stochastic Principal Component Analysis

    Get PDF
    This paper studied Principal Component Analysis (PCA) in an online. The problem is posed as a subspace optimization problem and solved using gradient based algorithms. One such algorithm is the Variance-Reduced PCA (VR-PCA). The VR-PCA was designed as an improvement to the classical online PCA algorithm known as the Oja’s method where it only handled one sample at a time. The paper developed Block VR-PCA as an improved version of VR-PCA. Unlike prominent VR-PCA, the Block VR-PCA was designed to handle more than one dimension in subspace optimization at a time and it showed good performance. The Block VR-PCA and Block Oja method were compared experimentally in MATLAB using synthetic and real data sets, their convergence results showed Block VR-PCA method appeared to achieve a minimum steady state error than Block Oja method. Keywords:   Online Stochastic; Principal Component Analysis; Block Variance-Reduced; Block Oj

    Diffusion Approximations for Online Principal Component Estimation and Global Convergence

    Full text link
    In this paper, we propose to adopt the diffusion approximation tools to study the dynamics of Oja's iteration which is an online stochastic gradient descent method for the principal component analysis. Oja's iteration maintains a running estimate of the true principal component from streaming data and enjoys less temporal and spatial complexities. We show that the Oja's iteration for the top eigenvector generates a continuous-state discrete-time Markov chain over the unit sphere. We characterize the Oja's iteration in three phases using diffusion approximation and weak convergence tools. Our three-phase analysis further provides a finite-sample error bound for the running estimate, which matches the minimax information lower bound for principal component analysis under the additional assumption of bounded samples.Comment: Appeared in NIPS 201

    From Oja's Algorithm to the Multiplicative Weights Update Method with Applications

    Full text link
    Oja's algorithm is a well known online algorithm studied mainly in the context of stochastic principal component analysis. We make a simple observation, yet to the best of our knowledge a novel one, that when applied to a any (not necessarily stochastic) sequence of symmetric matrices which share common eigenvectors, the regret of Oja's algorithm could be directly bounded in terms of the regret of the well known multiplicative weights update method for the problem of prediction with expert advice. Several applications to optimization with quadratic forms over the unit sphere in Rn\reals^n are discussed
    corecore