1 research outputs found

    Orthogonal Extended Infomax Algorithm

    Full text link
    The extended infomax algorithm for independent component analysis (ICA) can separate sub- and super-Gaussian signals but converges slowly as it uses stochastic gradient optimization. In this paper, an improved extended infomax algorithm is presented that converges much faster. Accelerated convergence is achieved by replacing the natural gradient learning rule of extended infomax by a fully-multiplicative orthogonal-group based update scheme of the unmixing matrix leading to an orthogonal extended infomax algorithm (OgExtInf). Computational performance of OgExtInf is compared with two fast ICA algorithms: the popular FastICA and Picard, a L-BFGS algorithm belonging to the family of quasi-Newton methods. Our results demonstrate superior performance of the proposed method on small-size EEG data sets as used for example in online EEG processing systems, such as brain-computer interfaces or clinical systems for spike and seizure detection.Comment: 17 pages, 6 figure
    corecore