1,175 research outputs found

    Regularized Gradient Algorithm for Non-Negative Independent Component Analysis

    Get PDF
    International audienceIndependent Component Analysis (ICA) is a well-known technique for solving blind source separation (BSS) problem. However "classical" ICA algorithms seem not suited for non-negative sources. This paper proposes a gradient descent approach for solving the Non- Negative Independent Component Analysis problem (NNICA). NNICA original separation criterion contains the discontinuous sign function whose minimization may lead to ill convergence (local minima) especially for sparse sources. Replacing the discontinuous function by a continuous one tanh, we propose a more accurate regularized Gradient algorithm called "Exact" Regularized Gradient (ERG) for NNICA. Experiments on synthetic data with different sparsity degrees illustrate the efficiency of the proposed method and a comparison shows that the proposed ERG outperforms existing methods

    Clustering via kernel decomposition

    Get PDF
    Spectral clustering methods were proposed recently which rely on the eigenvalue decomposition of an affinity matrix. In this letter, the affinity matrix is created from the elements of a nonparametric density estimator and then decomposed to obtain posterior probabilities of class membership. Hyperparameters are selected using standard cross-validation methods

    A convergent blind deconvolution method for post-adaptive-optics astronomical imaging

    Full text link
    In this paper we propose a blind deconvolution method which applies to data perturbed by Poisson noise. The objective function is a generalized Kullback-Leibler divergence, depending on both the unknown object and unknown point spread function (PSF), without the addition of regularization terms; constrained minimization, with suitable convex constraints on both unknowns, is considered. The problem is nonconvex and we propose to solve it by means of an inexact alternating minimization method, whose global convergence to stationary points of the objective function has been recently proved in a general setting. The method is iterative and each iteration, also called outer iteration, consists of alternating an update of the object and the PSF by means of fixed numbers of iterations, also called inner iterations, of the scaled gradient projection (SGP) method. The use of SGP has two advantages: first, it allows to prove global convergence of the blind method; secondly, it allows the introduction of different constraints on the object and the PSF. The specific constraint on the PSF, besides non-negativity and normalization, is an upper bound derived from the so-called Strehl ratio, which is the ratio between the peak value of an aberrated versus a perfect wavefront. Therefore a typical application is the imaging of modern telescopes equipped with adaptive optics systems for partial correction of the aberrations due to atmospheric turbulence. In the paper we describe the algorithm and we recall the results leading to its convergence. Moreover we illustrate its effectiveness by means of numerical experiments whose results indicate that the method, pushed to convergence, is very promising in the reconstruction of non-dense stellar clusters. The case of more complex astronomical targets is also considered, but in this case regularization by early stopping of the outer iterations is required

    Non-negative matrix factorization with sparseness constraints

    Full text link
    Non-negative matrix factorization (NMF) is a recently developed technique for finding parts-based, linear representations of non-negative data. Although it has successfully been applied in several applications, it does not always result in parts-based representations. In this paper, we show how explicitly incorporating the notion of `sparseness' improves the found decompositions. Additionally, we provide complete MATLAB code both for standard NMF and for our extension. Our hope is that this will further the application of these methods to solving novel data-analysis problems

    An LMS Based Blind Source Separation Algorithm Using A Fast Nonlinear Autocorrelation Method

    Full text link
    corecore