687 research outputs found

    Non-negative mixtures

    Get PDF
    This is the author's accepted pre-print of the article, first published as M. D. Plumbley, A. Cichocki and R. Bro. Non-negative mixtures. In P. Comon and C. Jutten (Ed), Handbook of Blind Source Separation: Independent Component Analysis and Applications. Chapter 13, pp. 515-547. Academic Press, Feb 2010. ISBN 978-0-12-374726-6 DOI: 10.1016/B978-0-12-374726-6.00018-7file: Proof:p\PlumbleyCichockiBro10-non-negative.pdf:PDF owner: markp timestamp: 2011.04.26file: Proof:p\PlumbleyCichockiBro10-non-negative.pdf:PDF owner: markp timestamp: 2011.04.2

    Non-Euclidean principal component analysis by Hebbian learning

    Get PDF
    Principal component analysis based on Hebbian learning is originally designed for data processing inEuclidean spaces. We present in this contribution an extension of Oja's Hebbian learning approach fornon-Euclidean spaces. We show that for Banach spaces the Hebbian learning can be carried out using theunderlying semi-inner product. Prominent examples for such Banach spaces are the lp-spaces for p≠2.For kernels spaces, as applied in support vector machines or kernelized vector quantization, thisapproach can be formulated as an online learning scheme based on the differentiable kernel. Hence,principal component analysis can be explicitly carried out in the respective data spaces but nowequipped with a non-Euclidean metric. In the article we provide the theoretical framework and giveillustrative examples

    Modelling and Contractivity of Neural-Synaptic Networks with Hebbian Learning

    Full text link
    This paper is concerned with the modelling and analysis of two of the most commonly used recurrent neural network models (i.e., Hopfield neural network and firing-rate neural network) with dynamic recurrent connections undergoing Hebbian learning rules. To capture the synaptic sparsity of neural circuits we propose a low dimensional formulation. We then characterize certain key dynamical properties. First, we give biologically-inspired forward invariance results. Then, we give sufficient conditions for the non-Euclidean contractivity of the models. Our contraction analysis leads to stability and robustness of time-varying trajectories -- for networks with both excitatory and inhibitory synapses governed by both Hebbian and anti-Hebbian rules. For each model, we propose a contractivity test based upon biologically meaningful quantities, e.g., neural and synaptic decay rate, maximum in-degree, and the maximum synaptic strength. Then, we show that the models satisfy Dale's Principle. Finally, we illustrate the effectiveness of our results via a numerical example.Comment: 24 pages, 4 figure

    Neural Relax

    Full text link
    We present an algorithm for data preprocessing of an associative memory inspired to an electrostatic problem that turns out to have intimate relations with information maximization
    • …
    corecore