842 research outputs found

    A Unifying review of linear gaussian models

    Get PDF
    Factor analysis, principal component analysis, mixtures of gaussian clusters, vector quantization, Kalman filter models, and hidden Markov models can all be unified as variations of unsupervised learning under a single basic generative model. This is achieved by collecting together disparate observations and derivations made by many previous authors and introducing a new way of linking discrete and continuous state models using a simple nonlinearity. Through the use of other nonlinearities, we show how independent component analysis is also a variation of the same basic generative model.We show that factor analysis and mixtures of gaussians can be implemented in autoencoder neural networks and learned using squared error plus the same regularization term. We introduce a new model for static data, known as sensible principal component analysis, as well as a novel concept of spatially adaptive observation noise. We also review some of the literature involving global and local mixtures of the basic models and provide pseudocode for inference and learning for all the basic models

    Nonlinear independent component analysis for discrete-time and continuous-time signals

    Get PDF
    We study the classical problem of recovering a multidimensional source signal from observations of nonlinear mixtures of this signal. We show that this recovery is possible (up to a permutation and monotone scaling of the source's original component signals) if the mixture is due to a sufficiently differentiable and invertible but otherwise arbitrarily nonlinear function and the component signals of the source are statistically independent with 'non-degenerate' second-order statistics. The latter assumption requires the source signal to meet one of three regularity conditions which essentially ensure that the source is sufficiently far away from the non-recoverable extremes of being deterministic or constant in time. These assumptions, which cover many popular time series models and stochastic processes, allow us to reformulate the initial problem of nonlinear blind source separation as a simple-to-state problem of optimisation-based function approximation. We propose to solve this approximation problem by minimizing a novel type of objective function that efficiently quantifies the mutual statistical dependence between multiple stochastic processes via cumulant-like statistics. This yields a scalable and direct new method for nonlinear Independent Component Analysis with widely applicable theoretical guarantees and for which our experiments indicate good performance

    Nonlinear independent component analysis for continuous-time signals

    Get PDF
    We study the classical problem of recovering a multidimensional source signal from observations of nonlinear mixtures of this signal. We show that this recovery is possible (up to a permutation and monotone scaling of the source's original component signals) if the mixture is due to a sufficiently differentiable and invertible but otherwise arbitrarily nonlinear function and the component signals of the source are statistically independent with 'non-degenerate' second-order statistics. The latter assumption requires the source signal to meet one of three regularity conditions which essentially ensure that the source is sufficiently far away from the non-recoverable extremes of being deterministic or constant in time. These assumptions, which cover many popular time series models and stochastic processes, allow us to reformulate the initial problem of nonlinear blind source separation as a simple-to-state problem of optimisation-based function approximation. We propose to solve this approximation problem by minimizing a novel type of objective function that efficiently quantifies the mutual statistical dependence between multiple stochastic processes via cumulant-like statistics. This yields a scalable and direct new method for nonlinear Independent Component Analysis with widely applicable theoretical guarantees and for which our experiments indicate good performance

    Nonlinear Independent Component Analysis for Continuous-Time Signals

    Full text link
    We study the classical problem of recovering a multidimensional source process from observations of nonlinear mixtures of this process. Assuming statistical independence of the coordinate processes of the source, we show that this recovery is possible for many popular models of stochastic processes (up to order and monotone scaling of their coordinates) if the mixture is given by a sufficiently differentiable, invertible function. Key to our approach is the combination of tools from stochastic analysis and recent contrastive learning approaches to nonlinear ICA. This yields a scalable method with widely applicable theoretical guarantees for which our experiments indicate good performance.Comment: 68 pages, 8 figures. Added consistency results (Section 8), corrected typo

    Adaptive signal processing algorithms for noncircular complex data

    No full text
    The complex domain provides a natural processing framework for a large class of signals encountered in communications, radar, biomedical engineering and renewable energy. Statistical signal processing in C has traditionally been viewed as a straightforward extension of the corresponding algorithms in the real domain R, however, recent developments in augmented complex statistics show that, in general, this leads to under-modelling. This direct treatment of complex-valued signals has led to advances in so called widely linear modelling and the introduction of a generalised framework for the differentiability of both analytic and non-analytic complex and quaternion functions. In this thesis, supervised and blind complex adaptive algorithms capable of processing the generality of complex and quaternion signals (both circular and noncircular) in both noise-free and noisy environments are developed; their usefulness in real-world applications is demonstrated through case studies. The focus of this thesis is on the use of augmented statistics and widely linear modelling. The standard complex least mean square (CLMS) algorithm is extended to perform optimally for the generality of complex-valued signals, and is shown to outperform the CLMS algorithm. Next, extraction of latent complex-valued signals from large mixtures is addressed. This is achieved by developing several classes of complex blind source extraction algorithms based on fundamental signal properties such as smoothness, predictability and degree of Gaussianity, with the analysis of the existence and uniqueness of the solutions also provided. These algorithms are shown to facilitate real-time applications, such as those in brain computer interfacing (BCI). Due to their modified cost functions and the widely linear mixing model, this class of algorithms perform well in both noise-free and noisy environments. Next, based on a widely linear quaternion model, the FastICA algorithm is extended to the quaternion domain to provide separation of the generality of quaternion signals. The enhanced performances of the widely linear algorithms are illustrated in renewable energy and biomedical applications, in particular, for the prediction of wind profiles and extraction of artifacts from EEG recordings

    Applied Harmonic Analysis and Data Science (hybrid meeting)

    Get PDF
    Data science has become a field of major importance for science and technology nowadays and poses a large variety of challenging mathematical questions. The area of applied harmonic analysis has a significant impact on such problems by providing methodologies both for theoretical questions and for a wide range of applications in signal and image processing and machine learning. Building on the success of three previous workshops on applied harmonic analysis in 2012, 2015 and 2018, this workshop focused on several exciting novel directions such as mathematical theory of deep learning, but also reported progress on long-standing open problems in the field

    Universal Science of Mind: Can Complexity-Based Artificial Intelligence Save the World in Crisis?

    Get PDF
    While practical efforts in the field of artificial intelligence grow exponentially, the truly scientific and mathematically exact understanding of the underlying phenomena of intelligence and consciousness is still missing in the conventional science framework. The inevitably dominating empirical, trial-and-error approach has vanishing efficiency for those extremely complicated phenomena, ending up in fundamentally limited imitations of intelligent behaviour. We provide the first-principle analysis of unreduced many-body interaction process in the brain revealing its qualitatively new features, which give rise to rigorously defined chaotic, noncomputable, intelligent and conscious behaviour. Based on the obtained universal concepts of unreduced dynamic complexity, intelligence and consciousness, we derive the universal laws of intelligence applicable to any kind of intelligent system interacting with the environment. We finally show why and how these fundamentally substantiated and therefore practically efficient laws of intelligent system dynamics are indispensable for correct AI design and training, which is urgently needed in this time of critical global change towards the truly sustainable development
    corecore