181 research outputs found

    Nonlinear independent component analysis for discrete-time and continuous-time signals

    Get PDF
    We study the classical problem of recovering a multidimensional source signal from observations of nonlinear mixtures of this signal. We show that this recovery is possible (up to a permutation and monotone scaling of the source's original component signals) if the mixture is due to a sufficiently differentiable and invertible but otherwise arbitrarily nonlinear function and the component signals of the source are statistically independent with 'non-degenerate' second-order statistics. The latter assumption requires the source signal to meet one of three regularity conditions which essentially ensure that the source is sufficiently far away from the non-recoverable extremes of being deterministic or constant in time. These assumptions, which cover many popular time series models and stochastic processes, allow us to reformulate the initial problem of nonlinear blind source separation as a simple-to-state problem of optimisation-based function approximation. We propose to solve this approximation problem by minimizing a novel type of objective function that efficiently quantifies the mutual statistical dependence between multiple stochastic processes via cumulant-like statistics. This yields a scalable and direct new method for nonlinear Independent Component Analysis with widely applicable theoretical guarantees and for which our experiments indicate good performance

    Large-Scale Kernel Methods for Independence Testing

    Get PDF
    Representations of probability measures in reproducing kernel Hilbert spaces provide a flexible framework for fully nonparametric hypothesis tests of independence, which can capture any type of departure from independence, including nonlinear associations and multivariate interactions. However, these approaches come with an at least quadratic computational cost in the number of observations, which can be prohibitive in many applications. Arguably, it is exactly in such large-scale datasets that capturing any type of dependence is of interest, so striking a favourable tradeoff between computational efficiency and test performance for kernel independence tests would have a direct impact on their applicability in practice. In this contribution, we provide an extensive study of the use of large-scale kernel approximations in the context of independence testing, contrasting block-based, Nystrom and random Fourier feature approaches. Through a variety of synthetic data experiments, it is demonstrated that our novel large scale methods give comparable performance with existing methods whilst using significantly less computation time and memory.Comment: 29 pages, 6 figure

    Exploring the Components of the Universe Through Higher-Order Weak Lensing Statistics

    Get PDF
    Our current cosmological model, backed by a large body of evidence from a variety of different cosmological probes (for example, see [1, 2]), describes a Universe comprised of around 5% normal baryonic matter, 22% cold dark matter and 73% dark energy. While many cosmologists accept this so-called concordance cosmology – the ΛCDM cosmological model – as accurate, very little is known about the nature and properties of these dark components of the Universe. Studies of the cosmic microwave background (CMB), combined with other observational evidence of big bang nucleosynthesis indicate that dark matter is non-baryonic. This supports measurements on galaxy and cluster scales, which found evidence of a large proportion of dark matter. This dark matter appears to be cold and collisionless, apparent only through its gravitational effects

    Numerical Methods for PDE Constrained Optimization with Uncertain Data

    Get PDF
    Optimization problems governed by partial differential equations (PDEs) arise in many applications in the form of optimal control, optimal design, or parameter identification problems. In most applications, parameters in the governing PDEs are not deterministic, but rather have to be modeled as random variables or, more generally, as random fields. It is crucial to capture and quantify the uncertainty in such problems rather than to simply replace the uncertain coefficients with their mean values. However, treating the uncertainty adequately and in a computationally tractable manner poses many mathematical challenges. The numerical solution of optimization problems governed by stochastic PDEs builds on mathematical subareas, which so far have been largely investigated in separate communities: Stochastic Programming, Numerical Solution of Stochastic PDEs, and PDE Constrained Optimization. The workshop achieved an impulse towards cross-fertilization of those disciplines which also was the subject of several scientific discussions. It is to be expected that future exchange of ideas between these areas will give rise to new insights and powerful new numerical methods

    Variations on the Theme of Conning in Mathematical Economics

    Get PDF
    The mathematization of economics is almost exclusively in terms of the mathematics of real analysis which, in turn, is founded on set theory (and the axiom of choice) and orthodox mathematical logic. In this paper I try to point out that this kind of mathematization is replete with economic infelicities. The attempt to extract these infelicities is in terms of three main examples: dynamics, policy and rational expectations and learning. The focus is on the role and reliance on standard xed point theorems in orthodox mathematical economics

    A Nash Game Based Variational Model For Joint Image Intensity Correction And Registration To Deal With Varying Illumination

    Get PDF
    Registration aligns features of two related images so that information can be compared and/or fused in order to highlight differences and complement information. In real life images where bias field is present, this undesirable artefact causes inhomogeneity of image intensities and hence leads to failure or loss of accuracy of registration models based on minimization of the differences of the two image intensities. Here, we propose a non-linear variational model for joint image intensity correction (illumination and translation) and registration and reformulate it in a game framework. While a non-potential game offers flexible reformulation and can lead to better fitting errors, proving the solution existence for a non-convex model is non-trivial. Here we establish an existence result using the Schauder's fixed point theorem. To solve the model numerically, we use an alternating minimization algorithm in the discrete setting. Finally numerical results can show that the new model outperforms existing models

    Bayesian spatio-temporal modelling for forecasting ground level ozone concentration levels

    No full text
    Accurate, instantaneous and high resolution spatial air-quality information can better inform the public and regulatory agencies of the air pollution levels that could cause adverse health effects. The most direct way to obtain accurate air quality information is from measurements made at surface monitoring stations across a study region of interest. Typically, however, air monitoring sites are sparsely and irregularly spaced over large areas. That is why, it is now very important to develop space-time models for air pollution which can produce accurate spatial predictions and temporal forecasts.This thesis focuses on developing spatio-temporal models for interpolating and forecasting ground level ozone concentration levels over a vast study region in the eastern United States. These models incorporate output from a computer simulation model known as the Community Multi-scale Air Quality (Eta-CMAQ) forecast model that can forecast up to 24 hours in advance. However, these forecasts are known to be biased. The models proposed hereare shown to improve upon these forecasts for a two-week study period during August 2005.The forecasting problems in both hourly and daily time units are investigated in detail. A fast method, based on Gaussian models is constructed for instantaneous interpolation and forecasts of hourly data. A more complexdynamic model, requiring the use of Markov chain Monte Carlo (MCMC) techniques, is developed for forecasting daily ozone concentration levels. A set of model validation analyses shows that the prediction maps that are generated by the aforementioned models are more accurate than the maps based solely on the Eta-CMAQ forecast data. A non-Gaussian measurement error model is also considered when forecasting the extreme levels of ozone concentration. All of the methods presented are based on Bayesian methods and MCMC sampling techniques are used in exploring posterior and predictive distributions

    Tuning of the Dielectric Relaxation and Complex Susceptibility in a System of Polar Molecules: A Generalised Model Based on Rotational Diffusion with Resetting

    Get PDF
    The application of the fractional calculus in the mathematical modelling of relaxation processes in complex heterogeneous media has attracted a considerable amount of interest lately. The reason for this is the successful implementation of fractional stochastic and kinetic equations in the studies of non-Debye relaxation. In this work, we consider the rotational diffusion equation with a generalised memory kernel in the context of dielectric relaxation processes in a medium composed of polar molecules. We give an overview of existing models on non-exponential relaxation and introduce an exponential resetting dynamic in the corresponding process. The autocorrelation function and complex susceptibility are analysed in detail. We show that stochastic resetting leads to a saturation of the autocorrelation function to a constant value, in contrast to the case without resetting, for which it decays to zero. The behaviour of the autocorrelation function, as well as the complex susceptibility in the presence of resetting, confirms that the dielectric relaxation dynamics can be tuned by an appropriate choice of the resetting rate. The presented results are general and flexible, and they will be of interest for the theoretical description of non-trivial relaxation dynamics in heterogeneous systems composed of polar molecules.publishedVersio
    • …
    corecore