146,186 research outputs found

    Adaptive Random Fourier Features Kernel LMS

    Full text link
    We propose the adaptive random Fourier features Gaussian kernel LMS (ARFF-GKLMS). Like most kernel adaptive filters based on stochastic gradient descent, this algorithm uses a preset number of random Fourier features to save computation cost. However, as an extra flexibility, it can adapt the inherent kernel bandwidth in the random Fourier features in an online manner. This adaptation mechanism allows to alleviate the problem of selecting the kernel bandwidth beforehand for the benefit of an improved tracking in non-stationary circumstances. Simulation results confirm that the proposed algorithm achieves a performance improvement in terms of convergence rate, error at steady-state and tracking ability over other kernel adaptive filters with preset kernel bandwidth.Comment: 5 pages, 2 figure

    Breaking the waves: asymmetric random periodic features for low-bitrate kernel machines

    Full text link
    Many signal processing and machine learning applications are built from evaluating a kernel on pairs of signals, e.g. to assess the similarity of an incoming query to a database of known signals. This nonlinear evaluation can be simplified to a linear inner product of the random Fourier features of those signals: random projections followed by a periodic map, the complex exponential. It is known that a simple quantization of those features (corresponding to replacing the complex exponential by a different periodic map that takes binary values, which is appealing for their transmission and storage), distorts the approximated kernel, which may be undesirable in practice. Our take-home message is that when the features of only one of the two signals are quantized, the original kernel is recovered without distortion; its practical interest appears in several cases where the kernel evaluations are asymmetric by nature, such as a client-server scheme. Concretely, we introduce the general framework of asymmetric random periodic features, where the two signals of interest are observed through random periodic features: random projections followed by a general periodic map, which is allowed to be different for both signals. We derive the influence of those periodic maps on the approximated kernel, and prove uniform probabilistic error bounds holding for all signal pairs from an infinite low-complexity set. Interestingly, our results allow the periodic maps to be discontinuous, thanks to a new mathematical tool, i.e. the mean Lipschitz smoothness. We then apply this generic framework to semi-quantized kernel machines (where only one signal has quantized features and the other has classical random Fourier features), for which we show theoretically that the approximated kernel remains unchanged (with the associated error bound), and confirm the power of the approach with numerical simulations

    Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction

    Get PDF
    We provide approximation guarantees for a linear-time inferential framework for Gaussian processes, using two low-rank kernel approximations based on random Fourier features and truncation of Mercer expansions. In particular, we bound the Kullback-Leibler divergence between the idealized Gaussian process and the one resulting from a low-rank approximation to its kernel. Additionally, we present strong evidence that these two approximations, enhanced by an initial automatic feature extraction through deep neural networks, outperform a broad range of state-of-the-art methods in terms of time efficiency, negative log-predictive density, and root mean squared error
    corecore