18,346 research outputs found

    Kernel methods for short-term spatio-temporal wind prediction

    Get PDF
    Two nonlinear methods for producing short-term spatio-temporal wind speed forecast are presented. From the relatively new class of kernel methods, a kernel least mean squares algorithm and kernel recursive least squares algorithm are introduced and used to produce 1 to 6 hour-ahead predictions of wind speed at six locations in the Netherlands. The performance of the proposed methods are compared to their linear equivalents, as well as the autoregressive, vector autoregressive and persistence time series models. The kernel recursive least squares algorithm is shown to offer significant improvement over all benchmarks, particularly for longer forecast horizons. Both proposed algorithms exhibit desirable numerical properties and are ripe for further development

    Square Root Extended Kernel Recursive Least Squares Algorithm for Nonlinear Channel Equalization

    Get PDF
    Abstract: This study presents a square root version of extended kernel recursive least square algorithm. Basically main idea is to overcome the divergence phenomena arise in the computation of weights of the extended kernel recursive least squares algorithm. Numerically stable givens orthogonal transformations are used to obtain the next iteration of the algorithm. The usefulness of the proposed algorithm is illustrated by discussing its application on the nonlinear multipath fading channel equalization based on Rayleigh distribution. Experiments are performed on slow fading Rayleigh channel with scattered signals

    Kernel recursive least squares dictionary learning algorithm

    Get PDF
    An online dictionary learning algorithm for kernel sparse representation is developed in the current paper. In this framework, the input signal nonlinearly mapped into the feature space is sparsely represented based on a virtual dictionary in the same space. At any instant, the dictionary is updated in two steps. In the first step, the input signal samples are sparsely represented in the feature space, using the dictionary that has been updated based on the previous data. In the second step, the dictionary is updated. In this paper, a novel recursive dictionary update algorithm is derived, based on the recursive least squares (RLS) approach. This algorithm gradually updates the dictionary, upon receiving one or a mini-batch of training samples. An efficient implementation of the algorithm is also formulated. Experimental results over four datasets in different fields show the superior performance of the proposed algorithm in comparison with its counterparts. In particular, the classification accuracy obtained by the dictionaries trained using the proposed algorithm gradually approaches that of the dictionaries trained in batch mode. Moreover, in spite of lower computational complexity, the proposed algorithm overdoes all existing online kernel dictionary learning algorithms.acceptedVersio

    Sparse least squares support vector regression for nonstationary systems

    Get PDF
    A new adaptive sparse least squares support vector regression algorithm, referred to as SLSSVR has been introduced for the adaptive modeling of nonstationary systems. Using a sliding window of recent data set of size N to track t he non-stationary characteristics of the incoming data, our adaptive model is initially formulated based on least squares support vector regression with forgetting factor (without bias term). In order to obtain a sparse model in which some parameters are exactly zeros, a l 1 penalty was applied in parameter estimation in the dual problem. Furthermore we exploit the fact that since the associated system/kernel matrix in positive definite, the dual solution of least squares support vector machine without bias term, can be solved iteratively with guaranteed convergence. Furthermore since the models between two consecutive time steps there are (N-1) shared kernels/parameters, the online solution can be obtained efficiently using coordinate descent algorithm in the form of Gauss-Seidel algorithm with minimal number of iterations. This allows a very sparse model per time step to be obtained very efficiently, avoiding expensive matrix inversion. The real stock market dataset and simulated examples have shown that the proposed approaches can lead to superior performances in comparison with the linear recursive least algorithm and a number of online non-linear approaches in terms of modelling performance and model size

    A New Recursive Least-Squares Method with Multiple Forgetting Schemes

    Full text link
    We propose a recursive least-squares method with multiple forgetting schemes to track time-varying model parameters which change with different rates. Our approach hinges on the reformulation of the classic recursive least-squares with forgetting scheme as a regularized least squares problem. A simulation study shows the effectiveness of the proposed method
    • …
    corecore