1 research outputs found

    Incremental Reduced Lagrangian Asymmetric ν-Twin Support Vector Regression

    No full text
    Lagrangian asymmetric ν-twin support vector regression is a prediction algorithm with good generalization performance. However, it is unsuitable for the scenarios where the samples are provided incrementally. Therefore, an incremental Lagrangian asymmetric ν-twin support vector regression (IRLAsy-ν-TSVR) algorithm is proposed. Firstly, the constrained optimization problems are transformed into unconstrained ones by introducing the plus functions, and the semi-smooth Newton method is utilized to directly solve them in the primal space to accelerate the convergence speed. Then, the matrix inverse lemma is adopted to realize efficient incremental update of the Hessian matrix inversion in the semi-smooth Newton method and save time. Next, to reduce the memory cost caused by the sample accumulation, the column and row vectors of the augmented kernel matrix are filtered by the reduced technology to approximate the original augmented kernel matrix, and this ensures the sparsity of the solution. Finally, the feasibility and efficacy of the proposed algorithm are validated on the benchmark datasets. The results show that compared with some state-of-the-art algorithms, the IRLAsy-ν-TSVR algorithm inherits the generali-zation performance of the offline algorithm and can obtain sparse solution, which is more suitable for online learning of large-scale datasets
    corecore