314,551 research outputs found

    THE MODEL OF ANALOG COMPLEXING ALGORITHM BASED ON EMPIRICAL MODE DECOMPOSITION METHOD

    Get PDF
    Analog Complexing (AC) algorithm can be considered a sequential pattern recognition method for prediction. However, financial Time-series data are often nonlinear and non-stationary, which cause some difficulties when used AC algorithm in prediction. Aiming at this problem, in this paper, using Empirical Mode Decomposition (EMD) to handle original data, and we will obtain a series of stationary Intrinsic Mode Functions (IMF); then each IMF is predicted dynamically by AC. By the empirical studies on NYMEX Crude Oil Futures price show that AC algorithm based on EMD method have high precision in 1 step and 3 steps dynamically prediction. Key words: Analog Complexing algorithm, Empirical Mode Decomposition, Intrinsic Mode Function, Dynamically predictio

    Human vs. Algorithm

    Get PDF
    We consider the roles of algorithm and human and their inter-relationships. As a vehicle for some of our ideas we describe an empirical investigation of software professionals using analogy-based tools and unaided search in order to solve various prediction problems. We conclude that there exist a class of software engineering problems which might be characterised as high value and low frequency where the human-algorithm interaction must be considered carefully if they are to be successfully deployed in industry

    Proximal boosting and its acceleration

    Full text link
    Gradient boosting is a prediction method that iteratively combines weak learners to produce a complex and accurate model. From an optimization point of view, the learning procedure of gradient boosting mimics a gradient descent on a functional variable. This paper proposes to build upon the proximal point algorithm when the empirical risk to minimize is not differentiable to introduce a novel boosting approach, called proximal boosting. Besides being motivated by non-differentiable optimization, the proposed algorithm benefits from Nesterov's acceleration in the same way as gradient boosting [Biau et al., 2018]. This leads to a variant, called accelerated proximal boosting. Advantages of leveraging proximal methods for boosting are illustrated by numerical experiments on simulated and real-world data. In particular, we exhibit a favorable comparison over gradient boosting regarding convergence rate and prediction accuracy
    • …
    corecore