8 research outputs found

    The Seven-League scheme: Deep learning for large time step Monte Carlo simulations of stochastic differential equations

    Get PDF
    We propose an accurate data-driven numerical scheme to solve stochastic differential equations (SDEs), by taking large time steps. The SDE discretization is built up by means of the polynomial chaos expansion method, on the basis of accurately determined stochastic collocation (SC) points. By employing an artificial neural network to learn these SC points, we can perform Monte Carlo simulations with large time steps. Basic error analysis indicates that this data-driven scheme results in accurate SDE solutions in the sense of strong convergence, provided the learning methodology is robust and accurate. With a method variant called the compression–decompression collocation and interpolation technique, we can drastically reduce the number of neural network functions that have to be learned, so that computational speed is enhanced. As a proof of concept, 1D numerical experiments confirm a high-quality strong convergence error when using large time steps, and the novel scheme outperforms some classical numerical SDE discretizations. Some applications, here in financial option valuation, are also presented

    GPU acceleration of the Seven-League Scheme for large time step simulations of stochastic differential equations

    Get PDF
    Monte Carlo simulation is widely used to numerically solve stochastic differential equations. Although the method is flexible and easy to implement, it may be slow to converge. Moreover, an inaccurate solution will result when using large time steps. The Seven League scheme, a deep learning-based numerical method, has been proposed to address these issues. This paper generalizes the scheme regarding parallel computing, particularly on Graphics Processing Units (GPUs), improving the computational speed

    Evaluation of integrals with fractional Brownian motion for different Hurst indices

    Get PDF
    In this paper, we will evaluate integrals that define the conditional expectation, variance and characteristic function of stochastic processes with respect to fractional Brownian motion (fBm) for all relevant Hurst indices, i.e. (Formula presented.). Particularly, the fractional Ornstein–Uhlenbeck (fOU) process gives rise to highly nontrivial integration formulas that need careful analysis when considering the whole range of Hurst indices. We will show that the classical technique of analytic continuation, from complex analysis, provides a way of extending the domain of validity of an integral from (Formula presented.) to the larger domain (Formula presented.). Numerical experiments for different Hurst indices confirm the robustness and efficiency of the integral formulations presented. Moreover, we provide accurate and highly efficient financial option pricing results for processes that are related to the fOU process, with the help of Fourier cosine expansions

    Linear feature extraction for ranking

    Full text link
    We address the feature extraction problem for document ranking in information retrieval. We then propose LifeRank, a Linear feature extraction algorithm for Ranking. In LifeRank, we regard each document collection for ranking as a matrix, referred to as the original matrix. We try to optimize a transformation matrix, so that a new matrix (dataset) can be generated as the product of the original matrix and a transformation matrix. The transformation matrix projects high-dimensional document vectors into lower dimensions. Theoretically, there could be very large transformation matrices, each leading to a new generated matrix. In LifeRank, we produce a transformation matrix so that the generated new matrix can match the learning to rank problem. Extensive experiments on benchmark datasets show the performance gains of LifeRank in comparison with state-of-the-art feature selection algorithms.peerReviewe
    corecore