505 research outputs found

    LDE-Net: L\'evy Induced Stochastic Differential Equation Equipped with Neural Network for Time Series Forecasting

    Full text link
    With the fast development of modern deep learning techniques, the study of dynamic systems and neural networks is increasingly benefiting each other in a lot of different ways. Since uncertainties often arise in real world observations, SDEs (stochastic differential equations) come to play an important role in scientific modeling. To this end, we employ a collection of SDEs with drift and diffusion terms approximated by neural networks to predict the trend of chaotic time series which has big jump properties. Our contributions are, first, we propose LDE-Net, which explores compounded SDEs with α\alpha-stable L\'evy motion to model complex time series data and solve the problem through neural network approximation. Second, we theoretically prove the convergence of our algorithm with respect to hyper-parameters of the neural network, and obtain the error bound without curse of dimensionality. Finally, we illustrate our method by applying it to real time series data and find the accuracy increases through the use of non-Gaussian L\'evy processes. We also present detailed comparisons in terms of data patterns, various models, different shapes of L\'evy motion and the prediction lengths.Comment: 18 pages, 38 figure

    Some time stepping methods for fractional diffusion problems with nonsmooth data

    Get PDF
    We consider error estimates for some time stepping methods for solving fractional diffusion problems with nonsmooth data in both homogeneous and inhomogeneous cases. McLean and Mustapha \cite{mclmus} (Time-stepping error bounds for fractional diffusion problems with non-smooth initial data, Journal of Computational Physics, 293(2015), 201-217) established an O(k)O(k) convergence rate for the piecewise constant discontinuous Galerkin method with nonsmooth initial data for the homogeneous problem when the linear operator AA is assumed to be self-adjoint, positive semidefinite and densely defined in a suitable Hilbert space, where kk denotes the time step size. In this paper, we approximate the Riemann-Liouville fractional derivative by Diethelm's method (or L1L1 scheme) and obtain the same time discretisation scheme as in McLean and Mustapha \cite{mclmus}. We first prove that this scheme has also convergence rate O(k)O(k) with nonsmooth initial data for the homogeneous problem when AA is a closed, densely defined linear operator satisfying some certain resolvent estimates. We then introduce a new time discretization scheme for the homogeneous problem based on the convolution quadrature and prove that the convergence rate of this new scheme is O(k1+α),0<α<1O(k^{1+ \alpha}), 0<\alpha <1 with the nonsmooth initial data. Using this new time discretization scheme for the homogeneous problem, we define a time stepping method for the inhomogeneous problem and prove that the convergence rate of this method is O(k1+α),0<α<1O(k^{1+ \alpha}), 0<\alpha <1 with the nonsmooth data. Numerical examples are given to show that the numerical results are consistent with the theoretical results
    • …
    corecore