2,067 research outputs found

    Forecasting and Granger Modelling with Non-linear Dynamical Dependencies

    Full text link
    Traditional linear methods for forecasting multivariate time series are not able to satisfactorily model the non-linear dependencies that may exist in non-Gaussian series. We build on the theory of learning vector-valued functions in the reproducing kernel Hilbert space and develop a method for learning prediction functions that accommodate such non-linearities. The method not only learns the predictive function but also the matrix-valued kernel underlying the function search space directly from the data. Our approach is based on learning multiple matrix-valued kernels, each of those composed of a set of input kernels and a set of output kernels learned in the cone of positive semi-definite matrices. In addition to superior predictive performance in the presence of strong non-linearities, our method also recovers the hidden dynamic relationships between the series and thus is a new alternative to existing graphical Granger techniques.Comment: Accepted for ECML-PKDD 201

    Sparse Learning for Variable Selection with Structures and Nonlinearities

    Full text link
    In this thesis we discuss machine learning methods performing automated variable selection for learning sparse predictive models. There are multiple reasons for promoting sparsity in the predictive models. By relying on a limited set of input variables the models naturally counteract the overfitting problem ubiquitous in learning from finite sets of training points. Sparse models are cheaper to use for predictions, they usually require lower computational resources and by relying on smaller sets of inputs can possibly reduce costs for data collection and storage. Sparse models can also contribute to better understanding of the investigated phenomenons as they are easier to interpret than full models.Comment: PhD thesi

    Limit Theory under Network Dependence and Nonstationarity

    Full text link
    These lecture notes represent supplementary material for a short course on time series econometrics and network econometrics. We give emphasis on limit theory for time series regression models as well as the use of the local-to-unity parametrization when modeling time series nonstationarity. Moreover, we present various non-asymptotic theory results for moderate deviation principles when considering the eigenvalues of covariance matrices as well as asymptotics for unit root moderate deviations in nonstationary autoregressive processes. Although not all applications from the literature are covered we also discuss some open problems in the time series and network econometrics literature.Comment: arXiv admin note: text overlap with arXiv:1705.08413 by other author
    • …
    corecore