1,338 research outputs found

    Power of tests for unit roots in the presence of a linear trend

    Get PDF
    Dickey and Fuller (1981) suggested unit root tests for an autoregressive model with a linear trend and a fixed initial value. This model has nuisance parameters so later authors have often worked with a slightly different model with a random initial value in which nuisance parameters can be eliminated by an invariant reduction of the model. This facilitates computation of envelope power functions and comparison of the relative performance of different unit root tests. It is shown here that invariance arguments also can be used when comparing power within the model with fixed initial value. Despite the apparently small difference between the two models the relative performance of unit root tests turns out to be very different.Envelope power function, maximal invariant parameter, maximal invariant statistic, most stringent test, unit root tests.

    Strong consistency results for least squares estimators in general vector autoregressions with deterministic terms

    Get PDF
    A vector autoregression with deterministic terms and with no restrictions to its characteristic roots is considered. Strong consistency results for the least squares statistics are presented. This extends earlier results where deterministic terms have not been considered. In addition the convergence rates are improved compared with earlier results.Least squares estimator, Strong consistency, Vector autoregression

    Asymptotic properties of least squares statistics in general vector autoregressive models

    Get PDF
    A vector autoregression with deterministic terms with no restrictions to its characteristic roots is considered. Strong consistency results and also some weak convergence results are given for a number of least squares statistics. These statistics are related to the denominator matrix of the least squares estimator as well as the least squares estimator itself. Applications of these results to the statistical analysis of non-stationary economic time-series are briefly discussed.Asymptotic normality, Cointegration, Least squares, Martingales, Sample correlations, Strong consistency, Vector autoregressive model, Weak consistency.

    Analysis of co-explosive processes

    Get PDF
    A vector autoregressive model allowing for unit roots as well as explosive characteristic roots is developed. The Granger-Johansen representation shows that this results in processes with two common features: a random walk and an explosively growing process. Co-integrating and co-explosive vectors can be found which eliminate these common factors. Likelihood ratio tests for linear restrictions on the co-explosive vectors are derived. As an empirical illustration the method is applied to data from the extreme Yugoslavian hyper-inflation of the 1990s.Asymptotic normality, Co-explosiveness, Cointegration, Explosive processes, Hyper-inflation, Likelihood ratio tests, Vector autoregression

    Cointegration Analysis in the Presence of Structural Breaks in the Deterministic Trend

    Get PDF
    When analysing macro economic data it is often of relevance to allow for structural breaks in the statistical analysis. In particular cointegration analysis in the presence of structural breaks could be of interest. To do this a vector autoregressive model is proposed with known break points in the structural breaks. Within this model it is possible to test cointegration rank, restrictions on the cointegrating vector as well as restrictions on for instance the slopes of broken linear trend.

    Order determination in general vector autoregressions

    Get PDF
    In the application of autoregressive models the order of the model is often estimated using either a sequence of likelihood ratio tests or a likelihood based information criterion. The consistency of such procedures has been discussed extensively under the assumption that the characteristic roots of the autoregression are stationary. It is shown that these methods can be used regardless of the assumption to the characteristic roots.

    Properties of etimated characteristic roots

    Get PDF
    Estimated characteristic roots in stationary autoregressions are shown to give rather noisy information about their population equivalents. This is remarkable given the central role of the characteristic roots in the theory of autoregressive processes. In the asymptotic analysis the problems appear when multiple roots are present as this imply a non-differentiability so the d-method does not apply, convergence rates are slow, and the asymptotic distribution is non-normal. In finite samples this has a considerable influence on the finite sample distribution unless the roots are far apart. With increasing order of the autoregressions it becomes increasingly difficult to place the roots far apart giving a very noisy signal from the characteristic roots.Autoregression; Characteristic root.

    Properties of Estimated Characteristic Roots

    Get PDF
    Estimated characteristic roots in stationary autoregressions are shown to give rather noisy information about their population equivalents. This is remarkable given the central role of the characteristic roots in the theory of autoregressive processes. In the asymptotic analysis the problems appear when multiple roots are present as this imply a non-differentiability so the δ-method does not apply, convergence rates are slow, and the asymptotic distribution is non-normal. In finite samples this has a considerable influence on the finite sample distribution unless the roots are far apart. With increasing order of the autoregressions it becomes increasingly difficult to place the roots far apart giving a very noisy signal from the characteristic roots.autoregression; characteristic root

    Discussion of The Forward Search: Theory and Data Analysis by Anthony C. Atkinson, Marco Riani, and Andrea Ceroli

    Get PDF
    The Forward Search Algorithm is a statistical algorithm for obtaining robust estimators of regression coefficients in the presence of outliers. The algorithm selects a succession of subsets of observations from which the parameters are estimated. The present note shows how the theory of empirical processes can contribute to the understanding of how the subsets are chosen and how the sequence of estimators is changing.empirical processes; Huber's skip; least trimmed squares estimator; one-step estimator; outlier robustness
    corecore