9,600 research outputs found

    Pitfalls in testing for long run relationships

    Get PDF
    This paper analyzes the robustness of the two most commonly used cointegration tests: the single equation based test of Engle and Granger (EG) and the system based test of Johansen. We show analytically and numerically several important situations where the Johansen LR tests tend to find spurious cointegration with probability approaching one asymptotically. The situations investigated are of two types. The first one corresponds to variables that have long-memory properties and a trending behavior, but they are not pure I(1) processes although they are difficult to tell from I(1) with standard unit root tests. The second corresponds to I(1) variables whose VAR representation has a singular or near-singular error covariance matrix. In most of the situations investigated in this paper, EG test is more robust than Johansen LR tests. This paper shows that a proper use of the LR test in applied cointegration analysis requires a deeper data analysis than the standard unit root test. We conclude by recommending to use both tests (EG and Johansen) to test for cointegration in order to avoid or to discover a pitfall.Publicad

    An analytic comparison of regularization methods for Gaussian Processes

    Get PDF
    Gaussian Processes (GPs) are a popular approach to predict the output of a parameterized experiment. They have many applications in the field of Computer Experiments, in particular to perform sensitivity analysis, adaptive design of experiments and global optimization. Nearly all of the applications of GPs require the inversion of a covariance matrix that, in practice, is often ill-conditioned. Regularization methodologies are then employed with consequences on the GPs that need to be better understood.The two principal methods to deal with ill-conditioned covariance matrices are i) pseudoinverse and ii) adding a positive constant to the diagonal (the so-called nugget regularization).The first part of this paper provides an algebraic comparison of PI and nugget regularizations. Redundant points, responsible for covariance matrix singularity, are defined. It is proven that pseudoinverse regularization, contrarily to nugget regularization, averages the output values and makes the variance zero at redundant points. However, pseudoinverse and nugget regularizations become equivalent as the nugget value vanishes. A measure for data-model discrepancy is proposed which serves for choosing a regularization technique.In the second part of the paper, a distribution-wise GP is introduced that interpolates Gaussian distributions instead of data points. Distribution-wise GP can be seen as an improved regularization method for GPs

    Principal component analysis and perturbation theory–based robust damage detection of multifunctional aircraft structure

    Get PDF
    A fundamental problem in structural damage detection is to define an efficient feature to calculate a damage index. Furthermore, due to perturbations from various sources, we also need to define a rigorous threshold whose overtaking indicates the presence of damages. In this article, we develop a robust damage detection methodology based on principal component analysis. We first present an original damage index based on projection of the separation matrix, and then, we drive a novel adaptive threshold that does not rely on statistical assumptions. This threshold is analytic, and it is based on matrix perturbation theory. The efficiency of the method is illustrated using simulations of a composite smart structure and experimental results performed on a conformal load-bearing antenna structure laboratory test

    On some limitations of probabilistic models for dimension-reduction: illustration in the case of one particular probabilistic formulation of PLS

    Full text link
    Partial Least Squares (PLS) refer to a class of dimension-reduction techniques aiming at the identification of two sets of components with maximal covariance, in order to model the relationship between two sets of observed variables x∈Rpx\in\mathbb{R}^p and y∈Rqy\in\mathbb{R}^q, with p≥1,q≥1p\geq 1, q\geq 1. El Bouhaddani et al. (2017) have recently proposed a probabilistic formulation of PLS. Under the constraints they consider for the parameters of their model, this latter can be seen as a probabilistic formulation of one version of PLS, namely the PLS-SVD. However, we establish that these constraints are too restrictive as they define a very particular subset of distributions for (x,y)(x,y) under which, roughly speaking, components with maximal covariance (solutions of PLS-SVD), are also necessarily of respective maximal variances (solutions of the principal components analyses of xx and yy, respectively). Then, we propose a simple extension of el Bouhaddani et al.'s model, which corresponds to a more general probabilistic formulation of PLS-SVD, and which is no longer restricted to these particular distributions. We present numerical examples to illustrate the limitations of the original model of el Bouhaddani et al. (2017)
    • …
    corecore