5,317 research outputs found

    The MVGC multivariate Granger causality toolbox: a new approach to Granger-causal inference

    Get PDF
    Background: Wiener-Granger causality (“G-causality”) is a statistical notion of causality applicable to time series data, whereby cause precedes, and helps predict, effect. It is defined in both time and frequency domains, and allows for the conditioning out of common causal influences. Originally developed in the context of econometric theory, it has since achieved broad application in the neurosciences and beyond. Prediction in the G-causality formalism is based on VAR (Vector AutoRegressive) modelling. New Method: The MVGC Matlab c Toolbox approach to G-causal inference is based on multiple equivalent representations of a VAR model by (i) regression parameters, (ii) the autocovariance sequence and (iii) the cross-power spectral density of the underlying process. It features a variety of algorithms for moving between these representations, enabling selection of the most suitable algorithms with regard to computational efficiency and numerical accuracy. Results: In this paper we explain the theoretical basis, computational strategy and application to empirical G-causal inference of the MVGC Toolbox. We also show via numerical simulations the advantages of our Toolbox over previous methods in terms of computational accuracy and statistical inference. Comparison with Existing Method(s): The standard method of computing G-causality involves estimation of parameters for both a full and a nested (reduced) VAR model. The MVGC approach, by contrast, avoids explicit estimation of the reduced model, thus eliminating a source of estimation error and improving statistical power, and in addition facilitates fast and accurate estimation of the computationally awkward case of conditional G-causality in the frequency domain. Conclusions: The MVGC Toolbox implements a flexible, powerful and efficient approach to G-causal inference. Keywords: Granger causality, vector autoregressive modelling, time series analysi

    Representation in Econometrics: A Historical Perspective

    Get PDF
    Measurement forms the substance of econometrics. This chapter outlines the history of econometrics from a measurement perspective - how have measurement errors been dealt with and how, from a methodological standpoint, did econometrics evolve so as to represent theory more adequately in relation to data? The evolution is organized in terms of four phases: 'theory and measurement', 'measurement and theory', 'measurement with theory' and 'measurement without theory'. The question of how measurement research has helped in the advancement of knowledge advance is discussed in the light of this history.Econometrics, History, Measurement error

    Bayesian interpretation of periodograms

    Full text link
    The usual nonparametric approach to spectral analysis is revisited within the regularization framework. Both usual and windowed periodograms are obtained as the squared modulus of the minimizer of regularized least squares criteria. Then, particular attention is paid to their interpretation within the Bayesian statistical framework. Finally, the question of unsupervised hyperparameter and window selection is addressed. It is shown that maximum likelihood solution is both formally achievable and practically useful

    Regularization and Bayesian Learning in Dynamical Systems: Past, Present and Future

    Full text link
    Regularization and Bayesian methods for system identification have been repopularized in the recent years, and proved to be competitive w.r.t. classical parametric approaches. In this paper we shall make an attempt to illustrate how the use of regularization in system identification has evolved over the years, starting from the early contributions both in the Automatic Control as well as Econometrics and Statistics literature. In particular we shall discuss some fundamental issues such as compound estimation problems and exchangeability which play and important role in regularization and Bayesian approaches, as also illustrated in early publications in Statistics. The historical and foundational issues will be given more emphasis (and space), at the expense of the more recent developments which are only briefly discussed. The main reason for such a choice is that, while the recent literature is readily available, and surveys have already been published on the subject, in the author's opinion a clear link with past work had not been completely clarified.Comment: Plenary Presentation at the IFAC SYSID 2015. Submitted to Annual Reviews in Contro

    Advanced methods in automatic modulation classification for emerging technologies

    Get PDF
    Modulation classification (MC) is of large importance in both military and commercial communication applications. It is a challenging problem, especially in non-cooperative wireless environments, where channel fading and no prior knowledge on the incoming signal are major factors that deteriorate the reception performance. Although the average likelihood ratio test method can provide an optimal solution to the MC problem with unknown parameters, it suffers from high computational complexity and in some cases mathematical intractability. Instead, in this research, an array-based quasi-hybrid likelihood ratio test (qHLRT) algorithm is proposed, which depicts two major advantages. First, it is simple yet accurate enough parameter estimation with reduced complexity. Second the incorporation of antenna arrays offers an effective ability to combat fading. Furthermore, a practical array-based qHLRT classifier scheme is implemented, which applies maximal ratio combining (MRC) to increase the accuracy of both carrier frequency offset (CFO) estimation and likelihood function calculation in channel fading. In fact, double CFO estimations are executed in this classifier. With the first the unknown CFO, phase offsets and amplitudes are estimated as prerequisite for MRC operation. Then, MRC is performed using these estimates, followed by a second CFO estimator. Since the input of the second CFO estimator is the output of the MRC, fading effects on the incoming signals are removed significantly and signal-to-noise ratio (SNR) is augmented. As a result, a more accurate CFO estimate is obtained. Consequently, the overall classification performance is improved, especially in low SNR environment. Recently, many state-of-the-arts communication technologies, such as orthogonal frequency division multiplexing (OFDM) modulations, have been emerging. The need for distinguishing OFDM signal from single carrier has become obvious. Besides, some vital parameters of OFDM signals should be extracted for further processing. In comparison to the research on MC for single carrier single antenna transmission, much less attention has been paid to the MC for emerging modulation methods. A comprehensive classification system is proposed for recognizing the OFDM signal and extracting its parameters. An automatic OFDM modulation classifier is proposed, which is based on the goodness-of-fittest. Since OFDM signal is Gaussian, Cramer-von Mises technique, working on the empirical distribution function, has been applied to test the presence of the normality. Numerical results show that such approach can successfully identify OFDM signals from single carrier modulations over a wide SNR range. Moreover, the proposed scheme can provide the acceptable performance when frequency-selective fading is present. Correlation test is then applied to estimate OFDM cyclic prefix duration. A two-phase searching scheme, which is based on Fast Fourier Transform (FFT) as well as Gaussianity test, is devised to detect the number of subcarriers. In the first phase, a coarse search is carried out iteratively. The exact number of subcarriers is determined by the fine tune in the second phase. Both analytical work and numerical results are presented to verify the efficiency of the proposed scheme
    corecore