9,605 research outputs found

    Kernel-based system identification from noisy and incomplete input-output data

    Full text link
    In this contribution, we propose a kernel-based method for the identification of linear systems from noisy and incomplete input-output datasets. We model the impulse response of the system as a Gaussian process whose covariance matrix is given by the recently introduced stable spline kernel. We adopt an empirical Bayes approach to estimate the posterior distribution of the impulse response given the data. The noiseless and missing data samples, together with the kernel hyperparameters, are estimated maximizing the joint marginal likelihood of the input and output measurements. To compute the marginal-likelihood maximizer, we build a solution scheme based on the Expectation-Maximization method. Simulations on a benchmark dataset show the effectiveness of the method.Comment: 16 pages, submitted to IEEE Conference on Decision and Control 201

    Three Dimensional Pseudo-Spectral Compressible Magnetohydrodynamic GPU Code for Astrophysical Plasma Simulation

    Full text link
    This paper presents the benchmarking and scaling studies of a GPU accelerated three dimensional compressible magnetohydrodynamic code. The code is developed keeping an eye to explain the large and intermediate scale magnetic field generation is cosmos as well as in nuclear fusion reactors in the light of the theory given by Eugene Newman Parker. The spatial derivatives of the code are pseudo-spectral method based and the time solvers are explicit. GPU acceleration is achieved with minimal code changes through OpenACC parallelization and use of NVIDIA CUDA Fast Fourier Transform library (cuFFT). NVIDIAs unified memory is leveraged to enable over-subscription of the GPU device memory for seamless out-of-core processing of large grids. Our experimental results indicate that the GPU accelerated code is able to achieve upto two orders of magnitude speedup over a corresponding OpenMP parallel, FFTW library based code, on a NVIDIA Tesla P100 GPU. For large grids that require out-of-core processing on the GPU, we see a 7x speedup over the OpenMP, FFTW based code, on the Tesla P100 GPU. We also present performance analysis of the GPU accelerated code on different GPU architectures - Kepler, Pascal and Volta

    VAR Modelling Approach and Cowles Commission Heritage

    Get PDF
    This paper examines the rise of the VAR approach from a historical perspective. It shows that the VAR approach arises as a systematic solution to the issue of 'model choice' bypassed by Cowles Commission (CC) researchers, and that the approach essentially inherits and enhances the CC legacy rather than abandons or opposes it. It argues that the approach is not so atheoretical as widely believed and that it helps reform econometrics by shifting research focus from measurement of given theories to identification/verification of data-coherent theories, and hence from confirmatory analysis to a mixture of confirmatory and exploratory analysis.VAR, Macroeconometrics, Methodology, Rational expectations, Structural model

    Econometric Studies of Business Cycles in the History of Econometrics

    Get PDF
    This study examines the evolution of econometric research in business cycle analysis during the 1960-90 period. It shows how the research was dominated by an assimilation of the tradition of NBER business cycle analysis by the Haavelmo-Cowles Commission approach, catalysed by time-series statistical methods. Methodological consequences of the assimilation are critically evaluated in light of the meagre achievement of the research in predicting the current global recession.Business cycles, NBER, Forecasting

    Representation in Econometrics: A Historical Perspective

    Get PDF
    Measurement forms the substance of econometrics. This chapter outlines the history of econometrics from a measurement perspective - how have measurement errors been dealt with and how, from a methodological standpoint, did econometrics evolve so as to represent theory more adequately in relation to data? The evolution is organized in terms of four phases: 'theory and measurement', 'measurement and theory', 'measurement with theory' and 'measurement without theory'. The question of how measurement research has helped in the advancement of knowledge advance is discussed in the light of this history.Econometrics, History, Measurement error

    System Identification Based on Errors-In-Variables System Models

    Get PDF
    We study the identification problem for errors-in-variables (EIV) systems. Such an EIV model assumes that the measurement data at both input and output of the system involve corrupting noises. The least square (LS) algorithm has been widely used in this area. However, it results in biased estimates for the EIV-based system identification. In contrast, the total least squares (TLS) algorithm is unbiased, which is now well-known, and has been effective for estimating the system parameters in the EIV system identification. In this dissertation, we first show that the TLS algorithm computes the approximate maximum likelihood estimate (MLE) of the system parameters and that the approximation error converges to zero asymptotically as the number of measurement data approaches infinity. Then we propose a graph subspace approach (GSA) to tackle the same EIV-based system identification problem and derive a new estimation algorithm that is more general than the TLS algorithm. Several numerical examples are worked out to illustrate our proposed estimation algorithm for the EIV-based system identification. We also study the problem of the EIV system identification without assuming equal noise variances at the system input and output. Firstly, we review the Frisch scheme, which is a well-known method for estimating the noise variances. Then we propose a new method which is GSA in combination with the Frisch scheme (GSA-Frisch) algorithm via estimating the ratio of the noise variances and the system parameters iteratively. Finally, a new identification algorithm is proposed to estimate the system parameters based on the subspace interpretation without estimating noise variances or the ratio. This new algorithm is unbiased, and achieves the consistency of the parameter estimates. Moreover, it is low in complexity. The performance of the identification algorithm is examined by several numerical examples, and compared to the N4SID algorithm that has the Matlab codes available in Matlab toolboxes, and also to the GSA-Frisch algorithm

    TOWARDS A MORE GENERAL APPROACH TO TESTING THE TIME ADDITIVITY HYPOTHESIS

    Get PDF
    A new procedure is proposed for re-examining the assumption of additivity of preferences over time which, although untenable, is usually maintained in intertemporal analyses of consumption and labour supply. The method is an extension of a famous work by Browning (1991). However, it is more general in permitting the estimation of Frisch demands, which are explicit in an unobservable variable (price of utility), but may lack a closed form representation in terms of observable variables such as prices and total outlay. It also makes an extensive use of duality theory to solve the endogeneity problem encountered in Browning\'s study. Applying this method with an appropriate estimator to the Australian disaggregate data, we find that the intertemporal additivity hypothesis is decisively rejected, which is consistent with Browning\'s conclusion. Results also indicate that the effects of lagged and future prices in determining current consumption decisions are insubstantial.Frisch Demands; The SNAP Structure; Intertemporal Additivity Hypothesis;

    Robust control examples applied to a wind turbine simulated model

    Get PDF
    Wind turbine plants are complex dynamic and uncertain processes driven by stochastic inputs and disturbances, as well as different loads represented by gyroscopic, centrifugal and gravitational forces. Moreover, as their aerodynamic models are nonlinear, both modeling and control become challenging problems. On the one hand, high-fidelity simulators should contain different parameters and variables in order to accurately describe the main dynamic system behavior. Therefore, the development of modeling and control for wind turbine systems should consider these complexity aspects. On the other hand, these control solutions have to include the main wind turbine dynamic characteristics without becoming too complicated. The main point of this paper is thus to provide two practical examples of the development of robust control strategies when applied to a simulated wind turbine plant. Extended simulations with the wind turbine benchmark model and the Monte Carlo tool represent the instruments for assessing the robustness and reliability aspects of the developed control methodologies when the model-reality mismatch and measurement errors are also considered. Advantages and drawbacks of these regulation methods are also highlighted with respect to different control strategies via proper performance metrics.Wind turbine plants are complex dynamic and uncertain processes driven by stochastic inputs and disturbances, as well as different loads represented by gyroscopic, centrifugal and gravitational forces. Moreover, as their aerodynamic models are nonlinear, both modeling and control become challenging problems. On the one hand, high-fidelity simulators should contain different parameters and variables in order to accurately describe the main dynamic system behavior. Therefore, the development of modeling and control for wind turbine systems should consider these complexity aspects. On the other hand, these control solutions have to include the main wind turbine dynamic characteristics without becoming too complicated. The main point of this paper is thus to provide two practical examples of the development of robust control strategies when applied to a simulated wind turbine plant. Extended simulations with the wind turbine benchmark model and the Monte Carlo tool represent the instruments for assessing the robustness and reliability aspects of the developed control methodologies when the model-reality mismatch and measurement errors are also considered. Advantages and drawbacks of these regulation methods are also highlighted with respect to different control strategies via proper performance metrics
    corecore