16 research outputs found

    Compressive system identification of LTI and LTV ARX models: The limited data set case

    Get PDF
    In this paper, we consider identifying Auto Regressive with eXternal input (ARX) models for both Linear Time-Invariant (LTI) and Linear Time-Variant (LTV) systems. We aim at doing the identification from the smallest possible number of observations. This is inspired by the field of Compressive Sensing (CS), and for this reason, we call this problem Compressive System Identification (CSI). In the case of LTI ARX systems, a system with a large number of inputs and unknown input delays on each channel can require a model structure with a large number of parameters, unless input delay estimation is performed. Since the complexity of input delay estimation increases exponentially in the number of inputs, this can be difficult for high dimensional systems. We show that in cases where the LTI system has possibly many inputs with different unknown delays, simultaneous ARX identification and input delay estimation is possible from few observations, even though this leaves an apparently ill-conditioned identification problem. We discuss identification guarantees and support our proposed method with simulations. We also consider identifying LTV ARX models. In particular, we consider systems with parameters that change only at a few time instants in a piecewise-constant manner where neither the change moments nor the number of changes is known a priori. The main technical novelty of our approach is in casting the identification problem as recovery of a block-sparse signal from an underdetermined set of linear equations. We suggest a random sampling approach for LTV identification, address the issue of identifiability and again support our approach with illustrative simulations

    Model structure learning: A support vector machine approach for LPV linear-regression models

    Get PDF
    Accurate parametric identification of Linear Parameter-Varying (LPV) systems requires an optimal prior selection of a set of functional dependencies for the parametrization of the model coefficients. Inaccurate selection leads to structural bias while over-parametrization results in a variance increase of the estimates. This corresponds to the classical bias-variance trade-off, but with a significantly larger degree of freedom and sensitivity in the LPV case. Hence, it is attractive to estimate the underlying model structure of LPV systems based on measured data, i.e., to learn the underlying dependencies of the model coefficients together with model orders etc. In this paper a Least-Squares Support Vector Machine (LS-SVM) approach is introduced which is capable of reconstructing the dependency structure for linear regression based LPV models even in case of rational dynamic dependency. The properties of the approach are analyzed in the prediction error setting and its performance is evaluated on representative examples

    Model structure selection using an integrated forward orthogonal search algorithm interfered with squared correlation and mutual information

    Get PDF
    Model structure selection plays a key role in nonlinear system identification. The first step in nonlinear system identification is to determine which model terms should be included in the model. Once significant model terms have been determined, a model selection criterion can then be applied to select a suitable model subset. The well known orthogonal least squares type algorithms are one of the most efficient and commonly used techniques for model structure selection. However, it has been observed that the orthogonal least squares type algorithms may occasionally select incorrect model terms or yield a redundant model subset in the presence of particular noise structures or input signals. A very efficient integrated forward orthogonal searching (IFOS) algorithm, which is interfered with squared correlation and mutual information, and which incorporates a general cross-validation (GCV) criterion and hypothesis tests, is introduced to overcome these limitations in model structure selection

    Sparse Iterative Learning Control with Application to a Wafer Stage: Achieving Performance, Resource Efficiency, and Task Flexibility

    Get PDF
    Trial-varying disturbances are a key concern in Iterative Learning Control (ILC) and may lead to inefficient and expensive implementations and severe performance deterioration. The aim of this paper is to develop a general framework for optimization-based ILC that allows for enforcing additional structure, including sparsity. The proposed method enforces sparsity in a generalized setting through convex relaxations using 1\ell_1 norms. The proposed ILC framework is applied to the optimization of sampling sequences for resource efficient implementation, trial-varying disturbance attenuation, and basis function selection. The framework has a large potential in control applications such as mechatronics, as is confirmed through an application on a wafer stage.Comment: 12 pages, 14 figure

    Identification of input-output LPV models

    Get PDF
    This chapter presents an overview of the available methods for identifying input-output LPV models both in discrete time and continuous time with the main focus on noise modeling issues. First, a least-squares approach and an instrumental variable method are presented for dealing with LPV-ARX models. Then, a refined instrumental variable approach is discussed to address more sophisticated noise models like Box-Jenkins in the LPV context. This latter approach is also introduced in continuous time and efficient solutions are proposed for both the problem of time-derivative approximation and the issue of continuous-time modeling of the noise

    Model structure selection using an integrated forward orthogonal search algorithm assisted by squared correlation and mutual information

    No full text
    Model structure selection plays a key role in non-linear system identification. The first step in non-linear system identification is to determine which model terms should be included in the model. Once significant model terms have been determined, a model selection criterion can then be applied to select a suitable model subset. The well known Orthogonal Least Squares (OLS) type algorithms are one of the most efficient and commonly used techniques for model structure selection. However, it has been observed that the OLS type algorithms may occasionally select incorrect model terms or yield a redundant model subset in the presence of particular noise structures or input signals. A very efficient Integrated Forward Orthogonal Search (IFOS) algorithm, which is assisted by the squared correlation and mutual information, and which incorporates a Generalised Cross-Validation (GCV) criterion and hypothesis tests, is introduced to overcome these limitations in model structure selection

    LPV system identification using series expansion models

    Full text link
    corecore