19,595 research outputs found

    An adaptive orthogonal search algorithm for model subset selection and non-linear system identification

    Get PDF
    A new adaptive orthogonal search (AOS) algorithm is proposed for model subset selection and non-linear system identification. Model structure detection is a key step in any system identification problem. This consists of selecting significant model terms from a redundant dictionary of candidate model terms, and determining the model complexity (model length or model size). The final objective is to produce a parsimonious model that can well capture the inherent dynamics of the underlying system. In the new AOS algorithm, a modified generalized cross-validation criterion, called the adjustable prediction error sum of squares (APRESS), is introduced and incorporated into a forward orthogonal search procedure. The main advantage of the new AOS algorithm is that the mechanism is simple and the implementation is direct and easy, and more importantly it can produce efficient model subsets for most non-linear identification problems

    Model structure selection using an integrated forward orthogonal search algorithm assisted by squared correlation and mutual information

    No full text
    Model structure selection plays a key role in non-linear system identification. The first step in non-linear system identification is to determine which model terms should be included in the model. Once significant model terms have been determined, a model selection criterion can then be applied to select a suitable model subset. The well known Orthogonal Least Squares (OLS) type algorithms are one of the most efficient and commonly used techniques for model structure selection. However, it has been observed that the OLS type algorithms may occasionally select incorrect model terms or yield a redundant model subset in the presence of particular noise structures or input signals. A very efficient Integrated Forward Orthogonal Search (IFOS) algorithm, which is assisted by the squared correlation and mutual information, and which incorporates a Generalised Cross-Validation (GCV) criterion and hypothesis tests, is introduced to overcome these limitations in model structure selection

    Improved model identification for nonlinear systems using a random subsampling and multifold modelling (RSMM) approach

    Get PDF
    In nonlinear system identification, the available observed data are conventionally partitioned into two parts: the training data that are used for model identification and the test data that are used for model performance testing. This sort of ‘hold-out’ or ‘split-sample’ data partitioning method is convenient and the associated model identification procedure is in general easy to implement. The resultant model obtained from such a once-partitioned single training dataset, however, may occasionally lack robustness and generalisation to represent future unseen data, because the performance of the identified model may be highly dependent on how the data partition is made. To overcome the drawback of the hold-out data partitioning method, this study presents a new random subsampling and multifold modelling (RSMM) approach to produce less biased or preferably unbiased models. The basic idea and the associated procedure are as follows. Firstly, generate K training datasets (and also K validation datasets), using a K-fold random subsampling method. Secondly, detect significant model terms and identify a common model structure that fits all the K datasets using a new proposed common model selection approach, called the multiple orthogonal search algorithm. Finally, estimate and refine the model parameters for the identified common-structured model using a multifold parameter estimation method. The proposed method can produce robust models with better generalisation performance

    Improved model identification for non-linear systems using a random subsampling and multifold modelling (RSMM) approach

    Get PDF
    In non-linear system identification, the available observed data are conventionally partitioned into two parts: the training data that are used for model identification and the test data that are used for model performance testing. This sort of 'hold-out' or 'split-sample' data partitioning method is convenient and the associated model identification procedure is in general easy to implement. The resultant model obtained from such a once-partitioned single training dataset, however, may occasionally lack robustness and generalisation to represent future unseen data, because the performance of the identified model may be highly dependent on how the data partition is made. To overcome the drawback of the hold-out data partitioning method, this study presents a new random subsampling and multifold modelling (RSMM) approach to produce less biased or preferably unbiased models. The basic idea and the associated procedure are as follows. First, generate K training datasets (and also K validation datasets), using a K-fold random subsampling method. Secondly, detect significant model terms and identify a common model structure that fits all the K datasets using a new proposed common model selection approach, called the multiple orthogonal search algorithm. Finally, estimate and refine the model parameters for the identified common-structured model using a multifold parameter estimation method. The proposed method can produce robust models with better generalisation performance

    Model structure selection using an integrated forward orthogonal search algorithm interfered with squared correlation and mutual information

    Get PDF
    Model structure selection plays a key role in nonlinear system identification. The first step in nonlinear system identification is to determine which model terms should be included in the model. Once significant model terms have been determined, a model selection criterion can then be applied to select a suitable model subset. The well known orthogonal least squares type algorithms are one of the most efficient and commonly used techniques for model structure selection. However, it has been observed that the orthogonal least squares type algorithms may occasionally select incorrect model terms or yield a redundant model subset in the presence of particular noise structures or input signals. A very efficient integrated forward orthogonal searching (IFOS) algorithm, which is interfered with squared correlation and mutual information, and which incorporates a general cross-validation (GCV) criterion and hypothesis tests, is introduced to overcome these limitations in model structure selection

    Constructing an overall dynamical model for a system with changing design parameter properties

    No full text
    This study considers the identification problem for a class of non-linear parameter-varying systems associated with the following scenario: the system behaviour depends on some specifically prescribed parameter properties, which are adjustable. To understand the effect of the varying parameters, several different experiments, corresponding to different parameter properties, are carried out and different data sets are collected. The objective is to find, from the available data sets, a common parameter-dependent model structure that best fits the adjustable parameter properties for the underlying system. An efficient Common Model Structure Selection (CMSS) algorithm, called the Extended Forward Orthogonal Regression (EFOR) algorithm, is proposed to select such a common model structure. Two examples are presented to illustrate the application and the effectiveness of the new identification approach

    Identification of Nonlinear Parameter-Dependent Common-Structured models to accommodate varying experimental conditions and design parameter properties

    Get PDF
    This study considers the identification problem for a class of nonlinear parameter-varying systems associated with the following scenario: the system behaviour depends on some specifically prescribed parameter properties, which are adjustable. To understand the effect of the varying parameters, several different experiments, corresponding to different parameter properties, are carried out and different data sets are collected. The objective is to find, from the available data sets, a common parameter-dependent model structure that best fits the adjustable parameter properties for the underlying system. An efficient common model structure selection (CMSS) algorithm, called the extended forward orthogonal regression (EFOR) algorithm, is proposed to select such a common model structure. Several examples are presented to illustrate the application and the effectiveness of the new identification approach

    Model estimation of cerebral hemodynamics between blood flow and volume changes: a data-based modeling approach

    Get PDF
    It is well known that there is a dynamic relationship between cerebral blood flow (CBF) and cerebral blood volume (CBV). With increasing applications of functional MRI, where the blood oxygen-level-dependent signals are recorded, the understanding and accurate modeling of the hemodynamic relationship between CBF and CBV becomes increasingly important. This study presents an empirical and data-based modeling framework for model identification from CBF and CBV experimental data. It is shown that the relationship between the changes in CBF and CBV can be described using a parsimonious autoregressive with exogenous input model structure. It is observed that neither the ordinary least-squares (LS) method nor the classical total least-squares (TLS) method can produce accurate estimates from the original noisy CBF and CBV data. A regularized total least-squares (RTLS) method is thus introduced and extended to solve such an error-in-the-variables problem. Quantitative results show that the RTLS method works very well on the noisy CBF and CBV data. Finally, a combination of RTLS with a filtering method can lead to a parsimonious but very effective model that can characterize the relationship between the changes in CBF and CBV

    Identification of nonlinear time-varying systems using an online sliding-window and common model structure selection (CMSS) approach with applications to EEG

    Get PDF
    The identification of nonlinear time-varying systems using linear-in-the-parameter models is investigated. A new efficient Common Model Structure Selection (CMSS) algorithm is proposed to select a common model structure. The main idea and key procedure is: First, generate K 1 data sets (the first K data sets are used for training, and theK 1 th one is used for testing) using an online sliding window method; then detect significant model terms to form a common model structure which fits over all the K training data sets using the new proposed CMSS approach. Finally, estimate and refine the time-varying parameters for the identified common-structured model using a Recursive Least Squares (RLS) parameter estimation method. The new method can effectively detect and adaptively track the transient variation of nonstationary signals. Two examples are presented to illustrate the effectiveness of the new approach including an application to an EEG data set

    NARX-based nonlinear system identification using orthogonal least squares basis hunting

    No full text
    An orthogonal least squares technique for basis hunting (OLS-BH) is proposed to construct sparse radial basis function (RBF) models for NARX-type nonlinear systems. Unlike most of the existing RBF or kernel modelling methods, whichplaces the RBF or kernel centers at the training input data points and use a fixed common variance for all the regressors, the proposed OLS-BH technique tunes the RBF center and diagonal covariance matrix of individual regressor by minimizing the training mean square error. An efficient optimization method isadopted for this basis hunting to select regressors in an orthogonal forward selection procedure. Experimental results obtained using this OLS-BH technique demonstrate that it offers a state-of-the-art method for constructing parsimonious RBF models with excellent generalization performance
    corecore