11 research outputs found

    Generalised cellular neural networks (GCNNs) constructed using particle swarm optimisation for spatio-temporal evolutionary pattern identification

    Get PDF
    Particle swarm optimization (PSO) is introduced to implement a new constructive learning algorithm for training generalized cellular neural networks (GCNNs) for the identification of spatio-temporal evolutionary (STE) systems. The basic idea of the new PSO-based learning algorithm is to successively approximate the desired signal by progressively pursuing relevant orthogonal projections. This new algorithm will thus be referred to as the orthogonal projection pursuit (OPP) algorithm, which is in mechanism similar to the conventional projection pursuit approach. A novel two-stage hybrid training scheme is proposed for constructing a parsimonious GCNN model. In the first stage, the orthogonal projection pursuit algorithm is applied to adaptively and successively augment the network, where adjustable parameters of the associated units are optimized using a particle swarm optimizer. The resultant network model produced at the first stage may be redundant. In the second stage, a forward orthogonal regression (FOR) algorithm, aided by mutual information estimation, is applied to re. ne and improve the initially trained network. The effectiveness and performance of the proposed method is validated by applying the new modeling framework to a spatio-temporal evolutionary system identification problem

    Model structure selection using an integrated forward orthogonal search algorithm interfered with squared correlation and mutual information

    Get PDF
    Model structure selection plays a key role in nonlinear system identification. The first step in nonlinear system identification is to determine which model terms should be included in the model. Once significant model terms have been determined, a model selection criterion can then be applied to select a suitable model subset. The well known orthogonal least squares type algorithms are one of the most efficient and commonly used techniques for model structure selection. However, it has been observed that the orthogonal least squares type algorithms may occasionally select incorrect model terms or yield a redundant model subset in the presence of particular noise structures or input signals. A very efficient integrated forward orthogonal searching (IFOS) algorithm, which is interfered with squared correlation and mutual information, and which incorporates a general cross-validation (GCV) criterion and hypothesis tests, is introduced to overcome these limitations in model structure selection

    Lattice dynamical wavelet neural networks implemented using particle swarm optimisation for spatio-temporal system identification

    Get PDF
    Starting from the basic concept of coupled map lattices, a new family of adaptive wavelet neural networks, called lattice dynamical wavelet neural networks (LDWNN), is introduced for spatiotemporal system identification, by combining an efficient wavelet representation with a coupled map lattice model. A new orthogonal projection pursuit (OPP) method, coupled with a particle swarm optimisation (PSO) algorithm, is proposed for augmenting the proposed network. A novel two-stage hybrid training scheme is developed for constructing a parsimonious network model. In the first stage, by applying the orthogonal projection pursuit algorithm, significant wavelet-neurons are adaptively and successively recruited into the network, where adjustable parameters of the associated waveletneurons are optimised using a particle swarm optimiser. The resultant network model, obtained in the first stage, may however be redundant. In the second stage, an orthogonal least squares (OLS) algorithm is then applied to refine and improve the initially trained network by removing redundant wavelet-neurons from the network. The proposed two-stage hybrid training procedure can generally produce a parsimonious network model, where a ranked list of wavelet-neurons, according to the capability of each neuron to represent the total variance in the system output signal is produced. Two spatio-temporal system identification examples are presented to demonstrate the performance of the proposed new modelling framework

    Model structure selection using an integrated forward orthogonal search algorithm assisted by squared correlation and mutual information

    No full text
    Model structure selection plays a key role in non-linear system identification. The first step in non-linear system identification is to determine which model terms should be included in the model. Once significant model terms have been determined, a model selection criterion can then be applied to select a suitable model subset. The well known Orthogonal Least Squares (OLS) type algorithms are one of the most efficient and commonly used techniques for model structure selection. However, it has been observed that the OLS type algorithms may occasionally select incorrect model terms or yield a redundant model subset in the presence of particular noise structures or input signals. A very efficient Integrated Forward Orthogonal Search (IFOS) algorithm, which is assisted by the squared correlation and mutual information, and which incorporates a Generalised Cross-Validation (GCV) criterion and hypothesis tests, is introduced to overcome these limitations in model structure selection

    A randomised approach for NARX model identification based on a multivariate Bernoulli distribution

    Get PDF
    The identification of polynomial NARX models is typically performed by incremental model building techniques. These methods assess the importance of each regressor based on the evaluation of partial individual models, which may ultimately lead to erroneous model selections. A more robust assessment of the significance of a specific model term can be obtained by considering ensembles of models, as done by the RaMSS algorithm. In that context, the identification task is formulated in a probabilistic fashion and a Bernoulli distribution is employed to represent the probability that a regressor belongs to the target model. Then, samples of the model distribution are collected to gather reliable information to update it, until convergence to a specific model. The basic RaMSS algorithm employs multiple independent univariate Bernoulli distributions associated to the different candidate model terms, thus overlooking the correlations between different terms, which are typically important in the selection process. Here, a multivariate Bernoulli distribution is employed, in which the sampling of a given term is conditioned by the sampling of the others. The added complexity inherent in considering the regressor correlation properties is more than compensated by the achievable improvements in terms of accuracy of the model selection process

    A randomized algorithm for nonlinear model structure selection

    Get PDF
    The identification of polynomial Nonlinear Autoregressive [Moving Average] models with eXogenous variables (NAR[MA]X) is typically carried out with incremental model building techniques that progressively select the terms to include in the model. The Model Structure Selection (MSS) turns out to be the hardest task of the identification process due to the difficulty of correctly evaluating the importance of a generic term. As a result, classical MSS methods sometimes yield unsatisfactory models, that are unreliable over long-range prediction horizons. The MSS problem is here recast into a probabilistic framework based on which a randomized algorithm for MSS is derived, denoted RaMSS. The method introduces a tentative probability distribution over models and progressively updates it by extracting useful information on the importance of each term from sampled model structures. The proposed method is validated over models with different characteristics by means of Monte Carlo simulations, which show its advantages over classical and competitor probabilistic MSS methods in terms of both reliability and computational efficiency
    corecore