1 research outputs found
Estimating Random Delays in Modbus Network Using Experiments and General Linear Regression Neural Networks with Genetic Algorithm Smoothing
Time-varying delays adversely affect the performance of networked control
sys-tems (NCS) and in the worst-case can destabilize the entire system.
Therefore, modelling network delays is important for designing NCS. However,
modelling time-varying delays is challenging because of their dependence on
multiple pa-rameters such as length, contention, connected devices, protocol
employed, and channel loading. Further, these multiple parameters are
inherently random and de-lays vary in a non-linear fashion with respect to
time. This makes estimating ran-dom delays challenging. This investigation
presents a methodology to model de-lays in NCS using experiments and general
regression neural network (GRNN) due to their ability to capture non-linear
relationship. To compute the optimal smoothing parameter that computes the best
estimates, genetic algorithm is used. The objective of the genetic algorithm is
to compute the optimal smoothing pa-rameter that minimizes the mean absolute
percentage error (MAPE). Our results illustrate that the resulting GRNN is able
to predict the delays with less than 3% error. The proposed delay model gives a
framework to design compensation schemes for NCS subjected to time-varying
delays.Comment: 11 pages, 7 figure