11 research outputs found
Parameter reduction in nonlinear state-space identification of hysteresis
Hysteresis is a highly nonlinear phenomenon, showing up in a wide variety of
science and engineering problems. The identification of hysteretic systems from
input-output data is a challenging task. Recent work on black-box polynomial
nonlinear state-space modeling for hysteresis identification has provided
promising results, but struggles with a large number of parameters due to the
use of multivariate polynomials. This drawback is tackled in the current paper
by applying a decoupling approach that results in a more parsimonious
representation involving univariate polynomials. This work is carried out
numerically on input-output data generated by a Bouc-Wen hysteretic model and
follows up on earlier work of the authors. The current article discusses the
polynomial decoupling approach and explores the selection of the number of
univariate polynomials with the polynomial degree, as well as the connections
with neural network modeling. We have found that the presented decoupling
approach is able to reduce the number of parameters of the full nonlinear model
up to about 50\%, while maintaining a comparable output error level.Comment: 24 pages, 8 figure
Improved PAC-Bayesian Bounds for Linear Regression
International audienceIn this paper, we improve the PAC-Bayesian error bound for linear regression derived in Germain et al. [10]. The improvements are twofold. First, the proposed error bound is tighter, and converges to the generalization loss with a well-chosen temperature parameter. Second, the error bound also holds for training data that are not independently sampled. In particular, the error bound applies to certain time series generated by well-known classes of dynamical models, such as ARX models
PAC-Bayesian theory for stochastic LTI systems
In this paper we derive a PAC-Bayesian error bound for autonomous stochastic
LTI state-space models. The motivation for deriving such error bounds is that
they will allow deriving similar error bounds for more general dynamical
systems, including recurrent neural networks. In turn, PACBayesian error bounds
are known to be useful for analyzing machine learning algorithms and for
deriving new ones
Improved PAC-Bayesian Bounds for Linear Regression
In this paper, we improve the PAC-Bayesian error bound for linear regression derived in Germain et al. (2016). The improvements are two-fold. First, the proposed error bound is tighter, and converges to the generalization loss with a well-chosen temperature parameter. Second, the error bound also holds for training data that are not independently sampled. In particular, the error bound applies to certain time series generated by well-known classes of dynamical models, such as ARX models
Improved PAC-Bayesian Bounds for Linear Regression
International audienceIn this paper, we improve the PAC-Bayesian error bound for linear regression derived in Germain et al. [10]. The improvements are twofold. First, the proposed error bound is tighter, and converges to the generalization loss with a well-chosen temperature parameter. Second, the error bound also holds for training data that are not independently sampled. In particular, the error bound applies to certain time series generated by well-known classes of dynamical models, such as ARX models
Improved PAC-Bayesian Bounds for Linear Regression
International audienceIn this paper, we improve the PAC-Bayesian error bound for linear regression derived in Germain et al. [10]. The improvements are twofold. First, the proposed error bound is tighter, and converges to the generalization loss with a well-chosen temperature parameter. Second, the error bound also holds for training data that are not independently sampled. In particular, the error bound applies to certain time series generated by well-known classes of dynamical models, such as ARX models
PAC-Bayesian theory for stochastic LTI systems
International audienceIn this paper we derive a PAC-Bayesian error bound for autonomous stochastic LTI state-space models. The motivation for deriving such error bounds is that they will allow deriving similar error bounds for more general dynamical systems, including recurrent neural networks. In turn, PAC-Bayesian error bounds are known to be useful for analyzing machine learning algorithms and for deriving new ones