1,426 research outputs found
Robust State Space Filtering under Incremental Model Perturbations Subject to a Relative Entropy Tolerance
This paper considers robust filtering for a nominal Gaussian state-space
model, when a relative entropy tolerance is applied to each time increment of a
dynamical model. The problem is formulated as a dynamic minimax game where the
maximizer adopts a myopic strategy. This game is shown to admit a saddle point
whose structure is characterized by applying and extending results presented
earlier in [1] for static least-squares estimation. The resulting minimax
filter takes the form of a risk-sensitive filter with a time varying risk
sensitivity parameter, which depends on the tolerance bound applied to the
model dynamics and observations at the corresponding time index. The
least-favorable model is constructed and used to evaluate the performance of
alternative filters. Simulations comparing the proposed risk-sensitive filter
to a standard Kalman filter show a significant performance advantage when
applied to the least-favorable model, and only a small performance loss for the
nominal model
Robust Kalman Filtering under Model Perturbations
We consider a family of divergence-based minimax approaches to perform robust
filtering. The mismodeling budget, or tolerance, is specified at each time
increment of the model. More precisely, all possible model increments belong to
a ball which is formed by placing a bound on the Tau-divergence family between
the actual and the nominal model increment. Then, the robust filter is obtained
by minimizing the mean square error according to the least favorable model in
that ball. It turns out that the solution is a family of Kalman like filters.
Their gain matrix is updated according to a risk sensitive like iteration where
the risk sensitivity parameter is now time varying. As a consequence, we also
extend the risk sensitive filter to a family of risk sensitive like filters
according to the Tau-divergence family
Model Predictive Control meets robust Kalman filtering
Model Predictive Control (MPC) is the principal control technique used in
industrial applications. Although it offers distinguishable qualities that make
it ideal for industrial applications, it can be questioned its robustness
regarding model uncertainties and external noises. In this paper we propose a
robust MPC controller that merges the simplicity in the design of MPC with
added robustness. In particular, our control system stems from the idea of
adding robustness in the prediction phase of the algorithm through a specific
robust Kalman filter recently introduced. Notably, the overall result is an
algorithm very similar to classic MPC but that also provides the user with the
possibility to tune the robustness of the control. To test the ability of the
controller to deal with errors in modeling, we consider a servomechanism system
characterized by nonlinear dynamics
Robust Kalman Filtering: Asymptotic Analysis of the Least Favorable Model
We consider a robust filtering problem where the robust filter is designed
according to the least favorable model belonging to a ball about the nominal
model. In this approach, the ball radius specifies the modeling error tolerance
and the least favorable model is computed by performing a Riccati-like backward
recursion. We show that this recursion converges provided that the tolerance is
sufficiently small
Distributionally Robust LQG control under Distributed Uncertainty
A new paradigm is proposed for the robustification of the LQG controller
against distributional uncertainties on the noise process. Our controller
optimizes the closed-loop performances in the worst possible scenario under the
constraint that the noise distributional aberrance does not exceed a certain
threshold limiting the relative entropy pseudo-distance between the actual
noise distribution the nominal one. The main novelty is that the bounds on the
distributional aberrance can be arbitrarily distributed along the whole
disturbance trajectory. We discuss why this can, in principle, be a substantial
advantage and we provide simulation results that substantiate such a principle
On the Robustness of the Bayes and Wiener Estimators under Model Uncertainty
This paper deals with the robust estimation problem of a signal given noisy
observations. We assume that the actual statistics of the signal and
observations belong to a ball about the nominal statistics. This ball is formed
by placing a bound on the Tau-divergence family between the actual and the
nominal statistics. Then, the robust estimator is obtained by minimizing the
mean square error according to the least favorable statistics in that ball.
Therefore, we obtain a divergence family-based minimax approach to robust
estimation. We show in the case that the signal and observations have no
dynamics, the Bayes estimator is the optimal solution. Moreover, in the dynamic
case, the optimal offline estimator is the noncausal Wiener filter
Factor analysis with finite data
Factor analysis aims to describe high dimensional random vectors by means of
a small number of unknown common factors. In mathematical terms, it is required
to decompose the covariance matrix of the random vector as the sum of
a diagonal matrix | accounting for the idiosyncratic noise in the data |
and a low rank matrix | accounting for the variance of the common factors |
in such a way that the rank of is as small as possible so that the number
of common factors is minimal. In practice, however, the matrix is
unknown and must be replaced by its estimate, i.e. the sample covariance, which
comes from a finite amount of data. This paper provides a strategy to account
for the uncertainty in the estimation of in the factor analysis
problem.Comment: Draft, the final version will appear in the 56th IEEE Conference on
Decision and Control, Melbourne, Australia, 201
Convergence analysis of a family of robust Kalman filters based on the contraction principle
In this paper we analyze the convergence of a family of robust Kalman
filters. For each filter of this family the model uncertainty is tuned
according to the so called tolerance parameter. Assuming that the corresponding
state-space model is reachable and observable, we show that the corresponding
Riccati-like mapping is strictly contractive provided that the tolerance is
sufficiently small, accordingly the filter converges
Factor Models with Real Data: a Robust Estimation of the Number of Factors
Factor models are a very efficient way to describe high dimensional vectors
of data in terms of a small number of common relevant factors. This problem,
which is of fundamental importance in many disciplines, is usually reformulated
in mathematical terms as follows. We are given the covariance matrix Sigma of
the available data. Sigma must be additively decomposed as the sum of two
positive semidefinite matrices D and L: D | that accounts for the idiosyncratic
noise affecting the knowledge of each component of the available vector of data
| must be diagonal and L must have the smallest possible rank in order to
describe the available data in terms of the smallest possible number of
independent factors.
In practice, however, the matrix Sigma is never known and therefore it must
be estimated from the data so that only an approximation of Sigma is actually
available. This paper discusses the issues that arise from this uncertainty and
provides a strategy to deal with the problem of robustly estimating the number
of factors.Comment: arXiv admin note: text overlap with arXiv:1708.0040
- …