12 research outputs found

    A Contraction Analysis of the Convergence of Risk-Sensitive Filters

    Full text link
    A contraction analysis of risk-sensitive Riccati equations is proposed. When the state-space model is reachable and observable, a block-update implementation of the risk-sensitive filter is used to show that the N-fold composition of the Riccati map is strictly contractive with respect to the Riemannian metric of positive definite matrices, when N is larger than the number of states. The range of values of the risk-sensitivity parameter for which the map remains contractive can be estimated a priori. It is also found that a second condition must be imposed on the risk-sensitivity parameter and on the initial error variance to ensure that the solution of the risk-sensitive Riccati equation remains positive definite at all times. The two conditions obtained can be viewed as extending to the multivariable case an earlier analysis of Whittle for the scalar case.Comment: 22 pages, 6 figure

    Model Predictive Control meets robust Kalman filtering

    Full text link
    Model Predictive Control (MPC) is the principal control technique used in industrial applications. Although it offers distinguishable qualities that make it ideal for industrial applications, it can be questioned its robustness regarding model uncertainties and external noises. In this paper we propose a robust MPC controller that merges the simplicity in the design of MPC with added robustness. In particular, our control system stems from the idea of adding robustness in the prediction phase of the algorithm through a specific robust Kalman filter recently introduced. Notably, the overall result is an algorithm very similar to classic MPC but that also provides the user with the possibility to tune the robustness of the control. To test the ability of the controller to deal with errors in modeling, we consider a servomechanism system characterized by nonlinear dynamics

    Robust Kalman Filtering: Asymptotic Analysis of the Least Favorable Model

    Full text link
    We consider a robust filtering problem where the robust filter is designed according to the least favorable model belonging to a ball about the nominal model. In this approach, the ball radius specifies the modeling error tolerance and the least favorable model is computed by performing a Riccati-like backward recursion. We show that this recursion converges provided that the tolerance is sufficiently small

    Robust Kalman Filtering under Model Perturbations

    Full text link
    We consider a family of divergence-based minimax approaches to perform robust filtering. The mismodeling budget, or tolerance, is specified at each time increment of the model. More precisely, all possible model increments belong to a ball which is formed by placing a bound on the Tau-divergence family between the actual and the nominal model increment. Then, the robust filter is obtained by minimizing the mean square error according to the least favorable model in that ball. It turns out that the solution is a family of Kalman like filters. Their gain matrix is updated according to a risk sensitive like iteration where the risk sensitivity parameter is now time varying. As a consequence, we also extend the risk sensitive filter to a family of risk sensitive like filters according to the Tau-divergence family

    Convergence analysis of a family of robust Kalman filters based on the contraction principle

    Full text link
    In this paper we analyze the convergence of a family of robust Kalman filters. For each filter of this family the model uncertainty is tuned according to the so called tolerance parameter. Assuming that the corresponding state-space model is reachable and observable, we show that the corresponding Riccati-like mapping is strictly contractive provided that the tolerance is sufficiently small, accordingly the filter converges

    On the Robustness of the Bayes and Wiener Estimators under Model Uncertainty

    Full text link
    This paper deals with the robust estimation problem of a signal given noisy observations. We assume that the actual statistics of the signal and observations belong to a ball about the nominal statistics. This ball is formed by placing a bound on the Tau-divergence family between the actual and the nominal statistics. Then, the robust estimator is obtained by minimizing the mean square error according to the least favorable statistics in that ball. Therefore, we obtain a divergence family-based minimax approach to robust estimation. We show in the case that the signal and observations have no dynamics, the Bayes estimator is the optimal solution. Moreover, in the dynamic case, the optimal offline estimator is the noncausal Wiener filter
    corecore