20,808 research outputs found
Probabilistic Methods for Model Validation
This dissertation develops a probabilistic method for validation and verification (V&V) of uncertain nonlinear systems. Existing systems-control literature on model and controller V&V either deal with linear systems with norm-bounded uncertainties,or consider nonlinear systems in set-based and moment based framework. These existing methods deal with model invalidation or falsification, rather than assessing the quality of a model with respect to measured data. In this dissertation, an axiomatic framework for model validation is proposed in probabilistically relaxed sense, that
instead of simply invalidating a model, seeks to quantify the "degree of validation".
To develop this framework, novel algorithms for uncertainty propagation have been proposed for both deterministic and stochastic nonlinear systems in continuous time. For the deterministic flow, we compute the time-varying joint probability density functions over the state space, by solving the Liouville equation via method-of-characteristics. For the stochastic flow, we propose an approximation algorithm that combines the method-of-characteristics solution of Liouville equation with the Karhunen-Lo eve expansion of process noise, thus enabling an indirect solution of
Fokker-Planck equation, governing the evolution of joint probability density functions. The efficacy of these algorithms are demonstrated for risk assessment in Mars entry-descent-landing, and for nonlinear estimation. Next, the V&V problem is formulated in terms of Monge-Kantorovich optimal transport, naturally giving rise to a metric, called Wasserstein metric, on the space of probability densities. It is shown that the resulting computation leads to solving a linear program at each time of measurement availability, and computational complexity results for the same are derived. Probabilistic guarantees in average and worst case sense, are given for the validation oracle resulting from the proposed method. The framework is demonstrated for nonlinear robustness veri cation of F-16 flight controllers, subject to probabilistic uncertainties.
Frequency domain interpretations for the proposed framework are derived for
linear systems, and its connections with existing nonlinear model validation methods
are pointed out. In particular, we show that the asymptotic Wasserstein gap between
two single-output linear time invariant systems excited by Gaussian white noise,
is the difference between their average gains, up to a scaling by the strength of
the input noise. A geometric interpretation of this result allows us to propose an
intrinsic normalization of the Wasserstein gap, which in turn allows us to compare it
with classical systems-theoretic metrics like v-gap. Next, it is shown that the optimal
transport map can be used to automatically refine the model. This model refinement
formulation leads to solving a non-smooth convex optimization problem. Examples
are given to demonstrate how proximal operator splitting based computation enables
numerically solving the same. This method is applied for nite-time feedback control
of probability density functions, and for data driven modeling of dynamical systems
Closed-Loop Statistical Verification of Stochastic Nonlinear Systems Subject to Parametric Uncertainties
This paper proposes a statistical verification framework using Gaussian
processes (GPs) for simulation-based verification of stochastic nonlinear
systems with parametric uncertainties. Given a small number of stochastic
simulations, the proposed framework constructs a GP regression model and
predicts the system's performance over the entire set of possible
uncertainties. Included in the framework is a new metric to estimate the
confidence in those predictions based on the variance of the GP's cumulative
distribution function. This variance-based metric forms the basis of active
sampling algorithms that aim to minimize prediction error through careful
selection of simulations. In three case studies, the new active sampling
algorithms demonstrate up to a 35% improvement in prediction error over other
approaches and are able to correctly identify regions with low prediction
confidence through the variance metric.Comment: 8 pages, submitted to ACC 201
A Statistical Learning Theory Approach for Uncertain Linear and Bilinear Matrix Inequalities
In this paper, we consider the problem of minimizing a linear functional
subject to uncertain linear and bilinear matrix inequalities, which depend in a
possibly nonlinear way on a vector of uncertain parameters. Motivated by recent
results in statistical learning theory, we show that probabilistic guaranteed
solutions can be obtained by means of randomized algorithms. In particular, we
show that the Vapnik-Chervonenkis dimension (VC-dimension) of the two problems
is finite, and we compute upper bounds on it. In turn, these bounds allow us to
derive explicitly the sample complexity of these problems. Using these bounds,
in the second part of the paper, we derive a sequential scheme, based on a
sequence of optimization and validation steps. The algorithm is on the same
lines of recent schemes proposed for similar problems, but improves both in
terms of complexity and generality. The effectiveness of this approach is shown
using a linear model of a robot manipulator subject to uncertain parameters.Comment: 19 pages, 2 figures, Accepted for Publication in Automatic
Active Sampling-based Binary Verification of Dynamical Systems
Nonlinear, adaptive, or otherwise complex control techniques are increasingly
relied upon to ensure the safety of systems operating in uncertain
environments. However, the nonlinearity of the resulting closed-loop system
complicates verification that the system does in fact satisfy those
requirements at all possible operating conditions. While analytical proof-based
techniques and finite abstractions can be used to provably verify the
closed-loop system's response at different operating conditions, they often
produce conservative approximations due to restrictive assumptions and are
difficult to construct in many applications. In contrast, popular statistical
verification techniques relax the restrictions and instead rely upon
simulations to construct statistical or probabilistic guarantees. This work
presents a data-driven statistical verification procedure that instead
constructs statistical learning models from simulated training data to separate
the set of possible perturbations into "safe" and "unsafe" subsets. Binary
evaluations of closed-loop system requirement satisfaction at various
realizations of the uncertainties are obtained through temporal logic
robustness metrics, which are then used to construct predictive models of
requirement satisfaction over the full set of possible uncertainties. As the
accuracy of these predictive statistical models is inherently coupled to the
quality of the training data, an active learning algorithm selects additional
sample points in order to maximize the expected change in the data-driven model
and thus, indirectly, minimize the prediction error. Various case studies
demonstrate the closed-loop verification procedure and highlight improvements
in prediction error over both existing analytical and statistical verification
techniques.Comment: 23 page
Probabilistic Inference from Arbitrary Uncertainty using Mixtures of Factorized Generalized Gaussians
This paper presents a general and efficient framework for probabilistic
inference and learning from arbitrary uncertain information. It exploits the
calculation properties of finite mixture models, conjugate families and
factorization. Both the joint probability density of the variables and the
likelihood function of the (objective or subjective) observation are
approximated by a special mixture model, in such a way that any desired
conditional distribution can be directly obtained without numerical
integration. We have developed an extended version of the expectation
maximization (EM) algorithm to estimate the parameters of mixture models from
uncertain training examples (indirect observations). As a consequence, any
piece of exact or uncertain information about both input and output values is
consistently handled in the inference and learning stages. This ability,
extremely useful in certain situations, is not found in most alternative
methods. The proposed framework is formally justified from standard
probabilistic principles and illustrative examples are provided in the fields
of nonparametric pattern classification, nonlinear regression and pattern
completion. Finally, experiments on a real application and comparative results
over standard databases provide empirical evidence of the utility of the method
in a wide range of applications
- …