288 research outputs found
Low-Complexity Quantized Switching Controllers using Approximate Bisimulation
In this paper, we consider the problem of synthesizing low-complexity
controllers for incrementally stable switched systems. For that purpose, we
establish a new approximation result for the computation of symbolic models
that are approximately bisimilar to a given switched system. The main advantage
over existing results is that it allows us to design naturally quantized
switching controllers for safety or reachability specifications; these can be
pre-computed offline and therefore the online execution time is reduced. Then,
we present a technique to reduce the memory needed to store the control law by
borrowing ideas from algebraic decision diagrams for compact function
representation and by exploiting the non-determinism of the synthesized
controllers. We show the merits of our approach by applying it to a simple
model of temperature regulation in a building
Robust Control for Dynamical Systems With Non-Gaussian Noise via Formal Abstractions
Controllers for dynamical systems that operate in safety-critical settings
must account for stochastic disturbances. Such disturbances are often modeled
as process noise in a dynamical system, and common assumptions are that the
underlying distributions are known and/or Gaussian. In practice, however, these
assumptions may be unrealistic and can lead to poor approximations of the true
noise distribution. We present a novel controller synthesis method that does
not rely on any explicit representation of the noise distributions. In
particular, we address the problem of computing a controller that provides
probabilistic guarantees on safely reaching a target, while also avoiding
unsafe regions of the state space. First, we abstract the continuous control
system into a finite-state model that captures noise by probabilistic
transitions between discrete states. As a key contribution, we adapt tools from
the scenario approach to compute probably approximately correct (PAC) bounds on
these transition probabilities, based on a finite number of samples of the
noise. We capture these bounds in the transition probability intervals of a
so-called interval Markov decision process (iMDP). This iMDP is, with a
user-specified confidence probability, robust against uncertainty in the
transition probabilities, and the tightness of the probability intervals can be
controlled through the number of samples. We use state-of-the-art verification
techniques to provide guarantees on the iMDP and compute a controller for which
these guarantees carry over to the original control system. In addition, we
develop a tailored computational scheme that reduces the complexity of the
synthesis of these guarantees on the iMDP. Benchmarks on realistic control
systems show the practical applicability of our method, even when the iMDP has
hundreds of millions of transitions.Comment: To appear in the Journal of Artificial Intelligence Research (JAIR).
arXiv admin note: text overlap with arXiv:2110.1266
- …