4 research outputs found

    Safety guarantees for iterative predictions with Gaussian Processes

    No full text
    Gaussian Processes (GPs) are widely employed in control and learning because of their principled treatment of uncertainty. However, tracking uncertainty for iterative, multistep predictions in general leads to an analytically intractable problem. While approximation methods exist, they do not come with guarantees, making it difficult to estimate their reliability and to trust their predictions. In this work, we derive formal probability error bounds for iterative predictions with GPs. Building on GP properties, we bound the probability that random trajectories lie in specific regions around the predicted values. Namely, given a tolerance � > 0, we compute regions around the predicted trajectory values, such that GP trajectories are guaranteed to lie inside them with probability at least 1 − �. We verify experimentally that our method tracks the predictive uncertainty correctly, even when current approximation techniques fail. Furthermore, we show how the proposed bounds can incorporate a given control law, and effectively bound the trajectories of the closed-loop system
    corecore