1 research outputs found
Weakest Preexpectation Semantics for Bayesian Inference
We present a semantics of a probabilistic while-language with soft
conditioning and continuous distributions which handles programs diverging with
positive probability. To this end, we extend the probabilistic guarded command
language (pGCL) with draws from continuous distributions and a score operator.
The main contribution is an extension of the standard weakest preexpectation
semantics to support these constructs. As a sanity check of our semantics, we
define an alternative trace-based semantics of the language, and show that the
two semantics are equivalent. Various examples illustrate the applicability of
the semantics