Probabilistic user modeling is essential for building machine learning
systems in the ubiquitous cases with humans in the loop. However, modern
advanced user models, often designed as cognitive behavior simulators, are
incompatible with modern machine learning pipelines and computationally
prohibitive for most practical applications. We address this problem by
introducing widely-applicable differentiable surrogates for bypassing this
computational bottleneck; the surrogates enable computationally efficient
inference with modern cognitive models. We show experimentally that modeling
capabilities comparable to the only available solution, existing
likelihood-free inference methods, are achievable with a computational cost
suitable for online applications. Finally, we demonstrate how AI-assistants can
now use cognitive models for online interaction in a menu-search task, which
has so far required hours of computation during interaction.Comment: Accepted for publication in The 39th Conference on Uncertainty in
Artificial Intelligence (UAI) 202