In this work, we demonstrate how to reliably estimate epistemic uncertainty
while maintaining the flexibility needed to capture complicated aleatoric
distributions. To this end, we propose an ensemble of Normalizing Flows (NF),
which are state-of-the-art in modeling aleatoric uncertainty. The ensembles are
created via sets of fixed dropout masks, making them less expensive than
creating separate NF models. We demonstrate how to leverage the unique
structure of NFs, base distributions, to estimate aleatoric uncertainty without
relying on samples, provide a comprehensive set of baselines, and derive
unbiased estimates for differential entropy. The methods were applied to a
variety of experiments, commonly used to benchmark aleatoric and epistemic
uncertainty estimation: 1D sinusoidal data, 2D windy grid-world (WetChicken), Pendulum, and Hopper. In these experiments, we setup
an active learning framework and evaluate each model's capability at measuring
aleatoric and epistemic uncertainty. The results show the advantages of using
NF ensembles in capturing complicated aleatoric while maintaining accurate
epistemic uncertainty estimates