Uncertainty quantification is a critical yet unsolved challenge for deep
learning, especially for the time series imputation with irregularly sampled
measurements. To tackle this problem, we propose a novel framework based on the
principles of recurrent neural networks and neural stochastic differential
equations for reconciling irregularly sampled measurements. We impute
measurements at any arbitrary timescale and quantify the uncertainty in the
imputations in a principled manner. Specifically, we derive analytical
expressions for quantifying and propagating the epistemic and aleatoric
uncertainty across time instants. Our experiments on the IEEE 37 bus test
distribution system reveal that our framework can outperform state-of-the-art
uncertainty quantification approaches for time-series data imputations.Comment: 7 pages, 3 figure