Inspired by the success of the Transformer model in natural language
processing and computer vision, this paper introduces BERT-PIN, a Bidirectional
Encoder Representations from Transformers (BERT) powered Profile Inpainting
Network. BERT-PIN recovers multiple missing data segments (MDSs) using load and
temperature time-series profiles as inputs. To adopt a standard Transformer
model structure for profile inpainting, we segment the load and temperature
profiles into line segments, treating each segment as a word and the entire
profile as a sentence. We incorporate a top candidates selection process in
BERT-PIN, enabling it to produce a sequence of probability distributions, based
on which users can generate multiple plausible imputed data sets, each
reflecting different confidence levels. We develop and evaluate BERT-PIN using
real-world dataset for two applications: multiple MDSs recovery and demand
response baseline estimation. Simulation results show that BERT-PIN outperforms
the existing methods in accuracy while is capable of restoring multiple MDSs
within a longer window. BERT-PIN, served as a pre-trained model, can be
fine-tuned for conducting many downstream tasks, such as classification and
super resolution