Astrophysical light curves are particularly challenging data objects due to the intensity and variety of noise contaminating them. Yet, despite
the astronomical volumes of light curves available, the majority of algorithms used to process them are still operating on a per-sample
basis. To remedy this, we propose a simple
Transformer model –called Denoising Time Series Transformer (DTST)– and show that it excels
at removing the noise and outliers in datasets of
time series when trained with a masked objective, even when no clean targets are available.
Moreover, the use of self-attention enables rich
and illustrative queries into the learned representations. We present experiments on real stellar light
curves from the Transiting Exoplanet Space Satellite (TESS), showing advantages of our approach
compared to traditional denoising techniques1