Multioutput regression of noisy time series using convolutional neural networks with applications to gravitational waves

Abstract

In this thesis I implement a deep learning algorithm to perform a multioutput regression. The dataset is a collection of one dimensional time series arrays, corresponding to simulated gravitational waveforms emitted by a black hole binary, and labelled by the masses of the two black holes. In addition, white Gaussian noise is added to the arrays, to simulate a signal detection in the presence of noise. A convolutional neural network is trained to infer the output labels in the presence of noise, and the resulting model generalizes over many order of magnitudes in the noise level. From the results I argue that the hidden layers of the model succesfully denoise the signals before the inference step. The entire code is implemeted in the form of a Python module, and the neural network is written in PyTorch. The training of the network is speeded up using a single GPU, and I report about efforts to improve the scaling of the training time with respect to the size of the training sample

    Similar works