Optimizing Advanced Ligo\u27s Scientific Output with Fast, Accurate, Clean Calibration

Abstract

Since 2015, the direct observation of gravitational waves has opened a new window to observe the universe and made strong-field tests of Einstein\u27s general theory of relativity possible for the first time. During the first two observing runs of the Advanced gravitational-wave detector network, the Laser Interferometer Gravitational-wave Observatory (LIGO) and the Virgo detector have made 10 detections of binary black hole mergers and one detection of a binary neutron star merger with a coincident gamma-ray burst. This dissertation discusses methods used in low and high latency to produce Advanced LIGO\u27s calibrated strain data, highlighting improvements to accuracy, latency, and noise reduction that have been made since the beginning of the second observing run (O2). Systematic errors in the calibration during O2 varied by frequency, but were generally no greater that 5% in amplitude and 3 deg in phase from 20 Hz to 1 kHz. Due in part to this work, it is now possible to achieve calibration accuracy at the level of ~1% in amplitude and ~1 deg in phase, offering improvements to downstream astrophysical analyses. Since the beginning of O2, latency intrinsic to the calibration procedure has decreased from ~12 s to ~3 s. As latency in data distribution and the sending of automated alerts to astronomers is minimized, reduction in calibration latency will become important for follow-up of events like the binary neutron star merger GW170817. A method of removing spectral lines and broadband noise in the calibration procedure has been developed since O2, offering increases in total detectable volume during future observing runs. High-latency subtraction of lines and broadband noise had a large impact on astrophysical analyses during O2. A similar data product can now be made available in low latency for the first time

    Similar works