Photon noise correlations in millimeter-wave telescopes

Abstract

Many modern millimeter and submillimeter (``mm-wave'') telescopes for astronomy are deploying more detectors by increasing detector pixel density, and with the rise of lithographed detector architectures and high-throughput readout techniques, it is becoming increasingly practical to overfill the focal plane. However, when the pixel pitch ppixp_{\rm pix} is small compared to the product of the wavelength Ξ»\lambda and the focal ratio FF, or ppix≲1.2FΞ»p_{\mathrm{pix}} \lesssim 1.2 F \lambda, the Bose term of the photon noise correlates between neighboring detector pixels due to the Hanbury Brown & Twiss (HBT) effect. When this HBT effect is non-negligible, the array-averaged sensitivity scales with detector count NdetN_{\mathrm{det}} less favorably than the uncorrelated limit of Ndetβˆ’1/2N_{\mathrm{det}}^{-1/2}. In this paper, we present a general prescription to calculate this HBT correlation based on a quantum optics formalism and extend it to polarization-sensitive detectors. We then estimate the impact of HBT correlations on the sensitivity of a model mm-wave telescope and discuss the implications for focal-plane design.Comment: 24 pages, 16 figure

    Similar works

    Full text

    thumbnail-image

    Available Versions