Many modern millimeter and submillimeter (``mm-wave'') telescopes for
astronomy are deploying more detectors by increasing detector pixel density,
and with the rise of lithographed detector architectures and high-throughput
readout techniques, it is becoming increasingly practical to overfill the focal
plane. However, when the pixel pitch ppixβ is small compared to the
product of the wavelength Ξ» and the focal ratio F, or
ppixββ²1.2FΞ», the Bose term of the photon noise
correlates between neighboring detector pixels due to the Hanbury Brown & Twiss
(HBT) effect. When this HBT effect is non-negligible, the array-averaged
sensitivity scales with detector count Ndetβ less favorably than
the uncorrelated limit of Ndetβ1/2β. In this paper, we present
a general prescription to calculate this HBT correlation based on a quantum
optics formalism and extend it to polarization-sensitive detectors. We then
estimate the impact of HBT correlations on the sensitivity of a model mm-wave
telescope and discuss the implications for focal-plane design.Comment: 24 pages, 16 figure