Atomic clocks provide a reproducible basis for our understanding of time and
frequency. Recent demonstrations of compact optical clocks, employing thermal
atomic beams, have achieved short-term fractional frequency instabilities in
the 10−16, competitive with the best international frequency standards
available. However, a serious challenge inherent in compact clocks is the
necessarily smaller optical beams, which results in rapid variation in
interrogating wavefronts. This can cause inhomogeneous excitation of the
thermal beam leading to long term drifts in the output frequency. Here we
develop a model for Ramsey-Bord\'e interferometery using optical fields with
curved wavefronts and simulate the 40Ca beam clock experiment described in
[Olson et al., Phys. Rev. Lett. 123, 073202 (2019)]. Olson et al.'s results had
shown surprising and unexplained behaviour in the response of the atoms in the
interrogation. Our model predicts signals consistent with experimental data and
can account for the significant sensitivity to laser geometry that was
reported. We find the signal-to-noise ratio is maximised when the laser is
uncollimated at the interrogation zones to minimise inhomogeneity, and also
identify an optimal waist size determined by both laser inhomogeneity and the
velocity distribution of the atomic beam. We investigate the shifts and
stability of the clock frequency, showing that the Gouy phase is the primary
source of frequency variations arising from laser geometry.Comment: 13 pages, 7 figure