We theoretically study the effects of loss on the phase sensitivity of an
SU(1,1) interferometer with parity detection with various input states. We show
that although the sensitivity of phase estimation decreases in the presence of
loss, it can still beat the shot-noise limit with small loss. To examine the
performance of parity detection, the comparison is performed among homodyne
detection, intensity detection, and parity detection. Compared with homodyne
detection and intensity detection, parity detection has a slight better optimal
phase sensitivity in the absence of loss, but has a worse optimal phase
sensitivity with a significant amount of loss with one-coherent state or
coherent ⊗ squeezed state input.Comment: 13 pages, 8 figure