We analyze systematic (classical) and fundamental (quantum) limitations of
the sensitivity of optical magnetometers resulting from ac-Stark shifts. We
show that in contrast to absorption-based techniques, the signal reduction
associated with classical broadening can be compensated in magnetometers based
on phase measurements using electromagnetically induced transparency (EIT).
However due to ac-Stark associated quantum noise the signal-to-noise ratio of
EIT-based magnetometers attains a maximum value at a certain laser intensity.
This value is independent on the quantum statistics of the light and defines a
standard quantum limit of sensitivity. We demonstrate that an EIT-based optical
magnetometer in Faraday configuration is the best candidate to achieve the
highest sensitivity of magnetic field detection and give a detailed analysis of
such a device.Comment: 11 pages, 4 figure