The CUSUM procedure is known to be optimal for detecting a change in
distribution under a minimax scenario, whereas the Shiryaev-Roberts procedure
is optimal for detecting a change that occurs at a distant time horizon. As a
simpler alternative to the conventional Monte Carlo approach, we propose a
numerical method for the systematic comparison of the two detection schemes in
both settings, i.e., minimax and for detecting changes that occur in the
distant future. Our goal is accomplished by deriving a set of exact integral
equations for the performance metrics, which are then solved numerically. We
present detailed numerical results for the problem of detecting a change in the
mean of a Gaussian sequence, which show that the difference between the two
procedures is significant only when detecting small changes.Comment: 21 pages, 8 figures, to appear in Communications in Statistics -
Theory and Method