Network data analysis methods are the only way to properly separate real
gravitational wave (GW) transient events from detector noise. They can be
divided into two generic classes: the coincidence method and the coherent
analysis. The former uses lists of selected events provided by each
interferometer belonging to the network and tries to correlate them in time to
identify a physical signal. Instead of this binary treatment of detector
outputs (signal present or absent), the latter method involves first the
merging of the interferometer data and looks for a common pattern, consistent
with an assumed GW waveform and a given source location in the sky. The
thresholds are only applied later, to validate or not the hypothesis made. As
coherent algorithms use a more complete information than coincidence methods,
they are expected to provide better detection performances, but at a higher
computational cost. An efficient filter must yield a good compromise between a
low false alarm rate (hence triggering on data at a manageable rate) and a high
detection efficiency. Therefore, the comparison of the two approaches is
achieved using so-called Receiving Operating Characteristics (ROC), giving the
relationship between the false alarm rate and the detection efficiency for a
given method. This paper investigates this question via Monte-Carlo
simulations, using the network model developed in a previous article.Comment: Spelling mistake corrected in one author's nam