The detection and estimation of gravitational wave burst signals, with {\em a
priori} unknown polarization waveforms, requires the use of data from a network
of detectors. For determining how the data from such a network should be
combined, approaches based on the maximum likelihood principle have proven to
be useful. The most straightforward among these uses the global maximum of the
likelihood over the space of all waveforms as both the detection statistic and
signal estimator. However, in the case of burst signals, a physically
counterintuitive situation results: for two aligned detectors the statistic
includes the cross-correlation of the detector outputs, as expected, but this
term disappears even for an infinitesimal misalignment. This {\em two detector
paradox} arises from the inclusion of improbable waveforms in the solution
space of maximization. Such waveforms produce widely different responses in
detectors that are closely aligned. We show that by penalizing waveforms that
exhibit large signal-to-noise ratio (snr) variability, as the corresponding
source is moved on the sky, a physically motivated restriction is obtained that
(i) resolves the two detector paradox and (ii) leads to a better performing
statistic than the global maximum of the likelihood. Waveforms with high snr
variability turn out to be precisely the ones that are improbable in the sense
mentioned above. The coherent network analysis method thus obtained can be
applied to any network, irrespective of the number or the mutual alignment of
detectors.Comment: 13 pages, 6 figure