We analyze the signal processing required for the optimal detection of a
stochastic background of gravitational radiation using laser interferometric
detectors. Starting with basic assumptions about the statistical properties of
a stochastic gravity-wave background, we derive expressions for the optimal
filter function and signal-to-noise ratio for the cross-correlation of the
outputs of two gravity-wave detectors. Sensitivity levels required for
detection are then calculated. Issues related to: (i) calculating the
signal-to-noise ratio for arbitrarily large stochastic backgrounds, (ii)
performing the data analysis in the presence of nonstationary detector noise,
(iii) combining data from multiple detector pairs to increase the sensitivity
of a stochastic background search, (iv) correlating the outputs of 4 or more
detectors, and (v) allowing for the possibility of correlated noise in the
outputs of two detectors are discussed. We briefly describe a computer
simulation which mimics the generation and detection of a simulated stochastic
gravity-wave signal in the presence of simulated detector noise. Numerous
graphs and tables of numerical data for the five major interferometers
(LIGO-WA, LIGO-LA, VIRGO, GEO-600, and TAMA-300) are also given. The treatment
given in this paper should be accessible to both theorists involved in data
analysis and experimentalists involved in detector design and data acquisition.Comment: 81 pages, 30 postscript figures, REVTE