5,279 research outputs found

    On false alarm rate of matched filter under distribution mismatch

    Get PDF
    The generalized likelihood ratio test (GLRT) is a very widely used technique for detecting signals of interest amongst noise, when some of the parameters describing the signal (and possibly the noise) are unknown. The threshold of such a test is set from a desired probability of false alarm \Pfa and hence this threshold depends on the statistical assumptions made about noise. In practice however, the noise statistics are seldom known and it becomes crucial to characterize \Pfa under a mismatched distribution. In this letter, we address this problem in the case of a simple binary composite hypothesis testing problem (matched filter) when the threshold is designed under a Gaussian assumption while the noise actually follows an elliptically contoured distribution. We also consider the inverse situation. Generic expressions for the assumed and actual \Pfa are derived and illustrated on the particular case of Student distributions for which simple, closed-form expressions are obtained. The latter show that the GLRT based on Gaussian assumption is not robust while that based on Student assumption is

    Improved Stack-Slide Searches for Gravitational-Wave Pulsars

    Full text link
    We formulate and optimize a computational search strategy for detecting gravitational waves from isolated, previously-unknown neutron stars (that is, neutron stars with unknown sky positions, spin frequencies, and spin-down parameters). It is well known that fully coherent searches over the relevant parameter-space volumes are not computationally feasible, and so more computationally efficient methods are called for. The first step in this direction was taken by Brady & Creighton (2000), who proposed and optimized a two-stage, stack-slide search algorithm. We generalize and otherwise improve upon the Brady-Creighton scheme in several ways. Like Brady & Creighton, we consider a stack-slide scheme, but here with an arbitrary number of semi-coherent stages and with a coherent follow-up stage at the end. We find that searches with three semi-coherent stages are significantly more efficient than two-stage searches (requiring about 2-5 times less computational power for the same sensitivity) and are only slightly less efficient than searches with four or more stages. We calculate the signal-to-noise ratio required for detection, as a function of computing power and neutron star spin-down-age, using our optimized searches.Comment: 19 pages, 7 figures, RevTeX

    An improved adaptive sidelobe blanker

    Get PDF
    We propose a two-stage detector consisting of a subspace detector followed by the whitened adaptive beamformer orthogonal rejection test. The performance analysis shows that it possesses the constant false alarm rate property with respect to the unknown covariance matrix of the noise and that it can guarantee a wider range of directivity values with respect to previously proposed two-stage detectors. The probability of false alarm and the probability of detection (for both matched and mismatched signals) have been evaluated by means of numerical integration techniques

    Coincidence and coherent data analysis methods for gravitational wave bursts in a network of interferometric detectors

    Full text link
    Network data analysis methods are the only way to properly separate real gravitational wave (GW) transient events from detector noise. They can be divided into two generic classes: the coincidence method and the coherent analysis. The former uses lists of selected events provided by each interferometer belonging to the network and tries to correlate them in time to identify a physical signal. Instead of this binary treatment of detector outputs (signal present or absent), the latter method involves first the merging of the interferometer data and looks for a common pattern, consistent with an assumed GW waveform and a given source location in the sky. The thresholds are only applied later, to validate or not the hypothesis made. As coherent algorithms use a more complete information than coincidence methods, they are expected to provide better detection performances, but at a higher computational cost. An efficient filter must yield a good compromise between a low false alarm rate (hence triggering on data at a manageable rate) and a high detection efficiency. Therefore, the comparison of the two approaches is achieved using so-called Receiving Operating Characteristics (ROC), giving the relationship between the false alarm rate and the detection efficiency for a given method. This paper investigates this question via Monte-Carlo simulations, using the network model developed in a previous article.Comment: Spelling mistake corrected in one author's nam

    Detection of a signal in linear subspace with bounded mismatch

    Get PDF
    We consider the problem of detecting a signal of interest in a background of noise with unknown covariance matrix, taking into account a possible mismatch between the actual steering vector and the presumed one. We assume that the former belongs to a known linear subspace, up to a fraction of its energy. When the subspace of interest consists of the presumed steering vector, this amounts to assuming that the angle between the actual steering vector and the presumed steering vector is upper bounded. Within this framework, we derive the generalized likelihood ratio test (GLRT). We show that it involves solving a minimization problem with the constraint that the signal of interest lies inside a cone. We present a computationally efficient algorithm to find the maximum likelihood estimator (MLE) based on the Lagrange multiplier technique. Numerical simulations illustrate the performance and the robustness of this new detector, and compare it with the adaptive coherence estimator which assumes that the steering vector lies entirely in a subspace
    • 

    corecore