research

Detection of a signal in linear subspace with bounded mismatch

Abstract

We consider the problem of detecting a signal of interest in a background of noise with unknown covariance matrix, taking into account a possible mismatch between the actual steering vector and the presumed one. We assume that the former belongs to a known linear subspace, up to a fraction of its energy. When the subspace of interest consists of the presumed steering vector, this amounts to assuming that the angle between the actual steering vector and the presumed steering vector is upper bounded. Within this framework, we derive the generalized likelihood ratio test (GLRT). We show that it involves solving a minimization problem with the constraint that the signal of interest lies inside a cone. We present a computationally efficient algorithm to find the maximum likelihood estimator (MLE) based on the Lagrange multiplier technique. Numerical simulations illustrate the performance and the robustness of this new detector, and compare it with the adaptive coherence estimator which assumes that the steering vector lies entirely in a subspace

    Similar works