22,209 research outputs found
Adaptive detection using randomly reduced dimension generalized likelihood ratio test
We address the problem of detecting a signal of interest in the presence of Gaussian noise with unknown statistics when the number of training samples available to learn the noise covariance matrix is less than the size of the observation space. Following an idea by Marzetta, a series of K random semi-unitary matrices are applied to the data to achieve dimensionality reduction. Then, the K corresponding generalized likelihood ratios are computed and their median value provides the final detector. We show that the semi-unitary matrices can be replaced by random Gaussian matrices without affecting the final test statistic. The new detector avoids eigenvalue decomposition and is easily amenable to parallel implementation. It is compared to conventional techniques based on diagonal loading of the sample covariance matrix or based on rank reduction through eigenvalue decomposition and is shown to perform well
Quantum Tomography
This is the draft version of a review paper which is going to appear in
"Advances in Imaging and Electron Physics"Comment: To appear in "Advances in Imaging and Electron Physics". Some figs
with low resolutio
Recommended from our members
Models for discriminating image blur from loss of contrast
Observers can discriminate between blurry and low-contrast images (Morgan, 2017). Wang and Simoncelli (2004) demonstrated that a code for blur is inherent to the phase relationships between localized pattern detectors of different scale. To test whether human observers actually use local phase coherence when discriminating between image blur and loss of contrast, we compared phase-scrambled chessboards with unscrambled chessboards. Although both stimuli had identical amplitude spectra, local phase coherence was disrupted by phase-scrambling. Human observers were required to concurrently detect and identify (as contrast or blur) image manipulations in the 2x2 forced-choice paradigm (Nachmias & Weber, 1975; Watson & Robson, 1981) traditionally considered to be a litmus test for "labelled lines" (i.e. detection mechanisms that can be distinguished on the basis of their preferred stimuli). Phase scrambling reduced some observers’ ability to discriminate between blur and a reduction in contrast. However, none of our observers produced data consistent with Watson & Robson’s most stringent test for labelled lines, regardless whether phases were scrambled or not. Models of performance fit significantly better when either a) the blur detector also responded to contrast modulations, b) the contrast detector also responded to blur modulations, or c) noise in the two detectors was anticorrelate
Feature selection when there are many influential features
Recent discussion of the success of feature selection methods has argued that
focusing on a relatively small number of features has been counterproductive.
Instead, it is suggested, the number of significant features can be in the
thousands or tens of thousands, rather than (as is commonly supposed at
present) approximately in the range from five to fifty. This change, in orders
of magnitude, in the number of influential features, necessitates alterations
to the way in which we choose features and to the manner in which the success
of feature selection is assessed. In this paper, we suggest a general approach
that is suited to cases where the number of relevant features is very large,
and we consider particular versions of the approach in detail. We propose ways
of measuring performance, and we study both theoretical and numerical
properties of the proposed methodology.Comment: Published in at http://dx.doi.org/10.3150/13-BEJ536 the Bernoulli
(http://isi.cbs.nl/bernoulli/) by the International Statistical
Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm
Space-time reduced rank methods and CFAR signal detection algorithms with applications to HPRF radar
In radar applications, the statistical properties (covariance matrix) of the interference are typically unknown a priori and are estimated from a dataset with limited sample support. Often, the limited sample support leads to numerically ill-conditioned radar detectors. Under such circumstances, classical interference cancellation methods such as sample matrix inversion (SMI) do not perform satisfactorily. In these cases, innovative reduced-rank space-time adaptive processing (STAP) techniques outperform full-rank techniques. The high pulse repetition frequency (HPRF) radar problem is analyzed and it is shown that it is in the class of adaptive radar with limited sample support. Reduced-rank methods are studied for the HPRF radar problem. In particular, the method known as diagonally loaded covariance matrix SMI (L-SMI) is closely investigated. Diagonal loading improves the numerical conditioning of the estimated covariance matrix, and hence, is well suited to be applied in a limited sample support environment. The performance of L-SMI is obtained through a theoretical distribution of the output conditioned signal-to-noise ratio of the space-time array. Reduced-rank techniques are extended to constant false alarm rate (CFAR) detectors based on the generalized likelihood ratio test (GLRT). Two new modified CFAR GLRT detectors are considered and analyzed. The first is a subspace-based GLRT detector where subspace-based transformations are applied to the data prior to detection. A subspace transformation adds statistical stability which tends to improve performance at the expense of an additional SNR loss. The second detector is a modified GLRT detector that incorporates a diagonally loaded covariance matrix. Both detectors show improved performance over the traditional GLRT
- …