125,278 research outputs found
Performance of Statistical Tests for Single Source Detection using Random Matrix Theory
This paper introduces a unified framework for the detection of a source with
a sensor array in the context where the noise variance and the channel between
the source and the sensors are unknown at the receiver. The Generalized Maximum
Likelihood Test is studied and yields the analysis of the ratio between the
maximum eigenvalue of the sampled covariance matrix and its normalized trace.
Using recent results of random matrix theory, a practical way to evaluate the
threshold and the -value of the test is provided in the asymptotic regime
where the number of sensors and the number of observations per sensor
are large but have the same order of magnitude. The theoretical performance of
the test is then analyzed in terms of Receiver Operating Characteristic (ROC)
curve. It is in particular proved that both Type I and Type II error
probabilities converge to zero exponentially as the dimensions increase at the
same rate, and closed-form expressions are provided for the error exponents.
These theoretical results rely on a precise description of the large deviations
of the largest eigenvalue of spiked random matrix models, and establish that
the presented test asymptotically outperforms the popular test based on the
condition number of the sampled covariance matrix.Comment: 45 p. improved presentation; more proofs provide
Signal Processing in Large Systems: a New Paradigm
For a long time, detection and parameter estimation methods for signal
processing have relied on asymptotic statistics as the number of
observations of a population grows large comparatively to the population size
, i.e. . Modern technological and societal advances now
demand the study of sometimes extremely large populations and simultaneously
require fast signal processing due to accelerated system dynamics. This results
in not-so-large practical ratios , sometimes even smaller than one. A
disruptive change in classical signal processing methods has therefore been
initiated in the past ten years, mostly spurred by the field of large
dimensional random matrix theory. The early works in random matrix theory for
signal processing applications are however scarce and highly technical. This
tutorial provides an accessible methodological introduction to the modern tools
of random matrix theory and to the signal processing methods derived from them,
with an emphasis on simple illustrative examples
A Bayesian Framework for Collaborative Multi-Source Signal Detection
This paper introduces a Bayesian framework to detect multiple signals
embedded in noisy observations from a sensor array. For various states of
knowledge on the communication channel and the noise at the receiving sensors,
a marginalization procedure based on recent tools of finite random matrix
theory, in conjunction with the maximum entropy principle, is used to compute
the hypothesis selection criterion. Quite remarkably, explicit expressions for
the Bayesian detector are derived which enable to decide on the presence of
signal sources in a noisy wireless environment. The proposed Bayesian detector
is shown to outperform the classical power detector when the noise power is
known and provides very good performance for limited knowledge on the noise
power. Simulations corroborate the theoretical results and quantify the gain
achieved using the proposed Bayesian framework.Comment: 15 pages, 9 pictures, Submitted to IEEE Trans. on Signal Processin
High speed self-testing quantum random number generation without detection loophole
Quantum mechanics provides means of generating genuine randomness that is
impossible with deterministic classical processes. Remarkably, the
unpredictability of randomness can be certified in a self-testing manner that
is independent of implementation devices. Here, we present an experimental
demonstration of self-testing quantum random number generation based on an
detection-loophole free Bell test with entangled photons. In the randomness
analysis, without the assumption of independent identical distribution, we
consider the worst case scenario that the adversary launches the most powerful
attacks against quantum adversary. After considering statistical fluctuations
and applying an 80 Gb 45.6 Mb Toeplitz matrix hashing, we achieve a
final random bit rate of 114 bits/s, with a failure probability less than
. Such self-testing random number generators mark a critical step
towards realistic applications in cryptography and fundamental physics tests.Comment: 34 pages, 10 figure
On Time-Reversal Imaging by Statistical Testing
This letter is focused on the design and analysis of computational wideband
time-reversal imaging algorithms, designed to be adaptive with respect to the
noise levels pertaining to the frequencies being employed for scene probing.
These algorithms are based on the concept of cell-by-cell processing and are
obtained as theoretically-founded decision statistics for testing the
hypothesis of single-scatterer presence (absence) at a specific location. These
statistics are also validated in comparison with the maximal invariant
statistic for the proposed problem.Comment: Reduced form accepted in IEEE Signal Processing Letter
- …