1,237,940 research outputs found
Likelihood-Ratio-Based Biometric Verification
The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal. Second, we show that, under some general conditions, decisions based on posterior probabilities and likelihood ratios are equivalent and result in the same receiver operating curve. However, in a multi-user situation, these two methods lead to different average error rates. As a third result, we prove theoretically that, for multi-user verification, the use of the likelihood ratio is optimal in terms of average error rates. The superiority of this method is illustrated by experiments in fingerprint verification. It is shown that error rates below 10/sup -3/ can be achieved when using multiple fingerprints for template construction
Likelihood Ratio-Based Detection of Facial Features
One of the first steps in face recognition, after image acquisition, is registration. A simple but effective technique of registration is to align facial features, such as eyes, nose and mouth, as well as possible to a standard face. This requires an accurate automatic estimate of the locations of those features. This contribution proposes a method for estimating the locations of facial features based on likelihood ratio-based detection. A post-processing step that evaluates the topology of the facial features is added to reduce the number of false detections. Although the individual detectors only have a reasonable performance (equal error rates range from 3.3% for the eyes to 1.0% for the nose), the positions of the facial features are estimated correctly in 95% of the face images
On likelihood ratio tests
Likelihood ratio tests are intuitively appealing. Nevertheless, a number of
examples are known in which they perform very poorly. The present paper
discusses a large class of situations in which this is the case, and analyzes
just how intuition misleads us; it also presents an alternative approach which
in these situations is optimal.Comment: Published at http://dx.doi.org/10.1214/074921706000000356 in the IMS
Lecture Notes--Monograph Series
(http://www.imstat.org/publications/lecnotes.htm) by the Institute of
Mathematical Statistics (http://www.imstat.org
Transfer Entropy as a Log-likelihood Ratio
Transfer entropy, an information-theoretic measure of time-directed
information transfer between joint processes, has steadily gained popularity in
the analysis of complex stochastic dynamics in diverse fields, including the
neurosciences, ecology, climatology and econometrics. We show that for a broad
class of predictive models, the log-likelihood ratio test statistic for the
null hypothesis of zero transfer entropy is a consistent estimator for the
transfer entropy itself. For finite Markov chains, furthermore, no explicit
model is required. In the general case, an asymptotic chi-squared distribution
is established for the transfer entropy estimator. The result generalises the
equivalence in the Gaussian case of transfer entropy and Granger causality, a
statistical notion of causal influence based on prediction via vector
autoregression, and establishes a fundamental connection between directed
information transfer and causality in the Wiener-Granger sense
- …
