2,079 research outputs found

    Information Theoretical Analysis of the Uniqueness of Iris Biometrics

    Get PDF
    With the rapid globalization of technology in the world, the need for a more reliable and secure online method of authentication is required. This can be achieved by using each individual’s distinctive biometric identifiers, such as the face, iris, fingerprint, palmprint, etc.; however, there is a bound to the uniqueness of each identifier and consequently, a limit to the capacity that a biometric recognition system can sustain before false matches occur. Therefore, knowing the limitations on the maximum population that a biometric modality can uniquely represent is essential now more than ever. In an effort to address the general problem, we turn to the use of iris biometrics to measure its uniqueness. The measure of iris uniqueness was first introduced by John Daugman in 2003 and its analysis since then remained an open research problem. Daugman defines uniqueness as the ability to enroll more and more classes into a recognition system while the probability of collision among the classes remains fixed and near zero. Due to errors while collecting these datasets (such as occlusions, illumination conditions, camera noise, motion, and out-of-focus blur) and quality degradation from any signal processing of the iris data, even the highest in-quality datasets will not approach a perfect zero probability of collision. Because of this, we appeal to techniques presented in information theory to analyze and find the maximum possible population the system can support while also measuring the quality of the iris data present in the datasets themselves. The focus of this work is divided into two new techniques to find the maximum population of an iris database: finding the limitations of Daugman\u27s widely accepted IrisCode and proposing a new methodology leveraging the raw iris data. Firstly, Daugman\u27s IrisCode is defined as binary templates representing each independent class present in the database. Through the assumption that a one-to-one encoding technique is available to map the IrisCode of each class to a new binary codeword with the length determined by the degrees of freedom inferred from the distribution of distances between each pair of independent class IrisCodes, we can appeal to Rate-Distortion Theory (limits of error-correcting codes) to establish bounds on the maximum population the IrisCode algorithm can sustain using the minimum Hamming distance (HD) between codewords as a quality metric. Our second approach leverages an Autoregressive (AR) model to estimate each iris class\u27s distinctive power spectral densities and then assume a similar one-to-one mapping of each iris class to a unique Gaussian codeword. A Gaussian Sphere Packing Bound is invoked to realize the maximum population of the dataset and measure the iris quality dependent on the noise present in the data. Another bound, the Daugman-like Bound, is developed that uses the relative entropy between models of classes as a distance metric, like Hamming distance, to find the maximum population given a fixed recognition error for the system. Using these two approaches, we hope to help researchers understand the limitations present in their recognition system depending on the quality of their iris database

    Characterization of Information Channels for Asymptotic Mean Stationarity and Stochastic Stability of Non-stationary/Unstable Linear Systems

    Full text link
    Stabilization of non-stationary linear systems over noisy communication channels is considered. Stochastically stable sources, and unstable but noise-free or bounded-noise systems have been extensively studied in information theory and control theory literature since 1970s, with a renewed interest in the past decade. There have also been studies on non-causal and causal coding of unstable/non-stationary linear Gaussian sources. In this paper, tight necessary and sufficient conditions for stochastic stabilizability of unstable (non-stationary) possibly multi-dimensional linear systems driven by Gaussian noise over discrete channels (possibly with memory and feedback) are presented. Stochastic stability notions include recurrence, asymptotic mean stationarity and sample path ergodicity, and the existence of finite second moments. Our constructive proof uses random-time state-dependent stochastic drift criteria for stabilization of Markov chains. For asymptotic mean stationarity (and thus sample path ergodicity), it is sufficient that the capacity of a channel is (strictly) greater than the sum of the logarithms of the unstable pole magnitudes for memoryless channels and a class of channels with memory. This condition is also necessary under a mild technical condition. Sufficient conditions for the existence of finite average second moments for such systems driven by unbounded noise are provided.Comment: To appear in IEEE Transactions on Information Theor

    Information Nonanticipative Rate Distortion Function and Its Applications

    Full text link
    This paper investigates applications of nonanticipative Rate Distortion Function (RDF) in a) zero-delay Joint Source-Channel Coding (JSCC) design based on average and excess distortion probability, b) in bounding the Optimal Performance Theoretically Attainable (OPTA) by noncausal and causal codes, and computing the Rate Loss (RL) of zero-delay and causal codes with respect to noncausal codes. These applications are described using two running examples, the Binary Symmetric Markov Source with parameter p, (BSMS(p)) and the multidimensional partially observed Gaussian-Markov source. For the multidimensional Gaussian-Markov source with square error distortion, the solution of the nonanticipative RDF is derived, its operational meaning using JSCC design via a noisy coding theorem is shown by providing the optimal encoding-decoding scheme over a vector Gaussian channel, and the RL of causal and zero-delay codes with respect to noncausal codes is computed. For the BSMS(p) with Hamming distortion, the solution of the nonanticipative RDF is derived, the RL of causal codes with respect to noncausal codes is computed, and an uncoded noisy coding theorem based on excess distortion probability is shown. The information nonanticipative RDF is shown to be equivalent to the nonanticipatory epsilon-entropy, which corresponds to the classical RDF with an additional causality or nonanticipative condition imposed on the optimal reproduction conditional distribution.Comment: 34 pages, 12 figures, part of this paper was accepted for publication in IEEE International Symposium on Information Theory (ISIT), 2014 and in book Coordination Control of Distributed Systems of series Lecture Notes in Control and Information Sciences, 201
    • …
    corecore