19 research outputs found

    Binary Biometrics: An Analytic Framework to Estimate the Bit Error Probability under Gaussian Assumption

    Get PDF
    In recent years the protection of biometric data has gained increased interest from the scientific community. Methods such as the helper data system, fuzzy extractors, fuzzy vault and cancellable biometrics have been proposed for protecting biometric data. Most of these methods use cryptographic primitives and require a binary representation from the real-valued biometric data. Hence, the similarity of biometric samples is measured in terms of the Hamming distance between the binary vector obtained at the enrolment and verification phase. The number of errors depends on the expected error probability Pe of each bit between two biometric samples of the same subject. In this paper we introduce a framework for analytically estimating Pe under the assumption that the within-and between-class distribution can be modeled by a Gaussian distribution. We present the analytic expression of Pe as a function of the number of samples used at the enrolment (Ne) and verification (Nv) phases. The analytic expressions are validated using the FRGC v2 and FVC2000 biometric databases

    Multi-bits biometric string generation based on the likelyhood ratio

    Get PDF
    Preserving the privacy of biometric information stored in biometric systems is becoming a key issue. An important element in privacy protecting biometric systems is the quantizer which transforms a normal biometric template into a binary string. In this paper, we present a user-specific quantization method based on a likelihood ratio approach (LQ). The bits generated from every feature are concatenated to form a fixed length binary string that can be hashed to protect its privacy. Experiments are carried out on both fingerprint data (FVC2000) and face data (FRGC). Results show that our proposed quantization method achieves a reasonably good performance in terms of FAR/FRR (when FAR is 10−4, the corresponding FRR are 16.7% and 5.77% for FVC2000 and FRGC, respectively)

    Key Generation from Multibiometric System Using Meerkat Algorithm

    Get PDF
    Biometrics are short of revocability and privacy while cryptography cannot adjust the user’s identity. By obtaining cryptographic keys using biometrics, one can obtain the features such as revocability, assurance about user’s identity, and privacy. Multi-biometrical based cryptographic key generation approach has been proposed, subsequently, left and right eye and ear of a person are uncorrelated from one to other, and they are treated as two independent biometrics and combine them in our system. None-the-less, the encryption keys are produced with the use of an approach of swarm intelligence. Emergent collective intelligence in groups of simple autonomous agents is collectively termed as a swarm intelligence. The Meerkat Clan Key Generation Algorithm (MCKGA) is a method for the generation of a key stream for the encryption of the plaintext. This method will reduce and distribute the number of keys. Testing of system, it was found that the keys produced by the characteristics of the eye are better than the keys produced by the characteristics of the ear. The advantages of our approach comprise generation of strong and unique keys from users’ biometric data using MCKGA and it is faster and accurate in terms of key generation

    Pitfall of the Detection Rate Optimized Bit Allocation within template protection and a remedy

    Get PDF
    One of the requirements of a biometric template protection system is that the protected template ideally should not leak any information about the biometric sample or its derivatives. In the literature, several proposed template protection techniques are based on binary vectors. Hence, they require the extraction of a binary representation from the real- valued biometric sample. In this work we focus on the Detection Rate Optimized Bit Allocation (DROBA) quantization scheme that extracts multiple bits per feature component while maximizing the overall detection rate. The allocation strategy has to be stored as auxiliary data for reuse in the verification phase and is considered as public. This implies that the auxiliary data should not leak any information about the extracted binary representation. Experiments in our work show that the original DROBA algorithm, as known in the literature, creates auxiliary data that leaks a significant amount of information. We show how an adversary is able to exploit this information and significantly increase its success rate on obtaining a false accept. Fortunately, the information leakage can be mitigated by restricting the allocation freedom of the DROBA algorithm. We propose a method based on population statistics and empirically illustrate its effectiveness. All the experiments are based on the MCYT fingerprint database using two different texture based feature extraction algorithms

    Binary Biometric Representation through Pairwise Adaptive Phase Quantization

    Get PDF
    Extracting binary strings from real-valued biometric templates is a fundamental step in template compression and protection systems, such as fuzzy commitment, fuzzy extractor, secure sketch, and helper data systems. Quantization and coding is the straightforward way to extract binary representations from arbitrary real-valued biometric modalities. In this paper, we propose a pairwise adaptive phase quantization (APQ) method, together with a long-short (LS) pairing strategy, which aims to maximize the overall detection rate. Experimental results on the FVC2000 fingerprint and the FRGC face database show reasonably good verification performances.\ud \u

    BUILD CRYPTOGRAPHIC SYSTEM FROM MULTI-BIOMETRICS USING MEERKAT ALGORITHM

    Get PDF
    Presenting uncouth proposal for the design of investigating ways to use extraction feature from biometric user,rather than memorable password or passphrase as an attempt to produce a new and randomly cipher keys. Human users find itdifficult to remember long cipher keys. Therefore, the proposed work takes the eye and ear as a multi-biometrics feature extraction forgenerating the cryptography keys. Meerkat Clan Key Generation Algorithm (MCKGA) is used in this work for key generation, firstlywe generate keys with 128-bits, then we enhance our method by generating 256-bits, and finally we mix the keys produced from (eyeand ear) and get robust key with 512-bits length, these keys are tested by NIST statically test to generate random keys used in encryptionprocess. Our approach generates unique keys used in cryptographic system by using Advanced Encryption Standard (AES) algorithm

    Modified shielding function for multi-biometric authentication and template protection / Abayomi Jegede... [et al.]

    Get PDF
    Biometrics provides a secure means of authentication because it is difficult to copy, forge, or steal biometric modalities. However, unprotected biometric data can be used to violate the security of the authentication system and the privacy of legitimate users. This paper proposes and implements a modified shielding function which provides multi-biometric authentication, template security and user privacy simultaneously. Experimental results based on face and iris datasets obtained from CASIA Near Infra-Red face database and CASIA Iris database version 2 respectively show that the approach has good recognition accuracy (false rejection rate of 0.65% and false acceptance rate of 0.035%). Security analysis shows that the method provides better security (key length of 120 bits) and user privacy compared to previous approaches based on the generic shielding function

    Binary Biometrics: An Analytic Framework to Estimate the Performance Curves Under Gaussian Assumption

    Get PDF
    In recent years, the protection of biometric data has gained increased interest from the scientific community. Methods such as the fuzzy commitment scheme, helper-data system, fuzzy extractors, fuzzy vault, and cancelable biometrics have been proposed for protecting biometric data. Most of these methods use cryptographic primitives or error-correcting codes (ECCs) and use a binary representation of the real-valued biometric data. Hence, the difference between two biometric samples is given by the Hamming distance (HD) or bit errors between the binary vectors obtained from the enrollment and verification phases, respectively. If the HD is smaller (larger) than the decision threshold, then the subject is accepted (rejected) as genuine. Because of the use of ECCs, this decision threshold is limited to the maximum error-correcting capacity of the code, consequently limiting the false rejection rate (FRR) and false acceptance rate tradeoff. A method to improve the FRR consists of using multiple biometric samples in either the enrollment or verification phase. The noise is suppressed, hence reducing the number of bit errors and decreasing the HD. In practice, the number of samples is empirically chosen without fully considering its fundamental impact. In this paper, we present a Gaussian analytical framework for estimating the performance of a binary biometric system given the number of samples being used in the enrollment and the verification phase. The error-detection tradeoff curve that combines the false acceptance and false rejection rates is estimated to assess the system performance. The analytic expressions are validated using the Face Recognition Grand Challenge v2 and Fingerprint Verification Competition 2000 biometric databases

    A Blind Signature Scheme using Biometric Feature Value

    Get PDF
    Blind signature has been one of the most charming research fields of public key cryptography through which authenticity, data integrity and non-repudiation can be verified. Our research is based on the blind signature schemes which are based on two hard problems – Integer factorization and discrete logarithm problems. Here biological information like finger prints, iris, retina DNA, tissue and other features whatever its kind which are unique to an individual are embedded into private key and generate cryptographic key which consists of private and public key in the public key cryptosystem. Since biological information is personal identification data, it should be positioned as a personal secret key for a system. In this schemes an attacker intends to reveal the private key knowing the public key, has to solve both the hard problems i.e. for the private key which is a part of the cryptographic key and the biological information incorporated in it. We have to generate a cryptographic key using biometric data which is called biometric cryptographic key and also using that key to put signature on a document. Then using the signature we have to verify the authenticity and integrity of the original message. The verification of the message ensures the security involved in the scheme due to use of complex mathematical equations like modular arithmetic and quadratic residue as well
    corecore