236 research outputs found

    Pitfall of the Detection Rate Optimized Bit Allocation within template protection and a remedy

    Get PDF
    One of the requirements of a biometric template protection system is that the protected template ideally should not leak any information about the biometric sample or its derivatives. In the literature, several proposed template protection techniques are based on binary vectors. Hence, they require the extraction of a binary representation from the real- valued biometric sample. In this work we focus on the Detection Rate Optimized Bit Allocation (DROBA) quantization scheme that extracts multiple bits per feature component while maximizing the overall detection rate. The allocation strategy has to be stored as auxiliary data for reuse in the verification phase and is considered as public. This implies that the auxiliary data should not leak any information about the extracted binary representation. Experiments in our work show that the original DROBA algorithm, as known in the literature, creates auxiliary data that leaks a significant amount of information. We show how an adversary is able to exploit this information and significantly increase its success rate on obtaining a false accept. Fortunately, the information leakage can be mitigated by restricting the allocation freedom of the DROBA algorithm. We propose a method based on population statistics and empirically illustrate its effectiveness. All the experiments are based on the MCYT fingerprint database using two different texture based feature extraction algorithms

    Constructing practical Fuzzy Extractors using QIM

    Get PDF
    Fuzzy extractors are a powerful tool to extract randomness from noisy data. A fuzzy extractor can extract randomness only if the source data is discrete while in practice source data is continuous. Using quantizers to transform continuous data into discrete data is a commonly used solution. However, as far as we know no study has been made of the effect of the quantization strategy on the performance of fuzzy extractors. We construct the encoding and the decoding function of a fuzzy extractor using quantization index modulation (QIM) and we express properties of this fuzzy extractor in terms of parameters of the used QIM. We present and analyze an optimal (in the sense of embedding rate) two dimensional construction. Our 6-hexagonal tiling construction offers ( log2 6 / 2-1) approx. 3 extra bits per dimension of the space compared to the known square quantization based fuzzy extractor

    Biometric security on body sensor networks

    Get PDF

    THRIVE: Threshold Homomorphic encryption based secure and privacy preserving bIometric VErification system

    Get PDF
    In this paper, we propose a new biometric verification and template protection system which we call the THRIVE system. The system includes novel enrollment and authentication protocols based on threshold homomorphic cryptosystem where the private key is shared between a user and the verifier. In the THRIVE system, only encrypted binary biometric templates are stored in the database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during the authentication stage. The THRIVE system is designed for the malicious model where the cheating party may arbitrarily deviate from the protocol specification. Since threshold homomorphic encryption scheme is used, a malicious database owner cannot perform decryption on encrypted templates of the users in the database. Therefore, security of the THRIVE system is enhanced using a two-factor authentication scheme involving the user's private key and the biometric data. We prove security and privacy preservation capability of the proposed system in the simulation-based model with no assumption. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form but she needs to proof her physical presence by using biometrics. The system can be used with any biometric modality and biometric feature extraction scheme whose output templates can be binarized. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biohash vectors on a desktop PC running with quad-core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real life applications

    On the performance of helper data template protection schemes

    Get PDF
    The use of biometrics looks promising as it is already being applied in elec- tronic passports, ePassports, on a global scale. Because the biometric data has to be stored as a reference template on either a central or personal storage de- vice, its wide-spread use introduces new security and privacy risks such as (i) identity fraud, (ii) cross-matching, (iii) irrevocability and (iv) leaking sensitive medical information. Mitigating these risks is essential to obtain the accep- tance from the subjects of the biometric systems and therefore facilitating the successful implementation on a large-scale basis. A solution to mitigate these risks is to use template protection techniques. The required protection properties of the stored reference template according to ISO guidelines are (i) irreversibility, (ii) renewability and (iii) unlinkability. A known template protection scheme is the helper data system (HDS). The fun- damental principle of the HDS is to bind a key with the biometric sample with use of helper data and cryptography, as such that the key can be reproduced or released given another biometric sample of the same subject. The identity check is then performed in a secure way by comparing the hash of the key. Hence, the size of the key determines the amount of protection. This thesis extensively investigates the HDS system, namely (i) the the- oretical classication performance, (ii) the maximum key size, (iii) the irre- versibility and unlinkability properties, and (iv) the optimal multi-sample and multi-algorithm fusion method. The theoretical classication performance of the biometric system is deter- mined by assuming that the features extracted from the biometric sample are Gaussian distributed. With this assumption we investigate the in uence of the bit extraction scheme on the classication performance. With use of the the- oretical framework, the maximum size of the key is determined by assuming the error-correcting code to operate on Shannon's bound. We also show three vulnerabilities of HDS that aect the irreversibility and unlinkability property and propose solutions. Finally, we study the optimal level of applying multi- sample and multi-algorithm fusion with the HDS at either feature-, score-, or decision-level

    Polar Coding for Secret-Key Generation

    Full text link
    Practical implementations of secret-key generation are often based on sequential strategies, which handle reliability and secrecy in two successive steps, called reconciliation and privacy amplification. In this paper, we propose an alternative approach based on polar codes that jointly deals with reliability and secrecy. Specifically, we propose secret-key capacity-achieving polar coding schemes for the following models: (i) the degraded binary memoryless source (DBMS) model with rate-unlimited public communication, (ii) the DBMS model with one-way rate-limited public communication, (iii) the 1-to-m broadcast model and (iv) the Markov tree model with uniform marginals. For models (i) and (ii) our coding schemes remain valid for non-degraded sources, although they may not achieve the secret-key capacity. For models (i), (ii) and (iii), our schemes rely on pre-shared secret seed of negligible rate; however, we provide special cases of these models for which no seed is required. Finally, we show an application of our results to secrecy and privacy for biometric systems. We thus provide the first examples of low-complexity secret-key capacity-achieving schemes that are able to handle vector quantization for model (ii), or multiterminal communication for models (iii) and (iv).Comment: 26 pages, 9 figures, accepted to IEEE Transactions on Information Theory; parts of the results were presented at the 2013 IEEE Information Theory Worksho
    • …
    corecore