250 research outputs found

    Polar Coding for Secret-Key Generation

    Full text link
    Practical implementations of secret-key generation are often based on sequential strategies, which handle reliability and secrecy in two successive steps, called reconciliation and privacy amplification. In this paper, we propose an alternative approach based on polar codes that jointly deals with reliability and secrecy. Specifically, we propose secret-key capacity-achieving polar coding schemes for the following models: (i) the degraded binary memoryless source (DBMS) model with rate-unlimited public communication, (ii) the DBMS model with one-way rate-limited public communication, (iii) the 1-to-m broadcast model and (iv) the Markov tree model with uniform marginals. For models (i) and (ii) our coding schemes remain valid for non-degraded sources, although they may not achieve the secret-key capacity. For models (i), (ii) and (iii), our schemes rely on pre-shared secret seed of negligible rate; however, we provide special cases of these models for which no seed is required. Finally, we show an application of our results to secrecy and privacy for biometric systems. We thus provide the first examples of low-complexity secret-key capacity-achieving schemes that are able to handle vector quantization for model (ii), or multiterminal communication for models (iii) and (iv).Comment: 26 pages, 9 figures, accepted to IEEE Transactions on Information Theory; parts of the results were presented at the 2013 IEEE Information Theory Worksho

    Biometric security on body sensor networks

    Get PDF

    Secret-key rates and privacy leakage in biometric systems

    Get PDF
    In this thesis both the generation of secret keys from biometric data and the binding of secret keys to biometric data are investigated. These secret keys can be used to regulate access to sensitive data, services, and environments. In a biometric secrecy system a secret key is generated or chosen during an enrollment procedure in which biometric data are observed for the first time. This key is to be reconstructed after these biometric data are observed for the second time when authentication is required. Since biometric measurements are typically noisy, reliable biometric secrecy systems also extract so-called helper data from the biometric observation at the time of enrollment. These helper data facilitate reliable reconstruction of the secret key in the authentication process. Since the helper data are assumed to be public, they should not contain information about the secret key. We say that the secrecy leakage should be negligible. Important parameters of biometric key-generation and key-binding systems include the size of the generated or chosen secret key and the information that the helper data contain (leak) about the biometric observation. This latter parameter is called privacy leakage. Ideally the privacy leakage should be small, to prevent the biometric data of an individual from being compromised. Moreover, the secret-key length (also characterized by the secret-key rate) should be large to minimize the probability that the secret key is guessed and unauthorized access is granted. The first part of this thesis mainly focuses on the fundamental trade-off between the secret-key rate and the privacy-leakage rate in biometric secret-generation and secretbinding systems. This trade-off is studied from an information-theoretical perspective for four biometric settings. The first setting is the classical secret-generation setting as proposed by Maurer [1993] and Ahlswede and Csiszár [1993]. For this setting the achievable secret-key vs. privacy-leakage rate region is determined in this thesis. In the second setting the secret key is not generated by the terminals, but independently chosen during enrollment (key binding). Also for this setting the region of achievable secret-key vs. privacy-leakage rate pairs is determined. In settings three and four zero-leakage systems are considered. In these systems the public message should contain only a negligible amount of information about both the secret key and the biometric enrollment sequence. To achieve this, a private key is needed, which can be observed only by the two terminals. Again both the secret generation setting and chosen secret setting are considered. For these two cases the regions of achievable secret-key vs. private-key rate pairs are determined. For all four settings two notions of leakage are considered. Depending on whether one looks at secrecy and privacy leakage separately or in combination, unconditional or conditional privacy leakage is considered. Here unconditional leakage corresponds to the mutual information between the helper data and the biometric enrollment sequence, while the conditional leakage relates to the conditional version of this mutual information, given the secret. The second part of the thesis focuses on the privacy- and secrecy-leakage analysis of the fuzzy commitment scheme. Fuzzy commitment, proposed by Juels and Wattenberg [1999], is, in fact, a particular realization of a binary biometric secrecy system with a chosen secret key. In this scheme the helper data are constructed as a codeword from an error-correcting code, used to encode a chosen secret, masked with the biometric sequence that has been observed during enrollment. Since this scheme is not privacy preserving in the conditional privacy-leakage sense, the unconditional privacy-leakage case is investigated. Four cases of biometric sources are considered, i.e. memoryless and totally-symmetric biometric sources, memoryless and input-symmetric biometric sources, memoryless biometric sources, and stationary and ergodic biometric sources. For the first two cases the achievable rate-leakage regions are determined. In these cases the secrecy leakage rate need not be positive. For the other two cases only outer bounds on achievable rate-leakage regions are found. These bounds, moreover, are sharpened for fuzzy commitment based on systematic parity-check codes. Using the fundamental trade-offs found in the first part of this thesis, it is shown that fuzzy commitment is only optimal for memoryless totally-symmetric biometric sources and only at the maximum secret-key rate. Moreover, it is demonstrated that for memoryless and stationary ergodic biometric sources, which are not input-symmetric, the fuzzy commitment scheme leaks information on both the secret key and the biometric data. Biometric sequences have an often unknown statistical structure (model) that can be quite complex. The last part of this dissertation addresses the problem of finding the maximum a posteriori (MAP) model for a pair of observed biometric sequences and the problem of estimating the maximum secret-key rate from these sequences. A universal source coding procedure called the Context-TreeWeighting (CTW) method [1995] can be used to find this MAP model. In this thesis a procedure that determines the MAP model, based on the so-called beta-implementation of the CTW method, is proposed. Moreover, CTW methods are used to compress the biometric sequences and sequence pairs in order to estimate the mutual information between the sequences. However, CTW methods were primarily developed for compressing onedimensional sources, while biometric data are often modeled as two-dimensional processes. Therefore it is proved here that the entropy of a stationary two-dimensional source can be expressed as a limit of a series of conditional entropies. This result is also extended to the conditional entropy of one two-dimensional source given another one. As a consequence entropy and mutual information estimates can be obtained from CTW methods using properly-chosen templates. Using such techniques estimates of the maximum secret-key rate for physical unclonable functions (PUFs) are determined from a data-set of observed sequences. PUFs can be regarded as inanimate analogues of biometrics

    Low-complexity and Reliable Transforms for Physical Unclonable Functions

    Get PDF
    Noisy measurements of a physical unclonable function (PUF) are used to store secret keys with reliability, security, privacy, and complexity constraints. A new set of low-complexity and orthogonal transforms with no multiplication is proposed to obtain bit-error probability results significantly better than all methods previously proposed for key binding with PUFs. The uniqueness and security performance of a transform selected from the proposed set is shown to be close to optimal. An error-correction code with a low-complexity decoder and a high code rate is shown to provide a block-error probability significantly smaller than provided by previously proposed codes with the same or smaller code rates.Comment: To appear in IEEE International Conference on Acoustics, Speech, and Signal Processing 202

    Quantization effects in biometric systems

    Full text link
    The fundamental secret-key rate vs. privacy-leakage rate trade-offs for secret-key generation and transmission for i.i.d. Gaussian biometric sources are determined. These results are the Gaussian equivalents of the results that were obtained for the discrete case by the authors and independently by Lai et al. in 2008. Also the effect that binary quantization of the biometric sequences has on the ratio of the secret-key rate and privacy-leakage rate is considered. It is shown that the squared correlation coefficient must be increased by a factor of pi2/4 to compensate for such a quantization action, for values of the privacy-leakage rate that approach zero, when the correlation coefficient is close to zero

    Variable-Length Bit Mapping and Error-Correcting Codes for Higher-Order Alphabet PUFs

    Get PDF
    Device-specific physical characteristics provide the foundation for PUFs, a hardware primitive for secure storage of cryptographic keys. So far, they have been implemented by either directly evaluating a binary output or by mapping outputs from a higher-order alphabet to a fixed-length bit sequence. However, the latter causes a significant bias in the derived key when combined with an equidistant quantization. To overcome this limitation, we propose a variable-length bit mapping that reflects the properties of a Gray code in a different metric, namely the Levenshtein metric instead of the classical Hamming metric. Subsequent error-correction is therefore based on a custom insertion/deletion correcting code. This new approach effectively counteracts the bias in the derived key already at the input side. We present the concept for our scheme and demonstrate its feasibility based on an empirical PUF distribution. As a result, we increase the effective output bit length of the secret by over 40% compared to state-of-the-art approaches while at the same time obtaining additional advantages, e.g., an improved tamper-sensitivity. This opens up a new direction of Error-Correcting Codes (ECCs) for PUFs that output responses with symbols of higher-order output alphabets
    • …
    corecore