5,559 research outputs found

    Binary Biometrics: An Analytic Framework to Estimate the Performance Curves Under Gaussian Assumption

    Get PDF
    In recent years, the protection of biometric data has gained increased interest from the scientific community. Methods such as the fuzzy commitment scheme, helper-data system, fuzzy extractors, fuzzy vault, and cancelable biometrics have been proposed for protecting biometric data. Most of these methods use cryptographic primitives or error-correcting codes (ECCs) and use a binary representation of the real-valued biometric data. Hence, the difference between two biometric samples is given by the Hamming distance (HD) or bit errors between the binary vectors obtained from the enrollment and verification phases, respectively. If the HD is smaller (larger) than the decision threshold, then the subject is accepted (rejected) as genuine. Because of the use of ECCs, this decision threshold is limited to the maximum error-correcting capacity of the code, consequently limiting the false rejection rate (FRR) and false acceptance rate tradeoff. A method to improve the FRR consists of using multiple biometric samples in either the enrollment or verification phase. The noise is suppressed, hence reducing the number of bit errors and decreasing the HD. In practice, the number of samples is empirically chosen without fully considering its fundamental impact. In this paper, we present a Gaussian analytical framework for estimating the performance of a binary biometric system given the number of samples being used in the enrollment and the verification phase. The error-detection tradeoff curve that combines the false acceptance and false rejection rates is estimated to assess the system performance. The analytic expressions are validated using the Face Recognition Grand Challenge v2 and Fingerprint Verification Competition 2000 biometric databases

    Binary Biometric Representation through Pairwise Adaptive Phase Quantization

    Get PDF
    Extracting binary strings from real-valued biometric templates is a fundamental step in template compression and protection systems, such as fuzzy commitment, fuzzy extractor, secure sketch, and helper data systems. Quantization and coding is the straightforward way to extract binary representations from arbitrary real-valued biometric modalities. In this paper, we propose a pairwise adaptive phase quantization (APQ) method, together with a long-short (LS) pairing strategy, which aims to maximize the overall detection rate. Experimental results on the FVC2000 fingerprint and the FRGC face database show reasonably good verification performances.\ud \u

    Pitfall of the Detection Rate Optimized Bit Allocation within template protection and a remedy

    Get PDF
    One of the requirements of a biometric template protection system is that the protected template ideally should not leak any information about the biometric sample or its derivatives. In the literature, several proposed template protection techniques are based on binary vectors. Hence, they require the extraction of a binary representation from the real- valued biometric sample. In this work we focus on the Detection Rate Optimized Bit Allocation (DROBA) quantization scheme that extracts multiple bits per feature component while maximizing the overall detection rate. The allocation strategy has to be stored as auxiliary data for reuse in the verification phase and is considered as public. This implies that the auxiliary data should not leak any information about the extracted binary representation. Experiments in our work show that the original DROBA algorithm, as known in the literature, creates auxiliary data that leaks a significant amount of information. We show how an adversary is able to exploit this information and significantly increase its success rate on obtaining a false accept. Fortunately, the information leakage can be mitigated by restricting the allocation freedom of the DROBA algorithm. We propose a method based on population statistics and empirically illustrate its effectiveness. All the experiments are based on the MCYT fingerprint database using two different texture based feature extraction algorithms

    Fingerprint Verification Using Spectral Minutiae Representations

    Get PDF
    Most fingerprint recognition systems are based on the use of a minutiae set, which is an unordered collection of minutiae locations and orientations suffering from various deformations such as translation, rotation, and scaling. The spectral minutiae representation introduced in this paper is a novel method to represent a minutiae set as a fixed-length feature vector, which is invariant to translation, and in which rotation and scaling become translations, so that they can be easily compensated for. These characteristics enable the combination of fingerprint recognition systems with template protection schemes that require a fixed-length feature vector. This paper introduces the concept of algorithms for two representation methods: the location-based spectral minutiae representation and the orientation-based spectral minutiae representation. Both algorithms are evaluated using two correlation-based spectral minutiae matching algorithms. We present the performance of our algorithms on three fingerprint databases. We also show how the performance can be improved by using a fusion scheme and singular points

    Multi-bits biometric string generation based on the likelyhood ratio

    Get PDF
    Preserving the privacy of biometric information stored in biometric systems is becoming a key issue. An important element in privacy protecting biometric systems is the quantizer which transforms a normal biometric template into a binary string. In this paper, we present a user-specific quantization method based on a likelihood ratio approach (LQ). The bits generated from every feature are concatenated to form a fixed length binary string that can be hashed to protect its privacy. Experiments are carried out on both fingerprint data (FVC2000) and face data (FRGC). Results show that our proposed quantization method achieves a reasonably good performance in terms of FAR/FRR (when FAR is 10−4, the corresponding FRR are 16.7% and 5.77% for FVC2000 and FRGC, respectively)

    New perspectives on realism, tractability, and complexity in economics

    Get PDF
    Fuzzy logic and genetic algorithms are used to rework more realistic (and more complex) models of competitive markets. The resulting equilibria are significantly different from the ones predicted from the usual static analysis; the methodology solves the Walrasian problem of how markets can reach equilibrium, starting with firms trading at disparate prices. The modified equilibria found in these complex market models involve some mutual self-restraint on the part of the agents involved, relative to economically rational behaviour. Research (using similar techniques) into the evolution of collaborative behaviours in economics, and of altruism generally, is summarized; and the joint significance of these two bodies of work for public policy is reviewed. The possible extension of the fuzzy/ genetic methodology to other technical aspects of economics (including international trade theory, and development) is also discussed, as are the limitations to the usefulness of any type of theory in political domains. For the latter purpose, a more differentiated concept of rationality, appropriate to ill-structured choices, is developed. The philosophical case for laissez-faire policies is considered briefly; and the prospects for change in the way we ‘do economics’ are analysed

    Optimal Iris Fuzzy Sketches

    Full text link
    Fuzzy sketches, introduced as a link between biometry and cryptography, are a way of handling biometric data matching as an error correction issue. We focus here on iris biometrics and look for the best error-correcting code in that respect. We show that two-dimensional iterative min-sum decoding leads to results near the theoretical limits. In particular, we experiment our techniques on the Iris Challenge Evaluation (ICE) database and validate our findings.Comment: 9 pages. Submitted to the IEEE Conference on Biometrics: Theory, Applications and Systems, 2007 Washington D

    A MPC Strategy for the Optimal Management of Microgrids Based on Evolutionary Optimization

    Get PDF
    In this paper, a novel model predictive control strategy, with a 24-h prediction horizon, is proposed to reduce the operational cost of microgrids. To overcome the complexity of the optimization problems arising from the operation of the microgrid at each step, an adaptive evolutionary strategy with a satisfactory trade-off between exploration and exploitation capabilities was added to the model predictive control. The proposed strategy was evaluated using a representative microgrid that includes a wind turbine, a photovoltaic plant, a microturbine, a diesel engine, and an energy storage system. The achieved results demonstrate the validity of the proposed approach, outperforming a global scheduling planner-based on a genetic algorithm by 14.2% in terms of operational cost. In addition, the proposed approach also better manages the use of the energy storage system.Ministerio de Economía y Competitividad DPI2016-75294-C2-2-RUnión Europea (Programa Horizonte 2020) 76409
    corecore