15,597 research outputs found
Binary Biometrics: An Analytic Framework to Estimate the Performance Curves Under Gaussian Assumption
In recent years, the protection of biometric data has gained increased interest from the scientific community. Methods such as the fuzzy commitment scheme, helper-data system, fuzzy extractors, fuzzy vault, and cancelable biometrics have been proposed for protecting biometric data. Most of these methods use cryptographic primitives or error-correcting codes (ECCs) and use a binary representation of the real-valued biometric data. Hence, the difference between two biometric samples is given by the Hamming distance (HD) or bit errors between the binary vectors obtained from the enrollment and verification phases, respectively. If the HD is smaller (larger) than the decision threshold, then the subject is accepted (rejected) as genuine. Because of the use of ECCs, this decision threshold is limited to the maximum error-correcting capacity of the code, consequently limiting the false rejection rate (FRR) and false acceptance rate tradeoff. A method to improve the FRR consists of using multiple biometric samples in either the enrollment or verification phase. The noise is suppressed, hence reducing the number of bit errors and decreasing the HD. In practice, the number of samples is empirically chosen without fully considering its fundamental impact. In this paper, we present a Gaussian analytical framework for estimating the performance of a binary biometric system given the number of samples being used in the enrollment and the verification phase. The error-detection tradeoff curve that combines the false acceptance and false rejection rates is estimated to assess the system performance. The analytic expressions are validated using the Face Recognition Grand Challenge v2 and Fingerprint Verification Competition 2000 biometric databases
How can Francis Bacon help forensic science? The four idols of human biases
Much debate has focused on whether forensic science is indeed a science. This paper is not aimed at answering, or even trying to contribute to, this question. Rather, in this paper I try to find ways to improve forensic science by identifying potential vulnerabilities. To this end I use Francis Bacon's doctrine of idols which distinguishes between different types of human biases that may prevent scientific and objective inquiry. Baconās doctrine contains four sources for such biases: Idols Tribus (of the 'tribe'), Idols Specus (of the 'den'/'cave'), Idols Fori (of the 'market'), and Idols Theatre (of the 'theatre'). While his 400 year old doctrine does not, of course, perfectly match up with our current world view, it still provides a productive framework for examining and cataloguing some of the potential weaknesses and limitations in our current approach to forensic science
Modeling the growth of fingerprints improves matching for adolescents
We study the effect of growth on the fingerprints of adolescents, based on
which we suggest a simple method to adjust for growth when trying to recover a
juvenile's fingerprint in a database years later. Based on longitudinal data
sets in juveniles' criminal records, we show that growth essentially leads to
an isotropic rescaling, so that we can use the strong correlation between
growth in stature and limbs to model the growth of fingerprints proportional to
stature growth as documented in growth charts. The proposed rescaling leads to
a 72% reduction of the distances between corresponding minutiae for the data
set analyzed. These findings were corroborated by several verification tests.
In an identification test on a database containing 3.25 million right index
fingers at the Federal Criminal Police Office of Germany, the identification
error rate of 20.8% was reduced to 2.1% by rescaling. The presented method is
of striking simplicity and can easily be integrated into existing automated
fingerprint identification systems
Improved fuzzy vault scheme for fingerprint verification
Fuzzy vault is a well-known technique to address the privacy concerns in biometric identification applications. We revisit the fuzzy vault scheme to address implementation, efficiency, and security issues encountered in its realization. We use the fingerprint data as a case study. We compare the performances of two different methods used in the implementation of fuzzy vault, namely brute force and Reed Solomon decoding. We show that the locations of fake (chaff) points in the vault leak information on the genuine points and propose
a new chaff point placement technique that makes distinguishing genuine points impossible. We also propose a novel method for creation of chaff points that decreases the success rate of the brute force attack from 100% to less than 3.5%. While this paper lays out a complete guideline as to how the fuzzy vault is implemented in an efficient and secure way, it also points out that more research is needed to thwart the proposed attacks by presenting ideas for future research
Multi-biometric templates using fingerprint and voice
As biometrics gains popularity, there is an increasing concern about privacy and misuse of biometric data held in central repositories. Furthermore, biometric verification systems face challenges arising from noise and intra-class variations. To tackle both problems, a multimodal biometric verification system combining fingerprint and voice modalities is proposed. The system combines the two modalities at the template level, using multibiometric templates. The fusion of fingerprint and voice data successfully diminishes privacy concerns by hiding the minutiae points from the fingerprint, among the artificial points generated by the features obtained from the spoken utterance of the speaker. Equal error rates are observed to be under 2% for the system where 600 utterances from 30 people have been processed and fused with a database of 400 fingerprints from 200 individuals. Accuracy is increased compared to the previous results for voice verification over the same speaker database
Confirmation Bias: The Pitfall of Forensic Science
As it stands, forensic science and its practitioners are held in high regard in criminal court proceedings due to their ability to discover irrefutable facts that would otherwise go unnoticed. Nevertheless, forensic scientists can fall victim to natural logical fallacies. More specifically, confirmation bias is āa proclivity to search for or interpret additional information to confirm beliefs and to steer clear of information that may disagree with those prior beliefsā (Budlowe et al., 2009, p. 803). To restore the integrity of the forensic sciences, the sources of confirmation bias need to be identified and eliminated. Accordingly, empirical studies have given substance to a subject that is intangible and thus difficult to recognize. Inherent and external sources of confirmation bias include the dependence and association of crime labs upon police agencies and the amount of extraneous information made available to verifying examiners. Potentially effective solutions offered to minimize its influence upon the conclusions made by forensic scientists include the privatization of crime labs, the establishment of educational requirements for forensic examiners, the separation of testing and interpretation, and the institution of double blind testing. This effort must be undertaken as the justice system relies on forensic sciences to provide meaningful evidence that can play a prominent role in the fate of those who stand trial
Multi-bits biometric string generation based on the likelyhood ratio
Preserving the privacy of biometric information stored in biometric systems is becoming a key issue. An important element in privacy protecting biometric systems is the quantizer which transforms a normal biometric template into a binary string. In this paper, we present a user-specific quantization method based on a likelihood ratio approach (LQ). The bits generated from every feature are concatenated to form a fixed length binary string that can be hashed to protect its privacy. Experiments are carried out on both fingerprint data (FVC2000) and face data (FRGC). Results show that our proposed quantization method achieves a reasonably good performance in terms of FAR/FRR (when FAR is 10ā4, the corresponding FRR are 16.7% and 5.77% for FVC2000 and FRGC, respectively)
Conceivable security risks and authentication techniques for smart devices
With the rapidly escalating use of smart devices and fraudulent transaction of usersā data from their devices, efficient and reliable techniques for authentication of the smart devices have become an obligatory issue. This paper reviews the security risks for mobile devices and studies several authentication techniques available for smart devices. The results from field studies enable a comparative evaluation of user-preferred authentication mechanisms and their opinions about reliability, biometric authentication and visual authentication techniques
- ā¦