24 research outputs found

    Biohashing: two factor authentication featuring fingerprint data and tokenised random number

    No full text
    Human authentication is the security task whose job is to limit access to physical locations or computer network only to those with authorisation. This is done by equipped authorised users with passwords, tokens or using their biometrics. Unfortunately, the first two suffer a lack of security as they are easy being forgotten and stolen; even biometrics also suffers from some inherent limitation and specific security threats. A more practical approach is to combine two or more factor authenticator to reap benefits in security or convenient or both. This paper proposed a novel two factor authenticator based on iterated inner products between tokenised pseudo-random number and the user specific fingerprint feature, which generated from the integrated wavelet and Fourier-Mellin transform, and hence produce a set of user specific compact code that coined as BioHashing. BioHashing highly tolerant of data capture offsets, with same user fingerprint data resulting in highly correlated bitstrings. Moreover, there is no deterministic way to get the user specific code without having both token with random data and user fingerprint feature. This would protect us for instance against biometric fabrication by changing the user specific credential, is as simple as changing the token containing the random data. The BioHashing has significant functional advantages over solely biometrics i.e. zero equal error rate point and clean separation of the genuine and imposter populations, thereby allowing elimination of false accept rates without suffering from increased occurrence of false reject rates. (C) 2004 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved

    Face recognition using wavelet transform and non-negative matrix factorization

    No full text
    This paper demonstrates a novel subspace projection technique via Non-Negative Matrix Factorization (NMF) to represent human facial image in low frequency subband, which is able to realize through the wavelet transform. Wavelet transform (WT), is used to reduce the noise and produce a representation in the low frequency domain, and hence making the facial images insensitive to facial expression and small occlusion. After wavelet decomposition, NMF is performed to produce region or part-based representations of the images. Non-negativity is a useful constraint to generate expressiveness in the reconstruction of faces. The simulation results on Essex and ORL database show that the hybrid of NMF and the best wavelet filter will yield better verification rate and shorter training time. The optimum results of 98.5% and 95.5% are obtained from Essex and ORL Database, respectively. These results are compared with our baseline method, Principal Component Analysis (PCA)

    Bio-discretization: Biometrics authentication featuring face data and tokenised random number

    No full text
    With the wonders of the Internet and the promises of the worldwide information infrastructure, a highly secure authentication system is desirable. Biometric has been deployed in this purpose as it is a unique identifier. However, it also suffers from inherent limitations and specific security threats such as biometric fabrication. To alleviate the liabilities of the biometric, a combination of token and biometric for user authentication and verification is introduced. All user data is kept in the token and human can get rid of the task of remembering passwords. The proposed framework is named as Bio-Discretization. Bio-Discretization is performed on the face image features, which is generated from Non-Negative Matrix Factorization (NMF) in the wavelet domain to produce a set of unique compact bitstring by iterated inner product between a set of pseudo random numbers and face images. Bio-Discretization possesses high data capture offset tolerance, with highly correlated bitstring for intraclass data. This approach is highly desirable in a secure environment and it outperforms the classic authentication scheme

    Integration of wavelet packet decomposition and random secret in iris verification

    No full text
    This paper presents a novel method of integrating the user-specific secret pseudo-random number and iris feature extracted from Wavelet Packet Transform as a feature vector for iris verification. The proposed method helps to improve the traditional iris verification system which depends solely on iris feature. The security risks such as iris fabrication can be avoided through the token replacement. Experimental results show that the proposed method has an encouraging performance. The optimum result is obtained for mean scheme Wavelet Packet Transform with secret pseudo-random number using 2 levels of decomposition, in which an excellent verification rate is produced

    Anchored kernel hashing for cancelable template protection for cross-spectral periocular data

    No full text
    Periocular characteristics is gaining prominence in biometric systems and surveillance systems that operate either in NIR spectrum or visible spectrum. While the ocular information can be well utilized, there exists a challenge to compare images from different spectra such as Near-Infra-Red (NIR) versus Visible spectrum (VIS). In addition, the ocular biometric templates from both NIR and VIS domain need to be protected after the extraction of features to avoid the leakage or linkability of biometric data. In this work, we explore a new approach based on anchored kernel hashing to obtain a cancelable biometric template that is both discriminative for recognition purposes while preserving privacy. The key benefit is that the proposed approach not only works for both NIR and the Visible spectrum, it can also be used with good accuracy for cross-spectral protected template comparison. Through the set of experiments using a cross-spectral periocular database, we demonstrate the performance with EER=1.39% and EER=1.61% for NIR and VIS protected templates respectively. We further present a set of cross-spectral template comparison by comparing the protected templates from one spectrum to another spectra to demonstrate the applicability of the proposed approach

    On the Recognition Performance of BioHash-Protected Finger Vein Templates

    No full text
    This chapter contributes towards advancing finger vein template protection research by presenting the first analysis on the suitability of the BioHashing template protection scheme for finger vein verification systems, in terms of the effect on the system’s recognition performance. Our results show the best performance when BioHashing is applied to finger vein patterns extracted using theWide Line Detector (WLD) and Repeated Line Tracking (RLT) feature extractors, and the worst performance when the Maximum Curvature (MC) extractor is used. The low recognition performance in the Stolen Token scenario is shown to be improvable by increasing the BioHash length; however, we demonstrate that the BioHash length is constrained in practice by the amount of memory required for the projection matrix. So, WLD finger vein patterns are found to be the most promising for BioHashing purposes due to their relatively small feature vector size, which allows us to generate larger BioHashes than is possible for RLT or MC feature vectors. In addition, we also provide an open-source implementation of a BioHash-protected finger vein verification system based on the WLD, RLT and MC extractors, so that other researchers can verify our findings and build upon our work

    Enhanced CT Images by the Wavelet Transform Improving Diagnostic Accuracy of Chest Nodules

    No full text
    The objective of this study was to compare the diagnostic accuracy in the interpretation of chest nodules using original CT images versus enhanced CT images based on the wavelet transform. The CT images of 118 patients with cancers and 60 with benign nodules were used in this study. All images were enhanced through an algorithm based on the wavelet transform. Two experienced radiologists interpreted all the images in two reading sessions. The reading sessions were separated by a minimum of 1 month in order to minimize the effect of observer’s recall. The Mann–Whitney U nonparametric test was used to analyze the interpretation results between original and enhanced images. The Kruskal–Wallis H nonparametric test of K independent samples was used to investigate the related factors which could affect the diagnostic accuracy of observers. The area under the ROC curves for the original and enhanced images was 0.681 and 0.736, respectively. There is significant difference in diagnosing the malignant nodules between the original and enhanced images (z = 7.122, P < 0.001), whereas there is no significant difference in diagnosing the benign nodules (z = 0.894, P = 0.371). The results showed that there is significant difference between original and enhancement images when the size of nodules was larger than 2 cm (Z = −2.509, P = 0.012, indicating the size of the nodules is a critical evaluating factor of the diagnostic accuracy of observers). This study indicated that the image enhancement based on wavelet transform could improve the diagnostic accuracy of radiologists for the malignant chest nodules
    corecore