Combining multiple Iris matchers using advanced fusion techniques to enhance Iris matching performance

Abstract

M.Phil. (Electrical And Electronic Engineering)The enormous increase in technology advancement and the need to secure information e ectively has led to the development and implementation of iris image acquisition technologies for automated iris recognition systems. The iris biometric is gaining popularity and is becoming a reliable and a robust modality for future biometric security. Its wide application can be extended to biometric security areas such as national ID cards, banking systems such as ATM, e-commerce, biometric passports but not applicable in forensic investigations. Iris recognition has gained valuable attention in biometric research due to the uniqueness of its textures and its high recognition rates when employed on high biometric security areas. Identity veri cation for individuals becomes a challenging task when it has to be automated with a high accuracy and robustness against spoo ng attacks and repudiation. Current recognition systems are highly a ected by noise as a result of segmentation failure, and this noise factors increase the biometric error rates such as; the FAR and the FRR. This dissertation reports an investigation of score level fusion methods which can be used to enhance iris matching performance. The fusion methods implemented in this project includes, simple sum rule, weighted sum rule fusion, minimum score and an adaptive weighted sum rule. The proposed approach uses an adaptive fusion which maps feature quality scores with the matcher. The fused scores were generated from four various iris matchers namely; the NHD matcher, the WED matcher, the WHD matcher and the POC matcher. To ensure homogeneity of matching scores before fusion, raw scores were normalized using the tanh-estimators method, because it is e cient and robust against outliers. The results were tested against two publicly available databases; namely, CASIA and UBIRIS using two statistical and biometric system measurements namely the AUC and the EER. The results of these two measures gives the AUC = 99:36% for CASIA left images, the AUC = 99:18% for CASIA right images, the AUC = 99:59% for UBIRIS database and the Equal Error Rate (EER) of 0.041 for CASIA left images, the EER = 0:087 for CASIA right images and with the EER = 0:038 for UBIRIS images

    Similar works