268 research outputs found

    Information Theoretic Methods For Biometrics, Clustering, And Stemmatology

    Get PDF
    This thesis consists of four parts, three of which study issues related to theories and applications of biometric systems, and one which focuses on clustering. We establish an information theoretic framework and the fundamental trade-off between utility of biometric systems and security of biometric systems. The utility includes person identification and secret binding, while template protection, privacy, and secrecy leakage are security issues addressed. A general model of biometric systems is proposed, in which secret binding and the use of passwords are incorporated. The system model captures major biometric system designs including biometric cryptosystems, cancelable biometrics, secret binding and secret generating systems, and salt biometric systems. In addition to attacks at the database, information leakage from communication links between sensor modules and databases is considered. A general information theoretic rate outer bound is derived for characterizing and comparing the fundamental capacity, and security risks and benefits of different system designs. We establish connections between linear codes to biometric systems, so that one can directly use a vast literature of coding theories of various noise and source random processes to achieve good performance in biometric systems. We develop two biometrics based on laser Doppler vibrometry: LDV) signals and electrocardiogram: ECG) signals. For both cases, changes in statistics of biometric traits of the same individual is the major challenge which obstructs many methods from producing satisfactory results. We propose a ii robust feature selection method that specifically accounts for changes in statistics. The method yields the best results both in LDV and ECG biometrics in terms of equal error rates in authentication scenarios. Finally, we address a different kind of learning problem from data called clustering. Instead of having a set of training data with true labels known as in identification problems, we study the problem of grouping data points without labels given, and its application to computational stemmatology. Since the problem itself has no true answer, the problem is in general ill-posed unless some regularization or norm is set to define the quality of a partition. We propose the use of minimum description length: MDL) principle for graphical based clustering. In the MDL framework, each data partitioning is viewed as a description of the data points, and the description that minimizes the total amount of bits to describe the data points and the model itself is considered the best model. We show that in synthesized data the MDL clustering works well and fits natural intuition of how data should be clustered. Furthermore, we developed a computational stemmatology method based on MDL, which achieves the best performance level in a large dataset

    Secret-key rates and privacy leakage in biometric systems

    Get PDF
    In this thesis both the generation of secret keys from biometric data and the binding of secret keys to biometric data are investigated. These secret keys can be used to regulate access to sensitive data, services, and environments. In a biometric secrecy system a secret key is generated or chosen during an enrollment procedure in which biometric data are observed for the first time. This key is to be reconstructed after these biometric data are observed for the second time when authentication is required. Since biometric measurements are typically noisy, reliable biometric secrecy systems also extract so-called helper data from the biometric observation at the time of enrollment. These helper data facilitate reliable reconstruction of the secret key in the authentication process. Since the helper data are assumed to be public, they should not contain information about the secret key. We say that the secrecy leakage should be negligible. Important parameters of biometric key-generation and key-binding systems include the size of the generated or chosen secret key and the information that the helper data contain (leak) about the biometric observation. This latter parameter is called privacy leakage. Ideally the privacy leakage should be small, to prevent the biometric data of an individual from being compromised. Moreover, the secret-key length (also characterized by the secret-key rate) should be large to minimize the probability that the secret key is guessed and unauthorized access is granted. The first part of this thesis mainly focuses on the fundamental trade-off between the secret-key rate and the privacy-leakage rate in biometric secret-generation and secretbinding systems. This trade-off is studied from an information-theoretical perspective for four biometric settings. The first setting is the classical secret-generation setting as proposed by Maurer [1993] and Ahlswede and Csiszár [1993]. For this setting the achievable secret-key vs. privacy-leakage rate region is determined in this thesis. In the second setting the secret key is not generated by the terminals, but independently chosen during enrollment (key binding). Also for this setting the region of achievable secret-key vs. privacy-leakage rate pairs is determined. In settings three and four zero-leakage systems are considered. In these systems the public message should contain only a negligible amount of information about both the secret key and the biometric enrollment sequence. To achieve this, a private key is needed, which can be observed only by the two terminals. Again both the secret generation setting and chosen secret setting are considered. For these two cases the regions of achievable secret-key vs. private-key rate pairs are determined. For all four settings two notions of leakage are considered. Depending on whether one looks at secrecy and privacy leakage separately or in combination, unconditional or conditional privacy leakage is considered. Here unconditional leakage corresponds to the mutual information between the helper data and the biometric enrollment sequence, while the conditional leakage relates to the conditional version of this mutual information, given the secret. The second part of the thesis focuses on the privacy- and secrecy-leakage analysis of the fuzzy commitment scheme. Fuzzy commitment, proposed by Juels and Wattenberg [1999], is, in fact, a particular realization of a binary biometric secrecy system with a chosen secret key. In this scheme the helper data are constructed as a codeword from an error-correcting code, used to encode a chosen secret, masked with the biometric sequence that has been observed during enrollment. Since this scheme is not privacy preserving in the conditional privacy-leakage sense, the unconditional privacy-leakage case is investigated. Four cases of biometric sources are considered, i.e. memoryless and totally-symmetric biometric sources, memoryless and input-symmetric biometric sources, memoryless biometric sources, and stationary and ergodic biometric sources. For the first two cases the achievable rate-leakage regions are determined. In these cases the secrecy leakage rate need not be positive. For the other two cases only outer bounds on achievable rate-leakage regions are found. These bounds, moreover, are sharpened for fuzzy commitment based on systematic parity-check codes. Using the fundamental trade-offs found in the first part of this thesis, it is shown that fuzzy commitment is only optimal for memoryless totally-symmetric biometric sources and only at the maximum secret-key rate. Moreover, it is demonstrated that for memoryless and stationary ergodic biometric sources, which are not input-symmetric, the fuzzy commitment scheme leaks information on both the secret key and the biometric data. Biometric sequences have an often unknown statistical structure (model) that can be quite complex. The last part of this dissertation addresses the problem of finding the maximum a posteriori (MAP) model for a pair of observed biometric sequences and the problem of estimating the maximum secret-key rate from these sequences. A universal source coding procedure called the Context-TreeWeighting (CTW) method [1995] can be used to find this MAP model. In this thesis a procedure that determines the MAP model, based on the so-called beta-implementation of the CTW method, is proposed. Moreover, CTW methods are used to compress the biometric sequences and sequence pairs in order to estimate the mutual information between the sequences. However, CTW methods were primarily developed for compressing onedimensional sources, while biometric data are often modeled as two-dimensional processes. Therefore it is proved here that the entropy of a stationary two-dimensional source can be expressed as a limit of a series of conditional entropies. This result is also extended to the conditional entropy of one two-dimensional source given another one. As a consequence entropy and mutual information estimates can be obtained from CTW methods using properly-chosen templates. Using such techniques estimates of the maximum secret-key rate for physical unclonable functions (PUFs) are determined from a data-set of observed sequences. PUFs can be regarded as inanimate analogues of biometrics

    Privacy and Security Assessment of Biometric Template Protection

    Full text link

    Security issues in helper data systems

    Get PDF

    On the performance of helper data template protection schemes

    Get PDF
    The use of biometrics looks promising as it is already being applied in elec- tronic passports, ePassports, on a global scale. Because the biometric data has to be stored as a reference template on either a central or personal storage de- vice, its wide-spread use introduces new security and privacy risks such as (i) identity fraud, (ii) cross-matching, (iii) irrevocability and (iv) leaking sensitive medical information. Mitigating these risks is essential to obtain the accep- tance from the subjects of the biometric systems and therefore facilitating the successful implementation on a large-scale basis. A solution to mitigate these risks is to use template protection techniques. The required protection properties of the stored reference template according to ISO guidelines are (i) irreversibility, (ii) renewability and (iii) unlinkability. A known template protection scheme is the helper data system (HDS). The fun- damental principle of the HDS is to bind a key with the biometric sample with use of helper data and cryptography, as such that the key can be reproduced or released given another biometric sample of the same subject. The identity check is then performed in a secure way by comparing the hash of the key. Hence, the size of the key determines the amount of protection. This thesis extensively investigates the HDS system, namely (i) the the- oretical classication performance, (ii) the maximum key size, (iii) the irre- versibility and unlinkability properties, and (iv) the optimal multi-sample and multi-algorithm fusion method. The theoretical classication performance of the biometric system is deter- mined by assuming that the features extracted from the biometric sample are Gaussian distributed. With this assumption we investigate the in uence of the bit extraction scheme on the classication performance. With use of the the- oretical framework, the maximum size of the key is determined by assuming the error-correcting code to operate on Shannon's bound. We also show three vulnerabilities of HDS that aect the irreversibility and unlinkability property and propose solutions. Finally, we study the optimal level of applying multi- sample and multi-algorithm fusion with the HDS at either feature-, score-, or decision-level

    Recent Application in Biometrics

    Get PDF
    In the recent years, a number of recognition and authentication systems based on biometric measurements have been proposed. Algorithms and sensors have been developed to acquire and process many different biometric traits. Moreover, the biometric technology is being used in novel ways, with potential commercial and practical implications to our daily activities. The key objective of the book is to provide a collection of comprehensive references on some recent theoretical development as well as novel applications in biometrics. The topics covered in this book reflect well both aspects of development. They include biometric sample quality, privacy preserving and cancellable biometrics, contactless biometrics, novel and unconventional biometrics, and the technical challenges in implementing the technology in portable devices. The book consists of 15 chapters. It is divided into four sections, namely, biometric applications on mobile platforms, cancelable biometrics, biometric encryption, and other applications. The book was reviewed by editors Dr. Jucheng Yang and Dr. Norman Poh. We deeply appreciate the efforts of our guest editors: Dr. Girija Chetty, Dr. Loris Nanni, Dr. Jianjiang Feng, Dr. Dongsun Park and Dr. Sook Yoon, as well as a number of anonymous reviewers

    Protecting the infrastructure: 3rd Australian information warfare & security conference 2002

    Get PDF
    The conference is hosted by the We-B Centre (working with a-business) in the School of Management Information System, the School of Computer & Information Sciences at Edith Cowan University. This year\u27s conference is being held at the Sheraton Perth Hotel in Adelaide Terrace, Perth. Papers for this conference have been written by a wide range of academics and industry specialists. We have attracted participation from both national and international authors and organisations. The papers cover many topics, all within the field of information warfare and its applications, now and into the future. The papers have been grouped into six streams: • Networks • IWAR Strategy • Security • Risk Management • Social/Education • Infrastructur

    Towards Usable End-user Authentication

    Get PDF
    Authentication is the process of validating the identity of an entity, e.g., a person, a machine, etc.; the entity usually provides a proof of identity in order to be authenticated. When the entity - to be authenticated - is a human, the authentication process is called end-user authentication. Making an end-user authentication usable entails making it easy for a human to obtain, manage, and input the proof of identity in a secure manner. In machine-to-machine authentication, both ends have comparable memory and computational power to securely carry out the authentication process using cryptographic primitives and protocols. On the contrary, as a human has limited memory and computational power, in end-user authentication, cryptography is of little use. Although password based end-user authentication has many well-known security and usability problems, it is the de facto standard. Almost half a century of research effort has produced a multitude of end-user authentication methods more sophisticated than passwords; yet, none has come close to replacing passwords. In this dissertation, taking advantage of the built-in sensing capability of smartphones, we propose an end-user authentication framework for smartphones - called ePet - which does not require any active participation from the user most of the times; thus the proposed framework is highly usable. Using data collected from subjects, we validate a part of the authentication framework for the Android platform. For web authentication, in this dissertation, we propose a novel password creation interface, which helps a user remember a newly created password with more confidence - by allowing her to perform various memory tasks built upon her new password. Declarative and motor memory help the user remember and efficiently input a password. From a within-subjects study we show that declarative memory is sufficient for passwords; motor memory mostly facilitate the input process and thus the memory tasks have been designed to help cement the declarative memory for a newly created password. This dissertation concludes with an evaluation of the increased usability of the proposed interface through a between-subjects study

    Icarus, or the idea toward efficient, economical, and ethical acquirement of critical governmental information systems

    Get PDF
    Critical governmental information systems are governmental information systems used by the administration to retain information and provide services that enable and safeguard the lives, well-being and security of citizens. These include public healthcare information systems, electronic voting systems, and border control information systems with biometric registers. As the ongoing digitalisation of our society advances, our dependence on information systems, their functionality and security continues to grow and the quality of work done with them increasingly intertwines with the daily lives of citizens. However, it has long been evident that IT projects in public administration are failing. Governmental information system projects often end up being astronomically expensive, unreasonably ineffective and ethically unsustainable. Therefore, much more attention should be paid to the procurement process for these systems. The reasons for these problems are manifold. Often, simple political motivations and appeal to, among other things, national pride, are a simple way for politicians to attain support. The creation of large-scale system entities favours large-scale private-sector actors specialised on governmental information systems. These companies’ impact on society’s decision-making and operations should not be underrated. However, the lack of understanding of technology and socio-technical problems and the structure of the digitalisation in society are likely to be the biggest problems in procuring these systems. It must be understood that organisational behaviour changes when its information systems are changed - including the ethical basis. To develop efficient, economical and ethical governmental information systems, a holistic approach is required. The ability to make decisions that take into account the needs and requirements of the citizens, employees, system vendors, and society at large in an ethically sustainable way is required. It should be noted that the decision makers, payers, suppliers, and the targets of the use are often different parties with their own desires, goals, and objectives. It is necessary to understand the functioning of the technology and its potential, but also to address the limitations of technology and potential threats created by it. The security required for these systems should not be diminished. This dissertation is framed in the light of Ovid’s telling of the myth of Daedalus and Icarus and through the framework for Aristotle's virtue ethics to examine the responsibility of governmental information system procurement, the role of responsible actors in society, and their required capabilities. The dissertation demonstrates that the current approach in many situations, the Icarian method, whereby persons committed their lives to the virtue of scientific knowledge are dismissed on discussion and decision-making in a way that is economically, effectively and ethically unsustainable. The antithesis for this, the Daedalus effect, represents a desirable state where decision-making is guided by caution, as well as by scientific and ethical inspiration. The dissertation proposes that critical governmental information systems should be guided by a prominent actor cultivated to the virtue of holistic scientific wisdom.Julkishallinnon kriittiset tietojärjestelmät ovat tietojärjestelmiä, joita hallinto käyttää säilyttämään tietoa sekä tuottamaan kansalaisten henkeä, hyvinvointia ja turvallisuutta mahdollistavia palveluita. Näihin järjestelmiin lukeutuvat muun muassa julkisen terveydenhuollon tietojärjestelmät, sähköiset äänestysjärjestelmät sekä rajavalvonnan tietojärjestelmät biometrisine tunnisterekistereineen. Yhteiskunnan digitalisoituessa riippuvuutemme tietojärjestelmistä sekä niiden toimivuudesta ja turvallisuudesta jatkaa kasvamistaan ja näiden järjestelmien avulla suoritetun työn laatu on yhä tiukemmassa vuorovaikutussuhteessa kansalaisten jokapäiväiseen elämään. On kuitenkin ollut ilmiselvää jo pitkään, että julkishallinnon IT-hankkeet epäonnistuvat taajaan. Julkishallinnon tietojärjestelmähankkeet ovat usein tähtitieteellisen kalliita, kohtuuttoman tehottomia ja eettisesti kestämättömiä. Siksi näiden järjestelmien hankintaprosessiin olisi syytä kiinnittää nykyistä selvästi enemmän huomiota. Syyt näihin ongelmiin ovat moninaiset. Usein jo yksinkertaiset poliittiset motivaatiot sekä vetoaminen muun muassa kansalliseen ylpeyteen ovat poliitikolle yksinkertainen keino saavuttaa kannatusta. Suurikokoisten järjestelmäkokonaisuuksien tekeminen suosii julkishallinnon järjestelmähankintoihin keskittyneitä suuryrityksiä, joiden vaikutusta yhteiskunnalliseen päätöksentekoon ja toimintaan ei sovi väheksyä. Kuitenkin ymmärtämättömyys teknologiasta ja sosio-teknisistä ongelmista sekä digitalisoituvan yhteiskunnan rakenteista lienee suurimpia ongelmia järjestelmien hankinnassa. On otettava huomioon, että yhteisön toiminta muuttuu, kun sen tietojärjestelmiä muutetaan – myös eettisen toiminnan osalta. Luodaksemme tehokkaita, taloudellisia ja eettisesti kestäviä tietojärjestelmiä julkishallinnon käyttöön on asiaa lähestyttävä kokonaisvaltaisesti. On osattava tehdä päätöksiä, joissa otetaan huomioon kansalaisten, työntekijöiden, järjestelmien toimittajien sekä yhteiskunnan tarpeet ja vaatimukset eettisesti kestävällä tavalla. On huomattava, että päättäjä, maksaja, toimittaja ja käytön kohde ovat kovin usein eri tahoja omine toiveineen, tavoitteineen ja päämäärineen. On ymmärrettävä teknologian toiminta ja sen luomat mahdollisuudet mutta myös käsitettävä teknologian rajoitukset ja sen luomat uhat. Myöskään tietoturvan tarvetta näiden järjestelmien kohdalla ei sovi vähätellä. Väitöskirjassa käsitellään Ovidiuksen kertoman Daedaluksen ja Ikaroksen myytin kautta Aristoteelisen hyve-etiikan viitekehyksessä julkishallinnon tietojärjestelmätilausten vastuuta sekä vastuuhenkilöitä, vastuuhenkilöiden asemaa yhteiskunnassa ja heiltä vaadittavia kykyjä. Väitöskirjassa osoitetaan, että monessa tilanteessa esiintyvä julkishallinnon toimintatapa, Ikaroslainen metodi, jossa tiedon hyveillä varustautuneet henkilöt sivutetaan keskustelusta ja päätöksenteosta on niin taloudellisesti, tehokkuudellisesti kuin eettisestikin kestämätön toimintatapa. Tämän vastavoima, Daedalus-efekti, taas edustaa tavoitetilaa, jossa päätöksentekoa ohjaavat varovaisuus sekä tieteen ja moraalifilosofian inspiroima toiminta. Väitöskirjassa esitetään, että julkishallinnon tietojärjestelmätoimintaa pitäisi ohjata laaja-alaisen tieteellisen viisauden hyveen kultivoinut näkyvä toimija

    An enhanced fuzzy commitment scheme in biometric template protection

    Get PDF
    Biometric template protection consists of two approaches; Feature Transformation (FT) and Biometric Cryptography (BC). This research focuses on Key-Binding Technique based on Fuzzy Commitment Scheme (FCS) under BC approach. In FCS, the helper data should not disclose any information about the biometric data. However, literatures showed that it had dependency issue in its helper data which jeopardize security and privacy. Moreover, this also increases the probability of privacy leakage which lead to attacks such as brute-force and cross-matching attack. Thus, the aim of this research is to reduce the dependency of helper data that can caused privacy leakage. Three objectives have been set such as (1) to identify the factors that cause dependency on biometric features (2) to enhance FCS by proposing an approach that reduces this dependency, and (3) to evaluate the proposed approach based on parameters such as security, privacy, and biometric performance. This research involved four phases. Phase one, involved research review and analysis, followed by designing conceptual model and algorithm development in phase two and three respectively. Phase four, involved with the evaluation of the proposed approach. The security and privacy analysis shows that with the additional hash function, it is difficult for adversary to perform brute‐force attack on information stored in database. Furthermore, the proposed approach has enhanced the aspect of unlinkability and prevents cross-matching attack. The proposed approach has achieved high accuracy of 95.31% with Equal Error Rate (EER) of 1.54% which performs slightly better by 1.42% compared to the existing approach. This research has contributed towards the key-binding technique of biometric fingerprint template protection, based on FCS. In particular, this research was designed to create a secret binary feature that can be used in other state-of-the-art cryptographic systems by using an appropriate error-correcting approach that meets security standards
    corecore