16,907 research outputs found

    Homomorphic Encryption for Speaker Recognition: Protection of Biometric Templates and Vendor Model Parameters

    Full text link
    Data privacy is crucial when dealing with biometric data. Accounting for the latest European data privacy regulation and payment service directive, biometric template protection is essential for any commercial application. Ensuring unlinkability across biometric service operators, irreversibility of leaked encrypted templates, and renewability of e.g., voice models following the i-vector paradigm, biometric voice-based systems are prepared for the latest EU data privacy legislation. Employing Paillier cryptosystems, Euclidean and cosine comparators are known to ensure data privacy demands, without loss of discrimination nor calibration performance. Bridging gaps from template protection to speaker recognition, two architectures are proposed for the two-covariance comparator, serving as a generative model in this study. The first architecture preserves privacy of biometric data capture subjects. In the second architecture, model parameters of the comparator are encrypted as well, such that biometric service providers can supply the same comparison modules employing different key pairs to multiple biometric service operators. An experimental proof-of-concept and complexity analysis is carried out on the data from the 2013-2014 NIST i-vector machine learning challenge

    Protection of privacy in biometric data

    Full text link
    Biometrics is commonly used in many automated veri cation systems offering several advantages over traditional veri cation methods. Since biometric features are associated with individuals, their leakage will violate individuals\u27 privacy, which can cause serious and continued problems as the biometric data from a person are irreplaceable. To protect the biometric data containing privacy information, a number of privacy-preserving biometric schemes (PPBSs) have been developed over the last decade, but they have various drawbacks. The aim of this paper is to provide a comprehensive overview of the existing PPBSs and give guidance for future privacy-preserving biometric research. In particular, we explain the functional mechanisms of popular PPBSs and present the state-of-the-art privacy-preserving biometric methods based on these mechanisms. Furthermore, we discuss the drawbacks of the existing PPBSs and point out the challenges and future research directions in PPBSs

    The Gatekeepers of Research: Why a Data Protection Authority Holds the Key to Research in the New York Privacy Acts

    Get PDF
    Biometric data is among the most sensitive of personal data because it is biologically tied and unique to the individual. Nonetheless, biometric data is an invaluable facet of the research that enables progressive scientific, technological, and medical innovation. Because a comprehensive federal data privacy act does not appear to be on the horizon, the torch has been passed to the states to create their own personal data protection regimes. New Yorkers’ personal biometric data is not aptly protected, partially because neither the New York Privacy Act nor the Biometric Privacy Act (collectively, the NY Privacy Acts) have matured to the point of becoming a legislative reality. This note seeks to establish that the NY Privacy Acts, while necessarily restricting data processing practices by businesses that endanger the consumer, fail to clearly define research and the boundaries of a sufficient research exemption from mandated erasure. To protect New Yorkers’ biometric data while simultaneously maximizing the benefits of biometric data to research, this note proposes that the New York legislature should amend the NY Privacy Acts to include a tripartite definition of “research,” inspired by the definitions of the General Data Protection Regulation, the California Consumer Privacy Act and California Privacy Rights Act, with a reasonable degree of added reverence for the “open science” concept. Finally, the New York legislature should mandate the imposition of both a data protection agency and a biometric data subcommittee that would ensure compliance with the elevated privacy standards required for biometric data while determining appropriate exemptions for research, thereby serving as the gatekeepers for research in the Empire State. Without these gatekeepers, New York would be locking up research and throwing away the key

    PABAU: Privacy Analysis of Biometric API Usage

    Full text link
    Biometric data privacy is becoming a major concern for many organizations in the age of big data, particularly in the ICT sector, because it may be easily exploited in apps. Most apps utilize biometrics by accessing common application programming interfaces (APIs); hence, we aim to categorize their usage. The categorization based on behavior may be closely correlated with the sensitive processing of a user's biometric data, hence highlighting crucial biometric data privacy assessment concerns. We propose PABAU, Privacy Analysis of Biometric API Usage. PABAU learns semantic features of methods in biometric APIs and uses them to detect and categorize the usage of biometric API implementation in the software according to their privacy-related behaviors. This technique bridges the communication and background knowledge gap between technical and non-technical individuals in organizations by providing an automated method for both parties to acquire a rapid understanding of the essential behaviors of biometric API in apps, as well as future support to data protection officers (DPO) with legal documentation, such as conducting a Data Protection Impact Assessment (DPIA).Comment: Accepted by The 8th IEEE International Conference on Privacy Computing (PriComp 2022

    Avoiding terminological confusion between the notions of 'biometrics' and 'biometric data':An investigation into the meanings of the terms from a European data protection and a scientific perspective

    Get PDF
    This article has been motivated by an observation: the lack of rigor by European bodies when they use scientific terms to address data protection and privacy issues raised by biometric technologies and biometric data. In particular, they improperly use the term ‘biometrics’ to mean at the same time ‘biometric data’, ‘identification method’, or ‘biometric technologies’.Based on this observation, there is a need to clarify what ‘biometrics’ means for the biometric community and whether and how the legal community should use the term in a data protection and privacy context.In parallel to that exercise of clarification, there is also a need to investigate the current legal definition of ‘biometric data’ as framed by European bodies at the level of the European Union and the Council of Europe.The comparison of the regulatory and scientific definitions of the term ‘biometric data’ reveals that the term is used in two different contexts. However, it is legitimate to question the role that the scientific definition could exercise on the regulatory definition. More precisely, the question is whether the technical process through which biometric information is extracted and transformed into a biometric template should be reflected in the regulatory definition of the term

    Anonymous subject identification and privacy information management in video surveillance

    Get PDF
    The widespread deployment of surveillance cameras has raised serious privacy concerns, and many privacy-enhancing schemes have been recently proposed to automatically redact images of selected individuals in the surveillance video for protection. Of equal importance are the privacy and efficiency of techniques to first, identify those individuals for privacy protection and second, provide access to original surveillance video contents for security analysis. In this paper, we propose an anonymous subject identification and privacy data management system to be used in privacy-aware video surveillance. The anonymous subject identification system uses iris patterns to identify individuals for privacy protection. Anonymity of the iris-matching process is guaranteed through the use of a garbled-circuit (GC)-based iris matching protocol. A novel GC complexity reduction scheme is proposed by simplifying the iris masking process in the protocol. A user-centric privacy information management system is also proposed that allows subjects to anonymously access their privacy information via their iris patterns. The system is composed of two encrypted-domain protocols: The privacy information encryption protocol encrypts the original video records using the iris pattern acquired during the subject identification phase; the privacy information retrieval protocol allows the video records to be anonymously retrieved through a GC-based iris pattern matching process. Experimental results on a public iris biometric database demonstrate the validity of our framework

    Privacy-Preserving Authentication: A Homomorphic Encryption Approach

    Get PDF
    The importance of privacy for individuals has become increasingly evident in recent years as the amount of personal data being collected, stored and used by both private companies and government institutions has grown exponentially. The potential for this data to be misused or mishandled has led to widespread concern among individuals about the protection of their personal information. In response to these concerns, there has been a rise in the development of privacy-preserving technologies, which aim to protect personal data while still allowing it to be used for legitimate purposes. These technologies are necessary not only to address the concerns of individuals, but also to meet the legal requirements of institutions that handle personal information. Many applications using personal information as a commodity can benefit from privacy-preserving technologies. The research presented in this thesis targets a commonly used Internet application in which privacy-enhancing technologies can play a key role: biometric-based authentication. Authentication is the establishment of one party’s identity to the other. Biometric data, such as faces, fingerprints or iris, are used more and more commonly as a means of providing personal identification and authentication. However, authentication protocols using biometric data face serious privacy concerns, as the data involved is sensitive or personally-identifiable, which makes it necessary for data holders to protect its privacy. The widespread use of this application, and the need to protect user privacy, motivated us to examine how homomorphic encryption, a privacy-preserving technology, can be used and deployed to enhance privacy in such an application. Homomorphic encryption is a form of encryption that allows arbitrary computations to be performed on encrypted data, resulting in an encrypted result that, when decrypted, is the same as if the computation had been performed on the corresponding cleartext data. This means that entire computational processes can be executed on encrypted data without requiring the decryption key, thereby maintaining the privacy of the data involved. This can address both concerns from individuals regarding the protection of their personal and sensitive data, and legal requirements that institutions must meet. Homomorphic encryption can be used in an authentication protocol to allow a server to verify the authenticity of a client’s credentials without having access to the cleartext values of the credentials. In this thesis, we describe and prove secure two novel biometric-based authentication protocols that use homomorphic encryption to preserve the confidentiality of the biometric data both in storage and during use. These protocols ensure the privacy of the biometric information, while still allowing it to be used for authentication purposes. Users of the protocols encrypt their own biometric data and send it to a remote server that performs computations, including the biometric matching, solely on encrypted data. One of the protocols is designed to protect biometric data privacy against a honest-but-curious server and the other against a malicious server. Additionally, in both cases the user is securely authenticated by the server. For both the protocols, implementation and performance results using public homomorphic encryption libraries are presented along with a security and usability assessment, including an evaluation analysis against industry-standard biometric-based authentication schemes. In the most efficient implementation, the active authentication phase takes no more than three seconds to complete

    Biometric privacy protection : guidelines and technologies

    Get PDF
    Compared with traditional techniques used to establish the identity of a person, biometric systems offer a greater confidence level that the authenticated individual is not impersonated by someone else. However, it is necessary to consider different privacy and security aspects in order to prevent possible thefts and misuses of biometric data. The effective protection of the privacy must encompass different aspects, such as the perceived and real risks pertaining to the users, the specificity of the application, the adoption of correct policies, and data protection methods as well. This chapter focuses on the most important privacy issues related to the use of biometrics, it presents actual guidelines for the implementation of privacy-protective biometric systems, and proposes a discussion of the methods for the protection of biometric data

    Biometrics and the United Kingdom National Identity Register: Exploring the privacy dilemmas of proportionality and secondary use of biometric information

    Get PDF
    Despite the obvious importance of privacy concerns in the information age, “privacy” remains a messy concept in the academic literature. Scholars are thus attempting to clarify and systematize the privacy concept. They have proposed two important dimensions of privacy concerns: 1) proportionality, or the adequate, relevant and non-excessive collection of personal data, and 2) secondary usage, or the prohibition of subsequent, unspecified uses of personal information. This paper takes measure of the proportionality and potential secondary uses of biometric data in the proposed United Kingdom (UK) National Identity Register (NIR). It argues that the UK Identity Cards Act 2006 fails to guard against violations of the principles of proportionality and secondary usage of biometric data. After reviewing the modern literature on informational privacy protection, I analyze biometrics and their privacy implications. I then discuss these implications in the context of the UK government’s NIR plans. The analysis yields insights into how biometrics on the proposed NIR interplay with purpose specifications, architectural concerns, knowledge asymmetries and public anxieties. I also explore potential secondary uses of the types of biometric data that could be stored in the NIR. Last, a brief note is offered about the possible means of regulating against privacy infringements

    Finger Vein Template Protection with Directional Bloom Filter

    Get PDF
    Biometrics has become a widely accepted solution for secure user authentication. However, the use of biometric traits raises serious concerns about the protection of personal data and privacy. Traditional biometric systems are vulnerable to attacks due to the storage of original biometric data in the system. Because biometric data cannot be changed once it has been compromised, the use of a biometric system is limited by the security of its template. To protect biometric templates, this paper proposes the use of directional bloom filters as a cancellable biometric approach to transform the biometric data into a non-invertible template for user authentication purposes. Recently, Bloom filter has been used for template protection due to its efficiency with small template size, alignment invariance, and irreversibility. Directional Bloom Filter improves on the original bloom filter. It generates hash vectors with directional subblocks rather than only a single-column subblock in the original bloom filter. Besides, we make use of multiple fingers to generate a biometric template, which is termed multi-instance biometrics. It helps to improve the performance of the method by providing more information through the use of multiple fingers. The proposed method is tested on three public datasets and achieves an equal error rate (EER) as low as 5.28% in the stolen or constant key scenario. Analysis shows that the proposed method meets the four properties of biometric template protection. Doi: 10.28991/HIJ-2023-04-02-013 Full Text: PD
    corecore