1,496 research outputs found

    Hierarchical mixture models for assessing fingerprint individuality

    Full text link
    The study of fingerprint individuality aims to determine to what extent a fingerprint uniquely identifies an individual. Recent court cases have highlighted the need for measures of fingerprint individuality when a person is identified based on fingerprint evidence. The main challenge in studies of fingerprint individuality is to adequately capture the variability of fingerprint features in a population. In this paper hierarchical mixture models are introduced to infer the extent of individualization. Hierarchical mixtures utilize complementary aspects of mixtures at different levels of the hierarchy. At the first (top) level, a mixture is used to represent homogeneous groups of fingerprints in the population, whereas at the second level, nested mixtures are used as flexible representations of distributions of features from each fingerprint. Inference for hierarchical mixtures is more challenging since the number of unknown mixture components arise in both the first and second levels of the hierarchy. A Bayesian approach based on reversible jump Markov chain Monte Carlo methodology is developed for the inference of all unknown parameters of hierarchical mixtures. The methodology is illustrated on fingerprint images from the NIST database and is used to make inference on fingerprint individuality estimates from this population.Comment: Published in at http://dx.doi.org/10.1214/09-AOAS266 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Method for estimating potential recognition capacity of texture-based biometrics

    Get PDF
    When adopting an image-based biometric system, an important factor for consideration is its potential recognition capacity, since it not only defines the potential number of individuals likely to be identifiable, but also serves as a useful figure-of-merit for performance. Based on block transform coding commonly used for image compression, this study presents a method to enable coarse estimation of potential recognition capacity for texture-based biometrics. Essentially, each image block is treated as a constituent biometric component, and image texture contained in each block is binary coded to represent the corresponding texture class. The statistical variability among the binary values assigned to corresponding blocks is then exploited for estimation of potential recognition capacity. In particular, methodologies are proposed to determine appropriate image partition based on separation between texture classes and informativeness of an image block based on statistical randomness. By applying the proposed method to a commercial fingerprint system and a bespoke hand vein system, the potential recognition capacity is estimated to around 10^36 for a fingerprint area of 25  mm^2 which is in good agreement with the estimates reported, and around 10^15 for a hand vein area of 2268  mm^2 which has not been reported before

    Latent Print Examination and Human Factors: Improving the Practice Through a Systems Approach: The Report of the Expert Working Group on Human Factors in Latent Print Analysis

    Get PDF
    Fingerprints have provided a valuable method of personal identification in forensic science and criminal investigations for more than 100 years. Fingerprints left at crime scenes generally are latent prints—unintentional reproductions of the arrangement of ridges on the skin made by the transfer of materials (such as amino acids, proteins, polypeptides, and salts) to a surface. Palms and the soles of feet also have friction ridge skin that can leave latent prints. The examination of a latent print consists of a series of steps involving a comparison of the latent print to a known (or exemplar) print. Courts have accepted latent print evidence for the past century. However, several high-profile cases in the United States and abroad have highlighted the fact that human errors can occur, and litigation and expressions of concern over the evidentiary reliability of latent print examinations and other forensic identification procedures has increased in the last decade. “Human factors” issues can arise in any experience- and judgment-based analytical process such as latent print examination. Inadequate training, extraneous knowledge about the suspects in the case or other matters, poor judgment, health problems, limitations of vision, complex technology, and stress are but a few factors that can contribute to errors. A lack of standards or quality control, poor management, insufficient resources, and substandard working conditions constitute other potentially contributing factors

    Probability, Individualization, and Uniqueness in Forensic Science Evidence: Listening to the Academies

    Get PDF
    Day in and day out, criminalists testify to positive, uniquely specific identifications of fingerprints, bullets, handwriting, and other trace evidence. A committee of the National Academy of Sciences, building on the writing of academic commentators, has called for sweeping changes in the presentation and production of evidence of identification. These include some form of circumscribed and standardized testimony. But the Academy report is short on the specifics of the testimony that would be legally and professionally allowable. This essay outlines possible types of testimony that might harmonize the testimony of criminalists with the actual state of forensic science. It does so through a critical analysis of the arguments and proposals of two critics of “individualization” testimony in forensic science. By clarifying the relationship between uniqueness and individualization, the essay advances a slightly less skeptical view of individualization than that expounded by Professors Michael Saks and Jay Koehler. Among other things, the essay argues that there is no rule of probability, logic, or ontology that prevents individualization and that testimony of uniqueness or individualization is scientifically acceptable in some situations. Recognizing that these situations are unusual, however, it also surveys some evidentiary rules and practices that could curb the excesses of the current form of testimony

    A critical review of the current state of forensic science knowledge and its integration in legal systems

    Get PDF
    Forensic science has a significant historical and contemporary relationship with the criminal justice system. It is a relationship between two disciplines whose origins stem from different backgrounds. It is trite that effective communication assist in resolving underlying problems in any given context. However, a lack of communication continues to characterise the intersection between law and science. As recently as 2019, a six-part symposium on the use of forensic science in the criminal justice system again posed the question on how the justice system could ensure the reliability of forensic science evidence presented during trials. As the law demands finality, science is always evolving and can never be considered finite or final. Legal systems do not always adapt to the nature of scientific knowledge, and are not willing to abandon finality when that scientific knowledge shifts. Advocacy plays an important role in the promotion of forensic science, particularly advocacy to the broader scientific community for financial support, much needed research and more testing. However, despite its important function, advocacy should not be conflated with science. The foundation of advocacy is a cause; whereas the foundation of science is fact. The objective of this research was to conduct a qualitative literature review of the field of forensic science; to identify gaps in the knowledge of forensic science and its integration in the criminal justice system. The literature review will provide researchers within the field of forensic science with suggested research topics requiring further examination and research. To achieve its objective, the study critically analysed the historical development of, and evaluated the use of forensic science evidence in legal systems generally, including its role regarding the admissibility or inadmissibility of the evidence in the courtroom. In conclusion, it was determined that the breadth of forensic scientific knowledge is comprehensive but scattered. The foundational underpinning of the four disciplines, discussed in this dissertation, has been put to the legal test on countless occasions. Some gaps still remain that require further research in order to strengthen the foundation of the disciplines. Human influence will always be present in examinations and interpretations and will lean towards subjective decision making.JurisprudenceD. Phil

    How Jurors Evaluate Fingerprint Evidence: The Relative Importance of Match Language, Method Information, and Error Acknowledgment

    Get PDF
    Fingerprint examiners use a variety of terms and phrases to describe a finding of a match between a defendant\u27s fingerprints and fingerprint impressions collected from a crime scene. Despite the importance and ubiquity of fingerprint evidence in criminal cases, no prior studies examine how jurors evaluate such evidence. We present two studies examining the impact of different match phrases, method descriptions, and statements about possible examiner error on the weight given to fingerprint identification evidence by laypersons. In both studies, the particular phrase chosen to describe the finding of a match-whether simple and imprecise or detailed and claiming near certainty-had little effect on participants\u27 judgments about the guilt of a suspect. In contrast, the examiner admitting the possibility of error reduced the weight given to the fingerprint evidence-regardless of whether the admission was made during direct or cross-examination. In addition, the examiner providing information about the method used to make fingerprint comparisons reduced the impact of admitting the possibility of error. We found few individual differences in reactions to the fingerprint evidence across a wide range of participant variables, and we found widespread agreement regarding the uniqueness of fingerprints and the reliability of fingerprint identifications. Our results suggest that information about the reliability of fingerprint identifications will have a greater impact on lay interpretations of fingerprint evidence than the specific qualitative or quantitative terms chosen to describe a fingerprint match

    Characteristic and necessary minutiae in fingerprints

    Get PDF
    Fingerprints feature a ridge pattern with moderately varying ridge frequency (RF), following an orientation field (OF), which usually features some singularities. Additionally at some points, called minutiae, ridge lines end or fork and this point pattern is usually used for fingerprint identification and authentication. Whenever the OF features divergent ridge lines (e.g., near singularities), a nearly constant RF necessitates the generation of more ridge lines, originating at minutiae. We call these the necessary minutiae. It turns out that fingerprints feature additional minutiae which occur at rather arbitrary locations. We call these the random minutiae or, since they may convey fingerprint individuality beyond the OF, the characteristic minutiae. In consequence, the minutiae point pattern is assumed to be a realization of the superposition of two stochastic point processes: a Strauss point process (whose activity function is given by the divergence field) with an additional hard core, and a homogeneous Poisson point process, modelling the necessary and the characteristic minutiae, respectively. We perform Bayesian inference using an Markov-Chain-Monte-Carlo (MCMC)-based minutiae separating algorithm (MiSeal). In simulations, it provides good mixing and good estimation of underlying parameters. In application to fingerprints, we can separate the two minutiae patterns and verify by example of two different prints with similar OF that characteristic minutiae convey fingerprint individuality
    corecore