174 research outputs found

    A Class of Regression Models for Pairwise Comparisons of Forensic Handwriting Comparison Systems

    Get PDF
    Handwriting analysis is a complex field largely living in forensic science and the legal realm. One task of a forensic document examiner (FDE) may be to determine the writer(s) of handwritten documents. Automated identification systems (AIS) were built to aid FDEs in their examinations. Part of the uses of these AIS (such as FISH[5] [7],WANDA [6], CEDAR-FOX [17], and FLASHIDÂź2) are tomeasure features about a handwriting sample and to provide the user with a numeric value of the evidence. These systems use their own algorithms and definitions of features to quantify the writing and can be considered a black-box. The outputs of two AIS are used to compare to the results of a survey of FDE writership opinions. In this dissertation I will be focusing on the development of a response surface that characterizes the feature outputs of AIS outputs. Using a set of handwriting samples, a pairwise metric, or scoring method, is applied to each of the individual features provided by the AIS to produce sets of pairwise scores. The pairwise scores lead to a degenerate U-statistic. We use a generalized least squares method to test the null hypothesis that there is no relationship between two metrics (ÎČ1 = 0.) Monte Carlo simulations are developed and ran to ensure the results, considering the structure of the pairwisemetric, behave under the null hypothesis, and to ensure the modeling will catch a relationship under the alternative hypothesis. The outcome of the significance tests helps to determine which of the metrics are related to each other

    Latent Print Examination and Human Factors: Improving the Practice Through a Systems Approach: The Report of the Expert Working Group on Human Factors in Latent Print Analysis

    Get PDF
    Fingerprints have provided a valuable method of personal identification in forensic science and criminal investigations for more than 100 years. Fingerprints left at crime scenes generally are latent prints—unintentional reproductions of the arrangement of ridges on the skin made by the transfer of materials (such as amino acids, proteins, polypeptides, and salts) to a surface. Palms and the soles of feet also have friction ridge skin that can leave latent prints. The examination of a latent print consists of a series of steps involving a comparison of the latent print to a known (or exemplar) print. Courts have accepted latent print evidence for the past century. However, several high-profile cases in the United States and abroad have highlighted the fact that human errors can occur, and litigation and expressions of concern over the evidentiary reliability of latent print examinations and other forensic identification procedures has increased in the last decade. “Human factors” issues can arise in any experience- and judgment-based analytical process such as latent print examination. Inadequate training, extraneous knowledge about the suspects in the case or other matters, poor judgment, health problems, limitations of vision, complex technology, and stress are but a few factors that can contribute to errors. A lack of standards or quality control, poor management, insufficient resources, and substandard working conditions constitute other potentially contributing factors

    Development and Properties of Kernel-based Methods for the Interpretation and Presentation of Forensic Evidence

    Get PDF
    The inference of the source of forensic evidence is related to model selection. Many forms of evidence can only be represented by complex, high-dimensional random vectors and cannot be assigned a likelihood structure. A common approach to circumvent this is to measure the similarity between pairs of objects composing the evidence. Such methods are ad-hoc and unstable approaches to the judicial inference process. While these methods address the dimensionality issue they also engender dependencies between scores when 2 scores have 1 object in common that are not taken into account in these models. The model developed in this research captures the dependencies between pairwise scores from a hierarchical sample and models them in the kernel space using a linear model. Our model is flexible to accommodate any kernel satisfying basic conditions and as a result is applicable to any type of complex high-dimensional data. An important result of this work is the asymptotic multivariate normality of the scores as the data dimension increases. As a result, we can: 1) model very high-dimensional data when other methods fail; 2) determine the source of multiple samples from a single trace in one calculation. Our model can be used to address high-dimension model selection problems in different situations and we show how to use it to assign Bayes factors to forensic evidence. We will provide examples of real-life problems using data from very small particles and dust analyzed by SEM/EDX, and colors of fibers quantified by microspectrophotometry

    Forensic Science: \u3ci\u3eDaubert’s\u3c/i\u3e Failure

    Get PDF

    The Rise of iWar: Identity, Information, and the Individualization of Modern Warfare

    Get PDF
    During a decade of global counterterrorism operations and two extended counterinsurgency campaigns, the United States was confronted with a new kind of adversary. Without uniforms, flags, and formations, the task of identifying and targeting these combatants represented an unprecedented operational challenge for which Cold War era doctrinal methods were largely unsuited. This monograph examines the doctrinal, technical, and bureaucratic innovations that evolved in response to these new operational challenges. It discusses the transition from a conventionally focused, Cold War-era targeting process to one optimized for combating networks and conducting identity-based targeting. It analyzes the policy decisions and strategic choices that were the catalysts of this change and concludes with an in depth examination of emerging technologies that are likely to shape how this mode of warfare will be waged in the future.https://press.armywarcollege.edu/monographs/1436/thumbnail.jp

    A critical review of the current state of forensic science knowledge and its integration in legal systems

    Get PDF
    Forensic science has a significant historical and contemporary relationship with the criminal justice system. It is a relationship between two disciplines whose origins stem from different backgrounds. It is trite that effective communication assist in resolving underlying problems in any given context. However, a lack of communication continues to characterise the intersection between law and science. As recently as 2019, a six-part symposium on the use of forensic science in the criminal justice system again posed the question on how the justice system could ensure the reliability of forensic science evidence presented during trials. As the law demands finality, science is always evolving and can never be considered finite or final. Legal systems do not always adapt to the nature of scientific knowledge, and are not willing to abandon finality when that scientific knowledge shifts. Advocacy plays an important role in the promotion of forensic science, particularly advocacy to the broader scientific community for financial support, much needed research and more testing. However, despite its important function, advocacy should not be conflated with science. The foundation of advocacy is a cause; whereas the foundation of science is fact. The objective of this research was to conduct a qualitative literature review of the field of forensic science; to identify gaps in the knowledge of forensic science and its integration in the criminal justice system. The literature review will provide researchers within the field of forensic science with suggested research topics requiring further examination and research. To achieve its objective, the study critically analysed the historical development of, and evaluated the use of forensic science evidence in legal systems generally, including its role regarding the admissibility or inadmissibility of the evidence in the courtroom. In conclusion, it was determined that the breadth of forensic scientific knowledge is comprehensive but scattered. The foundational underpinning of the four disciplines, discussed in this dissertation, has been put to the legal test on countless occasions. Some gaps still remain that require further research in order to strengthen the foundation of the disciplines. Human influence will always be present in examinations and interpretations and will lean towards subjective decision making.JurisprudenceD. Phil

    What fingermarks reveal about activities

    Get PDF
    Fingermarks play important role in forensic science. Based on the ridge detail information present in a fingermark, individualization or exclusion of a donor is possible by comparing a fingermark obtained from a crime scene to a reference fingerprint. In this process, the intrinsic features of a fingermark are used to determine the source of the fingermark. However, in some cases, the source of a fingermark is not argued but the activity that led to the deposition of the fingermark. The question changes from ‘Who left the fingermark?’ to ‘How did the fingermark end up on the surface?’ which requires a different assessment of the findings. The aim of this dissertation is to determine how fingermarks could provide information about activities in a reliable way, in order to be used in the forensic evidence process. To answer this main research question, several studies were conducted which are described in Chapters 2 to 5 of this dissertation. Chapter 2 describes the development of a general framework to evaluate fingermarks given activity level propositions. Relevant variables that function as sources of information when evaluating fingermarks given activity level proposition were identified. Based on these variables, three Bayesian networks were presented for different evaluations of the fingermarks given activity level propositions in a case example. The presented networks function as a general framework for the evaluation of fingermarks given activity level propositions, which can be adapted to specific case circumstances. Chapter 3 shows how the proposed framework in Chapter 2 can be used in casework by showing a case example. In order to use a Bayesian network, probabilities need to be assigned to the Bayesian network. In this study, a case specific experiment with the use of knives was conducted and the resulting data was used to assign probabilities to two Bayesian networks, both focusing on a different use of the experimental data. This study has shown how different uses of the data resulting from a case specific experiment on fingermarks can be used to assign probabilities to Bayesian networks for the evaluation of fingermarks given activity level propositions. In Chapter 4, we focus on the location of fingermarks on an item. In this study, we developed a classification model to evaluate the location of fingermarks given activity level propositions based on an experiment with pillowcases. The results showed that fingermark patterns left on a pillowcase by smothering with a pillow can be well distinguished from fingermark patterns left by changing a pillowcase of a pillow. The result of this study is a model that can be used to study the location of fingermarks on two-dimensional items in general, for which is expected that different activities will lead to different trace locations. Chapter 5 investigates the application of the location model presented in Chapter 4 to a dataset of letters, to study whether the model could also be used to distinguish between fingermark patterns left when writing a letter and fingermark patterns left when reading a letter. Based on the results of this study we conclude that the model proposed in Chapter 4 is indeed applicable to other objects for which it is expected that different activities lead to different fingermark locations, given the condition that the training set is representative for the object to be tested with regards to the size of the object and the activity that was carried out with the object. This dissertation supports the view that fingermarks contain valuable information about the activity that caused the deposition of the fingermarks and provides the forensic community with reliable methods that can be used when evaluating fingermarks given activity level propositions

    The Dialogue Between Forensic Scientists, Statisticians and Lawyers about Complex Scientific Issues for Court

    Get PDF
    Since DNA analysis became part of forensic science in the mid-1980s, its impact on investigation of crime and at court has been immense. In a few years the technique became the gold standard for evaluative evidence, overtaking some other evidence types, and replacing others completely. Part of this impact was due to formal statistical calculations, replacing subjective opinions, on the weight of evidence provided for the prosecution and defence views. The technology has improved quickly with ever more sensitive tests being introduced; the statistical interpretation of increasingly complex DNA results has not been as swift. The absence of a forum for inter-disciplinary discussion between developers and end users has led to methods being developed by statisticians, with little input from the working forensic scientists. It is forensic scientists who will be using the software, and typically have little opportunity to discuss with lawyers the data impact and presentation for non-scientific audiences such as Judges, magistrates and juries. There is a danger for courts in these interpretations — which are produced by a black box — where the reporting forensic scientist has little input and less understanding. It is time for a dialog between the scientists producing the DNA results, the statisticians developing the calculation methods and software and, the lawyers who present the findings to the court

    An examination of quantitative methods for Forensic Signature Analysis and the admissibility of signature verification system as legal evidence.

    Get PDF
    The experiments described in this thesis deal with handwriting characteristics which are involved in the production of forged and genuine signatures and complexity of signatures. The objectives of this study were (1) to provide su?cient details on which of the signature characteristics are easier to forge, (2) to investigate the capabilities of the signature complexity formula given by Found et al. based on a different signature database provided by University of Kent. This database includes the writing movements of 10 writers producing their genuine signature and of 140 writers forging these sample signatures. Using the 150 genuine signatures without constrictions of the Kent’s database an evaluation of the complexity formula suggested in Found et al took place divided the signature in three categories low, medium and high graphical complexity. The results of the formula implementation were compared with the opinions of three leading professional forensic document examiners employed by Key Forensics in the UK. The analysis of data for Study I reveals that there is not ample evidence that high quality forgeries are possible after training. In addition, a closer view of the kinematics of the forging writers is responsible for our main conclusion, that forged signatures are widely different from genuine especially in the kinematic domain. From all the parameters used in this study 11 out of 15 experienced significant changes when the comparison of the two groups (genuine versus forged signature) took place and gave a clear picture of which parameters can assist forensic document examiners and can be used by them to examine the signatures forgeries. The movements of the majority of forgers are signi?cantly slower than those of authentic writers. It is also clearly recognizable that the majority of forgers perform higher levels of pressure when trying to forge the genuine signature. The results of Study II although limited and not entirely consistent with the study of Found that proposed this model, indicate that the model can provide valuable objective evidence (regarding complex signatures) in the forensic environment and justify its further investigation but more work is need to be done in order to use this type of models in the court of law. The model was able to predict correctly only 53% of the FDEs opinion regarding the complexity of the signatures. Apart from the above investigations in this study there will be also a reference at the debate which has started in recent years that is challenging the validity of forensic handwriting experts’ skills and at the effort which has begun by interested parties of this sector to validate and standardise the field of forensic handwriting examination and a discussion started. This effort reveals that forensic document analysis field meets all factors which were set by Daubert ruling in terms of theory proven, education, training, certification, falsifiability, error rate, peer review and publication, general acceptance. However innovative methods are needed for the development of forensic document analysis discipline. Most modern and effective solution in order to prevent observational and emotional bias would be the development of an automated handwriting or signature analysis system. This system will have many advantages in real cases scenario. In addition the significant role of computer-assisted handwriting analysis in the daily work of forensic document examiners (FDE) or the judicial system is in agreement with the assessment of the National Research Council of United States that “the scientific basis for handwriting comparison needs to be strengthened”, however it seems that further research is required in order to be able these systems to reach the accomplishment point of this objective and overcome legal obstacles presented in this study
    • 

    corecore