9,180 research outputs found

    The Analytical, Technical Processes behind the Evaluation of Forensic Evidence through Questioned Document Examination

    Get PDF
    A will is found next to a dead body. A suicide letter is left at a crime scene. A ransom note is near a kidnapped individual. Questioned document examiners are of utmost importance within the forensic science community to identify perpetrators and provide closure for cases. Handwriting, paper and ink analysis embody areas studied by forensic scientists to determine authorship, reliability and authenticity. Each person is unique and specific features are showcased within an individual’s writing. Since the emergence of letters, people have created, developed and immersed themselves into the community process of writing. Features have advanced from childhood learning styles to adult routines. Minute characteristics are located by questioned document scientists through various techniques, methods and instrumental analysis. Handwriting features, paper inspection and ink examination result in unique quality identification. The Electrostatic Detection Apparatus and Video Spectral Comparator are two of the most widely used machines by questioned document scientists for validity purposes. After conducting tests, examiners are called to court as expert witnesses to testify regarding evidence. Preparatory procedures are followed in addition to both verbal and visual demonstrations of samples. From the crime scene to transportation to the crime laboratory to the courtroom, evidence is distributed with care and standard operating procedures are enacted. Appropriate handling of samples is always required to ensure precision, accuracy and protection of evidence. The most proper collection, analysis and demonstration of samples is essential in assisting the trier of fact, judge and jury, with reaching an ultimate decision

    Artificial Intelligence and Pattern Evidence: A Legal Application for AI

    Get PDF
    Artificial intelligence changes everything, and almost no jobs will be immune. The application of AI to the practice of law is well-known and well-understood. In this paper, we present some aspects of the related disciplines of forensic science and specifically the development and analysis of “pattern [and impression] evidence.” We show that pattern evidence has a great need for AI.We discuss several applications in detail but focus mostly on the application of AI-based text analysis technology to forensic linguistics.Sociedad Argentina de Informática e Investigación Operativ

    An examination of quantitative methods for Forensic Signature Analysis and the admissibility of signature verification system as legal evidence.

    Get PDF
    The experiments described in this thesis deal with handwriting characteristics which are involved in the production of forged and genuine signatures and complexity of signatures. The objectives of this study were (1) to provide su?cient details on which of the signature characteristics are easier to forge, (2) to investigate the capabilities of the signature complexity formula given by Found et al. based on a different signature database provided by University of Kent. This database includes the writing movements of 10 writers producing their genuine signature and of 140 writers forging these sample signatures. Using the 150 genuine signatures without constrictions of the Kent’s database an evaluation of the complexity formula suggested in Found et al took place divided the signature in three categories low, medium and high graphical complexity. The results of the formula implementation were compared with the opinions of three leading professional forensic document examiners employed by Key Forensics in the UK. The analysis of data for Study I reveals that there is not ample evidence that high quality forgeries are possible after training. In addition, a closer view of the kinematics of the forging writers is responsible for our main conclusion, that forged signatures are widely different from genuine especially in the kinematic domain. From all the parameters used in this study 11 out of 15 experienced significant changes when the comparison of the two groups (genuine versus forged signature) took place and gave a clear picture of which parameters can assist forensic document examiners and can be used by them to examine the signatures forgeries. The movements of the majority of forgers are signi?cantly slower than those of authentic writers. It is also clearly recognizable that the majority of forgers perform higher levels of pressure when trying to forge the genuine signature. The results of Study II although limited and not entirely consistent with the study of Found that proposed this model, indicate that the model can provide valuable objective evidence (regarding complex signatures) in the forensic environment and justify its further investigation but more work is need to be done in order to use this type of models in the court of law. The model was able to predict correctly only 53% of the FDEs opinion regarding the complexity of the signatures. Apart from the above investigations in this study there will be also a reference at the debate which has started in recent years that is challenging the validity of forensic handwriting experts’ skills and at the effort which has begun by interested parties of this sector to validate and standardise the field of forensic handwriting examination and a discussion started. This effort reveals that forensic document analysis field meets all factors which were set by Daubert ruling in terms of theory proven, education, training, certification, falsifiability, error rate, peer review and publication, general acceptance. However innovative methods are needed for the development of forensic document analysis discipline. Most modern and effective solution in order to prevent observational and emotional bias would be the development of an automated handwriting or signature analysis system. This system will have many advantages in real cases scenario. In addition the significant role of computer-assisted handwriting analysis in the daily work of forensic document examiners (FDE) or the judicial system is in agreement with the assessment of the National Research Council of United States that “the scientific basis for handwriting comparison needs to be strengthened”, however it seems that further research is required in order to be able these systems to reach the accomplishment point of this objective and overcome legal obstacles presented in this study

    Drawing Elena Ferrante's Profile. Workshop Proceedings, Padova, 7 September 2017

    Get PDF
    Elena Ferrante is an internationally acclaimed Italian novelist whose real identity has been kept secret by E/O publishing house for more than 25 years. Owing to her popularity, major Italian and foreign newspapers have long tried to discover her real identity. However, only a few attempts have been made to foster a scientific debate on her work. In 2016, Arjuna Tuzzi and Michele Cortelazzo led an Italian research team that conducted a preliminary study and collected a well-founded, large corpus of Italian novels comprising 150 works published in the last 30 years by 40 different authors. Moreover, they shared their data with a select group of international experts on authorship attribution, profiling, and analysis of textual data: Maciej Eder and Jan Rybicki (Poland), Patrick Juola (United States), Vittorio Loreto and his research team, Margherita Lalli and Francesca Tria (Italy), George Mikros (Greece), Pierre Ratinaud (France), and Jacques Savoy (Switzerland). The chapters of this volume report the results of this endeavour that were first presented during the international workshop Drawing Elena Ferrante's Profile in Padua on 7 September 2017 as part of the 3rd IQLA-GIAT Summer School in Quantitative Analysis of Textual Data. The fascinating research findings suggest that Elena Ferrante\u2019s work definitely deserves \u201cmany hands\u201d as well as an extensive effort to understand her distinct writing style and the reasons for her worldwide success

    A Study in Authenticity : Admissible Concealed Indicators of Authority and Other Features of Forgeries : A Case Study on Clement of Alexandria, Letter to Theodore, and the Longer Gospel of Mark

    Get PDF
    A standard approach in historically minded disciplines to documents and other artefacts that have become suspect is to concentrate on their dissimilarities with known genuine artefacts. While such an approach works reasonably well with relatively poor forgeries, more skilfully done counterfeits have tended to divide expert opinions, demanding protracted scholarly attention. As there has not been a widespread scholarly consensus on a constrained set of criteria for detecting forgeries, a pragmatic maximum for such dissimilarities—as there are potentially an infinite numbers of differences that can be enumerated between any two artefacts—has been impossible to set. Thus, rather than relying on a philosophically robust critical framework, scholars have been accustomed to approaching the matter on a largely case-by-case basis, with a handful of loosely formulated rules for guidance. In response to these shortcomings, this dissertation argues that a key characteristic of inquiry in historically minded disciplines should be the ability to distinguish between knowledge-claims that are epistemically warranted—i.e., that can be asserted post hoc from the material reality they have become embedded in with reference to some sort of rigorous methodological framework—and knowledge-claims that are not. An ancient letter by Clement of Alexandria (ca. 150–215 CE) to Theodore, in which two passages from the Longer Gospel of Mark (also known as the Secret Gospel of Mark) are quoted, has long been suspected of having been forged by Morton Smith (1915–1991), its putative discoverer. The bulk of this dissertation consists of four different articles that each use different methodological approaches. The first, a discourse analysis on scholarly debate over the letter’s authenticity, illuminates the reasons behind its odd character and troubled history. Second, archival research unearths how data points have become corrupted through unintended additions in digital-image processing (a phenomenon labelled line screen distortion here). Third, a quantitative study of the handwriting in Clement’s Letter to Theodore shows the inadequacy of unwittingly applying palaeographic standards in cases of suspected deceptions compared to the standards adhered to in forensic studies. Additionally, Smith’s conduct as an academic manuscript hunter is found to have been consistent with the standard practices of that profession. Finally, a study of the conceptual distinctions and framing of historical explanations in contemporary forgery discourse reveals the power of the methodologic approach of WWFD (What Would a Forger Do?), which has recently been used in three varieties (unconcealed, concealed, and hyperactive) to construe suspected documents as potential forgeries—despite its disregard of justificatory grounding in favour of coming up with free-form, first-person narratives in which the conceivable functions as its own justification. Together, the four articles illustrate the pitfalls of scholarly discourse on forgeries, especially that surrounding Clement’s Letter to Theodore. The solution to the poor argumentation that has characterized the scholarly study of forgeries is suggested to be an exercise in demarcation: to decide (in the abstract) which features should be acceptable as evidence either for or against the ascription of the status of forgery to an historical artefact. Implied within this suggestion is the notion of constraint, i.e., such that a constrained criterion would be one that cannot be employed to back up both an argument and its counter-argument. A topical case study—a first step on the road to creating a rigorous standard for constrained criteria in determining counterfeits—is the alternative narrative of an imagined creation of Clement’s Letter to Theodore by Smith around the time of its reported discovery (1958). Concealed indicators of authority, or the deliberate concealment of authorial details within the forged artefact by the forger, is established as a staple of the literary strategy of mystification, and their post hoc construction as acceptable evidence of authorship is argued to follow according to criteria: 1) that the beginning of the act of decipherment of a concealed indicator of authority has to have been preceded by a literary primer that is unambiguous to a high degree, 2) that, following the prompting of the literary primer, the act of deciphering a concealed indicator of authority has to have adhered to a technique or method that is unambiguous to a high degree, and 3) that, following the prompting of the literary primer and the act of decipherment, both of which must have been practiced in an unambiguous manner to a high degree, the plain-text solution to the concealed indicator of authority must likewise be unambiguous to a high degree.TĂ€ssĂ€ vĂ€itöskirjassa tarkastellaan Klemens Aleksandrialaisen (n. 150-215 jaa.) kirjettĂ€ Theodorokselle, joka sisĂ€ltÀÀ Salaisen Markuksen evankeliumin nimellĂ€ tunnettuja tekstikatkelmia. NĂ€issĂ€ katkelmissa, jotka eivĂ€t sisĂ€lly kanonisoituun Uuteen testamenttiin, Jeesus mm. herĂ€ttÀÀ nuorukaisen kuolleista ja opettaa tĂ€lle Jumalan valtakunnan salaisuuden. Klemensin kirje todistaa laajemmin kristinuskon varhaisvaiheen moninaisuudesta, mutta sitĂ€ on myös epĂ€ilty vÀÀrennökseksi. Historiallisten vÀÀrennösten tunnistamiseen ei ole löytynyt yleisesti hyvĂ€ksyttyĂ€ metodia. Historiantutkijat ovat joutuneet arvioimaan epĂ€iltyjĂ€ vÀÀrennöksiĂ€ tapauskohtaisesti, ja taidokkaasti toteutetut vÀÀrennökset johtavatkin usein pitkÀÀn ja kiivaaseen keskusteluun. VĂ€itöskirjan ytimen muodostavat neljĂ€ artikkelia, joissa tarkastellaan Klemensin kirjettĂ€ eri nĂ€kökulmista ja kuvataan myös yleisemmin historiallisten vÀÀrennösten paljastamiseen liittyviĂ€ sudenkuoppia. EnsimmĂ€inen artikkeli kuvaa diskurssianalyysin keinoin vÀÀrennösvĂ€itteistĂ€ kĂ€ytyĂ€ sananvaihtoa, jota leimaa puhuminen toisten tutkijoiden ohi ja yli. Toinen ja kolmas artikkeli analysoivat Klemensin kirjeen kĂ€sialaa. Ne paljastavat, ettĂ€ digitaalinen kuvankĂ€sittely on tahattomasti muokannut kĂ€sialan yksityiskohtia. Vertailuaineisto osoittaa, ettei Klemensin kirjeen kĂ€siala sisĂ€llĂ€ "vÀÀrentĂ€jĂ€n vapinaa" tai muita yleisiĂ€ vÀÀrennöksen tuntomerkkejĂ€. NeljĂ€s artikkeli tarkastelee ja problematisoi tutkijoiden tapaa perustella vÀÀrennösvĂ€itteitĂ€ luomalla kuvitteellisia tarinoita, joilla selitetÀÀn vÀÀrennöksien yksityiskohtien syntymistĂ€. VĂ€itöskirjassa ehdotetaan, ettĂ€ historiallisten vÀÀrennösten paljastamiseen tĂ€ytyy kehittÀÀ vankka tieteellinen viitekehys. VĂ€itöskirjan yhteenvetoluvussa otetaan tĂ€hĂ€n ensimmĂ€inen askel tarkastelemalla, kuinka autenttisuuden kysymystĂ€ on lĂ€hestytty mm. kirjallisuustieteen alalla. Yhteenvetoluvussa analysoidaan mystifikaatiolle (kirjallinen genre) tyypillistĂ€ tapaa piilottaa "kĂ€tkettyjĂ€ tekijyyden indikaattoreita" vÀÀrennöksiin. Analyysin perusteella todetaan, ettĂ€ aiemmin tutkijat ovat saattaneet langeta kehittelemÀÀn villejĂ€ vÀÀrennösteorioita erilaisten kuviteltujen vihjeiden ja salakirjoitusten pohjalta. Jotta vĂ€ltytÀÀn tĂ€mĂ€nkaltaiselta "kryptoanalyyttiseltĂ€ hyperaktiivisuudelta," tarvitaan "kĂ€tkettyjen tekijyyden indikaattoreiden" kĂ€ytölle kriteerejĂ€. Ehdotettujen kriteerien mukaan ainoastaan sellaiset "kĂ€tketyt tekijyyden indikaattorit" voidaan hyvĂ€ksyĂ€ todellisiksi, joiden 1) olemassaoloon viitataan yksiselitteisesti, joiden 2) purkaminen tapahtuu yksiselitteisellĂ€ metodilla ja jotka 3) nimeĂ€vĂ€t tekijĂ€n yksiselitteisellĂ€ tavalla

    Investigating the Common Authorship of Signatures by Off-line Automatic Signature Verification without the Use of Reference Signatures

    Get PDF
    In automatic signature verification, questioned specimens are usually compared with reference signatures. In writer-dependent schemes, a number of reference signatures are required to build up the individual signer model while a writer-independent system requires a set of reference signatures from several signers to develop the model of the system. This paper addresses the problem of automatic signature verification when no reference signatures are available. The scenario we explore consists of a set of signatures, which could be signed by the same author or by multiple signers. As such, we discuss three methods which estimate automatically the common authorship of a set of off-line signatures. The first method develops a score similarity matrix, worked out with the assistance of duplicated signatures; the second uses a feature-distance matrix for each pair of signatures; and the last method introduces pre-classification based on the complexity of each signature. Publicly available signatures were used in the experiments, which gave encouraging results. As a baseline for the performance obtained by our approaches, we carried out a visual Turing Test where forensic and non-forensic human volunteers, carrying out the same task, performed less well than the automatic schemes

    Drawing, Handwriting Processing Analysis: New Advances and Challenges

    No full text
    International audienceDrawing and handwriting are communicational skills that are fundamental in geopolitical, ideological and technological evolutions of all time. drawingand handwriting are still useful in defining innovative applications in numerous fields. In this regard, researchers have to solve new problems like those related to the manner in which drawing and handwriting become an efficient way to command various connected objects; or to validate graphomotor skills as evident and objective sources of data useful in the study of human beings, their capabilities and their limits from birth to decline

    Clinicopathological correlation in erythema induratum

    Get PDF
    Background - Erythema induratum (EI) is a reactive disorder to mycobacterium tuberculosis infection, a diagnosis not to be missed. Erythema nodosum (EN) is the main clinical differential of EI, but a distinctly different pathological condition that can be difficult to distinguish from EI. Methods – In this retrospective review we assess clinical and histological features of 40 EI cases and 16 EN cases. Six experienced dermatologists blindly diagnosed these cases based on clinical images, thereafter the histology was revealed, and they adjusted their diagnoses accordingly. Fleiss Kappa statistics were applied to determine inter-rater variability. A multi-variate logistic regression model determined the clinical and histological features that contribute most to an accurate diagnosis. Results - After assessing the clinical picture 48.8% of the EI cases and 74% of the EN cases were correctly diagnosed. With added histology results 67.1% EI and 81.2% EN cases were correct. EI cases showed inter-rater variability of 0.478 (pvalue < 0.01) before and 0.469 (p-value < 0.01) after histology was revealed. These features combined in a logistic regression model had a higher diagnostic accuracy than the assessors with regard to EI cases. The model was accurate in 100% and 80% of EI and EN cases respectively. Conclusions - While the study was limited by its retrospective nature and small sample size, valuable features (ulceration, vasculitis and lobular or septal panniculitis) were identified. A biopsy of the lower leg markedly increased the diagnostic accuracy, but there was less concordance between assessors, more research is needed to confirm these results

    The Screenplay Business: Managing creativity in script development in the contemporary British independent film industry

    Get PDF
    A screenplay is sometimes said to be a blueprint for a film, and its genesis and development is therefore important to our understanding of how films are created. Film business studies has traditionally avoided close study of the screenplay development process, perhaps as a result of the film studies emphasis on analysing the text of the completed film, and the auteur theory emphasis on the importance of the director; both of which may have marginalised the study of development and the creativity of development practitioners. Professional screenplay development is a team activity, with creative collaboration between screenwriters, producers, development executives, financiers, and directors. So how does power and creative control shift between members of this team, especially as people arrive or leave? And how does this multiple authorship affect the auteur theory idea that the director is the creative author of the film? This research sets out to open debates around the process and nature of the business of script development, and consider how development practitioners experience, collaborate and participate in the process of screenplay development in the UK today. It uses original interviews, observation and hermeneutic reflection; and asks how cross-disciplinary ideas around creativity, managing creative people, motivation, organisational culture, and team theory could be used to consider how the creative team of writer, producer, director and development executive can work effectively together. It proposes new theories, including defining the independent film value chain and the commitment matrix, analysing changing power relationships during development, and establishing new typologies around film categories and their relationship to funding. The core of this PhD by Prior Publication is the book The Screenplay Business: managing creativity and script development in the film industry. The supporting paper explores the contexts of film industry studies; the film value chain; auteurship and screenplay studies
    • 

    corecore