49 research outputs found

    The long arm of the algorithm? Automated Facial Recognition as evidence and trigger for police intervention

    Get PDF
    Criminal law's efficient and accurate administration depends to a considerable extent on the ability of decision-makers to identify unique individuals, circumstances and events as instances of abstract terms (such as events raising ‘reasonable suspicion’) laid out in the legal framework. Automated Facial Recognition has the potential to revolutionise the identification process, facilitate crime detection, and eliminate misidentification of suspects. This paper commences from the recent decision regarding the deployment of AFR by South Wales Police in order to discuss the lack of underpinning conceptual framework pertinent to a broader consideration of AFR in other contexts. We conclude that the judgment does not give the green light to other fact sensitive deployments of AFR. We consider two of these: a) use of AFR as a trigger for intervention short of arrest; b) use of AFR in an evidential context in criminal proceedings. AFR may on the face of it appear objective and sufficient, but this is belied by the probabilistic nature of the output, and the building of certain values into the tool, raising questions as to the justifiability of regarding the tool's output as an ‘objective’ ground for reasonable suspicion. The means by which the identification took place must be disclosed to the defence, if Article 6 right to a fair trial is to be upheld, together with information regarding disregarded ‘matches’ and error rates and uncertainties of the system itself. Furthermore, AFR raises the risk that scientific or algorithmic findings could usurp the role of the legitimate decision-maker, necessitating the development of a framework to protect the position of the human with decision-making prerogative

    Proof beyond a context-relevant doubt. A structural analysis of the standard of proof in criminal adjudication

    Get PDF
    The present article proceeds from the mainstream view that the conceptual framework underpinning adversarial systems of criminal adjudication, i.e. a mixture of common-sense philosophy and probabilistic analysis, is unsustainable. In order to provide fact-finders with an operable structure of justification, we need to turn to epistemology once again. The article proceeds in three parts. First, I examine the structural features of justification and how various theories have attempted to overcome Agrippa’s trilemma. Second, I put Inferential Contextualism to the test and show that a defeasible structure of justification allocating epistemic rights and duties to all participants of an inquiry manages to dissolve the problem of scepticism. Third, I show that our epistemic practice already embodies a contextualist mechanism. Our problem was not that our Standard of Proof is inoperable but that it was not adequately conceptualized. Contextualism provides the framework to articulate the abovementioned practice and to treat ‘reasonable doubts’ as a mechanism which we can now describe in detail. The seemingly insurmountable problem with our efforts to define the concept “reasonable doubts” was the fact that we have been conflating the surface features of this mechanism and its internal structure, i.e. the rules for its use

    The Unbearable Lightness of Ignoring Axiomatic Principles – A Response to: “On Coping in a Non-Binary World: Rejoinder to Biedermann and Kotsoglou” (by Nicholas Scurich and Richard S. John, in: Statistics and Public Policy, 2024)

    Get PDF
    This paper supplements our Commentary on “Three-Way ROCs for Forensic Decision Making” by Nicholas Scurich and Richard S. John (in: Statistics and Public Policy). We reply to allegations made by Nicholas Scurich and Richard S. John in their rejoinder "On Coping in a Non-Binary World: Rejoinder to Biedermann and Kotsoglou". These allegations do not carry sufficient intellectual weight to warrant discussion in a published response. However, they are serious insofar as they misquote and misinterpret our Commentary, contain demonstrably false assertions, and thus amount to an attempt to mislead the scientific and legal communities. In this paper, we provide the context and background at the origin of this debate and examine each of Scurich and John’s allegations

    Not ‘very English’: On the Use of the Polygraph by the Penal System in England and Wales

    Get PDF
    One of the most striking developments in the penal system in England and Wales is the increasing use of the polygraph by probation services. Despite severe criticism from scientific institutions and academic discourse, the legal order increasingly deploys the long-discredited polygraph in order to extract adverse statements from released offenders. Our article is structured as follows: First, we summarise the statutory and regulatory framework for the current use of the polygraph in the monitoring of sex offenders released on licence, and the proposed expansion of the polygraph testing regime as set out in the Domestic Abuse Bill and the Counter-Terrorism and Sentencing Bill respectively. We then review our findings in respect of governing policies and procedures uncovered by our FOI-based research, highlighting the concerning lack of consistency in respect of both practice and procedure. In the subsequent sections we set out the main arguments deployed by polygraph proponents, and posit our view that none of these arguments can withstand scrutiny. We conclude by proposing a moratorium on any further use of the polygraph by the State, in order to thoroughly evaluate its effect on the integrity of the legal order, human rights and, more generally, the Rationalist aspirations of the penal system. In addition, and given already existing law, we propose a process of independent oversight and scrutiny of the use of the polygraph in licence recall decisions and other situations impacting individual rights, especially police investigations triggered by polygraph test results

    Inroads into the Ultimate Issue Rule? Structural Elements of Communication between Experts and Fact-Finders

    Get PDF
    One of the most persistent questions in criminal evidence relates to the use of (unchallenged) expert evidence. What does it mean to accept or reject (unchallenged) expert evidence? To what extent can, and should, an expert enter jurisprudential territory? Is the traditional model of trial by jury viable in our complex world? In order to clarify these pressing questions, we will examine the evidential structure underpinning expert witness testimony. We will show that what we usually and, at the cost of oversimplification, call ‘evidence’, comprises three distinct questions: (i) What does the data show? (ii) What should we believe? (iii) What should we do? From this insight, a number of corollaries fall into place. First, although decisions have to be informed through reasoned inferential procedures, they cannot be reduced to scientific propositions. As a result, fact-finders do not need to cede their decision-making prerogative as some proponents of expert-driven decision-making suggest. Secondly, criminal liability is not a scientific conclusion. Rather, so our argument, it is an individualistic normative construction that involves an inferential leap which is not warranted by any scientific (i.e. general) proposition. For the rectitude of the criminal verdict (or indeed any legal decision) does not map logically onto the possible treatment of scientific findings, that is, acceptance/rejection. Thirdly, our clarification of this evidential structure, which we call coherent decisionalism, provides a conceptual framework to understand and stabilise case law on expert witness testimony

    Forensic science and the principle of excluded middle: “Inconclusive” decisions and the structure of error rate studies

    Get PDF
    In a paper published recently in this journal, Dror and Scurich (2020) [20] critically discuss the notions of “inconclusive evidence” (i.e., test items for which it is difficult to render a categorical response) and “inconclusive decisions” (i.e., experts’ conclusions or responses) in the context of forensic science error rate studies. They expose several ways in which the understanding and use of “inconclusives” in current forensic science research and practice can adversely affect the outcomes of error rate studies. A main cause of distortion, according to Dror and Scurich, is what they call “erroneous inconclusive” decisions, in particular the lack of acknowledgment of this type of erroneous conclusion in the computation of error rates. To overcome this complication, Dror and Scurich call for a more explicit monitoring of “inconclusives” using a modified error rate study design. Whilst we agree with several well-argued points raised by the authors, we disagree with their framing of “inconclusive decisions” as potential errors. In this paper, we argue that referring to an “inconclusive decision” as an error is a contradiction in terms, runs counter to an analysis based on decision logic and, hence, is questionable as a concept. We also reiterate that the very term “inconclusive decision” disregards the procedural architecture of the criminal justice system across modern jurisdictions, especially the fact that forensic experts have no decisional rights in the criminal process. These positions do not ignore the possibility that “inconclusives” - if used excessively - do raise problems in forensic expert reporting, in particular limited assertiveness (or, overcautiousness). However, these drawbacks derive from inherent limitations of experts rather than from the seemingly erroneous nature of “inconclusives” that needs to be fixed. More fundamentally, we argue that attempts to score “inconclusives” as errors amount to philosophical claims disguised as forensic methodology. Specifically, these attempts interfere with the metaphysical substrate underpinning empirical research. We point this out on the basis of the law of the excluded middle, i.e. the principle of “no third possibility being given” (tertium non datur)

    Αρχές μιας δομικής θεώρησης των διαπραγματεύσεων στην ποινική δίκη. Ψυχολογική βία – Ομολογία – Απόδειξη [In Greek: A Structural Analysis of Guilty Pleas. Psychological Violence - Confession - Proof​.]

    Get PDF
    Η παρούσα μελέτη επιχειρεί μια δομική ανάλυση του φαινομένου των διαπραγματεύσεων στην ποινική δίκη. Κατ’ αρχάς εξετάζεται εάν η δογματική του δικαίου απόδειξης δύναται να εντάξει ομαλά το φαινόμενο των διαπραγματεύσεων στο εννοιολογικό και θεσμικό πλαίσιο της αποδεικτικής διαδικασίας. Η ασυμβατότητα των διαπραγματεύσεων με το ηπειρωτικό δικονομικό σύστημα επιτρέπει, στην συνέχεια, να θεμελιωθεί ο αντισυνταγματικός χαρακτήρας τους. Kατόπιν επισκοπείται εννοιολογικά η δικονομική συμπεριφορά του κατηγορουμένου με σκοπό να εξεταστεί εάν η πρόταση μειωμένης ποινής δύναται να προκαλέσει ποινικά αξιόλογό ψυχικό πόνο σε αυτόν. Η «ψαλίδα ποινής» που δημιουργείται από την αριθμητική διαφορά μεταξύ δήθεν μειωμένης ποινής και της ποινής που θα επιβληθεί σε περίπτωση διεξαγωγής της δίκης δημιουργεί μια διλημματική κατάσταση, στην οποία ο κατηγορούμενος δεν έχει άλλη εύλογη εναλλακτική από το να προβεί σε ομολογία ενοχής. Τέλος δίδεται έμφαση σε πρόσφατη απόπειρα θεωρητικής θεμελίωσης των διαπραγματεύσεων επί τη βάσει της οικονομικής ανάλυσης του δικαίου. Η προσπάθεια αυτή αξιολογείται ως αντίθετη με την αυτονομία του δικαίου. Συνολικά, η μελέτη καταδεικνύει ότι οι διαπραγματεύσεις συνιστούν υπαρξιακή απειλή για το ηπειρωτικό σύστημα ποινικής δικαιοσύνης. Η ένταξή τους στην ποινική δίκη προϋποθέτει την άρση ισχύος της εκάστοτε έννομης τάξης

    Digital evidence exceptionalism? A review and discussion of conceptual hurdles in digital evidence transformation

    Get PDF
    Forensic science is currently undergoing a transformation and expansion to include modern types of evidence, such as evidence generated by digital investigations. This development is said to raise a series of challenges, both in operational and conceptual dimensions. This paper reviews and discusses a series of convoluted conceptual hurdles that are encountered in connection with the use of digital evidence as part of evidence and proof processes at trial, in contradistinction to investigative uses of such types of evidence. As a recent example raising such hurdles, we analyse and discuss assertions and proposals made in the article “Digital Evidence Certainty Descriptors (DECDs)” by Graeme Horsman (32 Forensic Science International: Digital Investigation (2020) 200896)

    Polygraph-based deception detection and machine learning. Combining the worst of both worlds?

    Get PDF
    At a time when developments in computational approaches, often associated with the now much-vaunted terms Machine Learning (ML) and Artificial Intelligence (AI), face increasing challenges in terms of fairness, transparency and accountability, the temptation for researchers to apply mainstream ML methods to virtually any type of data seems to remain irresistible. In this paper we critically examine a recent proposal to apply ML to polygraph screening results (where human interviewers have made a conclusion about deception), which raises several questions about the purpose and the design of the research, particularly given the vacuous scientific status of polygraph-based procedures themselves. We argue that in high-stake environments such as criminal justice and employment practice, where fundamental rights and principles of justice are at stake, the legal and ethical considerations for scientific research are heightened. Specifically, we argue that the combination of ambiguously labelled data and ad hoc ML models does not meet this requirement. Worse, such research can inappropriately legitimise otherwise scientifically invalid, indeed pseudo-scientific methods such as polygraph-based deception detection, especially when presented in a reputable scientific journal. We conclude that methodological concerns, such as those highlighted in this paper, should be addressed before research can be said to contribute to resolving any of the fundamental validity issues that underlie methods and techniques used in legal proceedings
    corecore