363 research outputs found

    Burdens of Persuasion in Civil Cases: Algorithms v. Explanations

    Get PDF
    The conjunction paradox has fascinated generations of scholars, primarily because it brings into focus the apparent incompatibility of equally well accepted conventions. On the one hand, trials should be structured to reduce the total number, or optimize the allocation, of errors. On the other hand, burdens of persuasion are allocated to elements by the standard jury instruction rather than to a case as a whole. Because an error in finding to be true any element of the plaintiff\u27s cause of action will result in an error if liability is found, errors on the overall case accumulate with errors on discrete issues. This, in turn, means that errors will neither be minimized nor optimized (except possibly randomly). Thus, the conventional view concerning the purpose of trial is inconsistent with the conventional view concerning the allocation of burdens of persuasion. Two recent efforts to resolve this conflict are examined in this article. Dean Saul Levmore has argued that the paradox is eliminated or reduced considerably because of either the implications of the Condorcet Jury Theorem or the implications of super majority voting rules. Professor Alex Stein has constructed a micro-economic explanation of negligence that is also offered as resolving the paradox. Neither succeed, and both fail for analogous reasons. First, each makes a series of ad hoc adjustments to the supposedly formal arguments that are out of place in formal reasoning. The result is that neither argument is, in fact, formal; both arguments thus implicitly reject the very formalisms they are supposedly employing in their explanations. Second, both articles mismodel the system of litigation they are trying to explain in an effort to close the gap between their supposedly formal models and the reality of the legal system; and when necessary corrections are made to their respective models of litigation, neither formal argument maps onto the reality of trials, leaving the original problem untouched and unexplained. These two efforts are thus very much similar to the failed effort to give a Bayesian explanation to trials and juridical proof, which similarly failed due to the inability to align the formal requirements of subjective Bayesianism with the reality of modern trials. We also explore the reasons for this consistent misuse of formal arguments in the evidentiary context. Rationality requires, at a minimum, sensitivity to the intellectual tools brought to a task, of which algorithmic theoretical accounts are only one of many. Another, somewhat neglected in legal scholarship, is substantive explanations of legal questions that take into account the surrounding legal landscape. As we show, although the theoretical efforts to domesticate the conjunction paradox fail, a substantive explanation of it can be given that demonstrates the small likelihood of perverse consequences flowing from it. The article thus adds to the growing literature concerning the nature of legal theorizing by demonstrating yet another area where legal theorizing in one of its modern conventional manifestations (involving the search for the algorithmic argument that purportedly explains or justifies an area of law) has been ineffectual, whereas explanations that are informed by the substantive contours of the relevant legal field have considerable promise

    Further Reflections on the Guillotine

    Get PDF
    The authors criticize the tone and substance of the current death penalty debate. The authors demonstrate that, as uncomfortable as it may sound, death is the commonality of social planning, and that all social policy decisions, including whether to have capital punishment, determine who will live and who will die. That we may execute some innocent people is an important consideration, but in light of the fact that without the death penalty other innocent people will be killed, it is not necessarily a reason to abandon it. If capital punishment deters crime, the point is obvious, but because the guilty will sometimes kill again if not executed, abolition would not obviously save innocent lives instead of merely displacing death. And just like any other form of social planning, the authors argue, such an allocation of death is the rightful province of a democratic society

    Naturalized Epistemology and the Law of Evidence

    Get PDF
    This paper important developments in epistemology, and defends a theoretical framework for evidence scholarship from the perspective of naturalized epistemology. It demonstrates that naturalized epistemology provides a firm conceptual foundation for much research into law of evidence. These developments in epistemology have not been much noted in legal scholarship, despite their importance in philosophy and their coincidence with some widely shared approaches to evidence scholarship. This article is a partial antidote for the unproductive fascination in some quarters of the legal academy with postmodern conceptions of knowledge and truth and to the even more common search by the legal professoriat for algorithms that provide answers to important legal questions. In the field of evidence, there is some interest in post-modern epistemology, and much searching for the appropriate algorithm, such as Bayesian decision theory or micro-economics, or simply the complete neglect of epistemological matters. The article argues that the naturalistic turn in epistemology of the past thirty years (especially that branch of naturalized epistemology known as social epistemology) provides the appropriate theoretical framework for the study of evidence, as it does for virtually any enterprise concerned with the empirical adequacy of its theories and the truth-generating capacity of its methodologies. Evidence scholarship and law are concerned with both, and thus naturalized epistemology provides a fruitful way of understanding the limitations of some of the existing efforts to provide theoretical and philosophical foundations to evidence law. It also provides a way to conceptualize and evaluate specific rules of evidence, and concomitantly explains what most evidence scholars do, regardless of their explicit philosophical commitments. For the great bulk of evidentiary scholars, this article should solidify the ground beneath their feet

    Legal Phenomena, Knowledge, and Theory: A Cautionary Tale of Hedgehogs and Foxes

    Get PDF
    This article analyzes the susceptibility of areas of legal regulation to being organized or explained by top-down deductive theories of general applicability. It hypothesizes that at least three variables determine in part the likely relevance of general theories to sets of legal phenomena, ambiguity (gaps in the law), unpredictability (computational intractability), and the comparative need for specialized and common sense reasoning. We hypothesize that as ambiguity, unpredictability, and the utility of common sense reasoning go up, the amenability of a set of legal phenomena to general theoretical approaches decreases. We thus predict that the meaning of negligence will be resistant to theoretical approaches, both economic and corrective justice, and that the nature of antitrust law will embrace the microeconomic approach. We test these predictions in various ways and find support for both of them

    The Juridical Management of Factual Uncertainty

    Get PDF
    Civil presumption doctrine in the United States is unnecessarily complex and essentially unnecessary. Evidence law affords a number of evidentiary devices for managing uncertainty, which civil presumptions, at best, merely replicate, but in a different vocabulary with the attendant unnecessary complexity. We survey the critical similarities of evidentiary devices, which can save time and expense, but seldom affect the final outcome of litigation, and demonstrate the manner in which civil presumptions are mere substitutes for other well known evidentiary devices. We further show the unnecessary complexity introduced by instructions on presumptions. The potential that presumption instructions have for harmful effects on jurors, and the effort required to master the intricate formalities of presumptions, suggest that the main reason for their continued existence is distrust of jurors, and perhaps appellate court distrust of trial courts, and that an appreciation of the extent to which presumptions duplicate other evidentiary devices can be the key to sorely needed reform

    FCIC memo of staff interview with Ronald Hauben, Ernst and Young

    Get PDF

    FCIC memo of staff interview with Bill Schlich, Ernst & Young

    Get PDF

    Small-Molecule Probes Targeting the Viral PPxY-Host Nedd4 Interface Block Egress of a Broad Range of RNA Viruses.

    Get PDF
    Budding of filoviruses, arenaviruses, and rhabdoviruses is facilitated by subversion of host proteins, such as Nedd4 E3 ubiquitin ligase, by viral PPxY late (L) budding domains expressed within the matrix proteins of these RNA viruses. As L domains are important for budding and are highly conserved in a wide array of RNA viruses, they represent potential broad-spectrum targets for the development of antiviral drugs. To identify potential competitive blockers, we used the known Nedd4 WW domain-PPxY interaction interface as the basis of an in silico screen. Using PPxY-dependent budding of Marburg (MARV) VP40 virus-like particles (VLPs) as our model system, we identified small-molecule hit 1 that inhibited Nedd4-PPxY interaction and PPxY-dependent budding. This lead candidate was subsequently improved with additional structure-activity relationship (SAR) analog testing which enhanced antibudding activity into the nanomolar range. Current lead compounds 4 and 5 exhibit on-target effects by specifically blocking the MARV VP40 PPxY-host Nedd4 interaction and subsequent PPxY-dependent egress of MARV VP40 VLPs. In addition, lead compounds 4 and 5 exhibited antibudding activity against Ebola and Lassa fever VLPs, as well as vesicular stomatitis and rabies viruses (VSV and RABV, respectively). These data provide target validation and suggest that inhibition of the PPxY-Nedd4 interaction can serve as the basis for the development of a novel class of broad-spectrum, host-oriented antivirals targeting viruses that depend on a functional PPxY L domain for efficient egress. IMPORTANCE: There is an urgent and unmet need for the development of safe and effective therapeutics against biodefense and high-priority pathogens, including filoviruses (Ebola and Marburg) and arenaviruses (e.g., Lassa and Junin) which cause severe hemorrhagic fever syndromes with high mortality rates. We along with others have established that efficient budding of filoviruses, arenaviruses, and other viruses is critically dependent on the subversion of host proteins. As disruption of virus budding would prevent virus dissemination, identification of small-molecule compounds that block these critical viral-host interactions should effectively block disease progression and transmission. Our findings provide validation for targeting these virus-host interactions as we have identified lead inhibitors with broad-spectrum antiviral activity. In addition, such inhibitors might prove useful for newly emerging RNA viruses for which no therapeutics would be available
    • …
    corecore