2,429 research outputs found

    Modelling network travel time reliability under stochastic demand

    Get PDF
    A technique is proposed for estimating the probability distribution of total network travel time, in the light of normal day-to-day variations in the travel demand matrix over a road traffic network. A solution method is proposed, based on a single run of a standard traffic assignment model, which operates in two stages. In stage one, moments of the total travel time distribution are computed by an analytic method, based on the multivariate moments of the link flow vector. In stage two, a flexible family of density functions is fitted to these moments. It is discussed how the resulting distribution may in practice be used to characterise unreliability. Illustrative numerical tests are reported on a simple network, where the method is seen to provide a means for identifying sensitive or vulnerable links, and for examining the impact on network reliability of changes to link capacities. Computational considerations for large networks, and directions for further research, are discussed

    The Proficiency of Experts

    Get PDF
    Expert evidence plays a crucial role in civil and criminal litigation. Changes in the rules concerning expert admissibility, following the Supreme Court\u27s Daubert ruling, strengthened judicial review of the reliability and the validity of an expert\u27s methods. Judges and scholars, however, have neglected the threshold question for expert evidence: whether a person should be qualified as an expert in the first place. Judges traditionally focus on credentials or experience when qualifying experts without regard to whether those criteria are good proxies for true expertise. We argue that credentials and experience are often poor proxies for proficiency. Qualification of an expert presumes that the witness can perform in a particular domain with a proficiency that non-experts cannot achieve, yet many experts cannot provide empirical evidence that they do in fact perform at high levels of proficiency. To demonstrate the importance ofproficiency data, we collect and analyze two decades of proficiency testing of latent fingerprint examiners. In this important domain, we found surprisingly high rates of false positive identifications for the period 1995 to 2016. These data would qualify the claims of many fingerprint examiners regarding their near infallibility, but unfortunately, judges do not seek out such information. We survey the federal and state case law and show how judges typically accept expert credentials as a proxy for proficiency in lieu of direct proof of proficiency. Indeed, judges often reject parties\u27 attempts to obtain and introduce at trial empirical data on an expert\u27s actual proficiency. We argue that any expert who purports to give falsifiable opinions can be subjected to proficiency testing and that proficiency testing is the only objective means of assessing the accuracy and reliability ofexperts who rely on subjective judgments to formulate their opinions (so-called black-box experts ). Judges should use proficiency data to make expert qualification decisions when the data is available, should demand proof of proficiency before qualifying black-box experts, and should admit at trial proficiency data for any qualified expert. We seek to revitalize the standard for qualifying experts: expertise should equal proficiency

    ICT and E-Commerce: Challenges and Opportunities for the Nigerian Judiciary and Legal System

    Get PDF
    Nigeria, like many other countries, is aspiring to develop legislation to facilitate electronic commerce specifically and the use of information communications technology generally. This article addresses some of the challenges that the legal system and the judiciary will have to tackle in this process and highlights some of the opportunities arising from properly addressing the issues arising from the advent of the information communications technology revolution

    The \u27New\u27 Exclusionary Rule Debate: From \u27Still Preoccupied with 1985\u27 to \u27Virtual Deterrence\u27

    Get PDF
    The justices of the Supreme Court have drawn new battle lines over the exclusionary rule. In Hudson v. Michigan, 547 U.S. 586 (2006), a five-justice majority, over a strong dissent, went out of the way to renew familiar criticisms of the rule. Just this January, in Herring v. United States, 129 S.Ct. 695 (2009), the justices again divided five to four. This time the dissenters raised the ante, by arguing that the Court\u27s cost-benefit approach to applying the rule is misguided. For the first time since Justice Brennan left the Court, some of the justices appealed to broader justifications for exclusion, including concerns for judicial integrity, judicial review, and long-run and indirect influences on official behavior. This article challenges the majority positions in Hudson and Herring as both normatively mistaken and empirically unsupported. Normatively, the escape of the guilty is a cost of the Fourth Amendment rather than whatever remedies enforce it. The only legitimate cost of exclusion is possible overdeterrence, defined in careful way: discouraging lawful behavior in a pool of cases in which legality is uncertain. The Article then tests the overdeterrence hypothesis against empirical evidence reporting hit rates for different types of searches and seizures. The current mix of Fourth Amendment remedies does not appear to be overdeterring and indeed appears to underdeter certain types of low-cost Fourth Amendment violations. The article also criticizes the Herring dissent\u27s more majestic view of the exclusionary rule, because the dissent\u27s approach (1) cannot account for the law\u27s response to innocent victims of illegal searches and seizures; (2) fails to account for alternative remedies, including a deterrence-based exclusionary rule; (3) conflicts with the good-faith immunity defense to tort actions against the police, thus threatening overdeterrence; and, most fundamentally, (4) mistakes the nature of Fourth Amendment rights as trumps over the application of otherwise valid criminal laws to private behavior, i.e., as a right to commit crimes in secret. Finally, the article presents a proposed improvement on current exclusionary rule practice, the virtual deterrence approach. Under this approach, before suppressing evidence (or admitting tainted evidence under an exception), the court should demand an account of what specific remedial steps, by way of training, discipline, or record-keeping, the department has taken to prevent recurrence of the violation. In typical cases the proposal may not be worth the additional layer of procedural complexity. When, however, the charged offense is exceptionally serious, or when the government exploits an exception to exclusion for fruits of conduct found unconstitutional by the court, virtual deterrence probably would increase compliance by police with constitutional requirements, and reduce both the chances of the guilty escaping and the temptation to distort fact and law to avoid such miscarriages of justice. The government\u27s option to refuse to undertake remedial measures and thereby acquiesce in a suppression order provides a strong safeguard against overdeterrence

    The \u27New\u27 Exclusionary Rule Debate: From \u27Still Preoccupied with 1985\u27 to \u27Virtual Deterrence\u27

    Get PDF
    The justices of the Supreme Court have drawn new battle lines over the exclusionary rule. In Hudson v. Michigan, 547 U.S. 586 (2006), a five-justice majority, over a strong dissent, went out of the way to renew familiar criticisms of the rule. Just this January, in Herring v. United States, 129 S.Ct. 695 (2009), the justices again divided five to four. This time the dissenters raised the ante, by arguing that the Court\u27s cost-benefit approach to applying the rule is misguided. For the first time since Justice Brennan left the Court, some of the justices appealed to broader justifications for exclusion, including concerns for judicial integrity, judicial review, and long-run and indirect influences on official behavior. This article challenges the majority positions in Hudson and Herring as both normatively mistaken and empirically unsupported. Normatively, the escape of the guilty is a cost of the Fourth Amendment rather than whatever remedies enforce it. The only legitimate cost of exclusion is possible overdeterrence, defined in careful way: discouraging lawful behavior in a pool of cases in which legality is uncertain. The Article then tests the overdeterrence hypothesis against empirical evidence reporting hit rates for different types of searches and seizures. The current mix of Fourth Amendment remedies does not appear to be overdeterring and indeed appears to underdeter certain types of low-cost Fourth Amendment violations. The article also criticizes the Herring dissent\u27s more majestic view of the exclusionary rule, because the dissent\u27s approach (1) cannot account for the law\u27s response to innocent victims of illegal searches and seizures; (2) fails to account for alternative remedies, including a deterrence-based exclusionary rule; (3) conflicts with the good-faith immunity defense to tort actions against the police, thus threatening overdeterrence; and, most fundamentally, (4) mistakes the nature of Fourth Amendment rights as trumps over the application of otherwise valid criminal laws to private behavior, i.e., as a right to commit crimes in secret. Finally, the article presents a proposed improvement on current exclusionary rule practice, the virtual deterrence approach. Under this approach, before suppressing evidence (or admitting tainted evidence under an exception), the court should demand an account of what specific remedial steps, by way of training, discipline, or record-keeping, the department has taken to prevent recurrence of the violation. In typical cases the proposal may not be worth the additional layer of procedural complexity. When, however, the charged offense is exceptionally serious, or when the government exploits an exception to exclusion for fruits of conduct found unconstitutional by the court, virtual deterrence probably would increase compliance by police with constitutional requirements, and reduce both the chances of the guilty escaping and the temptation to distort fact and law to avoid such miscarriages of justice. The government\u27s option to refuse to undertake remedial measures and thereby acquiesce in a suppression order provides a strong safeguard against overdeterrence

    Advances in Urban Traffic Network Equilibrium Models and Algorithms

    Get PDF

    Contaminated Confessions Revisited

    Get PDF
    A second wave of false confessions is cresting. In the first twenty-one years of post-conviction DNA testing, 250 innocent people were exonerated, forty of which had falsely confessed. Those false confessions attracted sustained public attention from judges, law enforcement, policymakers, and the media. Those exonerations not only showed that false confessions can happen, but did more by shedding light on the problem of confession contamination, in which details of the crime are disclosed to suspects during the interrogation process. As a result, false confessions can appear deceptively rich, detailed, and accurate. In just the last five years, there has been a new surge in false confessions — a set of twenty-six more false confessions among DNA exonerations. All but two of these most recent confessions included crime scene details corroborated by crime scene information. Illustrating the power of contaminated false confessions, in nine of the cases, defendants were convicted despite DNA tests that excluded them at the time. As a result, this second wave of false confessions should cause even more alarm than the first. In the vast majority of cases there is no evidence to test using DNA. Unless a scientific framework is adopted to regulate interrogations, including by requiring recording of entire interrogations, overhauling interrogation methods, providing for judicial review of reliability at trial, and informing jurors with expert testimony, the insidious problems of confession contamination will persist

    Contaminated Confessions Revisited

    Get PDF
    A second wave of false confessions is cresting. In the first twenty-one years of post-conviction DNA testing, 250 innocent people were exonerated, forty of which had falsely confessed. Those false confessions attracted sustained public attention from judges, law enforcement, policymakers, and the media. Those exonerations not only showed that false confessions can happen, but did more by shedding light on the problem of confession contamination, in which details of the crime are disclosed to suspects during the interrogation process. As a result, false confessions can appear deceptively rich, detailed, and accurate. In just the last five years, there has been a new surge in false confessions — a set of twenty-six more false confessions among DNA exonerations. All but two of these most recent confessions included crime scene details corroborated by crime scene information. Illustrating the power of contaminated false confessions, in nine of the cases, defendants were convicted despite DNA tests that excluded them at the time. As a result, this second wave of false confessions should cause even more alarm than the first. In the vast majority of cases there is no evidence to test using DNA. Unless a scientific framework is adopted to regulate interrogations, including by requiring recording of entire interrogations, overhauling interrogation methods, providing for judicial review of reliability at trial, and informing jurors with expert testimony, the insidious problems of confession contamination will persist
    • …
    corecore