808 research outputs found

    On Logic in the Law: Something, but not All

    Get PDF
    In 1880, when Oliver Wendell Holmes (later to be a Justice of the U.S. Supreme Court) criticized the logical theology of law articulated by Christopher Columbus Langdell (the first Dean of Harvard Law School), neither Holmes nor Langdell was aware of the revolution in logic that had begun, the year before, with Frege\u27s Begriffsschrift. But there is an important element of truth in Holmes\u27s insistence that a legal system cannot be adequately understood as a system of axioms and corollaries ; and this element of truth is not obviated by the more powerful logical techniques that are now available

    On the Suitability of the Bandler–Kohout Subproduct as an Inference Mechanism

    Get PDF
    Fuzzy relational inference (FRI) systems form an important part of approximate reasoning schemes using fuzzy sets. The compositional rule of inference (CRI), which was introduced by Zadeh, has attracted the most attention so far. In this paper, we show that the FRI scheme that is based on the Bandler-Kohout (BK) subproduct, along with a suitable realization of the fuzzy rules, possesses all the important properties that are cited in favor of using CRI, viz., equivalent and reasonable conditions for their solvability, their interpolative properties, and the preservation of the indistinguishability that may be inherent in the input fuzzy sets. Moreover, we show that under certain conditions, the equivalence of first-infer-then-aggregate (FITA) and first-aggregate-then-infer (FATI) inference strategies can be shown for the BK subproduct, much like in the case of CRI. Finally, by addressing the computational complexity that may exist in the BK subproduct, we suggest a hierarchical inferencing scheme. Thus, this paper shows that the BK-subproduct-based FRI is as effective and efficient as the CRI itself

    A short note on fuzzy relational inference systems

    Get PDF
    This paper is a short note contribution to the topic of fuzzy relational inference systems and the preservation of their desirable properties. It addresses the two main fuzzy relational inferences – compositional rule of inference (CRI) and the Bandler–Kohout subproduct (BK-subproduct) – and their combination with two fundamental fuzzy relational models of fuzzy rule bases, namely, the Mamdani–Assilian and the implicative models. The goal of this short note article is twofold. Firstly, we show that the robustness related to the combination of BK-subproduct and implicative fuzzy rule base model was not proven correctly in [24]. However, we will show that the result itself is still valid and a valid proof will be provided. Secondly, we shortly discuss the preservation of desirable properties of fuzzy inference systems and conclude that neither the above mentioned robustness nor any other computational advantages should automatically lead to a preference of the combinations of CRI with Mamdani–Assilian models or of the BK-subproduct with the implicative models

    Techniques to Understand Computer Simulations: Markov Chain Analysis

    Get PDF
    The aim of this paper is to assist researchers in understanding the dynamics of simulation models that have been implemented and can be run in a computer, i.e. computer models. To do that, we start by explaining (a) that computer models are just input-output functions, (b) that every computer model can be re-implemented in many different formalisms (in particular in most programming languages), leading to alternative representations of the same input-output relation, and (c) that many computer models in the social simulation literature can be usefully represented as time-homogeneous Markov chains. Then we argue that analysing a computer model as a Markov chain can make apparent many features of the model that were not so evident before conducting such analysis. To prove this point, we present the main concepts needed to conduct a formal analysis of any time-homogeneous Markov chain, and we illustrate the usefulness of these concepts by analysing 10 well-known models in the social simulation literature as Markov chains. These models are: • Schelling\'s (1971) model of spatial segregation • Epstein and Axtell\'s (1996) Sugarscape • Miller and Page\'s (2004) standing ovation model • Arthur\'s (1989) model of competing technologies • Axelrod\'s (1986) metanorms models • Takahashi\'s (2000) model of generalized exchange • Axelrod\'s (1997) model of dissemination of culture • Kinnaird\'s (1946) truels • Axelrod and Bennett\'s (1993) model of competing bimodal coalitions • Joyce et al.\'s (2006) model of conditional association In particular, we explain how to characterise the transient and the asymptotic dynamics of these computer models and, where appropriate, how to assess the stochastic stability of their absorbing states. In all cases, the analysis conducted using the theory of Markov chains has yielded useful insights about the dynamics of the computer model under study.Computer Modelling, Simulation, Markov, Stochastic Processes, Analysis, Re-Implementation

    JURI SAYS:An Automatic Judgement Prediction System for the European Court of Human Rights

    Get PDF
    In this paper we present the web platform JURI SAYS that automatically predicts decisions of the European Court of Human Rights based on communicated cases, which are published by the court early in the proceedings and are often available many years before the final decision is made. Our system therefore predicts future judgements of the court. The platform is available at jurisays.com and shows the predictions compared to the actual decisions of the court. It is automatically updated every month by including the prediction for the new cases. Additionally, the system highlights the sentences and paragraphs that are most important for the prediction (i.e. violation vs. no violation of human rights)

    Research Perspectives: The Anatomy of a Design Principle

    Get PDF
    This essay derives a schema for specifying design principles for information technology-based artifacts in sociotechnical systems. Design principles are used to specify design knowledge in an accessible form, but there is wide variation and lack of precision across views regarding their formulation. This variation is a sign of important issues that should be addressed, including a lack of attention to human actors and levels of complexity as well as differing views on causality, on the nature of the mechanisms used to achieve goals, and on the need for justificatory knowledge. The new schema includes the well-recognized elements of design principles, including goals in a specific context and the mechanisms to achieve the goal. In addition, the schema allows: (1) consideration of the varying roles of the human actors involved and the utility of design principles, (2) attending to the complexity of IT-based artifacts through decomposition, (3) distinction of the types of causation (i.e., deterministic versus probabilistic), (4) a variety of mechanisms in achieving aims, and (5) the optional definition of justificatory knowledge underlying the design principles. We illustrate the utility of the proposed schema by applying it to examples of published research

    Applications of micro-CT in the Criminal Justice System of England and Wales: an impact assessment

    Get PDF
    The Criminal Justice System of England and Wales is currently facing major challenges. One is the financial pressure of government funding cuts, the other the increasing need for professionalisation and rigour within the system. This thesis presents the use of micro Computed Tomography, Additive Manufacturing, and 3D visualisation to address both challenges. By drawing on data from live murder investigations the project examines how these digital technologies can be used to improve the investigation of strangulation deaths, sharp force injuries, and fractures. Each of these categories was treated as a separate case in the overall multiple-case study research design. The increased detail enabled by micro-CT assisted pathologists in the diagnosis of strangulation as previously undetected injuries of the larynx could be identified. A validation study comparing injured to uninjured samples was conducted to increase the strength of the interpretations. For sharp force injuries analysis, micro-CT proved useful for providing the necessary injury characteristics and highly accurate measurements to allow weapon identification. The high resolution of micro-CT scanning also enabled the visualisation of trauma on the smallest of skeletal elements, often encountered in non-accidental injuries in children. The cross-case synthesis revealed the main themes of clarity, objectivity, and visualisation which were improved by using micro-CT irrespective of type of homicide. The significance of these themes further crystallised in semi-structured interviews conducted with various stakeholders of the Criminal Justice System. Management concepts proved suitable to assess the project’s success as the themes used in operations management such as quality, delivery, and cost apply to the delivery of justice as well. A good working relationship with West Midlands Police’s homicide investigators and researchers at WMG was crucial to providing the technology and expertise to address real-life problems whilst ultimately saving taxpayers’ money

    Proceedings of the Second Joint Technology Workshop on Neural Networks and Fuzzy Logic, volume 2

    Get PDF
    Documented here are papers presented at the Neural Networks and Fuzzy Logic Workshop sponsored by NASA and the University of Texas, Houston. Topics addressed included adaptive systems, learning algorithms, network architectures, vision, robotics, neurobiological connections, speech recognition and synthesis, fuzzy set theory and application, control and dynamics processing, space applications, fuzzy logic and neural network computers, approximate reasoning, and multiobject decision making

    Death of Paradox: The Killer Logic beneath the Standards of Proof

    Get PDF
    The prevailing but contested view of proof standards is that factfinders should determine facts by probabilistic reasoning. Given imperfect evidence, they should ask themselves what they think the chances are that the burdened party would be right if the truth were to become known; they then compare those chances to the applicable standard of proof. I contend that for understanding the standards of proof, the modern versions of logic — in particular, fuzzy logic and belief functions — work better than classical probability. This modern logic suggests that factfinders view evidence of an imprecisely perceived and described reality to form a fuzzy degree of belief in a fact’s existence; they then apply the standard of proof in accordance with the theory of belief functions, by comparing their belief in a fact’s existence to their belief in its negation. This understanding explains how the standard of proof actually works in the law world. It gives a superior mental image of the factfinders’ task, conforms more closely to what we know of people’s cognition, and captures better what the law says its standards are and how it manipulates them. One virtue of this conceptualization is that it is not a radically new view. Another virtue is that it nevertheless manages to resolve some stubborn problems of proof, including the infamous conjunction paradox
    corecore