27 research outputs found

    A Variable-Free Logic for Mass Terms

    Get PDF
    This paper presents a logic appropriate for mass terms, that is, a logic that does not presuppose interpretation in discrete models. Models may range from atomistic to atomless. This logic is a generalization of the author\u27s work on natural language reasoning. The following claims are made for this logic. First, absence of variables makes it simpler than more conventional formalizations based on predicate logic. Second, capability to deal effectively with discrete terms, and in particular with singular terms, can be added to the logic, making it possible to reason about discrete entities and mass entities in a uniform manner. Third, this logic is similar to surface English, in that the formal language and English are well-translatable, making it particularly suitable for natural language applications. Fourth, deduction performed in this logic is similar to syllogistic, and therefore captures an essential characteristic of human reasoning

    Multiplicative-Additive Focusing for Parsing as Deduction

    Full text link
    Spurious ambiguity is the phenomenon whereby distinct derivations in grammar may assign the same structural reading, resulting in redundancy in the parse search space and inefficiency in parsing. Understanding the problem depends on identifying the essential mathematical structure of derivations. This is trivial in the case of context free grammar, where the parse structures are ordered trees; in the case of categorial grammar, the parse structures are proof nets. However, with respect to multiplicatives intrinsic proof nets have not yet been given for displacement calculus, and proof nets for additives, which have applications to polymorphism, are involved. Here we approach multiplicative-additive spurious ambiguity by means of the proof-theoretic technique of focalisation.Comment: In Proceedings WoF'15, arXiv:1511.0252

    First- and second-order logic of mass terms

    Get PDF

    ProMoVer: Modular Verification of Temporal Safety Properties

    Get PDF
    This paper describes ProMoVer, a tool for fully automated procedure–modular verification of Java programs equipped with method–local and global assertions that specify safety properties of sequences of method invocations. Modularity at the procedure–level is a natural instantiation of the modular verification paradigm, where correctness of global properties is relativized on the local properties of the methods rather than on their implementations, and is based here on the construction of maximal models for a program model that abstracts away from program data. This approach allows global properties to be verified in the presence of code evolution, multiple method implementations (as arising from software product lines), or even unknown method implementations (as in mobile code for open platforms). ProMoVer automates a typical verification scenario for a previously developed tool set for compositional verification of control flow safety properties, and provides appropriate pre– and post–processing. Modularity is exploited by a mechanism for proof reuse that detects and minimizes the verification tasks resulting from changes in the code and the specifications. The verification task is relatively light–weight due to support for abstraction from private methods and automatic extraction of candidate specifications from method implementations. We evaluate the tool on a number of applications from the smart card domain

    Multiplicative-additive focusing for parsing as deduction

    Get PDF
    Spurious ambiguity is the phenomenon whereby distinct derivations in grammar may assign the same structural reading, resulting in redundancy in the parse search space and inefficiency in parsing. Understanding the problem depends on identifying the essential mathematical structure of derivations. This is trivial in the case of context free grammar, where the parse structures are ordered trees; in the case of type logical categorial grammar, the parse structures are proof nets. However, with respect to multiplicatives intrinsic proof nets have not yet been given for displacement calculus, and proof nets for additives, which have applications to polymorphism, are not easy to characterise. Here we approach multiplicative-additive spurious ambiguity by means of the proof-theoretic technique of focalisation.Peer ReviewedPostprint (published version

    Inference Rules and the Meaning of the Logical Constants

    Get PDF
    The dissertation provides an analysis and elaboration of Michael Dummett's proof-theoretic notions of validity. Dummett's notions of validity are contrasted with standard proof-theoretic notions and formally evaluated with respect to their adequacy to propositional intuitionistic logic

    Boundary Algebra: A Simpler Approach to Boolean Algebra and the Sentential Connectives

    Get PDF
    Boundary algebra [BA] is a algebra of type , and a simplified notation for Spencer-Brown’s (1969) primary algebra. The syntax of the primary arithmetic [PA] consists of two atoms, () and the blank page, concatenation, and enclosure between ‘(‘ and ‘)’, denoting the primitive notion of distinction. Inserting letters denoting, indifferently, the presence or absence of () into a PA formula yields a BA formula. The BA axioms are A1: ()()= (), and A2: “(()) [abbreviated ‘⊥’] may be written or erased at will,” implying (⊥)=(). The repeated application of A1 and A2 simplifies any PA formula to either () or ⊥. The basis for BA is B1: abc=bca (concatenation commutes & associates); B2, ⊥a=a (BA has a lower bound, ⊥); B3, (a)a=() (BA is a complemented lattice); and B4, (ba)a=(b)a (implies that BA is a distributive lattice). BA has two intended models: (1) the Boolean algebra 2 with base set B={(),⊥}, such that () ⇔ 1 [dually 0], (a) ⇔ a′, and ab ⇔ a∪b [a∩b]; and (2) sentential logic, such that () ⇔ true [false], (a) ⇔ ~a, and ab ⇔ a∨b [a∧b]. BA is a self-dual notation, facilitates a calculational style of proof, and simplifies clausal reasoning and Quine’s truth value analysis. BA resembles C.S. Peirce’s graphical logic, the symbolic logics of Leibniz and W.E. Johnson, the 2 notation of Byrne (1946), and the Boolean term schemata of Quine (1982).Boundary algebra; boundary logic; primary algebra; primary arithmetic; Boolean algebra; calculation proof; G. Spencer-Brown; C.S. Peirce; existential graphs

    The theory of inconsistency: inconsistant mathematics and paraconsistent logic

    Get PDF
    Each volume includes author's previously published papers.Bibliography: leaves 147-151 (v. 1).3 v. :Thesis (D.Sc.)--University of Adelaide, School of Mathematical Sciences, 200

    A holistic approach to assessment of value of information (VOI) with fuzzy data and decision criteria.

    Get PDF
    The research presented in this thesis integrates theories and techniques from statistical analysis and artificial intelligence, to develop a more coherent, robust and complete methodology for assessing the value of acquiring new information in the context of the oil and gas industry. The classical methodology for value of information assessment has been used in the oil and gas industry since the 1960s, even though it is only recently that more applications have been published. It is commonly acknowledged that, due to the large number of data acquisition actions and the capital investment associated with it, the oil and gas industry is an ideal domain for developing and applying value of information assessments. In this research, three main gaps in the classical methodology for value of information are identified and addressed by integrating three existing techniques from other domains. Firstly, the research identifies that the technique design of experiments can be used in value of information for providing a holistic assessment of the complete set of uncertain parameters, selecting the ones that have the most impact on the value of the project and supporting the selection of the data acquisition actions for evaluation. Secondly, the fuzziness of the data is captured through membership functions and the expected utility value of each financial parameter is estimated using the probability of the states conditioned to the membership functions - in the classical methodology, this is conditioned to crisp values of the data. Thirdly, a fuzzy inference system is developed for making the value of information assessment, capturing the decision-making human logic into the assessment process and integrating several financial parameters into one. The proposed methodology is applied to a case study describing a value of information assessment in an oil field, where two alternatives for data acquisition are discussed. The case study shows how the three techniques can be integrated within the previous methodology, resulting in a more complete theory. It is observed that the technique or design of experiments provides a full identification of the input parameters affecting the value of the project, and allows a proper selection of the data acquisition actions. In the case study, it is concluded that, when the fuzziness of the data is included in the assessment, the value of the data decreases in comparison with the case where data are assumed to be crisp. This result means that the decision concerning the value of acquiring new data depends on whether the fuzzy nature of the data is included in the assessment, and on the difference between the project value with and without data acquisition. The fuzzy inference system developed for this case study successfully follows the logic of the decision maker and results in a straightforward system to aggregate decision criteria. Sensitivity analysis of the parameters of two different membership functions is made, reaching consistent results in both cases
    corecore