76,697 research outputs found

    Algebraic Approach to Logical Inference Implementation

    Get PDF
    The paper examines the usage potential of n-tuple algebra (NTA) developed by the authors as a theoretical generalization of structures and methods applied in intelligence systems. NTA supports formalization of a wide set of logical problems (abductive and modified conclusions, modelling of graphs, semantic networks, expert rules, etc.). This article mostly describes implementation of logical inference by means of NTA. Logical inference procedures in NTA can include, besides the known logical calculus methods, new algebraic methods for checking correctness of a consequence or for finding corollaries to a given axiom system. Inference methods consider (above feasibility of certain substitutions) inner structure of knowledge to be processed, thus providing faster solving of standard logical analysis tasks. Matrix properties of NTA objects allow to decrease laboriousness of intellectual procedures as well as to efficiently parallel logical inference algorithms. In NTA, we discovered new structural and statistical classes of conjunctive normal forms whose satisfiability can be detected for polynomial time. Consequently, many algorithms whose complexity evaluation is theoretically high, e.g. exponential, can in practice be solved in polynomial time, on the average. As for making databases more intelligent, NTA can be considered an extension of relational algebra to knowledge processing. In the authors' opinion, NTA can become a methodological basis for creating knowledge processing languages

    Normalisation Control in Deep Inference via Atomic Flows

    Get PDF
    We introduce `atomic flows': they are graphs obtained from derivations by tracing atom occurrences and forgetting the logical structure. We study simple manipulations of atomic flows that correspond to complex reductions on derivations. This allows us to prove, for propositional logic, a new and very general normalisation theorem, which contains cut elimination as a special case. We operate in deep inference, which is more general than other syntactic paradigms, and where normalisation is more difficult to control. We argue that atomic flows are a significant technical advance for normalisation theory, because 1) the technique they support is largely independent of syntax; 2) indeed, it is largely independent of logical inference rules; 3) they constitute a powerful geometric formalism, which is more intuitive than syntax
    • …
    corecore