6,972 research outputs found
Dual-Context Calculi for Modal Logic
We present natural deduction systems and associated modal lambda calculi for
the necessity fragments of the normal modal logics K, T, K4, GL and S4. These
systems are in the dual-context style: they feature two distinct zones of
assumptions, one of which can be thought as modal, and the other as
intuitionistic. We show that these calculi have their roots in in sequent
calculi. We then investigate their metatheory, equip them with a confluent and
strongly normalizing notion of reduction, and show that they coincide with the
usual Hilbert systems up to provability. Finally, we investigate a categorical
semantics which interprets the modality as a product-preserving functor.Comment: Full version of article previously presented at LICS 2017 (see
arXiv:1602.04860v4 or doi: 10.1109/LICS.2017.8005089
Advances in Proof-Theoretic Semantics
Logic; Mathematical Logic and Foundations; Mathematical Logic and Formal Language
Efficient learning of large sets of locally optimal classification rules
Conventional rule learning algorithms aim at finding a set of simple rules,
where each rule covers as many examples as possible. In this paper, we argue
that the rules found in this way may not be the optimal explanations for each
of the examples they cover. Instead, we propose an efficient algorithm that
aims at finding the best rule covering each training example in a greedy
optimization consisting of one specialization and one generalization loop.
These locally optimal rules are collected and then filtered for a final rule
set, which is much larger than the sets learned by conventional rule learning
algorithms. A new example is classified by selecting the best among the rules
that cover this example. In our experiments on small to very large datasets,
the approach's average classification accuracy is higher than that of
state-of-the-art rule learning algorithms. Moreover, the algorithm is highly
efficient and can inherently be processed in parallel without affecting the
learned rule set and so the classification accuracy. We thus believe that it
closes an important gap for large-scale classification rule induction.Comment: article, 40 pages, Machine Learning journal (2023
Normalisation Control in Deep Inference via Atomic Flows
We introduce `atomic flows': they are graphs obtained from derivations by
tracing atom occurrences and forgetting the logical structure. We study simple
manipulations of atomic flows that correspond to complex reductions on
derivations. This allows us to prove, for propositional logic, a new and very
general normalisation theorem, which contains cut elimination as a special
case. We operate in deep inference, which is more general than other syntactic
paradigms, and where normalisation is more difficult to control. We argue that
atomic flows are a significant technical advance for normalisation theory,
because 1) the technique they support is largely independent of syntax; 2)
indeed, it is largely independent of logical inference rules; 3) they
constitute a powerful geometric formalism, which is more intuitive than syntax
- …