8,532 research outputs found
On the relative proof complexity of deep inference via atomic flows
We consider the proof complexity of the minimal complete fragment, KS, of
standard deep inference systems for propositional logic. To examine the size of
proofs we employ atomic flows, diagrams that trace structural changes through a
proof but ignore logical information. As results we obtain a polynomial
simulation of versions of Resolution, along with some extensions. We also show
that these systems, as well as bounded-depth Frege systems, cannot polynomially
simulate KS, by giving polynomial-size proofs of certain variants of the
propositional pigeonhole principle in KS.Comment: 27 pages, 2 figures, full version of conference pape
Normalisation Control in Deep Inference via Atomic Flows
We introduce `atomic flows': they are graphs obtained from derivations by
tracing atom occurrences and forgetting the logical structure. We study simple
manipulations of atomic flows that correspond to complex reductions on
derivations. This allows us to prove, for propositional logic, a new and very
general normalisation theorem, which contains cut elimination as a special
case. We operate in deep inference, which is more general than other syntactic
paradigms, and where normalisation is more difficult to control. We argue that
atomic flows are a significant technical advance for normalisation theory,
because 1) the technique they support is largely independent of syntax; 2)
indeed, it is largely independent of logical inference rules; 3) they
constitute a powerful geometric formalism, which is more intuitive than syntax
On the pigeonhole and related principles in deep inference and monotone systems
International audienceWe construct quasipolynomial-size proofs of the propositional pigeonhole principle in the deep inference system KS, addressing an open problem raised in previous works and matching the best known upper bound for the more general class of monotone proofs. We make significant use of monotone formulae computing boolean threshold functions, an idea previously considered in works of Atserias et al. The main construction, monotone proofs witnessing the symmetry of such functions, involves an implementation of merge-sort in the design of proofs in order to tame the structural behaviour of atoms, and so the complexity of normalization. Proof transformations from previous work on atomic flows are then employed to yield appropriate KS proofs. As further results we show that our constructions can be applied to provide quasipolynomial-size KS proofs of the parity principle and the generalized pigeonhole principle. These bounds are inherited for the class of monotone proofs, and we are further able to construct n^O(log log n) -size monotone proofs of the weak pigeonhole principle with (1 + ε)n pigeons and n holes for ε = 1/ polylog n, thereby also improving the best known bounds for monotone proofs
Combinatorial Flows and Their Normalisation
This paper introduces combinatorial flows that generalize combinatorial proofs such that they also include cut and substitution as methods of proof compression. We show a normalization procedure for combinatorial flows, and how syntactic proofs are translated into combinatorial flows and vice versa
On linear rewriting systems for Boolean logic and some applications to proof theory
Linear rules have played an increasing role in structural proof theory in
recent years. It has been observed that the set of all sound linear inference
rules in Boolean logic is already coNP-complete, i.e. that every Boolean
tautology can be written as a (left- and right-)linear rewrite rule. In this
paper we study properties of systems consisting only of linear inferences. Our
main result is that the length of any 'nontrivial' derivation in such a system
is bound by a polynomial. As a consequence there is no polynomial-time
decidable sound and complete system of linear inferences, unless coNP=NP. We
draw tools and concepts from term rewriting, Boolean function theory and graph
theory in order to access some required intermediate results. At the same time
we make several connections between these areas that, to our knowledge, have
not yet been presented and constitute a rich theoretical framework for
reasoning about linear TRSs for Boolean logic.Comment: 27 pages, 3 figures, special issue of RTA 201
Complete Additivity and Modal Incompleteness
In this paper, we tell a story about incompleteness in modal logic. The story
weaves together a paper of van Benthem, `Syntactic aspects of modal
incompleteness theorems,' and a longstanding open question: whether every
normal modal logic can be characterized by a class of completely additive modal
algebras, or as we call them, V-BAOs. Using a first-order reformulation of the
property of complete additivity, we prove that the modal logic that starred in
van Benthem's paper resolves the open question in the negative. In addition,
for the case of bimodal logic, we show that there is a naturally occurring
logic that is incomplete with respect to V-BAOs, namely the provability logic
GLB. We also show that even logics that are unsound with respect to such
algebras do not have to be more complex than the classical propositional
calculus. On the other hand, we observe that it is undecidable whether a
syntactically defined logic is V-complete. After these results, we generalize
the Blok Dichotomy to degrees of V-incompleteness. In the end, we return to van
Benthem's theme of syntactic aspects of modal incompleteness
DeepMatching: Hierarchical Deformable Dense Matching
We introduce a novel matching algorithm, called DeepMatching, to compute
dense correspondences between images. DeepMatching relies on a hierarchical,
multi-layer, correlational architecture designed for matching images and was
inspired by deep convolutional approaches. The proposed matching algorithm can
handle non-rigid deformations and repetitive textures and efficiently
determines dense correspondences in the presence of significant changes between
images. We evaluate the performance of DeepMatching, in comparison with
state-of-the-art matching algorithms, on the Mikolajczyk (Mikolajczyk et al
2005), the MPI-Sintel (Butler et al 2012) and the Kitti (Geiger et al 2013)
datasets. DeepMatching outperforms the state-of-the-art algorithms and shows
excellent results in particular for repetitive textures.We also propose a
method for estimating optical flow, called DeepFlow, by integrating
DeepMatching in the large displacement optical flow (LDOF) approach of Brox and
Malik (2011). Compared to existing matching algorithms, additional robustness
to large displacements and complex motion is obtained thanks to our matching
approach. DeepFlow obtains competitive performance on public benchmarks for
optical flow estimation
Open-ended Learning in Symmetric Zero-sum Games
Zero-sum games such as chess and poker are, abstractly, functions that
evaluate pairs of agents, for example labeling them `winner' and `loser'. If
the game is approximately transitive, then self-play generates sequences of
agents of increasing strength. However, nontransitive games, such as
rock-paper-scissors, can exhibit strategic cycles, and there is no longer a
clear objective -- we want agents to increase in strength, but against whom is
unclear. In this paper, we introduce a geometric framework for formulating
agent objectives in zero-sum games, in order to construct adaptive sequences of
objectives that yield open-ended learning. The framework allows us to reason
about population performance in nontransitive games, and enables the
development of a new algorithm (rectified Nash response, PSRO_rN) that uses
game-theoretic niching to construct diverse populations of effective agents,
producing a stronger set of agents than existing algorithms. We apply PSRO_rN
to two highly nontransitive resource allocation games and find that PSRO_rN
consistently outperforms the existing alternatives.Comment: ICML 2019, final versio
- …