92,196 research outputs found
Learning to Reason: Leveraging Neural Networks for Approximate DNF Counting
Weighted model counting (WMC) has emerged as a prevalent approach for
probabilistic inference. In its most general form, WMC is #P-hard. Weighted DNF
counting (weighted #DNF) is a special case, where approximations with
probabilistic guarantees are obtained in O(nm), where n denotes the number of
variables, and m the number of clauses of the input DNF, but this is not
scalable in practice. In this paper, we propose a neural model counting
approach for weighted #DNF that combines approximate model counting with deep
learning, and accurately approximates model counts in linear time when width is
bounded. We conduct experiments to validate our method, and show that our model
learns and generalizes very well to large-scale #DNF instances.Comment: To appear in Proceedings of the Thirty-Fourth AAAI Conference on
Artificial Intelligence (AAAI-20). Code and data available at:
https://github.com/ralphabb/NeuralDNF
Probabilistic Model Counting with Short XORs
The idea of counting the number of satisfying truth assignments (models) of a
formula by adding random parity constraints can be traced back to the seminal
work of Valiant and Vazirani, showing that NP is as easy as detecting unique
solutions. While theoretically sound, the random parity constraints in that
construction have the following drawback: each constraint, on average, involves
half of all variables. As a result, the branching factor associated with
searching for models that also satisfy the parity constraints quickly gets out
of hand. In this work we prove that one can work with much shorter parity
constraints and still get rigorous mathematical guarantees, especially when the
number of models is large so that many constraints need to be added. Our work
is based on the realization that the essential feature for random systems of
parity constraints to be useful in probabilistic model counting is that the
geometry of their set of solutions resembles an error-correcting code.Comment: To appear in SAT 1
Counting connected hypergraphs via the probabilistic method
In 1990 Bender, Canfield and McKay gave an asymptotic formula for the number
of connected graphs on with edges, whenever and the nullity
tend to infinity. Asymptotic formulae for the number of connected
-uniform hypergraphs on with edges and so nullity
were proved by Karo\'nski and \L uczak for the case ,
and Behrisch, Coja-Oghlan and Kang for . Here we prove such a
formula for any fixed, and any satisfying and
as . This leaves open only the (much simpler) case
, which we will consider in future work. ( arXiv:1511.04739 )
Our approach is probabilistic. Let denote the random -uniform
hypergraph on in which each edge is present independently with
probability . Let and be the numbers of vertices and edges in
the largest component of . We prove a local limit theorem giving an
asymptotic formula for the probability that and take any given pair
of values within the `typical' range, for any in the supercritical
regime, i.e., when where
and ; our enumerative result then follows
easily.
Taking as a starting point the recent joint central limit theorem for
and , we use smoothing techniques to show that `nearby' pairs of values
arise with about the same probability, leading to the local limit theorem.
Behrisch et al used similar ideas in a very different way, that does not seem
to work in our setting.
Independently, Sato and Wormald have recently proved the special case ,
with an additional restriction on . They use complementary, more enumerative
methods, which seem to have a more limited scope, but to give additional
information when they do work.Comment: Expanded; asymptotics clarified - no significant mathematical
changes. 67 pages (including appendix
- …