4,947 research outputs found
Understanding the Complexity of Lifted Inference and Asymmetric Weighted Model Counting
In this paper we study lifted inference for the Weighted First-Order Model
Counting problem (WFOMC), which counts the assignments that satisfy a given
sentence in first-order logic (FOL); it has applications in Statistical
Relational Learning (SRL) and Probabilistic Databases (PDB). We present several
results. First, we describe a lifted inference algorithm that generalizes prior
approaches in SRL and PDB. Second, we provide a novel dichotomy result for a
non-trivial fragment of FO CNF sentences, showing that for each sentence the
WFOMC problem is either in PTIME or #P-hard in the size of the input domain; we
prove that, in the first case our algorithm solves the WFOMC problem in PTIME,
and in the second case it fails. Third, we present several properties of the
algorithm. Finally, we discuss limitations of lifted inference for symmetric
probabilistic databases (where the weights of ground literals depend only on
the relation name, and not on the constants of the domain), and prove the
impossibility of a dichotomy result for the complexity of probabilistic
inference for the entire language FOL
Algorithmic Analysis of Qualitative and Quantitative Termination Problems for Affine Probabilistic Programs
In this paper, we consider termination of probabilistic programs with
real-valued variables. The questions concerned are:
1. qualitative ones that ask (i) whether the program terminates with
probability 1 (almost-sure termination) and (ii) whether the expected
termination time is finite (finite termination); 2. quantitative ones that ask
(i) to approximate the expected termination time (expectation problem) and (ii)
to compute a bound B such that the probability to terminate after B steps
decreases exponentially (concentration problem).
To solve these questions, we utilize the notion of ranking supermartingales
which is a powerful approach for proving termination of probabilistic programs.
In detail, we focus on algorithmic synthesis of linear ranking-supermartingales
over affine probabilistic programs (APP's) with both angelic and demonic
non-determinism. An important subclass of APP's is LRAPP which is defined as
the class of all APP's over which a linear ranking-supermartingale exists.
Our main contributions are as follows. Firstly, we show that the membership
problem of LRAPP (i) can be decided in polynomial time for APP's with at most
demonic non-determinism, and (ii) is NP-hard and in PSPACE for APP's with
angelic non-determinism; moreover, the NP-hardness result holds already for
APP's without probability and demonic non-determinism. Secondly, we show that
the concentration problem over LRAPP can be solved in the same complexity as
for the membership problem of LRAPP. Finally, we show that the expectation
problem over LRAPP can be solved in 2EXPTIME and is PSPACE-hard even for APP's
without probability and non-determinism (i.e., deterministic programs). Our
experimental results demonstrate the effectiveness of our approach to answer
the qualitative and quantitative questions over APP's with at most demonic
non-determinism.Comment: 24 pages, full version to the conference paper on POPL 201
Credimus
We believe that economic design and computational complexity---while already
important to each other---should become even more important to each other with
each passing year. But for that to happen, experts in on the one hand such
areas as social choice, economics, and political science and on the other hand
computational complexity will have to better understand each other's
worldviews.
This article, written by two complexity theorists who also work in
computational social choice theory, focuses on one direction of that process by
presenting a brief overview of how most computational complexity theorists view
the world. Although our immediate motivation is to make the lens through which
complexity theorists see the world be better understood by those in the social
sciences, we also feel that even within computer science it is very important
for nontheoreticians to understand how theoreticians think, just as it is
equally important within computer science for theoreticians to understand how
nontheoreticians think
Algorithms and Hardness Results for Computing Cores of Markov Chains
Given a Markov chain M = (V, v_0, δ), with state space V and a starting state v_0, and a probability threshold ε, an ε-core is a subset C of states that is left with probability at most ε. More formally, C ⊆ V is an ε-core, iff â„™[reach (V\C)] ≤ ε. Cores have been applied in a wide variety of verification problems over Markov chains, Markov decision processes, and probabilistic programs, as a means of discarding uninteresting and low-probability parts of a probabilistic system and instead being able to focus on the states that are likely to be encountered in a real-world run. In this work, we focus on the problem of computing a minimal ε-core in a Markov chain. Our contributions include both negative and positive results: (i) We show that the decision problem on the existence of an ε-core of a given size is NP-complete. This solves an open problem posed in [Jan KretÃnský and Tobias Meggendorfer, 2020]. We additionally show that the problem remains NP-complete even when limited to acyclic Markov chains with bounded maximal vertex degree; (ii) We provide a polynomial time algorithm for computing a minimal ε-core on Markov chains over control-flow graphs of structured programs. A straightforward combination of our algorithm with standard branch prediction techniques allows one to apply the idea of cores to find a subset of program lines that are left with low probability and then focus any desired static analysis on this core subset
Strong Invariants Are Hard: On the Hardness of Strongest Polynomial Invariants for (Probabilistic) Programs
We show that computing the strongest polynomial invariant for single-path
loops with polynomial assignments is at least as hard as the Skolem problem, a
famous problem whose decidability has been open for almost a century. While the
strongest polynomial invariants are computable for affine loops, for polynomial
loops the problem remained wide open. As an intermediate result of independent
interest, we prove that reachability for discrete polynomial dynamical systems
is Skolem-hard as well. Furthermore, we generalize the notion of invariant
ideals and introduce moment invariant ideals for probabilistic programs. With
this tool, we further show that the strongest polynomial moment invariant is
(i) uncomputable, for probabilistic loops with branching statements, and (ii)
Skolem-hard to compute for polynomial probabilistic loops without branching
statements. Finally, we identify a class of probabilistic loops for which the
strongest polynomial moment invariant is computable and provide an algorithm
for it
Computing the partition function of the Sherrington-Kirkpatrick model is hard on average
We establish the average-case hardness of the algorithmic problem of exact
computation of the partition function associated with the
Sherrington-Kirkpatrick model of spin glasses with Gaussian couplings and
random external field. In particular, we establish that unless , there
does not exist a polynomial-time algorithm to exactly compute the partition
function on average. This is done by showing that if there exists a polynomial
time algorithm, which exactly computes the partition function for inverse
polynomial fraction () of all inputs, then there is a polynomial
time algorithm, which exactly computes the partition function for all inputs,
with high probability, yielding . The computational model that we adopt
is {\em finite-precision arithmetic}, where the algorithmic inputs are
truncated first to a certain level of digital precision. The ingredients of
our proof include the random and downward self-reducibility of the partition
function with random external field; an argument of Cai et al.
\cite{cai1999hardness} for establishing the average-case hardness of computing
the permanent of a matrix; a list-decoding algorithm of Sudan
\cite{sudan1996maximum}, for reconstructing polynomials intersecting a given
list of numbers at sufficiently many points; and near-uniformity of the
log-normal distribution, modulo a large prime . To the best of our
knowledge, our result is the first one establishing a provable hardness of a
model arising in the field of spin glasses.
Furthermore, we extend our result to the same problem under a different {\em
real-valued} computational model, e.g. using a Blum-Shub-Smale machine
\cite{blum1988theory} operating over real-valued inputs.Comment: 31 page
- …