1,521 research outputs found
On bounded query machines
AbstractSimple proofs are given for each of the following results: (a) P = Pspace if and only if, for every set A, P(A) = Pquery(A) (Selman et al., 1983): (b) NP = Pspace if and only if, for every set A, NP(A) = NPquery(S) (Book, 1981); (c) PH = Pspace if and only if, for every set A, PH(A) = PQH(A) (Book and Wrathall, 1981); (c) PH = Pspace if and only if, for every set set S, PH(S) = PQH(S) = Pspace(S) (Balcázar et al., 1986; Long and Selman, 1986)
Complexity of Nested Circumscription and Nested Abnormality Theories
The need for a circumscriptive formalism that allows for simple yet elegant
modular problem representation has led Lifschitz (AIJ, 1995) to introduce
nested abnormality theories (NATs) as a tool for modular knowledge
representation, tailored for applying circumscription to minimize exceptional
circumstances. Abstracting from this particular objective, we propose L_{CIRC},
which is an extension of generic propositional circumscription by allowing
propositional combinations and nesting of circumscriptive theories. As shown,
NATs are naturally embedded into this language, and are in fact of equal
expressive capability. We then analyze the complexity of L_{CIRC} and NATs, and
in particular the effect of nesting. The latter is found to be a source of
complexity, which climbs the Polynomial Hierarchy as the nesting depth
increases and reaches PSPACE-completeness in the general case. We also identify
meaningful syntactic fragments of NATs which have lower complexity. In
particular, we show that the generalization of Horn circumscription in the NAT
framework remains CONP-complete, and that Horn NATs without fixed letters can
be efficiently transformed into an equivalent Horn CNF, which implies
polynomial solvability of principal reasoning tasks. Finally, we also study
extensions of NATs and briefly address the complexity in the first-order case.
Our results give insight into the ``cost'' of using L_{CIRC} (resp. NATs) as a
host language for expressing other formalisms such as action theories,
narratives, or spatial theories.Comment: A preliminary abstract of this paper appeared in Proc. Seventeenth
International Joint Conference on Artificial Intelligence (IJCAI-01), pages
169--174. Morgan Kaufmann, 200
NP-complete Problems and Physical Reality
Can NP-complete problems be solved efficiently in the physical universe? I
survey proposals including soap bubbles, protein folding, quantum computing,
quantum advice, quantum adiabatic algorithms, quantum-mechanical
nonlinearities, hidden variables, relativistic time dilation, analog computing,
Malament-Hogarth spacetimes, quantum gravity, closed timelike curves, and
"anthropic computing." The section on soap bubbles even includes some
"experimental" results. While I do not believe that any of the proposals will
let us solve NP-complete problems efficiently, I argue that by studying them,
we can learn something not only about computation but also about physics.Comment: 23 pages, minor correction
The quantum measurement problem and physical reality: a computation theoretic perspective
Is the universe computable? If yes, is it computationally a polynomial place?
In standard quantum mechanics, which permits infinite parallelism and the
infinitely precise specification of states, a negative answer to both questions
is not ruled out. On the other hand, empirical evidence suggests that
NP-complete problems are intractable in the physical world. Likewise,
computational problems known to be algorithmically uncomputable do not seem to
be computable by any physical means. We suggest that this close correspondence
between the efficiency and power of abstract algorithms on the one hand, and
physical computers on the other, finds a natural explanation if the universe is
assumed to be algorithmic; that is, that physical reality is the product of
discrete sub-physical information processing equivalent to the actions of a
probabilistic Turing machine. This assumption can be reconciled with the
observed exponentiality of quantum systems at microscopic scales, and the
consequent possibility of implementing Shor's quantum polynomial time algorithm
at that scale, provided the degree of superposition is intrinsically, finitely
upper-bounded. If this bound is associated with the quantum-classical divide
(the Heisenberg cut), a natural resolution to the quantum measurement problem
arises. From this viewpoint, macroscopic classicality is an evidence that the
universe is in BPP, and both questions raised above receive affirmative
answers. A recently proposed computational model of quantum measurement, which
relates the Heisenberg cut to the discreteness of Hilbert space, is briefly
discussed. A connection to quantum gravity is noted. Our results are compatible
with the philosophy that mathematical truths are independent of the laws of
physics.Comment: Talk presented at "Quantum Computing: Back Action 2006", IIT Kanpur,
India, March 200
Computational Complexity of Smooth Differential Equations
The computational complexity of the solutions to the ordinary
differential equation , under various assumptions
on the function has been investigated. Kawamura showed in 2010 that the
solution can be PSPACE-hard even if is assumed to be Lipschitz
continuous and polynomial-time computable. We place further requirements on the
smoothness of and obtain the following results: the solution can still
be PSPACE-hard if is assumed to be of class ; for each , the
solution can be hard for the counting hierarchy even if is of class
.Comment: 15 pages, 3 figure
On the Structure and Complexity of Rational Sets of Regular Languages
In a recent thread of papers, we have introduced FQL, a precise specification
language for test coverage, and developed the test case generation engine
FShell for ANSI C. In essence, an FQL test specification amounts to a set of
regular languages, each of which has to be matched by at least one test
execution. To describe such sets of regular languages, the FQL semantics uses
an automata-theoretic concept known as rational sets of regular languages
(RSRLs). RSRLs are automata whose alphabet consists of regular expressions.
Thus, the language accepted by the automaton is a set of regular expressions.
In this paper, we study RSRLs from a theoretic point of view. More
specifically, we analyze RSRL closure properties under common set theoretic
operations, and the complexity of membership checking, i.e., whether a regular
language is an element of a RSRL. For all questions we investigate both the
general case and the case of finite sets of regular languages. Although a few
properties are left as open problems, the paper provides a systematic semantic
foundation for the test specification language FQL
A PSPACE Construction of a Hitting Set for the Closure of Small Algebraic Circuits
In this paper we study the complexity of constructing a hitting set for the
closure of VP, the class of polynomials that can be infinitesimally
approximated by polynomials that are computed by polynomial sized algebraic
circuits, over the real or complex numbers. Specifically, we show that there is
a PSPACE algorithm that given n,s,r in unary outputs a set of n-tuples over the
rationals of size poly(n,s,r), with poly(n,s,r) bit complexity, that hits all
n-variate polynomials of degree-r that are the limit of size-s algebraic
circuits. Previously it was known that a random set of this size is a hitting
set, but a construction that is certified to work was only known in EXPSPACE
(or EXPH assuming the generalized Riemann hypothesis). As a corollary we get
that a host of other algebraic problems such as Noether Normalization Lemma,
can also be solved in PSPACE deterministically, where earlier only randomized
algorithms and EXPSPACE algorithms (or EXPH assuming the generalized Riemann
hypothesis) were known.
The proof relies on the new notion of a robust hitting set which is a set of
inputs such that any nonzero polynomial that can be computed by a polynomial
size algebraic circuit, evaluates to a not too small value on at least one
element of the set. Proving the existence of such a robust hitting set is the
main technical difficulty in the proof.
Our proof uses anti-concentration results for polynomials, basic tools from
algebraic geometry and the existential theory of the reals
- …