1,345 research outputs found
Public projects, Boolean functions and the borders of Border's theorem
Border's theorem gives an intuitive linear characterization of the feasible
interim allocation rules of a Bayesian single-item environment, and it has
several applications in economic and algorithmic mechanism design. All known
generalizations of Border's theorem either restrict attention to relatively
simple settings, or resort to approximation. This paper identifies a
complexity-theoretic barrier that indicates, assuming standard complexity class
separations, that Border's theorem cannot be extended significantly beyond the
state-of-the-art. We also identify a surprisingly tight connection between
Myerson's optimal auction theory, when applied to public project settings, and
some fundamental results in the analysis of Boolean functions.Comment: Accepted to ACM EC 201
P-Selectivity, Immunity, and the Power of One Bit
We prove that P-sel, the class of all P-selective sets, is EXP-immune, but is
not EXP/1-immune. That is, we prove that some infinite P-selective set has no
infinite EXP-time subset, but we also prove that every infinite P-selective set
has some infinite subset in EXP/1. Informally put, the immunity of P-sel is so
fragile that it is pierced by a single bit of information.
The above claims follow from broader results that we obtain about the
immunity of the P-selective sets. In particular, we prove that for every
recursive function f, P-sel is DTIME(f)-immune. Yet we also prove that P-sel is
not \Pi_2^p/1-immune
The Complexity of Computing the Size of an Interval
Given a p-order A over a universe of strings (i.e., a transitive, reflexive,
antisymmetric relation such that if (x, y) is an element of A then |x| is
polynomially bounded by |y|), an interval size function of A returns, for each
string x in the universe, the number of strings in the interval between strings
b(x) and t(x) (with respect to A), where b(x) and t(x) are functions that are
polynomial-time computable in the length of x.
By choosing sets of interval size functions based on feasibility requirements
for their underlying p-orders, we obtain new characterizations of complexity
classes. We prove that the set of all interval size functions whose underlying
p-orders are polynomial-time decidable is exactly #P. We show that the interval
size functions for orders with polynomial-time adjacency checks are closely
related to the class FPSPACE(poly). Indeed, FPSPACE(poly) is exactly the class
of all nonnegative functions that are an interval size function minus a
polynomial-time computable function.
We study two important functions in relation to interval size functions. The
function #DIV maps each natural number n to the number of nontrivial divisors
of n. We show that #DIV is an interval size function of a polynomial-time
decidable partial p-order with polynomial-time adjacency checks. The function
#MONSAT maps each monotone boolean formula F to the number of satisfying
assignments of F. We show that #MONSAT is an interval size function of a
polynomial-time decidable total p-order with polynomial-time adjacency checks.
Finally, we explore the related notion of cluster computation.Comment: This revision fixes a problem in the proof of Theorem 9.
On the Lattice Isomorphism Problem
We study the Lattice Isomorphism Problem (LIP), in which given two lattices
L_1 and L_2 the goal is to decide whether there exists an orthogonal linear
transformation mapping L_1 to L_2. Our main result is an algorithm for this
problem running in time n^{O(n)} times a polynomial in the input size, where n
is the rank of the input lattices. A crucial component is a new generalized
isolation lemma, which can isolate n linearly independent vectors in a given
subset of Z^n and might be useful elsewhere. We also prove that LIP lies in the
complexity class SZK.Comment: 23 pages, SODA 201
Hierarchies of Inefficient Kernelizability
The framework of Bodlaender et al. (ICALP 2008) and Fortnow and Santhanam
(STOC 2008) allows us to exclude the existence of polynomial kernels for a
range of problems under reasonable complexity-theoretical assumptions. However,
there are also some issues that are not addressed by this framework, including
the existence of Turing kernels such as the "kernelization" of Leaf Out
Branching(k) into a disjunction over n instances of size poly(k). Observing
that Turing kernels are preserved by polynomial parametric transformations, we
define a kernelization hardness hierarchy, akin to the M- and W-hierarchy of
ordinary parameterized complexity, by the PPT-closure of problems that seem
likely to be fundamentally hard for efficient Turing kernelization. We find
that several previously considered problems are complete for our fundamental
hardness class, including Min Ones d-SAT(k), Binary NDTM Halting(k), Connected
Vertex Cover(k), and Clique(k log n), the clique problem parameterized by k log
n
Boolean Operations, Joins, and the Extended Low Hierarchy
We prove that the join of two sets may actually fall into a lower level of
the extended low hierarchy than either of the sets. In particular, there exist
sets that are not in the second level of the extended low hierarchy, EL_2, yet
their join is in EL_2. That is, in terms of extended lowness, the join operator
can lower complexity. Since in a strong intuitive sense the join does not lower
complexity, our result suggests that the extended low hierarchy is unnatural as
a complexity measure. We also study the closure properties of EL_ and prove
that EL_2 is not closed under certain Boolean operations. To this end, we
establish the first known (and optimal) EL_2 lower bounds for certain notions
generalizing Selman's P-selectivity, which may be regarded as an interesting
result in its own right.Comment: 12 page
- …