2,401 research outputs found
Almost Settling the Hardness of Noncommutative Determinant
In this paper, we study the complexity of computing the determinant of a
matrix over a non-commutative algebra. In particular, we ask the question,
"over which algebras, is the determinant easier to compute than the permanent?"
Towards resolving this question, we show the following hardness and easiness of
noncommutative determinant computation.
* [Hardness] Computing the determinant of an n \times n matrix whose entries
are themselves 2 \times 2 matrices over a field is as hard as computing the
permanent over the field. This extends the recent result of Arvind and
Srinivasan, who proved a similar result which however required the entries to
be of linear dimension.
* [Easiness] Determinant of an n \times n matrix whose entries are themselves
d \times d upper triangular matrices can be computed in poly(n^d) time.
Combining the above with the decomposition theorem of finite dimensional
algebras (in particular exploiting the simple structure of 2 \times 2 matrix
algebras), we can extend the above hardness and easiness statements to more
general algebras as follows. Let A be a finite dimensional algebra over a
finite field with radical R(A).
* [Hardness] If the quotient A/R(A) is non-commutative, then computing the
determinant over the algebra A is as hard as computing the permanent.
* [Easiness] If the quotient A/R(A) is commutative and furthermore, R(A) has
nilpotency index d (i.e., the smallest d such that R(A)d = 0), then there
exists a poly(n^d)-time algorithm that computes determinants over the algebra
A.
In particular, for any constant dimensional algebra A over a finite field,
since the nilpotency index of R(A) is at most a constant, we have the following
dichotomy theorem: if A/R(A) is commutative, then efficient determinant
computation is feasible and otherwise determinant is as hard as permanent.Comment: 20 pages, 3 figure
Circuit complexity, proof complexity, and polynomial identity testing
We introduce a new algebraic proof system, which has tight connections to
(algebraic) circuit complexity. In particular, we show that any
super-polynomial lower bound on any Boolean tautology in our proof system
implies that the permanent does not have polynomial-size algebraic circuits
(VNP is not equal to VP). As a corollary to the proof, we also show that
super-polynomial lower bounds on the number of lines in Polynomial Calculus
proofs (as opposed to the usual measure of number of monomials) imply the
Permanent versus Determinant Conjecture. Note that, prior to our work, there
was no proof system for which lower bounds on an arbitrary tautology implied
any computational lower bound.
Our proof system helps clarify the relationships between previous algebraic
proof systems, and begins to shed light on why proof complexity lower bounds
for various proof systems have been so much harder than lower bounds on the
corresponding circuit classes. In doing so, we highlight the importance of
polynomial identity testing (PIT) for understanding proof complexity.
More specifically, we introduce certain propositional axioms satisfied by any
Boolean circuit computing PIT. We use these PIT axioms to shed light on
AC^0[p]-Frege lower bounds, which have been open for nearly 30 years, with no
satisfactory explanation as to their apparent difficulty. We show that either:
a) Proving super-polynomial lower bounds on AC^0[p]-Frege implies VNP does not
have polynomial-size circuits of depth d - a notoriously open question for d at
least 4 - thus explaining the difficulty of lower bounds on AC^0[p]-Frege, or
b) AC^0[p]-Frege cannot efficiently prove the depth d PIT axioms, and hence we
have a lower bound on AC^0[p]-Frege.
Using the algebraic structure of our proof system, we propose a novel way to
extend techniques from algebraic circuit complexity to prove lower bounds in
proof complexity
- …