210 research outputs found

    A Measure of Space for Computing over the Reals

    Full text link
    We propose a new complexity measure of space for the BSS model of computation. We define LOGSPACE\_W and PSPACE\_W complexity classes over the reals. We prove that LOGSPACE\_W is included in NC^2\_R and in P\_W, i.e. is small enough for being relevant. We prove that the Real Circuit Decision Problem is P\_R-complete under LOGSPACE\_W reductions, i.e. that LOGSPACE\_W is large enough for containing natural algorithms. We also prove that PSPACE\_W is included in PAR\_R

    VPSPACE and a transfer theorem over the complex field

    Get PDF
    We extend the transfer theorem of [KP2007] to the complex field. That is, we investigate the links between the class VPSPACE of families of polynomials and the Blum-Shub-Smale model of computation over C. Roughly speaking, a family of polynomials is in VPSPACE if its coefficients can be computed in polynomial space. Our main result is that if (uniform, constant-free) VPSPACE families can be evaluated efficiently then the class PAR of decision problems that can be solved in parallel polynomial time over the complex field collapses to P. As a result, one must first be able to show that there are VPSPACE families which are hard to evaluate in order to separate P from NP over C, or even from PAR.Comment: 14 page

    A complex analogue of Toda's Theorem

    Full text link
    Toda \cite{Toda} proved in 1989 that the (discrete) polynomial time hierarchy, PH\mathbf{PH}, is contained in the class \mathbf{P}^{#\mathbf{P}}, namely the class of languages that can be decided by a Turing machine in polynomial time given access to an oracle with the power to compute a function in the counting complexity class #\mathbf{P}. This result, which illustrates the power of counting is considered to be a seminal result in computational complexity theory. An analogous result (with a compactness hypothesis) in the complexity theory over the reals (in the sense of Blum-Shub-Smale real machines \cite{BSS89}) was proved in \cite{BZ09}. Unlike Toda's proof in the discrete case, which relied on sophisticated combinatorial arguments, the proof in \cite{BZ09} is topological in nature in which the properties of the topological join is used in a fundamental way. However, the constructions used in \cite{BZ09} were semi-algebraic -- they used real inequalities in an essential way and as such do not extend to the complex case. In this paper, we extend the techniques developed in \cite{BZ09} to the complex projective case. A key role is played by the complex join of quasi-projective complex varieties. As a consequence we obtain a complex analogue of Toda's theorem. The results contained in this paper, taken together with those contained in \cite{BZ09}, illustrate the central role of the Poincar\'e polynomial in algorithmic algebraic geometry, as well as, in computational complexity theory over the complex and real numbers -- namely, the ability to compute it efficiently enables one to decide in polynomial time all languages in the (compact) polynomial hierarchy over the appropriate field.Comment: 31 pages. Final version to appear in Foundations of Computational Mathematic

    Ultimate Polynomial Time

    Full text link
    The class UP\mathcal{UP} of `ultimate polynomial time' problems over C\mathbb C is introduced; it contains the class P\mathcal P of polynomial time problems over C\mathbb C. The τ\tau-Conjecture for polynomials implies that UP\mathcal{UP} does not contain the class of non-deterministic polynomial time problems definable without constants over C\mathbb C. This latest statement implies that P≠NP\mathcal P \ne \mathcal{NP} over C\mathbb C. A notion of `ultimate complexity' of a problem is suggested. It provides lower bounds for the complexity of structured problems

    Computing the homology of basic semialgebraic sets in weak exponential time

    Get PDF
    We describe and analyze an algorithm for computing the homology (Betti numbers and torsion coefficients) of basic semialgebraic sets which works in weak exponential time. That is, out of a set of exponentially small measure in the space of data the cost of the algorithm is exponential in the size of the data. All algorithms previously proposed for this problem have a complexity which is doubly exponential (and this is so for almost all data)

    On sparseness and Turing reducibility over the reals

    Get PDF
    AbstractWe prove some results about existence of NP-complete and NP-hard (for Turing reductions) sparse sets on different settings over the real numbers
    • …
    corecore