124,514 research outputs found

    Computational Complexity and Phase Transitions

    Full text link
    Phase transitions in combinatorial problems have recently been shown to be useful in locating "hard" instances of combinatorial problems. The connection between computational complexity and the existence of phase transitions has been addressed in Statistical Mechanics and Artificial Intelligence, but not studied rigorously. We take a step in this direction by investigating the existence of sharp thresholds for the class of generalized satisfiability problems defined by Schaefer. In the case when all constraints are clauses we give a complete characterization of such problems that have a sharp threshold. While NP-completeness does not imply (even in this restricted case) the existence of a sharp threshold, it "almost implies" this, since clausal generalized satisfiability problems that lack a sharp threshold are either 1. polynomial time solvable, or 2. predicted, with success probability lower bounded by some positive constant by across all the probability range, by a single, trivial procedure.Comment: A (slightly) revised version of the paper submitted to the 15th IEEE Conference on Computational Complexit

    Quantum sampling algorithms, phase transitions, and computational complexity

    Get PDF

    Phase transitions in project scheduling.

    Get PDF
    The analysis of the complexity of combinatorial optimization problems has led to the distinction between problems which are solvable in a polynomially bounded amount of time (classified in P) and problems which are not (classified in NP). This implies that the problems in NP are hard to solve whereas the problems in P are not. However, this analysis is based on worst-case scenarios. The fact that a decision problem is shown to be NP-complete or the fact that an optimization problem is shown to be NP-hard implies that, in the worst case, solving it is very hard. Recent computational results obtained with a well known NP-hard problem, namely the resource-constrained project scheduling problem, indicate that many instances are actually easy to solve. These results are in line with those recently obtained by researchers in the area of artificial intelligence, which show that many NP-complete problemsexhibit so-called phase transitions, resulting in a sudden and dramatic change of computational complexity based on one or more order parameters that are characteristic of the system as a whole. In this paper we provide evidence for the existence of phase transitions in various resource-constrained project scheduling problems. We discuss the use of network complexity measures and resource parameters as potential order parameters. We show that while the network complexity measures seem to reveal continuous easy-hard or hard-easy phase-transitions, the resource parameters exhibit an easy-hard-easy transition behaviour.Networks; Problems; Scheduling; Algorithms;

    Adiabatic Computation - A Toy Model

    Get PDF
    We discuss a toy model for adiabatic quantum computation which displays some phenomenological properties expected in more realistic implementations. This model has two free parameters: the adiabatic evolution parameter ss and the α\alpha parameter which emulates many-variables constrains in the classical computational problem. The proposed model presents, in the s−αs-\alpha plane, a line of first order quantum phase transition that ends at a second order point. The relation between computation complexity and the occurrence of quantum phase transitions is discussed. We analyze the behavior of the ground and first excited states near the quantum phase transition, the gap and the entanglement content of the ground state.Comment: 7 pages, 8 figure

    Quantum Entanglement Phase Transitions and Computational Complexity: Insights from Ising Models

    Full text link
    In this paper, we construct 2-dimensional bipartite cluster states and perform single-qubit measurements on the bulk qubits. We explore the entanglement scaling of the unmeasured 1-dimensional boundary state and show that under certain conditions, the boundary state can undergo a volume-law to an area-law entanglement transition driven by variations in the measurement angle. We bridge this boundary state entanglement transition and the measurement-induced phase transition in the non-unitary 1+1-dimensional circuit via the transfer matrix method. We also explore the application of this entanglement transition on the computational complexity problems. Specifically, we establish a relation between the boundary state entanglement transition and the sampling complexity of the bipartite 22d cluster state, which is directly related to the computational complexity of the corresponding Ising partition function with complex parameters. By examining the boundary state entanglement scaling, we numerically identify the parameter regime for which the 22d quantum state can be efficiently sampled, which indicates that the Ising partition function can be evaluated efficiently in such a region

    Physical consequences of P≠\neqNP and the DMRG-annealing conjecture

    Full text link
    Computational complexity theory contains a corpus of theorems and conjectures regarding the time a Turing machine will need to solve certain types of problems as a function of the input size. Nature {\em need not} be a Turing machine and, thus, these theorems do not apply directly to it. But {\em classical simulations} of physical processes are programs running on Turing machines and, as such, are subject to them. In this work, computational complexity theory is applied to classical simulations of systems performing an adiabatic quantum computation (AQC), based on an annealed extension of the density matrix renormalization group (DMRG). We conjecture that the computational time required for those classical simulations is controlled solely by the {\em maximal entanglement} found during the process. Thus, lower bounds on the growth of entanglement with the system size can be provided. In some cases, quantum phase transitions can be predicted to take place in certain inhomogeneous systems. Concretely, physical conclusions are drawn from the assumption that the complexity classes {\bf P} and {\bf NP} differ. As a by-product, an alternative measure of entanglement is proposed which, via Chebyshev's inequality, allows to establish strict bounds on the required computational time.Comment: Accepted for publication in JSTA
    • 

    corecore