11,832 research outputs found
Why Philosophers Should Care About Computational Complexity
One might think that, once we know something is computable, how efficiently
it can be computed is a practical question with little further philosophical
importance. In this essay, I offer a detailed case that one would be wrong. In
particular, I argue that computational complexity theory---the field that
studies the resources (such as time, space, and randomness) needed to solve
computational problems---leads to new perspectives on the nature of
mathematical knowledge, the strong AI debate, computationalism, the problem of
logical omniscience, Hume's problem of induction, Goodman's grue riddle, the
foundations of quantum mechanics, economic rationality, closed timelike curves,
and several other topics of philosophical interest. I end by discussing aspects
of complexity theory itself that could benefit from philosophical analysis.Comment: 58 pages, to appear in "Computability: G\"odel, Turing, Church, and
beyond," MIT Press, 2012. Some minor clarifications and corrections; new
references adde
The quantum measurement problem and physical reality: a computation theoretic perspective
Is the universe computable? If yes, is it computationally a polynomial place?
In standard quantum mechanics, which permits infinite parallelism and the
infinitely precise specification of states, a negative answer to both questions
is not ruled out. On the other hand, empirical evidence suggests that
NP-complete problems are intractable in the physical world. Likewise,
computational problems known to be algorithmically uncomputable do not seem to
be computable by any physical means. We suggest that this close correspondence
between the efficiency and power of abstract algorithms on the one hand, and
physical computers on the other, finds a natural explanation if the universe is
assumed to be algorithmic; that is, that physical reality is the product of
discrete sub-physical information processing equivalent to the actions of a
probabilistic Turing machine. This assumption can be reconciled with the
observed exponentiality of quantum systems at microscopic scales, and the
consequent possibility of implementing Shor's quantum polynomial time algorithm
at that scale, provided the degree of superposition is intrinsically, finitely
upper-bounded. If this bound is associated with the quantum-classical divide
(the Heisenberg cut), a natural resolution to the quantum measurement problem
arises. From this viewpoint, macroscopic classicality is an evidence that the
universe is in BPP, and both questions raised above receive affirmative
answers. A recently proposed computational model of quantum measurement, which
relates the Heisenberg cut to the discreteness of Hilbert space, is briefly
discussed. A connection to quantum gravity is noted. Our results are compatible
with the philosophy that mathematical truths are independent of the laws of
physics.Comment: Talk presented at "Quantum Computing: Back Action 2006", IIT Kanpur,
India, March 200
Epistemic virtues, metavirtues, and computational complexity
I argue that considerations about computational complexity show that all finite agents need characteristics like those that have been called epistemic virtues. The necessity of these virtues follows in part from the nonexistence of shortcuts, or efficient ways of finding shortcuts, to cognitively expensive routines. It follows that agents must possess the capacities – metavirtues –of developing in advance the cognitive virtues they will need when time and memory are at a premium
NP-complete Problems and Physical Reality
Can NP-complete problems be solved efficiently in the physical universe? I
survey proposals including soap bubbles, protein folding, quantum computing,
quantum advice, quantum adiabatic algorithms, quantum-mechanical
nonlinearities, hidden variables, relativistic time dilation, analog computing,
Malament-Hogarth spacetimes, quantum gravity, closed timelike curves, and
"anthropic computing." The section on soap bubbles even includes some
"experimental" results. While I do not believe that any of the proposals will
let us solve NP-complete problems efficiently, I argue that by studying them,
we can learn something not only about computation but also about physics.Comment: 23 pages, minor correction
An Introduction to Quantum Complexity Theory
We give a basic overview of computational complexity, query complexity, and
communication complexity, with quantum information incorporated into each of
these scenarios. The aim is to provide simple but clear definitions, and to
highlight the interplay between the three scenarios and currently-known quantum
algorithms.Comment: 28 pages, LaTeX, 11 figures within the text, to appear in "Collected
Papers on Quantum Computation and Quantum Information Theory", edited by C.
Macchiavello, G.M. Palma, and A. Zeilinger (World Scientific
Complexity, parallel computation and statistical physics
The intuition that a long history is required for the emergence of complexity
in natural systems is formalized using the notion of depth. The depth of a
system is defined in terms of the number of parallel computational steps needed
to simulate it. Depth provides an objective, irreducible measure of history
applicable to systems of the kind studied in statistical physics. It is argued
that physical complexity cannot occur in the absence of substantial depth and
that depth is a useful proxy for physical complexity. The ideas are illustrated
for a variety of systems in statistical physics.Comment: 21 pages, 7 figure
Three Puzzles on Mathematics, Computation, and Games
In this lecture I will talk about three mathematical puzzles involving
mathematics and computation that have preoccupied me over the years. The first
puzzle is to understand the amazing success of the simplex algorithm for linear
programming. The second puzzle is about errors made when votes are counted
during elections. The third puzzle is: are quantum computers possible?Comment: ICM 2018 plenary lecture, Rio de Janeiro, 36 pages, 7 Figure
- …