20 research outputs found
The Computational Power of Minkowski Spacetime
The Lorentzian length of a timelike curve connecting both endpoints of a
classical computation is a function of the path taken through Minkowski
spacetime. The associated runtime difference is due to time-dilation: the
phenomenon whereby an observer finds that another's physically identical ideal
clock has ticked at a different rate than their own clock. Using ideas
appearing in the framework of computational complexity theory, time-dilation is
quantified as an algorithmic resource by relating relativistic energy to an
th order polynomial time reduction at the completion of an observer's
journey. These results enable a comparison between the optimal quadratic
\emph{Grover speedup} from quantum computing and an speedup using
classical computers and relativistic effects. The goal is not to propose a
practical model of computation, but to probe the ultimate limits physics places
on computation.Comment: 6 pages, LaTeX, feedback welcom
Zeno machines and hypercomputation
This paper reviews the Church-Turing Thesis (or rather, theses) with
reference to their origin and application and considers some models of
"hypercomputation", concentrating on perhaps the most straight-forward option:
Zeno machines (Turing machines with accelerating clock). The halting problem is
briefly discussed in a general context and the suggestion that it is an
inevitable companion of any reasonable computational model is emphasised. It is
hinted that claims to have "broken the Turing barrier" could be toned down and
that the important and well-founded role of Turing computability in the
mathematical sciences stands unchallenged.Comment: 11 pages. First submitted in December 2004, substantially revised in
July and in November 2005. To appear in Theoretical Computer Scienc
The Road to Quantum Computational Supremacy
We present an idiosyncratic view of the race for quantum computational
supremacy. Google's approach and IBM challenge are examined. An unexpected
side-effect of the race is the significant progress in designing fast classical
algorithms. Quantum supremacy, if achieved, won't make classical computing
obsolete.Comment: 15 pages, 1 figur
Average-Case Polynomial-Time Computability of Hamiltonian Dynamics
We apply average-case complexity theory to physical problems modeled by continuous-time dynamical systems. The computational complexity when simulating such systems for a bounded time-frame mainly stems from trajectories coming close to complex singularities of the system. We show that if for most initial values the trajectories do not come close to singularities the simulation can be done in polynomial time on average. For Hamiltonian systems we relate this to the volume of "almost singularities" in phase space and give some general criteria to show that a Hamiltonian system can be simulated efficiently on average. As an application we show that the planar circular-restricted three-body problem is average-case polynomial-time computable
SIMPL Systems: On a Public Key Variant of Physical Unclonable Functions
This paper theoretically discusses a novel security tool termed {\it SIMPL system}, which can be regarded as a public key version of physical unclonable functions (PUFs). Like the latter, a SIMPL system is physically unique and non-reproducible, and implements an individual function . In opposition to a PUF, however, a SIMPL system possesses a publicly known numerical description , which allows its digital simulation and prediction. At the same time, it is required that any digital simulation of a SIMPL system must work at a detectably lower speed than its real-time behavior.
In other words, the holder of a SIMPL system can evaluate a publicly known, publicly computable function faster than anyone else. This feature, so we argue in this paper, allows a number of improved practicality and security features. Once implemented successfully, SIMPL systems would have specific advantages over PUFs, certificates of authenticity, physically obfuscated keys, and also over standard mathematical cryptotechniques
Computational complexity of the landscape I
We study the computational complexity of the physical problem of finding
vacua of string theory which agree with data, such as the cosmological
constant, and show that such problems are typically NP hard. In particular, we
prove that in the Bousso-Polchinski model, the problem is NP complete. We
discuss the issues this raises and the possibility that, even if we were to
find compelling evidence that some vacuum of string theory describes our
universe, we might never be able to find that vacuum explicitly.
In a companion paper, we apply this point of view to the question of how
early cosmology might select a vacuum.Comment: JHEP3 Latex, 53 pp, 2 .eps figure