5,389 research outputs found
Verifying Time Complexity of Deterministic Turing Machines
We show that, for all reasonable functions , we can
algorithmically verify whether a given one-tape Turing machine runs in time at
most . This is a tight bound on the order of growth for the function
because we prove that, for and , there
exists no algorithm that would verify whether a given one-tape Turing machine
runs in time at most .
We give results also for the case of multi-tape Turing machines. We show that
we can verify whether a given multi-tape Turing machine runs in time at most
iff for some .
We prove a very general undecidability result stating that, for any class of
functions that contains arbitrary large constants, we cannot
verify whether a given Turing machine runs in time for some
. In particular, we cannot verify whether a Turing machine
runs in constant, polynomial or exponential time.Comment: 18 pages, 1 figur
Thermodynamics of stochastic Turing machines
In analogy to Brownian computers we explicitly show how to construct
stochastic models, which mimic the behaviour of a general purpose computer (a
Turing machine). Our models are discrete state systems obeying a Markovian
master equation, which are logically reversible and have a well-defined and
consistent thermodynamic interpretation. The resulting master equation, which
describes a simple one-step process on an enormously large state space, allows
us to thoroughly investigate the thermodynamics of computation for this
situation. Especially, in the stationary regime we can well approximate the
master equation by a simple Fokker-Planck equation in one dimension. We then
show that the entropy production rate at steady state can be made arbitrarily
small, but the total (integrated) entropy production is finite and grows
logarithmically with the number of computational steps.Comment: 13 pages incl. appendix, 3 figures and 1 table, slightly changed
version as published in PR
Quantifying Resource Use in Computations
It is currently not possible to quantify the resources needed to perform a
computation. As a consequence, it is not possible to reliably evaluate the
hardware resources needed for the application of algorithms or the running of
programs. This is apparent in both computer science, for instance, in
cryptanalysis, and in neuroscience, for instance, comparative neuro-anatomy. A
System versus Environment game formalism is proposed based on Computability
Logic that allows to define a computational work function that describes the
theoretical and physical resources needed to perform any purely algorithmic
computation. Within this formalism, the cost of a computation is defined as the
sum of information storage over the steps of the computation. The size of the
computational device, eg, the action table of a Universal Turing Machine, the
number of transistors in silicon, or the number and complexity of synapses in a
neural net, is explicitly included in the computational cost. The proposed cost
function leads in a natural way to known computational trade-offs and can be
used to estimate the computational capacity of real silicon hardware and neural
nets. The theory is applied to a historical case of 56 bit DES key recovery, as
an example of application to cryptanalysis. Furthermore, the relative
computational capacities of human brain neurons and the C. elegans nervous
system are estimated as an example of application to neural nets.Comment: 26 pages, no figure
- …