904 research outputs found
Descriptional complexity of cellular automata and decidability questions
We study the descriptional complexity of cellular automata (CA), a parallel model of computation. We show that between one of the simplest cellular models, the realtime-OCA. and "classical" models like deterministic finite automata (DFA) or pushdown automata (PDA), there will be savings concerning the size of description not bounded by any recursive function, a so-called nonrecursive trade-off. Furthermore, nonrecursive trade-offs are shown between some restricted classes of cellular automata. The set of valid computations of a Turing machine can be recognized by a realtime-OCA. This implies that many decidability questions are not even semi decidable for cellular automata. There is no pumping lemma and no minimization algorithm for cellular automata
On the descriptional complexity of iterative arrays
The descriptional complexity of iterative arrays (lAs) is studied. Iterative arrays are a parallel computational model with a sequential processing of the input. It is shown that lAs when compared to deterministic finite automata or pushdown automata may provide savings in size which are not bounded by any recursive function, so-called non-recursive trade-offs. Additional non-recursive trade-offs are proven to exist between lAs working in linear time and lAs working in real time. Furthermore, the descriptional complexity of lAs is compared with cellular automata (CAs) and non-recursive trade-offs are proven between two restricted classes. Finally, it is shown that many decidability questions for lAs are undecidable and not semidecidable
Numerical Evaluation of Algorithmic Complexity for Short Strings: A Glance into the Innermost Structure of Randomness
We describe an alternative method (to compression) that combines several
theoretical and experimental results to numerically approximate the algorithmic
(Kolmogorov-Chaitin) complexity of all bit strings up to 8
bits long, and for some between 9 and 16 bits long. This is done by an
exhaustive execution of all deterministic 2-symbol Turing machines with up to 4
states for which the halting times are known thanks to the Busy Beaver problem,
that is 11019960576 machines. An output frequency distribution is then
computed, from which the algorithmic probability is calculated and the
algorithmic complexity evaluated by way of the (Levin-Zvonkin-Chaitin) coding
theorem.Comment: 29 pages, 5 figures. Version as accepted by the journal Applied
Mathematics and Computatio
Complexity of equivalence relations and preorders from computability theory
We study the relative complexity of equivalence relations and preorders from
computability theory and complexity theory. Given binary relations , a
componentwise reducibility is defined by R\le S \iff \ex f \, \forall x, y \,
[xRy \lra f(x) Sf(y)]. Here is taken from a suitable class of effective
functions. For us the relations will be on natural numbers, and must be
computable. We show that there is a -complete equivalence relation, but
no -complete for .
We show that preorders arising naturally in the above-mentioned
areas are -complete. This includes polynomial time -reducibility
on exponential time sets, which is , almost inclusion on r.e.\ sets,
which is , and Turing reducibility on r.e.\ sets, which is .Comment: To appear in J. Symb. Logi
- …