51,445 research outputs found
Smoothed Complexity Theory
Smoothed analysis is a new way of analyzing algorithms introduced by Spielman
and Teng (J. ACM, 2004). Classical methods like worst-case or average-case
analysis have accompanying complexity classes, like P and AvgP, respectively.
While worst-case or average-case analysis give us a means to talk about the
running time of a particular algorithm, complexity classes allows us to talk
about the inherent difficulty of problems.
Smoothed analysis is a hybrid of worst-case and average-case analysis and
compensates some of their drawbacks. Despite its success for the analysis of
single algorithms and problems, there is no embedding of smoothed analysis into
computational complexity theory, which is necessary to classify problems
according to their intrinsic difficulty.
We propose a framework for smoothed complexity theory, define the relevant
classes, and prove some first hardness results (of bounded halting and tiling)
and tractability results (binary optimization problems, graph coloring,
satisfiability). Furthermore, we discuss extensions and shortcomings of our
model and relate it to semi-random models.Comment: to be presented at MFCS 201
One-Tape Turing Machine Variants and Language Recognition
We present two restricted versions of one-tape Turing machines. Both
characterize the class of context-free languages. In the first version,
proposed by Hibbard in 1967 and called limited automata, each tape cell can be
rewritten only in the first visits, for a fixed constant .
Furthermore, for deterministic limited automata are equivalent to
deterministic pushdown automata, namely they characterize deterministic
context-free languages. Further restricting the possible operations, we
consider strongly limited automata. These models still characterize
context-free languages. However, the deterministic version is less powerful
than the deterministic version of limited automata. In fact, there exist
deterministic context-free languages that are not accepted by any deterministic
strongly limited automaton.Comment: 20 pages. This article will appear in the Complexity Theory Column of
the September 2015 issue of SIGACT New
Different Approaches to Proof Systems
The classical approach to proof complexity perceives proof systems as deterministic, uniform, surjective, polynomial-time computable functions that map strings to (propositional) tautologies. This approach has been intensively studied since the late 70’s and a lot of progress has been made. During the last years research was started investigating alternative notions of proof systems. There are interesting results stemming from dropping the uniformity requirement, allowing oracle access, using quantum computations, or employing probabilism. These lead to different notions of proof systems for which we survey recent results in this paper
- …