2,103 research outputs found
Recommended from our members
Complexity of Computing Topological Degree of Lipschitz Functions in N Dimensions
Complexity of computing topological degree of Lipschitz functions in n dimensions
Journal ArticleWe find lower and upper bounds on the complexity, comp(deg), of computing t h e topological degree of functions defined on the n-dimensional unit cube Cn, f : ?Cn Rn,n ? 2, which satisfy a Lipschitz condition with constant K and whose infinity norm at each point on t h e boundary of Cn is at least d, d > 0, and such that K8d ? 1. A lower bound, complow ~ 2n (K8d)n-1(c+n) is obtained for comp(deg), assuming that each function evaluation costs c and elementary arithmetic operations and comparisons cost unity. We prove t h a t the topological degree can be computed using A = ( [ K2d + 1 ] + l ) n ?( [| K2n + 1 ] - l ) n function evaluations. It can be done by an algorithm ?* d due to Kearfott, with cost given by comp(?*) ? A(c + n2-2(n ? 1)!). Thus for small n, say n ? 5, and small say K2d ? 9, the degree can be computed in time at most 105 (c + 300). For large n and / or large K2d the problem is intractable
A Framework for Algorithm Stability
We say that an algorithm is stable if small changes in the input result in
small changes in the output. This kind of algorithm stability is particularly
relevant when analyzing and visualizing time-varying data. Stability in general
plays an important role in a wide variety of areas, such as numerical analysis,
machine learning, and topology, but is poorly understood in the context of
(combinatorial) algorithms. In this paper we present a framework for analyzing
the stability of algorithms. We focus in particular on the tradeoff between the
stability of an algorithm and the quality of the solution it computes. Our
framework allows for three types of stability analysis with increasing degrees
of complexity: event stability, topological stability, and Lipschitz stability.
We demonstrate the use of our stability framework by applying it to kinetic
Euclidean minimum spanning trees
Optimal solution of nonlinear equations
Journal ArticleWe survey recent worst case complexity results for the solution of nonlinear equations. Notes on worst and average case analysis of iterative algorithms and a bibliography of the subject are also included
Connected Choice and the Brouwer Fixed Point Theorem
We study the computational content of the Brouwer Fixed Point Theorem in the
Weihrauch lattice. Connected choice is the operation that finds a point in a
non-empty connected closed set given by negative information. One of our main
results is that for any fixed dimension the Brouwer Fixed Point Theorem of that
dimension is computably equivalent to connected choice of the Euclidean unit
cube of the same dimension. Another main result is that connected choice is
complete for dimension greater than or equal to two in the sense that it is
computably equivalent to Weak K\H{o}nig's Lemma. While we can present two
independent proofs for dimension three and upwards that are either based on a
simple geometric construction or a combinatorial argument, the proof for
dimension two is based on a more involved inverse limit construction. The
connected choice operation in dimension one is known to be equivalent to the
Intermediate Value Theorem; we prove that this problem is not idempotent in
contrast to the case of dimension two and upwards. We also prove that Lipschitz
continuity with Lipschitz constants strictly larger than one does not simplify
finding fixed points. Finally, we prove that finding a connectedness component
of a closed subset of the Euclidean unit cube of any dimension greater or equal
to one is equivalent to Weak K\H{o}nig's Lemma. In order to describe these
results, we introduce a representation of closed subsets of the unit cube by
trees of rational complexes.Comment: 36 page
Recommended from our members
An Optimal Complexity Algorithm for Computing Topological Degree in Two Dimensions
An algorithm is presented to compute the topological degree for any function from a class F. The class F consists of functions defined on a two dimensional unit square C, f, C â âÂČ, which satisfy Lipschitz condition with constant K > 0, and whose infinity norm on the boundary of C is at least d > 0. A worst case lower bound, m* = 4[K/4d], is established on the number of function evaluations necessary to compute the topological degree for any function f from the class F. The parallel information used by our algorithm permits the computation of the degree for every f in F with m* function evaluations necessary. The cost of our algorithm is shown to be almost equal to the complexity of the problem
Mutual Dimension
We define the lower and upper mutual dimensions and
between any two points and in Euclidean space. Intuitively these are
the lower and upper densities of the algorithmic information shared by and
. We show that these quantities satisfy the main desiderata for a
satisfactory measure of mutual algorithmic information. Our main theorem, the
data processing inequality for mutual dimension, says that, if is computable and Lipschitz, then the inequalities
and hold for all and . We use this inequality and related
inequalities that we prove in like fashion to establish conditions under which
various classes of computable functions on Euclidean space preserve or
otherwise transform mutual dimensions between points.Comment: This article is 29 pages and has been submitted to ACM Transactions
on Computation Theory. A preliminary version of part of this material was
reported at the 2013 Symposium on Theoretical Aspects of Computer Science in
Kiel, German
Computing the homology of basic semialgebraic sets in weak exponential time
We describe and analyze an algorithm for computing the homology (Betti
numbers and torsion coefficients) of basic semialgebraic sets which works in
weak exponential time. That is, out of a set of exponentially small measure in
the space of data the cost of the algorithm is exponential in the size of the
data. All algorithms previously proposed for this problem have a complexity
which is doubly exponential (and this is so for almost all data)
- âŠ