2,103 research outputs found

    Complexity of computing topological degree of Lipschitz functions in n dimensions

    Get PDF
    Journal ArticleWe find lower and upper bounds on the complexity, comp(deg), of computing t h e topological degree of functions defined on the n-dimensional unit cube Cn, f : ?Cn Rn,n ? 2, which satisfy a Lipschitz condition with constant K and whose infinity norm at each point on t h e boundary of Cn is at least d, d > 0, and such that K8d ? 1. A lower bound, complow ~ 2n (K8d)n-1(c+n) is obtained for comp(deg), assuming that each function evaluation costs c and elementary arithmetic operations and comparisons cost unity. We prove t h a t the topological degree can be computed using A = ( [ K2d + 1 ] + l ) n ?( [| K2n + 1 ] - l ) n function evaluations. It can be done by an algorithm ?* d due to Kearfott, with cost given by comp(?*) ? A(c + n2-2(n ? 1)!). Thus for small n, say n ? 5, and small say K2d ? 9, the degree can be computed in time at most 105 (c + 300). For large n and / or large K2d the problem is intractable

    A Framework for Algorithm Stability

    Get PDF
    We say that an algorithm is stable if small changes in the input result in small changes in the output. This kind of algorithm stability is particularly relevant when analyzing and visualizing time-varying data. Stability in general plays an important role in a wide variety of areas, such as numerical analysis, machine learning, and topology, but is poorly understood in the context of (combinatorial) algorithms. In this paper we present a framework for analyzing the stability of algorithms. We focus in particular on the tradeoff between the stability of an algorithm and the quality of the solution it computes. Our framework allows for three types of stability analysis with increasing degrees of complexity: event stability, topological stability, and Lipschitz stability. We demonstrate the use of our stability framework by applying it to kinetic Euclidean minimum spanning trees

    Optimal solution of nonlinear equations

    Get PDF
    Journal ArticleWe survey recent worst case complexity results for the solution of nonlinear equations. Notes on worst and average case analysis of iterative algorithms and a bibliography of the subject are also included

    Connected Choice and the Brouwer Fixed Point Theorem

    Get PDF
    We study the computational content of the Brouwer Fixed Point Theorem in the Weihrauch lattice. Connected choice is the operation that finds a point in a non-empty connected closed set given by negative information. One of our main results is that for any fixed dimension the Brouwer Fixed Point Theorem of that dimension is computably equivalent to connected choice of the Euclidean unit cube of the same dimension. Another main result is that connected choice is complete for dimension greater than or equal to two in the sense that it is computably equivalent to Weak K\H{o}nig's Lemma. While we can present two independent proofs for dimension three and upwards that are either based on a simple geometric construction or a combinatorial argument, the proof for dimension two is based on a more involved inverse limit construction. The connected choice operation in dimension one is known to be equivalent to the Intermediate Value Theorem; we prove that this problem is not idempotent in contrast to the case of dimension two and upwards. We also prove that Lipschitz continuity with Lipschitz constants strictly larger than one does not simplify finding fixed points. Finally, we prove that finding a connectedness component of a closed subset of the Euclidean unit cube of any dimension greater or equal to one is equivalent to Weak K\H{o}nig's Lemma. In order to describe these results, we introduce a representation of closed subsets of the unit cube by trees of rational complexes.Comment: 36 page

    Mutual Dimension

    Get PDF
    We define the lower and upper mutual dimensions mdim(x:y)mdim(x:y) and Mdim(x:y)Mdim(x:y) between any two points xx and yy in Euclidean space. Intuitively these are the lower and upper densities of the algorithmic information shared by xx and yy. We show that these quantities satisfy the main desiderata for a satisfactory measure of mutual algorithmic information. Our main theorem, the data processing inequality for mutual dimension, says that, if f:Rm→Rnf:\mathbb{R}^m \rightarrow \mathbb{R}^n is computable and Lipschitz, then the inequalities mdim(f(x):y)≀mdim(x:y)mdim(f(x):y) \leq mdim(x:y) and Mdim(f(x):y)≀Mdim(x:y)Mdim(f(x):y) \leq Mdim(x:y) hold for all x∈Rmx \in \mathbb{R}^m and y∈Rty \in \mathbb{R}^t. We use this inequality and related inequalities that we prove in like fashion to establish conditions under which various classes of computable functions on Euclidean space preserve or otherwise transform mutual dimensions between points.Comment: This article is 29 pages and has been submitted to ACM Transactions on Computation Theory. A preliminary version of part of this material was reported at the 2013 Symposium on Theoretical Aspects of Computer Science in Kiel, German

    Computing the homology of basic semialgebraic sets in weak exponential time

    Get PDF
    We describe and analyze an algorithm for computing the homology (Betti numbers and torsion coefficients) of basic semialgebraic sets which works in weak exponential time. That is, out of a set of exponentially small measure in the space of data the cost of the algorithm is exponential in the size of the data. All algorithms previously proposed for this problem have a complexity which is doubly exponential (and this is so for almost all data)
    • 

    corecore