6 research outputs found
Plantinga-Vegter algorithm takes average polynomial time
We exhibit a condition-based analysis of the adaptive subdivision algorithm
due to Plantinga and Vegter. The first complexity analysis of the PV Algorithm
is due to Burr, Gao and Tsigaridas who proved a worst-case cost bound for degree plane curves with maximum
coefficient bit-size . This exponential bound, it was observed, is in
stark contrast with the good performance of the algorithm in practice. More in
line with this performance, we show that, with respect to a broad family of
measures, the expected time complexity of the PV Algorithm is bounded by
for real, degree , plane curves. We also exhibit a smoothed
analysis of the PV Algorithm that yields similar complexity estimates. To
obtain these results we combine robust probabilistic techniques coming from
geometric functional analysis with condition numbers and the continuous
amortization paradigm introduced by Burr, Krahmer and Yap. We hope this will
motivate a fruitful exchange of ideas between the different approaches to
numerical computation.Comment: 8 pages, correction of typo
Condition Numbers for the Cube. I: Univariate Polynomials and Hypersurfaces
The condition-based complexity analysis framework is one of the gems of
modern numerical algebraic geometry and theoretical computer science. One of
the challenges that it poses is to expand the currently limited range of random
polynomials that we can handle. Despite important recent progress, the
available tools cannot handle random sparse polynomials and Gaussian
polynomials, that is polynomials whose coefficients are i.i.d. Gaussian random
variables.
We initiate a condition-based complexity framework based on the norm of the
cube that is a step in this direction. We present this framework for real
hypersurfaces and univariate polynomials. We demonstrate its capabilities in
two problems, under very mild probabilistic assumptions. On the one hand, we
show that the average run-time of the Plantinga-Vegter algorithm is polynomial
in the degree for random sparse (alas a restricted sparseness structure)
polynomials and random Gaussian polynomials. On the other hand, we study the
size of the subdivision tree for Descartes' solver and run-time of the solver
by Jindal and Sagraloff (arXiv:1704.06979). In both cases, we provide a bound
that is polynomial in the size of the input (size of the support plus logarithm
of the degree) for not only on the average, but all higher moments.Comment: 34 pages. Version 1, conference version; from version 2, journal
versio
Condition Numbers for the Cube. I: Univariate Polynomials and Hypersurfaces
International audienceThe condition-based complexity analysis framework is one of the gems of modern numerical algebraic geometry and theoretical computer science. One of the challenges that it poses is to expand the currently limited range of random polynomials that we can handle. Despite important recent progress, the available tools cannot handle random sparse polynomials and Gaussian polynomials, that is polynomials whose coefficients are i.i.d. Gaussian random variables. We initiate a condition-based complexity framework based on the norm of the cube, that is a step in this direction. We present this framework for real hypersurfaces. We demonstrate its capabilities by providing a new probabilistic complexity analysis for the Plantinga-Vegter algorithm, which covers both random sparse (alas a restricted sparseness structure) polynomials and random Gaussian polynomials. We present explicit results with structured random polynomials for problems with two or more dimensions. Additionally, we provide some estimates of the separation bound of a univariate polynomial in our current framework
Condition Numbers for the Cube. I: Univariate Polynomials and Hypersurfaces
The condition-based complexity analysis framework is one of the gems of modern numerical algebraic geometry and theoretical computer science. One of the challenges that it poses is to expand the currently limited range of random polynomials that we can handle. Despite important recent progress, the available tools cannot handle random sparse polynomials and Gaussian polynomials, that is polynomials whose coefficients are i.i.d. Gaussian random variables.We initiate a condition-based complexity framework based on the norm of the cube that is a step in this direction. We present this framework for real hypersurfaces and univariate polynomials. We demonstrate its capabilities in two problems, under very mild probabilistic assumptions. On the one hand, we show that the average run-time of the Plantinga-Vegter algorithm is polynomial in the degree for random sparse (alas a restricted sparseness structure) polynomials and random Gaussian polynomials. On the other hand, we study the size of the subdivision tree for Descartes' solver and run-time of the solver by Jindal and Sagraloff (2017). In both cases, we provide a bound that is polynomial in the size of the input (size of the support plus logarithm of the degree) for not only on the average, but all higher moments.[This is the journal version of the conference paper with the same title.
Certified numerical algorithm for isolating the singularities of the plane projection of generic smooth space curves
International audienceIsolating the singularities of a plane curve is the first step towards computing its topology. For this, numerical methods are efficient but not certified in general. We are interested in developing certified numerical algorithms for isolating the singularities. In order to do so, we restrict our attention to the special case of plane curves that are projections of smooth curves in higher dimensions. This type of curve appears naturally in robotics applications and scientific visualization. In this setting, we show that the singularities can be encoded by a regular square system whose solutions can be isolated with certified numerical methods. Our analysis is conditioned by assumptions that we prove to be generic using transversality theory. We also provide a semi-algorithm to check their validity. Finally, we present experiments, some of which are not reachable by other methods, and discuss the efficiency of our method