8 research outputs found

    New Acceleration of Nearly Optimal Univariate Polynomial Root-findERS

    Full text link
    Univariate polynomial root-finding has been studied for four millennia and is still the subject of intensive research. Hundreds of efficient algorithms for this task have been proposed. Two of them are nearly optimal. The first one, proposed in 1995, relies on recursive factorization of a polynomial, is quite involved, and has never been implemented. The second one, proposed in 2016, relies on subdivision iterations, was implemented in 2018, and promises to be practically competitive, although user's current choice for univariate polynomial root-finding is the package MPSolve, proposed in 2000, revised in 2014, and based on Ehrlich's functional iterations. By proposing and incorporating some novel techniques we significantly accelerate both subdivision and Ehrlich's iterations. Moreover our acceleration of the known subdivision root-finders is dramatic in the case of sparse input polynomials. Our techniques can be of some independent interest for the design and analysis of polynomial root-finders.Comment: 89 pages, 5 figures, 2 table

    Accelerated Approximation of the Complex Roots and Factors of a Univariate Polynomial

    Get PDF
    To appearInternational audienceThe known algorithms approximate the roots of a complex univariate polynomial in nearly optimal arithmetic and Boolean time. They are, however, quite involved and require a high precision of computing when the degree of the input polynomial is large, which causes numerical stability problems. We observe that these difficulties do not appear at the initial stages of the algorithms, and in our present paper we extend one of these stages, analyze it, and avoid the cited problems, still achieving the solution within a nearly optimal complexity estimates, provided that some mild initial isolation of the roots of the input polynomial has been ensured. The resulting algorithms promise to be of some practical value for root-finding and can be extended to the problem of polynomial factorization, which is of interest on its own right. We conclude with outlining such an extension, which enables us to cover the cases of isolated multiple roots and root clusters

    Nearly optimal computations with structured matrices

    Get PDF
    International audienceWe estimate the Boolean complexity of multiplication of structured matrices by a vector and the solution of nonsingular linear systems of equations with these matrices. We study four basic and most popular classes, that is, Toeplitz, Hankel, Cauchy and Vandermonde matrices, for which the cited computational problems are equivalent to the task of polynomial multiplication and division and polynomial and rational multipoint evaluation and interpolation. The Boolean cost estimates for the latter problems have been obtained by Kirrinnis in [10], except for rational interpolation. We supply them now as well as the Boolean complexity estimates for the important problems of multiplication of transposed Vandermonde matrix and its inverse by a vector. All known Boolean cost estimates from [10] for such problems rely on using Kronecker product. This implies the d-fold precision increase for the d-th degree output, but we avoid such an increase by relying on distinct techniques based on employing FFT. Furthermore we simplify the analysis and make it more transparent by combining the representations of our tasks and algorithms both via structured matrices and via polynomials and rational functions. This also enables further extensions of our estimates to cover Trummer’s important problem and computations with the popular classes of structured matrices that generalize the four cited basic matrix classes, as well as the transposed Vandermonde matrices. It is known that the solution of Toeplitz, Hankel, Cauchy, Vandermonde, and transposed Vandermonde linear systems of equations is generally prone to numerical stability problems, and numerical problems arise even for multiplication of Cauchy, Vandermonde, and transposed Vandermonde matrices by a vector. Thus our FFT-based results on the Boolean complexity of these important computations could be quite interesting because our estimates are reasonable even for more general classes of structured matrices, showing rather moderate growth of the complexity as the input size increases

    Fast approximate computations with Cauchy matrices and polynomials

    No full text
    corecore