70 research outputs found

    On largest volume simplices and sub-determinants

    Full text link
    We show that the problem of finding the simplex of largest volume in the convex hull of nn points in Qd\mathbb{Q}^d can be approximated with a factor of O(log⁡d)d/2O(\log d)^{d/2} in polynomial time. This improves upon the previously best known approximation guarantee of d(d−1)/2d^{(d-1)/2} by Khachiyan. On the other hand, we show that there exists a constant c>1c>1 such that this problem cannot be approximated with a factor of cdc^d, unless P=NPP=NP. % This improves over the 1.091.09 inapproximability that was previously known. Our hardness result holds even if n=O(d)n = O(d), in which case there exists a \bar c\,^{d}-approximation algorithm that relies on recent sampling techniques, where cˉ\bar c is again a constant. We show that similar results hold for the problem of finding the largest absolute value of a subdeterminant of a d×nd\times n matrix

    On maximum volume submatrices and cross approximation for symmetric semidefinite and diagonally dominant matrices

    Full text link
    The problem of finding a k×kk \times k submatrix of maximum volume of a matrix AA is of interest in a variety of applications. For example, it yields a quasi-best low-rank approximation constructed from the rows and columns of AA. We show that such a submatrix can always be chosen to be a principal submatrix if AA is symmetric semidefinite or diagonally dominant. Then we analyze the low-rank approximation error returned by a greedy method for volume maximization, cross approximation with complete pivoting. Our bound for general matrices extends an existing result for symmetric semidefinite matrices and yields new error estimates for diagonally dominant matrices. In particular, for doubly diagonally dominant matrices the error is shown to remain within a modest factor of the best approximation error. We also illustrate how the application of our results to cross approximation for functions leads to new and better convergence results

    Trivariate polynomial approximation on Lissajous curves

    Get PDF
    We study Lissajous curves in the 3-cube, that generate algebraic cubature formulas on a special family of rank-1 Chebyshev lattices. These formulas are used to construct trivariate hyperinterpolation polynomials via a single 1-d Fast Chebyshev Transform (by the Chebfun package), and to compute discrete extremal sets of Fekete and Leja type for trivariate polynomial interpolation. Applications could arise in the framework of Lissajous sampling for MPI (Magnetic Particle Imaging)
    • 

    corecore