44,879 research outputs found

    Efficient sum-of-exponentials approximations for the heat kernel and their applications

    Full text link
    In this paper, we show that efficient separated sum-of-exponentials approximations can be constructed for the heat kernel in any dimension. In one space dimension, the heat kernel admits an approximation involving a number of terms that is of the order O(log(Tδ)(log(1ϵ)+loglog(Tδ)))O(\log(\frac{T}{\delta}) (\log(\frac{1}{\epsilon})+\log\log(\frac{T}{\delta}))) for any x\in\bbR and δtT\delta \leq t \leq T, where ϵ\epsilon is the desired precision. In all higher dimensions, the corresponding heat kernel admits an approximation involving only O(log2(Tδ))O(\log^2(\frac{T}{\delta})) terms for fixed accuracy ϵ\epsilon. These approximations can be used to accelerate integral equation-based methods for boundary value problems governed by the heat equation in complex geometry. The resulting algorithms are nearly optimal. For NSN_S points in the spatial discretization and NTN_T time steps, the cost is O(NSNTlog2Tδ)O(N_S N_T \log^2 \frac{T}{\delta}) in terms of both memory and CPU time for fixed accuracy ϵ\epsilon. The algorithms can be parallelized in a straightforward manner. Several numerical examples are presented to illustrate the accuracy and stability of these approximations.Comment: 23 pages, 5 figures, 3 table

    Implicitization of curves and (hyper)surfaces using predicted support

    Get PDF
    We reduce implicitization of rational planar parametric curves and (hyper)surfaces to linear algebra, by interpolating the coefficients of the implicit equation. For predicting the implicit support, we focus on methods that exploit input and output structure in the sense of sparse (or toric) elimination theory, namely by computing the Newton polytope of the implicit polynomial, via sparse resultant theory. Our algorithm works even in the presence of base points but, in this case, the implicit equation shall be obtained as a factor of the produced polynomial. We implement our methods on Maple, and some on Matlab as well, and study their numerical stability and efficiency on several classes of curves and surfaces. We apply our approach to approximate implicitization, and quantify the accuracy of the approximate output, which turns out to be satisfactory on all tested examples; we also relate our measures to Hausdorff distance. In building a square or rectangular matrix, an important issue is (over)sampling the given curve or surface: we conclude that unitary complexes offer the best tradeoff between speed and accuracy when numerical methods are employed, namely SVD, whereas for exact kernel computation random integers is the method of choice. We compare our prototype to existing software and find that it is rather competitive
    corecore