1,566 research outputs found

    Gradient-type subspace iteration methods for the symmetric eigenvalue problem

    Full text link
    This paper explores variants of the subspace iteration algorithm for computing approximate invariant subspaces. The standard subspace iteration approach is revisited and new variants that exploit gradient-type techniques combined with a Grassmann manifold viewpoint are developed. A gradient method as well as a conjugate gradient technique are described. Convergence of the gradient-based algorithm is analyzed and a few numerical experiments are reported, indicating that the proposed algorithms are sometimes superior to a standard Chebyshev-based subspace iteration when compared in terms of number of matrix vector products, but do not require estimating optimal parameters. An important contribution of this paper to achieve this good performance is the accurate and efficient implementation of an exact line search. In addition, new convergence proofs are presented for the non-accelerated gradient method that includes a locally exponential convergence if started in a O(δ)\mathcal{O(\sqrt{\delta})} neighbourhood of the dominant subspace with spectral gap δ\delta.Comment: 29 page

    Criterion for polynomial solutions to a class of linear differential equation of second order

    Full text link
    We consider the differential equations y''=\lambda_0(x)y'+s_0(x)y, where \lambda_0(x), s_0(x) are C^{\infty}-functions. We prove (i) if the differential equation, has a polynomial solution of degree n >0, then \delta_n=\lambda_n s_{n-1}-\lambda_{n-1}s_n=0, where \lambda_{n}= \lambda_{n-1}^\prime+s_{n-1}+\lambda_0\lambda_{n-1}\hbox{and}\quad s_{n}=s_{n-1}^\prime+s_0\lambda_{k-1},\quad n=1,2,.... Conversely (ii) if \lambda_n\lambda_{n-1}\ne 0 and \delta_n=0, then the differential equation has a polynomial solution of degree at most n. We show that the classical differential equations of Laguerre, Hermite, Legendre, Jacobi, Chebyshev (first and second kind), Gegenbauer, and the Hypergeometric type, etc, obey this criterion. Further, we find the polynomial solutions for the generalized Hermite, Laguerre, Legendre and Chebyshev differential equations.Comment: 12 page

    Parallel algorithm with spectral convergence for nonlinear integro-differential equations

    Get PDF
    We discuss a numerical algorithm for solving nonlinear integro-differential equations, and illustrate our findings for the particular case of Volterra type equations. The algorithm combines a perturbation approach meant to render a linearized version of the problem and a spectral method where unknown functions are expanded in terms of Chebyshev polynomials (El-gendi's method). This approach is shown to be suitable for the calculation of two-point Green functions required in next to leading order studies of time-dependent quantum field theory.Comment: 15 pages, 9 figure

    Optimization via Chebyshev Polynomials

    Full text link
    This paper presents for the first time a robust exact line-search method based on a full pseudospectral (PS) numerical scheme employing orthogonal polynomials. The proposed method takes on an adaptive search procedure and combines the superior accuracy of Chebyshev PS approximations with the high-order approximations obtained through Chebyshev PS differentiation matrices (CPSDMs). In addition, the method exhibits quadratic convergence rate by enforcing an adaptive Newton search iterative scheme. A rigorous error analysis of the proposed method is presented along with a detailed set of pseudocodes for the established computational algorithms. Several numerical experiments are conducted on one- and multi-dimensional optimization test problems to illustrate the advantages of the proposed strategy.Comment: 26 pages, 6 figures, 2 table

    All-at-once preconditioning in PDE-constrained optimization

    Get PDF
    The optimization of functions subject to partial differential equations (PDE) plays an important role in many areas of science and industry. In this paper we introduce the basic concepts of PDE-constrained optimization and show how the all-at-once approach will lead to linear systems in saddle point form. We will discuss implementation details and different boundary conditions. We then show how these system can be solved efficiently and discuss methods and preconditioners also in the case when bound constraints for the control are introduced. Numerical results will illustrate the competitiveness of our techniques

    Numerical Approximations Using Chebyshev Polynomial Expansions

    Full text link
    We present numerical solutions for differential equations by expanding the unknown function in terms of Chebyshev polynomials and solving a system of linear equations directly for the values of the function at the extrema (or zeros) of the Chebyshev polynomial of order N (El-gendi's method). The solutions are exact at these points, apart from round-off computer errors and the convergence of other numerical methods used in connection to solving the linear system of equations. Applications to initial value problems in time-dependent quantum field theory, and second order boundary value problems in fluid dynamics are presented.Comment: minor wording changes, some typos have been eliminate
    • …
    corecore