65 research outputs found

    A Shift Selection Strategy for Parallel Shift-invert Spectrum Slicing in Symmetric Self-consistent Eigenvalue Computation

    Get PDF
    © 2020 ACM. The central importance of large-scale eigenvalue problems in scientific computation necessitates the development of massively parallel algorithms for their solution. Recent advances in dense numerical linear algebra have enabled the routine treatment of eigenvalue problems with dimensions on the order of hundreds of thousands on the world's largest supercomputers. In cases where dense treatments are not feasible, Krylov subspace methods offer an attractive alternative due to the fact that they do not require storage of the problem matrices. However, demonstration of scalability of either of these classes of eigenvalue algorithms on computing architectures capable of expressing massive parallelism is non-trivial due to communication requirements and serial bottlenecks, respectively. In this work, we introduce the SISLICE method: a parallel shift-invert algorithm for the solution of the symmetric self-consistent field (SCF) eigenvalue problem. The SISLICE method drastically reduces the communication requirement of current parallel shift-invert eigenvalue algorithms through various shift selection and migration techniques based on density of states estimation and k-means clustering, respectively. This work demonstrates the robustness and parallel performance of the SISLICE method on a representative set of SCF eigenvalue problems and outlines research directions that will be explored in future work

    Parallel symmetric eigenvalue problem solvers

    Get PDF
    Sparse symmetric eigenvalue problems arise in many computational science and engineering applications: in structural mechanics, nanoelectronics, and spectral reordering, for example. Often, the large size of these problems requires the development of eigensolvers that scale well on parallel computing platforms. In this dissertation, we describe two such eigensolvers, TraceMin and TraceMin-Davidson. These methods are different from many other eigensolvers in that they do not require accurate linear solves to be performed at each iteration in order to find the smallest eigenvalues and their associated eigenvectors. After introducing these closely related eigensolvers, we discuss alternative methods for solving the saddle point problems arising at each iteration, which can improve the overall running time. Additionally, we present TraceMin-Multisectioning, a new TraceMin implementation geared towards finding large numbers of eigenpairs in any given interval of the spectrum. We conclude with numerical experiments comparing our trace-minimization solvers to other popular eigensolvers (such as Krylov-Schur, LOBPCG, Jacobi-Davidson, and FEAST), establishing the competitiveness of our methods
    corecore