64 research outputs found
Parallel Methods and Higher Dimensional NLS Equations
Alternating direction implicit (ADI) schemes are proposed for the solution of the two-dimensional coupled nonlinear Schrödinger
equation. These schemes are of second- and fourth-order accuracy in space
and second order in time. The resulting schemes in each ADI computation step correspond to a block tridiagonal system which can be solved
by using one-dimensional block tridiagonal algorithm with a considerable
saving in computational time. These schemes are very well suited for parallel implementation on a high performance system with many processors
due to the nature of the computation that involves solving the same block
tridiagonal systems with many right hand sides. Numerical experiments
on one processor system are conducted to demonstrate the efficiency and
accuracy of these schemes by comparing them with the analytic solutions.
The results show that the proposed schemes give highly accurate results
Minimizing Communication for Eigenproblems and the Singular Value Decomposition
Algorithms have two costs: arithmetic and communication. The latter
represents the cost of moving data, either between levels of a memory
hierarchy, or between processors over a network. Communication often dominates
arithmetic and represents a rapidly increasing proportion of the total cost, so
we seek algorithms that minimize communication. In \cite{BDHS10} lower bounds
were presented on the amount of communication required for essentially all
-like algorithms for linear algebra, including eigenvalue problems and
the SVD. Conventional algorithms, including those currently implemented in
(Sca)LAPACK, perform asymptotically more communication than these lower bounds
require. In this paper we present parallel and sequential eigenvalue algorithms
(for pencils, nonsymmetric matrices, and symmetric matrices) and SVD algorithms
that do attain these lower bounds, and analyze their convergence and
communication costs.Comment: 43 pages, 11 figure
Introduction to Linear Algebra: Models, Methods, and Theory
This book develops linear algebra around matrices. Vector spaces in the abstract are not considered, only vector spaces associated with matrices. This book puts problem solving and an intuitive treatment of theory first, with a proof-oriented approach intended to come in a second course, the same way that calculus is taught. The book\u27s organization is straightforward: Chapter 1 has introductory linear models; Chapter 2 has the basics of matrix algebra; Chapter 3 develops different ways to solve a system of equations; Chapter 4 has applications, and Chapter 5 has vector-space theory associated with matrices and related topics such as pseudoinverses and orthogonalization. Many linear algebra textbooks start immediately with Gaussian elimination, before any matrix algebra. Here we first pose problems in Chapter 1, then develop a mathematical language for representing and recasting the problems in Chapter 2, and then look at ways to solve the problems in Chapter 3-four different solution methods are presented with an analysis of strengths and weaknesses of each.https://commons.library.stonybrook.edu/ams-books/1000/thumbnail.jp
- …