24,216 research outputs found
A multi-level preconditioned Krylov method for the efficient solution of algebraic tomographic reconstruction problems
Classical iterative methods for tomographic reconstruction include the class
of Algebraic Reconstruction Techniques (ART). Convergence of these stationary
linear iterative methods is however notably slow. In this paper we propose the
use of Krylov solvers for tomographic linear inversion problems. These advanced
iterative methods feature fast convergence at the expense of a higher
computational cost per iteration, causing them to be generally uncompetitive
without the inclusion of a suitable preconditioner. Combining elements from
standard multigrid (MG) solvers and the theory of wavelets, a novel
wavelet-based multi-level (WMG) preconditioner is introduced, which is shown to
significantly speed-up Krylov convergence. The performance of the
WMG-preconditioned Krylov method is analyzed through a spectral analysis, and
the approach is compared to existing methods like the classical Simultaneous
Iterative Reconstruction Technique (SIRT) and unpreconditioned Krylov methods
on a 2D tomographic benchmark problem. Numerical experiments are promising,
showing the method to be competitive with the classical Algebraic
Reconstruction Techniques in terms of convergence speed and overall performance
(CPU time) as well as precision of the reconstruction.Comment: Journal of Computational and Applied Mathematics (2014), 26 pages, 13
figures, 3 table
Restarted Hessenberg method for solving shifted nonsymmetric linear systems
It is known that the restarted full orthogonalization method (FOM)
outperforms the restarted generalized minimum residual (GMRES) method in
several circumstances for solving shifted linear systems when the shifts are
handled simultaneously. Many variants of them have been proposed to enhance
their performance. We show that another restarted method, the restarted
Hessenberg method [M. Heyouni, M\'ethode de Hessenberg G\'en\'eralis\'ee et
Applications, Ph.D. Thesis, Universit\'e des Sciences et Technologies de Lille,
France, 1996] based on Hessenberg procedure, can effectively be employed, which
can provide accelerating convergence rate with respect to the number of
restarts. Theoretical analysis shows that the new residual of shifted restarted
Hessenberg method is still collinear with each other. In these cases where the
proposed algorithm needs less enough CPU time elapsed to converge than the
earlier established restarted shifted FOM, weighted restarted shifted FOM, and
some other popular shifted iterative solvers based on the short-term vector
recurrence, as shown via extensive numerical experiments involving the recent
popular applications of handling the time fractional differential equations.Comment: 19 pages, 7 tables. Some corrections for updating the reference
Implicit ODE solvers with good local error control for the transient analysis of Markov models
Obtaining the transient probability distribution vector of a continuous-time Markov chain (CTMC) using an implicit ordinary differential equation (ODE) solver tends to be advantageous in terms of run-time computational cost when the product of the maximum output rate of the CTMC and the largest time of interest is large. In this paper, we show that when applied to the transient analysis of CTMCs, many implicit ODE solvers are such that the linear systems involved in their steps can be solved by using iterative methods with strict control of the 1-norm of the error. This allows the development of implementations of those ODE solvers for the transient analysis of CTMCs that can be more efficient and more accurate than more standard implementations.Peer ReviewedPostprint (published version
A Bramble-Pasciak conjugate gradient method for discrete Stokes equations with random viscosity
We study the iterative solution of linear systems of equations arising from
stochastic Galerkin finite element discretizations of saddle point problems. We
focus on the Stokes model with random data parametrized by uniformly
distributed random variables and discuss well-posedness of the variational
formulations. We introduce a Bramble-Pasciak conjugate gradient method as a
linear solver. It builds on a non-standard inner product associated with a
block triangular preconditioner. The block triangular structure enables more
sophisticated preconditioners than the block diagonal structure usually applied
in MINRES methods. We show how the existence requirements of a conjugate
gradient method can be met in our setting. We analyze the performance of the
solvers depending on relevant physical and numerical parameters by means of
eigenvalue estimates. For this purpose, we derive bounds for the eigenvalues of
the relevant preconditioned sub-matrices. We illustrate our findings using the
flow in a driven cavity as a numerical test case, where the viscosity is given
by a truncated Karhunen-Lo\`eve expansion of a random field. In this example, a
Bramble-Pasciak conjugate gradient method with block triangular preconditioner
outperforms a MINRES method with block diagonal preconditioner in terms of
iteration numbers.Comment: 19 pages, 1 figure, submitted to SIAM JU
Multilevel iterative solvers for the edge finite element solution of the 3D Maxwell equation
In the edge vector finite element solution of the frequency domain Maxwell equations, the presence of a large kernel of the discrete rotor operator is known to ruin convergence of standard iterative solvers. We extend the approach of [1] and, using domain decomposition ideas, construct a multilevel iterative solver where the projection with respect to the kernel is combined with the use of a hierarchical representation of the vector finite elements.
The new iterative scheme appears to be an efficient solver for the edge finite element solution of the frequency domain Maxwell equations. The solver can be seen as a variable preconditioner and, thus, accelerated by Krylov subspace techniques (e.g. GCR or FGMRES). We demonstrate the efficiency of our approach on a test problem with strong jumps in the conductivity.
[1] R. Hiptmair. Multigrid method for Maxwell's equations. SIAM J. Numer. Anal., 36(1):204-225, 1999
- …