The conjugate gradient (CG) method is commonly used for the rapid solution of
least squares problems. In image reconstruction, the problem can be ill-posed
and also contaminated by noise; due to this, approaches such as regularization
should be utilized. Total variation (TV) is a useful regularization penalty,
frequently utilized in image reconstruction for generating images with sharp
edges. When a non-quadratic norm is selected for regularization, as is the case
for TV, then it is no longer possible to use CG. Non-linear CG is an
alternative, but it does not share the efficiency that CG shows with least
squares and methods such as fast iterative shrinkage-thresholding algorithms
(FISTA) are preferred for problems with TV norm. A different approach to
including prior information is superiorization. In this paper it is shown that
the conjugate gradient method can be superiorized. Five different CG variants
are proposed, including preconditioned CG. The CG methods superiorized by the
total variation norm are presented and their performance in image
reconstruction is demonstrated. It is illustrated that some of the proposed
variants of the superiorized CG method can produce reconstructions of superior
quality to those produced by FISTA and in less computational time, due to the
speed of the original CG for least squares problems