AbstractA variant of the preconditioned conjugate gradient method to solve generalized least squares problems is presented. If the problem is min (Ax − b)TW−1(Ax − b) with A ∈ Rm×n and W ∈ Rm×m symmetric and positive definite, the method needs only a preconditioner A1 ∈ Rn×n, but not the inverse of matrix W or of any of its submatrices. Freund's comparison result for regular least squares problems is extended to generalized least squares problems. An error bound is also given