When an m×n matrix is premultiplied by a product of n
Householder matrices the worst-case normwise rounding error bound
is proportional to mnu, where u is the unit roundoff. We
prove that this bound can be replaced by one proportional to
mnu that holds with high probability if the rounding
errors are mean independent and of mean zero, under the
assumption that a certain bound holds with probability 1. The
proof makes use of a matrix concentration inequality. In
particular, this result applies to Householder QR factorization.
The same square rooting of the error constant applies to
two-sided transformations by Householder matrices and hence to
standard QR-type algorithms for computing eigenvalues and
singular values. It also applies to Givens QR factorization.
These results complement recent probabilistic rounding error
analysis results for inner-product based algorithms and show that
the square rooting effect is widespread in numerical linear
algebra. Our numerical experiments, which make use of a new
backward error formula for QR factorization, show that the
probabilistic bounds give a much better indicator of the actual
backward errors and their rate of growth than the worst-case
bounds
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.