The standard worst-case normwise backward error bound for Householder QR factorization of an m×n matrix is proportional to mnu, where u is the unit roundoff. We prove that the bound can be replaced by one proportional to mnu that holds with high probability if the rounding errors are mean independent and of mean zero and if the normwise backward errors in applying a sequence of m×m Householder matrices to a vector satisfy bounds proportional to mu with probability 1. The proof makes use of a matrix concentration inequality. The same square rooting of the error constant applies to two-sided transformations by Householder matrices and hence to standard QR-type algorithms for computing eigenvalues and singular values. It also applies to Givens QR factorization. These results complement recent probabilistic rounding error analysis results for inner-product based algorithms and show that the square rooting effect is widespread in numerical linear algebra. Our numerical experiments, which make use of a new backward error formula for QR factorization, show that the probabilistic bounds give a much better indicator of the actual backward errors and their rate of growth than the worst-case bounds
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.