In this paper we present a novel randomized block coordinate descent method
for the minimization of a convex composite objective function. The method uses
(approximate) partial second-order (curvature) information, so that the
algorithm performance is more robust when applied to highly nonseparable or ill
conditioned problems. We call the method Robust Coordinate Descent (RCD). At
each iteration of RCD, a block of coordinates is sampled randomly, a quadratic
model is formed about that block and the model is minimized
approximately/inexactly to determine the search direction. An inexpensive line
search is then employed to ensure a monotonic decrease in the objective
function and acceptance of large step sizes. We prove global convergence of the
RCD algorithm, and we also present several results on the local convergence of
RCD for strongly convex functions. Finally, we present numerical results on
large-scale problems to demonstrate the practical performance of the method.Comment: 23 pages, 6 figure