We present a novel black box optimization algorithm called Hessian Estimation
Evolution Strategy. The algorithm updates the covariance matrix of its sampling
distribution by directly estimating the curvature of the objective function.
This algorithm design is targeted at twice continuously differentiable
problems. For this, we extend the cumulative step-size adaptation algorithm of
the CMA-ES to mirrored sampling. We demonstrate that our approach to covariance
matrix adaptation is efficient by evaluation it on the BBOB/COCO testbed. We
also show that the algorithm is surprisingly robust when its core assumption of
a twice continuously differentiable objective function is violated. The
approach yields a new evolution strategy with competitive performance, and at
the same time it also offers an interesting alternative to the usual covariance
matrix update mechanism