Gradient Estimation Schemes for Noisy Functions

Abstract

In this paper we analyze different schemes for obtaining gradient estimates when the underlying function is noisy.Good gradient estimation is e.g. important for nonlinear programming solvers.As an error criterion we take the norm of the difference between the real and estimated gradients.This error can be split up into a deterministic and a stochastic error.For three finite difference schemes and two Design of Experiments (DoE) schemes we analyze both the deterministic and the stochastic errors.We also derive optimal step sizes for each scheme, such that the total error is minimized.Some of the schemes have the nice property that this step size also minimizes the variance of the error.Based on these results we show that to obtain good gradient estimates for noisy functions it is worthwhile to use DoE schemes.We recommend to implement such schemes in NLP solver

    Similar works