6 research outputs found
Curvature Aligned Simplex Gradient: Principled Sample Set Construction For Numerical Differentiation
The simplex gradient, a popular numerical differentiation method due to its
flexibility, lacks a principled method by which to construct the sample set,
specifically the location of function evaluations. Such evaluations, especially
from real-world systems, are often noisy and expensive to obtain, making it
essential that each evaluation is carefully chosen to reduce cost and increase
accuracy. This paper introduces the curvature aligned simplex gradient (CASG),
which provably selects the optimal sample set under a mean squared error
objective. As CASG requires function-dependent information often not available
in practice, we additionally introduce a framework which exploits a history of
function evaluations often present in practical applications. Our numerical
results, focusing on applications in sensitivity analysis and derivative free
optimization, show that our methodology significantly outperforms or matches
the performance of the benchmark gradient estimator given by forward
differences (FD) which is given exact function-dependent information that is
not available in practice. Furthermore, our methodology is comparable to the
performance of central differences (CD) that requires twice the number of
function evaluations.Comment: 31 Pages, 5 Figures, Submitted to IMA Numerical Analysi