this paper we consider objective functions that are perturbations of simple, smooth functions. The surface in on the left in Figure 1, taken from [24], and the graph on the right illustrate this type of problem. Figure 1: Optimization Landscapes 0 5 10 15 20 25 0 5 10 15 20 25 -80 -60 -40 -20 0 20 0.5 1.5 2.5 3.5 4.5 -2-1.5-1 -0.5 0.5 1 1.5 2 The perturbations may be results of discontinuities or nonsmoth effects in the underlying models, randomness in the function evaluation, or experimental or measurement errors. Conventional gradient-based methods will be trapped in local minima even if the noise is smooth. Many classes of methods for noisy optimization problems are based on function information computed on sequences of simplices. The Nelder-Mead, [18], multidirectional search, [8], [21], and implicit filtering, [12], methods are three examples. The performance of such methods can be explained in terms of the difference approximation of the gradient that is implicit in the function evaluations they perform
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.