39,615 research outputs found
The q-gradient method for global optimization
The q-gradient is an extension of the classical gradient vector based on the
concept of Jackson's derivative. Here we introduce a preliminary version of the
q-gradient method for unconstrained global optimization. The main idea behind
our approach is the use of the negative of the q-gradient of the objective
function as the search direction. In this sense, the method here proposed is a
generalization of the well-known steepest descent method. The use of Jackson's
derivative has shown to be an effective mechanism for escaping from local
minima. The q-gradient method is complemented with strategies to generate the
parameter q and to compute the step length in a way that the search process
gradually shifts from global in the beginning to almost local search in the
end. For testing this new approach, we considered six commonly used test
functions and compared our results with three Genetic Algorithms (GAs)
considered effective in optimizing multidimensional unimodal and multimodal
functions. For the multimodal test functions, the q-gradient method
outperformed the GAs, reaching the minimum with a better accuracy and with less
function evaluations.Comment: 12 pages, 1 figur
- …