This paper focuses on the convergence problem of the emerging fractional
order gradient descent method, and proposes three solutions to overcome the
problem. In fact, the general fractional gradient method cannot converge to the
real extreme point of the target function, which critically hampers the
application of this method. Because of the long memory characteristics of
fractional derivative, fixed memory principle is a prior choice. Apart from the
truncation of memory length, two new methods are developed to reach the
convergence. The one is the truncation of the infinite series, and the other is
the modification of the constant fractional order. Finally, six illustrative
examples are performed to illustrate the effectiveness and practicability of
proposed methods.Comment: 8 pages, 16 figure