The asymptotic behavior of stochastic gradient algorithms is studied. Relying
on results from differential geometry (Lojasiewicz gradient inequality), the
single limit-point convergence of the algorithm iterates is demonstrated and
relatively tight bounds on the convergence rate are derived. In sharp contrast
to the existing asymptotic results, the new results presented here allow the
objective function to have multiple and non-isolated minima. The new results
also offer new insights into the asymptotic properties of several classes of
recursive algorithms which are routinely used in engineering, statistics,
machine learning and operations research