We propose a descent subgradient algorithm for minimizing a real function,
assumed to be locally Lipschitz, but not necessarily smooth or convex. To find
an effective descent direction, the Goldstein subdifferential is approximated
through an iterative process. The method enjoys a new two-point variant of
Mifflin line search in which the subgradients are arbitrary. Thus, the line
search procedure is easy to implement. Moreover, in comparison to bundle
methods, the quadratic subproblems have a simple structure, and to handle
nonconvexity the proposed method requires no algorithmic modification. We study
the global convergence of the method and prove that any accumulation point of
the generated sequence is Clarke stationary, assuming that the objective f is
weakly upper semismooth. We illustrate the efficiency and effectiveness of the
proposed algorithm on a collection of academic and semi-academic test problems