Backtracking linesearch is the de facto approach for minimizing continuously
differentiable functions with locally Lipschitz gradient. In recent years, it
has been shown that in the convex setting it is possible to avoid linesearch
altogether, and to allow the stepsize to adapt based on a local smoothness
estimate without any backtracks or evaluations of the function value. In this
work we propose an adaptive proximal gradient method, adaPG, that uses novel
estimates of the local smoothness modulus which leads to less conservative
stepsize updates and that can additionally cope with nonsmooth terms. This idea
is extended to the primal-dual setting where an adaptive three term primal-dual
algorithm, adaPD, is proposed which can be viewed as an extension of the PDHG
method. Moreover, in this setting the ``essentially'' fully adaptive variant
adaPD+ is proposed that avoids evaluating the linear operator norm by
invoking a backtracking procedure, that, remarkably, does not require extra
gradient evaluations. Numerical simulations demonstrate the effectiveness of
the proposed algorithms compared to the state of the art