First-order optimization algorithms, often preferred for large problems,
require the gradient of the differentiable terms in the objective function.
These gradients often involve linear operators and their adjoints, which must
be applied rapidly. We consider two example problems and derive methods for
quickly evaluating the required adjoint operator. The first example is an image
deblurring problem, where we must compute efficiently the adjoint of
multi-stage wavelet reconstruction. Our formulation of the adjoint works for a
variety of boundary conditions, which allows the formulation to generalize to a
larger class of problems. The second example is a blind channel estimation
problem taken from the optimization literature where we must compute the
adjoint of the convolution of two signals. In each example, we show how the
adjoint operator can be applied efficiently while leveraging existing software.Comment: This manuscript is published in the IEEE Signal Processing Magazine,
Volume 33, Issue 6, November 201