35,745 research outputs found
Accelerated graph-based nonlinear denoising filters
Denoising filters, such as bilateral, guided, and total variation filters,
applied to images on general graphs may require repeated application if noise
is not small enough. We formulate two acceleration techniques of the resulted
iterations: conjugate gradient method and Nesterov's acceleration. We
numerically show efficiency of the accelerated nonlinear filters for image
denoising and demonstrate 2-12 times speed-up, i.e., the acceleration
techniques reduce the number of iterations required to reach a given peak
signal-to-noise ratio (PSNR) by the above indicated factor of 2-12.Comment: 10 pages, 6 figures, to appear in Procedia Computer Science, vol.80,
2016, International Conference on Computational Science, San Diego, CA, USA,
June 6-8, 201
Composing Scalable Nonlinear Algebraic Solvers
Most efficient linear solvers use composable algorithmic components, with the
most common model being the combination of a Krylov accelerator and one or more
preconditioners. A similar set of concepts may be used for nonlinear algebraic
systems, where nonlinear composition of different nonlinear solvers may
significantly improve the time to solution. We describe the basic concepts of
nonlinear composition and preconditioning and present a number of solvers
applicable to nonlinear partial differential equations. We have developed a
software framework in order to easily explore the possible combinations of
solvers. We show that the performance gains from using composed solvers can be
substantial compared with gains from standard Newton-Krylov methods.Comment: 29 pages, 14 figures, 13 table
Objective acceleration for unconstrained optimization
Acceleration schemes can dramatically improve existing optimization
procedures. In most of the work on these schemes, such as nonlinear Generalized
Minimal Residual (N-GMRES), acceleration is based on minimizing the
norm of some target on subspaces of . There are many numerical
examples that show how accelerating general purpose and domain-specific
optimizers with N-GMRES results in large improvements. We propose a natural
modification to N-GMRES, which significantly improves the performance in a
testing environment originally used to advocate N-GMRES. Our proposed approach,
which we refer to as O-ACCEL (Objective Acceleration), is novel in that it
minimizes an approximation to the \emph{objective function} on subspaces of
. We prove that O-ACCEL reduces to the Full Orthogonalization
Method for linear systems when the objective is quadratic, which differentiates
our proposed approach from existing acceleration methods. Comparisons with
L-BFGS and N-CG indicate the competitiveness of O-ACCEL. As it can be combined
with domain-specific optimizers, it may also be beneficial in areas where
L-BFGS or N-CG are not suitable.Comment: 18 pages, 6 figures, 5 table
Regularized Nonlinear Acceleration
We describe a convergence acceleration technique for unconstrained
optimization problems. Our scheme computes estimates of the optimum from a
nonlinear average of the iterates produced by any optimization method. The
weights in this average are computed via a simple linear system, whose solution
can be updated online. This acceleration scheme runs in parallel to the base
algorithm, providing improved estimates of the solution on the fly, while the
original optimization method is running. Numerical experiments are detailed on
classical classification problems
- …