345 research outputs found
Preconditioning for active set and projected gradient methods as\ud semi-smooth Newton methods for PDE-constrained optimization\ud with control constraints
Optimal control problems with partial differential equations play an important role in many applications. The inclusion of bound constraints for the control poses a significant additional challenge for optimization methods. In this paper we propose preconditioners for the saddle point problems that arise when a primal-dual active set method is used. We also show for this method that the same saddle point system can be derived when the method is considered as a semi-smooth Newton method. In addition, the projected gradient method can be employed to solve optimization problems with simple bounds and we discuss the efficient solution of the linear systems in question. In the case when an acceleration technique is employed for the projected gradient method, this again yields a semi-smooth Newton method that is equivalent to the primal-dual active set method. Numerical results illustrate the competitiveness of this approach
Stability Estimates and Structural Spectral Properties of Saddle Point Problems
For a general class of saddle point problems sharp estimates for
Babu\v{s}ka's inf-sup stability constants are derived in terms of the constants
in Brezzi's theory. In the finite-dimensional Hermitian case more detailed
spectral properties of preconditioned saddle point matrices are presented,
which are helpful for the convergence analysis of common Krylov subspace
methods. The theoretical results are applied to two model problems from optimal
control with time-periodic state equations. Numerical experiments with the
preconditioned minimal residual method are reported
Preconditioning of weighted H(div)-norm and applications to numerical simulation of highly heterogeneous media
In this paper we propose and analyze a preconditioner for a system arising
from a finite element approximation of second order elliptic problems
describing processes in highly het- erogeneous media. Our approach uses the
technique of multilevel methods and the recently proposed preconditioner based
on additive Schur complement approximation by J. Kraus (see [8]). The main
results are the design and a theoretical and numerical justification of an
iterative method for such problems that is robust with respect to the contrast
of the media, defined as the ratio between the maximum and minimum values of
the coefficient (related to the permeability/conductivity).Comment: 28 page
Steklov Spectral Geometry for Extrinsic Shape Analysis
We propose using the Dirichlet-to-Neumann operator as an extrinsic
alternative to the Laplacian for spectral geometry processing and shape
analysis. Intrinsic approaches, usually based on the Laplace-Beltrami operator,
cannot capture the spatial embedding of a shape up to rigid motion, and many
previous extrinsic methods lack theoretical justification. Instead, we consider
the Steklov eigenvalue problem, computing the spectrum of the
Dirichlet-to-Neumann operator of a surface bounding a volume. A remarkable
property of this operator is that it completely encodes volumetric geometry. We
use the boundary element method (BEM) to discretize the operator, accelerated
by hierarchical numerical schemes and preconditioning; this pipeline allows us
to solve eigenvalue and linear problems on large-scale meshes despite the
density of the Dirichlet-to-Neumann discretization. We further demonstrate that
our operators naturally fit into existing frameworks for geometry processing,
making a shift from intrinsic to extrinsic geometry as simple as substituting
the Laplace-Beltrami operator with the Dirichlet-to-Neumann operator.Comment: Additional experiments adde
Fast solution of Cahn-Hilliard variational inequalities using implicit time discretization and finite elements
We consider the e�cient solution of the Cahn-Hilliard variational inequality using an implicit time discretization, which is formulated as an optimal control problem with pointwise constraints on the control. By applying a semi-smooth Newton method combined with a Moreau-Yosida regularization technique for handling the control constraints we show superlinear convergence in function space. At the heart of this method lies the solution of large and sparse linear systems for which we propose the use of preconditioned Krylov subspace solvers using an e�ective Schur complement approximation. Numerical results illustrate the competitiveness of this approach
Preconditioned Stochastic Gradient Langevin Dynamics for Deep Neural Networks
Effective training of deep neural networks suffers from two main issues. The
first is that the parameter spaces of these models exhibit pathological
curvature. Recent methods address this problem by using adaptive
preconditioning for Stochastic Gradient Descent (SGD). These methods improve
convergence by adapting to the local geometry of parameter space. A second
issue is overfitting, which is typically addressed by early stopping. However,
recent work has demonstrated that Bayesian model averaging mitigates this
problem. The posterior can be sampled by using Stochastic Gradient Langevin
Dynamics (SGLD). However, the rapidly changing curvature renders default SGLD
methods inefficient. Here, we propose combining adaptive preconditioners with
SGLD. In support of this idea, we give theoretical properties on asymptotic
convergence and predictive risk. We also provide empirical results for Logistic
Regression, Feedforward Neural Nets, and Convolutional Neural Nets,
demonstrating that our preconditioned SGLD method gives state-of-the-art
performance on these models.Comment: AAAI 201
- …