2,961 research outputs found
Structure preserving Stochastic Impulse Methods for stiff Langevin systems with a uniform global error of order 1 or 1/2 on position
Impulse methods are generalized to a family of integrators for Langevin
systems with quadratic stiff potentials and arbitrary soft potentials. Uniform
error bounds (independent from stiff parameters) are obtained on integrated
positions allowing for coarse integration steps. The resulting integrators are
explicit and structure preserving (quasi-symplectic for Langevin systems)
Averaging Schemes for Solving Fived Point and Variational Inequality Problems
We develop and study averaging schemes for solving fixed point and variational inequality problems. Typically, researchers have established convergence results for solution methods for these problems by establishing contractive estimates for their algorithmic maps. In this paper, we establish global convergence results using nonexpansive estimates. After first establishing convergence for a general iterative scheme for computing fixed points, we consider applications to projection and relaxation algorithms for solving variational inequality problems and to a generalized steepest descent method for solving systems of equations. As part of our development, we also establish a new interpretation of a norm condition typically used for establishing convergence of linearization schemes, by associating it with a strong-f-monotonicity condition. We conclude by applying our results to transportation networks
Non-intrusive and structure preserving multiscale integration of stiff ODEs, SDEs and Hamiltonian systems with hidden slow dynamics via flow averaging
We introduce a new class of integrators for stiff ODEs as well as SDEs. These
integrators are (i) {\it Multiscale}: they are based on flow averaging and so
do not fully resolve the fast variables and have a computational cost
determined by slow variables (ii) {\it Versatile}: the method is based on
averaging the flows of the given dynamical system (which may have hidden slow
and fast processes) instead of averaging the instantaneous drift of assumed
separated slow and fast processes. This bypasses the need for identifying
explicitly (or numerically) the slow or fast variables (iii) {\it
Nonintrusive}: A pre-existing numerical scheme resolving the microscopic time
scale can be used as a black box and easily turned into one of the integrators
in this paper by turning the large coefficients on over a microscopic timescale
and off during a mesoscopic timescale (iv) {\it Convergent over two scales}:
strongly over slow processes and in the sense of measures over fast ones. We
introduce the related notion of two-scale flow convergence and analyze the
convergence of these integrators under the induced topology (v) {\it Structure
preserving}: for stiff Hamiltonian systems (possibly on manifolds), they can be
made to be symplectic, time-reversible, and symmetry preserving (symmetries are
group actions that leave the system invariant) in all variables. They are
explicit and applicable to arbitrary stiff potentials (that need not be
quadratic). Their application to the Fermi-Pasta-Ulam problems shows accuracy
and stability over four orders of magnitude of time scales. For stiff Langevin
equations, they are symmetry preserving, time-reversible and Boltzmann-Gibbs
reversible, quasi-symplectic on all variables and conformally symplectic with
isotropic friction.Comment: 69 pages, 21 figure
On the convergence of mirror descent beyond stochastic convex programming
In this paper, we examine the convergence of mirror descent in a class of
stochastic optimization problems that are not necessarily convex (or even
quasi-convex), and which we call variationally coherent. Since the standard
technique of "ergodic averaging" offers no tangible benefits beyond convex
programming, we focus directly on the algorithm's last generated sample (its
"last iterate"), and we show that it converges with probabiility if the
underlying problem is coherent. We further consider a localized version of
variational coherence which ensures local convergence of stochastic mirror
descent (SMD) with high probability. These results contribute to the landscape
of non-convex stochastic optimization by showing that (quasi-)convexity is not
essential for convergence to a global minimum: rather, variational coherence, a
much weaker requirement, suffices. Finally, building on the above, we reveal an
interesting insight regarding the convergence speed of SMD: in problems with
sharp minima (such as generic linear programs or concave minimization
problems), SMD reaches a minimum point in a finite number of steps (a.s.), even
in the presence of persistent gradient noise. This result is to be contrasted
with existing black-box convergence rate estimates that are only asymptotic.Comment: 30 pages, 5 figure
- ā¦