1,090 research outputs found

    Recursive Trust-Region Methods for Multiscale Nonlinear Optimization

    Full text link

    Recursive trust-region methods for multiscale nonlinear optimization

    Get PDF

    Multilevel Objective-Function-Free Optimization with an Application to Neural Networks Training

    Get PDF
    A class of multi-level algorithms for unconstrained nonlinear optimization is presented which does not require the evaluation of the objective function. The class contains the momentum-less AdaGrad method as a particular (single-level) instance. The choice of avoiding the evaluation of the objective function is intended to make the algorithms of the class less sensitive to noise, while the multi-level feature aims at reducing their computational cost. The evaluation complexity of these algorithms is analyzed and their behaviour in the presence of noise is then illustrated in the context of training deep neural networks for supervised learning applications.Comment: 29 pages, 4 figure

    Multilevel Minimization for Deep Residual Networks

    Full text link
    We present a new multilevel minimization framework for the training of deep residual networks (ResNets), which has the potential to significantly reduce training time and effort. Our framework is based on the dynamical system's viewpoint, which formulates a ResNet as the discretization of an initial value problem. The training process is then formulated as a time-dependent optimal control problem, which we discretize using different time-discretization parameters, eventually generating multilevel-hierarchy of auxiliary networks with different resolutions. The training of the original ResNet is then enhanced by training the auxiliary networks with reduced resolutions. By design, our framework is conveniently independent of the choice of the training strategy chosen on each level of the multilevel hierarchy. By means of numerical examples, we analyze the convergence behavior of the proposed method and demonstrate its robustness. For our examples we employ a multilevel gradient-based methods. Comparisons with standard single level methods show a speedup of more than factor three while achieving the same validation accuracy

    A Multi-Grid Iterative Method for Photoacoustic Tomography

    Get PDF
    Inspired by the recent advances on minimizing nonsmooth or bound-constrained convex functions on models using varying degrees of fidelity, we propose a line search multigrid (MG) method for full-wave iterative image reconstruction in photoacoustic tomography (PAT) in heterogeneous media. To compute the search direction at each iteration, we decide between the gradient at the target level, or alternatively an approximate error correction at a coarser level, relying on some predefined criteria. To incorporate absorption and dispersion, we derive the analytical adjoint directly from the first-order acoustic wave system. The effectiveness of the proposed method is tested on a total-variation penalized Iterative Shrinkage Thresholding algorithm (ISTA) and its accelerated variant (FISTA), which have been used in many studies of image reconstruction in PAT. The results show the great potential of the proposed method in improving speed of iterative image reconstruction
    corecore