33 research outputs found

    Suboptimal solutions to network team optimization problems

    Get PDF
    Smoothness of the solutions to network team optimization problems with statistical information structure is investigated. Suboptimal solutions expressed as linear combinations of elements from sets of basis functions containing adjustable parameters are considered. Estimates of their accuracy are derived, for basis functions represented by sinusoids with variable frequencies and phases and Gaussians with variable centers and widthss

    Proximal boosting and its acceleration

    Full text link
    Gradient boosting is a prediction method that iteratively combines weak learners to produce a complex and accurate model. From an optimization point of view, the learning procedure of gradient boosting mimics a gradient descent on a functional variable. This paper proposes to build upon the proximal point algorithm when the empirical risk to minimize is not differentiable to introduce a novel boosting approach, called proximal boosting. Besides being motivated by non-differentiable optimization, the proposed algorithm benefits from Nesterov's acceleration in the same way as gradient boosting [Biau et al., 2018]. This leads to a variant, called accelerated proximal boosting. Advantages of leveraging proximal methods for boosting are illustrated by numerical experiments on simulated and real-world data. In particular, we exhibit a favorable comparison over gradient boosting regarding convergence rate and prediction accuracy

    Schwarz Iterative Methods: Infinite Space Splittings

    Full text link
    We prove the convergence of greedy and randomized versions of Schwarz iterative methods for solving linear elliptic variational problems based on infinite space splittings of a Hilbert space. For the greedy case, we show a squared error decay rate of O((m+1)1)O((m+1)^{-1}) for elements of an approximation space A1\mathcal{A}_1 related to the underlying splitting. For the randomized case, we show an expected squared error decay rate of O((m+1)1)O((m+1)^{-1}) on a class AπA1\mathcal{A}_{\infty}^{\pi}\subset \mathcal{A}_1 depending on the probability distribution.Comment: Revised version, accepted in Constr. Appro

    A Comparison between Fixed-Basis and Variable-Basis Schemes for Function Approximation and Functional Optimization

    Get PDF
    Fixed-basis and variable-basis approximation schemes are compared for the problems of function approximation and functional optimization (also known as infinite programming). Classes of problems are investigated for which variable-basis schemes with sigmoidal computational units perform better than fixed-basis ones, in terms of the minimum number of computational units needed to achieve a desired error in function approximation or approximate optimization. Previously known bounds on the accuracy are extended, with better rates, to families o
    corecore