10,595 research outputs found

    Computation of sum of squares polynomials from data points

    Get PDF
    We propose an iterative algorithm for the numerical computation of sums of squares of polynomials approximating given data at prescribed interpolation points. The method is based on the definition of a convex functional GG arising from the dualization of a quadratic regression over the Cholesky factors of the sum of squares decomposition. In order to justify the construction, the domain of GG, the boundary of the domain and the behavior at infinity are analyzed in details. When the data interpolate a positive univariate polynomial, we show that in the context of the Lukacs sum of squares representation, GG is coercive and strictly convex which yields a unique critical point and a corresponding decomposition in sum of squares. For multivariate polynomials which admit a decomposition in sum of squares and up to a small perturbation of size Δ\varepsilon, GΔG^\varepsilon is always coercive and so it minimum yields an approximate decomposition in sum of squares. Various unconstrained descent algorithms are proposed to minimize GG. Numerical examples are provided, for univariate and bivariate polynomials

    Efficient Algorithms for Optimal Control of Quantum Dynamics: The "Krotov'' Method unencumbered

    Full text link
    Efficient algorithms for the discovery of optimal control designs for coherent control of quantum processes are of fundamental importance. One important class of algorithms are sequential update algorithms generally attributed to Krotov. Although widely and often successfully used, the associated theory is often involved and leaves many crucial questions unanswered, from the monotonicity and convergence of the algorithm to discretization effects, leading to the introduction of ad-hoc penalty terms and suboptimal update schemes detrimental to the performance of the algorithm. We present a general framework for sequential update algorithms including specific prescriptions for efficient update rules with inexpensive dynamic search length control, taking into account discretization effects and eliminating the need for ad-hoc penalty terms. The latter, while necessary to regularize the problem in the limit of infinite time resolution, i.e., the continuum limit, are shown to be undesirable and unnecessary in the practically relevant case of finite time resolution. Numerical examples show that the ideas underlying many of these results extend even beyond what can be rigorously proved.Comment: 19 pages, many figure

    The EM Algorithm

    Get PDF
    The Expectation-Maximization (EM) algorithm is a broadly applicable approach to the iterative computation of maximum likelihood (ML) estimates, useful in a variety of incomplete-data problems. Maximum likelihood estimation and likelihood-based inference are of central importance in statistical theory and data analysis. Maximum likelihood estimation is a general-purpose method with attractive properties. It is the most-often used estimation technique in the frequentist framework; it is also relevant in the Bayesian framework (Chapter III.11). Often Bayesian solutions are justified with the help of likelihoods and maximum likelihood estimates (MLE), and Bayesian solutions are similar to penalized likelihood estimates. Maximum likelihood estimation is an ubiquitous technique and is used extensively in every area where statistical techniques are used. --

    On nonparametric estimation of a mixing density via the predictive recursion algorithm

    Full text link
    Nonparametric estimation of a mixing density based on observations from the corresponding mixture is a challenging statistical problem. This paper surveys the literature on a fast, recursive estimator based on the predictive recursion algorithm. After introducing the algorithm and giving a few examples, I summarize the available asymptotic convergence theory, describe an important semiparametric extension, and highlight two interesting applications. I conclude with a discussion of several recent developments in this area and some open problems.Comment: 22 pages, 5 figures. Comments welcome at https://www.researchers.one/article/2018-12-
    • 

    corecore