551 research outputs found

    New bounds on the Lieb-Thirring constants

    Full text link
    Improved estimates on the constants Lγ,dL_{\gamma,d}, for 1/2<γ<3/21/2<\gamma<3/2, dNd\in N in the inequalities for the eigenvalue moments of Schr\"{o}dinger operators are established

    Rayleigh-Ritz majorization error bounds of the mixed type

    Full text link
    The absolute change in the Rayleigh quotient (RQ) for a Hermitian matrix with respect to vectors is bounded in terms of the norms of the residual vectors and the angle between vectors in [\doi{10.1137/120884468}]. We substitute multidimensional subspaces for the vectors and derive new bounds of absolute changes of eigenvalues of the matrix RQ in terms of singular values of residual matrices and principal angles between subspaces, using majorization. We show how our results relate to bounds for eigenvalues after discarding off-diagonal blocks or additive perturbations.Comment: 20 pages, 1 figure. Accepted to SIAM Journal on Matrix Analysis and Application

    The MM Alternative to EM

    Full text link
    The EM algorithm is a special case of a more general algorithm called the MM algorithm. Specific MM algorithms often have nothing to do with missing data. The first M step of an MM algorithm creates a surrogate function that is optimized in the second M step. In minimization, MM stands for majorize--minimize; in maximization, it stands for minorize--maximize. This two-step process always drives the objective function in the right direction. Construction of MM algorithms relies on recognizing and manipulating inequalities rather than calculating conditional expectations. This survey walks the reader through the construction of several specific MM algorithms. The potential of the MM algorithm in solving high-dimensional optimization and estimation problems is its most attractive feature. Our applications to random graph models, discriminant analysis and image restoration showcase this ability.Comment: Published in at http://dx.doi.org/10.1214/08-STS264 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    A Simple Iterative Algorithm for Parsimonious Binary Kernel Fisher Discrimination

    Get PDF
    By applying recent results in optimization theory variously known as optimization transfer or majorize/minimize algorithms, an algorithm for binary, kernel, Fisher discriminant analysis is introduced that makes use of a non-smooth penalty on the coefficients to provide a parsimonious solution. The problem is converted into a smooth optimization that can be solved iteratively with no greater overhead than iteratively re-weighted least-squares. The result is simple, easily programmed and is shown to perform, in terms of both accuracy and parsimony, as well as or better than a number of leading machine learning algorithms on two well-studied and substantial benchmarks
    corecore