385 research outputs found

    The MM Alternative to EM

    Full text link
    The EM algorithm is a special case of a more general algorithm called the MM algorithm. Specific MM algorithms often have nothing to do with missing data. The first M step of an MM algorithm creates a surrogate function that is optimized in the second M step. In minimization, MM stands for majorize--minimize; in maximization, it stands for minorize--maximize. This two-step process always drives the objective function in the right direction. Construction of MM algorithms relies on recognizing and manipulating inequalities rather than calculating conditional expectations. This survey walks the reader through the construction of several specific MM algorithms. The potential of the MM algorithm in solving high-dimensional optimization and estimation problems is its most attractive feature. Our applications to random graph models, discriminant analysis and image restoration showcase this ability.Comment: Published in at http://dx.doi.org/10.1214/08-STS264 the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Majorization algorithms for inspecting circles, ellipses, squares, rectangles, and rhombi

    Get PDF
    In several disciplines, as diverse as shape analysis, locationtheory, quality control, archaeology, and psychometrics, it can beof interest to fit a circle through a set of points. We use theresult that it suffices to locate a center for which the varianceof the distances from the center to a set of given points isminimal. In this paper, we propose a new algorithm based oniterative majorization to locate the center. This algorithm isguaranteed to yield a series nonincreasing variances until astationary point is obtained. In all practical cases, thestationary point turns out to be a local minimum. Numericalexperiments show that the majorizing algorithm is stable and fast.In addition, we extend the method to fit other shapes, such as asquare, an ellipse, a rectangle, and a rhombus by making use ofthe class of lpl_p distances and dimension weighting. In addition,we allow for rotations for shapes that might be rotated in theplane. We illustrate how this extended algorithm can be used as atool for shape recognition.iterative majorization;location;optimization;shape analysis

    Rank Reduction of Correlation Matrices by Majorization

    Get PDF
    A novel algorithm is developed for the problem of finding a low-rank correlation matrix nearest to a given correlation matrix. The algorithm is based on majorization and, therefore, it is globally convergent. The algorithm is computationally efficient, is straightforward to implement, and can handle arbitrary weights on the entries of the correlation matrix. A simulation study suggests that majorization compares favourably with competing approaches in terms of the quality of the solution within a fixed computational time. The problem of rank reduction of correlation matrices occurs when pricing a derivative dependent on a large number of assets, where the asset prices are modelled as correlated log-normal processes. Mainly, such an application concerns interest rates.rank, correlation matrix, majorization, lognormal price processes

    Weighted Majorization Algorithms for Weighted Least Squares Decomposition Models

    Get PDF
    For many least-squares decomposition models efficient algorithms are well known. A more difficult problem arises in decomposition models where each residual is weighted by a nonnegative value. A special case is principal components analysis with missing data. Kiers (1997) discusses an algorithm for minimizing weighteddecomposition models by iterative majorization. In this paper, we for computing a solution. We will show that the algorithm by Kiers is a special case of our algorithm. Here, we will apply weighted majorization to weighted principal components analysis, robust Procrustes analysis, and logistic bi-additive models of which the two parameter logistic model in item response theory is a specialcase. Simulation studies show that weighted majorization is generally faster than the method by Kiers by a factor one to four and obtains the same or better quality solutions. For logistic bi-additive models, we propose a new iterative majorization algorithm called logistic majorization.iterative majorization;IRT;logistic bi-additive model;robust Procrustes analysis;weighted principal component analysis;two parameter logistic model
    corecore