15 research outputs found

    A Riemannian ADMM

    Full text link
    We consider a class of Riemannian optimization problems where the objective is the sum of a smooth function and a nonsmooth function, considered in the ambient space. This class of problems finds important applications in machine learning and statistics such as the sparse principal component analysis, sparse spectral clustering, and orthogonal dictionary learning. We propose a Riemannian alternating direction method of multipliers (ADMM) to solve this class of problems. Our algorithm adopts easily computable steps in each iteration. The iteration complexity of the proposed algorithm for obtaining an \epsilon-stationary point is analyzed under mild assumptions. To the best of our knowledge, this is the first Riemannian ADMM with provable convergence guarantee for solving Riemannian optimization problem with nonsmooth objective. Numerical experiments are conducted to demonstrate the advantage of the proposed method

    Estudo do m茅todo de Newton em variedades

    Get PDF
    Orientador: Prof. Dr. Jinyun YuanCoorientador: Prof. Dr. Orizon P. FerreiraTese (doutorado) - Universidade Federal do Paran谩, Setor de Ci锚ncias Exatas, Programa de P贸s-Gradua莽茫o em Matem谩tica. Defesa : Curitiba, 16/03/2018Inclui refer锚ncias: p.52-57Resumo: Nesta tese, n'os estudamos m'etodo de Newton para encontrar singularidade de campo de vetor definido em variedade Riemanniana. Obtemos uma importante propriedade do transporte paralelo de vetor para estabelecer (sob hip'otese m'?nima) converg藛encia super-linear da sequ藛encia gerada pelo cl'assico m'etodo de Newton para encontrar zero de campo de vetor. Al'em disso, n'os propomos um m'etodo de Newton amortecido e apresentamos sua an'alise global de converg藛encia usando busca linear e uma fun?c藴ao m'erito. Asseguramos que, ap'os um n'umero finito de iteradas, uma sequ藛encia gerada pelo proposto m'etodo de Newton amortecido reduz-se a uma sequ藛encia gerada pelo m'etodo de Newton. Portanto, a taxa de converg藛encia do m'etodo proposto 'e super-linear/quadr'atica. N'os implementamos ambos os m'etodos para encontrar minimizadores globais de uma fam'?lia de fun?c藴oes definidas no cone de matrizes sim'etricas definidas positivas. Nossos experimentos mostram que a performance do m'etodo de Newton amortecido 'e superior a performance do cl'assico m'etodo de Newton, indicando que o comportamento dos m'etodos no espa?co Euclidiano permanecem neste novo cen'ario. Para proceder com os experimentos, n'os primeiro equipamos o cone de matrizes sim'etricas definidas positivas com uma estrutura de variedade Riemanniana. Ent藴ao, definimos as iteradas das sequ藛encias geradas pelo m'etodo de Newton e pelo m'etodo de Newton amortecido usando a curva geod'esica nesta variedade. Contudo, computar geod'esica involve significante custo num'erico. Devido a isso, n'os propomos dois novos algoritmos, a saber, m'etodo de Newton com retra赂c藴ao e m'etodo de Newton amortecido com retra赂c藴ao, para encontrar singularidade de campo de vetor definido em variedade Riemanniana. N'os apresentamos an'alises de converg藛encias desses novos m'etodos por extender os resultados obtidos para estabelecer o m'etodo de Newton e o m'etodo de Newton amortecido. Finalmente, n'os implementamos o m'etodo de Newton amortecido com retra?c藴ao para encontrar minimizadores da fam'?lia de fun?c藴oes acima mencionada. Neste caso, nossos experimentos mostram que o m'etodo de Newton amortecido possui performance similar do m'etodo de Newton amortecido com retra ?c藴ao. As principais contribui?c藴oes desta tese s藴ao como seguem. 1) Sob hip'otese m'?nima, isto 'e, invertibilidade da derivada covariante do campo de vetor em sua singularidade, n'os mostramos que o m'etodo de Newton est'a bem definido em uma vizinhan?ca aceit'avel de sua singularidade e que a sequ藛encia gerada por este m'etodo converge com taxa super-linear, (veja (27)). 2) N'os propomos um m'etodo de Newton amortecido no contexto de variedade Riemanniana e estabelecemos sua converg藛encia global para uma singularidade do campo de vetor preservando as taxas de converg藛encias super-linear e quadr'atica do m'etodo de Newton (veja (16)). 3) Propomos o m'etodo de Newton amortecido com retra赂c藴ao e estudamos suas propriedades de converg藛encia, obtendo os mesmos resultados do m'etodo de Newton e do m'etodo de Newton amortecido. Keywords: variedade Riemanniana 路 m'etodo de Newton 路 m'etodo de Newton amortecido 路 converg藛encia local 路 converg藛encia global 路 taxa super-linear 路 taxa quadratica 路 busca linear 路 exponential mapping 路 retra赂c藴ao.Abstract: In this thesis, we study Newton's method for finding a singularity of a differentiable vector field defined on a Riemannian manifold. We obtain an important property of the parallel transport of a vector which allows to establish (under a mild assumption) super-linear convergence of the sequence generated by the classical Newton's method for finding a zero of a vector field. Moreover, we propose a damped Newton's method and we present its global analysis of convergence using a linear search together with a merit function. We ensure that the sequence generated by the proposed damped Newton's method reduces to a sequence generated by the classical iteration of Newton's method after a finite number of iterations. Thus, the convergence rate of the proposed method is super-linear/quadratic. We implement both methods for finding global minimizers of a family of functions defined on the cone of symmetric positive definite matrices. Our experiments show that the performance of the damped Newtons method is superior to that of the classical Newton's method, indicating that the behavior of the methods in Euclidean space persists in this new setting. To proceed with the experiments, we first endow the cone of symmetric positive definite matrices with a Riemannian manifold structure. Then, we define the iterations of the sequences generated by Newton's method and damped Newtons method using the geodesic curve of this manifold. However, in general, performing this task in a computationally efficient manner involves significant numerical challenges. Therefore, we propose two new algorithms, namely Newton's method with retraction and damped Newton's method with retraction, to find a singularity of a vector field defined on a Riemannian manifold. We present the convergence analysis of these new methods by extending the results of Newton's method and the damped Newton's method. Finally, we implement the damped Newton's method with retraction for finding global minimizers of the aforementioned family of functions. For this case, our experiments show that damped Newton method with retraction does not present a better performance than the damped Newton method. The main contributions of this thesis are the following. 1) Under a mild assumption, i.e., invertibility of the covariant derivative of the vector field at its singularity, we show that Newton's method is well defined in a suitable neighborhood of this singularity and that the sequence generated by this method converges to the solution at a super-linear rate (see (27)). 2) We propose the damped Newton's method in the Riemannian manifold context and establish its global convergence to a singularity of a vector field preserving the super-linear convergence rates of Newton's method (see (16)). 3) We propose the damped Newton's method with retraction and study its convergence properties, obtaining the same results as those of Newton's method and the damped Newton's method. Keywords: Riemannian manifold 路 Newton's method 路 damped Newton's method 路 local convergence 路 global convergence 路 super-linear rate 路 quadratic rate 路 linesearch 路 exponential mapping 路 retraction
    corecore