253 research outputs found

    Non-smooth Non-convex Bregman Minimization: Unification and new Algorithms

    Full text link
    We propose a unifying algorithm for non-smooth non-convex optimization. The algorithm approximates the objective function by a convex model function and finds an approximate (Bregman) proximal point of the convex model. This approximate minimizer of the model function yields a descent direction, along which the next iterate is found. Complemented with an Armijo-like line search strategy, we obtain a flexible algorithm for which we prove (subsequential) convergence to a stationary point under weak assumptions on the growth of the model function error. Special instances of the algorithm with a Euclidean distance function are, for example, Gradient Descent, Forward--Backward Splitting, ProxDescent, without the common requirement of a "Lipschitz continuous gradient". In addition, we consider a broad class of Bregman distance functions (generated by Legendre functions) replacing the Euclidean distance. The algorithm has a wide range of applications including many linear and non-linear inverse problems in signal/image processing and machine learning

    A Splitting Augmented Lagrangian Method for Low Multilinear-Rank Tensor Recovery

    Full text link
    This paper studies a recovery task of finding a low multilinear-rank tensor that fulfills some linear constraints in the general settings, which has many applications in computer vision and graphics. This problem is named as the low multilinear-rank tensor recovery problem. The variable splitting technique and convex relaxation technique are used to transform this problem into a tractable constrained optimization problem. Considering the favorable structure of the problem, we develop a splitting augmented Lagrangian method to solve the resulting problem. The proposed algorithm is easily implemented and its convergence can be proved under some conditions. Some preliminary numerical results on randomly generated and real completion problems show that the proposed algorithm is very effective and robust for tackling the low multilinear-rank tensor completion problem
    • …
    corecore