11 research outputs found

    An Approximate Projection onto the Tangent Cone to the Variety of Third-Order Tensors of Bounded Tensor-Train Rank

    Full text link
    An approximate projection onto the tangent cone to the variety of third-order tensors of bounded tensor-train rank is proposed and proven to satisfy a better angle condition than the one proposed by Kutschan (2019). Such an approximate projection enables, e.g., to compute gradient-related directions in the tangent cone, as required by algorithms aiming at minimizing a continuously differentiable function on the variety, a problem appearing notably in tensor completion. A numerical experiment is presented which indicates that, in practice, the angle condition satisfied by the proposed approximate projection is better than both the one satisfied by the approximate projection introduced by Kutschan and the proven theoretical bound

    Tensor approximation by block term decomposition

    No full text
    Higher-order tensors have become a powerful tool in many areas of applied mathematics such as statistics or scientific computing. They also have found many applications in signal processing and machine learning. As suggested by the literature, higher-order tensors draw their power notably from the many tensor decompositions among which the block term decomposition holds a central place. The purpose of this master's thesis is to focus on the computation of the best approximation in the least-squares sense of a given third-order tensor by a block term decomposition. Using variable projection, the tensor approximation problem is expressed as a minimization of a cost function on a product of Stiefel manifolds. In this master's thesis, I apply two first-order algorithms from the framework of optimization on matrix manifolds to minimize this cost function. I investigate the performance of these two new methods and compare them with the already available onesMaster [120] : ingénieur civil en mathématiques appliquées, Université catholique de Louvain, 201

    An Apocalypse-Free First-Order Low-Rank Optimization Algorithm with at Most One Rank Reduction Attempt per Iteration

    No full text
    We consider the problem of minimizing a differentiable function with locally Lipschitz continuous gradient over the real determinantal variety, and present a first-order algorithm designed to find stationary points of that problem. This algorithm applies steps of a retraction-free descent method proposed by Schneider and Uschmajew (2015), while taking the numerical rank into account to attempt rank reductions. We prove that this algorithm produces a sequence of iterates the accumulation points of which are stationary, and therefore does not follow the so-called apocalypses described by Levin, Kileel, and Boumal (2022). Moreover, the rank reduction mechanism of this algorithm requires at most one rank reduction attempt per iteration, in contrast with the one of the P2GDR algorithm introduced by Olikier, Gallivan, and Absil (2022) which can require a number of rank reduction attempts equal to the rank of the iterate in the worst-case scenario

    On the Continuity of the Tangent Cone to the Determinantal Variety

    No full text
    Tangent and normal cones play an important role in constrained optimization to describe admissible search directions and, in particular, to formulate optimality conditions. They notably appear in various recent algorithms for both smooth and nonsmooth low-rank optimization where the feasible set is the set Rm×n≀r of all m × n real matrices of rank at most r. In this paper, motivated by the convergence analysis of such algorithms, we study, by computing inner and outer limits, the continuity of the correspondence that maps each X∈Rm×n≀r to the tangent cone to Rm×n≀r at X. We also deduce results about the continuity of the corresponding normal cone correspondence. Finally, we show that our results include as a particular case the a-regularity of the Whitney stratification of Rm×n≀r following from the fact that this set is a real algebraic variety, called the real determinantal variety

    Variable Projection Applied to Block Term Decomposition of Higher-Order Tensors

    No full text
    © Springer International Publishing AG, part of Springer Nature 2018. Higher-order tensors have become popular in many areas of applied mathematics such as statistics, scientific computing, signal processing or machine learning, notably thanks to the many possible ways of decomposing a tensor. In this paper, we focus on the best approximation in the least-squares sense of a higher-order tensor by a block term decomposition. Using variable projection, we express the tensor approximation problem as a minimization of a cost function on a Cartesian product of Stiefel manifolds. The effect of variable projection on the Riemannian gradient algorithm is studied through numerical experiments.status: publishe

    Rank Estimation for Third-Order Tensor Completion in the Tensor-Train Format

    No full text
    We propose a numerical method to obtain an adequate value for the upper bound on the rank for the tensor completion problem on the variety of third-order tensors of bounded tensor-train rank. The method is inspired by the parametrization of the tangent cone derived by Kutschan (2018). A proof of the adequacy of the upper bound for a related low-rank tensor approximation problem is given and an estimated rank is defined to extend the result to the low-rank tensor completion problem. Some experiments on synthetic data illustrate the approach and show that the method is very robust, e.g., to noise on the data

    Variable Projection Applied to Block Term Decomposition of Higher-Order Tensors

    No full text
    Higher-order tensors have become popular in many areas of applied mathematics such as statistics, scientific computing, signal processing or machine learning, notably thanks to the many possible ways of decomposing a tensor. In this paper, we focus on the best approximation in the least-squares sense of a higher-order tensor by a block term decomposition. Using variable projection, we express the tensor approximation problem as a minimization of a cost function on a Cartesian product of Stiefel manifolds. The effect of variable projection on the Riemannian gradient algorithm is studied through numerical experiments

    An Approximate Projection onto the Tangent Cone to the Variety of Third-Order Tensors of Bounded Tensor-Train Rank

    No full text
    An approximate projection onto the tangent cone to the variety of third-order tensors of bounded tensor-train rank is proposed and proven to satisfy a better angle condition than the one proposed by Kutschan (2019). Such an approximate projection enables, e.g., to compute gradient-related directions in the tangent cone, as required by algorithms aiming at minimizing a continuously differentiable function on the variety, a problem appearing notably in tensor completion. A numerical experiment is presented which indicates that, in practice, the angle condition satisfied by the proposed approximate projection is better than both the one satisfied by the approximate projection introduced by Kutschan and the proven theoretical bound
    corecore