58,428 research outputs found

    A Non-monotone Alternating Updating Method for A Class of Matrix Factorization Problems

    Full text link
    In this paper we consider a general matrix factorization model which covers a large class of existing models with many applications in areas such as machine learning and imaging sciences. To solve this possibly nonconvex, nonsmooth and non-Lipschitz problem, we develop a non-monotone alternating updating method based on a potential function. Our method essentially updates two blocks of variables in turn by inexactly minimizing this potential function, and updates another auxiliary block of variables using an explicit formula. The special structure of our potential function allows us to take advantage of efficient computational strategies for non-negative matrix factorization to perform the alternating minimization over the two blocks of variables. A suitable line search criterion is also incorporated to improve the numerical performance. Under some mild conditions, we show that the line search criterion is well defined, and establish that the sequence generated is bounded and any cluster point of the sequence is a stationary point. Finally, we conduct some numerical experiments using real datasets to compare our method with some existing efficient methods for non-negative matrix factorization and matrix completion. The numerical results show that our method can outperform these methods for these specific applications

    Towards Question-based Recommender Systems

    Get PDF
    Conversational and question-based recommender systems have gained increasing attention in recent years, with users enabled to converse with the system and better control recommendations. Nevertheless, research in the field is still limited, compared to traditional recommender systems. In this work, we propose a novel Question-based recommendation method, Qrec, to assist users to find items interactively, by answering automatically constructed and algorithmically chosen questions. Previous conversational recommender systems ask users to express their preferences over items or item facets. Our model, instead, asks users to express their preferences over descriptive item features. The model is first trained offline by a novel matrix factorization algorithm, and then iteratively updates the user and item latent factors online by a closed-form solution based on the user answers. Meanwhile, our model infers the underlying user belief and preferences over items to learn an optimal question-asking strategy by using Generalized Binary Search, so as to ask a sequence of questions to the user. Our experimental results demonstrate that our proposed matrix factorization model outperforms the traditional Probabilistic Matrix Factorization model. Further, our proposed Qrec model can greatly improve the performance of state-of-the-art baselines, and it is also effective in the case of cold-start user and item recommendations.Comment: accepted by SIGIR 202

    Ward identities and combinatorics of rainbow tensor models

    Full text link
    We discuss the notion of renormalization group (RG) completion of non-Gaussian Lagrangians and its treatment within the framework of Bogoliubov-Zimmermann theory in application to the matrix and tensor models. With the example of the simplest non-trivial RGB tensor theory (Aristotelian rainbow), we introduce a few methods, which allow one to connect calculations in the tensor models to those in the matrix models. As a byproduct, we obtain some new factorization formulas and sum rules for the Gaussian correlators in the Hermitian and complex matrix theories, square and rectangular. These sum rules describe correlators as solutions to finite linear systems, which are much simpler than the bilinear Hirota equations and the infinite Virasoro recursion. Search for such relations can be a way to solving the tensor models, where an explicit integrability is still obscure.Comment: 48 page

    Two Algorithms for Orthogonal Nonnegative Matrix Factorization with Application to Clustering

    Full text link
    Approximate matrix factorization techniques with both nonnegativity and orthogonality constraints, referred to as orthogonal nonnegative matrix factorization (ONMF), have been recently introduced and shown to work remarkably well for clustering tasks such as document classification. In this paper, we introduce two new methods to solve ONMF. First, we show athematical equivalence between ONMF and a weighted variant of spherical k-means, from which we derive our first method, a simple EM-like algorithm. This also allows us to determine when ONMF should be preferred to k-means and spherical k-means. Our second method is based on an augmented Lagrangian approach. Standard ONMF algorithms typically enforce nonnegativity for their iterates while trying to achieve orthogonality at the limit (e.g., using a proper penalization term or a suitably chosen search direction). Our method works the opposite way: orthogonality is strictly imposed at each step while nonnegativity is asymptotically obtained, using a quadratic penalty. Finally, we show that the two proposed approaches compare favorably with standard ONMF algorithms on synthetic, text and image data sets.Comment: 17 pages, 8 figures. New numerical experiments (document and synthetic data sets
    corecore