7,083 research outputs found
Bayesian Matrix Completion via Adaptive Relaxed Spectral Regularization
Bayesian matrix completion has been studied based on a low-rank matrix
factorization formulation with promising results. However, little work has been
done on Bayesian matrix completion based on the more direct spectral
regularization formulation. We fill this gap by presenting a novel Bayesian
matrix completion method based on spectral regularization. In order to
circumvent the difficulties of dealing with the orthonormality constraints of
singular vectors, we derive a new equivalent form with relaxed constraints,
which then leads us to design an adaptive version of spectral regularization
feasible for Bayesian inference. Our Bayesian method requires no parameter
tuning and can infer the number of latent factors automatically. Experiments on
synthetic and real datasets demonstrate encouraging results on rank recovery
and collaborative filtering, with notably good results for very sparse
matrices.Comment: Accepted to AAAI 201
A Batch Learning Framework for Scalable Personalized Ranking
In designing personalized ranking algorithms, it is desirable to encourage a
high precision at the top of the ranked list. Existing methods either seek a
smooth convex surrogate for a non-smooth ranking metric or directly modify
updating procedures to encourage top accuracy. In this work we point out that
these methods do not scale well to a large-scale setting, and this is partly
due to the inaccurate pointwise or pairwise rank estimation. We propose a new
framework for personalized ranking. It uses batch-based rank estimators and
smooth rank-sensitive loss functions. This new batch learning framework leads
to more stable and accurate rank approximations compared to previous work.
Moreover, it enables explicit use of parallel computation to speed up training.
We conduct empirical evaluation on three item recommendation tasks. Our method
shows consistent accuracy improvements over state-of-the-art methods.
Additionally, we observe time efficiency advantages when data scale increases.Comment: AAAI 2018, Feb 2-7, New Orleans, US
Neural Collaborative Ranking
Recommender systems are aimed at generating a personalized ranked list of
items that an end user might be interested in. With the unprecedented success
of deep learning in computer vision and speech recognition, recently it has
been a hot topic to bridge the gap between recommender systems and deep neural
network. And deep learning methods have been shown to achieve state-of-the-art
on many recommendation tasks. For example, a recent model, NeuMF, first
projects users and items into some shared low-dimensional latent feature space,
and then employs neural nets to model the interaction between the user and item
latent features to obtain state-of-the-art performance on the recommendation
tasks. NeuMF assumes that the non-interacted items are inherent negative and
uses negative sampling to relax this assumption. In this paper, we examine an
alternative approach which does not assume that the non-interacted items are
necessarily negative, just that they are less preferred than interacted items.
Specifically, we develop a new classification strategy based on the widely used
pairwise ranking assumption. We combine our classification strategy with the
recently proposed neural collaborative filtering framework, and propose a
general collaborative ranking framework called Neural Network based
Collaborative Ranking (NCR). We resort to a neural network architecture to
model a user's pairwise preference between items, with the belief that neural
network will effectively capture the latent structure of latent factors. The
experimental results on two real-world datasets show the superior performance
of our models in comparison with several state-of-the-art approaches.Comment: Proceedings of the 2018 ACM on Conference on Information and
Knowledge Managemen
Latitude: A Model for Mixed Linear-Tropical Matrix Factorization
Nonnegative matrix factorization (NMF) is one of the most frequently-used
matrix factorization models in data analysis. A significant reason to the
popularity of NMF is its interpretability and the `parts of whole'
interpretation of its components. Recently, max-times, or subtropical, matrix
factorization (SMF) has been introduced as an alternative model with equally
interpretable `winner takes it all' interpretation. In this paper we propose a
new mixed linear--tropical model, and a new algorithm, called Latitude, that
combines NMF and SMF, being able to smoothly alternate between the two. In our
model, the data is modeled using the latent factors and latent parameters that
control whether the factors are interpreted as NMF or SMF features, or their
mixtures. We present an algorithm for our novel matrix factorization. Our
experiments show that our algorithm improves over both baselines, and can yield
interpretable results that reveal more of the latent structure than either NMF
or SMF alone.Comment: 14 pages, 6 figures. To appear in 2018 SIAM International Conference
on Data Mining (SDM '18). For the source code, see
https://people.mpi-inf.mpg.de/~pmiettin/linear-tropical
- β¦