18,360 research outputs found

    A Harmonic Extension Approach for Collaborative Ranking

    Full text link
    We present a new perspective on graph-based methods for collaborative ranking for recommender systems. Unlike user-based or item-based methods that compute a weighted average of ratings given by the nearest neighbors, or low-rank approximation methods using convex optimization and the nuclear norm, we formulate matrix completion as a series of semi-supervised learning problems, and propagate the known ratings to the missing ones on the user-user or item-item graph globally. The semi-supervised learning problems are expressed as Laplace-Beltrami equations on a manifold, or namely, harmonic extension, and can be discretized by a point integral method. We show that our approach does not impose a low-rank Euclidean subspace on the data points, but instead minimizes the dimension of the underlying manifold. Our method, named LDM (low dimensional manifold), turns out to be particularly effective in generating rankings of items, showing decent computational efficiency and robust ranking quality compared to state-of-the-art methods

    Manifold regularization for structured outputs via the joint kernel

    Full text link
    By utilizing the label dependencies among both the labeled and unlabeled data, semi-supervised learning often has better generalization performance than supervised learning. In this paper, we extend a popular graph-based semi-supervised learning method, namely, manifold regularization, to structured outputs. This is performed via the joint kernel directly and allows a unified manifold regularization framework for both unstructured and structured data. Experimental results on various data sets with inter-dependent outputs demonstrate the usefulness of manifold information in improving prediction performance
    • …
    corecore