3 research outputs found
Projective Quadratic Regression for Online Learning
This paper considers online convex optimization (OCO) problems - the
paramount framework for online learning algorithm design. The loss function of
learning task in OCO setting is based on streaming data so that OCO is a
powerful tool to model large scale applications such as online recommender
systems. Meanwhile, real-world data are usually of extreme high-dimensional due
to modern feature engineering techniques so that the quadratic regression is
impractical. Factorization Machine as well as its variants are efficient models
for capturing feature interactions with low-rank matrix model but they can't
fulfill the OCO setting due to their non-convexity. In this paper, We propose a
projective quadratic regression (PQR) model. First, it can capture the import
second-order feature information. Second, it is a convex model, so the
requirements of OCO are fulfilled and the global optimal solution can be
achieved. Moreover, existing modern online optimization methods such as Online
Gradient Descent (OGD) or Follow-The-Regularized-Leader (FTRL) can be applied
directly. In addition, by choosing a proper hyper-parameter, we show that it
has the same order of space and time complexity as the linear model and thus
can handle high-dimensional data. Experimental results demonstrate the
performance of the proposed PQR model in terms of accuracy and efficiency by
comparing with the state-of-the-art methods.Comment: AAAI 202
Projective Quadratic Regression for Online Learning
This paper considers online convex optimization (OCO) problems - the paramount framework for online learning algorithm design. The loss function of learning task in OCO setting is based on streaming data so that OCO is a powerful tool to model large scale applications such as online recommender systems. Meanwhile, real-world data are usually of extreme high-dimensional due to modern feature engineering techniques so that the quadratic regression is impractical. Factorization Machine as well as its variants are efficient models for capturing feature interactions with low-rank matrix model but they can't fulfill the OCO setting due to their non-convexity. In this paper, We propose a projective quadratic regression (PQR) model. First, it can capture the import second-order feature information. Second, it is a convex model, so the requirements of OCO are fulfilled and the global optimal solution can be achieved. Moreover, existing modern online optimization methods such as Online Gradient Descent (OGD) or Follow-The-Regularized-Leader (FTRL) can be applied directly. In addition, by choosing a proper hyper-parameter, we show that it has the same order of space and time complexity as the linear model and thus can handle high-dimensional data. Experimental results demonstrate the performance of the proposed PQR model in terms of accuracy and efficiency by comparing with the state-of-the-art methods