Article thumbnail
Location of Repository

Implicit online learning

By Brian Kulis and Peter L. Bartlett

Abstract

Online learning algorithms have recently risen to prominence due to their strong theoretical guarantees and an increasing number of practical applications for large-scale data analysis problems. In this paper, we analyze a class of online learning algorithms based on fixed potentials and nonlinearized losses, which yields algorithms with implicit update rules. We show how to efficiently compute these updates, and we prove regret bounds for the algorithms. We apply our formulation to several special cases where our approach has benefits over existing online learning methods. In particular, we provide improved algorithms and bounds for the online metric learning problem, and show improved robustness for online linear prediction problems. Results over a variety of data sets demonstrate the advantages of our framework

Topics: 080100 ARTIFICIAL INTELLIGENCE AND IMAGE PROCESSING
Year: 2010
OAI identifier: oai:eprints.qut.edu.au:44001
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://www.icml2010.org/papers... (external link)
  • https://eprints.qut.edu.au/440... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.