Article thumbnail

A universal machine learning optimization framework for arbitrary outputs

By Sandor Szedmak and Zakria Hussain


We derive a general Convex Linearly Con-strained Program (CLCP) parameterized by a matrix G, constructed from the informa-tion given by the input-output pairs. The CLCP then chooses a set of regularization and loss functions in order to impose con-straints for the learning task. We show that several algorithms, including the SVM, LP-Boost, Ridge Regression etc., can be solved using the same optimization framework when the appropriate choice of G, regularization and loss function are made. Due to this uni-fication we show that if G is constructed from more complex input-output paired in-formation then we can solve more difficult problems such as structured output learn-ing, with the same complexity as a regres-sion/classification problem. We discuss var-ious different forms of G and then show on some real world enzyme prediction tasks, re-quiring structured outputs, that our method performs as well as the state-of-the-art.

Year: 2009
OAI identifier: oai:CiteSeerX.psu:
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://eprints.pascal-network.... (external link)
  • http://eprints.pascal-network.... (external link)
  • (external link)
  • Suggested articles

    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.