Global convergence of decomposition learning methods for support vector machines

Norikazu Takahashi, Tetsuo Nishi

Research output: Contribution to journalArticle

24 Citations (Scopus)

Abstract

Decomposition methods are well-known techniques for solving quadratic programming (QP) problems arising in support vector machines (SVMs). In each iteration of a decomposition method, a small number of variables are selected and a QP problem with only the selected variables is solved. Since large matrix computations are not required, decomposition methods are applicable to large QP problems. In this paper, we will make a rigorous analysis of the global convergence of general decomposition methods for SVMs. We first introduce a relaxed version of the optimality condition for the QP problems and then prove that a decomposition method reaches a solution satisfying this relaxed optimality condition within a finite number of iterations under a very mild condition on how to select variables.

Original languageEnglish
Pages (from-to)1362-1369
Number of pages8
JournalIEEE Transactions on Neural Networks
Volume17
Issue number6
DOIs
Publication statusPublished - Nov 2006
Externally publishedYes

    Fingerprint

Keywords

  • Decomposition method
  • Global convergence
  • Quadratic programming (QP)
  • Support vector machines (SVMs)
  • Termination

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Theoretical Computer Science
  • Electrical and Electronic Engineering
  • Artificial Intelligence
  • Computational Theory and Mathematics
  • Hardware and Architecture

Cite this