Global convergence of SMO algorithm for support vector regression

Norikazu Takahashi, Jun Guo, Tetsuo Nishi

Research output: Contribution to journalArticlepeer-review

35 Citations (Scopus)

Abstract

Global convergence of the sequential minimal optimization (SMO) algorithm for support vector regression (SVR) is studied in this paper. Given ι training samples, SVR is formulated as a convex quadratic programming (QP) problem with ι pairs of variables. We prove that if two pairs of variables violating the optimality condition are chosen for update in each step and subproblems are solved in a certain way, then the SMO algorithm always stops within a finite number of iterations after finding an optimal solution. Also, efficient implementation techniques for the SMO algorithm are presented and compared experimentally with other SMO algorithms.

Original languageEnglish
Pages (from-to)971-982
Number of pages12
JournalIEEE Transactions on Neural Networks
Volume19
Issue number6
DOIs
Publication statusPublished - Dec 1 2008
Externally publishedYes

Keywords

  • Convergence
  • Quadratic programming (QP)
  • Sequential minimal optimization (SMO)
  • Support vector regression (SVR)

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Global convergence of SMO algorithm for support vector regression'. Together they form a unique fingerprint.

Cite this