Convergence proof of a sequential minimal optimization algorithm for support vector regression

Jun Guo, Norikazu Takahashi, Tetsuo Nishi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

A sequential minimal optimization (SMO) algorithm for support vector regression (SVR) has recently been proposed by Flake and Lawrence. However, the convergence of their algorithm has not been proved so far. In this paper, we consider an SMO algorithm, which deals with the same optimization problem as Flake and Lawrence's SMO, and give a rigorous proof that it always stops within a finite number of iterations.

Original languageEnglish
Title of host publicationInternational Joint Conference on Neural Networks 2006, IJCNN '06
Pages355-362
Number of pages8
Publication statusPublished - Dec 1 2006
Externally publishedYes
EventInternational Joint Conference on Neural Networks 2006, IJCNN '06 - Vancouver, BC, Canada
Duration: Jul 16 2006Jul 21 2006

Publication series

NameIEEE International Conference on Neural Networks - Conference Proceedings
ISSN (Print)1098-7576

Other

OtherInternational Joint Conference on Neural Networks 2006, IJCNN '06
CountryCanada
CityVancouver, BC
Period7/16/067/21/06

ASJC Scopus subject areas

  • Software

Cite this

Guo, J., Takahashi, N., & Nishi, T. (2006). Convergence proof of a sequential minimal optimization algorithm for support vector regression. In International Joint Conference on Neural Networks 2006, IJCNN '06 (pp. 355-362). [1716114] (IEEE International Conference on Neural Networks - Conference Proceedings).