Abstract
Global convergence of the sequential minimal optimization (SMO) algorithm for support vector regression (SVR) is studied in this paper. Given ι training samples, SVR is formulated as a convex quadratic programming (QP) problem with ι pairs of variables. We prove that if two pairs of variables violating the optimality condition are chosen for update in each step and subproblems are solved in a certain way, then the SMO algorithm always stops within a finite number of iterations after finding an optimal solution. Also, efficient implementation techniques for the SMO algorithm are presented and compared experimentally with other SMO algorithms.
Original language | English |
---|---|
Pages (from-to) | 971-982 |
Number of pages | 12 |
Journal | IEEE Transactions on Neural Networks |
Volume | 19 |
Issue number | 6 |
DOIs | |
Publication status | Published - Dec 1 2008 |
Externally published | Yes |
Keywords
- Convergence
- Quadratic programming (QP)
- Sequential minimal optimization (SMO)
- Support vector regression (SVR)
ASJC Scopus subject areas
- Software
- Computer Science Applications
- Computer Networks and Communications
- Artificial Intelligence