TY - GEN
T1 - Convergence proof of a sequential minimal optimization algorithm for support vector regression
AU - Guo, Jun
AU - Takahashi, Norikazu
AU - Nishi, Tetsuo
PY - 2006/12/1
Y1 - 2006/12/1
N2 - A sequential minimal optimization (SMO) algorithm for support vector regression (SVR) has recently been proposed by Flake and Lawrence. However, the convergence of their algorithm has not been proved so far. In this paper, we consider an SMO algorithm, which deals with the same optimization problem as Flake and Lawrence's SMO, and give a rigorous proof that it always stops within a finite number of iterations.
AB - A sequential minimal optimization (SMO) algorithm for support vector regression (SVR) has recently been proposed by Flake and Lawrence. However, the convergence of their algorithm has not been proved so far. In this paper, we consider an SMO algorithm, which deals with the same optimization problem as Flake and Lawrence's SMO, and give a rigorous proof that it always stops within a finite number of iterations.
UR - http://www.scopus.com/inward/record.url?scp=40649090457&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=40649090457&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:40649090457
SN - 0780394909
SN - 9780780394902
T3 - IEEE International Conference on Neural Networks - Conference Proceedings
SP - 355
EP - 362
BT - International Joint Conference on Neural Networks 2006, IJCNN '06
T2 - International Joint Conference on Neural Networks 2006, IJCNN '06
Y2 - 16 July 2006 through 21 July 2006
ER -