### Abstract

A sequential minimal optimization (SMO) algorithm for support vector regression (SVR) has recently been proposed by Flake and Lawrence. However, the convergence of their algorithm has not been proved so far. In this paper, we consider an SMO algorithm, which deals with the same optimization problem as Flake and Lawrence's SMO, and give a rigorous proof that it always stops within a finite number of iterations.

Original language | English |
---|---|

Title of host publication | IEEE International Conference on Neural Networks - Conference Proceedings |

Pages | 355-362 |

Number of pages | 8 |

Publication status | Published - 2006 |

Externally published | Yes |

Event | International Joint Conference on Neural Networks 2006, IJCNN '06 - Vancouver, BC, Canada Duration: Jul 16 2006 → Jul 21 2006 |

### Other

Other | International Joint Conference on Neural Networks 2006, IJCNN '06 |
---|---|

Country | Canada |

City | Vancouver, BC |

Period | 7/16/06 → 7/21/06 |

### ASJC Scopus subject areas

- Software

### Cite this

*IEEE International Conference on Neural Networks - Conference Proceedings*(pp. 355-362). [1716114]

**Convergence proof of a sequential minimal optimization algorithm for support vector regression.** / Guo, Jun; Takahashi, Norikazu; Nishi, Tetsuo.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*IEEE International Conference on Neural Networks - Conference Proceedings.*, 1716114, pp. 355-362, International Joint Conference on Neural Networks 2006, IJCNN '06, Vancouver, BC, Canada, 7/16/06.

}

TY - GEN

T1 - Convergence proof of a sequential minimal optimization algorithm for support vector regression

AU - Guo, Jun

AU - Takahashi, Norikazu

AU - Nishi, Tetsuo

PY - 2006

Y1 - 2006

N2 - A sequential minimal optimization (SMO) algorithm for support vector regression (SVR) has recently been proposed by Flake and Lawrence. However, the convergence of their algorithm has not been proved so far. In this paper, we consider an SMO algorithm, which deals with the same optimization problem as Flake and Lawrence's SMO, and give a rigorous proof that it always stops within a finite number of iterations.

AB - A sequential minimal optimization (SMO) algorithm for support vector regression (SVR) has recently been proposed by Flake and Lawrence. However, the convergence of their algorithm has not been proved so far. In this paper, we consider an SMO algorithm, which deals with the same optimization problem as Flake and Lawrence's SMO, and give a rigorous proof that it always stops within a finite number of iterations.

UR - http://www.scopus.com/inward/record.url?scp=40649090457&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=40649090457&partnerID=8YFLogxK

M3 - Conference contribution

SN - 0780394909

SN - 9780780394902

SP - 355

EP - 362

BT - IEEE International Conference on Neural Networks - Conference Proceedings

ER -