A novel sequential minimal optimization algorithm for support vector regression

Jun Guo, Norikazu Takahashi, Tetsuo Nishi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Citations (Scopus)

Abstract

A novel sequential minimal optimization (SMO) algorithm for support vector regression is proposed. This algorithm is based on Flake and Lawrence's SMO in which convex optimization problems with l variables are solved instead of standard quadratic programming problems with 2l variables where l is the number of training samples, but the strategy for working set selection is quite different. Experimental results show that the proposed algorithm is much faster than Flake and Lawrence's SMO and comparable to the fastest conventional SMO.

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
PublisherSpringer Verlag
Pages827-836
Number of pages10
Volume4232 LNCS
ISBN (Print)3540464794, 9783540464792
Publication statusPublished - 2006
Externally publishedYes
Event13th International Conference on Neural Information Processing, ICONIP 2006 - Hong Kong, China
Duration: Oct 3 2006Oct 6 2006

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume4232 LNCS
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other13th International Conference on Neural Information Processing, ICONIP 2006
CountryChina
CityHong Kong
Period10/3/0610/6/06

Fingerprint

Support Vector Regression
Optimization Algorithm
Optimization
Convex optimization
Quadratic programming
Training Samples
Convex Optimization
Quadratic Programming
Optimization Problem
Experimental Results

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

Cite this

Guo, J., Takahashi, N., & Nishi, T. (2006). A novel sequential minimal optimization algorithm for support vector regression. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4232 LNCS, pp. 827-836). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 4232 LNCS). Springer Verlag.

A novel sequential minimal optimization algorithm for support vector regression. / Guo, Jun; Takahashi, Norikazu; Nishi, Tetsuo.

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 4232 LNCS Springer Verlag, 2006. p. 827-836 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 4232 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Guo, J, Takahashi, N & Nishi, T 2006, A novel sequential minimal optimization algorithm for support vector regression. in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). vol. 4232 LNCS, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 4232 LNCS, Springer Verlag, pp. 827-836, 13th International Conference on Neural Information Processing, ICONIP 2006, Hong Kong, China, 10/3/06.
Guo J, Takahashi N, Nishi T. A novel sequential minimal optimization algorithm for support vector regression. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 4232 LNCS. Springer Verlag. 2006. p. 827-836. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
Guo, Jun ; Takahashi, Norikazu ; Nishi, Tetsuo. / A novel sequential minimal optimization algorithm for support vector regression. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 4232 LNCS Springer Verlag, 2006. pp. 827-836 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{c57df179c4204a368d304a034d60e3f5,
title = "A novel sequential minimal optimization algorithm for support vector regression",
abstract = "A novel sequential minimal optimization (SMO) algorithm for support vector regression is proposed. This algorithm is based on Flake and Lawrence's SMO in which convex optimization problems with l variables are solved instead of standard quadratic programming problems with 2l variables where l is the number of training samples, but the strategy for working set selection is quite different. Experimental results show that the proposed algorithm is much faster than Flake and Lawrence's SMO and comparable to the fastest conventional SMO.",
author = "Jun Guo and Norikazu Takahashi and Tetsuo Nishi",
year = "2006",
language = "English",
isbn = "3540464794",
volume = "4232 LNCS",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "827--836",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",

}

TY - GEN

T1 - A novel sequential minimal optimization algorithm for support vector regression

AU - Guo, Jun

AU - Takahashi, Norikazu

AU - Nishi, Tetsuo

PY - 2006

Y1 - 2006

N2 - A novel sequential minimal optimization (SMO) algorithm for support vector regression is proposed. This algorithm is based on Flake and Lawrence's SMO in which convex optimization problems with l variables are solved instead of standard quadratic programming problems with 2l variables where l is the number of training samples, but the strategy for working set selection is quite different. Experimental results show that the proposed algorithm is much faster than Flake and Lawrence's SMO and comparable to the fastest conventional SMO.

AB - A novel sequential minimal optimization (SMO) algorithm for support vector regression is proposed. This algorithm is based on Flake and Lawrence's SMO in which convex optimization problems with l variables are solved instead of standard quadratic programming problems with 2l variables where l is the number of training samples, but the strategy for working set selection is quite different. Experimental results show that the proposed algorithm is much faster than Flake and Lawrence's SMO and comparable to the fastest conventional SMO.

UR - http://www.scopus.com/inward/record.url?scp=33750600332&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33750600332&partnerID=8YFLogxK

M3 - Conference contribution

SN - 3540464794

SN - 9783540464792

VL - 4232 LNCS

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 827

EP - 836

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

PB - Springer Verlag

ER -