Global convergence analysis of decomposition methods for support vector regression

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Decomposition method has been widely used to efficiently solve the large size quadratic programming (QP) problems arising in support vector regression (SVR). In a decomposition method, a large QP problem is decomposed into a series of smaller QP subproblems, which can be solved much faster than the original one. In this paper, we analyze the global convergence of decomposition methods for SVR. We will show the decomposition methods for the convex programming problem formulated by Flake and Lawrence always stop within a finite number of iterations.

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Pages663-673
Number of pages11
Volume5263 LNCS
EditionPART 1
DOIs
Publication statusPublished - 2008
Externally publishedYes
Event5th International Symposium on Neural Networks, ISNN 2008 - Beijing, China
Duration: Sep 24 2008Sep 28 2008

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 1
Volume5263 LNCS
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other5th International Symposium on Neural Networks, ISNN 2008
CountryChina
CityBeijing
Period9/24/089/28/08

Fingerprint

Support Vector Regression
Global Analysis
Decomposition Method
Convergence Analysis
Global Convergence
Quadratic programming
Quadratic Programming
Decomposition
Convex optimization
Convex Programming
Iteration
Series

Keywords

  • Decomposition method
  • Global convergence
  • Support vector regression

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

Cite this

Guo, J., & Takahashi, N. (2008). Global convergence analysis of decomposition methods for support vector regression. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (PART 1 ed., Vol. 5263 LNCS, pp. 663-673). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 5263 LNCS, No. PART 1). https://doi.org/10.1007/978-3-540-87732-5-74

Global convergence analysis of decomposition methods for support vector regression. / Guo, Jun; Takahashi, Norikazu.

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 5263 LNCS PART 1. ed. 2008. p. 663-673 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 5263 LNCS, No. PART 1).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Guo, J & Takahashi, N 2008, Global convergence analysis of decomposition methods for support vector regression. in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). PART 1 edn, vol. 5263 LNCS, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), no. PART 1, vol. 5263 LNCS, pp. 663-673, 5th International Symposium on Neural Networks, ISNN 2008, Beijing, China, 9/24/08. https://doi.org/10.1007/978-3-540-87732-5-74
Guo J, Takahashi N. Global convergence analysis of decomposition methods for support vector regression. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). PART 1 ed. Vol. 5263 LNCS. 2008. p. 663-673. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 1). https://doi.org/10.1007/978-3-540-87732-5-74
Guo, Jun ; Takahashi, Norikazu. / Global convergence analysis of decomposition methods for support vector regression. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 5263 LNCS PART 1. ed. 2008. pp. 663-673 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 1).
@inproceedings{7852cb5fd6b14efb9d35274a82d1abaf,
title = "Global convergence analysis of decomposition methods for support vector regression",
abstract = "Decomposition method has been widely used to efficiently solve the large size quadratic programming (QP) problems arising in support vector regression (SVR). In a decomposition method, a large QP problem is decomposed into a series of smaller QP subproblems, which can be solved much faster than the original one. In this paper, we analyze the global convergence of decomposition methods for SVR. We will show the decomposition methods for the convex programming problem formulated by Flake and Lawrence always stop within a finite number of iterations.",
keywords = "Decomposition method, Global convergence, Support vector regression",
author = "Jun Guo and Norikazu Takahashi",
year = "2008",
doi = "10.1007/978-3-540-87732-5-74",
language = "English",
isbn = "3540877312",
volume = "5263 LNCS",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
number = "PART 1",
pages = "663--673",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
edition = "PART 1",

}

TY - GEN

T1 - Global convergence analysis of decomposition methods for support vector regression

AU - Guo, Jun

AU - Takahashi, Norikazu

PY - 2008

Y1 - 2008

N2 - Decomposition method has been widely used to efficiently solve the large size quadratic programming (QP) problems arising in support vector regression (SVR). In a decomposition method, a large QP problem is decomposed into a series of smaller QP subproblems, which can be solved much faster than the original one. In this paper, we analyze the global convergence of decomposition methods for SVR. We will show the decomposition methods for the convex programming problem formulated by Flake and Lawrence always stop within a finite number of iterations.

AB - Decomposition method has been widely used to efficiently solve the large size quadratic programming (QP) problems arising in support vector regression (SVR). In a decomposition method, a large QP problem is decomposed into a series of smaller QP subproblems, which can be solved much faster than the original one. In this paper, we analyze the global convergence of decomposition methods for SVR. We will show the decomposition methods for the convex programming problem formulated by Flake and Lawrence always stop within a finite number of iterations.

KW - Decomposition method

KW - Global convergence

KW - Support vector regression

UR - http://www.scopus.com/inward/record.url?scp=59149088377&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=59149088377&partnerID=8YFLogxK

U2 - 10.1007/978-3-540-87732-5-74

DO - 10.1007/978-3-540-87732-5-74

M3 - Conference contribution

AN - SCOPUS:59149088377

SN - 3540877312

SN - 9783540877318

VL - 5263 LNCS

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 663

EP - 673

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

ER -