Global convergence of decomposition learning methods for support vector machines

Norikazu Takahashi, Tetsuo Nishi

Research output: Contribution to journalArticle

23 Citations (Scopus)

Abstract

Decomposition methods are well-known techniques for solving quadratic programming (QP) problems arising in support vector machines (SVMs). In each iteration of a decomposition method, a small number of variables are selected and a QP problem with only the selected variables is solved. Since large matrix computations are not required, decomposition methods are applicable to large QP problems. In this paper, we will make a rigorous analysis of the global convergence of general decomposition methods for SVMs. We first introduce a relaxed version of the optimality condition for the QP problems and then prove that a decomposition method reaches a solution satisfying this relaxed optimality condition within a finite number of iterations under a very mild condition on how to select variables.

Original languageEnglish
Pages (from-to)1362-1369
Number of pages8
JournalIEEE Transactions on Neural Networks
Volume17
Issue number6
DOIs
Publication statusPublished - Nov 2006
Externally publishedYes

Fingerprint

Decomposition Method
Global Convergence
Quadratic programming
Support vector machines
Support Vector Machine
Quadratic Programming
Decomposition
Decompose
Optimality Conditions
Iteration
Matrix Computation
Learning

Keywords

  • Decomposition method
  • Global convergence
  • Quadratic programming (QP)
  • Support vector machines (SVMs)
  • Termination

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Theoretical Computer Science
  • Electrical and Electronic Engineering
  • Artificial Intelligence
  • Computational Theory and Mathematics
  • Hardware and Architecture

Cite this

Global convergence of decomposition learning methods for support vector machines. / Takahashi, Norikazu; Nishi, Tetsuo.

In: IEEE Transactions on Neural Networks, Vol. 17, No. 6, 11.2006, p. 1362-1369.

Research output: Contribution to journalArticle

@article{dd8158d5c6f74a7896f79d6a1852eb5f,
title = "Global convergence of decomposition learning methods for support vector machines",
abstract = "Decomposition methods are well-known techniques for solving quadratic programming (QP) problems arising in support vector machines (SVMs). In each iteration of a decomposition method, a small number of variables are selected and a QP problem with only the selected variables is solved. Since large matrix computations are not required, decomposition methods are applicable to large QP problems. In this paper, we will make a rigorous analysis of the global convergence of general decomposition methods for SVMs. We first introduce a relaxed version of the optimality condition for the QP problems and then prove that a decomposition method reaches a solution satisfying this relaxed optimality condition within a finite number of iterations under a very mild condition on how to select variables.",
keywords = "Decomposition method, Global convergence, Quadratic programming (QP), Support vector machines (SVMs), Termination",
author = "Norikazu Takahashi and Tetsuo Nishi",
year = "2006",
month = "11",
doi = "10.1109/TNN.2006.880584",
language = "English",
volume = "17",
pages = "1362--1369",
journal = "IEEE Transactions on Neural Networks and Learning Systems",
issn = "2162-237X",
publisher = "IEEE Computational Intelligence Society",
number = "6",

}

TY - JOUR

T1 - Global convergence of decomposition learning methods for support vector machines

AU - Takahashi, Norikazu

AU - Nishi, Tetsuo

PY - 2006/11

Y1 - 2006/11

N2 - Decomposition methods are well-known techniques for solving quadratic programming (QP) problems arising in support vector machines (SVMs). In each iteration of a decomposition method, a small number of variables are selected and a QP problem with only the selected variables is solved. Since large matrix computations are not required, decomposition methods are applicable to large QP problems. In this paper, we will make a rigorous analysis of the global convergence of general decomposition methods for SVMs. We first introduce a relaxed version of the optimality condition for the QP problems and then prove that a decomposition method reaches a solution satisfying this relaxed optimality condition within a finite number of iterations under a very mild condition on how to select variables.

AB - Decomposition methods are well-known techniques for solving quadratic programming (QP) problems arising in support vector machines (SVMs). In each iteration of a decomposition method, a small number of variables are selected and a QP problem with only the selected variables is solved. Since large matrix computations are not required, decomposition methods are applicable to large QP problems. In this paper, we will make a rigorous analysis of the global convergence of general decomposition methods for SVMs. We first introduce a relaxed version of the optimality condition for the QP problems and then prove that a decomposition method reaches a solution satisfying this relaxed optimality condition within a finite number of iterations under a very mild condition on how to select variables.

KW - Decomposition method

KW - Global convergence

KW - Quadratic programming (QP)

KW - Support vector machines (SVMs)

KW - Termination

UR - http://www.scopus.com/inward/record.url?scp=34248677013&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=34248677013&partnerID=8YFLogxK

U2 - 10.1109/TNN.2006.880584

DO - 10.1109/TNN.2006.880584

M3 - Article

C2 - 17131653

AN - SCOPUS:34248677013

VL - 17

SP - 1362

EP - 1369

JO - IEEE Transactions on Neural Networks and Learning Systems

JF - IEEE Transactions on Neural Networks and Learning Systems

SN - 2162-237X

IS - 6

ER -