The LCCP for optimizing kernel parameters for SVM

Sabri Boughorbel, Jean Philippe Tarel, Nozha Boujemaa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

Tuning hyper-parameters is a necessary step to improve learning algorithm performances. For Support Vector Machine classifiers, adjusting kernel parameters increases drastically the recognition accuracy. Basically, cross-validation is performed by sweeping exhaustively the parameter space. The complexity of such grid search is exponential with respect to the number of optimized parameters. Recently, a gradient descent approach has been introduced in [1] which reduces drastically the search steps of the optimal parameters. In this paper, we define the LCCP (Log Convex Concave Procedure) optimization scheme derived from the CCCP (Convex ConCave Procedure) for optimizing kernel parameters by minimizing the radius-margin bound. To apply the LCCP, we prove, for a particular choice of kernel, that the radius is log convex and the margin is log concave. The LCCP is more efficient than gradient descent technique since it insures that the radius margin bound decreases monotonically and converges to a local minimum without searching the size step. Experimentations with standard data sets are provided and discussed.

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Pages589-594
Number of pages6
Volume3697 LNCS
Publication statusPublished - 2005
Externally publishedYes
Event15th International Conference on Artificial Neural Networks: Biological Inspirations - ICANN 2005 - Warsaw, Poland
Duration: 11 Sep 200515 Sep 2005

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume3697 LNCS
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other15th International Conference on Artificial Neural Networks: Biological Inspirations - ICANN 2005
CountryPoland
CityWarsaw
Period11/9/0515/9/05

Fingerprint

Learning algorithms
Support vector machines
Classifiers
Tuning
kernel
Margin
Gradient Descent
Radius
Log-concave
Sweeping
Hyperparameters
Optimal Parameter
Local Minima
Cross-validation
Experimentation
Learning
Parameter Space
Learning Algorithm
Support Vector Machine
Classifier

ASJC Scopus subject areas

  • Biochemistry, Genetics and Molecular Biology(all)
  • Computer Science(all)
  • Theoretical Computer Science

Cite this

Boughorbel, S., Tarel, J. P., & Boujemaa, N. (2005). The LCCP for optimizing kernel parameters for SVM. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3697 LNCS, pp. 589-594). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 3697 LNCS).

The LCCP for optimizing kernel parameters for SVM. / Boughorbel, Sabri; Tarel, Jean Philippe; Boujemaa, Nozha.

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 3697 LNCS 2005. p. 589-594 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 3697 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Boughorbel, S, Tarel, JP & Boujemaa, N 2005, The LCCP for optimizing kernel parameters for SVM. in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). vol. 3697 LNCS, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 3697 LNCS, pp. 589-594, 15th International Conference on Artificial Neural Networks: Biological Inspirations - ICANN 2005, Warsaw, Poland, 11/9/05.
Boughorbel S, Tarel JP, Boujemaa N. The LCCP for optimizing kernel parameters for SVM. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 3697 LNCS. 2005. p. 589-594. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
Boughorbel, Sabri ; Tarel, Jean Philippe ; Boujemaa, Nozha. / The LCCP for optimizing kernel parameters for SVM. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 3697 LNCS 2005. pp. 589-594 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{50d451dd46b9443095384f7c6170086f,
title = "The LCCP for optimizing kernel parameters for SVM",
abstract = "Tuning hyper-parameters is a necessary step to improve learning algorithm performances. For Support Vector Machine classifiers, adjusting kernel parameters increases drastically the recognition accuracy. Basically, cross-validation is performed by sweeping exhaustively the parameter space. The complexity of such grid search is exponential with respect to the number of optimized parameters. Recently, a gradient descent approach has been introduced in [1] which reduces drastically the search steps of the optimal parameters. In this paper, we define the LCCP (Log Convex Concave Procedure) optimization scheme derived from the CCCP (Convex ConCave Procedure) for optimizing kernel parameters by minimizing the radius-margin bound. To apply the LCCP, we prove, for a particular choice of kernel, that the radius is log convex and the margin is log concave. The LCCP is more efficient than gradient descent technique since it insures that the radius margin bound decreases monotonically and converges to a local minimum without searching the size step. Experimentations with standard data sets are provided and discussed.",
author = "Sabri Boughorbel and Tarel, {Jean Philippe} and Nozha Boujemaa",
year = "2005",
language = "English",
isbn = "3540287558",
volume = "3697 LNCS",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
pages = "589--594",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",

}

TY - GEN

T1 - The LCCP for optimizing kernel parameters for SVM

AU - Boughorbel, Sabri

AU - Tarel, Jean Philippe

AU - Boujemaa, Nozha

PY - 2005

Y1 - 2005

N2 - Tuning hyper-parameters is a necessary step to improve learning algorithm performances. For Support Vector Machine classifiers, adjusting kernel parameters increases drastically the recognition accuracy. Basically, cross-validation is performed by sweeping exhaustively the parameter space. The complexity of such grid search is exponential with respect to the number of optimized parameters. Recently, a gradient descent approach has been introduced in [1] which reduces drastically the search steps of the optimal parameters. In this paper, we define the LCCP (Log Convex Concave Procedure) optimization scheme derived from the CCCP (Convex ConCave Procedure) for optimizing kernel parameters by minimizing the radius-margin bound. To apply the LCCP, we prove, for a particular choice of kernel, that the radius is log convex and the margin is log concave. The LCCP is more efficient than gradient descent technique since it insures that the radius margin bound decreases monotonically and converges to a local minimum without searching the size step. Experimentations with standard data sets are provided and discussed.

AB - Tuning hyper-parameters is a necessary step to improve learning algorithm performances. For Support Vector Machine classifiers, adjusting kernel parameters increases drastically the recognition accuracy. Basically, cross-validation is performed by sweeping exhaustively the parameter space. The complexity of such grid search is exponential with respect to the number of optimized parameters. Recently, a gradient descent approach has been introduced in [1] which reduces drastically the search steps of the optimal parameters. In this paper, we define the LCCP (Log Convex Concave Procedure) optimization scheme derived from the CCCP (Convex ConCave Procedure) for optimizing kernel parameters by minimizing the radius-margin bound. To apply the LCCP, we prove, for a particular choice of kernel, that the radius is log convex and the margin is log concave. The LCCP is more efficient than gradient descent technique since it insures that the radius margin bound decreases monotonically and converges to a local minimum without searching the size step. Experimentations with standard data sets are provided and discussed.

UR - http://www.scopus.com/inward/record.url?scp=33646227182&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33646227182&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:33646227182

SN - 3540287558

SN - 9783540287551

VL - 3697 LNCS

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 589

EP - 594

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

ER -