A step loss function based SVM classifier for binary classification

Fethi Jarray, Sabri Boughorbel, Mahmud Mansour, Ghassen Tlig

Research output: Contribution to journalConference article

1 Citation (Scopus)

Abstract

In this paper, we propose a new cost function, step loss, for support vector machine classifiers based on a deep distinction between the instances. It takes into account the position of the samples with the margin. More precisely, we divide the instances into four categories: i) instances correctly classified and lies outside the margin, ii) instances well classified and lies within the margin, iii) instances misclassified and lies within the margin and iv) instances misclassified and lies outside the margin. The the step loss assign a constant cost for each group of instances. By this it is more general than the hard margin cost that divide the instances into two categories. It will be also more robust to the outliers than the soft margin because the instances of the fourth group have a constant cost contrary to the hinge cost where the misclassified instances have a linear cost. It will be more accurate than the Ramp loss because it hardly distinguishes between the instances well classified within the margin and the instances misclassified within the margin. Theoretically, we prove that SVM model integrated with the step loss function has has the nice property of kernilization.

Original languageEnglish
Pages (from-to)9-15
Number of pages7
JournalProcedia Computer Science
Volume141
DOIs
Publication statusPublished - 1 Jan 2018
Event9th International Conference on Emerging Ubiquitous Systems and Pervasive Networks, EUSPN 2018 - Leuven, Belgium
Duration: 5 Nov 20188 Nov 2018

Fingerprint

Classifiers
Costs
Hinges
Cost functions
Support vector machines

Keywords

  • Classification
  • Integer programming
  • Loss function
  • Machine learning
  • SVM

ASJC Scopus subject areas

  • Computer Science(all)

Cite this

A step loss function based SVM classifier for binary classification. / Jarray, Fethi; Boughorbel, Sabri; Mansour, Mahmud; Tlig, Ghassen.

In: Procedia Computer Science, Vol. 141, 01.01.2018, p. 9-15.

Research output: Contribution to journalConference article

Jarray, Fethi ; Boughorbel, Sabri ; Mansour, Mahmud ; Tlig, Ghassen. / A step loss function based SVM classifier for binary classification. In: Procedia Computer Science. 2018 ; Vol. 141. pp. 9-15.
@article{556ccd09b49c4c7e96d66cca207c05fd,
title = "A step loss function based SVM classifier for binary classification",
abstract = "In this paper, we propose a new cost function, step loss, for support vector machine classifiers based on a deep distinction between the instances. It takes into account the position of the samples with the margin. More precisely, we divide the instances into four categories: i) instances correctly classified and lies outside the margin, ii) instances well classified and lies within the margin, iii) instances misclassified and lies within the margin and iv) instances misclassified and lies outside the margin. The the step loss assign a constant cost for each group of instances. By this it is more general than the hard margin cost that divide the instances into two categories. It will be also more robust to the outliers than the soft margin because the instances of the fourth group have a constant cost contrary to the hinge cost where the misclassified instances have a linear cost. It will be more accurate than the Ramp loss because it hardly distinguishes between the instances well classified within the margin and the instances misclassified within the margin. Theoretically, we prove that SVM model integrated with the step loss function has has the nice property of kernilization.",
keywords = "Classification, Integer programming, Loss function, Machine learning, SVM",
author = "Fethi Jarray and Sabri Boughorbel and Mahmud Mansour and Ghassen Tlig",
year = "2018",
month = "1",
day = "1",
doi = "10.1016/j.procs.2018.10.123",
language = "English",
volume = "141",
pages = "9--15",
journal = "Procedia Computer Science",
issn = "1877-0509",
publisher = "Elsevier BV",

}

TY - JOUR

T1 - A step loss function based SVM classifier for binary classification

AU - Jarray, Fethi

AU - Boughorbel, Sabri

AU - Mansour, Mahmud

AU - Tlig, Ghassen

PY - 2018/1/1

Y1 - 2018/1/1

N2 - In this paper, we propose a new cost function, step loss, for support vector machine classifiers based on a deep distinction between the instances. It takes into account the position of the samples with the margin. More precisely, we divide the instances into four categories: i) instances correctly classified and lies outside the margin, ii) instances well classified and lies within the margin, iii) instances misclassified and lies within the margin and iv) instances misclassified and lies outside the margin. The the step loss assign a constant cost for each group of instances. By this it is more general than the hard margin cost that divide the instances into two categories. It will be also more robust to the outliers than the soft margin because the instances of the fourth group have a constant cost contrary to the hinge cost where the misclassified instances have a linear cost. It will be more accurate than the Ramp loss because it hardly distinguishes between the instances well classified within the margin and the instances misclassified within the margin. Theoretically, we prove that SVM model integrated with the step loss function has has the nice property of kernilization.

AB - In this paper, we propose a new cost function, step loss, for support vector machine classifiers based on a deep distinction between the instances. It takes into account the position of the samples with the margin. More precisely, we divide the instances into four categories: i) instances correctly classified and lies outside the margin, ii) instances well classified and lies within the margin, iii) instances misclassified and lies within the margin and iv) instances misclassified and lies outside the margin. The the step loss assign a constant cost for each group of instances. By this it is more general than the hard margin cost that divide the instances into two categories. It will be also more robust to the outliers than the soft margin because the instances of the fourth group have a constant cost contrary to the hinge cost where the misclassified instances have a linear cost. It will be more accurate than the Ramp loss because it hardly distinguishes between the instances well classified within the margin and the instances misclassified within the margin. Theoretically, we prove that SVM model integrated with the step loss function has has the nice property of kernilization.

KW - Classification

KW - Integer programming

KW - Loss function

KW - Machine learning

KW - SVM

UR - http://www.scopus.com/inward/record.url?scp=85058321093&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85058321093&partnerID=8YFLogxK

U2 - 10.1016/j.procs.2018.10.123

DO - 10.1016/j.procs.2018.10.123

M3 - Conference article

AN - SCOPUS:85058321093

VL - 141

SP - 9

EP - 15

JO - Procedia Computer Science

JF - Procedia Computer Science

SN - 1877-0509

ER -