A step loss function based SVM classifier for binary classification

Fethi Jarray, Sabri Boughorbel, Mahmud Mansour, Ghassen Tlig

Research output: Contribution to journalConference article

1 Citation (Scopus)


In this paper, we propose a new cost function, step loss, for support vector machine classifiers based on a deep distinction between the instances. It takes into account the position of the samples with the margin. More precisely, we divide the instances into four categories: i) instances correctly classified and lies outside the margin, ii) instances well classified and lies within the margin, iii) instances misclassified and lies within the margin and iv) instances misclassified and lies outside the margin. The the step loss assign a constant cost for each group of instances. By this it is more general than the hard margin cost that divide the instances into two categories. It will be also more robust to the outliers than the soft margin because the instances of the fourth group have a constant cost contrary to the hinge cost where the misclassified instances have a linear cost. It will be more accurate than the Ramp loss because it hardly distinguishes between the instances well classified within the margin and the instances misclassified within the margin. Theoretically, we prove that SVM model integrated with the step loss function has has the nice property of kernilization.

Original languageEnglish
Pages (from-to)9-15
Number of pages7
JournalProcedia Computer Science
Publication statusPublished - 1 Jan 2018
Event9th International Conference on Emerging Ubiquitous Systems and Pervasive Networks, EUSPN 2018 - Leuven, Belgium
Duration: 5 Nov 20188 Nov 2018



  • Classification
  • Integer programming
  • Loss function
  • Machine learning
  • SVM

ASJC Scopus subject areas

  • Computer Science(all)

Cite this