Exploiting diversity of margin-based classifiers

Enrique Romero, Xavier Carreras, Lluis Marques

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

An experimental comparison among Support Vector Machines, AdaBoost and a recently proposed model for maximizing the margin with Feed-forward Neural Networks has been made on a real-world classification problem, namely Text Categorization. The results obtained when comparing their agreement on the predictions show that similar performance does not imply similar predictions, suggesting that different models can be combined to obtain better performance. As a consequence of the study, we derived a very simple confidence measure of the prediction of the tested margin-based classifiers. This measure is based on the margin curve. The combination of margin-based classifiers with this confidence measure lead to a marked improvement on the performance of the system, when combined with several well-known combination schemes.

Original languageEnglish
Title of host publicationIEEE International Conference on Neural Networks - Conference Proceedings
Pages419-424
Number of pages6
Volume1
Publication statusPublished - 2004
Externally publishedYes
Event2004 IEEE International Joint Conference on Neural Networks - Proceedings - Budapest, Hungary
Duration: 25 Jul 200429 Jul 2004

Other

Other2004 IEEE International Joint Conference on Neural Networks - Proceedings
CountryHungary
CityBudapest
Period25/7/0429/7/04

Fingerprint

Classifiers
Adaptive boosting
Feedforward neural networks
Support vector machines

ASJC Scopus subject areas

  • Software

Cite this

Romero, E., Carreras, X., & Marques, L. (2004). Exploiting diversity of margin-based classifiers. In IEEE International Conference on Neural Networks - Conference Proceedings (Vol. 1, pp. 419-424)

Exploiting diversity of margin-based classifiers. / Romero, Enrique; Carreras, Xavier; Marques, Lluis.

IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 1 2004. p. 419-424.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Romero, E, Carreras, X & Marques, L 2004, Exploiting diversity of margin-based classifiers. in IEEE International Conference on Neural Networks - Conference Proceedings. vol. 1, pp. 419-424, 2004 IEEE International Joint Conference on Neural Networks - Proceedings, Budapest, Hungary, 25/7/04.
Romero E, Carreras X, Marques L. Exploiting diversity of margin-based classifiers. In IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 1. 2004. p. 419-424
Romero, Enrique ; Carreras, Xavier ; Marques, Lluis. / Exploiting diversity of margin-based classifiers. IEEE International Conference on Neural Networks - Conference Proceedings. Vol. 1 2004. pp. 419-424
@inproceedings{ae51fedd99764b58a3eef8c13c2c2f35,
title = "Exploiting diversity of margin-based classifiers",
abstract = "An experimental comparison among Support Vector Machines, AdaBoost and a recently proposed model for maximizing the margin with Feed-forward Neural Networks has been made on a real-world classification problem, namely Text Categorization. The results obtained when comparing their agreement on the predictions show that similar performance does not imply similar predictions, suggesting that different models can be combined to obtain better performance. As a consequence of the study, we derived a very simple confidence measure of the prediction of the tested margin-based classifiers. This measure is based on the margin curve. The combination of margin-based classifiers with this confidence measure lead to a marked improvement on the performance of the system, when combined with several well-known combination schemes.",
author = "Enrique Romero and Xavier Carreras and Lluis Marques",
year = "2004",
language = "English",
volume = "1",
pages = "419--424",
booktitle = "IEEE International Conference on Neural Networks - Conference Proceedings",

}

TY - GEN

T1 - Exploiting diversity of margin-based classifiers

AU - Romero, Enrique

AU - Carreras, Xavier

AU - Marques, Lluis

PY - 2004

Y1 - 2004

N2 - An experimental comparison among Support Vector Machines, AdaBoost and a recently proposed model for maximizing the margin with Feed-forward Neural Networks has been made on a real-world classification problem, namely Text Categorization. The results obtained when comparing their agreement on the predictions show that similar performance does not imply similar predictions, suggesting that different models can be combined to obtain better performance. As a consequence of the study, we derived a very simple confidence measure of the prediction of the tested margin-based classifiers. This measure is based on the margin curve. The combination of margin-based classifiers with this confidence measure lead to a marked improvement on the performance of the system, when combined with several well-known combination schemes.

AB - An experimental comparison among Support Vector Machines, AdaBoost and a recently proposed model for maximizing the margin with Feed-forward Neural Networks has been made on a real-world classification problem, namely Text Categorization. The results obtained when comparing their agreement on the predictions show that similar performance does not imply similar predictions, suggesting that different models can be combined to obtain better performance. As a consequence of the study, we derived a very simple confidence measure of the prediction of the tested margin-based classifiers. This measure is based on the margin curve. The combination of margin-based classifiers with this confidence measure lead to a marked improvement on the performance of the system, when combined with several well-known combination schemes.

UR - http://www.scopus.com/inward/record.url?scp=10944228640&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=10944228640&partnerID=8YFLogxK

M3 - Conference contribution

VL - 1

SP - 419

EP - 424

BT - IEEE International Conference on Neural Networks - Conference Proceedings

ER -