Margin maximization with feed-forward neural networks

A comparative study with SVM and AdaBoost

Enrique Romero, Lluis Marques, Xavier Carreras

Research output: Contribution to journalArticle

27 Citations (Scopus)

Abstract

Feed-forward Neural Networks (FNN) and Support Vector Machines (SVM) are two machine learning frameworks developed from very different starting points of view. In this work a new learning model for FNN is proposed such that, in the linearly separable case, it tends to obtain the same solution as SVM. The key idea of the model is a weighting of the sum-of-squares error function, which is inspired by the AdaBoost algorithm. As in SVM, the hardness of the margin can be controlled, so that this model can be also used for the non-linearly separable case. In addition, it is not restricted to the use of kernel functions, and it allows to deal with multiclass and multilabel problems as FNN usually do. Finally, it is independent of the particular algorithm used to minimize the error function. Theoretic and experimental results on synthetic and real-world problems are shown to confirm these claims. Several empirical comparisons among this new model, SVM, and AdaBoost have been made in order to study the agreement between the predictions made by the respective classifiers. Additionally, the results obtained show that similar performance does not imply similar predictions, suggesting that different models can be combined leading to better performance.

Original languageEnglish
Pages (from-to)313-344
Number of pages32
JournalNeurocomputing
Volume57
Issue number1-4
DOIs
Publication statusPublished - 1 Mar 2004
Externally publishedYes

Fingerprint

Adaptive boosting
Feedforward neural networks
Support vector machines
Neural Networks (Computer)
Hardness
Neural networks
Learning
Learning systems
Classifiers
Support Vector Machine

Keywords

  • AdaBoost
  • Feed-forward Neural Networks
  • Margin maximization
  • NLP classification problems
  • Support Vector Machines

ASJC Scopus subject areas

  • Artificial Intelligence
  • Cellular and Molecular Neuroscience

Cite this

Margin maximization with feed-forward neural networks : A comparative study with SVM and AdaBoost. / Romero, Enrique; Marques, Lluis; Carreras, Xavier.

In: Neurocomputing, Vol. 57, No. 1-4, 01.03.2004, p. 313-344.

Research output: Contribution to journalArticle

@article{0ec025c3d8fd4e12932a950099144edd,
title = "Margin maximization with feed-forward neural networks: A comparative study with SVM and AdaBoost",
abstract = "Feed-forward Neural Networks (FNN) and Support Vector Machines (SVM) are two machine learning frameworks developed from very different starting points of view. In this work a new learning model for FNN is proposed such that, in the linearly separable case, it tends to obtain the same solution as SVM. The key idea of the model is a weighting of the sum-of-squares error function, which is inspired by the AdaBoost algorithm. As in SVM, the hardness of the margin can be controlled, so that this model can be also used for the non-linearly separable case. In addition, it is not restricted to the use of kernel functions, and it allows to deal with multiclass and multilabel problems as FNN usually do. Finally, it is independent of the particular algorithm used to minimize the error function. Theoretic and experimental results on synthetic and real-world problems are shown to confirm these claims. Several empirical comparisons among this new model, SVM, and AdaBoost have been made in order to study the agreement between the predictions made by the respective classifiers. Additionally, the results obtained show that similar performance does not imply similar predictions, suggesting that different models can be combined leading to better performance.",
keywords = "AdaBoost, Feed-forward Neural Networks, Margin maximization, NLP classification problems, Support Vector Machines",
author = "Enrique Romero and Lluis Marques and Xavier Carreras",
year = "2004",
month = "3",
day = "1",
doi = "10.1016/j.neucom.2003.10.011",
language = "English",
volume = "57",
pages = "313--344",
journal = "Neurocomputing",
issn = "0925-2312",
publisher = "Elsevier",
number = "1-4",

}

TY - JOUR

T1 - Margin maximization with feed-forward neural networks

T2 - A comparative study with SVM and AdaBoost

AU - Romero, Enrique

AU - Marques, Lluis

AU - Carreras, Xavier

PY - 2004/3/1

Y1 - 2004/3/1

N2 - Feed-forward Neural Networks (FNN) and Support Vector Machines (SVM) are two machine learning frameworks developed from very different starting points of view. In this work a new learning model for FNN is proposed such that, in the linearly separable case, it tends to obtain the same solution as SVM. The key idea of the model is a weighting of the sum-of-squares error function, which is inspired by the AdaBoost algorithm. As in SVM, the hardness of the margin can be controlled, so that this model can be also used for the non-linearly separable case. In addition, it is not restricted to the use of kernel functions, and it allows to deal with multiclass and multilabel problems as FNN usually do. Finally, it is independent of the particular algorithm used to minimize the error function. Theoretic and experimental results on synthetic and real-world problems are shown to confirm these claims. Several empirical comparisons among this new model, SVM, and AdaBoost have been made in order to study the agreement between the predictions made by the respective classifiers. Additionally, the results obtained show that similar performance does not imply similar predictions, suggesting that different models can be combined leading to better performance.

AB - Feed-forward Neural Networks (FNN) and Support Vector Machines (SVM) are two machine learning frameworks developed from very different starting points of view. In this work a new learning model for FNN is proposed such that, in the linearly separable case, it tends to obtain the same solution as SVM. The key idea of the model is a weighting of the sum-of-squares error function, which is inspired by the AdaBoost algorithm. As in SVM, the hardness of the margin can be controlled, so that this model can be also used for the non-linearly separable case. In addition, it is not restricted to the use of kernel functions, and it allows to deal with multiclass and multilabel problems as FNN usually do. Finally, it is independent of the particular algorithm used to minimize the error function. Theoretic and experimental results on synthetic and real-world problems are shown to confirm these claims. Several empirical comparisons among this new model, SVM, and AdaBoost have been made in order to study the agreement between the predictions made by the respective classifiers. Additionally, the results obtained show that similar performance does not imply similar predictions, suggesting that different models can be combined leading to better performance.

KW - AdaBoost

KW - Feed-forward Neural Networks

KW - Margin maximization

KW - NLP classification problems

KW - Support Vector Machines

UR - http://www.scopus.com/inward/record.url?scp=1542680908&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=1542680908&partnerID=8YFLogxK

U2 - 10.1016/j.neucom.2003.10.011

DO - 10.1016/j.neucom.2003.10.011

M3 - Article

VL - 57

SP - 313

EP - 344

JO - Neurocomputing

JF - Neurocomputing

SN - 0925-2312

IS - 1-4

ER -