Margin maximization with feed-forward neural networks: A comparative study with SVM and AdaBoost

Enrique Romero, Lluis Marques, Xavier Carreras

Research output: Contribution to journalArticle

28 Citations (Scopus)


Feed-forward Neural Networks (FNN) and Support Vector Machines (SVM) are two machine learning frameworks developed from very different starting points of view. In this work a new learning model for FNN is proposed such that, in the linearly separable case, it tends to obtain the same solution as SVM. The key idea of the model is a weighting of the sum-of-squares error function, which is inspired by the AdaBoost algorithm. As in SVM, the hardness of the margin can be controlled, so that this model can be also used for the non-linearly separable case. In addition, it is not restricted to the use of kernel functions, and it allows to deal with multiclass and multilabel problems as FNN usually do. Finally, it is independent of the particular algorithm used to minimize the error function. Theoretic and experimental results on synthetic and real-world problems are shown to confirm these claims. Several empirical comparisons among this new model, SVM, and AdaBoost have been made in order to study the agreement between the predictions made by the respective classifiers. Additionally, the results obtained show that similar performance does not imply similar predictions, suggesting that different models can be combined leading to better performance.

Original languageEnglish
Pages (from-to)313-344
Number of pages32
Issue number1-4
Publication statusPublished - 1 Mar 2004
Externally publishedYes



  • AdaBoost
  • Feed-forward Neural Networks
  • Margin maximization
  • NLP classification problems
  • Support Vector Machines

ASJC Scopus subject areas

  • Artificial Intelligence
  • Cellular and Molecular Neuroscience

Cite this