Variations to incremental growing neural gas algorithm based on label maximization

Jean Charles Lamirel, RaghvenPhDa Mall, Pascal Cuxac, Ghada Safi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

22 Citations (Scopus)

Abstract

Neural clustering algorithms show high performance in the general context of the analysis of homogeneous textual dataset. This is especially true for the recent adaptive versions of these algorithms, like the incremental growing neural gas algorithm (IGNG) and the labeling maximization based incremental growing neural gas algorithm (IGNG-F). In this paper we highlight that there is a drastic decrease of performance of these algorithms, as well as the one of more classical algorithms, when a heterogeneous textual dataset is considered as an input. Specific quality measures and cluster labeling techniques that are independent of the clustering method are used for the precise performance evaluation. We provide new variations to incremental growing neural gas algorithm exploiting in an incremental way knowledge from clusters about their current labeling along with cluster distance measure data. This solution leads to significant gain in performance for all types of datasets, especially for the clustering of complex heterogeneous textual data.

Original languageEnglish
Title of host publication2011 International Joint Conference on Neural Networks, IJCNN 2011 - Final Program
Pages956-965
Number of pages10
DOIs
Publication statusPublished - 24 Oct 2011
Externally publishedYes
Event2011 International Joint Conference on Neural Network, IJCNN 2011 - San Jose, CA, United States
Duration: 31 Jul 20115 Aug 2011

Other

Other2011 International Joint Conference on Neural Network, IJCNN 2011
CountryUnited States
CitySan Jose, CA
Period31/7/115/8/11

Fingerprint

Labels
Gases
Labeling
Clustering algorithms

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Lamirel, J. C., Mall, R., Cuxac, P., & Safi, G. (2011). Variations to incremental growing neural gas algorithm based on label maximization. In 2011 International Joint Conference on Neural Networks, IJCNN 2011 - Final Program (pp. 956-965). [6033326] https://doi.org/10.1109/IJCNN.2011.6033326

Variations to incremental growing neural gas algorithm based on label maximization. / Lamirel, Jean Charles; Mall, RaghvenPhDa; Cuxac, Pascal; Safi, Ghada.

2011 International Joint Conference on Neural Networks, IJCNN 2011 - Final Program. 2011. p. 956-965 6033326.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Lamirel, JC, Mall, R, Cuxac, P & Safi, G 2011, Variations to incremental growing neural gas algorithm based on label maximization. in 2011 International Joint Conference on Neural Networks, IJCNN 2011 - Final Program., 6033326, pp. 956-965, 2011 International Joint Conference on Neural Network, IJCNN 2011, San Jose, CA, United States, 31/7/11. https://doi.org/10.1109/IJCNN.2011.6033326
Lamirel JC, Mall R, Cuxac P, Safi G. Variations to incremental growing neural gas algorithm based on label maximization. In 2011 International Joint Conference on Neural Networks, IJCNN 2011 - Final Program. 2011. p. 956-965. 6033326 https://doi.org/10.1109/IJCNN.2011.6033326
Lamirel, Jean Charles ; Mall, RaghvenPhDa ; Cuxac, Pascal ; Safi, Ghada. / Variations to incremental growing neural gas algorithm based on label maximization. 2011 International Joint Conference on Neural Networks, IJCNN 2011 - Final Program. 2011. pp. 956-965
@inproceedings{43273c1ce7b24d07a77585448a328bcc,
title = "Variations to incremental growing neural gas algorithm based on label maximization",
abstract = "Neural clustering algorithms show high performance in the general context of the analysis of homogeneous textual dataset. This is especially true for the recent adaptive versions of these algorithms, like the incremental growing neural gas algorithm (IGNG) and the labeling maximization based incremental growing neural gas algorithm (IGNG-F). In this paper we highlight that there is a drastic decrease of performance of these algorithms, as well as the one of more classical algorithms, when a heterogeneous textual dataset is considered as an input. Specific quality measures and cluster labeling techniques that are independent of the clustering method are used for the precise performance evaluation. We provide new variations to incremental growing neural gas algorithm exploiting in an incremental way knowledge from clusters about their current labeling along with cluster distance measure data. This solution leads to significant gain in performance for all types of datasets, especially for the clustering of complex heterogeneous textual data.",
author = "Lamirel, {Jean Charles} and RaghvenPhDa Mall and Pascal Cuxac and Ghada Safi",
year = "2011",
month = "10",
day = "24",
doi = "10.1109/IJCNN.2011.6033326",
language = "English",
isbn = "9781457710865",
pages = "956--965",
booktitle = "2011 International Joint Conference on Neural Networks, IJCNN 2011 - Final Program",

}

TY - GEN

T1 - Variations to incremental growing neural gas algorithm based on label maximization

AU - Lamirel, Jean Charles

AU - Mall, RaghvenPhDa

AU - Cuxac, Pascal

AU - Safi, Ghada

PY - 2011/10/24

Y1 - 2011/10/24

N2 - Neural clustering algorithms show high performance in the general context of the analysis of homogeneous textual dataset. This is especially true for the recent adaptive versions of these algorithms, like the incremental growing neural gas algorithm (IGNG) and the labeling maximization based incremental growing neural gas algorithm (IGNG-F). In this paper we highlight that there is a drastic decrease of performance of these algorithms, as well as the one of more classical algorithms, when a heterogeneous textual dataset is considered as an input. Specific quality measures and cluster labeling techniques that are independent of the clustering method are used for the precise performance evaluation. We provide new variations to incremental growing neural gas algorithm exploiting in an incremental way knowledge from clusters about their current labeling along with cluster distance measure data. This solution leads to significant gain in performance for all types of datasets, especially for the clustering of complex heterogeneous textual data.

AB - Neural clustering algorithms show high performance in the general context of the analysis of homogeneous textual dataset. This is especially true for the recent adaptive versions of these algorithms, like the incremental growing neural gas algorithm (IGNG) and the labeling maximization based incremental growing neural gas algorithm (IGNG-F). In this paper we highlight that there is a drastic decrease of performance of these algorithms, as well as the one of more classical algorithms, when a heterogeneous textual dataset is considered as an input. Specific quality measures and cluster labeling techniques that are independent of the clustering method are used for the precise performance evaluation. We provide new variations to incremental growing neural gas algorithm exploiting in an incremental way knowledge from clusters about their current labeling along with cluster distance measure data. This solution leads to significant gain in performance for all types of datasets, especially for the clustering of complex heterogeneous textual data.

UR - http://www.scopus.com/inward/record.url?scp=80054722832&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=80054722832&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2011.6033326

DO - 10.1109/IJCNN.2011.6033326

M3 - Conference contribution

AN - SCOPUS:80054722832

SN - 9781457710865

SP - 956

EP - 965

BT - 2011 International Joint Conference on Neural Networks, IJCNN 2011 - Final Program

ER -