Optimal reduced sets for sparse kernel spectral clustering

RaghvenPhDa Mall, Siamak Mehrkanoon, Rocco Langone, Johan A.K. Suykens

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

Kernel spectral clustering (KSC) solves a weighted kernel principal component analysis problem in a primal-dual optimization framework. It results in a clustering model using the dual solution of the problem. It has a powerful out-of-sample extension property leading to good clustering generalization w.r.t. the unseen data points. The out-of-sample extension property allows to build a sparse model on a small training set and introduces the first level of sparsity. The clustering dual model is expressed in terms of non-sparse kernel expansions where every point in the training set contributes. The goal is to find reduced set of training points which can best approximate the original solution. In this paper a second level of sparsity is introduced in order to reduce the time complexity of the computationally expensive out-of-sample extension. In this paper we investigate various penalty based reduced set techniques including the Group Lasso, L0, L1 L0 penalization and compare the amount of sparsity gained w.r.t. a previous L1 penalization technique. We observe that the optimal results in terms of sparsity corresponds to the Group Lasso penalization technique in majority of the cases. We showcase the effectiveness of the proposed approaches on several real world datasets and an image segmentation dataset.

Original languageEnglish
Title of host publicationProceedings of the International Joint Conference on Neural Networks
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2436-2443
Number of pages8
ISBN (Electronic)9781479914845
DOIs
Publication statusPublished - 1 Jan 2014
Externally publishedYes
Event2014 International Joint Conference on Neural Networks, IJCNN 2014 - Beijing, China
Duration: 6 Jul 201411 Jul 2014

Other

Other2014 International Joint Conference on Neural Networks, IJCNN 2014
CountryChina
CityBeijing
Period6/7/1411/7/14

Fingerprint

Image segmentation
Principal component analysis

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Mall, R., Mehrkanoon, S., Langone, R., & Suykens, J. A. K. (2014). Optimal reduced sets for sparse kernel spectral clustering. In Proceedings of the International Joint Conference on Neural Networks (pp. 2436-2443). [6889474] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IJCNN.2014.6889474

Optimal reduced sets for sparse kernel spectral clustering. / Mall, RaghvenPhDa; Mehrkanoon, Siamak; Langone, Rocco; Suykens, Johan A.K.

Proceedings of the International Joint Conference on Neural Networks. Institute of Electrical and Electronics Engineers Inc., 2014. p. 2436-2443 6889474.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Mall, R, Mehrkanoon, S, Langone, R & Suykens, JAK 2014, Optimal reduced sets for sparse kernel spectral clustering. in Proceedings of the International Joint Conference on Neural Networks., 6889474, Institute of Electrical and Electronics Engineers Inc., pp. 2436-2443, 2014 International Joint Conference on Neural Networks, IJCNN 2014, Beijing, China, 6/7/14. https://doi.org/10.1109/IJCNN.2014.6889474
Mall R, Mehrkanoon S, Langone R, Suykens JAK. Optimal reduced sets for sparse kernel spectral clustering. In Proceedings of the International Joint Conference on Neural Networks. Institute of Electrical and Electronics Engineers Inc. 2014. p. 2436-2443. 6889474 https://doi.org/10.1109/IJCNN.2014.6889474
Mall, RaghvenPhDa ; Mehrkanoon, Siamak ; Langone, Rocco ; Suykens, Johan A.K. / Optimal reduced sets for sparse kernel spectral clustering. Proceedings of the International Joint Conference on Neural Networks. Institute of Electrical and Electronics Engineers Inc., 2014. pp. 2436-2443
@inproceedings{bc5d349f9fc6498fa00356a76b986d32,
title = "Optimal reduced sets for sparse kernel spectral clustering",
abstract = "Kernel spectral clustering (KSC) solves a weighted kernel principal component analysis problem in a primal-dual optimization framework. It results in a clustering model using the dual solution of the problem. It has a powerful out-of-sample extension property leading to good clustering generalization w.r.t. the unseen data points. The out-of-sample extension property allows to build a sparse model on a small training set and introduces the first level of sparsity. The clustering dual model is expressed in terms of non-sparse kernel expansions where every point in the training set contributes. The goal is to find reduced set of training points which can best approximate the original solution. In this paper a second level of sparsity is introduced in order to reduce the time complexity of the computationally expensive out-of-sample extension. In this paper we investigate various penalty based reduced set techniques including the Group Lasso, L0, L1 L0 penalization and compare the amount of sparsity gained w.r.t. a previous L1 penalization technique. We observe that the optimal results in terms of sparsity corresponds to the Group Lasso penalization technique in majority of the cases. We showcase the effectiveness of the proposed approaches on several real world datasets and an image segmentation dataset.",
author = "RaghvenPhDa Mall and Siamak Mehrkanoon and Rocco Langone and Suykens, {Johan A.K.}",
year = "2014",
month = "1",
day = "1",
doi = "10.1109/IJCNN.2014.6889474",
language = "English",
pages = "2436--2443",
booktitle = "Proceedings of the International Joint Conference on Neural Networks",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Optimal reduced sets for sparse kernel spectral clustering

AU - Mall, RaghvenPhDa

AU - Mehrkanoon, Siamak

AU - Langone, Rocco

AU - Suykens, Johan A.K.

PY - 2014/1/1

Y1 - 2014/1/1

N2 - Kernel spectral clustering (KSC) solves a weighted kernel principal component analysis problem in a primal-dual optimization framework. It results in a clustering model using the dual solution of the problem. It has a powerful out-of-sample extension property leading to good clustering generalization w.r.t. the unseen data points. The out-of-sample extension property allows to build a sparse model on a small training set and introduces the first level of sparsity. The clustering dual model is expressed in terms of non-sparse kernel expansions where every point in the training set contributes. The goal is to find reduced set of training points which can best approximate the original solution. In this paper a second level of sparsity is introduced in order to reduce the time complexity of the computationally expensive out-of-sample extension. In this paper we investigate various penalty based reduced set techniques including the Group Lasso, L0, L1 L0 penalization and compare the amount of sparsity gained w.r.t. a previous L1 penalization technique. We observe that the optimal results in terms of sparsity corresponds to the Group Lasso penalization technique in majority of the cases. We showcase the effectiveness of the proposed approaches on several real world datasets and an image segmentation dataset.

AB - Kernel spectral clustering (KSC) solves a weighted kernel principal component analysis problem in a primal-dual optimization framework. It results in a clustering model using the dual solution of the problem. It has a powerful out-of-sample extension property leading to good clustering generalization w.r.t. the unseen data points. The out-of-sample extension property allows to build a sparse model on a small training set and introduces the first level of sparsity. The clustering dual model is expressed in terms of non-sparse kernel expansions where every point in the training set contributes. The goal is to find reduced set of training points which can best approximate the original solution. In this paper a second level of sparsity is introduced in order to reduce the time complexity of the computationally expensive out-of-sample extension. In this paper we investigate various penalty based reduced set techniques including the Group Lasso, L0, L1 L0 penalization and compare the amount of sparsity gained w.r.t. a previous L1 penalization technique. We observe that the optimal results in terms of sparsity corresponds to the Group Lasso penalization technique in majority of the cases. We showcase the effectiveness of the proposed approaches on several real world datasets and an image segmentation dataset.

UR - http://www.scopus.com/inward/record.url?scp=84908474270&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84908474270&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2014.6889474

DO - 10.1109/IJCNN.2014.6889474

M3 - Conference contribution

AN - SCOPUS:84908474270

SP - 2436

EP - 2443

BT - Proceedings of the International Joint Conference on Neural Networks

PB - Institute of Electrical and Electronics Engineers Inc.

ER -