Concerning the differentiability of the energy function in vector quantization algorithms

Dominique Lepetz, Max Némoz-Gaillard, Michael Aupetit

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

The adaptation rule of Vector Quantization algorithms, and consequently the convergence of the generated sequence, depends on the existence and properties of a function called the energy function, defined on a topological manifold. Our aim is to investigate the conditions of existence of such a function for a class of algorithms including the well-known 'K-means' and 'Self-Organizing Map' algorithms. The results presented here extend several previous studies and show that the energy function is not always a potential but at least the uniform limit of a series of potential functions which we call a pseudo-potential. It also shows that a large number of existing vector quantization algorithms developed by the Artificial Neural Networks community fall into this class. The framework we define opens the way to studying the convergence of all the corresponding adaptation rules at once, and a theorem gives promising insights in that direction.

Original languageEnglish
Pages (from-to)621-630
Number of pages10
JournalNeural Networks
Volume20
Issue number5
DOIs
Publication statusPublished - Jul 2007
Externally publishedYes

Fingerprint

Vector quantization
Self organizing maps
Neural networks

Keywords

  • Energy function
  • K-means
  • Neural-gas
  • Potential function
  • Pseudo-potential
  • Self-organizing maps
  • Vector quantization

ASJC Scopus subject areas

  • Artificial Intelligence
  • Neuroscience(all)

Cite this

Concerning the differentiability of the energy function in vector quantization algorithms. / Lepetz, Dominique; Némoz-Gaillard, Max; Aupetit, Michael.

In: Neural Networks, Vol. 20, No. 5, 07.2007, p. 621-630.

Research output: Contribution to journalArticle

Lepetz, Dominique ; Némoz-Gaillard, Max ; Aupetit, Michael. / Concerning the differentiability of the energy function in vector quantization algorithms. In: Neural Networks. 2007 ; Vol. 20, No. 5. pp. 621-630.
@article{37ecd0ec8518479da3f3cf358c88ae7e,
title = "Concerning the differentiability of the energy function in vector quantization algorithms",
abstract = "The adaptation rule of Vector Quantization algorithms, and consequently the convergence of the generated sequence, depends on the existence and properties of a function called the energy function, defined on a topological manifold. Our aim is to investigate the conditions of existence of such a function for a class of algorithms including the well-known 'K-means' and 'Self-Organizing Map' algorithms. The results presented here extend several previous studies and show that the energy function is not always a potential but at least the uniform limit of a series of potential functions which we call a pseudo-potential. It also shows that a large number of existing vector quantization algorithms developed by the Artificial Neural Networks community fall into this class. The framework we define opens the way to studying the convergence of all the corresponding adaptation rules at once, and a theorem gives promising insights in that direction.",
keywords = "Energy function, K-means, Neural-gas, Potential function, Pseudo-potential, Self-organizing maps, Vector quantization",
author = "Dominique Lepetz and Max N{\'e}moz-Gaillard and Michael Aupetit",
year = "2007",
month = "7",
doi = "10.1016/j.neunet.2006.11.006",
language = "English",
volume = "20",
pages = "621--630",
journal = "Neural Networks",
issn = "0893-6080",
publisher = "Elsevier Limited",
number = "5",

}

TY - JOUR

T1 - Concerning the differentiability of the energy function in vector quantization algorithms

AU - Lepetz, Dominique

AU - Némoz-Gaillard, Max

AU - Aupetit, Michael

PY - 2007/7

Y1 - 2007/7

N2 - The adaptation rule of Vector Quantization algorithms, and consequently the convergence of the generated sequence, depends on the existence and properties of a function called the energy function, defined on a topological manifold. Our aim is to investigate the conditions of existence of such a function for a class of algorithms including the well-known 'K-means' and 'Self-Organizing Map' algorithms. The results presented here extend several previous studies and show that the energy function is not always a potential but at least the uniform limit of a series of potential functions which we call a pseudo-potential. It also shows that a large number of existing vector quantization algorithms developed by the Artificial Neural Networks community fall into this class. The framework we define opens the way to studying the convergence of all the corresponding adaptation rules at once, and a theorem gives promising insights in that direction.

AB - The adaptation rule of Vector Quantization algorithms, and consequently the convergence of the generated sequence, depends on the existence and properties of a function called the energy function, defined on a topological manifold. Our aim is to investigate the conditions of existence of such a function for a class of algorithms including the well-known 'K-means' and 'Self-Organizing Map' algorithms. The results presented here extend several previous studies and show that the energy function is not always a potential but at least the uniform limit of a series of potential functions which we call a pseudo-potential. It also shows that a large number of existing vector quantization algorithms developed by the Artificial Neural Networks community fall into this class. The framework we define opens the way to studying the convergence of all the corresponding adaptation rules at once, and a theorem gives promising insights in that direction.

KW - Energy function

KW - K-means

KW - Neural-gas

KW - Potential function

KW - Pseudo-potential

KW - Self-organizing maps

KW - Vector quantization

UR - http://www.scopus.com/inward/record.url?scp=34447638892&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=34447638892&partnerID=8YFLogxK

U2 - 10.1016/j.neunet.2006.11.006

DO - 10.1016/j.neunet.2006.11.006

M3 - Article

VL - 20

SP - 621

EP - 630

JO - Neural Networks

JF - Neural Networks

SN - 0893-6080

IS - 5

ER -