Regularized Gaussian Discriminant Analysis through Eigenvalue Decomposition

Halima Bensmail, Gilles Celeux

Research output: Contribution to journalArticle

112 Citations (Scopus)

Abstract

Friedman proposed a regularization technique (RDA) of discriminant analysis in the Gaussian framework. RDA uses two regularization parameters to design an intermediate classifier between the linear, the quadratic the nearest-means classifiers. In this article we propose an alternative approach, called EDDA, that is based on the reparameterization of the covariance matrix [Σk] of a group Gk in terms of its eigenvalue decomposition Σk = λkDkAkDk′, where λk specifies the volume of density contours of Gk, the diagonal matrix of eigenvalues specifies its shape the eigenvectors specify its orientation. Variations on constraints concerning volumes, shapes orientations λk, Ak, and Dk lead to 14 discrimination models of interest. For each model, we derived the normal theory maximum likelihood parameter estimates. Our approach consists of selecting a model by minimizing the sample-based estimate of future misclassification risk by cross-validation. Numerical experiments on simulated and real data show favorable behavior of this approach compared to RDA.

Original languageEnglish
Pages (from-to)1743-1748
Number of pages6
JournalJournal of the American Statistical Association
Volume91
Issue number436
DOIs
Publication statusPublished - 1 Dec 1996

Keywords

  • Covariance matrix
  • Maximum likelihood
  • Normal-based classification
  • Spectral decomposition

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint Dive into the research topics of 'Regularized Gaussian Discriminant Analysis through Eigenvalue Decomposition'. Together they form a unique fingerprint.

  • Cite this