Regularized Gaussian discriminant analysis through eigenvalue decomposition

Halima Bensmail, Gilles Celeux

Research output: Contribution to journalArticle

112 Citations (Scopus)

Abstract

Friedman proposed a regularization technique (RDA) of discriminant analysis in the Gaussian framework. RDA uses two regularization parameters to design an intermediate classifier between the linear, the quadratic, and the nearest-means classifiers. In this article we propose an alternative approach, called EDDA, that is based on the reparameterization of the covariance matrix [Σk] of a group Gk in terms of its eigenvalue decomposition Σk = λkDkAkD′k, where λk specifies the volume of density contours of Gk, the diagonal matrix of eigenvalues specifies its shape, and the eigenvectors specify its orientation. Variations on constraints concerning volumes, shapes, and orientations λk, Ak, and Dk lead to 14 discrimination models of interest. For each model, we derived the normal theory maximum likelihood parameter estimates. Our approach consists of selecting a model by minimizing the sample-based estimate of future misclassification risk by cross-validation. Numerical experiments on simulated and real data show favorable behavior of this approach compared to RDA.

Original languageEnglish
Pages (from-to)1743-1748
Number of pages6
JournalJournal of the American Statistical Association
Volume91
Issue number436
Publication statusPublished - 1 Dec 1996
Externally publishedYes

    Fingerprint

Keywords

  • Covariance matrix
  • Maximum likelihood
  • Normal-based classification
  • Spectral decomposition

ASJC Scopus subject areas

  • Mathematics(all)
  • Statistics and Probability

Cite this