Regularized Gaussian discriminant analysis through eigenvalue decomposition

Halima Bensmail, Gilles Celeux

Research output: Contribution to journalArticle

111 Citations (Scopus)

Abstract

Friedman proposed a regularization technique (RDA) of discriminant analysis in the Gaussian framework. RDA uses two regularization parameters to design an intermediate classifier between the linear, the quadratic, and the nearest-means classifiers. In this article we propose an alternative approach, called EDDA, that is based on the reparameterization of the covariance matrix [Σk] of a group Gk in terms of its eigenvalue decomposition Σk = λkDkAkD′k, where λk specifies the volume of density contours of Gk, the diagonal matrix of eigenvalues specifies its shape, and the eigenvectors specify its orientation. Variations on constraints concerning volumes, shapes, and orientations λk, Ak, and Dk lead to 14 discrimination models of interest. For each model, we derived the normal theory maximum likelihood parameter estimates. Our approach consists of selecting a model by minimizing the sample-based estimate of future misclassification risk by cross-validation. Numerical experiments on simulated and real data show favorable behavior of this approach compared to RDA.

Original languageEnglish
Pages (from-to)1743-1748
Number of pages6
JournalJournal of the American Statistical Association
Volume91
Issue number436
Publication statusPublished - 1 Dec 1996
Externally publishedYes

Fingerprint

Eigenvalue Decomposition
Discriminant Analysis
Classifier
Model Discrimination
Reparameterization
Regularization Technique
Misclassification
Diagonal matrix
Regularization Parameter
Cross-validation
Estimate
Eigenvector
Covariance matrix
Maximum Likelihood
Two Parameters
Numerical Experiment
Eigenvalue
Alternatives
Model
Decomposition

Keywords

  • Covariance matrix
  • Maximum likelihood
  • Normal-based classification
  • Spectral decomposition

ASJC Scopus subject areas

  • Mathematics(all)
  • Statistics and Probability

Cite this

Regularized Gaussian discriminant analysis through eigenvalue decomposition. / Bensmail, Halima; Celeux, Gilles.

In: Journal of the American Statistical Association, Vol. 91, No. 436, 01.12.1996, p. 1743-1748.

Research output: Contribution to journalArticle

@article{77e47dbc853c4452b44872f022c70a3d,
title = "Regularized Gaussian discriminant analysis through eigenvalue decomposition",
abstract = "Friedman proposed a regularization technique (RDA) of discriminant analysis in the Gaussian framework. RDA uses two regularization parameters to design an intermediate classifier between the linear, the quadratic, and the nearest-means classifiers. In this article we propose an alternative approach, called EDDA, that is based on the reparameterization of the covariance matrix [Σk] of a group Gk in terms of its eigenvalue decomposition Σk = λkDkAkD′k, where λk specifies the volume of density contours of Gk, the diagonal matrix of eigenvalues specifies its shape, and the eigenvectors specify its orientation. Variations on constraints concerning volumes, shapes, and orientations λk, Ak, and Dk lead to 14 discrimination models of interest. For each model, we derived the normal theory maximum likelihood parameter estimates. Our approach consists of selecting a model by minimizing the sample-based estimate of future misclassification risk by cross-validation. Numerical experiments on simulated and real data show favorable behavior of this approach compared to RDA.",
keywords = "Covariance matrix, Maximum likelihood, Normal-based classification, Spectral decomposition",
author = "Halima Bensmail and Gilles Celeux",
year = "1996",
month = "12",
day = "1",
language = "English",
volume = "91",
pages = "1743--1748",
journal = "Journal of the American Statistical Association",
issn = "0162-1459",
publisher = "Taylor and Francis Ltd.",
number = "436",

}

TY - JOUR

T1 - Regularized Gaussian discriminant analysis through eigenvalue decomposition

AU - Bensmail, Halima

AU - Celeux, Gilles

PY - 1996/12/1

Y1 - 1996/12/1

N2 - Friedman proposed a regularization technique (RDA) of discriminant analysis in the Gaussian framework. RDA uses two regularization parameters to design an intermediate classifier between the linear, the quadratic, and the nearest-means classifiers. In this article we propose an alternative approach, called EDDA, that is based on the reparameterization of the covariance matrix [Σk] of a group Gk in terms of its eigenvalue decomposition Σk = λkDkAkD′k, where λk specifies the volume of density contours of Gk, the diagonal matrix of eigenvalues specifies its shape, and the eigenvectors specify its orientation. Variations on constraints concerning volumes, shapes, and orientations λk, Ak, and Dk lead to 14 discrimination models of interest. For each model, we derived the normal theory maximum likelihood parameter estimates. Our approach consists of selecting a model by minimizing the sample-based estimate of future misclassification risk by cross-validation. Numerical experiments on simulated and real data show favorable behavior of this approach compared to RDA.

AB - Friedman proposed a regularization technique (RDA) of discriminant analysis in the Gaussian framework. RDA uses two regularization parameters to design an intermediate classifier between the linear, the quadratic, and the nearest-means classifiers. In this article we propose an alternative approach, called EDDA, that is based on the reparameterization of the covariance matrix [Σk] of a group Gk in terms of its eigenvalue decomposition Σk = λkDkAkD′k, where λk specifies the volume of density contours of Gk, the diagonal matrix of eigenvalues specifies its shape, and the eigenvectors specify its orientation. Variations on constraints concerning volumes, shapes, and orientations λk, Ak, and Dk lead to 14 discrimination models of interest. For each model, we derived the normal theory maximum likelihood parameter estimates. Our approach consists of selecting a model by minimizing the sample-based estimate of future misclassification risk by cross-validation. Numerical experiments on simulated and real data show favorable behavior of this approach compared to RDA.

KW - Covariance matrix

KW - Maximum likelihood

KW - Normal-based classification

KW - Spectral decomposition

UR - http://www.scopus.com/inward/record.url?scp=0030326891&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030326891&partnerID=8YFLogxK

M3 - Article

VL - 91

SP - 1743

EP - 1748

JO - Journal of the American Statistical Association

JF - Journal of the American Statistical Association

SN - 0162-1459

IS - 436

ER -