Regularization in regression: Comparing Bayesian and frequentist methods in a poorly informative situation

Gilles Celeux, Mohammed El Anbari, Jean Michel Marin, Christian P. Robert

Research output: Contribution to journalArticle

19 Citations (Scopus)

Abstract

Using a collection of simulated and real benchmarks, we compare Bayesian and frequentist regularization approaches under a low informative constraint when the number of variables is almost equal to the number of observations on simulated and real datasets. This comparison includes new global noninformative approaches for Bayesian variable selection built on Zellner's g-priors that are similar to Liang et al. (2008). The interest of those calibration-free proposals is discussed. The numerical experiments we present highlight the appeal of Bayesian regularization methods, when compared with non-Bayesian alternatives. They dominate frequentist methods in the sense that they provide smaller prediction errors while selecting the most relevant variables in a parsimonious way.

Original languageEnglish
Pages (from-to)477-502
Number of pages26
JournalBayesian Analysis
Volume7
Issue number2
DOIs
Publication statusPublished - 2012
Externally publishedYes

    Fingerprint

Keywords

  • Calibration
  • Dantzig selector
  • Elastic net
  • Lasso
  • Model choice
  • Noninformative priors
  • Regularization methods
  • Zellner's g-prior

ASJC Scopus subject areas

  • Statistics and Probability
  • Applied Mathematics

Cite this