Regularization in regression: Comparing Bayesian and frequentist methods in a poorly informative situation

Gilles Celeux, Mohammed El Anbari, Jean Michel Marin, Christian P. Robert

Research output: Contribution to journalArticle

17 Citations (Scopus)

Abstract

Using a collection of simulated and real benchmarks, we compare Bayesian and frequentist regularization approaches under a low informative constraint when the number of variables is almost equal to the number of observations on simulated and real datasets. This comparison includes new global noninformative approaches for Bayesian variable selection built on Zellner's g-priors that are similar to Liang et al. (2008). The interest of those calibration-free proposals is discussed. The numerical experiments we present highlight the appeal of Bayesian regularization methods, when compared with non-Bayesian alternatives. They dominate frequentist methods in the sense that they provide smaller prediction errors while selecting the most relevant variables in a parsimonious way.

Original languageEnglish
Pages (from-to)477-502
Number of pages26
JournalBayesian Analysis
Volume7
Issue number2
DOIs
Publication statusPublished - 2012
Externally publishedYes

Fingerprint

Regularization
Regression
Calibration
Bayesian Variable Selection
Appeal
Bayesian Methods
Regularization Method
Prediction Error
Experiments
Numerical Experiment
Benchmark
Alternatives
Observation

Keywords

  • Calibration
  • Dantzig selector
  • Elastic net
  • Lasso
  • Model choice
  • Noninformative priors
  • Regularization methods
  • Zellner's g-prior

ASJC Scopus subject areas

  • Statistics and Probability
  • Applied Mathematics

Cite this

Regularization in regression : Comparing Bayesian and frequentist methods in a poorly informative situation. / Celeux, Gilles; El Anbari, Mohammed; Marin, Jean Michel; Robert, Christian P.

In: Bayesian Analysis, Vol. 7, No. 2, 2012, p. 477-502.

Research output: Contribution to journalArticle

Celeux, Gilles ; El Anbari, Mohammed ; Marin, Jean Michel ; Robert, Christian P. / Regularization in regression : Comparing Bayesian and frequentist methods in a poorly informative situation. In: Bayesian Analysis. 2012 ; Vol. 7, No. 2. pp. 477-502.
@article{8b32c1db0ff245c8880cb54f318ad876,
title = "Regularization in regression: Comparing Bayesian and frequentist methods in a poorly informative situation",
abstract = "Using a collection of simulated and real benchmarks, we compare Bayesian and frequentist regularization approaches under a low informative constraint when the number of variables is almost equal to the number of observations on simulated and real datasets. This comparison includes new global noninformative approaches for Bayesian variable selection built on Zellner's g-priors that are similar to Liang et al. (2008). The interest of those calibration-free proposals is discussed. The numerical experiments we present highlight the appeal of Bayesian regularization methods, when compared with non-Bayesian alternatives. They dominate frequentist methods in the sense that they provide smaller prediction errors while selecting the most relevant variables in a parsimonious way.",
keywords = "Calibration, Dantzig selector, Elastic net, Lasso, Model choice, Noninformative priors, Regularization methods, Zellner's g-prior",
author = "Gilles Celeux and {El Anbari}, Mohammed and Marin, {Jean Michel} and Robert, {Christian P.}",
year = "2012",
doi = "10.1214/12-BA716",
language = "English",
volume = "7",
pages = "477--502",
journal = "Bayesian Analysis",
issn = "1936-0975",
publisher = "Carnegie Mellon University",
number = "2",

}

TY - JOUR

T1 - Regularization in regression

T2 - Comparing Bayesian and frequentist methods in a poorly informative situation

AU - Celeux, Gilles

AU - El Anbari, Mohammed

AU - Marin, Jean Michel

AU - Robert, Christian P.

PY - 2012

Y1 - 2012

N2 - Using a collection of simulated and real benchmarks, we compare Bayesian and frequentist regularization approaches under a low informative constraint when the number of variables is almost equal to the number of observations on simulated and real datasets. This comparison includes new global noninformative approaches for Bayesian variable selection built on Zellner's g-priors that are similar to Liang et al. (2008). The interest of those calibration-free proposals is discussed. The numerical experiments we present highlight the appeal of Bayesian regularization methods, when compared with non-Bayesian alternatives. They dominate frequentist methods in the sense that they provide smaller prediction errors while selecting the most relevant variables in a parsimonious way.

AB - Using a collection of simulated and real benchmarks, we compare Bayesian and frequentist regularization approaches under a low informative constraint when the number of variables is almost equal to the number of observations on simulated and real datasets. This comparison includes new global noninformative approaches for Bayesian variable selection built on Zellner's g-priors that are similar to Liang et al. (2008). The interest of those calibration-free proposals is discussed. The numerical experiments we present highlight the appeal of Bayesian regularization methods, when compared with non-Bayesian alternatives. They dominate frequentist methods in the sense that they provide smaller prediction errors while selecting the most relevant variables in a parsimonious way.

KW - Calibration

KW - Dantzig selector

KW - Elastic net

KW - Lasso

KW - Model choice

KW - Noninformative priors

KW - Regularization methods

KW - Zellner's g-prior

UR - http://www.scopus.com/inward/record.url?scp=84865715409&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84865715409&partnerID=8YFLogxK

U2 - 10.1214/12-BA716

DO - 10.1214/12-BA716

M3 - Article

AN - SCOPUS:84865715409

VL - 7

SP - 477

EP - 502

JO - Bayesian Analysis

JF - Bayesian Analysis

SN - 1936-0975

IS - 2

ER -