The adaptive gril estimator with a diverging number of parameters

Mohammed El Anbari, Abdallah Mkhadri

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

We consider the problem of variables selection and estimation in linear regression model in situations where the number of parameters diverges with the sample size. We propose the adaptive Generalized Ridge-Lasso (\mboxAdaGril) which is an extension of the the adaptive Elastic Net. AdaGril incorporates information redundancy among correlated variables for model selection and estimation. It combines the strengths of the quadratic regularization and the adaptively weighted Lasso shrinkage. In this article, we highlight the grouped selection property for AdaCnet method (one type of AdaGril) in the equal correlation case. Under weak conditions, we establish the oracle property of AdaGril which ensures the optimal large performance when the dimension is high. Consequently, it achieves both goals of handling the problem of collinearity in high dimension and enjoys the oracle property. Moreover, we show that AdaGril estimator achieves a Sparsity Inequality, i.e., a bound in terms of the number of non-zero components of the "true" regression coefficient. This bound is obtained under a similar weak Restricted Eigenvalue (RE) condition used for Lasso. Simulations studies show that some particular cases of AdaGril outperform its competitors.

Original languageEnglish
Pages (from-to)2634-2660
Number of pages27
JournalCommunications in Statistics - Theory and Methods
Volume42
Issue number14
DOIs
Publication statusPublished - 18 Jul 2013
Externally publishedYes

Fingerprint

Adaptive Estimator
Lasso
Oracle Property
Selection of Variables
Collinearity
Ridge
Regression Coefficient
Shrinkage
Linear Regression Model
Diverge
Sparsity
Model Selection
Higher Dimensions
Redundancy
Regularization
Sample Size
Simulation Study
Eigenvalue
Estimator

Keywords

  • Adaptive regularization
  • High dimension
  • Oracle property
  • Sparsity inequality
  • Variable selection

ASJC Scopus subject areas

  • Statistics and Probability

Cite this

The adaptive gril estimator with a diverging number of parameters. / El Anbari, Mohammed; Mkhadri, Abdallah.

In: Communications in Statistics - Theory and Methods, Vol. 42, No. 14, 18.07.2013, p. 2634-2660.

Research output: Contribution to journalArticle

@article{ccc7f214f13447589e11c07d0e37faea,
title = "The adaptive gril estimator with a diverging number of parameters",
abstract = "We consider the problem of variables selection and estimation in linear regression model in situations where the number of parameters diverges with the sample size. We propose the adaptive Generalized Ridge-Lasso (\mboxAdaGril) which is an extension of the the adaptive Elastic Net. AdaGril incorporates information redundancy among correlated variables for model selection and estimation. It combines the strengths of the quadratic regularization and the adaptively weighted Lasso shrinkage. In this article, we highlight the grouped selection property for AdaCnet method (one type of AdaGril) in the equal correlation case. Under weak conditions, we establish the oracle property of AdaGril which ensures the optimal large performance when the dimension is high. Consequently, it achieves both goals of handling the problem of collinearity in high dimension and enjoys the oracle property. Moreover, we show that AdaGril estimator achieves a Sparsity Inequality, i.e., a bound in terms of the number of non-zero components of the {"}true{"} regression coefficient. This bound is obtained under a similar weak Restricted Eigenvalue (RE) condition used for Lasso. Simulations studies show that some particular cases of AdaGril outperform its competitors.",
keywords = "Adaptive regularization, High dimension, Oracle property, Sparsity inequality, Variable selection",
author = "{El Anbari}, Mohammed and Abdallah Mkhadri",
year = "2013",
month = "7",
day = "18",
doi = "10.1080/03610926.2011.615438",
language = "English",
volume = "42",
pages = "2634--2660",
journal = "Communications in Statistics - Theory and Methods",
issn = "0361-0926",
publisher = "Taylor and Francis Ltd.",
number = "14",

}

TY - JOUR

T1 - The adaptive gril estimator with a diverging number of parameters

AU - El Anbari, Mohammed

AU - Mkhadri, Abdallah

PY - 2013/7/18

Y1 - 2013/7/18

N2 - We consider the problem of variables selection and estimation in linear regression model in situations where the number of parameters diverges with the sample size. We propose the adaptive Generalized Ridge-Lasso (\mboxAdaGril) which is an extension of the the adaptive Elastic Net. AdaGril incorporates information redundancy among correlated variables for model selection and estimation. It combines the strengths of the quadratic regularization and the adaptively weighted Lasso shrinkage. In this article, we highlight the grouped selection property for AdaCnet method (one type of AdaGril) in the equal correlation case. Under weak conditions, we establish the oracle property of AdaGril which ensures the optimal large performance when the dimension is high. Consequently, it achieves both goals of handling the problem of collinearity in high dimension and enjoys the oracle property. Moreover, we show that AdaGril estimator achieves a Sparsity Inequality, i.e., a bound in terms of the number of non-zero components of the "true" regression coefficient. This bound is obtained under a similar weak Restricted Eigenvalue (RE) condition used for Lasso. Simulations studies show that some particular cases of AdaGril outperform its competitors.

AB - We consider the problem of variables selection and estimation in linear regression model in situations where the number of parameters diverges with the sample size. We propose the adaptive Generalized Ridge-Lasso (\mboxAdaGril) which is an extension of the the adaptive Elastic Net. AdaGril incorporates information redundancy among correlated variables for model selection and estimation. It combines the strengths of the quadratic regularization and the adaptively weighted Lasso shrinkage. In this article, we highlight the grouped selection property for AdaCnet method (one type of AdaGril) in the equal correlation case. Under weak conditions, we establish the oracle property of AdaGril which ensures the optimal large performance when the dimension is high. Consequently, it achieves both goals of handling the problem of collinearity in high dimension and enjoys the oracle property. Moreover, we show that AdaGril estimator achieves a Sparsity Inequality, i.e., a bound in terms of the number of non-zero components of the "true" regression coefficient. This bound is obtained under a similar weak Restricted Eigenvalue (RE) condition used for Lasso. Simulations studies show that some particular cases of AdaGril outperform its competitors.

KW - Adaptive regularization

KW - High dimension

KW - Oracle property

KW - Sparsity inequality

KW - Variable selection

UR - http://www.scopus.com/inward/record.url?scp=84886067839&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84886067839&partnerID=8YFLogxK

U2 - 10.1080/03610926.2011.615438

DO - 10.1080/03610926.2011.615438

M3 - Article

VL - 42

SP - 2634

EP - 2660

JO - Communications in Statistics - Theory and Methods

JF - Communications in Statistics - Theory and Methods

SN - 0361-0926

IS - 14

ER -