The adaptive gril estimator with a diverging number of parameters

Mohammed El Anbari, Abdallah Mkhadri

Research output: Contribution to journalArticle

2 Citations (Scopus)


We consider the problem of variables selection and estimation in linear regression model in situations where the number of parameters diverges with the sample size. We propose the adaptive Generalized Ridge-Lasso (\mboxAdaGril) which is an extension of the the adaptive Elastic Net. AdaGril incorporates information redundancy among correlated variables for model selection and estimation. It combines the strengths of the quadratic regularization and the adaptively weighted Lasso shrinkage. In this article, we highlight the grouped selection property for AdaCnet method (one type of AdaGril) in the equal correlation case. Under weak conditions, we establish the oracle property of AdaGril which ensures the optimal large performance when the dimension is high. Consequently, it achieves both goals of handling the problem of collinearity in high dimension and enjoys the oracle property. Moreover, we show that AdaGril estimator achieves a Sparsity Inequality, i.e., a bound in terms of the number of non-zero components of the "true" regression coefficient. This bound is obtained under a similar weak Restricted Eigenvalue (RE) condition used for Lasso. Simulations studies show that some particular cases of AdaGril outperform its competitors.

Original languageEnglish
Pages (from-to)2634-2660
Number of pages27
JournalCommunications in Statistics - Theory and Methods
Issue number14
Publication statusPublished - 18 Jul 2013
Externally publishedYes



  • Adaptive regularization
  • High dimension
  • Oracle property
  • Sparsity inequality
  • Variable selection

ASJC Scopus subject areas

  • Statistics and Probability

Cite this