2020

Research / 2020

Research

On Improvability of Model Selection by Model Averaging

2020.12.17

Jingfu Peng, Yuhong Yang


Publication Time2020.12.17

Lead AuthorJingfu Peng

Corresponding AuthorJingfu Peng, Yuhong Yang

JournalJOURNAL OF ECONOMETRICS

Abstract

In regression, model averaging (MA) provides an alternative to model selection (MS), and asymptotic efficiency theories have been derived for both MS and MA. Basically, under sensible conditions, MS asymptotically achieves the smallest estimation loss/risk among the candidate models, and MA does so among averaged estimators from the models with convex weights. Clearly, MA can beat MS by any extent in rate of convergence when all the candidate models have large biases that can be canceled out by a MA scheme. To our knowledge, however, a foundational issue has not been addressed in the literature. That is, when there is no advantage of reducing approximation error, does MA offer any significant improvement over MS in regression estimation? In this paper, we answer this question in a nested model setting that has been often used in the frequentist MA research area. A remarkable implication is that the much celebrated asymptotic efficiency of MS (e.g., by AIC) does not necessarily justify MS as commonly interpreted as achieving the best possible performance. In a nutshell, the oracle model (i.e., the unknowable best model among all the candidates) can be significantly improved by MA under certain conditions. A simulation study supports the theoretical findings.

Keywords

Model selection, Model averaging, Asymptotic efficiency