2020

Research / 2020

Research

Model Averaging Prediction for Time Series Models with A Diverging Number of Parameters

2020.11.07

Jun Liao, Guohua Zou, Yan Gao, Xinyu Zhang


Publication Time2020.11.07

Lead Author】Jun Liao 

Corresponding Author】Guohua Zou 

Journal】 JOURNAL OF ECONOMETRICS

Abstract

An important problem with the model averaging approach is the choice of weights. In this paper, a generalized Mallows model averaging (GMMA) criterion for choosing weights is developed in the context of an infinite order autoregressive (AR()) process. The GMMA method adapts to the circumstances in which the dimensions of candidate models can be large and increase with the sample size. The GMMA method is shown to be asymptotically optimal in the sense of achieving the lowest out-of-sample mean squared prediction error (MSPE) for both the independent-realization and the same-realization predictions, which, as a byproduct, solves a conjecture put forward by Hansen (2008) that the well-known Mallows model averaging criterion from Hansen (2007) is asymptotically optimal for predicting the future of a time series. The rate of the GMMA-based weight estimator tending to the optimal weight vector minimizing the independent-realization MSPE is derived as well. Both simulation experiment and real data analysis illustrate the merits of the GMMA method in the prediction of an AR(∞) process.

Keywords

Asymptotic optimality, Autoregressive process, Consistency, Mallows criterion, Model averaging