演講摘要 : This paper proposes a new class of GMM estimators to increase the efficiency of the coefficient estimate relative to the ordinary least squares (OLS) estimator when all the error term and regressors having nonparametric autocorrelation. This class of GMM estimators are built on the moments generated from the long difference (LD) operator of Griliches and Hausman (1986) and those from the multiple difference (MD) operator of Tsay (2007). Most importantly, the GMM estimator is designed to beat both OLS and first-differenced (FD) estimators when neither OLS nor FD estimator attains Gauss-Markov bound in that the proposed method merges the information inherent in the moments of the OLS estimator and those of the FD one. Thus, the GMM estimator also resolves the dilemma concerning `to difference or not to difference' in the time series literature, because both level and differenced data are employed for the GMM estimation. The Monte Carlo experiments confirm the theoretical findings by showing that the GMM method has very good finite-sample power performance relative to both OLS and FD estimators.