A Comparison of Mixed and Ridge Estimators of Linear Models


Güler H. , KAÇIRANLAR S.

COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, vol.38, no.2, pp.368-401, 2009 (Journal Indexed in SCI) identifier

  • Publication Type: Article / Article
  • Volume: 38 Issue: 2
  • Publication Date: 2009
  • Doi Number: 10.1080/03610910802506630
  • Title of Journal : COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION
  • Page Numbers: pp.368-401

Abstract

The presence of autocorrelation in errors and multicollinearity among the regressors has undesirable effects on the least squares regression. There are a wide range of methods, such as the mixed estimator or the ridge estimator, for estimating regression equations, which are aimed to overcome the usefulness of the ordinary least squares estimator or the generalized least squares estimator. The purpose of this article is to examine multicollinearity and autocorrelation problems simultaneously and, to compare the mixed estimator to the ridge regression estimator (RRE) by the dispersion and mse matrix criterions in the linear regression model with correlated or heteroscedastic errors.

The presence of autocorrelation in errors and multicollinearity among the regressors has undesirable effects on the least squares regression. There are a wide range of methods, such as the mixed estimator or the ridge estimator, for estimating regression equations, which are aimed to overcome the usefulness of the ordinary least squares estimator or the generalized least squares estimator. The purpose of this article is to examine multicollinearity and autocorrelation problems simultaneously and, to compare the mixed estimator to the ridge regression estimator (RRE) by the dispersion and mse matrix criterions in the linear regression model with correlated or heteroscedastic errors.