Superiority of the r-k Class Estimator Over Some Estimators In A Linear Model


Siray G., SAKALLIOĞLU S.

COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, vol.41, no.15, pp.2819-2832, 2012 (Peer-Reviewed Journal) identifier identifier

  • Publication Type: Article / Article
  • Volume: 41 Issue: 15
  • Publication Date: 2012
  • Doi Number: 10.1080/03610926.2011.648786
  • Journal Name: COMMUNICATIONS IN STATISTICS-THEORY AND METHODS
  • Journal Indexes: Science Citation Index Expanded, Scopus
  • Page Numbers: pp.2819-2832
  • Keywords: Average loss criterion, r - k Class estimator, Mahalanobis loss function, Principal components regression estimator, Ridge regression estimator, PRINCIPAL COMPONENT REGRESSION, RIDGE-REGRESSION, COMBINING RIDGE

Abstract

In regression analysis, to overcome the problem of multicollinearity, the r - k class estimator is proposed as an alternative to the ordinary least squares estimator which is a general estimator including the ordinary ridge regression estimator, the principal components regression estimator and the ordinary least squares estimator. In this article, we derive the necessary and sufficient conditions for the superiority of the r - k class estimator over each of these estimators under the Mahalanobis loss function by the average loss criterion. Then, we compare these estimators with each other using the same criterion. Also, we suggest to test to verify if these conditions are indeed satisfied. Finally, a numerical example and a Monte Carlo simulation are done to illustrate the theoretical results.