Volume No. 8 Issue No.: 3A Page No.: 726-733 Jan-Mar 2014

 

RIDGE REGRESSION AS AN APPLICATION TO REGRESSION PROBLEMS

 

Saikia B. and Singh R.*

Department of Statistics, North Eastern Hill University, Shillong (INDIA)

 

Received on : September 19, 2013

 

ABSTRACT

 

Powel Ciompa exercised first time the term of multicollinearity in around 1910 but Frish in 1934 invented the term multicollinearity in his work Confluence Analysis as known today. The problem of multicollinearity in regression analysis is essentially a lack of sufficient information in the sample to permit accurate estimation of individual parameter. Hoerl and Kennard in 1970 proposed a popular tool of Ridge Regression (RR) for estimating the regression coefficients in case of correlated explanatory variables. They proposed R ? = ( X X ? + k I) 1 ? Y X ? in lieu of ? = ( X ?X ) ?1 X ?Y for the estimate of parameter vector, ?. In the presence of multicollinearity RR technique provides better estimate of regression coefficients. This technique presents biased estimate having a smaller Mean Square Error (MSE) than the Ordinary Least Squares (OLS) estimator. This paper presents and applies the technique of RR for the estimation of the regression coefficients when explanatory variables are correlated.

 

Keywords : Biasing parameter, Multicollinearity, Ridge estimate, Tikhonov regularization, Tolerance, Variance inflation factor

 

 

BACK