Shrinkage and absolute penalty estimation in linear regression models


In predicting a response variable using multiple linear regression model, several candidate models may be available which are subsets of the full model. Shrinkage estimators borrow information from the full model and provides a hybrid estimate of the regression parameters by shrinking the full model estimates toward the candidate submodel. The process introduces bias in the estimation but reduces the overall prediction error that offsets the bias. In this article, we give an overview of shrinkage estimators and their asymptotic properties. A real data example is given and a Monte Carlo simulation study is carried out to evaluate the performance of shrinkage estimators compared to the absolute penalty estimators such as least absolute shrinkage and selection operator (LASSO), adaptive LASSO and smoothly clipped absolute deviation (SCAD) based on prediction errors criterion in a multiple linear regression setup. WIREs Comput Stat 2012, 4:541–553. DOI: 10.1002/wics.1232

shrinkage estimation; absolute penalty estimation; LASSO; adaptive LASSO; SCAD

How to cite

S. Ejaz Ahmed, and S. E. Raheem, (2012). Shrinkage and absolute penalty estimation in linear modelsWIREs Computational StatisticsVolume 4, Issue 6, pages 541–553, November/December 2012.