Sufficient dimension reduction methods allow to estimate lower dimensional subspaces while retaining most of the information about the regression of a response variable on a set of predictors. However, it may happen that only a subset of the predictors is needed. We propose a geometric approach to subset selection by imposing sparsity constraints on some coefficients. The proposed method can be applied to most existing dimension reduction methods, such as sliced inverse regression and sliced average variance estimation, and may help to improve the estimation accuracy and facilitate interpretation. Simulation studies are presented to show the effectiveness of the proposed method applied to two popular dimension reduction methods, namely SIR and SAVE, and a comparison is made with LASSO and stepwise OLS regression.
A Geometric Approach to Subset Selection and Sparse Sufficient Dimension Reduction
SCRUCCA, Luca
2011
Abstract
Sufficient dimension reduction methods allow to estimate lower dimensional subspaces while retaining most of the information about the regression of a response variable on a set of predictors. However, it may happen that only a subset of the predictors is needed. We propose a geometric approach to subset selection by imposing sparsity constraints on some coefficients. The proposed method can be applied to most existing dimension reduction methods, such as sliced inverse regression and sliced average variance estimation, and may help to improve the estimation accuracy and facilitate interpretation. Simulation studies are presented to show the effectiveness of the proposed method applied to two popular dimension reduction methods, namely SIR and SAVE, and a comparison is made with LASSO and stepwise OLS regression.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.