Variable Selection in Robust Linear Models

Open Access
Kai, Bo
Graduate Program:
Master of Science
Document Type:
Master Thesis
Date of Defense:
Committee Members:
  • Runze Li, Thesis Advisor
  • robust regression
  • linear models
  • variable selection
  • SS penalty
  • oracle property
Variable selection plays very important roles in statistical learning. Traditional stepwise subset selection methods are widely used in practice, but they are difficult to implement when the number of predictors is large. Modern penalized likelihood estimation methods were introduced to overcome the deficiency of classical ones and to select variables and estimate their coefficients simultaneously. A number of authors have systematically studied the theoretical properties of various penalized least squares estimators. However, it is well known that the least squares estimators are badly affected by the existence of outliers in datasets. Robust penalized estimators were proposed to resist the influence of outliers. In this thesis, we consider the penalized M-estimation in linear models to achieve robustness and variable selection simultaneously. We establish the theoretical results for robust penalized estimators under general settings, i.e. general robust loss function and general penalty function. We show that the oracle property still hold for penalized M-estimators. Our finite simulation studies demonstrate satisfactory performances of the penalized M-estimators.