Though originally specify for least squares, lasso regularization is easily extended to a wide variety of statistical models including generalized linear models, generalized estimate equations, proportional hazards models, and M-estimators, in a straightforward style. Lasso was originally formulated for least squares models and this simple case reveals a substantial amount about the behavior of the estimator, including its relationship to ridge regression and best subset choice and the connections between lasso coefficient estimates and so-named soft thresholding.

COMING SOON!

```
data(ggplot2::diamonds)
library(caret)
library(dplyr)
dia.trans<-bind_cols(diamonds %>% select_if(is.numeric),
model.matrix(~cut-1,diamonds) %>% as_tibble(),
model.matrix(~color-1,diamonds) %>% as_tibble(),
model.matrix(~clarity-1,diamonds) %>% as_tibble())
#setting parameters alpha and lambda
lasso_expand<-expand.grid(alpha = 1, lambda = seq(0.001,0.1,by = 0.0005))
lasso_mod <- train(x=dia.trans %>% select(-price), y=dia.trans$price, method='glmnet',
tuneGrid=lasso_expand)
#best tune
lasso_mod$bestTune
lasso_mod$results$RMSE
lasso_imp<-varImp(lasso_mod)
#get the importance of each feature and eliminate some of them
lasso_imp$importance
```