You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
How to develop a scorecard that uses lasso or ridge for variable screening to get a model that is more generalizable than a model with a full subset of variables?
The text was updated successfully, but these errors were encountered:
Any class supporting .fit(), .predict() and predict_proba() is suitable to be used as an estimator. The Logistic regression in sklearn supports lasso and ridge regularization via parameter penalty. The problem is that after fitting, there is no function to filter those variables with abs(coefficient) < threshold.
If I understood correctly, what you are proposing would require a function to retrieve estimator support, which it is currently missing.
How to develop a scorecard that uses lasso or ridge for variable screening to get a model that is more generalizable than a model with a full subset of variables?
The text was updated successfully, but these errors were encountered: