Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH add support for L1 + L2 regularization in SparseLogisticRegression #231

Closed
mathurinm opened this issue Mar 25, 2024 · 1 comment · Fixed by #278 · May be fixed by #244
Closed

ENH add support for L1 + L2 regularization in SparseLogisticRegression #231

mathurinm opened this issue Mar 25, 2024 · 1 comment · Fixed by #278 · May be fixed by #244

Comments

@mathurinm
Copy link
Collaborator

Currently we only support L1 in logreg: https://contrib.scikit-learn.org/skglm/generated/skglm.SparseLogisticRegression.html

We could introduce a second regularization parameter corresponding to a squared L2 regularisation, like ElasticNet is to Lasso

@PascalCarrivain would you give it a try ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants