Skip to content

feat: Expand adaptive lasso support and fix warm start#11

Merged
DataboyUsen merged 1 commit intomainfrom
AdaptiveLASSO-mixin
May 5, 2026
Merged

feat: Expand adaptive lasso support and fix warm start#11
DataboyUsen merged 1 commit intomainfrom
AdaptiveLASSO-mixin

Conversation

@DataboyUsen
Copy link
Copy Markdown
Owner

Description

  • Support omega parameter for adaptive lasso in ElasticNet classifier/regressor
  • Allow omega elements to be zero (no L1 penalty for specific features)
  • Fix warm start bug of plqERM_ElasticNet to handle hyperparameter changes correctly
  • Add detailed CI tests for related updates

Related Issue

  • warm start on plqERM_ElasticNet()
  • adaptive ElasticNet on plq_ElasticNet_Classifier & plq_ElasticNet_Regressor

Type of Change

  • Bug fix (non-breaking change that fixes an issue)
  • New feature (non-breaking change that adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation update
  • Refactoring
  • Other:

Checklist

  • I have tested my changes locally
  • Tests pass: pytest tests/ -v
  • Code follows the style guidelines (PEP 8)
  • Documentation has been updated (if applicable)
  • Commits are properly formatted

Additional Notes

- Support omega parameter for adaptive lasso in ElasticNet classifier/regressor
- Allow omega elements to be zero (no L1 penalty for specific features)
- Fix warm start bug of `plqERM_ElasticNet` to handle hyperparameter changes correctly
- Add detailed CI tests for related updates
@DataboyUsen DataboyUsen merged commit aa2a5c8 into main May 5, 2026
17 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant