Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: ReduceLROnPlateau records the learning rate and works on batches #1075

Merged

Conversation

BenjaminBossan
Copy link
Collaborator

Previously, when using ReduceLROnPlateau, we would not record the learning rates in history. The code comment says that's because this class does not expose the get_last_lr method. I checked it again and it's now present, so let's use it.

Furthermore, I made a change to enable ReduceLROnPlateau to step on each batch (so far, only each epoch was supported). This is consistent with other learning rate schedulers.

Previously, when using ReduceLROnPlateau, we would not record the
learning rates in history. The comment says that's because this class
does not expose the get_last_lr method. I checked it again and it's now
present, so let's use it.

Furthermore, I made a change to enable ReduceLROnPlateau to step on each
batch instead of each epoch. This is consistent with other learning rate
schedulers.
@BenjaminBossan
Copy link
Collaborator Author

The CI is failing because of a change in sklearn 1.6 and requires PR #1076.

@BenjaminBossan BenjaminBossan merged commit 5bd84bd into master Dec 19, 2024
16 checks passed
@BenjaminBossan BenjaminBossan deleted the enh-reduce-lr-on-plateau-recording-and-per-batch branch December 19, 2024 13:11
githubnemo added a commit that referenced this pull request Jan 9, 2025
Please welcome skorch 1.1.0 - a smaller release with a few fixes, a new notebook showcasing learning rate 
schedulers and mainly support for scikit-learn 1.6.0.

Full list of changes:

### Added

- Added a [notebook](https://github.com/skorch-dev/skorch/blob/master/notebooks/Learning_Rate_Scheduler.ipynb) that shows how to use Learning Rate Scheduler in skorch.(#1074)

### Changed

- All neural net classes now inherit from sklearn's [`BaseEstimator`](https://scikit-learn.org/stable/modules/generated/sklearn.base.BaseEstimator.html). This is to support compatibility with sklearn 1.6.0 and above. Classification models additionally inherit from [`ClassifierMixin`](https://scikit-learn.org/stable/modules/generated/sklearn.base.ClassifierMixin.html) and regressors from [`RegressorMixin`](https://scikit-learn.org/stable/modules/generated/sklearn.base.RegressorMixin.html). (#1078)
- When using the `ReduceLROnPlateau` learning rate scheduler, we now record the learning rate in the net history (`net.history[:, 'event_lr']` by default). It is now also possible to to step per batch, not only by epoch (#1075)
- The learning rate scheduler `.simulate()` method now supports adding step args which is useful when simulation policies such as `ReduceLROnPlateau` which expect metrics to base their schedule on. (#1077)
- Removed deprecated `skorch.callbacks.scoring.cache_net_infer` (#1088)

### Fixed

- Fix an issue with using `NeuralNetBinaryClassifier` with `torch.compile` (#1058)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants