Learning rate not reset to last value after resume #428
Unanswered
AlaaKhaddaj
asked this question in
Q&A
Replies: 1 comment
-
@AlaaKhaddaj yes, the LR is based on the schedule over the training duration, with default cosine sched, if you change the train duration on resume, the LR at the point you resume is different because you've changed the duration, you are now no longer at the same % through the schedule. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I was originally training a model for a while, until the learning rate became too small. After increasing the number of epochs, and resuming, the learning rate at the first epoch of the resumed model does not match the learning rate at the last epoch of the original model.
Is that an intended behavior?
Beta Was this translation helpful? Give feedback.
All reactions