You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I believe it would be beneficial to experiment with different scheduler classes. There might be a better option than ExponentialLR that could improve the training process.
Problem
—
Proposed Solution
—
Alternatives Considered
—
The text was updated successfully, but these errors were encountered:
Unfortunately, I haven't conducted any tests and I don't have any results. However, when I was reviewing the code and saw that you had set up the option to choose different optimizers, I had an idea to try changing the learning rate scheduler as well.
In the lr_scheduler.py file, I found a variety of different schedulers. I started reading the descriptions of each one, and two particularly caught my attention: CosineAnnealingLR and ReduceLROnPlateau.
Of course, I didn't review all the available options; there might be something better. But out of the ones I read, these two stood out to me. Yes, the descriptions might be a bit exaggerated, but why not give them a try? :)
short descriptions (unofficial)
ExponentialLR
Gradually decreases the learning rate following an exponential function with each epoch.
CosineAnnealingLR
Smoothly decreases the learning rate following a cosine curve, which helps stabilize training.
ReduceLROnPlateau
Reduces the learning rate when the model's performance metric stops improving, helping to avoid getting stuck in local minima.
Description
In the
train.py
file, there is a section for initializing schedulers. Have you tried replacing theExponentialLR
class with a different scheduler, such as CosineAnnealingLR or ReduceLROnPlateau, or any other scheduler?I believe it would be beneficial to experiment with different scheduler classes. There might be a better option than
ExponentialLR
that could improve the training process.Problem
—
Proposed Solution
—
Alternatives Considered
—
The text was updated successfully, but these errors were encountered: