Skip to content
This repository has been archived by the owner on Jan 3, 2023. It is now read-only.

Implement Adagrad optimizer #19

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

kaushikkanetkar
Copy link
Contributor

No description provided.

@kaushikkanetkar
Copy link
Contributor Author

Added a test in tests/test_optimizer.py
Tested manually by running mnist and imdb rnn using this optimizer. The reduction in cost is lesser after a few iterations.
Tested style.

@kaushikkanetkar
Copy link
Contributor Author

Adding @zach-nervana @tyler-nervana for the review.

Copy link
Contributor

@tyler-nervana tyler-nervana left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you can add a docstring for Adagrad, that'd be awesome! Regardless, this looks good to me, so I'll go ahead and get it merged internally.



class Adagrad(LearningRateOptimizer):
"""
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you add a docstring? You can use this one as a good reference: https://github.com/NervanaSystems/neon/blob/master/neon/optimizers/optimizer.py#L637

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, I'll do that

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done. Updated. Thanks.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants