Skip to content
This repository was archived by the owner on Jul 1, 2024. It is now read-only.

Implement Adam optimizer #301

Closed
wants to merge 5 commits into from
Closed

Implement Adam optimizer #301

wants to merge 5 commits into from

Conversation

TH3CHARLie
Copy link
Contributor

fixes #288

Summary:
Implement Adam optimizer, follow code style similar to SGD and RMSProp

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Dec 6, 2019
@TH3CHARLie
Copy link
Contributor Author

Current test fails at

self.assertTrue(
torch.allclose(
optim1["state"][id1]["momentum_buffer"],
optim2["state"][id2]["momentum_buffer"],
)
)

because "momentum_buffer" raises KeyError

I'm not pretty sure about this but I think it is due to no momentum in Adam, if so, then maybe the test itself needs modification

@aadcock
Copy link

aadcock commented Dec 6, 2019

@TH3CHARLie , thanks for submitting the PR! Yeah, I think you are correct, the test likely needs to be updated.

Copy link
Contributor

@mannatsingh mannatsingh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for working on this TH3CHARLie!

You're right, Adam doesn't have a momentum buffer. Can you add a _check_momentum_buffer to TestOptimizer and return True in the base class? TestAdamOptimizer can return False instead and we can use this to determine if the check should happen or not.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vreis has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mannatsingh has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@mannatsingh merged this pull request in 3d3fc1d.

@TH3CHARLie TH3CHARLie deleted the add-adam branch December 10, 2019 02:54
kleistern8z added a commit to kleistern8z/ClassyVision-2 that referenced this pull request Aug 14, 2024
Summary:
fixes facebookresearch/ClassyVision#288
Implement Adam optimizer, follow code style similar to SGD and RMSProp
Pull Request resolved: facebookresearch/ClassyVision#301

Reviewed By: vreis

Differential Revision: D18875000

Pulled By: mannatsingh

fbshipit-source-id: 6fee087264a71bc826a00e89a5557c6894979d84
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implement Adam optimizer
4 participants