Skip to content

Add additional optimizers #25

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open

Add additional optimizers #25

wants to merge 3 commits into from

Conversation

tzuhanchang
Copy link
Owner

Add torch.nn.AdamW optimizer.

@tzuhanchang tzuhanchang added Priority Model Model related issue labels Mar 31, 2024
@tzuhanchang tzuhanchang requested a review from els285 March 31, 2024 18:39
@tzuhanchang tzuhanchang self-assigned this Mar 31, 2024
@tzuhanchang tzuhanchang changed the title Add torch.nn.AdamW optimizer Add additional optimizers Apr 2, 2024
@tzuhanchang
Copy link
Owner Author

Add torch.optim.AdamW and torch.optim.SGD. Also, add weight_decay and momentum options, but set their default values to 0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Model Model related issue
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant