Skip to content

Add AdamW optimizer support for World Language Model example #1380

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Aug 23, 2025

Conversation

likejazz
Copy link
Contributor

Previously, gradients had to be manually updated step by step. With this change, training can be performed directly using the AdamW optimizer, making the workflow simpler and more flexible.

$ python main.py --accel --epochs 6 --model Transformer --use-optimizer --lr 0.001

Copy link

netlify bot commented Aug 19, 2025

Deploy Preview for pytorch-examples-preview canceled.

Name Link
🔨 Latest commit a29193a
🔍 Latest deploy log https://app.netlify.com/projects/pytorch-examples-preview/deploys/68a3fe2fda006400084bfd5f

@meta-cla meta-cla bot added the cla signed label Aug 19, 2025
@msaroufim msaroufim merged commit 28d16ff into pytorch:main Aug 23, 2025
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants