fix the use of initial learning rate in the OneCycleLR example (#130306)

Fixes #127649

Pull Request resolved: https://github.com/pytorch/pytorch/pull/130306
Approved by: https://github.com/janeyx99
This commit is contained in:
Tianyi Tao
2024-07-09 18:58:05 +00:00
committed by PyTorch MergeBot
parent 3689471ea4
commit 3477ee38e4

View File

@ -1939,7 +1939,7 @@ class OneCycleLR(LRScheduler):
Example:
>>> # xdoctest: +SKIP
>>> data_loader = torch.utils.data.DataLoader(...)
>>> optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9)
>>> optimizer = torch.optim.SGD(model.parameters(), lr=1e-4, momentum=0.9)
>>> scheduler = torch.optim.lr_scheduler.OneCycleLR(optimizer, max_lr=0.01, steps_per_epoch=len(data_loader), epochs=10)
>>> for epoch in range(10):
>>> for batch in data_loader: