mirror of
https://github.com/pytorch/pytorch.git
synced 2025-10-20 21:14:14 +08:00
fix the use of initial learning rate in the OneCycleLR example (#130306)
Fixes #127649 Pull Request resolved: https://github.com/pytorch/pytorch/pull/130306 Approved by: https://github.com/janeyx99
This commit is contained in:
committed by
PyTorch MergeBot
parent
3689471ea4
commit
3477ee38e4
@ -1939,7 +1939,7 @@ class OneCycleLR(LRScheduler):
|
||||
Example:
|
||||
>>> # xdoctest: +SKIP
|
||||
>>> data_loader = torch.utils.data.DataLoader(...)
|
||||
>>> optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9)
|
||||
>>> optimizer = torch.optim.SGD(model.parameters(), lr=1e-4, momentum=0.9)
|
||||
>>> scheduler = torch.optim.lr_scheduler.OneCycleLR(optimizer, max_lr=0.01, steps_per_epoch=len(data_loader), epochs=10)
|
||||
>>> for epoch in range(10):
|
||||
>>> for batch in data_loader:
|
||||
|
Reference in New Issue
Block a user