Skip to content

A numpy implementation of several learning rate schedulers

Notifications You must be signed in to change notification settings

yfji/lr_scheduler

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

lr_scheduler

A numpy implementation of several learning rate schedulers

Learning rate decaying strategies

  • WarmupLR
    This scheduler increases lr during warming up after each iteration. So you should use * iterations instead of epochs in warmup.

  • StepLR
    This scheduler manages lr during training after each epoch. In fact, lr may not decay as the current epoch is not in the milestones.

  • LinearLR
    This scheduler decays lr during training after each epoch in a linear manner. Once the milestone encountered, lr decays by gamma.

  • ExponentialLR
    This scheduler decays lr during training after each epoch in a exponential manner. Once the milestone encountered, lr decays by gamma.

  • CosineLR
    This scheduler decays lr during training after each epoch in a cosine manner. Once the milestone encountered, lr decays by gamma. Notice that the curve in which lr decays by gamma is shorter than a period.

  • CosineLRv2
    This scheduler decays lr during training after each epoch in a cosine manner. Once the milestone encountered, lr decays by gamma. Notice that the curve in which lr decays by gamma is half a period.

  • DampeningLR
    This scheduler decays lr during training after each epoch in a dampening manner. Once the milestone encountered, lr decays by gamma. Now it decays using a linear manner and dampens after each epoch, while you can extend this method to decay by other curves

Learning rate resume

All the classes except WarmupLR provide resume function to recover the learning rate from a given epoch.

Sample

LR schedulers

About

A numpy implementation of several learning rate schedulers

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages