Skip to content

scheduler.py


pycmtensor.scheduler

PyCMTensor scheduler module

The code snippet defines a base class called Scheduler for learning rate schedulers. It also includes three subclasses: ConstantLR, StepLR, and PolynomialLR, which implement specific learning rate scheduling strategies.

Example Usage: - Creating a Scheduler object: scheduler = Scheduler(lr=0.01) - Getting the learning rate for a specific epoch: lr = scheduler(epoch=5) - Creating a ConstantLR object: constant_lr = ConstantLR(lr=0.01) - Getting the learning rate for a specific epoch: lr = constant_lr(epoch=10) - Creating a StepLR object: step_lr = StepLR(lr=0.01, factor=0.95, drop_every=5) - Getting the learning rate for a specific epoch: lr = step_lr(epoch=15) - Creating a PolynomialLR object: poly_lr = PolynomialLR(lr=0.01, max_epochs=20, power=0.5) - Getting the learning rate for a specific epoch: lr = poly_lr(epoch=8)

Scheduler(lr)

Initializes the Scheduler object with a base learning rate.

Parameters:

Name Type Description Default
lr float

The base learning rate.

required

Attributes:

Name Type Description
name str

Name of the scheduler.

_base_lr float

Base learning rate.

_history list

List to store the learning rate history.

lr property

Property that returns the base learning rate.

Returns:

Name Type Description
float

The base learning rate.

history property

Property that returns the learning rate history.

Returns:

Name Type Description
list

The learning rate history.

__str__()

Returns a string representation of the Scheduler object.

Returns:

Name Type Description
str

String representation of the Scheduler object.

__repr__()

Returns a string representation of the Scheduler object with its attributes.

Returns:

Name Type Description
str

String representation of the Scheduler object with its attributes.

__call__(epoch)

Records the learning rate and returns the current learning rate for a specific epoch.

Parameters:

Name Type Description Default
epoch int

The epoch number.

required

Returns:

Name Type Description
float

The current learning rate.

record(lr)

Saves the history of the learning rate and returns the current learning rate.

Parameters:

Name Type Description Default
lr float

The learning rate.

required

Returns:

Name Type Description
float

The current learning rate.

ConstantLR(lr=0.01, **kwargs)

Bases: Scheduler

Subclass of Scheduler for constant learning rate scheduler.

Parameters:

Name Type Description Default
lr float

initial learning rate

0.01

StepLR(lr=0.01, factor=0.95, drop_every=10, **kwargs)

Bases: Scheduler

Base class for step learning rate scheduler

Parameters:

Name Type Description Default
lr float

initial learning rate

0.01
factor float

percentage reduction to the learning rate

0.95
drop_every int

step down the learning rate after every n steps

10

PolynomialLR(max_epochs, lr=0.01, power=1.0, **kwargs)

Bases: Scheduler

Subclass of Scheduler for polynomial decay learning rate scheduler.

Parameters:

Name Type Description Default
lr float

initial learning rate value

0.01
max_epochs int

the max number of training epochs

required
power float

the exponential factor to decay

1.0

CyclicLR(lr=0.01, max_lr=0.1, cycle_steps=16, scale_fn=None, **kwargs)

Bases: Scheduler

Subclass of Scheduler for cyclic learning rate scheduler.

Parameters:

Name Type Description Default
lr float

initial learning rate value.

0.01
max_lr float

Peak learning rate value.

0.1
cycle_steps int

The number of steps to complete a cycle.

16
scale_fn function

Scaling function for the learning rate.

None

Raises:

Type Description
ValueError

description

Triangular2CLR(lr=0.01, max_lr=0.1, cycle_steps=16, **kwargs)

Bases: CyclicLR

Subclass of CyclicLR for Triangular Cyclic LR scheduler.

The scaling of the Triangular Cyclic function is:

\[ scale = \frac{1}{2^{step-1}} \]

Parameters:

Name Type Description Default
lr float

initial learning rate value.

0.01
max_lr float

Peak learning rate value.

0.1
cycle_steps int

The number of steps to complete a cycle.

16

ExpRangeCLR(lr=0.01, max_lr=0.1, cycle_steps=16, gamma=0.5, **kwargs)

Bases: CyclicLR

Subclass of CyclicLR for exponential range Cyclic LR scheduler.

The scaling is:

\[ scale = \gamma^{step} \]

Parameters:

Name Type Description Default
lr float

initial learning rate value.

0.01
max_lr float

Peak learning rate value.

0.1
cycle_steps int

The number of steps to complete a cycle.

16
gamma float

Exponential parameter.

0.5