Write a Python class ExponentialLRScheduler to implement a learning rate scheduler based on the ExponentialLR strategy. Your class should have an init method to initialize with an initial_lr (float) and gamma (float) parameter. It should also have a get_lr(self, epoch) method that returns the current learning rate for a given epoch (int). The learning rate should be decreased by gamma every epoch. The returned learning rate should be rounded to 4 decimal places. Only use standard Python.
Examples
Example 1:
Input:
scheduler = ExponentialLRScheduler(initial_lr=0.1, gamma=0.9)
print(f"{scheduler.get_lr(epoch=0):.4f}")
print(f"{scheduler.get_lr(epoch=1):.4f}")
print(f"{scheduler.get_lr(epoch=2):.4f}")
print(f"{scheduler.get_lr(epoch=3):.4f}")Output:
0.1000
0.0900
0.0810
0.0729Explanation: The initial learning rate is 0.1. At epoch 1, it decays by 0.9 to 0.09. At epoch 2, it decays again to 0.081, and so on, decaying by gamma every single epoch. All results are rounded to 4 decimal places.
Starter Code
class ExponentialLRScheduler:
def __init__(self, initial_lr, gamma):
# Initialize initial_lr and gamma
pass
def get_lr(self, epoch):
# Calculate and return the learning rate for the given epoch
passPython3
ReadyLines: 1Characters: 0
Ready