Implement the momentum optimizer update step function. Your function should take the current parameter value, gradient, and velocity as inputs, and return the updated parameter value and new velocity. The function should also handle scalar and array inputs.
Examples
Example 1:
Input:
parameter = 1.0, grad = 0.1, velocity = 0.1Output:
(0.909, 0.091)Explanation: The momentum optimizer computes updated values for the parameter and the velocity. With input values parameter=1.0, grad=0.1, and velocity=0.1, the updated parameter becomes 0.909 and the updated velocity becomes 0.091.
Starter Code
import numpy as np
def momentum_optimizer(parameter, grad, velocity, learning_rate=0.01, momentum=0.9):
"""
Update parameters using the momentum optimizer.
Uses momentum to accelerate learning in relevant directions and dampen oscillations.
Args:
parameter: Current parameter value
grad: Current gradient
velocity: Current velocity/momentum term
learning_rate: Learning rate (default=0.01)
momentum: Momentum coefficient (default=0.9)
Returns:
tuple: (updated_parameter, updated_velocity)
"""
# Your code here
return np.round(parameter, 5), np.round(velocity, 5)Python3
ReadyLines: 1Characters: 0
Ready