Implement Adam Optimization Algorithm

Medium
Machine Learning

Implement the Adam (Adaptive Moment Estimation) optimization algorithm in Python. Adam is an optimization algorithm that adapts the learning rate for each parameter. Your task is to write a function adam_optimizer that updates the parameters of a given function using the Adam algorithm.

The function should take the following parameters:

  • f: The objective function to be optimized
  • grad: A function that computes the gradient of f
  • x0: Initial parameter values
  • learning_rate: The step size (default: 0.001)
  • beta1: Exponential decay rate for the first moment estimates (default: 0.9)
  • beta2: Exponential decay rate for the second moment estimates (default: 0.999)
  • epsilon: A small constant for numerical stability (default: 1e-8)
  • num_iterations: Number of iterations to run the optimizer (default: 1000)

The function should return the optimized parameters.

Examples

Example 1:
Input: import numpy as np def objective_function(x): return x[0]**2 + x[1]**2 def gradient(x): return np.array([2*x[0], 2*x[1]]) x0 = np.array([1.0, 1.0]) x_opt = adam_optimizer(objective_function, gradient, x0) print("Optimized parameters:", x_opt)
Output: # Optimized parameters: [0.99000325 0.99000325]
Explanation: The Adam optimizer updates the parameters to minimize the objective function. In this case, the objective function is the sum of squares of the parameters, and the optimizer finds the optimal values for the parameters.

Starter Code

import numpy as np

def adam_optimizer(f, grad, x0, learning_rate=0.001, beta1=0.9, beta2=0.999, epsilon=1e-8, num_iterations=10):
	# Your code here
	pass
Lines: 1Characters: 0
Ready
The AI Interview - Master AI/ML Interviews