Implement a function to calculate the Mean Absolute Error (MAE) between two arrays of actual and predicted values. The MAE is a metric used to measure the average magnitude of errors in a set of predictions without considering their direction.
Your function should return the MAE as a float value.
Examples
Example 1:
Input:
y_true = np.array([3, -0.5, 2, 7]), y_pred = np.array([2.5, 0.0, 2, 8])Output:
0.5Explanation: The MAE is the mean of absolute differences: (|3-2.5| + |-0.5-0| + |2-2| + |7-8|) / 4 = (0.5 + 0.5 + 0 + 1) / 4 = 0.5
Starter Code
import numpy as np
def mae(y_true, y_pred):
"""
Calculate Mean Absolute Error between two arrays.
Parameters:
y_true (numpy.ndarray): Array of true values
y_pred (numpy.ndarray): Array of predicted values
Returns:
float: Mean Absolute Error
"""
# Your code here
passPython3
ReadyLines: 1Characters: 0
Ready