Task: Compute Root Mean Square Error (RMSE)
In this task, you are required to implement a function rmse(y_true, y_pred) that calculates the Root Mean Square Error (RMSE) between the actual values and the predicted values. RMSE is a commonly used metric for evaluating the accuracy of regression models, providing insight into the standard deviation of residuals.
Your Task:
Implement the function rmse(y_true, y_pred) to:
- Calculate the RMSE between the arrays
y_trueandy_pred. - Return the RMSE value rounded to three decimal places.
- Ensure the function handles edge cases such as:
- Mismatched array shapes.
- Empty arrays.
- Invalid input types.
The RMSE is defined as:
RMSE=n1i=1∑n(ytrue,i−ypred,i)2Where:
- n is the number of observations.
- ytrue,i and ypred,i are the actual and predicted values for the i-th observation.
Examples
Example 1:
Input:
y_true = np.array([3, -0.5, 2, 7])
y_pred = np.array([2.5, 0.0, 2, 8])
print(rmse(y_true, y_pred))Output:
0.612Explanation: The RMSE is calculated as sqrt((0.5^2 + 0.5^2 + 0^2 + 1^2) / 4) = 0.612
Starter Code
import numpy as np
def rmse(y_true, y_pred):
# Write your code here
return round(rmse_res,3)
Python3
ReadyLines: 1Characters: 0
Ready