Calculate Root Mean Square Error (RMSE)

Easy
Machine Learning

Task: Compute Root Mean Square Error (RMSE)

In this task, you are required to implement a function rmse(y_true, y_pred) that calculates the Root Mean Square Error (RMSE) between the actual values and the predicted values. RMSE is a commonly used metric for evaluating the accuracy of regression models, providing insight into the standard deviation of residuals.

Your Task:

Implement the function rmse(y_true, y_pred) to:

  1. Calculate the RMSE between the arrays y_true and y_pred.
  2. Return the RMSE value rounded to three decimal places.
  3. Ensure the function handles edge cases such as:
    • Mismatched array shapes.
    • Empty arrays.
    • Invalid input types.

The RMSE is defined as:

RMSE=1ni=1n(ytrue,iypred,i)2\text{RMSE} = \sqrt{\frac{1}{n} \sum_{i=1}^{n} (y_{\text{true}, i} - y_{\text{pred}, i})^2}

Where:

  • nn is the number of observations.
  • ytrue,iy_{\text{true}, i} and ypred,iy_{\text{pred}, i} are the actual and predicted values for the ii-th observation.

Examples

Example 1:
Input: y_true = np.array([3, -0.5, 2, 7]) y_pred = np.array([2.5, 0.0, 2, 8]) print(rmse(y_true, y_pred))
Output: 0.612
Explanation: The RMSE is calculated as sqrt((0.5^2 + 0.5^2 + 0^2 + 1^2) / 4) = 0.612

Starter Code


import numpy as np

def rmse(y_true, y_pred):
	# Write your code here
	return round(rmse_res,3)
Lines: 1Characters: 0
Ready
The AI Interview - Master AI/ML Interviews