RMSE stands for Root Mean Square Error, and it is a commonly used metric in machine learning to measure the accuracy of a regression model. RMSE measures the difference between the actual and predicted values of the target variable, and it is expressed in the same units as the target variable.
Here is an example of how to calculate RMSE:
Suppose we have a dataset that contains the weight and height of 10 individuals. We want to build a regression model that predicts the weight based on the height. Here is a sample dataset:
Sr.No. | Height(inches) | Actual Wt. | Pred-Wt. | (Actual-Predicted) wt. | diff 2 |
---|---|---|---|---|---|
1 | 67 | 57 | 55 | 2 | 4 |
2 | 70 | 59 | 59 | 0 | 0 |
3 | 54 | 73 | 85 | -12 | 144 |
4 | 60 | 82 | 92 | -10 | 100 |
5 | 72 | 48 | 50 | -2 | 4 |
6 | 56 | 50 | 50 | 0 | 0 |
7 | 71 | 59 | 58 | 1 | 1 |
8 | 56 | 64 | 63 | 1 | 1 |
9 | 65 | 70 | 70 | 0 | 0 |
10 | 57 | 79 | 78 | 1 | 1 |
-- | -- | -- | -- | Sum = | 255 |
Average = Sum / Count ====> Average = 255 / 10
So, Average = 25.5
Now, RMSE = Sqrt(Average) ===> RMSE = Sqrt (25.5)
RMSE = 5.049752
This is the RMSE for our regression model. It means that, on average, our model predicts the weight of an individual within +/- 5.049752 Kgs. of the actual weight.
Thus, RMSE is a metric that measures the accuracy of a regression model by calculating the difference between the actual and predicted values of the target variable and taking the square root of the average of the squared differences. The lower the RMSE, the more accurate the model.
Advertisement
Advertisement