RSS (Residual Sum of Squares)
<aside>
💡
RSS measures how far off a model’s predictions are from the actual values. (calculates the sum of the squared differences between the observed values (actual values) and the predicted values)
</aside>
- It adds up the squares of all the differences between actual and predicted values.
- A smaller RSS means the model fits the data better.
$$
\begin{align*}
\text{RSS} &= \sum_{i=1}^{n} (y_i - \hat{y}_i)^2
\end{align*}
$$
MSE (Mean Squared Error)
<aside>
💡
MSE is the average of the squared errors, which is the Residual Sum of Squares (RSS) divided by the number of observations.
</aside>
- RSS grows with the number of data points (RSS depend on the number of datapoints), so it’s not always the best way to compare models.
- Instead, we take the average of RSS by dividing it by the number of observations.
- This gives the Mean Squared Error (MSE), which is easier to interpret and compare.
- A smaller MSE means predictions are closer to actual values.
$$
\begin{align*}
\text{MSE} &= \frac{\sum_{i=1}^{n} (y_i - (w_0 + w_1x_i))^2}{N} \\ \\
\text{MSE} &= \frac{1}{n} \sum_{i=1}^{n} (y_i - \hat{y}_i)^2
\end{align*}
$$
RMSE (Root Mean Squared Error)
<aside>
💡
RMSE is simply the square root of the Mean Squared Error (MSE)
</aside>
- RMSE is the square root of the Mean Squared Error (MSE).
- Taking the square root brings the error back to the same units as the target variable, so it’s easier to interpret.