返回列表 发帖

RMSE vs. standard error

RMSE is the square root of the mean squared error.

Standard Error of Estimate (SEE) = square root of sum of squares divided by n-k-1

So does RMSE= SEE?

Way to confuse. Throw in a quant question, and stare at the blank faces of candidates.

By the way i'd think the answer to your question is NO.

SEE = std deviation of error terms.
SEE = sqrt(variance of error)
SEE = sqrt(SSE/n-k-1)

where as MSE = SSE/ n-k-1 <-- there is no square root here.
SSE = squared sum of all errors, or residual sum of errors.

SSE/n-k-1 is not equal to SEE.

By the way what is RMSE? seeing it for the first time.

TOP

they are not the same thing, but closely related. RMSE is for the MEAN, not the total errors. it is the average error.

TOP

RMSE is sqrt(MSE). Same thing as far as I can tell.

It's a tool used to gauge in-sample and out-fo-sample forecasting accuracy. Low RMSE relative to another model = better forecasting.

TOP

As is with SEE

TOP

So it boils down to whether MSE = Sum of squares / n, or MSE = sum of squares / n-k-1. On an Anove table you will find MSS and the associated degrees of freedom is n-k-1.

I think denominator for MSE = n, denominator in the SEE is n-k-1 and that's my story.

TOP

返回列表