Revisiting Who's The Best. Should we use RMSE, MSE or Accuracy?

A few years back, I authored a blog post in which I deftly presented the case for the superiority of MAE over Accuracy for identifying the most talented forecasters.

In that blog, I blithely batted away the RMSE metric, on the basis that it was likely too susceptible to blowout results in a relatively short season of 200 games or so to be usefully discriminating.

But, we all grow older and, hopefully, most of us also grow wiser, so it’s only reasonable that I, in hindsight, test that hypothesis.

So, using the same methodology from the linked blog post, what do we find if we include RMSE as a metric in a “Typical” season?

We find that RMSE is clearly superior to MAE in that, for example, the best forecaster finishes 1st on RMSE about 87% of the time, which is greater than the 81% of the time it does likewise if we choose MAE as our performance metric.

And, what about if we, instead, explore a “Surprising” season?

Again we find that the best forecaster prevails more often, here about 82% of the time versus 75% if we use MAE.

So, the conclusion? RMSE clearly does a slightly better job of identifying better forecasters but, I would contend, at the price of having a metric that is far less intuitive.

On balance, I’d probably still go with MAE over RMSE, but recognise that there is a price paid in doing so.