2016 - Team Ratings After Round 10

This week's results provide some interesting examples of the contrasting philosophies of the MoS Team Rating Systems and how they manifest in the interpretation of results.

In particular, the treatment of Geelong after their unexpected loss to Carlton is instructive. Both MARS and ChiPS punished the Cats quite heavily, knocking, respectively, 3.6 and 3.5 Rating Points (RPs) off their Rating, slipping them down one place into 5th on MARS and 4th on ChiPS. Both RP deductions were among the largest of the round.

On MARS, that move for the Cats was one of nine ranking changes, only three of which were by multiple spots, the Dogs and Adelaide rising two places into 6th and 4th respectively, and the Roos falling two places into 7th.

That leaves MARS with a Top 5 of West Coast, Sydney, Hawthorn, Adelaide, and Geelong, all of which it rates over 1,020.

ChiPS, in total, moved only four teams this week and none by more than a single spot. Its Top 5 is now Sydney, West Coast, GWS, Geelong, and the Kangaroos, and these are all rated 1,014.5 or higher. In ChiPS' case we can interpret this as implying that these five teams are about 2.5 goals or more better than an average team playing on a neutral venue.

Curiously, GWS fell one spot on MARS this week into 8th yet rose one spot on ChiPS into 3rd, leaving the Giants now ranked 5 places differently by the two Systems, easily the largest discrepancy for any of the teams. The only other teams for which the difference exceeds one spot are:

  • Western Bulldogs: 6th on MARS and 8th on ChiPS
  • Adelaide: 4th on MARS and 7th on ChiPS
  • Hawthorn: 3rd on MARS and 6th on ChiPS
  • Kangaroos: 7th on MARS and 5th on ChiPS
  • Brisbane Lions: 18th on MARS and 16th on ChiPS
  • Gold Coast: 16th on MARS and 18th on ChiPS

The correlation between raw MARS and ChiPS Ratings now stands at +0.976, the highest it's been since Round 3..

As first noted last week, some clear demarcations exist in both Systems' Ratings. For ChiPS it comes between the Dogs in 8th, Rated 1,012.6, and the Tigers in 9th, Rated 999.6 - a difference of 13 RPs.

For MARS it comes between Port Adelaide in 9th, Rated 1,009.3, and the Tigers in 10th, Rated 998.5 - a difference of almost 11 RPs. Still, I suppose we can't be too concerned about the competition forming into the hopefuls and the hopeless given the Carlton and Collingwood wins over Geelong in recent weeks.

MoSSBODS RATINGS

MoSSBODS, you'll recall, takes notice only of the Scoring Shots recorded in a contest, on which basis Geelong narrowly prevailed over Carlton 25-24 on Sunday (though it did score fewer and concede more Scoring Shots than MoSSBODS expected, so its Ratings did decline).

The Cats were the only team to lose this week while generating more Scoring Shots than their opponents, though a few other of the results are interpreted quite differently if you apply a Scoring Shot lens:

  • Sydney defeats the Kangaroos by 26 points, but only by 1 Scoring Shot
  • The Western Bulldogs defeat Collingwood by 21 points, but only by 1 Scoring Shot
  • Adelaide defeat GWS by 22 points, but by 12 Scoring Shots
  • West Coast defeat Gold Coast by 72 points, but also by 12 Scoring Shots
  • Hawthorn defeat Brisbane Lions by 48 points, but only by 8 Scoring Shots
  • Port Adelaide defeat Melbourne by 45 points, but only by 5 Scoring Shots

MoSSBODS' conceit is that, because Conversion Rates are broadly random, Scoring Shots provide a better indicator of a team's ability than does its Score in points. That claim is supported empirically (though I am currently developing an Offence-Defence Team Rating System that will use Scores rather than Scoring Shots with a view to having a direct competitor to MoSSBODS. The working name for this Rating System is MoSScoreBODS, for obvious reasons.)

One significant implication of MoSSBODS' philosophy is that the Cats suffer only a little for their slightly-worse-than-expected performance against the Blues, and therefore retain top Combined Rating. Their actual Rating does, however, fall by 0.4 SS on Offence, 0.9 SS on Defence, and 1.3 SS on Combined Rating.

GWS' loss, in comparison, is exacerbated by MoSSBODS' taking a Scoring Shot-based view, and their Combined Rating falls slightly further than does the Cats' - by 1.4 SS. The Giants nonetheless hold onto 2nd place overall.

A little further down the table, the Dogs, despite dropping 0.6 SS on Offence and 0.4 SS Combined, jump one place into 3rd, relegating a more-rapidly declining West Coast, who shed 0.4 SS on Offence, 0.4 SS on Defence, and 0.8 SS Combined, into 4th.

Two other swaps occurred on Combined Ratings, Adelaide and Hawthorn swapping 6th and 7th, and Carlton and Fremantle swapping 14th and 15th.

Those changes leave MoSSBODS disagreeing with both ChiPS and MARS most about:

  • Geelong: ranked 4th on ChiPS, 5th on MARS and 1st on MoSSBODS
  • Western Bulldogs: ranked 8th on ChiPS, 6th on MARS and 3rd on MoSSBODS
  • Sydney: ranked 1st on ChiPS, 2nd on MARS and 5th on MoSSBODS

We can put the new team Ratings in an historical context in the now-usual fashion, by plotting each team's Ratings against the backdrop of history - the Ratings of every team, from all previous years, after Round 10 of their respective season.

After Round 10s across history, the lowest ever Combined Rating for a team that subsequently finished in the Top 2 was the -1.84 of the 1913 St Kilda team, who finished as Runners Up that year curiously having defeated the eventual Premiers, Fitzroy, in the previous week's Preliminary Final. Under the Finals System that prevailed from 1907 to 1930, Fitzroy, as Minor Premiers were entitled to a "challenge" final in Week 4 given that they had not secured the Premiership in the previous week. St Kilda, though very much the crowd favourites in the Grand Final, were unable to replicate their efforts from the previous week.

The lowest Combined Rating after Round 10 for a subsequent Premier was -0.46, recorded by Carlton in 1945. At the end of Round 10 they were 4 and 6 with an 88 PC. They went on to finish the home and away season in 4th place, which was the last of the ladder positions playing Finals, and in successive weeks defeated the teams finishing 3rd (North Melbourne), 2nd (Collingwood) and 1st (South Melbourne).

As I mentioned last week, the cutoffs on the chart above reflect the proportion of ultimate Grand Finalists attaining a particular Combined Rating or higher at the current point in the season. They can't be interpreted as assessments of the likelihood of a team's making the Grand Final given a particular Combined Rating because they ignore those teams with the same or higher Combined Ratings that did not subsequently finish in the top 2 at season's end.

To remedy that, for this week I have attempted to very simply estimate the likelihood of a Grand Final appearance for each of the 18 teams by:

  1. Fitting a binary logit model to the Round 10 Combined Rating of all 1,440 teams that played in a Round 10 in any of the seasons between 1897 and 2015. The target variable is whether or not they finished as Premier or Runner Up.
  2. Using this model to provide estimated probabilities of finishing in the Top 2 for all 18 of the current teams
  3. Normalising these estimated probabilities (by dividing by their half their sum) to ensure that they sum to 2

Here's the output of that process:

It's important to note that these estimates do not account for the remaining schedules of any of the teams, but I think they provide an interesting historically-based MoSSBODS view on relative chances. Note that, with 2 of 18 teams to make the Grand Final, every team's naive expectation of a GF appearance is 1/9 or about 11%. Currently, eight teams are above that threshold and 10 are below it.

Compared to current TAB prices for making the Grand Final, these estimates are most notably different for teams with more remote chances, especially Port Adelaide ($34), Collingwood ($67), Richmond ($67), Carlton ($126), Melbourne ($251) and St Kilda ($501). They also, however, suggest that there's value in the prices for GWS ($3.75), the Western Bulldogs ($5.50), and Adelaide ($7.50). I'd be extremely cautious before acting on these very simplistic estimates, however.

This next chart, an animated GIF as it happens, allows us to track the Rating changes of all 18 teams across the season so far, here too against the backdrop of history.

 

I find it especially interesting to watch the march of the dots denoting eventual Top 2 finishers up and to the right of the screen. It's as if some of the current and past teams are getting washed off the back of a wave that's carrying other teams to the Finals. Too much caffeine ...

Lastly, here's a static chart showing each team's before and after Round 10 Ratings.
 

This week sees the Offensive and Defensive Ratings of Adelaide, Hawthorn, St Kilda, Carlton and the Gold Coast declining, and those of their opponents, GWS, Brisbane Lions, Fremantle, Geelong and West Coast increasing.