Matter of Stats

View Original

The Quality of the 2014 Finalists

Based on end of home-and-away season MARS Ratings, as a group the 2014 Finalists are slightly inferior to their 2013 counterparts (though, oddly enough, the opposite would have been true had the Dons not been relegated to 9th).

The Top 4 teams on the competition ladder are Rated on average about 3 Rating Points (RPs) lower this year compared to last, while the teams in ladder positions 5 to 8 are Rated on average almost 2 RPs higher. Across all 8 teams that translates into this year's crop Rating about 0.6 RPs lower per team.

The 2014 crop (and that of 2013) is more substantially inferior to the Finalists from 2012 and 2011, having an average Rating about 6 RPs lower than the former and about 2.5 RPs lower than the latter. What both 2012 and 2011 had that this season lacks is a truly standout team with a 1,060+ Rating.

(Note that I've shown Essendon in 7th place in 2013 in the table that follows and also treated them as finishing 7th in the later analyses. Also, I've pro-rated Ratings for the years 2012 to 2014 to lift the all-team average Rating to 1,000. Because formal MARS Ratings have GWS entering the competition with a sub-1000 Rating in 2012, failing to make that adjustment results in a small amount of Rating deflation and disadvantages teams in the most recent three seasons.)

At the top of this year's ladder, the Sydney of 2014 is about a 7 RP better side at present compared to the 2013 side at the same point of the season, while the Hawks are about a 2 RP weaker side, the Cats about a 21 RP weaker side, and the Dockers about a 4.5 RP weaker side. As a rough guide, 1 RP is worth about 0.75 points on the field.

West Coast and Adelaide this year join the Roos and Crows of 2012, and the Saints and Blues of 2011 as teams finishing with a Rating better than 1,010 but failing to make the Finals. This has only been a phenomenon since the expansion of the competition to 17 and then 18 teams, and might legitimately be partly attributed to the dilution in team quality that this expansion has brought, which effectively made more Rating Points available to the stronger teams.

One positive sign from this year, however, is the reduction in the overall spread of Ratings, with this year's weakest team, the Saints, almost 40 RPs superior to the weakest team of 2013, GWS. The stripe chart that follows makes this point quite well visually and also depicts the overall compression of Ratings that has taken place over the past couple of seasons.

The spread that we saw this year is more similar to the spreads we enjoyed in seasons prior to 2011 but still retains a less egalitarian flavour than those earlier years, with the difference in the average Rating of the Top 8 teams compared to the remaining 8, 9, or 10 teams still above 50 RPs.

Looking at the MARS Rating necessary to secure different ladder positions reveals some natural breaks in what's required to finish where.

Those natural thresholds are as follows:

  • For 1st place, a Rating of around 1,045 is required
  • For 2nd, around 1,035 to 1,040
  • For 3rd or 4th, around 1,025
  • For 5th or 6th, around 1,015
  • For 7th or 8th, around 1,005
  • For 9th or 10th, around 1,000
  • For 11th to 13th, 985 to 995
  • For any lower place, Ratings below about 975 prevail

That said, there has been considerable variability in the MARS Ratings of teams finishing in any given ladder position, with differences of 25 to 45 RPs being observed for all but four ladder positions (ie for 13th and 17th, where the range is smaller, and for 12th and 18th where the range is larger).

To finish, here's a chart with twin timelines for each team reflecting its end of home-and-away season MARS Rating and its ladder position.

The broad similarity of each team's Rating trajectory to its ladder position trajectory is striking, though the Ratings trajectories are somewhat squashed by the need to accommodate the lowly GWS and Gold Coast Ratings in recent seasons.