Matter of Stats

View Original

An Analysis of Strength of Schedule for the 2018 AFL Season

For the past four years now I've used essentially the same methodology to analyse the AFL Fixture for the upcoming season (see this post from two years ago, and, probably more relevantly, this one from the year before if you'd like the details).

This year, the only change I'll be making is to use the MoSHBODS Team Rating System (rather than MoSSBODS) to provide the estimates of team ability and venue effects. Last season, MoSHBODS performed slightly better than MoSSBODS in predicting game margins, and it has a simpler interpretation in that Rating Points map to expected points scored on a one-to-one basis.

The 2018 AFL Fixture, released in late October (in its customary impossible-to-import-directly-into-Excel format), has all 18 teams playing 22 of a possible 34 games, each missing 6 of the home and 6 of the away clashes that an all-plays-all full schedule would entail.

There is again, a bye week for every team and, as was the case last year, these have been accommodated by playing only 6 games in two rounds (Rounds 13 and 14), 7 games in one round (Round 12), and 8 games in another (Round 10). 

In determining the 108 games to be excluded the League has again, in the interests of what it calls "on-field equity", applied a 'weighted rule', which is a mechanism for reducing the average disparity in ability between opponents across the 198 home-and-away games, using the ladder positions of 2017 after the Finals series as the measure of that ability.

This year, of the contests that would pit a team from last year's Top 6 against a team from the Bottom 6, only 42 of the 72 (or about 58%) possible pairings are included in the schedule. That's the same number as we had in the 2017 Fixture.

By contrast, only 44 of the 60 (or about 73%) of the possible pairings between the Top 6 teams are included, but 46 of the 72 (or about 64%) of the possible pairings between the Top 6 and the Middle 6 are included That's two fewer and two more, respectively, than last year.

As a consequence, one aspect of the 2018 Fixture that is different from last year is that there are more Top 6 v Middle 6 clashes than there are Top 6 v Top 6 and Middle 6 v Middle 6 clashes this year.

You can see the results for the 2017 Fixture in the table at right where you'll note that, within each tier, at least as many games (and usually more) were played between teams from the same tier as opposed to teams from another tier.

Barring an exceptionally fortuitous and/or unlikely set of circumstances, excluding games, however it's done, imbalances the schedule in that the combined strength of the opponents faced by any one team across the entire home-and-away season will differ, possibly materially, from that of every other team. At face value, the AFL's methodology for trimming the draw seems likely to exacerbate that imbalance (deliberately so) especially for the strongest and the weakest teams because it is designed to have teams playing more games against others of roughly similar ability.

The practical effect though is quite small, especially for the 2018 Fixture. Richmond, for example, play 7 games against Top 6 opponents this season, 8 against Middle 6 opponents, and 7 against Bottom 6 opponents. By comparison, the numbers for the Brisbane Lions are 7, 7 and 8 respectively.

Regardless, in reality, the actual effect of the AFL's schedule truncation on the variability of team schedule strength depends on the degree to which last year's final ladder positions reflect the true underlying abilities of the teams, the spread of ability within each "third" of the competition, and the relative magnitude of venue effects in enhancing or depressing these abilities. Those are precisely the things, of course, that the MoSHBODS Team Rating System is designed to estimate.

This year we'll use MoSHBODS' opinions to to answer the following questions about the schedule:

  1. How difficult is the schedule that each team faces, taking into account the teams faced and the venues at which they are faced?

  2. How much has the use of the 'weighted rule' in truncating the draw helped or hindered a team's chances relative to a complete 34-round competition?

STRENGTH OF SCHEDULE

The first thing we need to estimate a team's schedule strength is a measure of their opponents' underlying abilities. For this purpose we'll use MoSHBODS 2018 Round 1 Team Ratings, which are set by taking 70% of the final 2017 Ratings, the regression towards zero reflecting the average historical shrinking in the spread of team abilities from the end of one season to the start of the next. These Ratings appear in the table below. 

This year sees a few teams ranked more than a couple of spots differently by MoSHBODS compared to their official final ladder position. Port Adelaide, for example, will start the season as the 4th-highest rated MoSHBODS team despite finishing 7th on the final ladder, while West Coast find but ranked only 11th on MoSHBODS despite finishing 6th on the final ladder.

In the context of the AFL's competition "thirds", four teams would be placed in a different third were MoSHBODS to be used rather than the final ladder in defining the boundaries:

  • Port Adelaide: Middle 6 based on Ladder / Top 6 based on MoSHBODS

  • Collingwood: Bottom 6 based on Ladder / Middle 6 based on MoSHBODS

  • West Coast: Top 6 based on Ladder / Middle 6 based on MoSHBODS

  • Melbourne: Middle 6 based on Ladder / Bottom 6 based on MoSHBODS

The average and range of the Combined Ratings of teams from each of the AFL thirds is as follows:

  • Top 6: Ave +10.8 Points / Range 19.3 Points

  • Middle 6: Ave +2.8 Points / Range 11.7 Points

  • Bottom 6: Ave -13.6 Points / Range 27.5 Points

So, ignoring Venue Effects, which we'll come to in a moment, the difference between playing an average Top 6 team and an average Bottom 6 team is just over 24 points. Also, the spread of Ratings is much greater in the Bottom 6 than in either of the Top 6 or Middle 6. It's more important then exactly who you play from the Bottom 6 than who you play from the other 6s. Not all thirds are created equal.

MoSHBODS also provides estimates of how much better or worse teams, on average, play at each venue. These estimates are known as Venue Performance Values, and are a logical extension of the notion of a "home ground advantage" to account for the fact that not all away venues are the same for a given team.

The current Venue Performance Values are summarised in the table below for all of the venues being used sometime during 2018. Note that teams need to have played a minimum number of games at a venue (four in the MoSHBODS System) before their Venue Performance Value is altered from zero (shown as dashes in the table below to improve readability).

Venue Performance values are, like Ratings, measured in Points, and are added to a team's underlying MoSHBODS Combined Rating when used in the Strength of Schedule calculation. So, for example, we can say that Geelong is, on average, a +2.8 Points better team than their underlying +8.3 Points Rating when playing at Docklands.

Because of the manner in which they are calculated, these Venue Performance Values incorporate the effects, if any, of interstate travel, which you can see, for example, if you run your eye along the row for the Gabba in the table above. At that ground, St Kilda are about a 14 point worse team, Hawthorn a 10 point worse team, and the Western Bulldogs also a 10 point worse team.

After performing this calculation for all 22 games for every team, we arrive at the Strength of Schedule calculations below, within which larger positive values represent more difficult schedules.

In the left portion of the table we have the combined strength of the opponents faced by a team at home, split into the contribution from underlying ability and from the venue (from the opponents' perspective only). We would generally expect the Aggregate Opponent Venue Performance figure to be negative for a team, since their opponents are likely to perform less well away from home than at home.

We don't see that, however, for West Coast and Fremantle, because they are both playing their home games this year at the previously unused Perth Stadium. Because no team has a record at that venue, all Venue Performance Values are still at zero. To the extent that travel is generally detrimental to teams' performances, this might overstate the effective strength of the opponents faced by these two teams.

That said, even a notional negative 40 point allocation to West Coast - roughly the figures for Adelaide, Gold Coast and Sydney - would still have them with a home draw in the top half in terms of difficulty. 

Unadjusted though, based on their home draws, the teams with the five most difficult draws are:

  1. Essendon (+46.9)

  2. West Coast (+25.4)

  3. St Kilda (+13.9)

  4. Melbourne (+11.3)

  5. Hawthorn (+3.5)

Those with the easiest home draws are:

  1. Brisbane Lions (-86.3)

  2. Geelong (-85.1)

  3. Gold Coast (-61.2)

  4. Sydney (-55.7)

  5. Western Bulldogs (-42.6)

All of these five face significantly weaker schedules in large part because of venue effects. The Western Bulldogs (and, to an extent, Geelong) aside, they're not playing inherently weaker teams, just teams that are weaker when playing at these teams' home venues. 

The middle section of the table looks at the combined strength of the teams played away from home, again split into the contribution from underlying ability and from the venue (from the opponents' perspective only). Here we would expect the Aggregate Opponent Venue Performance figures to be positive for a team, since their opponents are likely to perform better at home than their underlying ability would suggest.

The figures here are, on average, fractionally higher for non-Victorian teams (+27.6) than for Victorian teams (+25.1) because non-Victorian teams play more games, on average, interstate. 

Based on their away draws, the teams with the five most difficult draws are:

  1. Geelong (+64.3)

  2. GWS (+50.2)

  3. Fremantle (+42.8)

  4. Richmond (+42.7)

  5. Western Bulldogs (+40.9)

Those with the easiest home draws are:

  1. Brisbane Lions (-86.3)

  2. Geelong (-85.1)

  3. Gold Coast (-61.2)

  4. Sydney (-55.7)

  5. Western Bulldogs (-42.6)

Combining the home and the away pictures to estimate a Total Effective Strength of Schedule (SoS) figure and summarising the results we have:

  • Tough Schedules: West Coast*, GWS, Fremantle*, Essendon, Richmond, St Kilda

  • Slightly Harder Schedules: Hawthorn, Carlton

  • Average Schedules: Melbourne, North Melbourne, Western Bulldogs, Adelaide

  • Slightly Easier Schedules: Geelong, Gold Coast, Sydney, Collingwood

  • Easy Schedules: Port Adelaide, Brisbane Lions

* given treatment of new home ground. West Coast would move to 'Slightly Harder' category with a -40 point adjustment, Fremantle to 'Average'.

Comparing each team's ranking on Strength of Schedule with the ladder positions used for weighting the draw, a few teams stand out: 

  • Fremantle and, to a lesser extent, Carlton have seemingly more difficult schedules than might be expected for a team in the Bottom 6

  • Sydney, Geelong, and Adelaide all have easier schedules than might be expected for a Top 6 team

  • Port Adelaide has a slightly easier schedule than might be expected for a Middle 6 team

To investigate the issue that some of these disparities might be attributed mainly to venue effects, I've included a couple of new columns on the extreme right of the table this year, which calculate total Strength of Schedule using only the estimated underlying abilities of the opponents faced (ie the sum of the ratings of the teams played).

Looked at that through this lens we see that GWS's, Carlton's, North Melbourne's and the Western Bulldogs' schedules appear a little easier, and Brisbane's, Melbourne's, Adelaide's and Port Adelaide's a little harder. It does nothing of significance for the ranking of West Coast's Fremantle's, Sydney's and Geelong's schedule strength.

So, going back to the Total Effective SoS numbers, we find that the difference between the hardest and easiest schedules this year amounts to about 112 points across the season, which is a tick over 5 points per game.

A 5-point advantage turns a game with an otherwise 50% victory probability into one with about a 56% probability, which converts to about 1.2 extra expected wins across a 22-game season. If, instead, we assume a 25% (or 75%) average probability without the advantage, then the 5-point advantage is worth about 0.9 extra wins a season.

If we exclude the teams with the two easiest and two hardest schedules the difference shrinks to about 3 points per game. That represents 0,5 to 0.7 extra wins a season.

STRENGTH OF MISSING SCHEDULE

We can also view the schedule on the basis of the opportunities missed by a team as a consequence of playing only 22 of the possible 34 games. 

The table below summarises the missing games in the 2018 Fixture, denoting with H's those games missed that would have been home games for a team. and as A's those that would have been away games. Note that I've ordered the teams on the basis of their final 2017 ladder positions, the same ordering that was used for implementing the AFL's 'weighted rule'.

West Coast, for example, fail to play three of the other Top 6 teams twice during the season, missing out on Adelaide at home, and Richmond and Geelong away. Sydney misses only Richmond at home, and Adelaide away, but plays Geelong, GWS, and West Coast both at home and away.

Ignoring venue effects, we can overlay MoSHBODS Ratings on this table to calculate a simplistic Strength of the Missed Schedule figure.

The column headed total shows the aggregate MoSHBODS Ratings of the opponents not played twice during the home-and-away season. As we'd expect, their magnitude is broadly related to the teams' final ladder positions, though with a few exceptions such as Geelong, who have the 9th lowest value but who finished 3rd on the ladder, and St Kilda, who have the 6th lowest value but who finished 11th on the ladder.

By adding back the effect on a team of not playing itself twice, we get a Net Impact of Missed Games figure, which is exactly equal to the negative of the Aggregate Opponent Ability Only column in the earlier Strength of Schedule Actual table. 

CONCLUSION

Accounting for both the quality of the opposition faced and the venuing of fixtures, I'd summarise the teams' relative Strength of Schedules as follows

  • Tough Schedules: West Coast, GWS, Fremantle, Essendon, Richmond, St Kilda

  • Slightly Harder Schedules: Hawthorn, Carlton

  • Average Schedules: Melbourne, North Melbourne, Western Bulldogs, Adelaide

  • Slightly Easier Schedules: Geelong, Gold Coast, Sydney, Collingwood

  • Easy Schedules: Port Adelaide, Brisbane Lions

Were we to ignore venuing (which I don't think we should, but I recognise is harder for the AFL to account for), relative to those assessments:

  • Easier than shown above: GWS, Carlton, North Melbourne and Western Bulldogs

  • Harder than shown above: Brisbane Lions, Melbourne, Port Adelaide and Adelaide

These differences come about for these teams because, at their home grounds, opponents tend to do relatively better or worse than away teams do on average at other venues. If we look at GWS, for example, we see that a number of teams do quite well at GWS's two home grounds for 2018: Sydney Showground and Manuka Oval.

Make of all that what you will, but the fundamental issue is that anything short of an all-plays-all home-and-away fixture will always lead to winners and losers in the fixturing.

From a glass half-full perspective, while certainly at the margin a teams' schedule will nudge the probability of it playing finals up or down by a little, there will be myriad other factors before and during the season that will have much more profound effects.

Still, it'd be nice to make the schedule as even as we could ...