An Analysis of Strength of Schedule for the 2020 AFL Season
/This year we’ll use the same methodology as we did last year to analyse the AFL Fixture for the upcoming season. Details of that methodology are in this post from 2015 and this one from the year before. We’ll again use the MoSHBODS Team Rating System to provide the estimates of relative team ability and venue effects.
The 2020 AFL Fixture, released 31 October, has all 18 teams playing 22 of a possible 34 games, each missing 6 of the home and 6 of the away clashes that an all-plays-all full schedule would entail. There is, again, a bye week for every team, these having been accommodated in 2020 by playing only 6 games in Rounds 12 through 14.
THE RULE OF THIRDS
In determining the 108 games to be excluded, the League has, once again in the interests of what it calls "on-field equity", applied a 'weighted rule', which is a mechanism for reducing the average disparity in ability between opponents across the 198 home-and-away games, using the ladder positions of 2019 after the Finals series as the measure of that ability.
This year, of the contests that would pit a team from last year's Top 6 against a team from the Bottom 6, only 42 of the 72 (or about 58%) of the possible pairings are included in the schedule. That's the same number as we had in the 2018 and 2019 Fixtures.
By contrast, 46 of the 60 (or about 77%) of the possible pairings between the Top 6 teams are included, while 44 of the 72 (or about 72%) of the possible pairings between the Top 6 and the Middle 6 teams are included. Those are also the same proportions as we had in the 2019 Fixture.
This year then, 136 of the 198 contests (or about 69%) involve teams from the same one-third based on final official ladder positions last season.
That’s the same number as we had last year, the data for which appears at left.
MoSHBODS’ VIEWS ON STRENGTH OF SCHEDULE
Next we’ll use MoSHBODS' opinions about team strengths and venue effects to provide some answers to the following questions about the 2020 schedule:
How difficult is the schedule that each team faces, taking into account the teams faced and the venues at which they are faced?
How much has the use of the 'weighted rule' in truncating the draw helped or hindered a team's chances relative to a complete 34-round competition?
The first thing we need to estimate a team's schedule strength is a measure of their opponents' underlying abilities. For this purpose we'll use MoSHBODS’ 2020 pre-Round 1 Team Ratings, which are set by taking 70% of their final 2019 Ratings, the regression towards zero reflecting the average historical shrinking in the spread of team abilities from the end of one season to the start of the next. These Ratings appear in the table below.
Again this year there are number of teams ranked quite differently by MoSHBODS than by their ladder finish.
Five teams are ranked four or more spots differently by MoSHBODS compared to their official final ladder position, and four more are ranked two or three spots differently.
GWS, for example, will start the season as the 8th-highest rated MoSHBODS team despite finishing 2nd on the final ladder, while the Western Bulldogs will start ranked 4th despite finishing 7th on the ladder.
In the context of the AFL's competition "thirds", six teams would be placed in a different third were MoSHBODS to be used rather than the final ladder in defining the boundaries:
Brisbane Lions and GWS: Top 6 based on Ladder / Middle 6 based on MoSHBODS
Western Bulldogs and Hawthorn: Middle 6 based on Ladder / Top 6 based on MoSHBODS
Essendon: Middle 6 based on Ladder / Bottom 6 based on MoSHBODS
Sydney: Bottom 6 based on Ladder / Middle 6 based on MoSHBODS
Last year, only two teams would have moved into a different third using MoSHBODS rather than the ladder. Note, however, the small difference in Combined Rating between West Coast in 6th and Brisbane Lions in 7th (0.23 points), which means that the Lions could easily have finished in the Top 6 based on Ratings but for a goal or two here and there.
The average and range of the Combined MoSHBODS Ratings of teams from each of the AFL thirds is as follows:
Top 6: Ave +7.7 Points / Range 7.8 Points (2019: +11.7 / 6.2)
Middle 6: Ave +1.4 Points / Range 9.3 Points (2019 +3.9 / 10.8)
Bottom 6: Ave -9.1 Points / Range 22.3 Points (2019 -15.6 / 19.3)
We can see that the Top and Middle 6 teams from are, on average, weaker than those from last year, and the Bottom 6 teams are, on average, stronger than those from last year. Put another way, last year’s teams were far more evenly matched than those from the season before.
Ignoring venue effects, which we'll come to in a moment, the difference between playing an average Top 6 team and an average Bottom 6 team is therefore about 17 points (+7.7 less -9.1). That’s 10 points less than last season. Also, the spread of Ratings is - again and even more so than was the case last season - greater in the Bottom 6 than in either the Top 6 or the Middle 6. So, once again this year, it's more important exactly who you play from the Bottom 6 than who you play from the other 6s.
MoSHBODS also provides estimates of how much better or worse teams, on average, play at each venue. These estimates are known as Venue Performance Values, and are a logical extension of the notion of a "home ground advantage" to account for the fact that not all away venues are the same for every team.
The current Venue Performance Values are summarised in the table below for all of the venues at which a team appears at least once sometime during the 2020 home-and-away season. Note that teams need to have played a minimum number of games at a venue (four in the MoSHBODS System) before their Venue Performance Value is altered from zero (shown as 0.0 in the table below). Values shown as dashes are for team-venue pairs that don’t arise in the 2020 schedule.
(Interestingly, Gold Coast play at 12 different venues in 2020, which is three more venues than any other team, and twice as many as Carlton and Collingwood.)
Venue Performance values are, like Ratings, measured in Points, and are added to a team's underlying MoSHBODS Combined Rating when used in the Strength of Schedule calculation. So, for example, we can say that Geelong is, on average, a +2.5 Points better team than their underlying +9.0 Points Rating when playing at Docklands.
Because of the manner in which they are calculated, these Venue Performance Values incorporate the effects, if any, of interstate travel, which you can see, for example, if you run your eye along the row for the Gabba in the table above. At that ground, St Kilda are about a 14 point worse team, Hawthorn an 11 point worse team, and North Melbourne a 12 point worse team. (Gold Coast are about a 19 point worse team at the Gabba, but you can’t really attribute that to the travel.)
After performing the necessary calculations for all 22 games for every team, we arrive at the Strength of Schedule estimates below, within which larger positive values represent more difficult schedules.
(See the STRENGTH OF MISSING SCHEDULE section below for each teams’ actual and missing schedule.)
In the left portion of the table we have the combined strength of the opponents faced by a team at home, split into the contribution from underlying ability and from the venue (from the opponents' perspective only). We would generally expect the Aggregate Opponent Venue Performance figure to be negative for a team, since their opponents are likely to perform less well away from home than at home.
We don't see that, however, for GWS because of:
the surprisingly good performances that a number of this season’s visitors have recorded at Sydney Showground (namely, Adelaide, Sydney, Collingwood, and Richmond, all of whom have positive Venue Performance Values for that venue)
the fact that three visitors (Geelong, St Kilda, and Fremantle) have played an insufficient number of games at Sydney Showground to have registered a Venue Performance Value, and so carry the default Value of zero.
In fact, Essendon will be the only visitor to Sydney Showground in 2020 who will suffer from a negative Venue Performance value at the ground.
West Coast and Fremantle also suffer a little because only three teams (other than themselves) have played four or more games at Perth Stadium and so still have the default Venue Performance Value of zero. To the extent that travel, in particular, is generally detrimental to teams' performances, this might overstate the effective strength of the opponents faced by these two teams at home.
That said, even a notional negative 42 point allocation to Fremantle and West Coast - roughly the average figure for Adelaide, Port Adelaide, Gold Coast, Brisbane Lions, GWS and Sydney - would still have West Coast with a Total Effective Strength of Schedule in the top 8 in terms of difficulty, and Fremantle in the top 14.
Unadjusted though, based on their home fixtures, the teams with the five most difficult schedules (including venue effects) are:
Carlton (+37.4)
Richmond (+11.3)
GWS (+10.5)
St Kilda (+10.1)
West Coast (+6.8)
Such is the underlying strength of the teams they face, Carlton would still rank 1st, Richmond 2nd, St Kilda 3rd, and West Coast 5th if we ignored venue effects. GWS would fall to 11th, however.
Those with the easiest home schedules (including venue effects) are:
Geelong (-91.3)
Brisbane Lions (-90.7)
Port Adelaide (-76.9)
Adelaide (-69.5)
Melbourne (-42.6)
Were we to ignore venue effects, Geelong, Melbourne, and Port Adelaide would remain in the bottom five in terms of home schedule difficulty, while Brisbane Lions and Adelaide would move into 8th and 9th respectively, such is the benefit for these two of playing at their home grounds.
The middle section of the table looks at the combined strength of the teams played away from home, again split into the contribution from underlying ability and from the venue (from the opponents' perspective only). Here we would expect the Aggregate Opponent Venue Performance figures to be positive for a team, since their opponents are likely to perform better at home than their underlying ability would suggest. That is, indeed, the case for every team, and to a relatively similar aggregate extent.
Based on their away fixtures, the teams with the five most difficult schedules (including venue effects) are:
North Melbourne(+72.7)
Essendon (+64.1)
Geelong (+55.5)
GWS (+54.9)
Gold Coast (+52.7)
Ignoring venue effects would yield exactly the same top five but with GWS and Geelong switching places.
Those with the easiest away schedules (including venue effects) are:
Richmond (-11.6)
St Kilda (-2.3)
Sydney (+6.5)
Fremantle (+7.4)
Carlton (+10.8)
Ignoring venue effects would give the same bottom five, but see Sydney 15th, St Kilda 16th, and Fremantle 17th.
Combining the home and the away pictures to estimate a Total Effective Strength of Schedule (SoS) figure and summarising the results, we have (including venue effects):
Tough Schedules: GWS, West Coast*, Carlton, Essendon, Gold Coast, North Melbourne
Slightly Harder Schedules: Collingwood
Average Schedules: Hawthorn, Fremantle*, Western Bulldogs, St Kilda, Richmond
Slightly Easier Schedules: Melbourne, Adelaide
Easy Schedules: Sydney, Geelong, Port Adelaide, Brisbane Lions
* given zero venue performance values for most opponents at Perth Stadium. Fremantle would move to the 'Slightly Easier' category with a -34 point adjustment, West Coast to 'Average'.
Comparing each team's ranking on Strength of Schedule with the ladder positions used for weighting the draw, a few teams stand out:
Gold Coast and Carlton have more difficult schedules than might be expected for a team in the Bottom 6
Brisbane Lions, Geelong and Richmond have easier schedule than might be expected for Top 6 teams
Essendon has a slightly harder, and Port Adelaide a slightly easier schedule than might be expected for Middle 6 teams
To investigate the issue that some of these disparities might be attributed mainly to venue effects, I've included a couple of columns on the extreme right of the table, which calculate total Strength of Schedule using only the estimated underlying abilities of the opponents faced (ie the sum of the ratings of the teams played, ignoring venue effects).
Looked at that through this lens we see that Fremantle’s and Melbourne’s schedules appear somehwat easier, while Adelaide's, Geelong's, and Brisbane Lions’ appear harder. No other team sees the ranking of the difficultly of its schedule move by more than four places when we switch from including to excluding venue effects.
Going back to the Total Effective SoS numbers, we find that the difference between the hardest and easiest schedules this year amounts to about 132 points across the season, which is about 6 points per game, down a point compared to last year.
A 6-point advantage turns a game with an otherwise 50% victory probability into one with about a 57% probability, which converts to about 1.5 extra expected wins across a 22-game season. If, instead, we assume a 25% (or 75%) average probability without the advantage, then the 6-point advantage is worth about 1.2 extra expected wins a season.
If we exclude the teams with the two easiest and two hardest schedules the difference shrinks to about 4 points per game. That represents about 0.7 to 1.0 expected extra wins a season.
STRENGTH OF MISSING SCHEDULE
We can also view the schedule on the basis of the opportunities missed by a team as a consequence of playing only 22 of a possible 34 games.
The table below summarises the missing games in the 2020 Fixture, denoting with H's those games missed that would have been home games for a team, and as A's those that would have been away games. Note that I've ordered the teams on the basis of their final 2019 ladder positions, the same ordering that was used for implementing the AFL's 'weighted rule'.
Richmond, for example, fail to play two of the other Top 6 teams twice during the season, missing out on Brisbane Lions at home, and Geelong away. GWS misses West Coast at home, and Collingwood and Brisbane Lions away.
Ignoring venue effects, we can overlay MoSHBODS Ratings on this table to calculate a simplistic Strength of the Missed Schedule figure.
The column headed ‘Total’ shows the aggregate MoSHBODS Ratings of the opponents not played twice during the home-and-away season. The more negative it is, the weaker in aggregate are the teams not played twice; the more positive it is, the stronger in aggregate are the teams not played twice.
On this measure, West Coast’s schedule was furthest away (in a detrimental sense) from what it would have enjoyed in an all-plays-all home-and-away fixture, GWS’s was second-furthest, and Geelong’s third-furthest (though Richmond’s, Collingwood’s, and Hawthorn’s were almost equally as distant). Conversely, Melbourne’s schedule was furthest away in a beneficial sense, Gold Coast second-furthest, and Fremantle third-furthest (though Sydney’s was almost as far).
As we'd expect, the magnitude of the number in the Total column for a team is broadly related to that team’s final ladder position, reflecting the AFL’s desire to have stronger teams play fewer games against weaker opponents and more games against similarly stronger opponents, and to have weaker teams play fewer games against stronger opponents and more games against similarly weaker opponents.
By adding back the effect on a team of not playing itself twice, we get a Net Impact of Missed Games figure, which is exactly equal to the negative of the Aggregate Opponent Ability Only column in the earlier Strength of Schedule Actual table.
CONCLUSION
Accounting for both the quality of the opposition faced and the venuing of fixtures (and using a venue performance value of zero at Perth Stadium for most teams except West Coast and Fremantle), I'd summarise the teams' relative Strength of Schedules as follows (with teams’ final ladder positions shown in brackets)
Tough Schedules: GWS (2nd), West Coast (6th), Carlton (16th), Essendon (8th), Gold Coast (18th), North Melbourne (12th)
Slightly Harder Schedules: Collingwood (4th)
Average Schedules: Hawthorn (9th), Fremantle (13th), Western Bulldogs (7th), St Kilda (14th), Richmond (1st)
Slightly Easier Schedules: Melbourne (17th), Adelaide (11th)
Easy Schedules: Sydney (15th), Geelong (3rd), Port Adelaide (10th), Brisbane Lions (5th)
Relative to the AFL’s intentions, you could make a case based on this listing that:
Carlton, Gold Coast and, maybe, North Melbourne were hard done by
Richmond, Geelong and Brisbane Lions did better than they might have expected (and you might add West Coast to that list if you were quibbling about how Perth Stadium has been treated).
Were we to ignore venuing (which I don't think we should, but which I also recognise is harder for the AFL to account for), relative to those assessments:
Easier than shown above: Fremantle and Melbourne
Harder than shown above: Adelaide, Geelong and Brisbane Lions
Contributing to these differences is the fact that:
Many teams are still assessed as having a zero venue performance value at Perth Stadium
Melbourne doesn’t do particularly well at the MCG compared to many of the other Victorian teams
Many interstate teams do relatively poorly at Adelaide Oval, and many teams interstate or otherwise do relatively poorly at Kardinia Park and the Gabba.
Strength of Schedule is quite a challenging metric to define and measure, and my final comment for readers is that I think you’ll get a better sense of it and be more able to make your own assessment by reading a variety of approaches.
Here are a few that I know about that you should check out:
(I’ll add to this list as more are published in the following days. Let me know if you see any.)