An Analysis of Strength of Schedule for the Men's 2023 AFL Season

The men’s AFL fixture for 2023 was released earlier this week, and tradition requires that the MoS website publishes its assessment of which teams fared best and which worst in that fixture given what the MoS models think about relative team strengths and venue effects.

For the analysis, we’ll use the same methodology as we used last year, details of which appear in this post from 2015 and this one from the year before. We’ll again use the latest MoSHBODS Team Rating System to provide the required estimates of relative team ability and venue effects. Note, however, that we’ll use the Venue Performance Values as at the end of the 2022 Grand Final, and team ratings as they will be for the first game of the 2023 season (which is 58% of what they were at the endof the 2022 season).

The 2023 AFL Fixture has all 18 teams now playing 23 of a possible 34 games, each missing 6 of the home and 5 of the away, or 5 of the home and 6 of the away clashes that an all-plays-all full schedule would entail. There is, again, a bye week for every team, these having been accommodated in 2023 by playing only 7 games in Round 12, 8 in Round 13, and 6 in each of Rounds 14 and 15, meaning that it will now be four weeks before the annual “is the bye a disadvantage?” discussion can be ended for another season.

THE RULE OF THIRDS

In determining the 99 games to be excluded from the schedule, the League has once again, in the interests of what it calls "on-field equity", applied a 'weighted rule', which is a mechanism for reducing the average disparity in ability between opponents across the 207 home-and-away games, using the ladder positions of 2022 after the Finals series as the measure of that ability.

This year, of the contests that would pit a team from last year's Top 6 against a team from the Bottom 6, only 45 of the 72 (or about 63%) of the possible pairings are included in the schedule. That's up by 3 on last year’s figure of 42..

By contrast, 46 of the 60 (or about 77%) of the possible pairings between the Top 6 teams are included, while 47 of the 72 (or about 65%) of the possible pairings between the Top 6 and the Middle 6 teams are included.

There are also 44 of a possible 60 pairings pitting teams from the Middle 6 against one another, 46 of a possible 60 pairings pitting teams from the Bottom 6 against one another, and 47 of a possible 72 pairings pitting a team from the Middle 6 against a team from the Bottom 6.

In total, 136 of the 207 contests (or about 66%) involve teams from the same one-third based on final official ladder positions last season. That’s just fractionally less than the percentage for the 2022 season.

MoSHBODS’ VIEWS ON STRENGTH OF SCHEDULE

Next we’ll use MoSHBODS' opinions about team strengths and venue effects to provide some answers to the following questions about the 2022 schedule:

  1. How difficult is the schedule that each team faces, taking into account the teams faced and the venues at which they are faced?

  2. How much has the use of the 'weighted rule' in truncating the draw helped or hindered a team's chances relative to a complete 34-round competition?

The first thing we need to estimate a team's schedule strength is a measure of their opponents' underlying abilities. For this purpose, as noted earlier, we'll use MoSHBODS’ 2023 pre-Round 1 Team Ratings, which are set by taking 58% of their final 2022 Ratings, the regression towards zero reflecting the average historical shrinking in the spread of team abilities from the end of one season to the start of the next. These Ratings appear in the table below. 

This year, again, most team rankings on MoSHBODS are similar to the ordering based on ladder finish, with the main exceptions being Richmond, who finished ranked 2nd on MoSHBODS but 7th on the ladder, and Port Adelaide, who finished ranked 5th on MoSHBODS but 11th on the ladder. Just three teams were ranked four places differently by the two methods (Sydney, Brisbane Lions, and Fremantle), and 11 were ranked the same or within two places by the two methods.

In the context of the AFL's competition "thirds", only four teams would be placed in a different third were MoSHBODS to be used rather than the final ladder in defining the boundaries:

  • Brisbane Lions and Fremantle: Top 6 based on Ladder / Middle 6 based on MoSHBODS

  • Richmond and Port Adelaide: Middle 6 based on Ladder / Top 6 based on MoSHBODS

The average and range of the Combined MoSHBODS Ratings of teams from each of the AFL thirds is as follows:

  • Top 6: Average +8.5 Points / Range 21.9 Points (2022: +8.4 / 23.6)

  • Middle 6: Average +3.7 Points / Range 12.7 Points (2022: -0.2 / 8.2)

  • Bottom 6: Average -12.2 Points / Range 18.1 Points (2022: -8.3 / 4.6)

We can see that the Top 6 teams from the final 2022 ladder are, on average, slightly stronger than those from last year, the Middle 6 significantly stronger, and the Bottom 6 teams significantly weaker.

Ignoring venue effects, which we'll come to in a moment, the difference between playing an average Top 6 team and an average Bottom 6 team is therefore just under 21 points (+8.5 less -12.2). That’s about 4 points more than last season.

With relatively large spreads in the ratings across the teams in each third - the equivalent of about 2 goals in the Middle 6, 3 goals in the Bottom 6, and 3.5 goals in the Top 6 - it's quite important which of the teams from each third a team plays.

VENUE PERFORMANCE VALUES

MoSHBODS also provides estimates of how much better or worse teams, on average, play at each venue relative to their own and their opponents’ underlying ability. These estimates are known as Venue Performance Values, and are a logical extension of the notion of a "home ground advantage" to account for the fact that not all away venues are the same for every team.

The Venue Performance Values, calculated as at the end of the 2022 season, are summarised in the table below for all of the venues at which a team appears at least once sometime during the 2023 home-and-away season. For details on how these have been calculated, refer to this blog.

(Interestingly, GWS play at 12 different venues in 2023, and Gold Coast at 11 different venues. In contrast, Collingwood and Richmond play at only 6 different venues)

Venue Performance values are, like Ratings, measured in Points, and are added to a team's underlying MoSHBODS Combined Rating when used in the Strength of Schedule calculation. So, for example, we can say that Geelong is, on average, a +3.3 Points better team than their underlying +23.7 Points Rating when playing at Kardinia.

As I noted last year, my methodology now includes a team’s own VPV as well as that of its opponents’ in the Strength calculations, because I think this better encapsulates the full venue effect. Prior to 2021, only the opponents’ VPVs were included.

To reiterate the rationale for this by way of a concrete example, imagine moving a Brisbane Lions v Melbourne game from the Gabba to Carrara assuming that the VPV numbers in the table above apply. Under the old methodology, that would have virtually no effect on Brisbane’s estimated Strength of Schedule, because Melbourne’s VPV at both venues is about the same at around minus 9.3 to 10.2 points, and we would ignore Brisbane’s VPV at those venues. But, Brisbane is estimated to be almost a 6-point better side at the Gabba compared to Carrara - a fact which, arguably, seems worthy of inclusion in the Strength calculation. The fixture with that game at Gabba is surely an easier fixture for Brisbane than the one with that same game at Carrara.

The main drawback that I can see from this approach is that it tends to increase the estimated schedule strength for teams that have relatively low VPVs at all venues (for example, Western Bulldogs), and decreases the estimated schedule strength for teams that have relatively high VPVs at all venues. If a team seems to enjoy no home ground advantage anywhere, is it reasonable to therefore assess them as having a more difficult schedule and, conversely, if a team seems to play relatively equally well at all venues, is it reasonable to therefore assess them as having a less difficult schedule? Ultimately, this is probably a question of personal preference but, again for this year, I’m answering “yes” to both those questions.

One way of avoiding this issue is, of course, to solely consider the underlying abilities of the teams faced and ignore venues altogether. In the table that follows, and in much of the analysis, I’ll provide the data to allow you to do exactly that.

Anyway, because of the manner in which they are calculated, the Venue Performance Values incorporate the effects, if any, of interstate travel, which you can see, for example, if you run your eye along the row for the Gabba in the table above. At that ground, all interstate teams are about 9.5 to 10.5 points worse. (Gold Coast are just under a 1 point worse team at the Gabba, but you can’t really attribute that to the travel.)

Generally speaking, the interstate travel component of VPVs is more uniform now across teams because:

  • only the last 1.3 years of data is included in VPV calculations

  • a team needs to have played 18 games at a venue in that 1.3 year window before the regularisation towards the default of -10.5 points is completely discontinued and the calculation becomes based solely on the team’s actual performance relative to expectation

STRENGTH OF SCHEDULE

After performing the necessary calculations for all 23 games for every team, we arrive at the Strength of Schedule estimates below, within which larger positive values represent more difficult schedules.

(See the STRENGTH OF MISSING SCHEDULE section below for each teams’ actual and missing schedule.)

In the left portion of the table we have the combined strength of the opponents faced by a team at home, split into the contribution from underlying ability and from venue effects. We would generally expect the Aggregate Net Venue Performance figure to be negative for a team in this part of the table, since their opponents are likely to have negative VPVs and they themselves are likely to have positive VPVs. That is, indeed, the case here.

Based solely on each team’s home fixtures, the teams with the five most difficult schedules (including net venue effects) are:

  1. St Kilda (-9.3)

  2. Western Bulldogs (-15.3)

  3. Melbourne (-20.7)

  4. Richmond (-32.6)

  5. Carlton (-37.8)

If we ignore venue effects, such is the underlying strength of the teams they face at home, Sydney rank 1st, Brisbane Lions 2nd, St Kilda 3rd, Carlton 4th, and Gold Coast 5th. On this metric, Western Bulldogs would slide into 6th (partly because they small negative VPVs at Docklands and the MCG, which makes their fixture seem more difficult when venue effects are included), Melbourne into 13th, and Richmond into 11th.

Those with the easiest home schedules (including net venue effects) are:

  1. Brisbane Lions(-134.0)

  2. Adelaide (-121.4)

  3. Geelong (-117.0)

  4. Port Adelaide (-103.6)

  5. Fremantle (-93.2)

Here, the disproportionate impacts of interstate travel on teams’ home schedules are apparent. Brisbane Lions, for example, face interstate teams in every home game except when they play Gold Coast, Adelaide likewise except when they play Port Adelaide, and Fremantle likewise except when they play West Coast.

Were we to ignore venue effects, only Adelaide and Geelong would remain in the bottom five in terms of home schedule difficulty, while Brisbane Lions would move into 2nd, Port Adelaide into 9th, and Fremantle into 12th. North Melbourne, West Coast, and Hawthorn would move into the bottom five.

The middle section of the table looks at the combined strength of the teams played away from home, again split into the contribution from underlying ability and venue effects. Here we would expect the Aggregate Net Venue Performance figures to be positive for a team, since their opponents are likely to have positive VPVs at their home grounds. That is, indeed, the case for all teams, though least of all for Carlton.

Based on their away fixtures, the teams with the five most difficult schedules (including net venue effects) are:

  1. West Coast (+113.3)

  2. Fremantle (+110.3)

  3. Sydney (+101.6)

  4. Port Adelaide (+97.9)

  5. Adelaide (+95.1)

Ignoring venue effects would leave West Coast, Fremantle, and Port Adelaide in the Top 5, with Sydney falling to 8th, and Adelaide to 6th. Hawthorn would come up to 1st, and Geelong to 2nd.

Those with the easiest away schedules (including net venue effects) are:

  1. Carlton (+20.8)

  2. St Kilda (+28.2)

  3. Collingwood (+47.4)

  4. North Melbourne (+48.6)

  5. Gold Coast (+48.9)

Ignoring venue effects would leave Carlton, St Kilda, and Gold Coast in the Top 5, with Collingwood falling to 9th, and North Melbourne to 10th. Brisbane Lions would come up to 4th, and Essendon to 5th.

Combining the home and the away pictures to estimate a Total Effective Strength of Schedule (SoS) figure and summarising the results, we have (including net venue effects):

  • Tough Schedules: Western Bulldogs, West Coast, Melbourne, and Richmond

  • Slightly Harder Schedules: St Kilda, Fremantle, Sydney, and Essendon

  • Average Schedule: Hawthorn, North Melbourne, Port Adelaide, and Collingwood

  • Slightly Easier Schedules: GWS, Carlton, Gold Coast, and Adelaide

  • Easy Schedules: Geelong and Brisbane Lions

Comparing each team's ranking on Strength of Schedule with the ladder positions used for weighting the draw, a few teams stand out: 

  • West Coast, Essendon, and, perhaps, Hawthorn have more difficult schedules than might be expected for teams in the Bottom 6

  • Brisbane Lions, Geelong, and Collingwood have significantly easier schedules than might be expected for Top 6 teams

  • Gold Coast has an easier, and Western Bulldogs significantly harder schedules than might be expected for Middle 6 teams

To investigate the issue that some of these disparities might be attributed mainly to net venue effects, I have, as mentioned, included a couple of columns on the extreme right of the table, which calculate total Strength of Schedule using only the estimated underlying abilities of the opponents faced (ie the sum of the ratings of the teams played, ignoring venue effects).

Looked at that through this lens we see that:

  • Melbourne’s, St Kilda’s, and West Coast’s fixtures appear much easier

  • Brisbane Lions’ fixture appears much harder

  • Port Adelaide’s, Collingwood’s, Adelaide’s, Geelong’s, and Sydney’s fixtures appear somewhat harder

Going back to the Total Effective SoS numbers, we find that the difference between the hardest and easiest schedules this year amounts to about 105 points across the season, which is just under 5 points per game.

A 5-point advantage turns a game with an otherwise 50% victory probability into one with about a 56% probability, which converts to about 1.2 extra expected wins across a 22-game season. If, instead, we assume a 25% (or 75%) average probability without the advantage, then the 5-point advantage is worth about 0.9 extra expected wins a season.

DETAILED GAME-BY-GAME NET VPVs

The table below provides a full breakdown of the Strength of Schedule calculation for every fixtured game. You can click on it to access a larger version. The top row of figures records the Combined Rating of the relevant team, and the numbers in the body of the table the Net VPVs for each contest.

So, for example, when Geelong face Sydney at home, the Opponent component of the SoS calculation for the Cats is the +6.9 Rating of the Swans, and the Venue component is -12.1, which is Sydney’s VPV at Kardinia of -8.8 points less Geelong’s VPV at Kardinia of 3.3 points.

STRENGTH OF MISSING SCHEDULE

We can also view the schedule on the basis of the opportunities missed by a team as a consequence of playing only 23 of a possible 34 games. 

The table below summarises the missing games in the 2023 Fixture, denoting with H's those games missed that would have been home games for a team, and as A's those that would have been away games. Note that I've ordered the teams on the basis of their final 2022 ladder positions, the same ordering that was used for implementing the AFL's 'weighted rule'.

Richmond, for example, fail to play four of the Top 6 teams twice during the season, missing out on Collingwood, Brisbane Lions, and Fremantle at home, and Geelong away.

Ignoring venue effects, we can overlay MoSHBODS Ratings on this table to calculate a simplistic Strength of the Missed Schedule figure.

The column headed ‘Total’ shows the aggregate MoSHBODS Ratings of the opponents not played twice during the home-and-away season. The more negative it is, the weaker in aggregate are the teams not played twice; the more positive it is, the stronger in aggregate are the teams not played twice.

On this measure, Sydney’s schedule was furthest away (in a detrimental sense) from what it would have enjoyed in an all-plays-all home-and-away fixture, Geelong’s was second-furthest, and Port Adelaide’s third-furthest. Port Adelaide is unique amongst the teams outside the Top 6 in that they are the only team to miss out on playing each of GWS, West Coast, and North Melbourne twice.

Conversely, North Melbourne’s schedule was furthest away in a beneficial sense, West Coast’s second-furthest, and Hawthorn’s third-furthest.

As we'd expect, the magnitude of the number in the Total column for a team is broadly related to that team’s final ladder position, reflecting the AFL’s desire to have stronger teams play fewer games against weaker opponents and more games against similarly stronger opponents, and to have weaker teams play fewer games against stronger opponents and more games against similarly weaker opponents.

By adding back the effect on a team of not playing itself twice, we get a Net Impact of Missed Games figure, which is exactly equal to the negative of the Aggregate Opponent Ability Only column in the Strength of Actual Schedule table shown earlier.

CONCLUSION

If we weight the two components of schedule strength - say average quality of the opposition faced at 75%, and net venue effects at 25% - we might summarise the teams' relative Strength of Schedules as follows (with teams’ final 2022 ladder positions shown in brackets)

  • Tough Schedules: Western Bulldogs (8th) and Sydney (2nd)

  • Harder Schedules: Fremantle (6th) and Port Adelaide (11th)

  • Slightly Harder Schedules: Richmond (7th), West Coast (17th), Essendon (15th), and Collingwood (3rd)

  • Roughly Average Schedules: Melbourne (5th) and Hawthorn (13th)

  • Slightly Easier Schedules: North Melbourne (18th), St Kilda (10th), GWS (16th), and Adelaide (14th)

  • Easier Schedules: Carlton (14th), Geelong (1st), Brisbane Lions (4th), and Gold Coast (12th)

Relative to the AFL’s intentions, you could make a case based on this listing that:

  • Port Adelaide, West Coast, and Essendon did worse than they might have expected

  • Melbourne, Geelong, and Brisbane Lions did better than they might have expected

As I say most years, Strength of Schedule is quite a challenging metric to define and measure, so you’ll get a better sense of it and be more able to make your own assessment by reading a variety of approaches.