An Analysis of Strength of Schedule for the Men's 2024 AFL Season

The men’s AFL fixture for 2024 was recently released and, once again this year, we’ll analyse it to see what it means for all 18 teams.

Specifically, we’ll look for answers to three questions:

  1. Which teams fared best and which worst in terms of the overall difficultly of the teams they face, given what the MoS models think about relative team strengths and venue effects (the Strength of Schedule analysis)

  2. Which teams fared best and which worst in terms of the matchups the missed out on given that only 23 games out of a possible 34 all-plays-all fixture are played. Again we’ll use the MoS model’s opinions about relative team strengths and venue effects in what we’ll call a Strength of Missing Schedule analysis

  3. How much more or less likely would each team be to play Finals were the missing parts of the fixture actually played

For the analyses in 1 and 2, we’ll use the same methodology as we used last year, details of which appear in this post from 2015 and this one from the year before. We’ll again use the latest MoSHBODS Team Rating System to provide the required estimates of relative team ability and venue effects. Note, however, that we’ll use the Venue Performance Values as at the start of the 2024 season, and team ratings as they will be for the first game of the 2024 season (which is 58% of what they were at the endof the 2023 season).

THE FIXTURE

The 2024 AFL Fixture has all 18 teams now playing 23 of a possible 34 games, each missing 6 of the home and 5 of the away, or 5 of the home and 6 of the away clashes that an all-plays-all full schedule would entail. There is, again, a bye week for every team, these having been accommodated in 2024 by playing only 4 games in Round 0 (sic), 6 games in Rounds 14 and 15, 7 games in Round 12, and 8 games in Rounds 2, 3, 5, 6, and 13, meaning that their will now be eight opportunities for the perennial “is the bye a disadvantage?” discussion.

THE RULE OF THIRDS

In determining the 99 games to be excluded from the schedule, the League has once again, in the interests of what it calls "on-field equity", applied a 'weighted rule', which is a mechanism for reducing the average disparity in ability between opponents across the 207 home-and-away games, using the ladder positions of 2023 after the Finals series as the measure of that ability.

This year, of the contests that would pit a team from last year's Top 6 against a team from the Bottom 6, only 46 of the 72 (or about 64%) of the possible pairings are included in the schedule. That's up by 1 on last year’s figure of 45..

By contrast, 46 of the 60 (or about 77%) of the possible pairings between the Top 6 teams are included, while 46 of the 72 (or about 64%) of the possible pairings between the Top 6 and the Middle 6 teams are included.

There are also 46 of a possible 60 pairings pitting teams from the Middle 6 against one another, 46 of a possible 60 pairings pitting teams from the Bottom 6 against one another, and 45 of a possible 72 pairings pitting a team from the Middle 6 against a team from the Bottom 6.

In total, 138 of the 207 contests (or about 67%) involve teams from the same one-third based on final official ladder positions last season. That’s just fractionally more than the percentage for the 2023 season.

MoSHBODS’ VIEWS ON STRENGTH OF SCHEDULE

The metric that we’ll use to estimate Strength of Schedule is the combined strength of the teams met across the 24 rounds of the season, adjusted for venue effects (so meeting Team A at home will tend to make a lower contribution to the overall combined strength of the teams met than will meeting Team A away where they’ll likely have a Home Ground Advantage. Recall that the MoSHBODS Rating System calculates a Venue Performance Value for every team at every venue, and these will tend to be positive for venues that are the team’s home grounds, and negative for benues that are the team’s away grounds, especially if they are interstate).
The overall strength of Team A playing Team B at Venue V is given by Team A’s Combined MoSHBODS rating + Team A’s VPV at Venue V - Team B’s VPV at Venue V.

For this measure, a higher combined strength is interpreted as a more difficult schedule.

The first thing we need for this metric is a measure of each team’s underlying abilities. For this purpose, as noted earlier, we'll use MoSHBODS’ 2024 pre-Round 0 Team Ratings, which are set by taking 58% of their final 2023 Ratings, the regression towards zero reflecting the average historical shrinking in the spread of team abilities from the end of one season to the start of the next. These Ratings appear in the table below. 

This year there are a number of teams who are ordered very differently based on MoSHBODS versus their ladder finish, with the biggest differences being for:

  • Adelaide: 3rd on MoSHBODS and 10th on the Ladder

  • Carlton: 8th on MoSHBODS and 3rd on the Ladder

  • Fremantle: 9th on MoSHBODS and 14th on the Ladder

  • St Kilda: 12th on MoSHBODS and 7th on the Ladder

  • Essendon: 16th on MoSHBODS and 11th on the Ladder

In the context of the AFL's competition "thirds", only five teams would be placed in a different third were MoSHBODS to be used rather than the final ladder in defining the boundaries:

  • Port Adelaide and Carlton: Top 6 based on Ladder / Middle 6 based on MoSHBODS

  • Adelaide and Sydney: Middle 6 based on Ladder / Top 6 based on MoSHBODS

  • Fremantle: Bottom 6 based on Ladder / Middle 6 based on MoSHBODS

The average and range of the Combined MoSHBODS Ratings of teams from each of the AFL thirds is as follows:

  • Top 6: Average +7.5 Points / Range 10.3 Points (2023: +8.5 / 21.9)

  • Middle 6: Average +0.6 Points / Range 22.5 Points (2023: +3.7 /12.7)

  • Bottom 6: Average -8.1 Points / Range 27.2 Points (2023: -12.2 / 18.1)

We can see that the Top 6 teams from the final 2022 ladder are, on average, slightly weaker than those from last year, the Middle 6 significantly also slightly weaker, and the Bottom 6 teams slightly stronger.

Ignoring venue effects, which we'll come to in a moment, the difference between playing an average Top 6 team and an average Bottom 6 team is therefore about 16.6 points (+7.5 less -8.1). That’s about 4 points less than last season and almost identical to 2022.

With relatively large spreads in the ratings across the Middle and Bottom thirds - the equivalent of about 3.5 goals in the Middle 6, and 4.5 goals in the Bottom 6 - it's quite important which of the teams from these thirds a team plays.

VENUE PERFORMANCE VALUES

MoSHBODS also provides estimates of how much better or worse teams, on average, play at each venue relative to their own and their opponents’ underlying ability. These estimates are known as Venue Performance Values, and are a logical extension of the notion of a "home ground advantage" to account for the fact that not all away venues are the same for every team.

The Venue Performance Values, calculated as they would be on day 1 of the 2024 season, are summarised in the table below for all of the venues at which a team appears at least once sometime during the 2024 home-and-away season. For details on how these have been calculated, refer to this blog.

(Interestingly, GWS again play at 12 different venues in 2024, and Gold Coast and North Melbourne at 10 different venues. In contrast, Carlton and Essendon play at only 7 different venues)

Venue Performance values are, like Ratings, measured in Points, and are added to a team's underlying MoSHBODS Combined Rating when used in the Strength of Schedule calculation. So, for example, we can say that Geelong is, on average, a +1.8 Points better team than their underlying +0.9 Points Rating when playing at Kardinia.

As noted earlier, the methodology includes a team’s own VPV as well as that of its opponents’ in the Strength calculations, because I think this better encapsulates the full venue effect. Prior to 2021, only the opponents’ VPVs were included.

To reiterate the rationale for this by way of a concrete example, imagine moving a Brisbane Lions v Melbourne game from the Gabba to Carrara assuming that the VPV numbers in the table above apply. Under the old methodology, that would have virtually no effect on Brisbane’s estimated Strength of Schedule, because Melbourne’s VPV at both venues is about the same at around minus 9.9 to 10.4 points, and we would ignore Brisbane’s VPV at those venues. But, Brisbane is estimated to be more than a 5-point better side at the Gabba compared to Carrara - a fact which, arguably, seems worthy of inclusion in the Strength calculation. The fixture with that game at Gabba is surely an easier fixture for Brisbane than the one with that same game at Carrara.

The main drawback that I can see from this approach is that it tends to increase the estimated schedule strength for teams that have relatively low VPVs at all venues (for example, Western Bulldogs), and decreases the estimated schedule strength for teams that have relatively high VPVs at all venues. If a team seems to enjoy no home ground advantage anywhere, is it reasonable to therefore assess them as having a more difficult schedule and, conversely, if a team seems to play relatively equally well at all venues, is it reasonable to therefore assess them as having a less difficult schedule? Ultimately, this is probably a question of personal preference but, again for this year, I’m answering “yes” to both those questions.

One way of avoiding this issue is, of course, to solely consider the underlying abilities of the teams faced and ignore venues altogether. In the table that follows, and in much of the analysis, I’ll provide the data to allow you to do exactly that.

Anyway, because of the manner in which they are calculated, the Venue Performance Values incorporate the effects, if any, of interstate travel, which you can see, for example, if you run your eye along the row for the Gabba in the table above. At that ground, all interstate teams are about 9.7 to 10.7 points worse. (Gold Coast are just under half-a-point worse team at the Gabba, but you can’t really attribute that to the travel.)

Generally speaking, the interstate travel component of VPVs is fairly uniform across teams because:

  • only the last 1.3 years of data is included in VPV calculations

  • a team needs to have played 18 games at a venue in that 1.3 year window before the regularisation towards the default of -10.4 points is completely discontinued and the calculation becomes based solely on the team’s actual performance relative to expectation

STRENGTH OF SCHEDULE

After performing the necessary calculations for all 23 games for every team, we arrive at the Strength of Schedule estimates below, within which larger positive values represent more difficult schedules.

(See the STRENGTH OF MISSING SCHEDULE section below for each teams’ actual and missing schedule.)

In the left portion of the table we have the combined strength of the opponents faced by a team at home, split into the contribution from underlying ability and from venue effects. We would generally expect the Aggregate Net Venue Performance figure to be negative for a team in this part of the table, since their opponents are likely to have negative VPVs and they themselves are likely to have positive VPVs. That is, indeed, the case here, with Hawthorn the obvious outlier because of their negative VPVs at both the MCG and York Park, which accounts for 9 of their 10 home games in the fixture.

Based solely on each team’s home fixtures, the teams with the five most difficult schedules (including net venue effects) are:

  1. Hawthorn (+32.4)

  2. Carlton (-15.4)

  3. North Melbourne (-17.6)

  4. Richmond (-29.0)

  5. Western Bulldogs (-51.6)

If we ignore venue effects, such is the underlying strength of the teams they face at home, Hawthorn still rank 1st, Richmond 2nd, Carlton 3rd, Fremantle 4th, and Sydney 5th. On this metric, North Melbourne would slide into 13th, and Western Bulldogs into 16th (partly because of their small negative VPVs at Docklands and the MCG, which makes their fixture seem more difficult relative to other teams with larger Home Ground Advantages when venue effects are included).

Those with the easiest home schedules (including net venue effects) are:

  1. Adelaide (-169.4)

  2. Brisbane Lions(-140.6)

  3. Port Adelaide (-135.1)

  4. Sydney (-104.6)

  5. Fremantle (-93.9)

Here, the disproportionate impacts of interstate travel on teams’ home schedules are apparent. Brisbane Lions, for example, face interstate teams in every home game except when they play Gold Coast, Adelaide likewise except when they play Port Adelaide, and Fremantle likewise except when they play West Coast.

Were we to ignore venue effects, none of those five teams just mentioned would remain in the bottom five in terms of home schedule difficulty, with St Kilda taking 1st, Geelong 2nd, Western Bulldogs 3rd, Melbourne 4th, and Gold Coast 5th).

The middle section of the table looks at the combined strength of the teams played away from home, again split into the contribution from underlying ability and venue effects. Here we would expect the Aggregate Net Venue Performance figures to be positive for a team, since their opponents are likely to have positive VPVs at their home grounds. That is, indeed, the case for all teams, though least of all for North Melbourne.

Based on their away fixtures, the teams with the five most difficult schedules (including net venue effects) are:

  1. Port Adelaide (+129.0)

  2. Brisbane Lions (+115.7)

  3. Sydney (+112.1)

  4. Western Bulldogs (+104.0)

  5. West Coast (+90.2)

Ignoring venue effects would leave Brisbane Lions, Port Adelaide and Western Bulldogs in the Top 5, with Sydney falling to 9th, and West Coast to 8th. Essendon would come up to 5th, and Geelong to 3rd.

Those with the easiest away schedules (including net venue effects) are:

  1. North Melbourne (+25.5)

  2. Hawthorn (+27.6)

  3. St Kilda (+33.9)

  4. Fremantle (+43.5)

  5. Carlton (+45.3)

Ignoring venue effects would leave Hawthorn, Carlton, and Fremantle in the Top 5, with North Melbourne moving to 8th, and St Kilda to 7th. Gold Coast would come up to 3rd, and Richmond to 4th.

Combining the home and the away pictures to estimate a Total Effective Strength of Schedule (SoS) figure and summarising the results, we have (including net venue effects):

  • Tough Schedules: Hawthorn, Western Bulldogs, Richmond, Carlton, and Essendon

  • Slightly Harder Schedules: West Coast, Melbourme, North Melbourne, and Sydney

  • Average Schedule: GWS and Collingwood

  • Slightly Easier Schedules: Port Adelaide, Geelong, Gold Coast, and Brisbane Lions

  • Easy Schedules: Fremantle, St Kilda, and Adelaide

Comparing each team's ranking on Strength of Schedule with the ladder positions used for weighting the draw, a few teams stand out: 

  • Richmond and Hawthorn have more difficult schedules than might be expected for teams in the Bottom 6

  • Brisbane Lions, and possibly Port Adelaide have easier schedules than might be expected for Top 6 teams

  • Western Bulldogs has a harder schedule, and St Kilda and Adelaide an easier schedule than might be expected for Middle 6 teams

To investigate the issue that some of these disparities might be attributed mainly to net venue effects, I have, as mentioned, included a couple of columns on the extreme right of the table, which calculate total Strength of Schedule using only the estimated underlying abilities of the opponents faced (ie the sum of the ratings of the teams played, ignoring venue effects).

Looked at that through this lens we see that:

  • Melbourne’s and North Melbourne’s fixtures appear much easier

  • Brisbane Lions’ and Port Adelaide’s fixture appears much harder

Going back to the Total Effective SoS numbers, we find that the difference between the hardest and easiest schedules this year amounts to about 147 points across the season, which is just over 6 points per game and up about 1 point per game on last year’s figure.

A 6-point advantage turns a game with an otherwise 50% victory probability into one with about a 57% probability, which converts to about 1.7 extra expected wins across a 23-game season.

DETAILED GAME-BY-GAME NET VPVs

The table below provides a full breakdown of the Strength of Schedule calculation for every fixtured game. You can click on it to access a larger version. The top row of figures records the Combined Rating of the relevant team, and the numbers in the body of the table the Net VPVs for each contest.

So, for example, when Collingwood play Brisbane Lions at home at the MCG, the Opponent component of the SoS calculation for the Pies is the +7.4 Rating of the Lions, and the Venue component is -11.9, which is Brisbane’s VPV at the MCG of -10.0 points less the Pies’ VPV at +1.9 points. So, the Pies would start as 4.5 point favourites (11.9 less 7.4) and the contribution of this game to the Strength of Schedule is the negative of that value).

STRENGTH OF MISSING SCHEDULE

We can also view the schedule on the basis of the opportunities missed by a team as a consequence of playing only 23 of a possible 34 games. 

We use the same metric for this as we did for the Strength of Schedule assessment, here the combined strength of the teams not met across the 24 rounds of the season, also adjusted for venue effects. We assume that all missing games will be played at the home ground most commonly used for the home team in the season proper. So, for example, all of the missing Geelong home games are played at Kardinia. Carlton, who play 5 home games at Docklands and 5 at the MCG in the season proper, are assumed to play all missing home games at Docklands (where their VPV is slightly higher).

For this measure, a lower combined strength is interpreted as a less disadvantageous schedule.

The table below summarises the missing games in the 2024 Fixture, denoting with H's those games missed that would have been home games for a team, and as A's those that would have been away games. Note that I've ordered the teams on the basis of their final 2023 ladder positions, the same ordering that was used for implementing the AFL's 'weighted rule'.

(Note that Richmond plays St Kilda notionally twice at home in 2024, including during the Gather Round played exclusively in Adelaide.)

Essendon, for example, fail to play five of the Top 6 teams twice during the season, missing out on Brisbane Lions, Port Adelaide, and Melbourne at home, and Carlton and GWS away.

Ignoring venue effects, we can overlay MoSHBODS Ratings on this table to calculate a simplistic Strength of the Missed Schedule figure.

The column headed ‘Total’ shows the aggregate MoSHBODS Ratings of the opponents not played twice during the home-and-away season. The more negative it is, the weaker in aggregate are the teams not played twice; the more positive it is, the stronger in aggregate are the teams not played twice.

On this measure, Brisbane Lions’ schedule was furthest away (in a detrimental sense) from what it would have enjoyed in an all-plays-all home-and-away fixture, Sydney’s was second-furthest, GWS’ third-furthest, and Western Bulldogs’ fourth-furthest.

Sydney is unique amongst the teams outside the Top 6 in that they are the only team to miss out on playing each of West Coast, North Melbourne, Hawthorn and Gold Coast twice.

Conversely, West Coast’s schedule was furthest away in a beneficial sense, followed by North Melbourne, Gold Coast, and St Kilda,

As we'd expect, the magnitude of the number in the Total column for a team is broadly related to that team’s final ladder position, reflecting the AFL’s desire to have stronger teams play fewer games against weaker opponents and more games against similarly stronger opponents, and to have weaker teams play fewer games against stronger opponents and more games against similarly weaker opponents.

Because team ratings are constrained to sum to zero, by adding back the effect on a team of not playing itself twice, we get a Net Impact of Missed Games figure, which is exactly equal to the negative of the Aggregate Opponent Ability Only column in the Strength of Actual Schedule table shown earlier. In this sense we can see that there is some relationship between a team’s Strength of Schedule and Strength of Missing Schedule.

We can also account for Venue Effects in our calculations relating to Missing Games, which we do in the table below..

It produces a broadly similar ranking of the teams, with Western Bulldogs a little higher and Melbourne and Geelong a little lower.

IMPACT OF THE MISSING SCHEDULE

Although the numbers in the previous section provide a way to rank and broadly quantify the effects of the truncated draw on each team, it’s hard to know what that means in practical terms.

To that end, for this analysis we will estimate the difference in the simulated probability of making the Finals between that obtained using the actual 24 round fixture and that using a 35 round fixture in which all teams meet all other teams home and away. What we’re measuring here are the differences between an idealised all-plays-all season and a 24 round season modelled on the actual fixture.

For this measure, a larger difference in Finals probability (35- vs 24-round) is interpreted as a more disadvantageous schedule.

The table below provides information about how each team fared in the all-plays-all simulations versus the simulations based on the actual fixture.

Focussing solely on the difference in the probability of playing Finals, we see that Sydney fares worst by being over 3% less likely to play Finals given the actual fixture. Brisbane Lions are next most disadvantaged, and then Collingwood.

Conversely, Gold Coast are most advantaged by the fixture (compared to an all-plays-all alternative), follwed by Adelaide and then St Kilda.

SUMMARY

The table below summarises the team rankings on five of the metrics so far considered.

CONCLUSION

One way of finally summarising the schedule strength discussion is to bring together the answer to two questions: how strong is the schedule that each team faces, and how much does it affect their Finals chance.

We provide this summary in the final table for this blog.

We conclude, therefore, that:

  • Essendon has a tough schedule, but it doesn’t much disadvantage it in terms of playing Finals

  • Hawthorn, Richmond, and Western Bulldogs all have tough schedules, but they only somewhat disadvantage them in terms of playing Finals

  • Carlton has a tough schedule, and it very much disadvantages it in terms of playing Finals

  • Sydney and Collingwood both have only average schedules, but the matches they miss very much disadvantage them in terms of playing Finals

  • Brisbane Lions has a relatively weak schedule, but the matches they miss very much disadvantage them in terms of playing Finals, partly because of their relatively high VPV at the Gabba