Matter of Stats

View Original

An Analysis of Strength of Schedule for the Men's 2022 AFL Season

I thought that this year I might not have time to perform the traditional Strength of Schedule analysis, having spent far too long during the off-season re-optimising the MoS twins (more on which in a future blog), but here I am on a rainy night in Sydney with a toothache that a masochist would label ‘unpleasantly painful’ and the prospect of sleep before tomorrow’s 11:30am dental appointment fairly remote. So, let’s do it …

For the analysis, we’ll use the same methodology as we used last year, details of which appear in this post from 2015 and this one from the year before. We’ll again use the latest MoSHBODS Team Rating System to provide the required estimates of relative team ability and venue effects.

The 2022 AFL Fixture, as usual, has all 18 teams playing 22 of a possible 34 games, each missing 6 of the home and 6 of the away clashes that an all-plays-all full schedule would entail. There is, again, a bye week for every team, these having again been accommodated in 2022 by playing only 6 games in Rounds 12 through 14.

THE RULE OF THIRDS

In determining the 108 games to be excluded from the schedule, the League has once again, in the interests of what it calls "on-field equity", applied a 'weighted rule', which is a mechanism for reducing the average disparity in ability between opponents across the 198 home-and-away games, using the ladder positions of 2021 after the Finals series as the measure of that ability.

This year, of the contests that would pit a team from last year's Top 6 against a team from the Bottom 6, only 42 of the 72 (or about 58%) of the possible pairings are included in the schedule. That's the same number as we had in the 2018, 2019, (original) 2020, and 2021 Fixtures.

By contrast, 44 of the 60 (or about 73%) of the possible pairings between the Top 6 teams are included, while 46 of the 72 (or about 64%) of the possible pairings between the Top 6 and the Middle 6 teams are included.

There are also 42 of a possible 60 pairings pitting teams from the Middle 6 against one another, 46 of a possible 60 pairings pitting teams from the Bottom 6 against one another, and 44 of a possible 72 pairings pitting a team from the Middle 6 against a team from the Bottom 6.

In total, 132 of the 198 contests (or about 67%) involve teams from the same one-third based on final official ladder positions last season. That’s the same number that was fixtured for the 2021 season.

MoSHBODS’ VIEWS ON STRENGTH OF SCHEDULE

Next we’ll use MoSHBODS' opinions about team strengths and venue effects to provide some answers to the following questions about the 2022 schedule:

  1. How difficult is the schedule that each team faces, taking into account the teams faced and the venues at which they are faced?

  2. How much has the use of the 'weighted rule' in truncating the draw helped or hindered a team's chances relative to a complete 34-round competition?

The first thing we need to estimate a team's schedule strength is a measure of their opponents' underlying abilities. For this purpose we'll use MoSHBODS’ 2022 pre-Round 1 Team Ratings, which are set by taking 58% of their final 2021 Ratings, the regression towards zero reflecting the average historical shrinking in the spread of team abilities from the end of one season to the start of the next. These Ratings appear in the table below. 

This year, again, most team rankings on MoSHBODS are similar to the ordering based on ladder finish, with no team ranked more than four places differently by the two methods, and 11 ranked the same or within two places by the two methods.

In the context of the AFL's competition "thirds", only six teams would be placed in a different third were MoSHBODS to be used rather than the final ladder in defining the boundaries:

  • GWS: Top 6 based on Ladder / Middle 6 based on MoSHBODS

  • Sydney: Middle 6 based on Ladder / Top 6 based on MoSHBODS

  • Fremantle and West Coast: Middle 6 based on Ladder / Bottom 6 based on MoSHBODS

  • Hawthorn and Adelaide: Bottom 6 based on Ladder / Middle 6 based on MoSHBODS

The average and range of the Combined MoSHBODS Ratings of teams from each of the AFL thirds is as follows:

  • Top 6: Average +8.4 Points / Range 23.6 Points (2021: +10.5 / 9.3)

  • Middle 6: Average -0.2 Points / Range 8.2 Points (2021: -0.1 / 11.1)

  • Bottom 6: Average -8.3 Points / Range 4.6 Points (2021: -10.4 / 12.0)

We can see that the Top 6 teams from the final 2021 ladder are, on average, slightly weaker than those from last year, the Middle 6 about the same in terms of average strength, and the Bottom 6 teams slightly stronger.

Ignoring venue effects, which we'll come to in a moment, the difference between playing an average Top 6 team and an average Bottom 6 team is therefore about 17 points (+8.4 less -8.3). That’s about 4 points less than last season. This year, unlike other years, the spread of Ratings is far greater in the Top 6 than in either the Bottom 6 or the Middle 6, so it's relatively more important exactly who you play from the Top 6 than who you play from the other 6s.

VENUE PERFORMANCE VALUES

MoSHBODS also provides estimates of how much better or worse teams, on average, play at each venue relative to their own and their opponents’ underlying ability. These estimates are known as Venue Performance Values, and are a logical extension of the notion of a "home ground advantage" to account for the fact that not all away venues are the same for every team.

The current Venue Performance Values are summarised in the table below for all of the venues at which a team appears at least once sometime during the 2022 home-and-away season. For details on how these have been calculated, refer to this blog

(Interestingly, Gold Coast play at 11 different venues in 2022, and Port Adelaide, Brisbane, North Melbourne, and Western Bulldogs play at 10. Richmond plays at only 6)

Venue Performance values are, like Ratings, measured in Points, and are added to a team's underlying MoSHBODS Combined Rating when used in the Strength of Schedule calculation. So, for example, we can say that Geelong is, on average, a +2.1 Points better team than their underlying +3.1 Points Rating when playing at Kardinia.

As I noted last year, my methodology now includes a team’s own VPV as well as that of its opponents’ in the Strength calculations, because I think this better encapsulates the full venue effect. Prior to 2021, only the opponents’ VPVs were included.

To reiterate the rationale for this by way of a concrete example, imagine moving a Brisbane Lions v Melbourne game from the Gabba to Carrara assuming that the VPV numbers in the table above apply. Under the old methodology, that would have virtually no effect on Brisbane’s estimated Strength of Schedule, because Melbourne’s VPV at both venues is about the same at around minus 8.6 to 8.9 points, and we would ignore Brisbane’s VPV at those venues. But, Brisbane is estimated to be about a 7-point better side at the Gabba compared to Carrara - a fact which, arguably, seems worthy of inclusion in the Strength calculation. The fixture with that game at Gabba is surely an easier fixture for Brisbane than the one with that same game at Carrara.

The main drawback that I can see from this approach is that it tends to increase the estimated schedule strength for teams that have relatively low VPVs at all venues (for example, Western Bulldogs), and decreases the estimated schedule strength for teams that have relatively high VPVs at all venues. If a team seems to enjoy no home ground advantage anywhere, is it reasonable to therefore assess them as having a more difficult schedule and, conversely, if a team seems to play relatively equally well at all venues, is it reasonable to therefore assess them as having a less difficult schedule? Ultimately, this is probably a question of personal preference but, again for this year, I’m answering “yes” to both those questions.

One way of avoiding this issue is, of course, to solely consider the underlying abilities of the teams faced and ignore venues altogether. In the table that follows, I’ll provide the data to allow you to do exactly that.

Anyway, because of the manner in which they are calculated, the Venue Performance Values incorporate the effects, if any, of interstate travel, which you can see, for example, if you run your eye along the row for the Gabba in the table above. At that ground, all interstate teams are about 6- to 10-points worse. (Gold Coast are about a 1 point worse team at the Gabba, but you can’t really attribute that to the travel.)

STRENGTH OF SCHEDULE

After performing the necessary calculations for all 22 games for every team, we arrive at the Strength of Schedule estimates below, within which larger positive values represent more difficult schedules.

(See the STRENGTH OF MISSING SCHEDULE section below for each teams’ actual and missing schedule.)

In the left portion of the table we have the combined strength of the opponents faced by a team at home, split into the contribution from underlying ability and from venue effects. We would generally expect the Aggregate Net Venue Performance figure to be negative for a team in this part of the table, since their opponents are likely to have negative VPVs and they themselves are likely to have positive VPVs. That is, indeed, the case here.

Based on each team’s home fixtures, the teams with the five most difficult schedules (including net venue effects) are:

  1. Richmond (+9.4)

  2. North Melbourne (+0.4)

  3. Western Bulldogs (-28.5)

  4. Melbourne (-31.7)

  5. Collingwood (-40.0)

If we ignore venue effects, such is the underlying strength of the teams they face at home, North Melbourne rank 1st, Port Adelaide 2nd, Richmond 3rd, West Coast 4th, and Geelong 5th. On this metric, Western Bulldogs would slide into 8th (partly because they have only a small positive VPV at Docklands, and a negative VPVs at the MCG, which makes their fixture seem more difficult when venue effects are included), Melbourne into 6th, and Collingwood into 9th.

Those with the easiest home schedules (including net venue effects) are:

  1. Brisbane Lions(-156.2)

  2. Adelaide (-117.9)

  3. West Coast (-117.9)

  4. Essendon (-116.0)

  5. Gold Coast (-100.6)

Here, the disproportionate impacts of interstate travel on teams’ home schedules are apparent. Brisbane Lions and Gold Coast, for example, face interstate teams in every home game except when they play each other, Adelaide likewise except when they play Port Adelaide, and West Coast likewise except when they play Fremantle.

Were we to ignore venue effects, only Essendon and Gold Coast would remain in the bottom five in terms of home schedule difficulty, while Brisbane Lions would move into 9th, Adelaide into 8th, and West Coast into 15th. St Kilda and Hawthorn would move into the bottom five.

The middle section of the table looks at the combined strength of the teams played away from home, again split into the contribution from underlying ability and venue effects. Here we would expect the Aggregate Net Venue Performance figures to be positive for a team, since their opponents are likely to have positive VPVs at their home grounds. That is, indeed, the case for all teams, though least of all for Hawthorn.

Based on their away fixtures, the teams with the five most difficult schedules (including net venue effects) are:

  1. Fremantle (+107.7)

  2. GWS (+99.5)

  3. Gold Coast (+95.8)

  4. Essendon (+95.1)

  5. Brisbane Lions (+94.2)

Ignoring venue effects would leave Fremantle, Essendon, and Brisbane Lions in the Top 5, with GWS falling to 6th, and Gold Coast to 10th. St Kilda would come up to 2nd, and Western Bulldogs to 3rd.

Those with the easiest away schedules (including net venue effects) are:

  1. Hawthorn (+20.5)

  2. North Melbourne (+23.4)

  3. Melbourne (+23.9)

  4. Geelong (+26.4)

  5. Richmond (+34.6)

Ignoring venue effects would leave Melbourne, Geelong, and Richmond in the Top 5, with Hawthorn falling to 8th, and North Melbourne to 6th. West Coast would come up to 1st, and Adelaide to 5th.

Combining the home and the away pictures to estimate a Total Effective Strength of Schedule (SoS) figure and summarising the results, we have (including net venue effects):

  • Tough Schedules: Western Bulldogs, Richmond, St Kilda, Fremantle, and GWS

  • Slightly Harder Schedules: Collingwood, North Melbourne, and Carlton

  • Average Schedule: Sydney, Gold Coast, and Melbourne

  • Slightly Easier Schedules: Port Adelaide, Essendon, and Geelong

  • Easy Schedules: Hawthorn, West Coast, Adelaide, and Brisbane Lions

Comparing each team's ranking on Strength of Schedule with the ladder positions used for weighting the draw, a few teams stand out: 

  • Collingwood, Carlton and North Melbourne have more difficult schedules than might be expected for a team in the Bottom 6

  • Brisbane Lions, and Geelong have easier schedule than might be expected for Top 6 teams

  • West Coast has a slightly easier, and Richmond, St Kilda and Fremantle slightly harder schedules than might be expected for Middle 6 teams

To investigate the issue that some of these disparities might be attributed mainly to net venue effects, I have, as mentioned, included a couple of columns on the extreme right of the table, which calculate total Strength of Schedule using only the estimated underlying abilities of the opponents faced (ie the sum of the ratings of the teams played, ignoring venue effects).

Looked at that through this lens we see that:

  • Richmond’s fixture appears much easier

  • St Kilda’s and Melbourne’s fixtures appear somewhat easier

  • Port Adelaide’s, Essendon’s and Brisbane Lions’ fixtures appear much harder

Going back to the Total Effective SoS numbers, we find that the difference between the hardest and easiest schedules this year amounts to about 107 points across the season, which is about 5 points per game.

A 5-point advantage turns a game with an otherwise 50% victory probability into one with about a 56% probability, which converts to about 1.2 extra expected wins across a 22-game season. If, instead, we assume a 25% (or 75%) average probability without the advantage, then the 5-point advantage is worth about 0.9 extra expected wins a season.

STRENGTH OF MISSING SCHEDULE

We can also view the schedule on the basis of the opportunities missed by a team as a consequence of playing only 22 of a possible 34 games. 

The table below summarises the missing games in the 2022 Fixture, denoting with H's those games missed that would have been home games for a team, and as A's those that would have been away games. Note that I've ordered the teams on the basis of their final 2021 ladder positions, the same ordering that was used for implementing the AFL's 'weighted rule'.

Port Adelaide, for example, fail to play three of the other Top 6 teams twice during the season, missing out on Brisbane Lions at home, and Western Bulldogs and GWS away. Melbourne and Western Bulldogs fare worst amongst the Top 6 teams, both playing 8 of a possible 10 games against other teams from the Top 6.

Ignoring venue effects, we can overlay MoSHBODS Ratings on this table to calculate a simplistic Strength of the Missed Schedule figure.

The column headed ‘Total’ shows the aggregate MoSHBODS Ratings of the opponents not played twice during the home-and-away season. The more negative it is, the weaker in aggregate are the teams not played twice; the more positive it is, the stronger in aggregate are the teams not played twice.

On this measure, Western Bulldogs’ schedule was furthest away (in a detrimental sense) from what it would have enjoyed in an all-plays-all home-and-away fixture, Melbourne’s was second-furthest, and Port Adelaide’s third-furthest. This seems reasonable, given that these were the Top 3 teams on the ladder at the end of 2021. Conversely, Adelaide’s schedule was furthest away in a beneficial sense, Gold Coast’s second-furthest, and West Coast’s somewhat surprisingly third-furthest.

As we'd expect, the magnitude of the number in the Total column for a team is broadly related to that team’s final ladder position, reflecting the AFL’s desire to have stronger teams play fewer games against weaker opponents and more games against similarly stronger opponents, and to have weaker teams play fewer games against stronger opponents and more games against similarly weaker opponents.

By adding back the effect on a team of not playing itself twice, we get a Net Impact of Missed Games figure, which is exactly equal to the negative of the Aggregate Opponent Ability Only column in the earlier Strength of Schedule Actual table. 

CONCLUSION

If we weight the two components of schedule strength - say average quality of the opposition faced at 75%, and net venue effects at 25% - we might summarise the teams' relative Strength of Schedules as follows (with teams’ final 2021 ladder positions shown in brackets)

  • Tough Schedules: Western Bulldogs (2nd) and Fremantle (11th)

  • Harder Schedules: St Kilda (10th), GWS (6th), and Collingwood (17th)

  • Slightly Harder Schedules: North Melbourne (18th), Port Adelaide (3rd), and Richmond (12th)

  • Roughly Average Schedules: Carlton (13th), Sydney (7th), and Essendon (8th)

  • Slightly Easier Schedules: Brisbane Lions (5th), Gold Coast (16th), Geelong (4th), and Melbourne (1st)

  • Easier Schedules: Hawthorn (14th), West Coast (9th), and Adelaide (15th)

Relative to the AFL’s intentions, you could make a case based on this listing that:

  • St Kilda, Fremantle, and Collingwood (and maybe North Melbourne) were hard done by

  • Melbourne, Geelong, and Brisbane Lions did better than they might have expected

As I say most years, Strength of Schedule is quite a challenging metric to define and measure, so you’ll get a better sense of it and be more able to make your own assessment by reading a variety of approaches.