Matter of Stats

View Original

2019 Strength of Schedule: A Post-Season Review

Back in November of 2018 we looked at the then-upcoming season of the AFL and estimated the strength of schedule for all 18 teams based on the MoSHBODS Ratings and Venue Performance Values (VPVs) that prevailed at the time.

In this blog post we'll use the same methodology but replace the static, start-of-season MoSHBODS data with the dynamic Ratings and VPVs that each team's opponents carried into the respective game to assess who, in hindsight, had easier or more difficult schedules than we assessed initially.

The summary results of that analysis appear in the table below, with the new estimated schedule strengths recorded in the block of data headed "What We Think Now", and the schedule strengths we reported in that earlier blog shown in the block of data headed "What We Thought Pre-Season".

What we find is that:

  • If we include venue effects, in retrospect West Coast had the toughest home and away schedule, though it was only about 0.4 points per game tougher than we estimated pre-season.

  • Excluding venue effects it was Port Adelaide who had the toughest home and away schedule, about 1.3 points per game tougher than we estimated pre-season.

  • The teams that, including venue effects, had the most significantly harder schedules than we estimated pre-season were the Western Bulldogs (1.5 points per game), Port Adelaide (1.3), and Adelaide (0.8).

  • Teams that, including venue effects, had significantly easier schedules than we estimated pre-season were Collingwood (2.1 points per game), and GWS (1.6).

Overall, there was some difference between how we ranked the teams' schedules pre-season and how we'd rank them now knowing actual pre-game ratings and VPVs, though less difference than there was in 2018. The rank correlation coefficient for the Total Effective SOS data is +0.80, and for Aggregate Opponent Ability Only +0.50.

We can decompose the differences between the "actual" (knowing the true pre-game ratings) and "expected" (using the pre-season ratings) estimates of schedule strength by looking at the team-by-team data for each opponent.

(Note that, in this analysis, we'll look only at opponent ability and ignore VPVs. The differences we're investigating then are those shown in the rightmost column in the table above.)

Here we can see that, for example, the difference between the Western Bulldogs’ actual and expected strength of schedule is largely attributable to the unexpected strength of the Brisbane Lions and Fremantle when the Dogs played them (which they did twice)

Port Adelaide's unexpectedly strong schedule is due largely to the same two teams.

Collingwood’s unexpectedly easier schedule is due almost entirely to Melbourne’s lower rating when they met them, while GWS benefits from meeting Melbourne, Richmond and Sydney when they were weaker then expected.

CONCLUSION

Though some teams did end up with harder (Dogs, Crows and Power) or easier (Pies and Giants) schedules than were assessed pre-season, overall the estimates made before Round 1 proved to be reasonably indicative of the strength of schedule most teams actually faced.