2020 : Simulating the Final Ladder After Round 14

The latest simulation results for the 2020 home-and-away season appear below.

For details about the methodologies being used, see this post from earlier in the season.

LADDER FINISHES

Using the Standard Methodolgy, Port Adelaide, Brisbane Lions, Geelong, Richmond, and West Coast are all now assessed as having 96% or better chances of playing Finals, and Collingwood, St Kilda, and Melbourne all about 75% to 80% chances.

The Collingwood, Melbourne, and GWS all saw their final 8 chances rise by over 10% points this week, while St Kilda, Western Bulldogs, and Carlton saw theirs fall by over 10% points.

Port Adelaide, Brisbane Lions, and Geelong are all now rated 9 in 10 or better chances for a spot in the Top 4, Richmond about 7 in 10, and West Coast about 2 in 5. St Kilda’s and West Coast’s Top 4 chances fell by over 10% points this week, while Geelong’s and Richmond’s rose by over 10% points.

For Minor Premier we have Port Adelaide still about 1 chance in 2, Brisbane Lions 1 in 4, and Geelong 1 in 5.

Almost exactly 12 Expected Wins now separate 1st from last.

The results for the Heretical Methodology (in which, instead of increasing the uncertainty about team ratings the further away is the contest, we update team ratings within a simulation based on the simulated outcomes) appear below.

They, again, tell a fairly similar story.

TEAM AND POSITION CONCENTRATION

Based on the results from the Standand Methodology, most teams are now effectively competing for between 4.5 and 7 ladder positions, down about 1 to 1.5 spots on last week.

Every team except Brisbane Lions is now effectively competing for fewer ladder positions, but most of all Fremantle (down 2.2 spots), Hawthorn (1.9), Carlton (1.6), and Melbourne (1.4). The Lions are now effectively competing for 4.4 ladder spots, up from 4.2 last week.

Similarly, there are 4.5 to 7 teams competing for most of the ladder positions, the exceptions being the Top 2 and Bottom 3 places.

Every ladder position now has fewer teams effectively competing for it, but the largest declines came for 11th (1.8 fewer teams), 6th (1.7), 12th (1.6), and 10th and 15th (1.5).

Interestingly, the ladder position with the most teams effectively vying for it is 8th, the last spot in the Finals.

DEPENDENCE

This week, I’m reintroducing the dependence charts, which show how the probability of one team making (say) the eight depends of that of another team.

I’ll explain how it works by way of example. The first arrow in the chart below shows that Carlton makes the finals about 8% of the times when the Western Bulldogs does (the foot of the arrow stem), and about 14% of the time when the Dogs don’t (the tip of the arrow head). The black line shows that, across all simulations - those where the Dogs do and where they don’t make the Finals - the Blues play Finals about 12.5% of the time.

That tells us that the fate of the Blues is somewhat, but not much, tied to the fate of the Dogs.

Whenever the black line for a team is close to the foot or head of the arrow, that tells us that the team we’re conditioning on isn’t particularly likely to play finals. So here, for example, it turns out that the Dogs only play Finals in 25% of simulations (see the bottom right panel), so the unconditional probability of Carlton playing in the Finals (the black line) is close to the conditional probability assuming that the Dogs’ miss out.

The biggest effects then are those where the arrow is long, and extends a reasonable distance on both sides of the relevant black line.

Take, for example, the GWS line on the St Kilda panel, which extends from about 50% (if GWS make the Finals) to about 85% (if GWS miss), which is about a -25% to a +10% point swing for St Kilda relative to its unconditional probability. Similarly, if we look at GWS’s panel, we see that St Kilda’s fate has a sizeable effect on the Giants’.

We also see a relatively large dependence between the fates of Melbourne and GWS.

We can do the same thing looking this time at spots in the Top 4.

Here, the big dependency is clearly between Richmond and West Coast.

WINS AND FINAL RANK

Also included this week for the first time this season is a look at what the Standard Method simulations reveal about the relationship between the number of wins a team records in the home-and-away season simulations, and where they finish on the ladder.

In this chart, the darkness of a tile for a given team is based on the number of times across the 50,000 simulations that this team finished with a particular number of wins (the y-axis) and in a particular ladder position (the x-axis).

We see, for example, that Carlton often misses out when it records 9 wins, as does GWS, but that St Kilda and Melbourne less often miss out with 9 wins. Carlton and GWS both have considerably worse percentages than St Kilda and Melbourne.

We also see that West Coast never plays Finals with only 9 wins. That’s because this would require them losing to Essendon, Western Bulldogs, and St Kilda, who are all teams with a reasonable shot at playing Finals (though Essendon has by far the smallest chance). It would also entail them losing to North Melbourne, which would further depress their percentage. This is, of course, an unlikely scenario.

GAME IMPORTANCE

Lastly, let’s look at the games that are assessed as being most likely to influence the composition of the Finalists.

According to the latest simulations using the Standard Methodology, it’s now the Round 18 St Kilda v GWS game that is expected to have the most impact on the final Top 8 (consistent with our earlier dependency analysis).

Among the Top 10 games, GWS and St Kilda appear in 3 each.

Turning next, and lastly, to impacts on the Top 4, we see that it’s two games from Round 17 that are expected to have the most impact.

Among the Top 10 games, the Eagles appear in 4, the Tigers in 3, and the Swans, Cats, and Lions in 2 each.