A Decade of Finals

This year represents the 10th under the current system of finals, a system I think has much to recommend it. It certainly seems to - justifiably, I'd argue - favour those teams that have proven their credentials across the entire season.

The table below shows how the finals have played out over the 10 years:

Finals_2009_W1.png

This next table summarises, on a one-week-of-the-finals-at-a-time basis, how teams from each ladder position have fared:

Finals_Summary_Wk1.png

Of particular note in relation to Week 1 of the finals is the performance of teams finishing 3rd and of those finishing 7th. Only two such teams - one from 3rd and one from 7th - have been successful in their respective Qualifying and Elimination Finals.

In the matchups of 1st v 4th and 5th v 8th the outcomes have been far more balanced. In the 1st v 4th clashes, it's been the higher ranked team that has prevailed on 6 of 10 occasions, whereas in the 5th v 8th clashes, it's been the lower ranked team that's won 60% of the time.

Turning our attention next to Week 2 of the finals, we find that the news isn't great for Adelaide or Lions fans. On both those occasions when 4th has met 5th in Week 2, the team from 4th on the ladder has emerged victorious, and on the 7 occasions that 3rd has faced 6th in Week 2, the team from 3rd on the ladder has won 5 and lost only 2.

Looking more generally at the finals, it's interesting to note that no team from ladder positions 5, 7 or 8 has made it through to the Preliminary Finals and, on the only two occasions that the team from position 6 has made it that far, none has progressed into the Grand Final.

So, teams only from positions 1 to 4 have so far contested Grand Finals, teams from 1st on 6 occasions, teams from 2nd on 7 occasions, teams from 3rd on 3 occasions, and teams from 4th only twice.

No team finishing lower than 3rd has yet won a Flag.

The Decline of the Humble Behind

Last year, you might recall, a spate of deliberately rushed behinds prompted the AFL to review and ultimately change the laws relating to this form of scoring.

Has the change led to a reduction in the number of behinds recorded in each game? The evidence is fairly strong:

Goals and Behinds.png

So far this season we've seen 22.3 behinds per game, which is 2.6 per game fewer than we saw in 2008 and puts us on track to record the lowest number of average behinds per game since 1915. Back then though goals came as much more of a surprise, so a spectator at an average game in 1915 could expect to witness only 16 goals to go along with the 22 behinds. Happy days.

This year's behind decline continues a trend during which the number of behinds per game has dropped from a high of 27.3 per game in 1991 to its current level, a full 5 behinds fewer, interrupted only by occasional upticks such as the 25.1 behinds per game recorded in 2007 and the 24.9 recorded in 2008.

While behind numbers have been falling recently, goals per game have also trended down - from 29.6 in 1991, to this season's current average of 26.8. Still, AFL followers can expect to witness more goals than behinds in most games they watch. This wasn't always the case. Not until the season of 1969 had there been a single season with more goals than behinds, and not until 1976 did such an outcome became a regular occurrence. In only one season since then, 1981, have fans endured more behinds than goals across the entire season.

On a game-by-game basis, 90 of 128 games this season, or a smidge over 70%, have produced more goals than behinds. Four more games have produced an equal number of each.

As a logical consequence of all these trends, behinds have had a significantly smaller impact on the result of games, as evidenced by the chart below which shows the percentage of scoring attributable to behinds falling from above 20% in the very early seasons to around 15% across the period 1930 to 1980, to this season's 12.2%, the second-lowest percentage of all time, surpassed only by the 11.9% of season 2000.

Behinds PC.png

(There are more statistical analyses of the AFL on MAFL Online's sister site at MAFL Stats.)

Is the Competition Getting More Competitive?

We've talked before about the importance of competitiveness in the AFL and the role that this plays in retaining fans' interest because they can legitimately believe that their team might win this weekend (Melbourne supporters aside).

Last year we looked at a relatively complex measure of competitiveness that was based on the notion that competitive balance should produce competition ladders in which the points are spread across teams rather than accruing disproportionately to just a few. Today I want to look at some much simpler diagnostics based on margins of victory.

Firstly, let's take a look at the average victory margin per game across every season of the VFL/AFL.

Average_Victory_Margin.png

The trend since about the mid 1950s has been increasing average victory margins, though this seems to have been reversed at least a little over the last decade or so. Notwithstanding this reversal, in historical terms, we saw quite high average victory margins in 2008. Indeed, last year's average margin of 35.9 points was the 21st highest of all time.

Looking across the last decade, the lowest average victory margin came in 2002 when it was only 31.7 points, a massive 4 points lower than we saw last year. Post WWII, the lowest average victory margin was 23.2 points in 1957, which was the season in which Melbourne took the minor premiership with 12-1-5 record.

Averages can, of course, be heavily influenced by outliers, in particular by large victories. One alternative measure of the closeness of games that avoids these outliers is the proportion of games that are decided by less than a goal or two. The following chart provides information about such measures. (The purple line shows the percentage of games won by 11 points or fewer and the green line shows the percentage of games won by 5 points or fewer. Both include draws.)

Close_Games.png

Consistent with what we found in the chart of average victory margins we can see here a general trend towards fewer close games since about the mid 1950s. We can also see an increase in the proportion of close games in the last decade.

Again we also find that, in historical terms, the proportion of close games that we're seeing is relatively low. The proportion of games that finished with a margin of 5 points or fewer in 2008 was just 10.8%, which ranks equal 66th (from 112 seasons). The proportion that finished with a margin of 11 points or fewer was just 21.1%, which ranks an even lowlier 83rd.

On balance then I think you'd have to conclude that the AFL competition is not generally getting closer though there are some signs that the situation has been improving in the last decade or so.

Winners' Share of Scoring

You might recall from seasons past my commenting on what I've claimed to be a startling regularity in AFL scoring, specifically, the proportion of scoring shots recorded by winning teams.

In 2008, winning teams racked up 57.3% of all scoring shots, while in 2007 the figure was 56.6%, and in 2006 it was 56.7%. Across the period 1999 to 2008 this percentage bounced around in a range between 56.4% and 57.8%. By any standard that's remarkable regularity.

I've recently come into possession of the scores for the entire history of the VFL/AFL competition in a readily analysable form - and by now you surely now how dangerous that's gotta be - so it seemed only natural to see if this regularity persisted into earlier seasons (assuming that it makes sense for something to persist into the past).

Below is a chart showing (in purple) the percentage of scoring shots registered by winning teams in each of the seasons 1897 through 2008. (The red line shows the proportion of goals that they scored, and the green line shows the proportion of behinds.)

Winners_Scoring.png

So, apart from the more extreme dominance of winning teams in the first decade or so of the competition, and a few other aberrant seasons over the next two decades, we have certainly seen remarkable stability in the percentage we've been discussing. Indeed, in the period 1927 to 2008, the percentage of scoring shots registered by winning teams has never been outside the range 55.0% to 59.6%. That surely almost establishes this phenomenon as a Law of Footy.

For those of you who prefer to digest your data in tabular form (preferably taken with meals), here's a decade-by-decade summary of the data.

Winners_Scoring_Table.png

The recent peak in winning teams' share of scoring was witnessed in 1995 and it came not as a consequence of a spike in 6-pointer dominance but instead from a spike in winning teams' share of behinds. In 1995 winning teams scored 57% of all behinds, which is about 2-4% higher than anything we've witnessed since. 1995 was the year that Carlton won the minor premiership kicking 317 behinds, Geelong finished runners-up kicking 338, and Richmond and Essendon, finishing in 3rd and 4th, kicked 600 more between them. By way of context, that's almost 75 more behinds than the top 4 of Geelong, Hawthorn, Western Bulldogs and St Kilda managed in 2008.

Regularity also aptly describes the history of the percentage of goals kicked by winning teams across the seasons (the red line in the chart). Again looking at the entire period since 1927, this percentage has never strayed from the righteous range of 57.0% to 61.8%.

Winning teams' share of behinds (the green line) has been, relatively speaking, quite variable, ranging from 51.9% to 58.2% in the period 1927 to the present, which once again demonstrates that it's goals and not behinds that win footy games.

How Important is Pre-Season Success?

With the pre-season now underway it's only right that I revisit the topic of the extent to which pre-season performance predicts regular season success.

Here's the table with the relevant data:

Pre-Season.png

The macro view tells us that, of the 21 pre-season winners, only 14 of them have gone on to participate in the regular season finals in the same year and, of the 21 pre-season runners-up, only 12 of them have made that same category. When you consider that roughly one-half of the teams have made the regular season finals in each year - slightly less from 1988 to 1993, and slightly more in 1994 - those stats look fairly unimpressive.

But a closer, team-by-team view shows that Carlton alone can be blamed for 3 of the 7 occasions on which the pre-season winner has missed the regular season finals, and Adelaide and Richmond can be blamed for 4 of the 9 occasions on which the pre-season runner-up has missed the regular season finals.

Pre-Season_Team.png

So, unless you're a Crows, Blues or Tigers supporter, you should be at least a smidge joyous if your team makes the pre-season final; if history's any guide, the chances are good that your team will get a ticket to the ball in September.

It's one thing to get a ticket but another thing entirely to steal the show. Pre-season finalists can, collectively, lay claim to five flags but, as a closer inspection of the previous table will reveal, four of these flags have come from just two teams, Essendon and Hawthorn. What's more, no flag has come to a pre-season finalist since the Lions managed it in 2001.

On balance then, I reckon I'd rather the team that I supported remembered that there's a "pre" in pre-season.

A Little AFL/VFL History

Every so often this year I'll be diving into the history of the VFL/AFL to come up with obscure and conversation-stopping facts for you to use at the next social event you attend.

For example, do you know the most common score in AFL history? It's 12.12 (84) and has been a team's final score about 0.88% of the time (counting two scores for each game in the denominator for that percentage). What if we restrict our attention to more recent seasons, say 1980 to 2008? It's 12.12 again (84), only now its prevalence is 0.98%. Last year though we managed only a single 12.12 (84) score, courtesy of St Kilda in Round 14.

While we're on the topic of scores, which season do you think produced the highest average score per team? It was 1982 and the average was 112.07 points. The trend since that season has been steadily downwards with the nadir being in 1997 when the average was 90.37 points.

Average_Team_Scores.png

From season averages to individual game scores, here are a couple of doozies. In May of 1919, Geelong took on St Kilda in a Round 5 clash at Corio Oval. The first quarter failed to produce a goal from either team and saw Geelong lead 0.6 to 0.2. St Kilda found their range - relatively speaking - in the second quarter to lead 3.4 to 0.9 at the main break. One need speculate only briefly about the thrust of the Cats' half-time speech from the coach.

The speech clearly didn't help, however, as Geelong continued to accumulate only singles for the remaining two quarters, finally emerging goal-less and defeated, 0.18 to 6.10.

Just over two years later, in July of 1921, St Kilda swapped roles and matched the Cats' ineptitude, eventually going down 0.18 to Fitzroy's 6.8 in front of around 6,000 startled fans.

If you're looking for more sustained inaccuracy you'd be after the South Melbourne team of 1900. They managed 59.127 for the entire season, a 31.7% accuracy rate.

In contrast, in 1949 the Hawks put on a spectacular display of straight kicking at Glenferrie Oval, finishing with 7.0 for the game. Regretably, their opponents, Essendon, clearly with no sense of aesthetics, repeatedly sprayed the ball at goal finishing 70 point victors by bagging a woefully inaccurate 16.16.

Again, turning from the single game to an entire season, plaudits must go to the St Kilda team of 2004, who registered 409.253 or 61.8% for the season. But, as the Hawks discovered, accuracy does not preordain success: St Kilda went out in the Preliminary Final to Port by 6 points.

The Team of the Decade

Over the break I came across what must surely be amongst the simplest, most practical team rating systems.

It's based on the general premise that a team's rating should be proportional to the sum of the ratings of the teams that it has defeated. In the variant that I've used, each team's rating is proportional to the rating of those teams it has defeated on each occasion that it has faced them in a given season plus one-half of the rating of those teams with which it has drawn if they played only once, or with which it has won once and lost once if they have played twice during the season.

(Note that I've used only regular home-and-away season games for these ratings and that I've made no allowance for home team advantage.)

This method produces relative, not absolute, ratings so we can arbitrarily set any one team's rating - say the strongest team's - to be 1, and then define every other team's rating relative to this. All ratings are non-negative.

Using the system requires some knowledge of matrix algebra, but that's about it. (For the curious, the ratings involve solving the equation Ax = kx where A is a symmetric matrix with 0s on the diagonal and where Aij is the proportion of games between teams i and j that were won by i and Aji = 1 - Aij; x is the ratings vector; and k is a constant. The solution for x that we want is the first-rank eigenvector of A. We normalise x by dividing each element by the maximum element in x.)

Applying this technique to the home-and-away games of the previous 10 seasons, we obtain the following ratings:

Team_of_the_Decade_Ratings.png

Now bear in mind that it makes little sense to directly compare ratings across seasons, so a rating of, say, 0.8 this year means only that the team was in some sense 80% as good as the best team this year; it doesn't mean that the team was any better or worse than a team rating 0.6 last year unless you're willing to make some quantitative assumption about the relative merits of this year's and last year's best teams.

What we can say with some justification however is that Geelong was stronger relative to Port in 2007 than was Geelong relative to the Hawks in 2008, The respective GFs would seem to support this assertion.

So, looking across the 10 seasons, we find that:

  • 2003 produced the greatest ratings difference between the best (Port) and second-best (Lions) teams
  • 2001 produced the smallest ratings difference between the best (Essendon) and second-best (Lions) teams
  • Carlton's drop from 4th in 2001 to 16th in 2002 is the most dramatic decline
  • Sydney's rise from 14th in 2002 to 3rd in 2003 is the most dramatic rise

Perhaps most important of all we can say that the Brisbane Lions are the Team of the Decade.

Here is the ratings table above in ranking form:

Team_of_the_Decade_Rankings.png

What's interesting about these rankings from a Brisbane Lions point of view is that only twice has its rating been 10th or worse. Of particular note is that, in seasons 2005 and 2008, Brisbane rates in the top 8 but did not make the finals. In 2008 the Lions won all their encounters against 3 of the finalists and shared the honours with 2 more, so there seems to be some justification for their lofty 2008 rating at least.

Put another way, based on the ratings, Brisbane should have participated in all but 2 of the past 10 final series. No other team can make that claim.

Second-best Team of the Decade is Port Adelaide, who registered 3 consecutive Highest Rated Team across seasons 2002, 2003 and 2004. Third-best is Geelong, largely due to their more recent performance, which has seen them amongst the top 5 teams in all but 1 of the previous 5 seasons.

The Worst Team of the Decade goes to Carlton, who've finished ranked 10th or below in each of the previous 7 seasons. Next worst is Richmond who have a similar record blemished only by a 9th-placed finish in 2006.