Teams' All-Time Records

At this time of year, before we fixate on the week-to-week triumphs and travesties of yet another AFL season, it's interesting to look at the varying fortunes of all the teams that have ever competed in the VFL/AFL.

The table below provides the Win, Draw and Loss records of every team.

All_Time_WDL.png

As you can see, Collingwood has the best record of all the teams having won almost 61% of all the games in which it has played, a full 1 percentage point better than Carlton, in second. Collingwood have also played more games than any other team and will be the first team to have played in 2,300 games when Round 5 rolls around this year.

Amongst the relative newcomers to the competition, West Coast and Port Adelaide - and to a lesser extent, Adelaide - have all performed well having won considerably more than half of their matches.

Sticking with newcomers but dipping down to the other end of the table we find Fremantle with a particularly poor record. They've won just under 40% of their games and, remarkably, have yet to register a draw. (Amongst current teams, Essendon have recorded the highest proportion of drawn games at 1.43%, narrowly ahead of Port Adelaide with 1.42%. After Fremantle, the team with the next lowest proportion of drawn games is Adelaide at 0.24%. In all, 1.05% of games have finished with scores tied.)

Lower still we find the Saints, a further 1.3 percentage points behind Fremantle. It took St Kilda 48 games before it registered its first win in the competition, which should surely have been some sort of a hint to fans of the pain that was to follow across two world wars and a depression (maybe two). Amongst those 112 seasons of pain there's been just the sole anaesthetising flag, in 1966.

Here then are a couple of milestones that we might witness this year that will almost certainly go unnoticed elsewhere:

  • Collingwood's 2,300th game (and 1,400th win or, if the season's a bad one for them, 900th loss)
  • Carlton's 900th loss
  • West Coast's 300th win
  • Port Adelaide's 300th game
  • Geelong's and Sydney's 2,200th game
  • Adelaide's 200th loss
  • Richmond's 1,000th loss (if they fail to win more than one match all season)
  • Fremantle's 200th loss

Granted, few of those are truly banner events, but if AFL commentators were as well supported by statisticians as, say, Major League Baseball, you can bet they'd get a mention, much as equally arcane statistics are sprinkled liberally in the 3 hours of dead time there is between pitches.

Which Quarter Do Winners Win?

Today we'll revisit yet another chestnut and we'll analyse a completely new statistic.

First, the chestnut: which quarter do winning teams win most often? You might recall that for the previous four seasons the answer has been the 3rd quarter, although it was a very close run thing last season, when the results for the 3rd and 4th quarters were nearly identical.

How then does the picture look if we go back across the entire history of the VFL/AFL?

Qtrs_Won_By_Winners.png

It turns out that the most recent epoch, spanning the seasons 1993 to 2008, has been one in which winning teams have tended to win more 3rd quarters than any other quarter. In fact, it was the quarter won most often in nine of those 16 seasons.

This, however, has not at all been the norm. In four of the other six epochs it has been the 4th quarter that winning teams have tended to win most often. In the other three epochs the 4th quarter has been the second most commonly won quarter.

But, the 3rd quarter has rarely been far behind the 4th, and its resurgence in the most recent epoch has left it narrowly in second place in the all-time statistics.

A couple of other points are worth making about the table above. Firstly, it's interesting to note how significantly more frequently winning teams are winning the 1st quarter than they have tended to in epochs past. Successful teams nowadays must perform from the first bounce.

Secondly, there's a clear trend over the past 4 epochs for winning teams to win a larger proportion of all quarters, from about 66% in the 1945 to 1960 epoch to almost 71% in the 1993 to 2008 epoch.

Now on to something a little different. While I was conducted the previous analysis, I got to wondering if there'd ever been a team that had won a match in which in had scored more points than its opponent in just a solitary quarter. Incredibly, I found that it's a far more common occurrence than I'd have estimated.

Number_Of_Qtrs_Won_By_Winners.png

The red line shows, for every season, the percentage of games in which the winner won just a solitary quarter (they might or might not have drawn any of the others). The average percentage across all 112 seasons is 3.8%. There were five such games last season, in four of which the winner didn't even manage to draw any of the other three quarters. One of these games was the Round 19 clash between Sydney and Fremantle in which Sydney lost the 1st, 2nd and 4th quarters but still got home by 2 points on the strength of a 6.2 to 2.5 3rd term.

You can also see from the chart the upward trend since about the mid 1930s in the percentage of games in which the winner wins all four quarters, which is consistent with the general rise, albeit much less steadily, in average victory margins over that same period that we saw in an earlier blog.

To finish, here's the same data from the chart above summarised by epoch:

Number_Of_Qtrs_Won_By_Winners_Table.png

Is the Competition Getting More Competitive?

We've talked before about the importance of competitiveness in the AFL and the role that this plays in retaining fans' interest because they can legitimately believe that their team might win this weekend (Melbourne supporters aside).

Last year we looked at a relatively complex measure of competitiveness that was based on the notion that competitive balance should produce competition ladders in which the points are spread across teams rather than accruing disproportionately to just a few. Today I want to look at some much simpler diagnostics based on margins of victory.

Firstly, let's take a look at the average victory margin per game across every season of the VFL/AFL.

Average_Victory_Margin.png

The trend since about the mid 1950s has been increasing average victory margins, though this seems to have been reversed at least a little over the last decade or so. Notwithstanding this reversal, in historical terms, we saw quite high average victory margins in 2008. Indeed, last year's average margin of 35.9 points was the 21st highest of all time.

Looking across the last decade, the lowest average victory margin came in 2002 when it was only 31.7 points, a massive 4 points lower than we saw last year. Post WWII, the lowest average victory margin was 23.2 points in 1957, which was the season in which Melbourne took the minor premiership with 12-1-5 record.

Averages can, of course, be heavily influenced by outliers, in particular by large victories. One alternative measure of the closeness of games that avoids these outliers is the proportion of games that are decided by less than a goal or two. The following chart provides information about such measures. (The purple line shows the percentage of games won by 11 points or fewer and the green line shows the percentage of games won by 5 points or fewer. Both include draws.)

Close_Games.png

Consistent with what we found in the chart of average victory margins we can see here a general trend towards fewer close games since about the mid 1950s. We can also see an increase in the proportion of close games in the last decade.

Again we also find that, in historical terms, the proportion of close games that we're seeing is relatively low. The proportion of games that finished with a margin of 5 points or fewer in 2008 was just 10.8%, which ranks equal 66th (from 112 seasons). The proportion that finished with a margin of 11 points or fewer was just 21.1%, which ranks an even lowlier 83rd.

On balance then I think you'd have to conclude that the AFL competition is not generally getting closer though there are some signs that the situation has been improving in the last decade or so.

Winners' Share of Scoring

You might recall from seasons past my commenting on what I've claimed to be a startling regularity in AFL scoring, specifically, the proportion of scoring shots recorded by winning teams.

In 2008, winning teams racked up 57.3% of all scoring shots, while in 2007 the figure was 56.6%, and in 2006 it was 56.7%. Across the period 1999 to 2008 this percentage bounced around in a range between 56.4% and 57.8%. By any standard that's remarkable regularity.

I've recently come into possession of the scores for the entire history of the VFL/AFL competition in a readily analysable form - and by now you surely now how dangerous that's gotta be - so it seemed only natural to see if this regularity persisted into earlier seasons (assuming that it makes sense for something to persist into the past).

Below is a chart showing (in purple) the percentage of scoring shots registered by winning teams in each of the seasons 1897 through 2008. (The red line shows the proportion of goals that they scored, and the green line shows the proportion of behinds.)

Winners_Scoring.png

So, apart from the more extreme dominance of winning teams in the first decade or so of the competition, and a few other aberrant seasons over the next two decades, we have certainly seen remarkable stability in the percentage we've been discussing. Indeed, in the period 1927 to 2008, the percentage of scoring shots registered by winning teams has never been outside the range 55.0% to 59.6%. That surely almost establishes this phenomenon as a Law of Footy.

For those of you who prefer to digest your data in tabular form (preferably taken with meals), here's a decade-by-decade summary of the data.

Winners_Scoring_Table.png

The recent peak in winning teams' share of scoring was witnessed in 1995 and it came not as a consequence of a spike in 6-pointer dominance but instead from a spike in winning teams' share of behinds. In 1995 winning teams scored 57% of all behinds, which is about 2-4% higher than anything we've witnessed since. 1995 was the year that Carlton won the minor premiership kicking 317 behinds, Geelong finished runners-up kicking 338, and Richmond and Essendon, finishing in 3rd and 4th, kicked 600 more between them. By way of context, that's almost 75 more behinds than the top 4 of Geelong, Hawthorn, Western Bulldogs and St Kilda managed in 2008.

Regularity also aptly describes the history of the percentage of goals kicked by winning teams across the seasons (the red line in the chart). Again looking at the entire period since 1927, this percentage has never strayed from the righteous range of 57.0% to 61.8%.

Winning teams' share of behinds (the green line) has been, relatively speaking, quite variable, ranging from 51.9% to 58.2% in the period 1927 to the present, which once again demonstrates that it's goals and not behinds that win footy games.

How Important is Pre-Season Success?

With the pre-season now underway it's only right that I revisit the topic of the extent to which pre-season performance predicts regular season success.

Here's the table with the relevant data:

Pre-Season.png

The macro view tells us that, of the 21 pre-season winners, only 14 of them have gone on to participate in the regular season finals in the same year and, of the 21 pre-season runners-up, only 12 of them have made that same category. When you consider that roughly one-half of the teams have made the regular season finals in each year - slightly less from 1988 to 1993, and slightly more in 1994 - those stats look fairly unimpressive.

But a closer, team-by-team view shows that Carlton alone can be blamed for 3 of the 7 occasions on which the pre-season winner has missed the regular season finals, and Adelaide and Richmond can be blamed for 4 of the 9 occasions on which the pre-season runner-up has missed the regular season finals.

Pre-Season_Team.png

So, unless you're a Crows, Blues or Tigers supporter, you should be at least a smidge joyous if your team makes the pre-season final; if history's any guide, the chances are good that your team will get a ticket to the ball in September.

It's one thing to get a ticket but another thing entirely to steal the show. Pre-season finalists can, collectively, lay claim to five flags but, as a closer inspection of the previous table will reveal, four of these flags have come from just two teams, Essendon and Hawthorn. What's more, no flag has come to a pre-season finalist since the Lions managed it in 2001.

On balance then, I reckon I'd rather the team that I supported remembered that there's a "pre" in pre-season.

Who Fares Best In The Draw?

Well I guess it's about time we had a look at the AFL draw for 2009.

I've summarised it in the following schematic:

Draw_Summary.png

The numbers show how many times a particular matchup occurs at home, away or at a neutral venue in terms of the team shown in the leftmost column. So, for example, looking at the first row, Adelaide play the Lions only once during 2009 and it's an away game for Adelaide.

For the purpose of assessing the relative difficulty of each team's schedule, I'll use the final MARS Ratings for 2008, which were as follows:

Ratings.png

Given those, the table below shows the average MARS Rating of the opponents that each team faces at home, away and at neutral venues.

Strength_of_Schedule.png

So, based solely on average opponent rating, regardless of venue, the Crows have the worst of the 2009 draw. The teams they play only once include five of the bottom six MARS-ranked teams in Brisbane (11th), Richmond (12th), Essendon (14th), West Coast (15th), Melbourne (16th). One mitigating factor for the Crows is that they tend to play stronger teams at home: they have the 2nd toughest home schedule and only the 6th toughest away and neutral-venue schedules.

Melbourne fare next worst in the draw, meeting just once four of the bottom five teams, excluding themselves. They too, however, tend to face stronger teams at home and relatively weaker teams away, though their neutral-venue schedule is also quite tough (St Kilda and Sydney).

Richmond, in contrast, get the best of the draw, avoiding a second contest with six of the top eight teams and playing each of the bottom four teams twice.

St Kilda's draw is the next best and sees them play once only four of the teams in the top eight and play each of the bottom three teams twice.

Looking a little more closely and differentiating home games from away games, we find that the Bulldogs have the toughest home schedule but also the easiest away schedule. Port Adelaide have the easiest home schedule and Sydney have the toughest away schedule.

Generally speaking, last year's finalists have fared well in the draw, with five of them having schedules ranked 10th or lower. Adelaide, Sydney and, to a lesser extent, the Bulldogs are the exceptions. It should be noted that higher-ranked teams always have a relative advantage over other teams in that their schedules exclude games against themselves. 

A Little AFL/VFL History

Every so often this year I'll be diving into the history of the VFL/AFL to come up with obscure and conversation-stopping facts for you to use at the next social event you attend.

For example, do you know the most common score in AFL history? It's 12.12 (84) and has been a team's final score about 0.88% of the time (counting two scores for each game in the denominator for that percentage). What if we restrict our attention to more recent seasons, say 1980 to 2008? It's 12.12 again (84), only now its prevalence is 0.98%. Last year though we managed only a single 12.12 (84) score, courtesy of St Kilda in Round 14.

While we're on the topic of scores, which season do you think produced the highest average score per team? It was 1982 and the average was 112.07 points. The trend since that season has been steadily downwards with the nadir being in 1997 when the average was 90.37 points.

Average_Team_Scores.png

From season averages to individual game scores, here are a couple of doozies. In May of 1919, Geelong took on St Kilda in a Round 5 clash at Corio Oval. The first quarter failed to produce a goal from either team and saw Geelong lead 0.6 to 0.2. St Kilda found their range - relatively speaking - in the second quarter to lead 3.4 to 0.9 at the main break. One need speculate only briefly about the thrust of the Cats' half-time speech from the coach.

The speech clearly didn't help, however, as Geelong continued to accumulate only singles for the remaining two quarters, finally emerging goal-less and defeated, 0.18 to 6.10.

Just over two years later, in July of 1921, St Kilda swapped roles and matched the Cats' ineptitude, eventually going down 0.18 to Fitzroy's 6.8 in front of around 6,000 startled fans.

If you're looking for more sustained inaccuracy you'd be after the South Melbourne team of 1900. They managed 59.127 for the entire season, a 31.7% accuracy rate.

In contrast, in 1949 the Hawks put on a spectacular display of straight kicking at Glenferrie Oval, finishing with 7.0 for the game. Regretably, their opponents, Essendon, clearly with no sense of aesthetics, repeatedly sprayed the ball at goal finishing 70 point victors by bagging a woefully inaccurate 16.16.

Again, turning from the single game to an entire season, plaudits must go to the St Kilda team of 2004, who registered 409.253 or 61.8% for the season. But, as the Hawks discovered, accuracy does not preordain success: St Kilda went out in the Preliminary Final to Port by 6 points.

The Team of the Decade

Over the break I came across what must surely be amongst the simplest, most practical team rating systems.

It's based on the general premise that a team's rating should be proportional to the sum of the ratings of the teams that it has defeated. In the variant that I've used, each team's rating is proportional to the rating of those teams it has defeated on each occasion that it has faced them in a given season plus one-half of the rating of those teams with which it has drawn if they played only once, or with which it has won once and lost once if they have played twice during the season.

(Note that I've used only regular home-and-away season games for these ratings and that I've made no allowance for home team advantage.)

This method produces relative, not absolute, ratings so we can arbitrarily set any one team's rating - say the strongest team's - to be 1, and then define every other team's rating relative to this. All ratings are non-negative.

Using the system requires some knowledge of matrix algebra, but that's about it. (For the curious, the ratings involve solving the equation Ax = kx where A is a symmetric matrix with 0s on the diagonal and where Aij is the proportion of games between teams i and j that were won by i and Aji = 1 - Aij; x is the ratings vector; and k is a constant. The solution for x that we want is the first-rank eigenvector of A. We normalise x by dividing each element by the maximum element in x.)

Applying this technique to the home-and-away games of the previous 10 seasons, we obtain the following ratings:

Team_of_the_Decade_Ratings.png

Now bear in mind that it makes little sense to directly compare ratings across seasons, so a rating of, say, 0.8 this year means only that the team was in some sense 80% as good as the best team this year; it doesn't mean that the team was any better or worse than a team rating 0.6 last year unless you're willing to make some quantitative assumption about the relative merits of this year's and last year's best teams.

What we can say with some justification however is that Geelong was stronger relative to Port in 2007 than was Geelong relative to the Hawks in 2008, The respective GFs would seem to support this assertion.

So, looking across the 10 seasons, we find that:

  • 2003 produced the greatest ratings difference between the best (Port) and second-best (Lions) teams
  • 2001 produced the smallest ratings difference between the best (Essendon) and second-best (Lions) teams
  • Carlton's drop from 4th in 2001 to 16th in 2002 is the most dramatic decline
  • Sydney's rise from 14th in 2002 to 3rd in 2003 is the most dramatic rise

Perhaps most important of all we can say that the Brisbane Lions are the Team of the Decade.

Here is the ratings table above in ranking form:

Team_of_the_Decade_Rankings.png

What's interesting about these rankings from a Brisbane Lions point of view is that only twice has its rating been 10th or worse. Of particular note is that, in seasons 2005 and 2008, Brisbane rates in the top 8 but did not make the finals. In 2008 the Lions won all their encounters against 3 of the finalists and shared the honours with 2 more, so there seems to be some justification for their lofty 2008 rating at least.

Put another way, based on the ratings, Brisbane should have participated in all but 2 of the past 10 final series. No other team can make that claim.

Second-best Team of the Decade is Port Adelaide, who registered 3 consecutive Highest Rated Team across seasons 2002, 2003 and 2004. Third-best is Geelong, largely due to their more recent performance, which has seen them amongst the top 5 teams in all but 1 of the previous 5 seasons.

The Worst Team of the Decade goes to Carlton, who've finished ranked 10th or below in each of the previous 7 seasons. Next worst is Richmond who have a similar record blemished only by a 9th-placed finish in 2006.

Surprisals in 2009

This year we'll once again be using surprisals as a way of quantifying how unpredictable the results of each round and each team have been.

In addition to measuring the surprisal of head-to-head results, which is what we did last year, we'll also look at how surprising each result has been from a line betting point of view - a measure of how accurately the bookies have been able to predict not just the winner but the margin too.

If you're interested in the details, please download this document.