2011 Round 28 Results: Cats, and the Bookmaker, Just Too Good This Year

Most of the cats I've known have treated water as something profoundly repugnant while many of the birds of my acquaintance have been quite fond of it, but on a dampened and at times sodden MCG on Saturday it was the Geelong Cats who revelled in the conditions and eventually accounted for a dispirited Collingwood Magpies by over 6 goals.

That result had the TAB bookmaker trousering Investor money yet again, an all too frequent occurrence this season, and meant that the Head-to-Head Fund finished the season at a discouraging 39.4c and the Line Fund narrowly in profit at $1.094.

The Head-to-Head Fund's 60.6c loss was produced after investing the totality of its bankroll 6.89 times over the course of the season, which means that the ROI for the Fund was about -8.8%. That's higher than the typical overround in the head-to-head market of about 7% and suggests that the Fund was perhaps a bit unlucky to lose as much as it did.

One way of thinking about the bookmaker's overround is that it's the bookmaker's expected return from wagering or, put another way, it's roughly the proportion of Investors' money that he would expect to take in the long run if the probabilities implicit in his head-to-head prices represented an accurate assessment of each team's chances. Based on this, had the Fund lost only at the rate it "should" have, it would have finished down by about 48c (ie 6.893 x 7) rather than down by almost 61c.

Similarly, with a typical overround in the line market of about 5.25%, the Line Fund might have been expected to lose 5.25c x 3.663 or about 19c given its level of activity. Seen in that light its 9.4c gain is all the more meritorious being almost 30c better (from an Investor point of view) than the TAB bookmaker would have expected.

In short, the year's results have shown yet again that the line market remains exploitable and the head-to-head market less so, if at all. The vulnerability in the line market was evidenced by the fact that, again this season, the starts for Home teams proved to be too generous, with Home teams winning 51.5% of all line contests and finishing the season with an average handicap-adjusted margin of +1.8 points per game. Of course, opportunities for exploitation are far easier to spot in the rear-view mirror ...

It's only been a few weeks since I looked at the wagering results on a team by team basis, so I'll not dwell on them too long here other than to note that the Head-to-Head Fund made money, net, wagering on just four teams this season - Collingwood, Hawthorn, the Roos and the Eagles - and lost money on the 13 other teams. In contrast, the Line Fund made money wagering on 8 teams, abstained from wagering on 1 other (Sydney), and lost money on the remaining 8.

Looked at from the opposite point of view - that is, considering those occasions on which we bet against particular teams - the Head-to-Head losses are more concentrated. Just 10 teams caused net losses for the Fund and only 6 of them produced losses exceeding 10c.

Viewed in this way, the Line Fund's profits again come from a broad set of teams, with 8 of them creating net profits across the season by dutifully losing on line betting as expected.  

If we average each team's contribution to each Fund when wagered on and when wagered against we find that the Dees can be blamed for the largest portion of Head-to-Head losses (about 13.5c), followed by the Blues and the Swans (about 10c each), and that the Roos are the only team to make a significant positive contribution to this Fund's value (about 10c).

The Dees, along with Richmond, are also the greatest source of losses to the Line Fund (about 7c each), while the Dogs (about 7.5c) and the Pies (about 6c) are the greatest sources of profit. More generally, the gains and losses are far more evenly spread across the teams in the Line Fund's results than they are in the Head-to-Head Fund's.

Before moving on to the tipping and margin prediction results, I'll complete the autopsy by assessing the consequences of a number of the more significant decisions I made this year about the wagering conduct of the Funds: 

  • Commencing wagering in Round 1 instead of Round 6. This was clearly a mistake, notwithstanding the sensational returns we enjoyed in Round 1. Over the course of Rounds 1 to 5 the Head-to-Head Fund dropped 6.7c and the Line Fund shed 26.6c. Combined that knocked over 16c off the final Portfolio price; adjusting for this, the final loss would have been far more palatable.
  • Kelly Staking instead of Level Staking. It's only as a result of some of the analysis I've done during this season on this aspect of wagering that I've given any attention to it. Had both the Head-to-Head and Line Funds wagered 5% of the Fund whenever they wagered this year rather than wagering an amount dependent on the edge they assessed they had, the Head-to-Head Fund would have finished down by "only" about 48.5c and the Line Fund would have finished up by 15c. These are both superior outcomes. Looked at from an ROI viewpoint a level-staking Line Fund would have recorded a +3.3% result, slightly better than the +2.6% it actually recorded, while the Head-to-Head Fund's ROI would have been -12.8% rather than -8.8%, its smaller loss having been produced from a much smaller turn. From an ROI point of view then, Kelly-staking the Head-to-Head Fund was actually superior to level-staking it - the problem was the multiplier used to convert the Kelly fraction to a wager. I used 0.2 this season; a smaller multiplier would have reduced the loss (but would also have reduced the upside, had it turned out there was some.)
  • Preventing the Head-to-Head Fund from wagering on teams priced at over $5. With Gold Coast in particular registering a few wins at home at long odds, this cap proved to be a net drag on profitability, costing the Fund over 26c. Even knowing that now I'm not sure I'd ever be willing again to allow a Fund to wager large amounts on long-shot home teams. The scars from the Heritage Fund are still too raw.
  • Wagering on Home Teams only. For the first time that I can remember, a MAFL head-to-head fund would have made money over the course of a season had it been allowed to wager on away teams. This would have been the case for the Head-to-Head Fund this year, though only if it had also not been subject to the $5 price cap on wagering. Under these conditions it would have produced a 5.8% ROI and added 7c to the Fund's price. The Line Fund, however, would have fared dreadfully had it been allowed to bet on away teams. Such wagering from it would have produced an ROI of almost -19% and would have knocked 51c from the final Fund price.
  • Adjusting the estimated probability of a home team victory if the Head-to-Head Fund's assessment is more than 25% higher than the bookmaker's. Relaxing this condition would have altered the Head-to-Head Fund's wager in only two games, converting the 6.2c loss from the wager on the Dees playing Carlton in Round 10 into a 7c loss, and converting the 12.7c gain from wagering on the Roos playing the Dogs in Round 17 into a 13.8c gain. Combined, the difference is immaterial - it's a net 0.3c gain.

On balance I think you'd have to conclude that the conditions I imposed on the wagering of the Funds this year, especially those imposed on the Head-to-Head Fund, were in aggregate and with the always-flattering glow of hindsight, detrimental to profit. The condition I feel most culpable for is allowing the Funds to wager in the early parts of the season when, most obviously, bookmaker knowledge will exceed that of any statistical model. It's just so frustrating to watch almost a quarter of the season go by uncontested. Next season we return to more disciplined ways.

A last look at the Tipsters and Predictors then before we draw a curtain on the season (though a sheet might be considered more apt given the way that our wagering went).

The Cats' victory in the Granny was enough to allow BKB to slip into top spot, alone, on the MAFL Head-to-Head Tipsters ladder and complete the season as the only tipster with a success rate above 76%. Bookie_9 and Combo_NN_1 finished joint second just one tip adrift, each finishing on 75.8%. Combo_NN_2, Bookie_3 and Combo_7 all finished one more tip further back in joint fourth.

Bookie_3's prediction for the GF result was close enough to the final margin to see it finish the season with an MAPE of 29.53 points per game, comfortably below my arbitrary benchmark for excellence of 30. Combo_7 only just missed out on achieving this result too, finishing with an MAPE of 30.12 points per game. Bookie_9 finished third, followed by the two neural network-based algorithms which, though they fell away a little in the later parts of the season, still managed to record creditable MAPEs in the vicinity of 31 points per game.

Combo_NN_2 also ended the year as the Margin Predictor with the best record for predicting margins within 6 points of the actual final margin. It achieved this outcome in about 1 game in 6. It was also equal-best, with Combo_7, at predicting margins within 2 goals of the final result, a feat they achieved in just over 29% of games.

Overall, the performance of the neural network-based models has surprised me this season as I expected them to display signs of overfitting, a phenomenon often associated with models of this type. Combo_NN_2 did suffer a little towards the end of the season in estimating margins for games involving Collingwood as the Pies' MARS Rating was outside the range of any rating that Combo_NN_2 was exposed to during its training period, but a tipping performance just two games behind the bookmaker and an MAPE within a couple of points per game is an outstanding achievement when you consider that the algorithm uses just 5 pieces of data (albeit 5 information-packed pieces in the shape of the participating teams' MARS ratings and bookmaker prices, and the interstate status of the contest).

On line betting it turned out that 5 Predictors produced winning rates sufficient to turn a profit over the entirety of the season: ProPred_3, ProPred_7 and Bookie_3 especially, but also, narrowly, H2H_Unadj_10 and H2H_Adj_3.

The extraordinarily high winning rate of favourites this season, reflected in the strong head-to-head tipping performance by BKB and the other bookie-derived tipsters, was also apparent in the final probability score for the TAB bookmaker, which was a remarkable 0.278 bits per game. H2H's and ProPred's average scores of around 0.22 to 0.23 bits per game, which would have been outstanding in any other season, were made to look decidedly sub-par in comparison.

The Line Fund algorithm completed the season with an average probability score of -0.05 bits per game, which though slightly below chance, is a level that simulations have shown will generally be sufficient for profitable wagering, as it has been this year.

So, that's it then for another season. Thank you to all who have passed by these pages and especially to those who have invested in MAFL in this and in previous seasons. I'm sorry I couldn't produce a profit this year and I hope you'll come back again next season, if not with your money, at least with your interest.

(Investors please let me know what you'd like me to do with your remaining funds, as diminished as they are.)