2014 - Round 27 Results : A Small Swan Dive

Obviously not the result that anyone expected nor the one that Investors wanted this week - and, frankly, not quite the Grand Final that I think the season deserved. Congrats to the Hawks regardless - they were maybe the best and certainly one of the two best teams all season.

The Swans' loss on Saturday continues the mixed performances recorded by Minor Premiers in Grand Finals since 2000. Of the 12 Minor Premiers that have made it to the final Saturday only six have won, which is the same number of Flags that Runners Up have accumulated, though from two fewer attempts. The three other Flags have gone to 3rd-placed teams who collectively now have the same success rate in Grand Finals as Minor Premiers.

Please click on the image for a larger version

Minor Premiers' misfortune has been most prevalent when they've faced Runners Up in Grand Finals, their record now standing at 2 and 5, and including losses in their four most recent encounters (2008, 2009, 2011 and 2014). The last time that a Minor Premier defeated a Runner Up in a Grand Final was Geelong's emphatic victory over Port Adelaide in 2007.

Since that year, Minor Premiers have made every Grand Final but won only two of them.

Looking at the entirety of the Finals campaigns and not just the GF we see that, over the 15 years, Minor Premiers have now won 72% of the Finals they've appeared in while Runners Up have won 70%. Teams from no other final ladder position have a better than 50% record in Finals, though those finishing 3rd or 6th are only a single switched result away from having done so.

The Swans' loss not only added the lacklustre recent record of Minor Premiers in Grand Finals but also served to hand another 2% of the Recommended Portfolio back to the TAB Bookmaker and represented the third consecutive loss in as many weeks.

The Head-to-Head Fund's loss was enough to flip it from profit into loss for the year, that outcome largely a result of two particularly poor weeks, one in Round 17 and the other in the Semi Finals. Regardless, I think it did enough this year to earn it another run in 2015.

That's not so, however, for the Margin Fund, which finished the season with a result reminiscent of many from earlier in the season when it made hopeful and sometimes tantalisingly near-correct predictions (not a hallmark of this week's effort) before ultimately waving goodbye to most of what it wagered on our behalf. During the year it made 218 bets for only 22 collects, turning the entire Fund 2.3 times and surrendering about two-thirds of it to the Bookmaker.

Ultimately then the Recommended Portfolio's net profitability for the year was solely attributable to the Line Fund, which landed 64 of its 112 bets, turning the Fund 3.4 times and increasing its size by over 44%.

All told, the Recommended Portfolio ended up by 8.4%, which means that, for only the second time in MAFL/MoS history, the Recommended Portfolio has recorded back-to-back profits.

Eight teams, net. made money for the Portfolio when they were wagered on, most notably Hawthorn (+14c), the Kangaroos (+8c), Richmond (+6c) and GWS (+5c). Conversely, 10 teams cost Investors money, especially Melbourne (6c), Collingwood (4.5c), West Coast (4c), Fremantle (4c) and Adelaide (4c).

The profits from Hawthorn came especially from Line betting where they ended up as collects seven times from the eight occasions on which we supported them. The losses from Melbourne wagering were more spread out across wager types, but Investors were particularly unfortunate when wagering on the Dees in the Head-to-Head market where they finished with an 0 and 6 record.

Adelaide were the team on which most Portfolio money was wagered (almost 25% of the Portfolio at some stage during the season), and Carlton the team on which least was wagered (only just over 4%).

Viewed instead from the perspective of the teams against which we were wagering, Geelong (+7c), Fremantle (+6c), Gold Coast (+5c) and Collingwood (+5c) were kindest to Investors when in this situation, while Melbourne (-8c), Richmond (-5c), Hawthorn (-3c) and the Western Bulldogs (-3c) were least kind.

It's interesting to note that we never wagered against GWS, Melbourne or St Kilda in the Head-to-Head market, and wagered against the Brisbane Lions only once. In the Line market we wagered on whoever was playing at home against the Cats on 10 occasions, and whoever was playing at home against Port Adelaide or Sydney nine times. That strategy was profitable in relation to the Cats and Port, but unprofitable in relation to the Swans.

TIPS AND PREDICTIONS

As I noted in the blog post before the round, most of the higher ladder positions on MoS Tipster and Predictors Leaderboards were locked away before the Grand Final was played, though some interest did still remain for positions lower down.

C_Marg, as the only Tipster predicting a Hawks victory, was therefore the only Tipster to record a non-zero result on the Head-to-Head Tipster Leaderboard. It also, as a Margin Predictor, recorded the smallest absolute margin error for the round, which allowed it to put one more spot between it and the wooden spoon before the season ended.

Bookie_9 finishes the season as the best-performed Head-to-Head Tipster, with a 148.5 from 207 or 72% record. The average accuracy of all remaining (ie non-Heuristic Based) Head-to-Head Tipsters was just over 69%, including a quite credible 69.3% from C_Marg, which matched the TAB Bookmaker's performance this year. 

Combo_7 heads the final Margin Predictor Leaderboard with a Mean Absolute Prediction Error (MAPE) of 28.6 points per game, just 0.05 points per game better than the 2nd-placed Bookie_LPSO. RSMP_Weighted was unable to reproduce the margin predicting form that it displayed in taking out the title last year and finished only 6th, two places behind last season's Runners Up, RSMP_Simple.

Combo_7 held down its Number 1 ranking for the last 11 rounds of the season, Bookie_LPSO and Win_3 being the only challengers for that position during this period, with neither getting any nearer in terms of MAPE differential than Bookie_LPSO did in the last two rounds. Win_3 fell away dramatically from about Round 21 onwards, eventually finishing 7th.

The all-Predictor average MAPE for the season was 29.6 points per game and all but the bottom five Predictors turned in sub-30 final results. Thirteen of the Predictors tipped Line winners at a rate sufficient to be profitable in this market at $1.90 prices though, to be fair, Bookie_LPSO and ProPred_7 were about as close to breakeven as it's possible to be without not so being. Combined, the Margin Predictors tipped Line results - for home teams and away teams alike - with 54.7% accuracy, reflecting I think the relatively predictable nature of line results this year for anyone pursuing a data-based approach.

Seven Predictors also finished with profitable SuperMargin wagering performances if we allow them to wager only when they predict a Home team win or draw. H2H_Adj_7 did best, betting on 57% of contests and selecting the correct bucket on 22 occasions, three more than any other Predictor, registering a +41% ROI as a result.

Combo_NN2, one of the two Predictors powering the Recommended Portfolio's Margin Fund, finished the season with an ROI of -11%. What dragged down the Margin Fund's performance even moreso was the -40% ROI registered by Bookie_9, the other advisor to the Fund.

Bookie_OE took out the prize for MoS' best Head-to-Head Probability Predictor, finishing some 0.0032 bits per game better than last year's winner, Bookie_LPSO. Bookie_RE finished 3rd, 0.0019 bits further back, slipping one place below its 2013 finish. 

C_Prob finished a creditable 4th, but will need some supplementary support if it's to power a profitable Fund of the future.

The Line Fund algorithm, though negatively impacted by the Swans' failure to cover the spread this weekend - and, boy, is that an understatement - still finished the season with a significantly better (though negative) Log Probability Score than last season.