Monday, May 6, 2013

NHL Playoff Predictions

The NHL playoffs are upon us, and for the third time I'm dusting off my Excel playoff model to see if I can predict who's going to win.

As it stands (as of May 6th, 2013), my model predicts that the most likely final will be between the Ottawa Senators and the Chicago Blackhawks. Altogether, though, the top four teams are the Senators, Blackhawks, Bruins, and Sharks (collectively these account for a 77% chance of winning the whole thing).

One of the ways that I've been presenting the daily updates from the model is as follows:

As time progresses (along the bottom), the height of each colored segment represents the relative probability of that team winning. For instance, when the Senators lost on May 3nd, their bar shrunk noticeably, and grew again after they won on the 5th. Again, the Bruins, Senators, Blackhawks, and Sharks account for a massive amount of the graph (and hopefully don't lose on the first round... that would be awkward).

So what makes me think I'm anywhere near accurate? If you asked me in person, I'd scratch my head and shrug a little. Particularly concerning are the long odds offered to some of the teams I predict to have a good chance of winning offered by sites like SportsClubStats and Bet365.

There are a couple of suggestions that I'm not totally inaccurate, though. Here are some of the results from previous years:
2010: Only correctly predicted the Blackhawks halfway after they started leading in the semi-finals. Maybe not the best prediction...

2012: Predicted the Kings six weeks before they won, once the Blues started to slide a little bit. More surprised about the Eastern conference, though, where the Devils admittedly were not predicted to do all that well.

Of course, the toughest part when it comes to checking how accurate a model is is actually coming up with an objective way of measuring that accuracy. Sure, the Kings won last year, but they only had a 13.5% chance starting out. 13.5% is high relative to other teams, but not really all that great overall. Can I really call it a win that a team with a 13.5% chance to win at the outside beat a bunch of teams at 5-10%?

One way to evaluate accuracy is to use a Brier score for each team, and take an average of all of them over time. A slightly modified Brier score would give a score of 1.0 to a 100% prediction that comes true, and 0.0 if it fails, with various decimal values in between based on what the given prediction was beforehand. If we compare the results from last year's model to what we would expect from pure chance, we get this:

So that's cool. Almost the whole way throughout the playoffs last year, my model gave more precise estimates of who's going to win than chance (assuming every game has a 50-50 chance of going either way). Part of the reason the score is so high near the end is that some teams have already been eliminated, and therefore would have a "perfect" prediction score (even though that's a bit silly). If we remove these teams, we get something more like this:

There are three distinct dips in the graph that represent the end of each round of playoffs. The scores dip because the predictions would get more general (open-ended playoff series making things less predictable, etc.). Even accounting for all this, my model last year was still significantly and consistently above chance. Fancy!

So who knows if the Senators will actually win. It'd be pretty cool if they did, though...

1 comment:

BrendaPalmeri said...

Awesome article, I am a big fan of this website, keep on posting that great content, and I'll be a regular visitor for a long time...looking for the next one....keep it up! mơ thấy tai nạn đánh con gì ăn lớn