Eurovision 2014: Final predictions

A learning experience

It was probably reasonable to think of Tuesday’s model performance as mixed: the overall score was bad, but only because of some quite surprising qualifiers. On the other hand, last night’s performance was terrible. The particular set of qualifiers didn’t show up even once in 10,000 simulations. That’s substantially worse than you’d expect if the model was picking qualifiers at random.

The most egregious mistake was the model’s near-absolute certainty (93%) that Israel was going to qualify. This, as it turns out, was not the case. Looking into it, I think I now understand why this happened, and hopefully can avoid it next year.

The problem arose partially because of how weak the field was. One of the consequences of this was that the Betfair odds for all of the entries were extremely long. The “favourite” was Norway, with a price of 26.5 (~4% win probability). Israel had the third shortest odds, at a price of 65.0 (~1.5% win probability).

Now, as I described previously, the song quality is estimated from a linear fit to the logarithm of the odds. This means that the difference in quality between a song at odds of 2.0 and a song at odds of 4.0 is the same as between a song at 20.0 and a song at 40.0, or a song at 200.0 and a song at 400.0. This almost certainly overemphasises the differences between songs with long odds. In reality, there’s not a huge amount of difference between a song which trades at 100.0 and a song which trades at 1000.0, while a song at 2.0 and a song at 20.0 are worlds apart.

A fix for this would be to introduce more variability in quality at longer odds. This is definitely something I’ll look into for next year’s model. For this year though, I’m going to stick with things as they are, and console myself that most of the countries with truly long odds have already been eliminated.

While looking into this mistake, I also noticed a rather strange bug in the model, which resulted in some songs getting huge scores. One of the simulations showed San Marino getting a perfect 12 from every country. I’ve fixed this bug1, which affected about 5% of simulations, but it does mean that the final predictions are now made using a slightly different model to the earlier predictions, so there may be some slight changes.

For the win

Moving on to the final then, the field is looking fairly balanced. After a Yugoslav-free final last year, and with the departure of Serbia and Croatia this year, it seemed that we were entering a new era for Eurovision. However, Slovenia and Montenegro have both qualified, for the first time in Montenegro’s case. I’ve seen it suggested that the former Yugoslav diaspora may be concentrating its votes on these two. If so, it bodes well for them.

Also interesting is that we have a full complement of Scandinavians present. With most voting blocs this would serve to weaken the strongest member, but the Scandinavians are usually fairly good at avoiding this. Expect plenty of vote-swapping, but the 12s should go to Sweden.

Winning probabilities

Not a lot has changed in the model prediction since the second semi-final, mostly because we already knew that nobody from the second semi was going to win. The main change is that, because I fixed the bug that gave them superhuman powers of singing, San Marino et al don’t really win very much any more.

From the Betfair side, the odds of a Dutch win have spiked enormously, to the point where they’re now second favourite. One of the main reasons for this is that the draw has now been made for running order. In general, over the past ten years, songs in the second half of the draw have performed much better than songs in the first. This year, Ukraine have drawn the first position, and many of the other highly rated songs are in the first half. The Netherlands are performing in 24th place, two before the end, while the UK take last position.

I had intended to incorporate running order into the model for this year, but it slipped my mind. This means that the model is missing out on some information. To try to combat this, I’ve re-run the model, but instead of using Betfair data from before the contest began, I’ve used the latest data.

Winning probabilities redux

As you can see, this causes the model to greatly focus on Sweden. Sweden have a fairly middling slot in the running: they perform 13th, at the end of the first half. However, this is probably much better than their competitors Armenia and Ukraine (1st and 7th respectively). Netherlands and the UK do receive bumps up the order in this new set of simulations, but not enough to make them really competitive.

Bellwethers, likely twelves, etc

Usually I try to identify “bellwether” countries: countries whose 12 points are most likely to go to the winner. In this case, because the model is so certain about Sweden winning, this has largely boiled down to the other Scandinavian countries. So, while Denmark, Norway and Finland all have around 40% chance of picking the winner, this is largely because they’re probably going to pick Sweden, and Sweden’s probably going to win.

A few of the lower-ranked bellwethers may be more reliable in practice. Poland, Spain and Israel are only slightly less likely to pick the winner, and are probably more varied in their tastes. Toss in Hungary and Macedonia, and you’ve got yourself some predictions. As always, be aware that the voting order is chosen for dramatic purposes, once the televotes are known. It can be a fun game to try to think yourself into the mindset of a Eurovision vote order planner, assuming you’re sober enough by that point in the evening.

We don’t have Cyprus voting this year, so Eurovision’s favourite pairing (Gryprus? Cypreece?) won’t be in evidence. The model is also fairly cool on the Moldova/Romania relationship, although I’m a bit skeptical about that. There are, however, some vote patterns that we can rely on.

  • Belarus → Ukraine (48%)
  • Greece → Armenia (56%)
  • Russia → Armenia (61%)
  • Georgia → Armenia (63%)
  • Finland → Sweden (64%)
  • Netherlands → Armenia (67%)
  • Azerbaijan → Ukraine (71%)
  • Norway → Sweden (75%)
  • France → Armenia (76%)
  • Denmark → Sweden (82%)

Summing up, Sweden are the likely winners—I hope they kept all the decorations from last year. Armenia has a posse, but probably not enough to give it victory. Don’t buy the UK/Netherlands hype, unless you enjoy that sort of thing, in which case go have fun, what are you listening to me for?


P.S. I’ve made some extra predictions, which you should read if you want to know what position you can expect each country to finish in.


  1. If you’re interested, the problem was that the burn-in period for the Gibbs sampler wasn’t being discarded correctly, and so some parameters occasionally had rather stratospheric values.