The Opinion Polls in the run up to the General Election were a long-running and important news story. There were a lot of them - seven or eight companies - and they reported very regularly. Hardly a day passed without one or two new polls being published. The most notable characteristics of the polls were their consistency one with another and how little they varied over time. Over the course of this year, in the four months or so up to May 7th, the trend line showed the two main parties neck and neck with few polls deviating from this trend. This meant that if a poll did show one or the other with a (say) three point lead it tended to be dismissed as an "outlier" - and there weren't many of these.
The polling companies who translated vote share numbers into seats all pointed to a hung Parliament - neither the Conservatives nor Labour would come close to the around 323 seats needed for a Majority in the House of Commons. The numbers were finessed to take account of the exceptional situation in Scotland where it began to be clear that the Scottish National Party was going to come close to a clean sweep of seats. And there was also a common view (which, for the record, I shared) that the Liberal Democrats would hold onto far more seats than their woeful national poll numbers suggested. Around 30 "holds" was a common view.
In the table above we see seat "forecasts" the day before Election Day (and just before) based on the latest polls. We also see the "Exit" poll and the final result. If we take "The Guardian" as an example (and the other polls were only marginally different) on 7th May, Election Day, they forecast an equal number of seats for Labour and the Conservatives and 27 for the LibDems. In fact the Tories got 99 seats more than Labour and the LibDems only 8. It was an almost unbelievable and unprecedented polling failure.
Over the course of the election campaign the opinion polls dominated political commentary and the media. It is almost impossible to find anyone who expected anything but a hung parliament. So the commentaries were dictated by this. Possible coalitions, arrangements etc. were explored ad nauseam. And this played back into the party campaigns - not least with the Conservatives warning of a possible Labour/SNP deal.
With the benefit of hindsight we can now see that much of the debate, predicated as it was on a hung parliament, was specious nonsense. The polls were believed - and that was the starting point for everything.
The polling companies have started to analyse and explain what went wrong. Maybe there was a "late swing" (unlikely). Maybe some Conservative voters were "shy" to admit that they were going to vote Conservative (unconvincing). Maybe there was a differential turnout with Labour potential voters staying at home more than the Tory ones (some evidence for this). And maybe quantitative polls no longer really work on their own - which is my view.
A poll is not a forecast it is a snapshot. On the day it comes down to the floating voter deciding whether or not to vote and then, if he does go to the polling station, pausing with his pencil over the ballot paper before making his choice. The factors influencing that choice are many and varied. But those factors can be explored, and that is where qualitative research comes in. There was little or no analysis and/or presentation of focus group results during this election - or of any other qualitative research. Maybe this was because all the "Qual" research was private and not intended for public consumption.
My theory is this. The outcome of the election was heavily influenced by Conservative focus group (etc.) research in the final month or so and its translation into communications messages. With the Conservatives having sympathetic newspaper proprietors on their side (The Times, the Telegraph, The Sun, the Mail...) the qualitative research based messages could be widely communicated. So when on the day before the election the Sun had on its front page a large photograph of Ed Miliband eating a sandwich awkwardly there was no randomness to this at all. It was carefully calculated. My guess is that Conservative focus groups showed that some voters found Miliband "weird" and that his "sarnie" struggle was illustrative of this. Daft, offensive, dim-witted - yes. Effective? Probably.
The anti Miliband position had been created successfully over months and years. The "Red Ed" sobriquet was all part of this. Again I'm guessing here but I think the Conservative campaign against Miliband was firmly based on research telling them that he was a weak link for Labour. In fact Miliband ran a good campaign in the main and raised his profile. But probably not enough among the crucial floating voters. So, like it or not, Ed with his Sarnie may well have lost Labour the election. (I'm being metaphorical here,mod course, but the sarnie is a symbol of discomfort felt by sufficient voters enough to tip the balance).
Back to the pollsters. Asking respondents HOW they would vote at any moment in time is still important - but asking WHY they made that choice more so. There has been too much reliance on the How this year and insufficient on the Why. Except, it's my guess, in the Conservative campaign headquarters. Don't confuse voters with too much detail - give them a small number of reasons to prefer you and a couple of powerful reasons not to choose your opponent. Reduce that to a slogan or two - "Red Ed" , for example. Provide a powerful visual image to back it up and get your friends in the media to give it prominence. All is fair in love, war - and politics!
This blog seems internally inconsistent, if as you say all is fair in love and war then 'effective' is not 'dim witted'.
ReplyDeletePerhaps I should have said intellectually and morally bereft, and tasteless?
ReplyDelete