Thursday, April 3, 2014

Concerns raised over Ipsos-Mori's methodology

An anonymous poster left this rather troubling comment on last night's post about polling accuracy -

"James this is an excellent article and has prompted me to share with you my own personal experience.

I had never been polled before but in September I was contacted by Ipsos Mori. On paper I should be a No voter, retired professional, own my house, good company pension etc.

After being asked if I would take part in the poll, I was then asked if they could contact me on a regular basis as they wanted to follow trends.

After two or three very leading questions, one of which started "since Scotland would not be admitted to the EU", I was then asked "if the referendum was tomorrow how would you vote" answer Yes. "How likely are you to vote" answer definitely.

After a few other questions the interviewer thanked me for my time then finished the call.

I have never heard back from Ipsos Mori, I wonder why?"


Now, to be fair, it's possible that this was an unpublished internal poll for the No campaign or for one of the anti-independence parties. If so, there's no problem at all - if Better Together are daft enough to use leading questions to convince themselves that they're doing better than they really are, then by all means let them get on with it. But there was in fact a published Ipsos-Mori referendum poll that was conducted between the 9th and 15th of September last year. There was no suggestion at the time that any questions (let alone leading questions) were asked before the main referendum question, and if there had been it would have completely transformed our perception of the results - look at the way Professor John Curtice castigated the Panelbase poll just one month earlier that had asked the referendum question third.

Because this was a telephone poll, one theoretical possibility is that one or more rogue interviewers were asking the questions in the wrong sequence, and thus potentially distorting the results. It's not entirely paranoid of us to raise these doubts, because as things stand Ipsos-Mori are the extreme outliers in this campaign, and that must be happening for a reason (or for a variety of reasons). The other slightly mysterious thing about them is that they did actually produce one poll that was very good for the Yes side, way back in early 2012. That was followed by a huge slump for Yes over the course of the rest of the year, from which there hasn't really been a proper recovery in any subsequent Ipsos-Mori poll. No other pollster has replicated that trend. The only one that has even come close is TNS-BMRB, but unlike Ipsos-Mori they've shown Yes making a very telling recovery from the low point of the initial slump.

It's very difficult to explain this divergence from the general pattern, given that as far as we know Ipsos-Mori haven't altered their methodology since that early 2012 poll. So if anyone else has been interviewed by them, feel free to let us know your own experience. Were you contacted on your landline phone, or on your mobile? Was the referendum question asked first? If not, could the earlier questions be construed as in any way leading? Was there a preamble to the referendum question, and if so, what was it?

UPDATE : I've been contacted by someone else who was interviewed by Ipsos-Mori more recently. He asked to be anonymised, so I've excluded his postcode and a few other details that might conceivably be personally identifying. In some ways it's quite a similar account to the one above, but crucially there's no indication this time of any leading questions being asked prior to the referendum question. There's still no sign of anyone having been contacted by Ipsos-Mori on their mobile phone. That's important, because if they aren't calling mobiles it's hard to see how they can be confident that their sampling isn't significantly skewed.

"Firstly, thank you for your excellent blog, which has helped me to come to a dim appreciation of some of the complexities of opinion polls. I'm stimulated to contact you as a result of reading your post this morning, with details of a reader who contacted you with details of his experience with IPSOS-MORI.

I have to say that I can echo that experience. I was contacted by them by telephone (my landline) earlier this year - I would guess around 6 to 8 weeks ago, but add a significant plus or minus on either side of that. I was asked a similar set of question (no leading comments re the EU, but was asked my voting intention, which is Yes, and my likelihood of voting, which is certain). The final question which I was asked related to age - when I said that I was **, I was told that their quota...was filled, and that they could go no further with questions. I was asked if I would be happy to be contacted for future polls, and answered yes, but no further contact has been made to date.

Interestingly, I fall into the same demographic as your other correspondent...I live in a fairly affluent area...and guess I should be a right of centre No voter (neither of which tags describes me!). I don't know if these two anecdotal reports have any significance, but it seemed worth giving you this information."


If you scroll down to the comments section, you can also find an interesting comment from an ex-Mori interviewer.

* * *

I did actually manage to make it through the whole of Clegg v Farage this time, but as someone with pro-European views it was a profoundly depressing experience to have as my supposed "champion" a politician as unappealing and insufferably condescending as Nick Clegg. I'll be honest - he was so dreadful that it got to the point where I was more or less cheering on Farage. In retrospect, it's hard to understand why the public didn't see straight through Clegg in the 2010 leaders' debates, because his style hasn't changed one iota - it must have been a very weird kind of novelty value.

Instead of treating us like adult human beings and saying "that's because of the Lisbon Treaty", he'll say "that's because of something called the Lisbon Treaty, which is...", before proceeding with a patient explanation in words of no more than one syllable for his very favourite little girl Hannah, who asked such a clever question. In fact, I almost expected him to say : "That's a really important question, Hannah, and I'm going to answer it for you with what we call 'words'. Those are really cool things that we use to make up a sentence..."

You won't be surprised to hear that the bit that made the most steam come out of my ears was when Clegg abused the platform he'd been given by embarking on an anti-independence rant. Dimbleby did eventually close down the topic, but he waited far, far too long - it felt like a good thirty seconds went by. And even then, the way he did it was totally unsatisfactory - he seemed to imply he was only shutting Clegg up because independence was a rather tiresome subject. Instead, he should have said this -

"Nick, stop. This isn't on. You know perfectly well that there is no representative here from the pro-independence campaign to put the alternative point of view. The reason they aren't here is that they weren't invited. The reason they weren't invited is that independence isn't on the agenda for discussion tonight. So it's inappropriate for you to raise it, and I'd ask you not to do it again."

The point being of course that Clegg knew full well that he was chancing his arm, and if other anti-independence politicians are to be deterred from doing the same on network TV they need to know they'll be clearly 'punished' by having their transgression flagged up for viewers. A moderator half-heartedly changing the subject after the damage has already been done just isn't good enough.

Interestingly, the BBC's own guidelines call on presenters to effectively slap down politicians who try to slip in a sly anti-independence dig in interviews related to other topics. Those guidelines won't officially come into force until late May, but Dimbleby's performance tonight doesn't fill me with confidence that they'll ever be properly implemented.

22 comments:

  1. If the pollsters are manipulating/being manipulated, aren't they pushing it the wrong way?

    Wouldn't the better manipulation would be to have Yes be miles ahead, so that people think that they don't have to put in the campaigning effort, because it's already won?

    Rather than to have the polls show No farther ahead than it is, which will motivate the Yes campaign to campaign harder, and demotivate the No campaigners, because they think it's already won.

    I guess this is a carrot or stick thing, with the two options being "normailise but demotivate" or "try to convince them it's hopeless".

    ReplyDelete
  2. It is impossible for Ipsos Mori interviewers to ask the questions in the wrong order. The order they appear will be determined by each particular survey; some rotate the questions to remove any sort of bias that the order of asking might introduce, otherwise they will follow a set order. Each questions appears on the screen individually in front of the interviewer and the next will not appear until the interviewer has typed in the response to the current question.

    ex Mori Interviewer

    ReplyDelete
  3. Anon : Thanks for the clarification. So based on your experience, should we be in any way worried by the description of the call given above, or does it sound to you more like a private poll?

    ReplyDelete
  4. Just in case anyone misses it, I've now updated the post above with someone else's experience of being interviewed by Ipsos-Mori.

    ReplyDelete
  5. I can't understand why/how weighting is applied in a one off referendum?
    Surely past voting history isn't so important?

    ReplyDelete
  6. Past voting history certainly correlates strongly with referendum voting intention, so that's the logic of weighting by it (although of course Ipsos-Mori are unusual in not doing that).

    ReplyDelete
  7. James, Do you have any evidence to support:
    Past voting history certainly correlates strongly with referendum voting intention

    ReplyDelete
  8. Even some Tories are voting Yes, so I still can't understand how weighting works for the Indy vote?

    ReplyDelete
  9. Not particularly worried by the description, just a bit mystified that the survey didn't "screen out" right at the start. As to private poll versus one for publication, I see no reason why the methodology would change depending on the use that the results are to be put to - best practice is best practice - why change just because the client is going to keep the results to themselves.

    When I worked there these types of surveys were carried out using random digit dialing (RDD). A computer would randomly generate numbers, when it generated one with a suitable area code for the area being surveyed it would try to dial out, if it is an actual telephone number it will then cause someones phone to ring, when this is picked up it is automatically connected to a waiting interviewer who will introduce themselves and ask the responder if they are willing to take part in the interview.

    It is at this point, the start, that the demographic questions are asked - gender, age, ethnicity; why waste the interviewer's time (£££) conducting the interview only to find out the interviewee is in a demographic group whose quota is already full. Sometimes surveys did "screen out" half way through when a particular response was given; you would then have to apologise to a usually mystified person on the other end of the phone. I'm not discounting that they might be asking the age question at a different part of the interview - just that because it costs Mori to have interviewers sit asking people questions when they might not be able to complete the survey due to quotas being filled, you would expect that to be done right at the start.

    [Demographic questions asked to ensure a representative sample is included in the survey. Census data will provide the quotas for each demographic group.]

    I included the description of RDD as I think this excludes mobile phones from these types of surveys. I may be wrong - I was just a humble interviewer, and it has been a couple of years since I worked there. Hope that helps.

    ex Mori Interviewer

    ReplyDelete
  10. Anon : The concern about public v private polling is the question sequence. Clearly if Ipsos-Mori ask a leading question (indeed a downright inaccurate one) about Scotland being forced to leave the EU, and only afterwards ask the main referendum question, then they're not likely to produce an accurate result - the No vote will be too high and the Yes vote will be too low. That doesn't matter so much if it's a private poll - the client will be misled but that's the client's own fault. But if it's a public poll and Ipsos-Mori haven't disclosed what they've done, then that's a very serious matter, because it means we've all been led up the garden path. I find it hard to believe that they would do that, though, which is why I wondered if the description of the call sounded more like a private poll to you.

    Thankyou for explaining the whole process, though - it's really interesting to hear the details of how it works.

    ReplyDelete
  11. Alasdair : Presumably you've seen the various datasets, so if you don't regard that as sufficient evidence of a strong correlation, I'm not quite sure what you'd be looking for?

    Juteman : Look at it this way. Not all over-65s will No, but there is a strong correlation between being over 65 and voting No. So if a poll doesn't have enough over-65s in its sample, ths Yes vote it reports is likely to be too high. By the same token, if you have too many Tory voters and too few SNP voters in the sample, the Yes vote is likely to be too low, regardless of the fact that not all SNP voters are planning to vote Yes and not all Tory voters are planning to vote No.

    ReplyDelete
  12. As another ex-Mori interviewer, I can say that my time working there was an eye-opening experience, and I will never trust a single result they put out again.

    If you have an understanding of demographics, you can see even down at the lowly level of phone-monkey that the quotas are often designed to steer the poll towards a certain outcome by focusing on age groups or socio-economic backgrounds that, statistically, will hold certain opinions. In one political poll, the preamble for the headline question was -five paragraphs long- and amounted to "X hates you, your family, your dog, and everything you like, love, and stand for; would you vote for X?".

    Now, Mori themselves aren't coming up with most of that kind of rubbish, the clients set the brief afterall, but the very fact they're willing to take on clients with such narrow, biased briefs clearly designed to generate specific results makes Mori as a company suspect in all regards, IMO.

    ReplyDelete
  13. "Not all over-65s will No, but there is a strong correlation between being over 65 and voting No"

    How can that be? They have never voted 'No' in an Indy referendum before to base that assumption on.

    ReplyDelete
  14. A correlation between being over 65 and an intention to vote No, to be pedantic.

    ReplyDelete
  15. I'm not trying to be picky, James,
    I just feel that something is wrong in the methods used by the polling companies for this referendum. I think the weighting might be at fault, as there is no past vote to weight. Asking folk what party they voted for last is only of limited use.
    The actual result of this referendum may cause a few red faces amongst polling companies.

    ReplyDelete
  16. James,

    Re: ‘Past voting history certainly correlates strongly with referendum voting intention’

    We cannot rely on the datasets arising from referendum polling as evidence of a correlation between voting history and referendum voting intention. Remember that previous voting history is build into (most) referendum polling methodology (at the very least by means of political weighting, but arguably also by sampling by electoral constituency which bear no reference to the referendum). This dependence on historic voting patterns gives rise to a ‘fallacy of presumption’! That is: the proof of the conclusion (in this case that there is a correlation between voting history and referendum voting intention) rests upon the presumption of its truth (as evidenced by the pollsters including previous voting intention as sampling/weighting criterion).

    Even six months out from polling date, we have already a raft of evidence (voter registration, diverse campaign groups, social media activity and crowd funding initiatives, not to mention the amazing attendance at public meetings) that suggest that the electorate is taking this referendum into uncharted territory and we may fairly ask searching questions of any presumptions vis-à-vis past voting intention built into a polling sample.

    In short and to paraphrase Tony Blair: ‘the electorate is shaking the Kaleidoscope and the pieces are in flux’ and we may/must hold as highly suspect the reliability of any headline forecast of a Yes/No vote result reported by any poll relying on previous electoral patterns. That said, we do well to remember that this uncertainty cuts many ways. Perhaps the polls are correct and the No campaign is heading for a respectable/narrow victory. Perhaps the Ian Smart view/hope of the world is correct and cause of Independence heading for a massive hammering. Maybe the Yes campaign is heading for a massive victory on September 18th. The plain and scary truth is that we just don’t know!

    ReplyDelete
  17. You expressed my thoughts better Alasdair.
    I hope it is a massive Yes of course! :-)

    ReplyDelete
  18. Alasdair : To some extent I agree that we don't know, for the reasons set out in the previous post. But I'm struggling to understand how you can credibly argue that there may not be a strong correlation between voting history and referendum voting intention - the evidence that there is such a link is absolutely overwhelming. (Which is in no sense bad news - pro-independence parties outpolled anti-independence parties on the 2011 list vote, after all.)

    ReplyDelete
  19. James, My point is that with the referendum already delivering greatly increased electoral registration and promising the prospect of much higher turnout (about which I now agree with you) the voting pattern assumptions imported from past elections must be questionable (unless we assume that those who have self excluded from the electoral process did so confident that result determined by those who participated would faithfully reflect their wishes). To be clear, I am not saying that the assumptions are necessary wrong, merely that they hang upon the a rather shoogly peg.

    ReplyDelete
  20. I think we're talking at cross-purposes. Obviously higher levels of registration and turnout significantly reduces the importance of the correlation between past vote recall and referendum voting intention, but it doesn't mean that the correlation isn't there.

    RIP Margo MacDonald.

    ReplyDelete
  21. @ILLY

    The one thing that the BT/Unionist campaign fears most, is a gathering memento for the Yes campaign.

    This is why they would be happy to put out dodgy press releases suggesting Yes is languishing in the polls, because for all the possible down-sides to them from this strategy, it is more than made up, by stopping the gathering of a yes memento.

    ReplyDelete