Saturday, August 16, 2014

Reading the runes for tonight's polls

I've just arrived in Brussels, and I've barely had a chance to catch breath, but this is how I see tonight's polling situation.  If the embargoed Panelbase poll turns out to have the figures that were prematurely tweeted earlier, it could be - pound for pound - the best Panelbase poll ever for Yes.  That's because the firm's methodology has radically changed since the last time they produced a 48/52 split.  But it's certainly one of the two best Panelbase polls ever (excluding the famous one from a year ago which is generally set aside because of the question sequence).  That's problematical for the No trolls who are trying to play this down, because we're not supposed to be in "Yes have failed to make even further progress from their all-time high" territory, but instead in "Darling bounce helps No to decisive lead" territory.  So much for all that, eh?

It may seem bizarre that Blair McDougall, the No camp's embarrassment of a campaign chief, has apparently been going out of his way to remind people tonight that Panelbase actually showed a Yes lead in a poll last year.  The reason he's doing it is presumably to make the new one seem relatively unimpressive by comparison, but in reality it's highly unlikely that the two polls are directly comparable.  Neither the SNP nor Yes Scotland have insisted upon the unusual question sequence that generated that Yes lead in any subsequent poll.

Judging from Twitter Kremlinology, it's probable that the ICM poll shows some kind of reduction in the No lead, although it's difficult to know what to read into that because the last poll from the firm was on the good side for No.

Kenny Farquharson has once again been rather catty tonight, suggesting that the ICM poll for his publication is the only "independent" poll of this evening, as compared to a Yes Scotland poll that was passed on to a "pet" paper.  Does he have a point?  The short answer is "no", and the longer answer is "mostly no".  As long as there was no jiggery-pokery with the question sequence, it makes no difference whether the paying client is "independent" or not - the results of the headline question are equally credible.  The supplementary questions may be leading, but because they're asked later they can't affect whether respondents say 'Yes' or 'No' to independence.  The only problem with having a partisan client is that they may only publish polls that are particularly favourable, so over a period of time we would only get a partial picture.  But if the Yes campaign have indeed been withholding less favourable Panelbase results, that would just make tonight's numbers look even better.

And if there has been a Yes bounce, does that mean that once you cut away all the spin from the unionist media, Alex Salmond actually defeated Alistair Darling in the debate?  Not necessarily.  It seems to me that Salmond was in a no-lose situation, because the debate will have made people think - and we all know which way voters tend to swing in this campaign once they actively start seeking out information for themselves.

"Darling win" proves to be myth as Yes vote surges to 48% in sensational Panelbase poll

New Panelbase poll :

Yes 48%
No 52%

More to follow...

Testing, testing, one, two, three, four...

I'm sitting in Dover Pier, waiting to catch a ferry (you probably would have guessed that), and as I feared my mobile is playing up horribly, so I thought I'd try a very quick post as a test.  The first casualty of this trip is that I've just been asked to take part in Referendum TV tomorrow at the Edinburgh Fringe to talk about the polls that are apparently coming overnight, but obviously I'll be several hundred miles away.

Talking of the Fringe, I didn't get round to doing my annual reviews, but if they're still on I can highly  recommend The Tulip Tree (about a young Enoch Powell and unrequited love) and Agamemnon, part of The Bunker Trilogy.

As usual, I also saw the open air play in Duddingston, which I had mixed feelings about this time (it's 'Mary Stewart') but it's still worth a look.  Oh, and there was a barking mad and borderline-tasteless musical about Jack the Ripper.  Bliss!

Friday, August 15, 2014

Pro-independence campaign stand on brink of victory in new study after 'undecided leaners' are taken into account

A new study (as opposed to a conventional poll) conducted by the Economic and Social Research Council has been reported by the Scotsman. Supposedly the results should be available in full from today, but I've been unable to track them down, so I can only go on the very limited information the newspaper has made available. The headline results are...

Should Scotland be an independent country?

Yes 38%
No 51%


Which would probably work out as a 43/57 split after Don't Knows are excluded. But in fact when those Don't Knows were pressed further on how they are leaning, most gave an answer, and of those that did there was a 2-1 split in favour of Yes. When those 'undecided leaners' are taken into account, the result is extraordinarily close -

Yes 47.5%
No 52.5%


No word on the fieldwork dates, and we probably have to assume that they're likely to be a little bit out of date, and that they took place over a longer period of time than would be normal for a poll (even a TNS poll!). But encouraging stuff all the same.

* * *

A Housekeeping Note

Barring unexpected hitches, I expect to be on the move over the next ten days or so. Normally in these circumstances I would just pre-schedule a couple of posts and leave it at that, but obviously that's not realistic with polling day just over a month away. So I'm going to attempt to keep blogging (emphasis on the word attempt) via my useless mobile phone. Updates will be shorter and less frequent, and if they disappear altogether you'll probably be able to surmise what has happened!

Can we believe the polls (or the pollsters)?

A guest post by Scott Hamilton

In the referendum campaign we are being presented with polling data on an almost weekly basis. When a poll is released (often in the wee small hours) almost immediately there is a scrum to understand what the numbers mean for which side, often with a Twitter race to churn out a pleasing graphic triumphantly cherry picking the most striking result.

Whilst the evidence to suggest that people are directly affected by polling data - in terms of how they vote at least - is scarce at best, it is easier to conclude that voting intention could be influenced by the media who are demonstrably affected by polling data. Often it is the media establishments that commission the polls who are most vocal about the results (understandable given their money often pays for the analysis). Polling generates fairly cheap copy and makes for good, dramatic headlines where each side of the campaign is said to be “winning” or “losing”, though more often than not the narrative is much more dramatic - “blow for Salmond” appears to be something of a favourite.

Error, errors and more errors

Question - when have you ever seen a newspaper headline that expressed a poll result with ANY discussion of error front and centre? Never, right? The “Blow for Salmond” headline with the “60% No” strapline isn’t quite as sexy when you add “this value is subject to at best plus or minus 3% error, maybe much more - please interpret these results with caution”.

We (I’m looking at you MSM!) should remember that, like any observationally driven procedure, polling is subject to error. Most people with a passing interest in polls will be aware of the oft-quoted “plus or minus 3%” figure that the polling companies and the media use as something of a “quality guarantee”. This is far too simplistic a metric to use as in truth, the error associated with any single political poll is actually unknowable and here’s why.

The “plus or minus 3%” error value is the absolute best case a polling company can achieve - this is because the figure represents the sampling error, not the total error in the poll - which is actually unknowable! Sampling error is the amount of potential variation from the true value associated with trying to represent a large population with a much smaller sample. For example if you wanted to know how many left-handed people there were in Scotland, you could ask 1000 people and be pretty sure you were within about 3% of the correct answer when all’s said and done.

Now, leaving sampling error to one side, there are other potential sources of error in a polling survey. Commonly these are called the coverage error, measurement error, and the non-response error.

Coverage error arises from not being able to contact portions of the population - perhaps the pollster only uses people with an internet connection, or a landline telephone number. This can introduce bias into the sample for a whole host of reasons.

Measurement error comes about when the survey is flawed, perhaps in terms of question wording, question order, interviewer error, poorly trained researchers, etc, etc. This is perhaps the most difficult source of error to understand, it is unlikely that even the poling companies could put a % figure on how much error these methodological aspects contribute to the total for the survey! Taking the left hand/right hand example above, this is a completely non-partisan question that people should have no qualms about answering honestly. They also won’t have forgotten which hand they use, unlike say how they voted in an election three years ago which is sometimes used to adjust poll results. I’m also confident there’s little potential for vested interest in such a question and how it’s worded and framed, perhaps not the case for an issue like the upcoming independence referendum!

Non-response error results from the surveyor not being able to access the required demographic for simple reasons like people not answering the phone, or ignoring the door when knocked - unavoidable really.

Polling companies try to account for all of this uncertainty by using weighting procedures whereby the sample (or sub groups in the sample) is adjusted to make it align more closely with the demographics of the population being surveyed. For instance, the sample might have 10% too many women compared with the population, so therefore women’s voting preference would be weighted by 0.9 in the final analysis to account for this.

But bear in mind, if we accept the premise that the absolute best case for a question as straightforward as “do you use your left or right hand to write?” is plus or minus 3% error after weighting, how much error do you think may still exist in a survey on something as societally complex as the Scottish indyref? We simply do not know, and no polling company can tell you either. There’s simply no way for them to fully account for the total error in their results - not that they or their sponsors tell you that. And perhaps just as bad, when they get it right, they can’t say with confidence how that came to be so - good sample? Good coverage? Nice researcher getting honest answers?

Still feel confident about those results in the Daily Record?

Some numbers

OK, so we know a bit about error - but what difference does all this make? Well, quite a bit actually! Thankfully there are some quite recent Scottish elections we can use as test cases for some further analysis. Bear in mind that in these cases the data is the final, adjusted, polished, weighted, dressed up for the school disco data. The polling companies' best estimates which we can use to see how well they reflected the outcome of a real example.

Let’s pause here however - polling companies will always say “you can’t compare the poll done a month before the election with the final result! Opinion must have changed”. Perhaps uniquely in what is after all supposed to be a scientifically driven pursuit, there is no penalty for a polling company being entirely wrong, all of the time! They always have the fall back position of “we were right at the time”. But then, I’m sure the polling company would say, “how do you know we’re wrong?” and we’re left going round in circles in the context of a compliant media blindly accepting and promoting results which it itself commissioned. Anything wrong with this picture? Any space for vested interest? Media commissions poll, media shouts about poll they commissioned and perhaps even designed...

My central problem with all of this is that the media uses the error strewn polling from at least several weeks (and months!) before a major voting event to strongly suggest or even predict the outcome of that event. Even if they don’t come out and say it, my theory is that some of what they are trying to achieve is an acceptance in the voting community of a preordained outcome, backed up conveniently by their numbers. Not exactly playing fair.

As I write we’re about five weeks from the referendum so I thought this a good time to look at how accurate a few of the polling companies were in the run up to the 2011 Scottish Parliament election - but not in % terms as that can be kinda obtuse. Let’s turn it into votes!

To establish the predictive power of the polling companies at various points in the last few weeks leading up the vote I’ve turned the difference between the outcome and their poll on a given date into actual votes cast. After all we know how many people voted (thankyou Wikipedia), so we know how many people voted for each party, so therefore we can see how many people the pollsters think would vote for each too on a given date.

2011 Scottish Parliament Election

In 2011 the total votes cast amounted to about 1,990,000 on a turnout of about 50.4%. The SNP ended up getting 902,915 votes in total (45.39% of the total). Labour got 630,461- 31.69% of the total. The other parties were less significant so I’ll stick to these two.

YouGov : on the 15th of April 2011 this polling company put the SNP vote share at 40% and Labour’s at 37%. These values don’t sound too dramatic compared with the outcome but I estimate this represents (with errors for other parties included) about 140,000 Scots who didn’t vote as per the polling percentages just three weeks before the election. Most of the error is in overestimating Labour’s share, and underestimating the SNP's. This, after all the weighting procedures that are supposed to reduce error...the dressed for the school disco data.

Remember, this is for a turnout of 50% so if we scale the same error to 70% and 80% turnouts (both plausible for the referendum) we end up with quite staggering numbers- 189,000 and 216,000 voters in the “wrong box” less than a month from the vote. Repeat after me, “plus or minus 3%”. Could over 200,000 people influence the outcome of the indyref?

Granted by the day before the vote, YouGov’s polls better reflected the outcome - but their polls still didn’t match the outcome by some 100,000 voters (or 134,000 and 153,000 when scaled for reasonably expected indyref turnouts) the day before the vote. “Plus or minus 3%.......”

Just so it doesn’t look like I’m picking on selected polls - YouGov’s polls, on average from Feb to May 2011, differed from the eventual outcome by something like 140,000 people (this amount of error could mean as much as 230,000 people assuming high indyref turnout). Could 230,000 people swing an indyref?

TNS : this polling company conducted fewer polls in the run up to the 2011 election but their polling at 5 weeks out (27th March) represented about 166,000 voters in the “wrong box”- that is to say they did not vote as polled. Scaled to 70%/80% turnout that is about quarter of a million people.

By a few days before, TNS’ numbers better reflected what happened on the day but there were still 82,000 people who didn’t vote as expected. If the turnout had been 80% this would mean 153,000 voters. Could that many people swing an indyref?

Conclusion

Hopefully this piece has helped shine a light on how uncertain polls are, how they can carry quite serious errors corresponding to hundreds of thousands of voter (sometimes even the day before!), and why you should be utterly sceptical about any news outlet’s representation of them.

So, when you’re reading the paper on Sunday and there’s a poll in it - remember that the results could have an error amounting to a couple of hundred thousand Scottish voters. How they’ll swing on the day no-one knows, least of all the pollsters. The only 100% certainty is that 100% of the polls are wrong 100% of the time, worth bearing in mind as we enter the final weeks!

Thursday, August 14, 2014

BREAKING : Have the BBC decided to involve No-friendly pollster Ipsos-Mori in their leaders' debate?

I've just received this very disturbing email -

"Was just phoned by Ipsos Mori asking for the views of 16 to 34 year olds ahead of the referendum. At the end of the interview, I asked when the poll was going to be published, and was told that the poll's findings would be published before the debate between Alex Salmond and Alistair Darling, with some of the findings used as questions. Off the top of my head I was asked which way I intend to vote, how likely I am to vote, which currency Scotland would use, how much of a risk independence would be, how would independence affect public spending, healthcare etc, how likely it is that more powers will be given, how more powers would affect my voting position and how much better off I will be due to independence. That isn't an all inclusive list nor are the questions all worded correctly- however I do find it strange that the majority of the questions were those that could be spun against Alex Salmond as opposed to Alistair Darling. I'm not sure if this poll's use in the debate is yet common knowledge - maybe I'm slow."

I'm hoping against hope that this is just a misunderstanding. As I recall, the BBC have a policy that explicitly forbids the commissioning of voting intention polls. If they've lost their marbles and ignored that policy, it would obviously severely call into question the corporation's neutrality if they had also selected one of the most No-friendly firms to conduct the poll. At least STV had the (thin) excuse that they'd used Ipsos-Mori before.

There's just one thing I don't understand, Professor

See if you can reconcile two statements from Professor John Curtice, made within the same blogpost about the new female-only Survation poll. I must say I'm struggling.

Statement 1 : "In the company’s last regular monthly omnibus for the Daily Record in July, Yes support amongst women stood at 40%, exactly the same as today’s poll. Meanwhile the equivalent figures for the three months before that were June, 41%, May 38%, and April just 36%."

What he's pointing out here is that support for Yes among women in this poll is roughly the same as it was in two previous Survation polls which had Yes at 47% among the entire sample, and is significantly higher than in two earlier polls that had Yes at 44-45%. In other words, although it's impossible to draw firm conclusions, the new poll is perfectly consistent with Yes being above 45% across the population as a whole.

Statement 2 : "Of course what the poll does do is to add to the substantial weight of evidence that the Yes side is well behind in the polls."

Er, come again? How does a poll which could easily be consistent with Yes being as high as 47% add to the weight of evidence that Yes is "well" behind? Or does a 47/53 split suddenly fit the definition of a "commanding" lead, to use Survation's recent word of choice?

In all honesty, though, I was shaking my head in disbelief looking through Survation's datasets. Respondents in the South of Scotland have been upweighted from 55 to 128, and over-65s have been upweighted from 95 to 232. The upweighting of young voters isn't quite so extreme this time, but it's still substantial. All of this introduces a level of randomness (and likely volatility) into Survation's results which just isn't satisfactory. I get the slight impression that the various upweightings may have slightly helped Yes on this occasion, but it's difficult to be sure given that it's a female-only poll. Yet again, the No lead is fractionally lower in the raw unweighted data than it is in the weighted numbers, but that's probably due in large part to the extreme shortage of older voters in the raw sample.

To the extent that we can take the results seriously, though, the good news is that the gap is slightly smaller on the unrounded numbers...

Should Scotland be an independent country? (Women only, Don't Knows excluded)

Yes 40.4% (+1.3)
No 59.6% (-1.3)


Undecided female voters were pressed on how they would be most likely to vote, and they split 48.7% Yes, 51.3% No (excluding respondents who still couldn't give a view). Because that's a much narrower gap than in the headline numbers, it means that the No lead would be slightly lower if 'undecided leaners' were added in. For technical reasons, it's not possible to make the calculation, but it would probably work out as either a 41/59 or 42/58 split (and based on Curtice's observation, that could easily be consistent with a Yes vote of as high as 48% once men are taken into account!).

One thing I should have pointed out last night is that Survation still haven't joined the new orthodoxy of weighting by country of birth, so for that reason they may well be understating the Yes vote slightly in all of their polls.

Wednesday, August 13, 2014

No lead slumps by 3% among women in new female-only Survation poll

A rather odd new referendum poll from Survation is out, conducted among women only, who of course tend to be somewhat less likely to be Yes voters than men. However, it's still possible to work out the trend from the last Survation poll which was conducted immediately after the leaders' debate, and that trend is favourable for Yes. So if there ever was a significant post-debate boost for No (it may well have been an illusion caused by normal sampling variation), it appears to be fading fast.

Should Scotland be an independent country? (Women only)

Yes 34% (+1)
No 50% (-2)


With Don't Knows excluded, that works out as...

Yes 40% (+1)
No 60% (-1)


Those changes would be consistent with a No lead of about 10% or so among the population as a whole, which was fairly standard for Survation during the spring. With undecideds excluded, that would equate to roughly Yes 44% or 45%, and No 56% or 55%.

This poll was commissioned by the Daily Record, and it appears to have completely replaced their normal monthly referendum poll, which on the face of it rather defeats the purpose of having an ongoing monthly series. The logic may be that there have been two very recent full-scale Survation polls in the Daily Mail, which would render a third one redundant, thus offering an opportunity to do something a little different. But why a female-only poll in particular? I know there are many people in both the Yes and No camps who will welcome a focus on women's views, but given that this is an anti-independence newspaper we're talking about, I find it hard not to be cynical about the motivations for this. I suspect it may be a cunning wheeze to hoodwink the Record's readers into thinking that the No lead is wider than it really is.

Certainly every time we see one of these 'special' polls, it always conveniently seems to be among a No-friendly demographic. I believe this is the second female-only poll, and there was also a Populus poll of over-50s a few months ago. And let's not forget those two bizarre full-scale ComRes telephone polls that were confined to the Borders and Dumfries & Galloway - a particularly No-friendly part of Scotland containing just 5% of the national population. I've got a suggestion for next time - how about a poll of low-income male Glaswegians, aged between 25 and 44?

* * *

As Scottish Skier pointed out a few hours ago, TNS-BMRB have followed in the footsteps of Survation and YouGov by producing a poll that has a slightly lower No lead in the raw unweighted data than it does in the weighted results for publication. It really is extraordinarily unusual for three polls in a row to show this pattern, because Yes are generally weighted up rather than down. It may of course just be pure coincidence, but there could also be something interesting going on beneath the surface which might point to the Yes vote being underestimated in these recent polls. In the case of the TNS poll, part of the explanation seems to be that the No lead is much bigger among people who say they didn't vote in 2011 than it is among the whole sample, which doesn't usually happen. The responses of those non-voters have been significantly upweighted in the overall results, in line with TNS-BMRB's highly questionable weighting procedure. Also, people who recall voting for minor parties in 2011 have broken heavily for Yes this month, and they have been subject to an extreme downweighting, from 33 real respondents to only 6 'virtual' respondents.

I predicted last night that simply stripping out the small number of respondents who say they are certain not to vote in the referendum would be sufficient to increase the Yes vote by an appreciable amount, and so it has proved - it actually trims the No lead by a full 2.6%. Here are the figures for each level of likelihood to vote, with Don't Knows excluded...

Whole sample (equivalent to 100% turnout) :

Yes 41.6% (-2.0)
No 58.4% (+2.0)

Whole sample excluding only definite non-voters (equivalent to 93% turnout) :

Yes 42.9% (-1.4)
No 57.1% (+1.4)

Respondents who say they are certain or very likely to vote (equivalent to 79% turnout) :

Yes 44.6% (-0.4)
No 55.4% (+0.4)

Respondents who say they are certain to vote (equivalent to 71% turnout) :

Yes 45.2% (+0.6)
No 54.8% (-0.6)


The last set of figures represent the narrowest gap of the campaign so far in a TNS poll.

You probably don't need me to point out that the Yes vote gets higher as the assumed turnout gets lower. And I don't want to alarm anyone in the No campaign (much), but a 71% or 79% turnout does sound a hell of a lot more plausible than a 93% or a 100% turnout.

Wisdom on Wednesday : Calling a spade a spade

"As an imperialist class-based state the UK is poorly equipped to meet the divergent needs of its constituent nations."

Author Irvine Welsh.

Yes vote soars to yet another record high among definite voters in new poll from traditionally No-friendly firm TNS-BMRB

It's been a bruising few days for Blair McDougall and the other Abominable No-Men, as their hopes of a post-debate polling bounce have receded.  We now have a second poll that was partly conducted after the debate, and the pattern is uncannily similar to the first one - the No lead has actually fallen among definite voters.  And in this case, it's fallen to its lowest level of the campaign so far in any poll from the traditionally No-friendly firm TNS-BMRB.

Should Scotland be an independent country? (Definite voters only)

Yes 38% (+1)
No 46% (n/c)

Although that's only a 1% increase in the Yes vote share, it's worth remembering that the 37% vote recorded in last month's poll was 2% higher than it had been in any previous TNS poll - or to put it another way, tonight's poll has the Yes vote 3% higher than in any TNS poll prior to July.  A rough calculation suggests that with Don't Knows excluded the position is likely to be Yes 45% (n/c), No 55% (n/c), although there's an outside chance that it might be Yes 46% (+1), No 54% (-1), which would also be a new record.  We'll find out when the datasets are published, presumably tomorrow.

On the headline figures that cover the whole sample regardless of likelihood to vote (they even include people who say they are absolutely certain not to vote!) there has been a predictable reversion to the mean since the last poll which saw the No lead slump to a record low by a huge margin.  The bad news for the No campaign is that while tonight's figures may not be the worst for them in a TNS poll, they're the second worst.  Last month's poll was the first time in the campaign that Yes had been higher than 41% in a TNS poll after Don't Knows were stripped out, and tonight is the second.  That makes it very difficult to argue that Yes haven't made further progress since the spring, unless there have been some highly coincidental margin of error effects in both of the last two polls.

Should Scotland be an independent country? (Whole sample with Don't Knows excluded)

Yes 42% (-2)
No 58% (+2)

With Don't Knows taken into account, Yes remain on 32%, which is their record high for the campaign - prior to last month's poll they had never been higher than 30% (or at least not since TNS introduced an enormous methodological change).

When the datasets are released, the first calculation I'll be making is voting intentions with the small number of definite non-voters stripped out - in both of the last two TNS polls that in itself was sufficient to increase the Yes vote by the best part of 1%.  And that gives a clue as to why the figures for definite voters only (which TNS are increasingly giving parity of esteem to) may be the more meaningful ones - they equate to a turnout in the low 70s, which is perfectly plausible.  If you think that the turnout is more likely to be around the 80% mark (or a bit higher), then the relevant figures will be for those who say they are very likely or certain to vote, and we won't find out what those are until tomorrow - although it's reasonable to assume that they'll be more favourable for Yes than the numbers from the whole sample.

As I always point out, good results for Yes in polls from TNS and Ipsos-Mori are worth more than good results from other pollsters, because those two are the only firms that don't rely on volunteer online panels for their referendum polling, and that actually seek out a sample in the 'real world'.  It's hugely encouraging that both are now showing Yes at a record high for the campaign among definite voters (42.5% in the case of Ipsos-Mori, 45% or 46% in the case of TNS) as we enter the crucial last few weeks.

And what does the TNS poll tell us about the post-debate polling landscape more generally?  It slightly increases the likelihood that the Survation poll showing an unusually high No lead was an outlier caused by random sampling variation - but the emphasis is on the word 'slightly', because what we really need to be sure is another poll that was entirely conducted after the debate, rather than only partly.

Final thought - as far as we know, TNS haven't yet joined the new orthodoxy of weighting by country of birth.  That may not be such a huge issue for them, because a face-to-face pollster ought to find it easier than an online pollster to put together a representative sample - but it would still be a very wise idea to weight by country of birth to be on the safe side, because it's such a strong predictor of referendum vote.

* * *

SCOT GOES POP POLL OF POLLS

To state the bleedin' obvious, the only reason this update of the Poll of Polls shows a slight increase in the No lead is that TNS are represented in the sample by their headline figures - if they were represented by the numbers for definite voters (as is the case for Ipsos-Mori), the No lead would have fallen back again.

MEAN AVERAGE (excluding Don't Knows) :

Yes 42.4% (-0.4)
No 57.6% (+0.4)

MEAN AVERAGE (not excluding Don't Knows) :

Yes 36.5% (n/c)
No 49.5% (+0.7)

MEDIAN AVERAGE (excluding Don't Knows) :

Yes 42.5% (-0.3)
No 57.5% (+0.3)


(The Poll of Polls is based on a rolling average of the most recent poll from each of the pollsters that have been active in the referendum campaign since September 2013, and that adhere to British Polling Council rules. At present, there are six - YouGov, TNS-BMRB, Survation, Panelbase, Ipsos-Mori and ICM. Whenever a new poll is published, it replaces the last poll from the same company in the sample. Changes in the Poll of Polls are generally glacial in nature due to the fact that only a small portion of the sample is updated each time.)

Tuesday, August 12, 2014

Mystery of "new TNS poll" showing another boost for Yes

A number of media outlets were seen to briefly report a new poll a few hours ago, seemingly from TNS-BMRB.  The figures given were -

Yes 38%
No 46%

- which if true would be the narrowest gap of the campaign so far from TNS, regardless of whether those are the headline figures or the turnout-filtered figures.

Mysteriously, the reports disappeared soon afterwards, which may indicate that it is an embargoed poll that was accidentally released too early.  Or it could just be an old poll from another firm - time will tell.

Monday, August 11, 2014

Boost for Yes as the gap narrows by 2% among definite voters in new poll from No-friendly firm YouGov

As I pointed out in last night's post, YouGov are unusual in that they don't filter or weight their headline results by likelihood to vote.  However, the datasets for the new poll have been released, so we now know what the figures would have been if YouGov used the same method as Ipsos-Mori, and headlined the results for respondents who say they are absolutely certain to vote in the referendum.  The logic for doing so is that people typically overestimate their own likelihood to vote, and actual turnout figures tend to correlate quite closely with the number of people who tell pollsters they are absolutely certain to vote.  The percentage changes given below are from the last YouGov poll which was conducted in late June.

Should Scotland be an independent country?  (Definite voters only)

Yes 40% (+1)
No 60% (-1)

As always, it has to be borne in mind that YouGov are now by far the most No-friendly of the six BPC pollsters, almost certainly due to the artificial and highly secretive "Kellner Correction" which is used to suppress the reported Yes vote, so the size of the No lead has to be seen in that light.  

Among voters who say they have at least an 8 out of 10 chance of turning out to vote, the No lead is completely unchanged since the last YouGov poll at 61-39.  So there's really very little comfort here for the "No have been given a boost by the leaders' debate" narrative.  Admittedly there's no absolute proof that there hasn't been a post-debate bounce for No, because only half of the sample for this poll was interviewed after the debate.  But if by any chance there was a substantial bounce, that must mean that the pre-debate half of the sample was extremely favourable for Yes, possibly with the Yes vote approaching an all-time high for YouGov, because there's no other way that the overall figures would average out as showing a small swing to Yes among definite voters.  That in itself would be troubling for the No campaign, because if any post-debate bounce proves to be superficial and transitory, you'd expect the state of play to return to roughly where it was prior to Tuesday evening.

It may also be worth making the point that, due to YouGov's clear sympathies with the arguments of the No campaign, they'd have been likely to point it out if there were any significant differences between the two halves of the sample.

There's one uncanny similarity between the YouGov and Survation polls of recent days, which is that the No lead is actually slightly lower on the raw unweighted data than it is in the weighted figures.  That's reasonably unusual - generally the weighting lifts up the Yes vote, because groups that favour Yes (such as lower-income people) are under-represented in sampling and have to be upweighted.  This should perhaps set a few alarm bells ringing, because it might mean there is something strange about both polls that is suppressing the Yes vote.  In the case of Survation it was fairly obvious what was going on - there were implausibly huge swings to No among the small samples of young people and residents of the South of Scotland electoral region, who had been upweighted massively in the overall results.  But I'm struggling to spot such an obvious explanation in the YouGov datasets.

We've got used to the highly unusual disparity between different polling firms in this campaign, but we're also going to have to start facing up to the fact that this looks likely to prove to be a factor in the next Holyrood campaign as well - unless of course the firms that prove to be the most inaccurate in September subsequently put their house in order.  YouGov are currently showing a modest Labour lead for Holyrood (although that lead has actually narrowed since late June, which again flatly contradicts the "post-debate blow for Salmond" claim).  That contrasts with the other traditionally No-friendly pollster Ipsos-Mori, who are continuing to show a decent SNP lead.  The more Yes-friendly pollsters tend to show much bigger SNP leads.  So although it would be an over-simplification to say it's "YouGov versus the field" in terms of Holyrood polling, there's certainly an element of truth in that, and the explanation is most likely to be the Kellner Correction.  The fact that this disparity isn't just happening in referendum polling must surely place a big question mark over the credibility of YouGov's approach.

The most interesting of the supplementary questions in the poll asks whether there should be another independence referendum in the future if there is a No vote this year, and what the timescale for that should be.  A full 53% of respondents say there should be another referendum at some point in the next 30 years.  I must say that really surprises me - although I personally think people's tolerance for another referendum would be quite high in the long-run (after all we had a second devolution referendum just 15 years after the first one), I wouldn't expect them to realise that right now.  Most startlingly, 31% of No voters say there should be a second referendum.  What's going on here?  It's hard not to conclude that at least some of these people are soft No voters who are not at all sure they are doing the right thing, and would like the safety-net of thinking they might have a second bite of the cherry one day.  If so, there's every reason to think that many of them will be open to persuasion between now and September.

* * *

REQUIRED SWINGS

Swing required for 1 out of 6 pollsters to show Yes in the lead or level : 3.5%

Swing required for 2 out of 6 pollsters to show Yes in the lead or level : 4.5%

Swing required for 3 out of 6 pollsters to show Yes in the lead or level : 5.5%

Swing required for 4 out of 6 pollsters to show Yes in the lead or level : 6.5%

* * *

SCOT GOES POP POLL OF POLLS

This update of the Poll of Polls takes account of both the Survation and YouGov polls, and therefore the slight increase in the No lead is almost entirely caused by Survation - the YouGov figures make an absolutely negligible difference.

MEAN AVERAGE (excluding Don't Knows) :

Yes 42.8% (-0.7)
No 57.2% (+0.7)

MEAN AVERAGE (not excluding Don't Knows) :

Yes 36.5% (-0.5)
No 48.8% (+0.8)

MEDIAN AVERAGE (excluding Don't Knows) :

Yes 42.8% (-0.6)
No 57.2% (+0.6)

(The Poll of Polls is based on a rolling average of the most recent poll from each of the pollsters that have been active in the referendum campaign since September 2013, and that adhere to British Polling Council rules. At present, there are six - YouGov, TNS-BMRB, Survation, Panelbase, Ipsos-Mori and ICM. Whenever a new poll is published, it replaces the last poll from the same company in the sample. Changes in the Poll of Polls are generally glacial in nature due to the fact that only a small portion of the sample is updated each time.)

*  *  *

Those of you who read the comments section of this blog can't really have failed to notice that there has been an infestation of anonymous No-supporting trolls recently - some of them abusive, but none of them remotely interested in constructive debate.  A number of people have contacted me to suggest that I ban the trolls or mass-delete their posts.  I'm not going to do that.  One thing that makes the Yes campaign different (with a very few unfortunate exceptions like James Mackenzie) is that we believe in open debate, and we don't go around censoring our opponents in the way that Labour Hame, Vote No Borders or the Better Together Facebook page do.  I fully appreciate how irritating it is, though.  Heaven only knows where all these people have suddenly sprung from - they might be from the dark hordes at Political Betting, they might be "risk assessors" and risk assessor groupies who have followed me over from Twitter, or something more organised than that might be going on.

To answer a specific question that a couple of people have asked me, there's no need for anyone to post anonymously if they don't want to - simply select the "Name/URL" option when you comment.  If you don't have a website or profile of some description, just leave the URL section blank.

Setback for the No campaign as new YouGov poll fails to back up their complacent boasts of a post-debate polling boost

It's been a night of frustration for Blair McDougall, the No camp's embarrassment of a campaign chief.  He must have hoped that the new YouGov poll would replicate the post-debate increase in the No lead reported by Survation a couple of days ago, but instead it has shown no change whatsoever since the last poll from the firm several weeks ago.  Bear in mind that YouGov have in recent times overtaken Ipsos-Mori to become the outright most No-friendly pollster, largely due to the notorious "Kellner Correction" which artificially lowers the Yes vote, so the following figures should be seen in that light.

Should Scotland be an independent country?

Yes 39% (n/c)
No 61% (n/c)

On the figures that take account of Don't Knows there has been a 1% increase in the No vote, which is of no statistical significance at all -

Yes 35% (n/c)
No 55% (+1)

Now, to be fair, there are two ways in which the No campaign could argue that this poll is theoretically consistent with a substantial post-debate bounce for No, even if that's not showing up on the headline figures - but both are problematical for them.  The first way is to point out that, unlike the Survation poll, only half of the fieldwork actually took place after the debate, so it's possible that the post-debate part of the sample was much better for No.  The problem is that this would imply there must have been a swing to Yes in the pre-debate part of the sample, otherwise the overall Yes vote would be lower.  That's surely not something that McDougall Central would be comfortable with, because we know that debate bounces can recede very quickly.

The second way is to point out that the last two YouGov polls were probably understating the Yes vote, because they both showed swings to No at a time when other pollsters were detecting no such thing.  So it could be suggested that the Yes vote has actually fallen from 41-42% (with Don't Knows excluded), which is the high watermark YouGov showed in the spring.  But that would also be a problematical claim, because the No campaign were all too eager to insist that the changes shown in the last two polls were real, and they'd have to implicitly accept they were leading us up the garden path.

It's important to stress that YouGov are one of only two BPC pollsters (the other is TNS) who don't filter or weight their headline numbers by likelihood to vote.  However, like TNS, they do make additional turnout-adjusted figures available.  As far as I know, those figures haven't been released yet for this poll, so that'll be the first thing to look out for when the datasets appear, presumably tomorrow.  It's impossible to say whether the filtered results will be any different -  they weren't in the last YouGov poll, but in the last-but-one poll the No lead was a full 4% lower among respondents who say they will definitely vote.

It's been well-rehearsed that YouGov have acted in a thoroughly reprehensible manner since this long referendum campaign began.  When I or others raise question marks about the methodologies of various pollsters, it's sometimes asked : "Are you seriously saying that the pollsters are consciously biased?"  And generally the answer to that question is "no", but YouGov is a partial exception. I'd say they now occupy a no-man's-land somewhere between neutrality and outright pro-No bias.  Both Peter Kellner and Laurence Janta-Lipinski have published commentary on the referendum that has utilised familiar No campaign attack lines, and it's very hard to trust a firm capable of doing that to be scrupulously neutral in devising its methodology.

Nevertheless, just for once I can give a very small piece of credit to YouGov, because in this poll they have made two very sensible methodological changes, albeit arguably of the "too little, too late" variety.  They've followed in Panelbase's footsteps by introducing weighting by country of birth, which will have the effect of slightly reducing the No lead.  In this case it has only reduced what would otherwise have been a 62/38 split to 61/39, but once the hoo-ha over the debate fades away or goes into reverse, we may soon be back into a situation where a 2% or 4% decrease in the No lead will look a hell of a lot more significant.  Incidentally, if it's gradually becoming the new orthodoxy that weighting by country of birth is a wise idea, then it's highly likely that the No lead in the Survation poll should have been a bit lower.

YouGov's other change is to start interviewing 16 and 17 year olds, which is a ridiculously long-overdue step, because other pollsters have been doing it throughout the campaign.  It's very hard to understand how Kellner can try to pull rank on other pollsters when his own firm has been guilty of such an amateurish omission for so long.

Welcome though these changes are, neither of them can even begin to make up for the fact that the "Kellner Correction" remains in place, and the obsessive secrecy over the effect it is having continues.  Anthony Wells has a semi-detached relationship with YouGov that I don't fully understand, but his inside knowledge was sufficient to let us know instantly what the figures in this poll would have been if the new methodological changes hadn't been made.  And yet still no-one will tell us what the numbers would be if the "Kellner Correction" was removed.  When I asked Janta-Lipinski a few weeks ago, he initially tried to change the subject, before boasting about the firm's secretiveness and the fact that we weren't entitled to that kind of information.

More details on this poll, and a Poll of Polls update, can be found HERE.

Sunday, August 10, 2014

Reasons to be sceptical about the changes in the new Survation poll

I've been away all day, but I've finally had a chance to catch up with the way Survation are reporting the findings of their own poll, and quite frankly it's laughable.  It's obviously a matter of perception whether the No campaign can be reasonably said to have a "commanding" lead when Yes are on 43% - just 7% from victory.  But one thing that is undeniable is that it's a much lower lead than in countless polls we've seen in this campaign, especially up until a few months ago.

There's one particular line that contains a downright inaccuracy (I'm tempted to be a good bit more blunt than that) -

"This is the highest ‘No’ vote – and the biggest lead over ‘Yes’ – seen in a Survation poll since our first independence referendum poll in February."

The "first" poll they are talking about there was in fact their second - they are attempting to erase from history their first referendum poll in January which showed a much, much bigger No lead.  It's hard not to suspect that they are doing so to please their clients the Daily Mail, who were plainly keen to make the false claim that the leaders' debate had led to a record lead for No.  What possible justification can there be to pretend the January poll didn't happen?  Well, Survation have radically altered their methodology since then, and it's impossible to know for sure what the outcome of that poll would have been under the current procedures.  But it is possible to hazard an educated guess based on the raw unweighted data, which as it happens suggests that the No lead would have been slightly higher in January than it was in last night's poll.

The breathless way in which Survation report their finding that Darling "won" the debate verges on the idiotic.  The only credible way to gauge the winner of a debate is to conduct an instant poll immediately afterwards - if you wait until a day or two later (as Survation did), public perceptions will be hopelessly tainted by the media spin on who "won".  But hats off to the impressively high 28.3% of Survation's respondents who know their own mind, and who defied the media narrative by saying Salmond won the debate.

To turn to the headline voting intention results, there are two specific reasons to be sceptical about the supposed swing to No (over and above the possibility that it's largely an illusion caused by the standard 3% margin of error).  There are two groups of respondents who showed particularly big swings to No, and who had to be upweighted massively because Survation couldn't find enough people in those demographic categories.  Whenever that happens, it effectively increases the margin of error for the overall sample, and makes volatility from one poll to the next much more probable.  Indeed, I seem to recall saying in a post not too long ago that Survation's constant need to drastically upweight the youngest group of respondents really ought to be causing volatility, and it was surprising that it wasn't.  That has finally happened now, because since last week's poll the No lead among 16-24 year olds has increased from 5.1% to 34.6%.  The effect of that wildly implausible swing has been magnified in the overall results, because the 61 real respondents in that age group have been upweighted more than two-fold to count as 129 'virtual' respondents.  Self-evidently, there hasn't really been a 15% swing to No among young people in the space of one week, so it's likely that the weighting procedure has artificially generated at least part of the increase in the overall No lead.  Indeed, if we look at the other age groups (who haven't needed to be upweighted significantly), it's striking how similar this week's poll is to last week's - No have gained a little ground among three age groups, and Yes have gained a little ground among two.  But nothing of any great statistical significance.

The other part of the datasets where alarm bells start ringing is the regional breakdown for the South of Scotland.  Last week, Yes had an 11% lead in that region, whereas now they are behind by 24%.  Again, the impact of that turnaround has been magnified, because in both polls respondents from the South had to be upweighted roughly two-fold.  Now, to be fair, this week's results are actually more plausible than last week's - you'd expect the South to be one of the most No-friendly regions.  But the point is that the extreme upweighting of two such different figures has distorted the trend.

The best clue that No have been flattered in this poll lies in the unweighted data.  In the vast majority of polls, the weighting helps Yes, but in this one the No lead is slightly smaller on the unweighted figures.  Indeed, the No lead on the unweighted figures of this poll is smaller than the No lead was on the unweighted figures of the last-but-one poll from Survation.

What I draw from all of this is a) it's far from clear at this stage that there has been ANY post-debate bounce for No at all, and b) if there has been a bounce, the balance of probability is that it's smaller than Survation's headline numbers would suggest.

*  *  *

This supplementary question made me laugh -

"Do you believe the Scottish Government should draw up alternative options to a ‘currency union’ ahead of the referendum on September 18, 2014?"

What, like they did several months ago, in rather a lot of detail?  In next week's Survation poll -

"Do you think Clement Attlee would have been a better Prime Minister if he'd introduced a National Health Service?  Why oh why didn't he do it?"