"I know. And like many of the other respectable polls, within the margin of error of 18th September 2014. That's not news, unless you're Scot Goes Pop."
I presume that can be reasonably interpreted as criticism and/or mockery of my blogpost about the Panelbase poll. If so, I think it's worth taking a moment just to defend that post, because quite honestly, the idea that this particular poll was not worthy of note is a bit batty.
Let's start with the obvious: Scot Goes Pop is a polling blog. (Not exclusively, but to a large extent.) Pretty much any full-scale Scottish poll is worth reporting here, even if it shows no change at all. Scottish polls aren't exactly ten-a-penny outside election periods, so they always tell us something interesting.
Secondly, Scott is quite wrong to imply that I thought the significance of the poll was a 2% increase in the Yes vote since the 2014 referendum. In reality, I was much more interested in the fact that 47% is a two-year high for Yes in Panelbase polls, and is significantly better than the recent 'normal range' for Yes reported by Panelbase, which has been around 43-45%. Here is the sequence of Yes votes in the last ten Panelbase polls -
44 - 45 - 45 - 44 - 43 - 44 - 44 - 44 - 45 - 47
If you don't think the 47 at the end sticks out like a sore thumb, you must be pretty determined not to see it. Now, of course, it's perfectly possible that support for independence has remained steady at around 44%, in which case the standard margin of error could just about produce a freakish 47% result now and again. That's one possible explanation, and if it's the correct one it'll become obvious soon enough because the next couple of Panelbase polls would in all likelihood show a reversion to the 43-45% norm. But there is another very obvious possible explanation - that the jump in support for Yes is either real or partly real. If an unexpected poll result comes along and raises the possibility that Yes has significantly narrowed the gap, are we really supposed to look away in a state of total disinterest? Come now.
The third and more general point is that Scott is making a schoolboy error (albeit a very common one) by assuming that because a large number of polls are putting the Yes vote within the margin of error of the 45% vote in 2014, no conclusions at all can possibly be reached about changes in public opinion since the indyref. This is exactly the same mistake people made when they said that it didn't actually matter that Hillary Clinton was ahead of Donald Trump in the vast majority of polls, because in a lot of those polls her lead was within the margin of error. (As you'll recall, Clinton went on to win the popular vote by some three million votes.)
Take a glance at the recent run of Yes results in polls from Survation, which unlike Panelbase is not one of the more No-friendly firms...
46 - 47 - 46 - 46 - 47 - 47 - 46 - 45
If looked at individually, then yes, all of those polls are within the margin of error of the 45% vote in 2014. None of those polls on their own would constitute proof of an increase in the Yes vote since the indyref. And yet if you look at them collectively, it's entirely right and proper to draw the opposite conclusion. Seven out of eight of the polls have Yes above 45%, and not a single one has Yes below 45%. That's an extremely improbable pattern if Yes is supposed to have been flatlining at exactly 45%. If that had been the case, and if the margin of error was the explanation for Yes sometimes getting as high as 47%, it would be more likely that we'd have seen a rather more even spread of results above and below 45%. So, if by any chance Survation have their methodology exactly right (and admittedly that's a big if), it can be said with a bit of confidence that the Yes vote has generally been a little higher over the last year or so than it was on referendum day in 2014.
* * *