You may have seen that Professor John Robertson has a blogpost about Saturday's Survation poll, which he regards as less reliable than a recent YouGov Scottish subsample, because he thinks "landline telephone" sampling is inferior to online sampling. The post is actually based on a false premise, because Survation have confirmed today that their poll was conducted online - and it goes without saying that a full-scale online poll should be taken more seriously than an online subsample. The confusion probably came about because of an ambiguously-worded tweet from Survation on Saturday evening.
Even if the Survation poll had been a phone poll, though, there would still have been a number of problems with John's argument. First of all, although YouGov subsamples can probably be regarded as more credible than subsamples from other firms (because they appear to be correctly structured and weighted), they obviously have a bigger margin of error than full-scale polls because of the smaller sample size. So to get a meaningful picture you have to look at the pattern over a number of YouGov subsamples, and it's pretty obvious that the SNP's 47% showing in the subsample John is talking about is an outlier. High 30s is much more typical - in other words pretty similar to what the Survation poll found.
Secondly, it's highly unlikely that Survation would conduct a landline-only phone poll, so the concern John raises about certain demographic groups being less contactable by landline doesn't really apply. It may be that response rates to phone polls are unacceptably low because people these days are unlikely to answer an unexpected phone call, regardless of whether they're on a mobile or landline. But that's a somewhat different point.
Thirdly, there's the standard Mandy Rice-Davies objection to the quote John provides from YouGov about the supposed greater accuracy of online polling. YouGov are, and always have been, an online-only pollster, so "they would say that, wouldn't they?"
Fourthly, John points to the fact that online polls were much more Yes-friendly during the indyref. But in fact there was a dramatic convergence between the online and phone polls as the campaign drew to a close, and by polling day they were more or less showing the same thing - a very, very slender No lead. So it's impossible to know for sure who was getting it right earlier on. Anecdotally, a lot of campaigners did detect a large swing to Yes in the closing weeks, which would lend more support to the theory that the telephone polls were more accurate. (YouGov were the only online firm to report a big swing, and they only did so because of their notoriously convoluted "Kellner Correction".)
Lastly, John mentions a ScotPulse online poll showing a handsome Yes vote. Unfortunately ScotPulse polls can't be taken seriously because they're not properly weighted. The (allegedly) best data collection method doesn't really help much if the other basics aren't being done correctly.
* * *