Over the last 3 years the British Population Survey has been monitoring people who respond to online surveys and comparing them to the population as a whole, in terms of detailed demographics and attitudinal variables. It is a massive survey involving 6-8,000 face-to-face in home interviews per month.
Comments
This means pollsters are not innocent observers of public opinion, but active participants in the political process; not only reporting public opinion but helping to shape it.
Yes, because the majority of people in the general population are the non-hysterical moron types who won't ever even notice the frenzied self-excitement of the online anoraks.
The UK population is around 64 million. I am surprised if as many as 6.4 million people in the UK participate in online polls.
Perhaps the LibLabCon should ban opinion polls as they're clearly anti-democratic.
Online poll with self selecting sample for example might put UKIP at say 21%, whereas the actual population figure might be say 12%, but the sample is badly skewed by being composed largely of wonkish political addicts and activists. BBC picks up the poll and makes a splash with it. Non political viewers take note of the BBC headline and accord UKIP more credibility than it is due, and give more consideration to giving it their vote than they otherwise would.
Obviously the opposite is also possible, and the wonks and activists are nett anti-UKIP and the sample suppresses their apparent vote, leading to people giving it less credibility than it deserves. I dont know which is more likely, and neither does anyone else, but the very fact that the sample is self selecting, and is probably made up from people with political axes to grind, makes it almost useless.
No amount of correction and weighting can compensate for a biased sample, but that does not mean that online polls are useless. Phone polls have not dissimilar problems, and when looking at polls we want a sample that is an accurate representation of the 65% who will turn out to vote and not the entirety of the population.
Online polls are more like focus groups in some ways, and do show trends and movements. Ultimately the proof of the pudding is in the eating, so do they forecast the result accurately? It seems that they often do, but difficult to be sure in that a late poll movement (such as in the indy ref) may not be a true movement, but really just an improvement in accuracy.
I expect that I am not the only PBer who signed up with an online poll in order to deliberately influence the narrative. I think that Ashcrofts marginal constituency polls are likely to be a better predictor, but are still untested apart from a few by elections.
Happy Winterval everybody!
The other problems I have with online surveys compared particularly to phone surveys is its far to easy to pack the panel. Party A could tell its activists to all go and sign up over the next month to a given panel. It could even have a small group of activists generate hundreds of survey accounts and pretend to be supporters of an opposing party, or even their own party and sway results. Polls are not given under oath, people should not assume that people remotely tell the truth in them, I dont, I respond in a way that is likely, if repeated by other like minded voters, to push my party in a given direction.
Regarding the Scottish referendum: a face-to-face poll for TNS, conducted in the same week as the YouGov 51-49 poll but before it was published, showed a dead heat.
http://d25d2506sfb94s.cloudfront.net/cumulus_uploads/document/ywzyqmrf2u/Scotland_Final_140905_Sunday_Times_FINAL.pdf
http://www.tns-bmrb.co.uk/uploads/files/TNSUK_SOM2014Sep9_DataTables.pdf
In respect of Westminster VI surveys since the referendum, the poll that has given the biggest vote share and lead to the SNP was a telephone poll conducted by Ipsos MORI.
https://www.ipsos-mori.com/researchpublications/researcharchive/3469/SNP-open-up-significant-lead-ahead-of-General-Election-vote.aspx
Phone polls show much less support for UKIP than online polls even with the same polling company, similarly they show much less evidence of LD->CON switching, both of which are the sort of moves which people might feel would raise eyebrows with their peers, and hence would be reticent about telling people about. This applies doubly with face-to-face polls, the nice young lady (probably a student) with a clipboard approaches you in the high street and asks about your politics, you are going to be almost certain she might be disapproving of your tory or UKIP views, if those views are not deeply held, or you are wavering and possibly embarrassed about your wavering, there is going to be a tendency to tell her what you think she wants to hear.
Q.E.D.
http://m.bbc.co.uk/news/health-30500372
Are telephone polls to be taken too seriously...?
http://newstonoone.blogspot.co.uk/2014/12/testing-boundaries-3-lib-dems-vs-all.html
Most of these polls probably accept that the sample is biased, and prone to stuffing with wonks and activists, but probably get close to the result by assuming that the relative amount of stuffing was equal from all parties, so you got just as many Labour wonks ramping their side as you did Conservative wonks ramping their party. I am not sure how this stands up in a five party system.
In a phone poll if the number isn't answered, do they call back, or do they substitute the next number in the sequence. In a door-to-door stratified random sample like we used doing psychological research if the selected person doesn't open the door the usual instruction was to go to the neighbour, because within a small geographical area with similar housing, health, school facilities etc, there was no reason to suspect that the neighbour wasn't as representative as the selected candidate, and because sample sizes were quite large any minor effect would be mitigated. If you do the same with a phone number you have no reason to suspect the next number has anything to do with the selected number, they could be in completely different area of town, and you could get a noticeably biased sample (ie those prepared to answer political opinion surveys, ie wonks and activists ;-) )
Thanks to Mr. Sparrow for this piece.
I very much agree with the sentiment at the end. Last election the constant flow of YouGov polls gave them a disproportionate impact as they were continually reported by the media, shaping opinion rather than merely reflecting it.
I do think we need more stringent restrictions on polling during electoral periods to stop this sort of thing. [But it's even more important we ban the worm].
Perhaps you could arrange with your polling chums for anyone who uses the politically paranoid term LibLabCon to suffer a North Korean-style Denial Of Service?
Also, "turnip".
The big two typically poll 60-62% in phone polls. They tend to poll 65% -70% in online polls.
The Lib Dems average 11% in phone polls, 7% online.
There's no significant difference in levels of UKIP support. 15.5% in phone polls, 17% online.
http://www.bbc.co.uk/news/world-middle-east-30596474
Stigmatised opinions are more likely to be withheld, with the soft, middle of the road response offered in these situations.
All polling needs to be balanced by the actual outcomes of real elections where comparisons to polling can be made.
ICM had a particularly poor Euros, and were the worst pollster for understating the Kippers and over-estimating both Lab and Con.
That's worth bearing in mind when viewing their polling outcomes. Are they really geared up for the change that has happened since 2010?
Fermi problems are typified by the question "how many piano tuners are there in Chicago", where you make a set of assumptions that lead you to an answer:
There are approximately 9,000,000 people living in Chicago.
On average, there are two persons in each household in Chicago.
Roughly one household in twenty has a piano that is tuned regularly.
etc
Some of those will be over estimates, some underestimates, which tend to cancel each other out, leading to a more accurate answer than you might otherwise expect.
The parallel I was drawing was that the sample from online polls is undoubtedly biased, but the natures of those biases is such that they will tend to cancel each other out, at least when there are only two leading parties. The number of people joining the sample to ramp Labour will probably be similar to the number of people joining the sample to ramp the Tories etc. The number of activists and wonks joining the sample from each side probably broadly parallels the motivation of the supporters of that side, which is usually a factor of how well that side is doing etc.
With a large enough sample you can mitigate the self-selection to some extent, by saying that it will apply equally to all sides, and that might get you so far with things like raw VI, but since people choosing to reply are the politically engaged, numbers like "likelihood to vote" are possibly way out, and so using that as a factor to modify VI sounds dangerous.
When as someone noticed yesterday one organisation was scaling the results of young green voters by 245% to balance their sample, its not surprising that we see the Green vote fluctuating wildly from poll to poll.
The article is very interesting, but I'd question two points. First, as EiT and fox have said, phone samples also will have biases, e.g. more willingness to respond among people not in employment. I once got called by Populus - annoyingly, I was just about to go into a meeting so I couldn't bias their sample :-). Second, the "bandwagon effect" isn't well-documented, and some research by Nate Silver suggests it doesn't exist. I think it probably only has one effect - supporters of new parties like UKIP will be less likely to vote for them if polls suggest they're failing to break through.
ICM was out by 2%
TNS by 4%
Opinium by 5%
Survation by 5%
ComRes online by 6%
YouGov was spot on.
Src.: http://www.militaryphotos.net/forums/showthread.php?243346-ISIS-claims-capture-of-Jordanian-pilot-in-Syria
The points Nick makes seem extremely valid to me but I think that phone polls suffer very similar problems. Whenever I get someone on the phone for a survey I politely hang up and I am interested in politics. I am sure the vast majority of potential contributors do the same so once again you are left with the more enthusiastic and committed.
Mr Sox points out that you can never really compensate for a distorted sample and I can see how in epidemiological terms that must be true. But polling companies, particularly those like ICM (and my new heroes in Ipsos Mori of course) have got pretty good at doing so over time which is why they get pretty close most of the time. If the committed head off in a particular direction the majority of us sheep generally follow.
As for polls making the story rather than simply reporting it I am sure there is some truth in that too. But is it maybe still better than the story being made by foreign media moguls talking their own book? Elections and public opinion have never been pure or paragons of virtue. Biased polls are probably still an improvement.
Still, lots of good excuses packed away for the next tory disappointment.
All methods are imperfect, and we never know how accurate polls are unless there is an election at the same time.
Example: you want your sample to have 35% of people who voted Conservative last time. You find to your horror that you only have 17.5% in your sample. So you count them all double (as if they were two respondents) and reduce the weight of everyone else so the total is still 100%. Now if the Tories you didn't find think just like the ones you did find, that's not an indication of anything at all.
But if yesterday's papers reported that in 2010 Cameron was a secret worshipper of Satan, your respondents might be temporarily embarrassed to admit they voted for him (unless they didn't have consciences, of course :-)). In that case, actually your sample already has lots MORE 2010 Tories than are admitting it, and you are mistaken in doubling the ones you found - perversely, the apparent shortage of 2010 Tories is making you create a sample that is too far to the right. The Satanism story may mean they're now voting UKIP or LibDem but it's still a biased sample.
Bottom line: the raw results of polls should be treated with caution. However, trends from the same institute are usually illuminating, since the same biases will tend to apply each time.
Mr. Thoughts, whilst relieved it's not a British plane, that's concerning. Reminds me of a few months ago when three (I think) pilots and planes defected from Iraq to ISIS.
Mathemeticians agruing about and/or over observations versus samples. It is like Einstien and that Danish bloke who no-one remembers....*
* Well, until and Englishman (oops - he was an Ulster-Scot) prove the latter right. Red-or-black; you play the game....
Correct me if I'm wrong, but I don't think there were any online pollsters in operation at that time.
"Some online nonprobability polls have compiled a good record of predicting election outcomes, actually outperforming traditional polls in some elections. But skeptics point to the absence of a theory, such as the one underlying probability sampling, that provides a basis for expecting traditional polls to work well under different conditions."
http://magazine.amstat.org/blog/2014/10/01/prescolumnoct14/
Regardless of sampling errors and all the other sophisticated and less sophisticated methods of screening respondents, it is simply not possible for Labour to be 5-7% ahead with some pollsters, the Tories to be 3% with Ipsos-Mori or the parties to be level pegging as Populus suggested on Monday and for them all to be correct.
On here we give a great deal of credence to YouGov simply because they produce 5 polls a week and for the addicts on here, endless subsets and polls within polls. We have treated ICM as the Gold Standard and speaking personally when I see it give Labour a decent lead as it did last week, I take a sharp intake of breath.
We also have ComRes and Survation, neither of which I have personally ever treated as serious pollsters given their lack of accuracy compared to real results e.g. euro elections, by-elections etc.
We have Michael Ashcroft's herculean efforts both with his Monday polls and his individual constituency polls. We used to anxiously look forward to his mega-polls.
Finally we have the others like TNS, Opinium and Stephen Fisher. TNS seems to get credence for its Scottish polls and Stephen Fisher's weekly summary predictions are as anticipated by many of us as the Michael Ashcroft marginal polls.
We all have our favourites. There are those who tend to give results we like and those we don't. It would be good if Mike or TSE give us a thread or two over the holiday period reminding us of pollsters predictions before recent elections and the relative accuracy of them compared to the actual results.
Personally my gut reaction is that basically the Tories and Labour are neck and neck with Labour possibly being 1% in the lead, around 33-32, the LibDems are on around 10% and UKIP on around 12%. In Scotland the SNP are at around 40%, Labour around 25%, the Tories around 17% (where they polled in 2010) and the LibDems around 5% but most of that 5% is stacked in the seats they are defending and those where they were 2nd in 2010.
Over the next few months I see Labour dropping into the 20s%, the Tories edging up towards 35% and the LibDems and UKIP fighting it out in the 10-15% band. Over 6 months we have seen the Labour lead in the marginal drop from over 6% to around 3%. The next set of Ashcroft marginal will be interesting. Christmas and New Year gives MPs lots of time to spend in their constituencies and sample the mood music. Danny Alexander's faux attack on George Osborne this morning may be part of his reaction to that mood music.
Nick Sparrow was the man who learnt the lessons first from that debacle and created the broad weighting structures we have today.
- No poll is perfect but I think we in the industry try to make them as good as we can
- Both main methodologies can have issues reaching the whole population - online panels tend to have less older people, while phone finds it hard to reach the increasing number of mobile only households (which tend to be younger)
- As mentioned online panels are self-selecting to sign up but bear in mind that generally most surveys are not political and asking about consumer products. I think more people are motivated by interest in surveys or making a little bit of money (some sign up to multiple panels) rather than political reasons
- While membership of a panel is self-selecting, the panel company chooses who to invite to each survey. A general rule of thumb is that each panellist should not be invited to more than 1 survey a month
- For phone surveys, most companies should make calls in the evening as well as the daytime to catch people who are out at work.
- It is a well known variable in research that if you have say a 1-7 scale where 7 is very good and 1 is very poor, that people in online research tend to give more answers at the extremes of the scale than phone respondents. This may be due to phone respondents feeling the need to justify extreme views to the interviewer and to the anonymity of online
As I have mentioned before, I don't do political polling myself but think it must be some of the hardest polling to do:
- Often deals with controversial topics. People are more likely to be untruthful when it comes to politics rather than something like choice of cereal.
- The difficulties of weighting based on past voting and making sure past recall is accurate nearly 5 years from the last election (and when we have had local and Euro elections since)
- The lack of frequent elections to validate results against
- The fact that you can have a perfectly accurate poll taken say 4 days before the election but late developments mean people change their mind in the polling booth
One good thing is that we have lots of poll companies and more frequent polls so that it should hopefully be easier to spot any outliers
Looking at the next election I think most pollsters will get the Lab vs Con lead right. The big challenge will be getting the UKIP and SNP numbers right. Both parties are seeing 2010 non-voters now saying they will vote for them. How many will actually turn out on the day? It is hard to say.
I think the point about older people not answering the phone to people they don’t know is a good one. An increasing number of my friends are so fed up with unblockable nuisance calls .... from abroad for example ...... that they don’t answer, see who caled then ring back. More expensive, of course, and could cause serious problems when what Microsoft Technical Department is calling them about actually affects them! (Yes, I know it’s not!)
Well smell the Coffee, Mr. Sparrow........ and is this why there is a concerted effort by all the pollsters, in the two weeks before Christmas, to downgrade the level of UKIP poll levels?
In other words to be - active participants in the political process.
Mike Smithson has always said that could never happen, has he changed his mind?
The 'self-select' bias problem for online polls is of course very well known, and reputable pollsters such as YouGov will try to correct for it. This is easier if, like YouGov, you start with an enormous panel, set up well before the particular poll you are conducting. So, in general, it doesn't seem to be the case that online polls are systematically more wrong that telephone or face-to-face polls, which (as several posters have pointed out) have their own problems.
All the same, as political punters we should definitely be aware of the possibility of self-select bias, especially in circumstances where for some reason one segment of the respondent base is likely to be particularly fired-up. An example of this was the apparent surge in support for Mitt Romney after he did better than expected in the first TV debate against Obama. As always in political betting, you need to use polls as guides to be interpreted intelligently, not as exact and literal measurements of voting intention. In particular, if phone and online polls start to diverge systematically, ask yourself why - self-select bias might be one possible explanation.
In passing, it does seem that if one asked if a respondent suffered or had ever suffered from mild paranoia, then a yes answer means they could probably be put down as a Kipper!
Oh, it's just intuitively clear. :-) But I'll do a random sample of my friends to check if you like.
Really good post by Gareth (and apols to Ishmael for the patronising bit, didn't word my post well). We should probably be surprised how well the polls do, given the difficulties.
MPs and would-be MPs get regularly polled too, incidentally - typically you're offered £25 (to you or to charity) to spend 10 minutes or so expressing views on issues of interest to the sponsors, e.g. attitudes to Heathrow expansion. Occasionally the sponsors release the results ("64% of X's MPs favour our cause") but usually not. I once in a quiet moment agreed to do an hour-long face to face survey on a range of issues (for the CBI, I think) - that definitely introduced bias, since the interviewer couldn't stop herself from raising her eyebrows or chuckling at some of my responses. It tempted me to be more outrageous - others might have tried to be more mainstream.
Nick P - thanks. One of the things that winds market researchers up is how many people think research is easy and you can just do a few questions in Survey Monkey in 5 minutes. Anyone can do bad research but getting it right and getting good quality data is a lot harder. Certainly I have had quite a few clients come to me asking me to field their questions they have written themselves and have had to tactfully rewrite them substantially...
In April ICM had labour on 36 and Ukip on 20
http://en.m.wikipedia.org/wiki/European_Parliament_election,_2014_(United_Kingdom)
Also pre euro you made a big point of saying AIFE would hit the Ukip vote
You said about 2% would be knocked off, and after the result said it was as you expected
Therefore it is wholly inconsistent of you to use the final Ukip score as a measure of pollster accuracy, as people accidentally voting AIFE would have told a pollster Ukip
Merry Christmas PBers, and here's to a prosperous political New Year!
Bell was an Ulsterman - the entanglement man of EPR paradox fame
And Bohr was a real legend. I saw a Scandinavian airplane recently bearing images of famous Scandinavians. The only one I'd heard of was Hans Christian Anderson, the others were actors and other such nonentities. No Bohr.
A travesty.
"No Tycho Brahe either?"
Nope, but with Copernicus, Kepler and Galileo stealing his work, I didn't expect him to be there.
But plenty of Danish soap stars.
I agree. I find with long polls my attention wanders and I tend to answer quickly just to finish. Some of the answers are often - err - rubbish...
YouGov's long multiple choice ones (not on politics but on products or supermarkets often ften) are an utter turnoff...
Form a consortium of large mainstream websites which require log-ons (banks, shopping, betting sites). Randomly ask customers a VI question which they have to answer (even if "prefer not to answer") to get further on in the log-in process.
Obviously there are selection biases here. Again, it's the least-worst we are looking for.
And Carl Linnaeus was Swedish. Although, to be fair Brahe was born in what was then Denmark, but is now Sweden.
Bu maybe that's why I don't get YouGov surveys often!
I have it in my mind that when we were doing questionnaire design we were advised to keep them to one side of A4, if at all possible.
Was quite a long time ago though!
"Copernicus was earlier, I think."
True, but he was a bit dull compared to Brahe.
"Has someone pinned a "please patronize me" notice to my back this morning?"
I didn't realise we were being that serious, but there is a serious point here.
At this time of year we get what are laughingly called "Celebrity Specials" where old has-beens appear to promote their latest panto, and are eulogised by the presenters. A real "pass the sick bag" feast.
I love reading their stuff post WW2.
If he was a sandwich board nutcase - he's have one saying The End Is Nigh, and another saying We're All Saved.
1) How would you rate your stay at Hotel X? Poor, Average, Good, Very Good, Excellent
2) What is your favourite type of meat? Lamb, Beef, Pork, Chicken, Turkey
3) How many car journeys did you go on in the last year?
4) How many hours of TV will you watch over Christmas this year?
While we're on the subject, who are your three most famous scientists?
I'd go for Newton, Einstein and Bohr ... in that order. But as always they stood on others shoulders, even though Newton didn't really mean it.
Sorry, but I had to google him. Looks to be a passable commander and certainly more worthy than the woman in the funny jumpers. Don't let Messrs Eagles and Dancer see or they'll have the Hanniba/'wotsisname argument again.