Howdy, Stranger!

It looks like you're new here. Sign in or register to get started.

politicalbetting.com » Blog Archive » Online Polls, Big Stories, Shaky Foundations

SystemSystem Posts: 12,214
edited December 2014 in General

imagepoliticalbetting.com » Blog Archive » Online Polls, Big Stories, Shaky Foundations

Over the last 3 years the British Population Survey has been monitoring people who respond to online surveys and comparing them to the population as a whole, in terms of detailed demographics and attitudinal variables. It is a massive survey involving 6-8,000 face-to-face in home interviews per month.

Read the full story here


«1

Comments

  • JohnLoonyJohnLoony Posts: 1,790
    Happy Birthday to Ed Miliband. The first leader of a main political party in the UK to be younger than me.
  • So online poll respondents are only 10% of the population that may not be representative of the rest, but what's the proportion that still responds to random calls to old-fashioned landlines from people claiming to be conducting opinion polls?
  • JohnLoonyJohnLoony Posts: 1,790
    Can we exclude the possibility that the drip feed of such polls helps to create a bandwagon effect, influencing the outcome of elections and referenda....

    This means pollsters are not innocent observers of public opinion, but active participants in the political process; not only reporting public opinion but helping to shape it.


    Yes, because the majority of people in the general population are the non-hysterical moron types who won't ever even notice the frenzied self-excitement of the online anoraks.
  • EddieEddie Posts: 34
    "But all online responders account for no more than 10% of the population."

    The UK population is around 64 million. I am surprised if as many as 6.4 million people in the UK participate in online polls.
  • tlg86tlg86 Posts: 26,221
    Oh I see. Ukip's rise is the fault of opinion polls and nothing to do with the alienation of a small, but not inconsiderable, part of the electorate.

    Perhaps the LibLabCon should ban opinion polls as they're clearly anti-democratic.
  • IndigoIndigo Posts: 9,966
    JohnLoony said:

    Can we exclude the possibility that the drip feed of such polls helps to create a bandwagon effect, influencing the outcome of elections and referenda....

    This means pollsters are not innocent observers of public opinion, but active participants in the political process; not only reporting public opinion but helping to shape it.


    Yes, because the majority of people in the general population are the non-hysterical moron types who won't ever even notice the frenzied self-excitement of the online anoraks.

    But its not the online anoraks that are the issue. Its mass market media like the BBC noticing the results of polls that are potentially very misleading owing to their self-selecting sample, and using them to paint an unrealistic pictures (possibly quite innocently) which voters then react to.

    Online poll with self selecting sample for example might put UKIP at say 21%, whereas the actual population figure might be say 12%, but the sample is badly skewed by being composed largely of wonkish political addicts and activists. BBC picks up the poll and makes a splash with it. Non political viewers take note of the BBC headline and accord UKIP more credibility than it is due, and give more consideration to giving it their vote than they otherwise would.

    Obviously the opposite is also possible, and the wonks and activists are nett anti-UKIP and the sample suppresses their apparent vote, leading to people giving it less credibility than it deserves. I dont know which is more likely, and neither does anyone else, but the very fact that the sample is self selecting, and is probably made up from people with political axes to grind, makes it almost useless.
  • foxinsoxukfoxinsoxuk Posts: 23,548
    An interesting article. My first posts on here were expressing similar sentiments as to whether online polls were effective samples. Some years ago I was doing some epidemiological and public health work and we spent a lot of time doing random sampling to ensure that our study was representative of the population at large. I like statistics and the study of bias.

    No amount of correction and weighting can compensate for a biased sample, but that does not mean that online polls are useless. Phone polls have not dissimilar problems, and when looking at polls we want a sample that is an accurate representation of the 65% who will turn out to vote and not the entirety of the population.

    Online polls are more like focus groups in some ways, and do show trends and movements. Ultimately the proof of the pudding is in the eating, so do they forecast the result accurately? It seems that they often do, but difficult to be sure in that a late poll movement (such as in the indy ref) may not be a true movement, but really just an improvement in accuracy.

    I expect that I am not the only PBer who signed up with an online poll in order to deliberately influence the narrative. I think that Ashcrofts marginal constituency polls are likely to be a better predictor, but are still untested apart from a few by elections.

    Happy Winterval everybody!
  • foxinsoxukfoxinsoxuk Posts: 23,548
    tlg86 said:

    Oh I see. Ukip's rise is the fault of opinion polls and nothing to do with the alienation of a small, but not inconsiderable, part of the electorate.

    Perhaps the LibLabCon should ban opinion polls as they're clearly anti-democratic.

    As UKIP has done well in some real elections, it does seem to an extent to be a genuine phenomenon. I have more than a little scepticism that relying on DNV voters to turn out is a valid strategy, but we shall see.
  • IndigoIndigo Posts: 9,966
    edited December 2014

    No amount of correction and weighting can compensate for a biased sample, but that does not mean that online polls are useless. Phone polls have not dissimilar problems, and when looking at polls we want a sample that is an accurate representation of the 65% who will turn out to vote and not the entirety of the population.

    The two flaws in this are that if we have a self-selected sample that greatly increases the number of wonks and activists compared to the population then a number of measures, not least of which will be "likelihood to vote" will be totally unrepresentative. Your highly engaged sample set will be much more engaged that the general population, and using their LTV figure to balance the sample seems like doubling down on the sampling error. The second issue I have is that it almost by definition will undersample anti-politics party voters, if they are voting for their party because they dont like the current politics, they may be considerably less likely than average to be involved in the political process and hence on survey panels. One wonders for example how many white van men are on the panels.

    The other problems I have with online surveys compared particularly to phone surveys is its far to easy to pack the panel. Party A could tell its activists to all go and sign up over the next month to a given panel. It could even have a small group of activists generate hundreds of survey accounts and pretend to be supporters of an opposing party, or even their own party and sway results. Polls are not given under oath, people should not assume that people remotely tell the truth in them, I dont, I respond in a way that is likely, if repeated by other like minded voters, to push my party in a given direction.
  • I'm sorry, but the facts do not support this argument.

    Regarding the Scottish referendum: a face-to-face poll for TNS, conducted in the same week as the YouGov 51-49 poll but before it was published, showed a dead heat.

    http://d25d2506sfb94s.cloudfront.net/cumulus_uploads/document/ywzyqmrf2u/Scotland_Final_140905_Sunday_Times_FINAL.pdf

    http://www.tns-bmrb.co.uk/uploads/files/TNSUK_SOM2014Sep9_DataTables.pdf

    In respect of Westminster VI surveys since the referendum, the poll that has given the biggest vote share and lead to the SNP was a telephone poll conducted by Ipsos MORI.

    https://www.ipsos-mori.com/researchpublications/researcharchive/3469/SNP-open-up-significant-lead-ahead-of-General-Election-vote.aspx
  • IndigoIndigo Posts: 9,966
    edited December 2014
    tlg86 said:

    Oh I see. Ukip's rise is the fault of opinion polls and nothing to do with the alienation of a small, but not inconsiderable, part of the electorate.

    Perhaps the LibLabCon should ban opinion polls as they're clearly anti-democratic.

    Too paranoid tbh. The opposite is completely possible as well. Self selecting polls might be full of Guardian reading types and Flightpath clones (shudder!) who hate UKIP, and might well under report the actual level support UKIP have. Thats the problem with this sort of biased sample, you dont know how it is biased, in which direction, and by how much.
  • IndigoIndigo Posts: 9,966
    JamesMo said:

    I'm sorry, but the facts do not support this argument.

    Regarding the Scottish referendum: a face-to-face poll for TNS, conducted in the same week as the YouGov 51-49 poll but before it was published, showed a dead heat.

    http://d25d2506sfb94s.cloudfront.net/cumulus_uploads/document/ywzyqmrf2u/Scotland_Final_140905_Sunday_Times_FINAL.pdf

    http://www.tns-bmrb.co.uk/uploads/files/TNSUK_SOM2014Sep9_DataTables.pdf

    In respect of Westminster VI surveys since the referendum, the poll that has given the biggest vote share and lead to the SNP was a telephone poll conducted by Ipsos MORI.

    https://www.ipsos-mori.com/researchpublications/researcharchive/3469/SNP-open-up-significant-lead-ahead-of-General-Election-vote.aspx

    I have a different problem with telephone polls. You are talking to a person then, and all sorts of social factors come into play, such as not wanting to look an idiot, not wanting to appear inconsistent, and perhaps not wanting to be seen to support an out of fashion or politically incorrect party.

    Phone polls show much less support for UKIP than online polls even with the same polling company, similarly they show much less evidence of LD->CON switching, both of which are the sort of moves which people might feel would raise eyebrows with their peers, and hence would be reticent about telling people about. This applies doubly with face-to-face polls, the nice young lady (probably a student) with a clipboard approaches you in the high street and asks about your politics, you are going to be almost certain she might be disapproving of your tory or UKIP views, if those views are not deeply held, or you are wavering and possibly embarrassed about your wavering, there is going to be a tendency to tell her what you think she wants to hear.
  • foxinsoxukfoxinsoxuk Posts: 23,548
    Indigo said:

    JamesMo said:

    I'm sorry, but the facts do not support this argument.

    Regarding the Scottish referendum: a face-to-face poll for TNS, conducted in the same week as the YouGov 51-49 poll but before it was published, showed a dead heat.

    http://d25d2506sfb94s.cloudfront.net/cumulus_uploads/document/ywzyqmrf2u/Scotland_Final_140905_Sunday_Times_FINAL.pdf

    http://www.tns-bmrb.co.uk/uploads/files/TNSUK_SOM2014Sep9_DataTables.pdf

    In respect of Westminster VI surveys since the referendum, the poll that has given the biggest vote share and lead to the SNP was a telephone poll conducted by Ipsos MORI.

    https://www.ipsos-mori.com/researchpublications/researcharchive/3469/SNP-open-up-significant-lead-ahead-of-General-Election-vote.aspx

    I have a different problem with telephone polls. You are talking to a person then, and all sorts of social factors come into play, such as not wanting to look an idiot, not wanting to appear inconsistent, and perhaps not wanting to be seen to support an out of fashion or politically incorrect party.

    Phone polls show much less support for UKIP than online polls even with the same polling company, similarly they show much less evidence of LD->CON switching, both of which are the sort of moves which people might feel would raise eyebrows with their peers, and hence would be reticent about telling people about. This applies doubly with face-to-face polls, the nice young lady (probably a student) with a clipboard approaches you in the high street and asks about your politics, you are going to be almost certain she might be disapproving of your tory or UKIP views, if those views are not deeply held, or you are wavering and possibly embarrassed about your wavering, there is going to be a tendency to tell her what you think she wants to hear.
    Phone polls do have some advantages, with the rise in mobiles they go to an individual rather than a household. There are many (including myself) who hang up on these calls as so often sales pitches or scams rather than true research. I also get annoyed about getting these calls at work. Not everyone works 9-5.

  • CharlesCharles Posts: 35,758
    Indigo said:

    the nice young lady (probably a student) with a clipboard approaches you in... if those views are not deeply held, or you are wavering and possibly embarrassed about your wavering, there is going to be a tendency to tell her what you think she wants to hear.

    http://www.telegraph.co.uk/news/politics/liberaldemocrats/8215481/Vince-Cable-I-have-the-nuclear-option-its-like-fighting-a-war.html

    Q.E.D.
  • foxinsoxukfoxinsoxuk Posts: 23,548
    OT: I drink like a Paraguayan!

    http://m.bbc.co.uk/news/health-30500372

  • Is it more a case that the rise of online polls has just meant more polling generally, so more references to polls and, therefore, more of the news agenda referencing, or being set by, polls?
  • IndigoIndigo Posts: 9,966

    Is it more a case that the rise of online polls has just meant more polling generally, so more references to polls and, therefore, more of the news agenda referencing, or being set by, polls?

    add the words "based on a biased sample" to the end of this and yes, pretty much.

  • Are online polls not to be taken too seriously? Yes, online polls are not to be taken too seriously.

    Are telephone polls to be taken too seriously...?
  • Thanks for a very interesting article, Mr Sparrow.
  • For those that missed it yesterday, my latest piece, this on the Lib Dems' various battlegrounds:

    http://newstonoone.blogspot.co.uk/2014/12/testing-boundaries-3-lib-dems-vs-all.html
  • IndigoIndigo Posts: 9,966
    My feeling is that most polls online, telephone or face-to-face are in effect attempts at a large http://en.wikipedia.org/wiki/Fermi_problem. Fermi problems give surprisingly accurate results given their broad brush estimates and scope for error because overestimate and underestimates tend to cancel each other out.

    Most of these polls probably accept that the sample is biased, and prone to stuffing with wonks and activists, but probably get close to the result by assuming that the relative amount of stuffing was equal from all parties, so you got just as many Labour wonks ramping their side as you did Conservative wonks ramping their party. I am not sure how this stands up in a five party system.

    In a phone poll if the number isn't answered, do they call back, or do they substitute the next number in the sequence. In a door-to-door stratified random sample like we used doing psychological research if the selected person doesn't open the door the usual instruction was to go to the neighbour, because within a small geographical area with similar housing, health, school facilities etc, there was no reason to suspect that the neighbour wasn't as representative as the selected candidate, and because sample sizes were quite large any minor effect would be mitigated. If you do the same with a phone number you have no reason to suspect the next number has anything to do with the selected number, they could be in completely different area of town, and you could get a noticeably biased sample (ie those prepared to answer political opinion surveys, ie wonks and activists ;-) )
  • Good morning, everyone.

    Thanks to Mr. Sparrow for this piece.

    I very much agree with the sentiment at the end. Last election the constant flow of YouGov polls gave them a disproportionate impact as they were continually reported by the media, shaping opinion rather than merely reflecting it.

    I do think we need more stringent restrictions on polling during electoral periods to stop this sort of thing. [But it's even more important we ban the worm].
  • MarqueeMarkMarqueeMark Posts: 52,937
    Excellent piece, Mr Sparrow.

    Perhaps you could arrange with your polling chums for anyone who uses the politically paranoid term LibLabCon to suffer a North Korean-style Denial Of Service?

    Also, "turnip".
  • Ishmael_XIshmael_X Posts: 3,664
    Indigo said:

    My feeling is that most polls online, telephone or face-to-face are in effect attempts at a large http://en.wikipedia.org/wiki/Fermi_problem. Fermi problems give surprisingly accurate results given their broad brush estimates and scope for error because overestimate and underestimates tend to cancel each other out.

    Most of these polls probably accept that the sample is biased, and prone to stuffing with wonks and activists, but probably get close to the result by assuming that the relative amount of stuffing was equal from all parties, so you got just as many Labour wonks ramping their side as you did Conservative wonks ramping their party. I am not sure how this stands up in a five party system.

    In a phone poll if the number isn't answered, do they call back, or do they substitute the next number in the sequence. In a door-to-door stratified random sample like we used doing psychological research if the selected person doesn't open the door the usual instruction was to go to the neighbour, because within a small geographical area with similar housing, health, school facilities etc, there was no reason to suspect that the neighbour wasn't as representative as the selected candidate, and because sample sizes were quite large any minor effect would be mitigated. If you do the same with a phone number you have no reason to suspect the next number has anything to do with the selected number, they could be in completely different area of town, and you could get a noticeably biased sample (ie those prepared to answer political opinion surveys, ie wonks and activists ;-) )

    The best way of polling would be mass email spamming - ultra-cheap, huge sample even if responses only a tiny percentage, elimination of self-selection and mobile vs-landline issues. A great pity it's illegal.

  • Wow! I logged on this morning expecting very little it being Xmas Eve and all that. What a piece. This bombshell is too important to be missed by the many people who will be out fighting in the aisles for the last brussels. Can we rerun it again in the new year?
  • IndigoIndigo Posts: 9,966
    edited December 2014
    Ishmael_X said:

    The best way of polling would be mass email spamming - ultra-cheap, huge sample even if responses only a tiny percentage, elimination of self-selection and mobile vs-landline issues. A great pity it's illegal.

    That would be the ultimate in self-selection. Vast sample, tiny response, only the really interested or committed would reply.

  • Indigo said:

    My feeling is that most polls online, telephone or face-to-face are in effect attempts at a large http://en.wikipedia.org/wiki/Fermi_problem. Fermi problems give surprisingly accurate results given their broad brush estimates and scope for error because overestimate and underestimates tend to cancel each other out.

    Most of these polls probably accept that the sample is biased, and prone to stuffing with wonks and activists, but probably get close to the result by assuming that the relative amount of stuffing was equal from all parties, so you got just as many Labour wonks ramping their side as you did Conservative wonks ramping their party. I am not sure how this stands up in a five party system.

    In a phone poll if the number isn't answered, do they call back, or do they substitute the next number in the sequence. In a door-to-door stratified random sample like we used doing psychological research if the selected person doesn't open the door the usual instruction was to go to the neighbour, because within a small geographical area with similar housing, health, school facilities etc, there was no reason to suspect that the neighbour wasn't as representative as the selected candidate, and because sample sizes were quite large any minor effect would be mitigated. If you do the same with a phone number you have no reason to suspect the next number has anything to do with the selected number, they could be in completely different area of town, and you could get a noticeably biased sample (ie those prepared to answer political opinion surveys, ie wonks and activists ;-) )

    Isn't the Fermi method, and the ideas around the related wisdom of crowds, based around obtaining a single answer though? Polling is about finding the range of opinions.
  • Sean_FSean_F Posts: 37,536
    Thanks for a very interesting article. But do online polls, in general, come up with different results to telephone polls? Currently, there are two significant ones:-

    The big two typically poll 60-62% in phone polls. They tend to poll 65% -70% in online polls.

    The Lib Dems average 11% in phone polls, 7% online.

    There's no significant difference in levels of UKIP support. 15.5% in phone polls, 17% online.

  • A war plane has crashed within ISIS held territory. No details as yet:
    http://www.bbc.co.uk/news/world-middle-east-30596474
  • Moving off polls and on to 'gut reactions' Peter Oborne in the DT says this morning that we will see a big return to two-party politics in the election. Lab and Con will win higher proportion of seats between than at any time since 1992. UKIP will win one or two, maybe not even than that.
  • Mr. Borough, I think Oborne will be wrong by a large margin. I can't see the SNP collapsing (support may decline but they could make substantial gains even then), nor UKIP.
  • Ishmael_XIshmael_X Posts: 3,664
    edited December 2014
    Indigo said:

    Ishmael_X said:

    The best way of polling would be mass email spamming - ultra-cheap, huge sample even if responses only a tiny percentage, elimination of self-selection and mobile vs-landline issues. A great pity it's illegal.

    That would be the ultimate in self-selection. Vast sample, tiny response, only the really interested or committed would reply.

    Not true. "Vast sample, tiny response" has nothing to do with self-selection and doesn't matter if the spamming is vast enough (because .001% of a feck of a lot is a feck of a lot). It takes no more commitment or interest to reply to a question in an email to one on the phone, and much less than the rigmarole of signing up to an online polling panel. All polling samples are self-selecting in your sense in that they consist solely of those who choose to answer the question.
  • chestnutchestnut Posts: 7,341
    Just to add my voice to those who equally question the validity of 'polling by human'.

    Stigmatised opinions are more likely to be withheld, with the soft, middle of the road response offered in these situations.

    All polling needs to be balanced by the actual outcomes of real elections where comparisons to polling can be made.

    ICM had a particularly poor Euros, and were the worst pollster for understating the Kippers and over-estimating both Lab and Con.

    That's worth bearing in mind when viewing their polling outcomes. Are they really geared up for the change that has happened since 2010?

  • IndigoIndigo Posts: 9,966

    Isn't the Fermi method, and the ideas around the related wisdom of crowds, based around obtaining a single answer though? Polling is about finding the range of opinions.

    We are trying to find a single answer here, there percentage of voters that will vote for each party.

    Fermi problems are typified by the question "how many piano tuners are there in Chicago", where you make a set of assumptions that lead you to an answer:

    There are approximately 9,000,000 people living in Chicago.
    On average, there are two persons in each household in Chicago.
    Roughly one household in twenty has a piano that is tuned regularly.
    etc

    Some of those will be over estimates, some underestimates, which tend to cancel each other out, leading to a more accurate answer than you might otherwise expect.

    The parallel I was drawing was that the sample from online polls is undoubtedly biased, but the natures of those biases is such that they will tend to cancel each other out, at least when there are only two leading parties. The number of people joining the sample to ramp Labour will probably be similar to the number of people joining the sample to ramp the Tories etc. The number of activists and wonks joining the sample from each side probably broadly parallels the motivation of the supporters of that side, which is usually a factor of how well that side is doing etc.
  • Sean_FSean_F Posts: 37,536

    Mr. Borough, I think Oborne will be wrong by a large margin. I can't see the SNP collapsing (support may decline but they could make substantial gains even then), nor UKIP.

    The big two may increase their share of seats, but probably not their share of the vote.

  • IndigoIndigo Posts: 9,966
    Ishmael_X said:

    All polling samples are self-selecting in your sense in that they consist solely of those who choose to answer the question.

    Absolutely. I am on record here as being highly dubious of the value of polls, because the people that answer them are self selecting, and because it is assumed that the people polled will tell the truth when they certainly dont, especially this far out from an election when people are more interested in "sending messages".

    With a large enough sample you can mitigate the self-selection to some extent, by saying that it will apply equally to all sides, and that might get you so far with things like raw VI, but since people choosing to reply are the politically engaged, numbers like "likelihood to vote" are possibly way out, and so using that as a factor to modify VI sounds dangerous.

    When as someone noticed yesterday one organisation was scaling the results of young green voters by 245% to balance their sample, its not surprising that we see the Green vote fluctuating wildly from poll to poll.
  • NickPalmerNickPalmer Posts: 21,566
    Indigo said:

    Ishmael_X said:

    The best way of polling would be mass email spamming - ultra-cheap, huge sample even if responses only a tiny percentage, elimination of self-selection and mobile vs-landline issues. A great pity it's illegal.

    That would be the ultimate in self-selection. Vast sample, tiny response, only the really interested or committed would reply.

    Yes. Ishmael's suggestion is based on a common mathematical misunderstanding - that the important thing in a sample is that it's big. If you imagine taking coloured balls from a barrel with millions of balls, if you are sampling them accurately then 1000 is plenty to get a fairly accurate picture. If you sample 100,000, you won't get a much better picture. But if your sample is biased by any kind of self-selection (or anything else) then you're stuffed, no matter how big the sample unless it's the whole population.
    Indigo said:


    The other problems I have with online surveys compared particularly to phone surveys is its far to easy to pack the panel. Party A could tell its activists to all go and sign up over the next month to a given panel. It could even have a small group of activists generate hundreds of survey accounts and pretend to be supporters of an opposing party, or even their own party and sway results. Polls are not given under oath, people should not assume that people remotely tell the truth in them, I dont, I respond in a way that is likely, if repeated by other like minded voters, to push my party in a given direction.

    The panels are big enough to defeat this sort of effort. I'm on the YouGov panel - I've been asked my political opinion once in the last three years.

    The article is very interesting, but I'd question two points. First, as EiT and fox have said, phone samples also will have biases, e.g. more willingness to respond among people not in employment. I once got called by Populus - annoyingly, I was just about to go into a meeting so I couldn't bias their sample :-). Second, the "bandwagon effect" isn't well-documented, and some research by Nate Silver suggests it doesn't exist. I think it probably only has one effect - supporters of new parties like UKIP will be less likely to vote for them if polls suggest they're failing to break through.

  • MikeSmithsonMikeSmithson Posts: 7,382
    edited December 2014
    chestnut said:

    Just to add my voice to those who equally question the validity of 'polling by human'.

    Stigmatised opinions are more likely to be withheld, with the soft, middle of the road response offered in these situations.

    All polling needs to be balanced by the actual outcomes of real elections where comparisons to polling can be made.

    ICM had a particularly poor Euros, and were the worst pollster for understating the Kippers and over-estimating both Lab and Con.

    That's worth bearing in mind when viewing their polling outcomes. Are they really geared up for the change that has happened since 2010?

    At the Euros ICM did better with the UKIP share than any of the online firms apart from YouGov. Please get your facts right before making statements.

    ICM was out by 2%
    TNS by 4%
    Opinium by 5%
    Survation by 5%
    ComRes online by 6%

    YouGov was spot on.



  • A war plane has crashed within ISIS held territory. No details as yet:
    http://www.bbc.co.uk/news/world-middle-east-30596474

    Royal Jordanian Airforce: Interesting to see how this pans-out....

    Src.: http://www.militaryphotos.net/forums/showthread.php?243346-ISIS-claims-capture-of-Jordanian-pilot-in-Syria
  • DavidLDavidL Posts: 54,016
    edited December 2014
    Excellent piece and some good additional contributions from @foxinsoxuk.

    The points Nick makes seem extremely valid to me but I think that phone polls suffer very similar problems. Whenever I get someone on the phone for a survey I politely hang up and I am interested in politics. I am sure the vast majority of potential contributors do the same so once again you are left with the more enthusiastic and committed.

    Mr Sox points out that you can never really compensate for a distorted sample and I can see how in epidemiological terms that must be true. But polling companies, particularly those like ICM (and my new heroes in Ipsos Mori of course) have got pretty good at doing so over time which is why they get pretty close most of the time. If the committed head off in a particular direction the majority of us sheep generally follow.

    As for polls making the story rather than simply reporting it I am sure there is some truth in that too. But is it maybe still better than the story being made by foreign media moguls talking their own book? Elections and public opinion have never been pure or paragons of virtue. Biased polls are probably still an improvement.

    Still, lots of good excuses packed away for the next tory disappointment.
  • foxinsoxukfoxinsoxuk Posts: 23,548

    Indigo said:

    Ishmael_X said:

    The best way of polling would be mass email spamming - ultra-cheap, huge sample even if responses only a tiny percentage, elimination of self-selection and mobile vs-landline issues. A great pity it's illegal.

    That would be the ultimate in self-selection. Vast sample, tiny response, only the really interested or committed would reply.

    Yes. Ishmael's suggestion is based on a common mathematical misunderstanding - that the important thing in a sample is that it's big. If you imagine taking coloured balls from a barrel with millions of balls, if you are sampling them accurately then 1000 is plenty to get a fairly accurate picture. If you sample 100,000, you won't get a much better picture. But if your sample is biased by any kind of self-selection (or anything else) then you're stuffed, no matter how big the sample unless it's the whole population.
    Indigo said:


    The other problems I have with online surveys compared particularly to phone surveys is its far to easy to pack the panel. Party A could tell its activists to all go and sign up over the next month to a given panel. It could even have a small group of activists generate hundreds of survey accounts and pretend to be supporters of an opposing party, or even their own party and sway results. Polls are not given under oath, people should not assume that people remotely tell the truth in them, I dont, I respond in a way that is likely, if repeated by other like minded voters, to push my party in a given direction.

    The panels are big enough to defeat this sort of effort. I'm on the YouGov panel - I've been asked my political opinion once in the last three years.

    The article is very interesting, but I'd question two points. First, as EiT and fox have said, phone samples also will have biases, e.g. more willingness to respond among people not in employment. I once got called by Populus - annoyingly, I was just about to go into a meeting so I couldn't bias their sample :-). Second, the "bandwagon effect" isn't well-documented, and some research by Nate Silver suggests it doesn't exist. I think it probably only has one effect - supporters of new parties like UKIP will be less likely to vote for them if polls suggest they're failing to break through.

    I am on the yougov panel and get polled on politics most weeks.

    All methods are imperfect, and we never know how accurate polls are unless there is an election at the same time.
  • NickPalmerNickPalmer Posts: 21,566
    edited December 2014
    In reply to the technical question, pollsters have quotas to fill (gender, region, usually former political leanings). If someone's out, they will try another number, but if they fill their quota of male Scottish 2010 Tories, they won't just add some more to replace female Midlands 2010 LibDems. If they get too many of one group and too few of another, they assume (a bit dangerously) that the ones they have are representative of the missing ones, and give them more or less weight to balance the sample. It's often suggested here that a poll that has e.g. few 2010 Tory voters is overestimating the true size of the Tory vote, masking it by weighting. Not so, unless the ones they found are atypical of the ones they didn't find.

    Example: you want your sample to have 35% of people who voted Conservative last time. You find to your horror that you only have 17.5% in your sample. So you count them all double (as if they were two respondents) and reduce the weight of everyone else so the total is still 100%. Now if the Tories you didn't find think just like the ones you did find, that's not an indication of anything at all.

    But if yesterday's papers reported that in 2010 Cameron was a secret worshipper of Satan, your respondents might be temporarily embarrassed to admit they voted for him (unless they didn't have consciences, of course :-)). In that case, actually your sample already has lots MORE 2010 Tories than are admitting it, and you are mistaken in doubling the ones you found - perversely, the apparent shortage of 2010 Tories is making you create a sample that is too far to the right. The Satanism story may mean they're now voting UKIP or LibDem but it's still a biased sample.

    Bottom line: the raw results of polls should be treated with caution. However, trends from the same institute are usually illuminating, since the same biases will tend to apply each time.
  • IndigoIndigo Posts: 9,966
    edited December 2014

    ... lots of good stuff snipped ...

    Bottom line: the raw results of polls should be treated with caution. However, trends from the same institute are usually illuminating, since the same biases will tend to apply each time.

    Many thanks for that. I think those last two lines are the most important thing for people to remember. By continuation, if a particular organisation has the Tories in the lead, they might well not be, especially if the margin is relatively close, but if they show the Tory vote is falling over time, that is much more likely to be the case.
  • Ishmael_XIshmael_X Posts: 3,664

    Indigo said:

    Ishmael_X said:

    The best way of polling would be mass email spamming - ultra-cheap, huge sample even if responses only a tiny percentage, elimination of self-selection and mobile vs-landline issues. A great pity it's illegal.

    That would be the ultimate in self-selection. Vast sample, tiny response, only the really interested or committed would reply.

    Yes. Ishmael's suggestion is based on a common mathematical misunderstanding - that the important thing in a sample is that it's big. If you imagine taking coloured balls from a barrel with millions of balls, if you are sampling them accurately then 1000 is plenty to get a fairly accurate picture. If you sample 100,000, you won't get a much better picture. But if your sample is biased by any kind of self-selection (or anything else) then you're stuffed, no matter how big the sample unless it's the whole population.
    Indigo said:


    The other problems I have with online surveys compared particularly to phone surveys is its far to easy to pack the panel. Party A could tell its activists to all go and sign up over the next month to a given panel. It could even have a small group of activists generate hundreds of survey accounts and pretend to be supporters of an opposing party, or even their own party and sway results. Polls are not given under oath, people should not assume that people remotely tell the truth in them, I dont, I respond in a way that is likely, if repeated by other like minded voters, to push my party in a given direction.

    The panels are big enough to defeat this sort of effort. I'm on the YouGov panel - I've been asked my political opinion once in the last three years.

    The article is very interesting, but I'd question two points. First, as EiT and fox have said, phone samples also will have biases, e.g. more willingness to respond among people not in employment. I once got called by Populus - annoyingly, I was just about to go into a meeting so I couldn't bias their sample :-). Second, the "bandwagon effect" isn't well-documented, and some research by Nate Silver suggests it doesn't exist. I think it probably only has one effect - supporters of new parties like UKIP will be less likely to vote for them if polls suggest they're failing to break through.

    Don't be patronising. That is a common mathematical mistake, but not a mistake I am making, though I should have said "adequately sized sample" rather than huge sample. Actually I think you are making one: your sample is what you measure. You are not measuring non-responders here. So your sample, statistically speaking is only the responders. Secondly as already pointed out, in this context all samples self-select by answering the question, so you have to live with it and try to find the least-worst kind of self-selection.
  • Mr. Borough, I think Oborne will be wrong by a large margin. I can't see the SNP collapsing (support may decline but they could make substantial gains even then), nor UKIP.

    Who knows. But I wonder whether commentators like Oborne ever put their money where their gut is.
  • DavidLDavidL Posts: 54,016

    In reply to the technical question, pollsters have quotas to fill (gender, region, usually former political leanings). If someone's out, they will try another number, but if they fill their quota of male Scottish 2010 Tories, they won't just add some more to replace female Midlands 2010 LibDems. If they get too many of one group and too few of another, they assume (a bit dangerously) that the ones they have are representative of the missing ones, and give them more or less weight to balance the sample. It's often suggested here that a poll that has e.g. few 2010 Tory voters is overestimating the true size of the Tory vote, masking it by weighting. Not so, unless the ones they found are atypical of the ones they didn't find.

    Example: you want your sample to have 35% of people who voted Conservative last time. You find to your horror that you only have 17.5% in your sample. So you count them all double (as if they were two respondents) and reduce the weight of everyone else so the total is still 100%. Now if the Tories you didn't find think just like the ones you did find, that's not an indication of anything at all.

    But if yesterday's papers reported that in 2010 Cameron was a secret worshipper of Satan, your respondents might be temporarily embarrassed to admit they voted for him (unless they didn't have consciences, of course :-)). In that case, actually your sample already has lots MORE 2010 Tories than are admitting it, and you are mistaken in doubling the ones you found - perversely, the apparent shortage of 2010 Tories is making you create a sample that is too far to the right. The Satanism story may mean they're now voting UKIP or LibDem but it's still a biased sample.

    Bottom line: the raw results of polls should be treated with caution. However, trends from the same institute are usually illuminating, since the same biases will tend to apply each time.

    Do you have a source for this Satanism allegation Nick? ;-)
  • Mr. Borough, that's the thing. A commentator doesn't get paid by results.

    Mr. Thoughts, whilst relieved it's not a British plane, that's concerning. Reminds me of a few months ago when three (I think) pilots and planes defected from Iraq to ISIS.
  • FluffyThoughtsFluffyThoughts Posts: 2,420
    edited December 2014
    Interesting bee-hatch fight:

    Mathemeticians agruing about and/or over observations versus samples. It is like Einstien and that Danish bloke who no-one remembers....*

    * Well, until and Englishman (oops - he was an Ulster-Scot) prove the latter right. Red-or-black; you play the game....
  • IndigoIndigo Posts: 9,966
    edited December 2014

    Mr. Borough, that's the thing. A commentator doesn't get paid by results.

    Indeed. When you are paid for, in effect, generating clicks, creating heat is the name of the game, providing light is more an optional extra ;-)

  • The biggest failure in UK polling history was the 1992 general election.

    Correct me if I'm wrong, but I don't think there were any online pollsters in operation at that time.
  • JamesMo said:

    The biggest failure in UK polling history was the 1992 general election.

    Correct me if I'm wrong, but I don't think there were any online pollsters in operation at that time.

    And very few mobile-phones. MS-DOS Windows was struggling with WFWG too. How the fashionable always get things wrong....
  • Interesting article from US stats organization on this topic's debate:

    "Some online nonprobability polls have compiled a good record of predicting election outcomes, actually outperforming traditional polls in some elections. But skeptics point to the absence of a theory, such as the one underlying probability sampling, that provides a basis for expecting traditional polls to work well under different conditions."

    http://magazine.amstat.org/blog/2014/10/01/prescolumnoct14/
  • EasterrossEasterross Posts: 1,915
    An excellent piece and a good reason why the pollsters may get as much egg on their faces in 2015 as they did in 1992.

    Regardless of sampling errors and all the other sophisticated and less sophisticated methods of screening respondents, it is simply not possible for Labour to be 5-7% ahead with some pollsters, the Tories to be 3% with Ipsos-Mori or the parties to be level pegging as Populus suggested on Monday and for them all to be correct.

    On here we give a great deal of credence to YouGov simply because they produce 5 polls a week and for the addicts on here, endless subsets and polls within polls. We have treated ICM as the Gold Standard and speaking personally when I see it give Labour a decent lead as it did last week, I take a sharp intake of breath.

    We also have ComRes and Survation, neither of which I have personally ever treated as serious pollsters given their lack of accuracy compared to real results e.g. euro elections, by-elections etc.

    We have Michael Ashcroft's herculean efforts both with his Monday polls and his individual constituency polls. We used to anxiously look forward to his mega-polls.

    Finally we have the others like TNS, Opinium and Stephen Fisher. TNS seems to get credence for its Scottish polls and Stephen Fisher's weekly summary predictions are as anticipated by many of us as the Michael Ashcroft marginal polls.

    We all have our favourites. There are those who tend to give results we like and those we don't. It would be good if Mike or TSE give us a thread or two over the holiday period reminding us of pollsters predictions before recent elections and the relative accuracy of them compared to the actual results.

    Personally my gut reaction is that basically the Tories and Labour are neck and neck with Labour possibly being 1% in the lead, around 33-32, the LibDems are on around 10% and UKIP on around 12%. In Scotland the SNP are at around 40%, Labour around 25%, the Tories around 17% (where they polled in 2010) and the LibDems around 5% but most of that 5% is stacked in the seats they are defending and those where they were 2nd in 2010.

    Over the next few months I see Labour dropping into the 20s%, the Tories edging up towards 35% and the LibDems and UKIP fighting it out in the 10-15% band. Over 6 months we have seen the Labour lead in the marginal drop from over 6% to around 3%. The next set of Ashcroft marginal will be interesting. Christmas and New Year gives MPs lots of time to spend in their constituencies and sample the mood music. Danny Alexander's faux attack on George Osborne this morning may be part of his reaction to that mood music.
  • Sean_FSean_F Posts: 37,536

    chestnut said:

    Just to add my voice to those who equally question the validity of 'polling by human'.

    Stigmatised opinions are more likely to be withheld, with the soft, middle of the road response offered in these situations.

    All polling needs to be balanced by the actual outcomes of real elections where comparisons to polling can be made.

    ICM had a particularly poor Euros, and were the worst pollster for understating the Kippers and over-estimating both Lab and Con.

    That's worth bearing in mind when viewing their polling outcomes. Are they really geared up for the change that has happened since 2010?

    At the Euros ICM did better with the UKIP share than any of the online firms apart from YouGov. Please get your facts right before making statements.

    ICM was out by 2%
    TNS by 4%
    Opinium by 5%
    Survation by 5%
    ComRes online by 6%

    YouGov was spot on
    ICM online did put Labour in first place. Few people remember how accurate you were if you put the wrong party first.
  • Interesting bee-hatch fight:

    Mathemeticians agruing about and/or over observations versus samples. It is like Einstien and that Danish bloke who no-one remembers....*

    * Well, until and Englishman (oops - he was an Ulster-Scot) prove the latter right. Red-or-black; you play the game....

    Eddington was English, surely?
  • Is it not further complicated by subcontracting phone poll fieldwork (ie making the calls) to just one or two companies? I'm not sure who does the sampling in those cases.
  • Re 1992.

    Nick Sparrow was the man who learnt the lessons first from that debacle and created the broad weighting structures we have today.
  • Re 1992.

    Nick Sparrow was the man who learnt the lessons first from that debacle and created the broad weighting structures we have today.

    He should have fixed the phone randomisation system so as not to oversample Labour supporters.
  • Is it not further complicated by subcontracting phone poll fieldwork (ie making the calls) to just one or two companies? I'm not sure who does the sampling in those cases.

    Several firms carry out fieldwork but the structures are designed by the pollsters. Those phoned are determined by computer generated system that takes known landline numbers and then randomises the last digit.
  • IndigoIndigo Posts: 9,966
    edited December 2014

    Re 1992.

    Nick Sparrow was the man who learnt the lessons first from that debacle and created the broad weighting structures we have today.

    He should have fixed the phone randomisation system so as not to oversample Labour supporters.
    If there is an oversampling of Labour supporters its probably due to them being more likely to be available, especially during the day, most people on benefit, or with part time jobs probably tend to vote Labour. Old people are more likely to be Conservative voters, and tend to be at home as well, but old people often dont answer the telephone to unlisted numbers or people they dont know.
  • As a market researcher, I'll make a few comments on some of these points:

    - No poll is perfect but I think we in the industry try to make them as good as we can
    - Both main methodologies can have issues reaching the whole population - online panels tend to have less older people, while phone finds it hard to reach the increasing number of mobile only households (which tend to be younger)
    - As mentioned online panels are self-selecting to sign up but bear in mind that generally most surveys are not political and asking about consumer products. I think more people are motivated by interest in surveys or making a little bit of money (some sign up to multiple panels) rather than political reasons
    - While membership of a panel is self-selecting, the panel company chooses who to invite to each survey. A general rule of thumb is that each panellist should not be invited to more than 1 survey a month
    - For phone surveys, most companies should make calls in the evening as well as the daytime to catch people who are out at work.
    - It is a well known variable in research that if you have say a 1-7 scale where 7 is very good and 1 is very poor, that people in online research tend to give more answers at the extremes of the scale than phone respondents. This may be due to phone respondents feeling the need to justify extreme views to the interviewer and to the anonymity of online

    As I have mentioned before, I don't do political polling myself but think it must be some of the hardest polling to do:

    - Often deals with controversial topics. People are more likely to be untruthful when it comes to politics rather than something like choice of cereal.
    - The difficulties of weighting based on past voting and making sure past recall is accurate nearly 5 years from the last election (and when we have had local and Euro elections since)
    - The lack of frequent elections to validate results against
    - The fact that you can have a perfectly accurate poll taken say 4 days before the election but late developments mean people change their mind in the polling booth

    One good thing is that we have lots of poll companies and more frequent polls so that it should hopefully be easier to spot any outliers

    Looking at the next election I think most pollsters will get the Lab vs Con lead right. The big challenge will be getting the UKIP and SNP numbers right. Both parties are seeing 2010 non-voters now saying they will vote for them. How many will actually turn out on the day? It is hard to say.
  • OldKingColeOldKingCole Posts: 33,709
    edited December 2014
    Indigo said:

    Re 1992.

    Nick Sparrow was the man who learnt the lessons first from that debacle and created the broad weighting structures we have today.

    He should have fixed the phone randomisation system so as not to oversample Labour supporters.
    If there is an oversampling of Labour supporters its probably due to them being more likely to be available, especially during the day, most people on benefit, or with part time jobs probably tend to vote Labour. Old people are more likely to be Conservative voters, and tend to be at home as well, but old people often dont answer the telephone to unlisted numbers or people they dont know.
    Do phone pollsters only work 9-5, then?

    I think the point about older people not answering the phone to people they don’t know is a good one. An increasing number of my friends are so fed up with unblockable nuisance calls .... from abroad for example ...... that they don’t answer, see who caled then ring back. More expensive, of course, and could cause serious problems when what Microsoft Technical Department is calling them about actually affects them! (Yes, I know it’s not!)
  • Is it not further complicated by subcontracting phone poll fieldwork (ie making the calls) to just one or two companies? I'm not sure who does the sampling in those cases.

    Several firms carry out fieldwork but the structures are designed by the pollsters. Those phoned are determined by computer generated system that takes known landline numbers and then randomises the last digit.
    Randomising the last digit on landlines is not a clever idea because numbers are assigned in blocks to areas. Because people in posher areas are more likely to be ex-directory, they are under-sampled. Mobile phones are a different matter.
  • MikeKMikeK Posts: 9,053
    "This means pollsters are not innocent observers of public opinion, but active participants in the political process; not only reporting public opinion but helping to shape it."

    Well smell the Coffee, Mr. Sparrow........ and is this why there is a concerted effort by all the pollsters, in the two weeks before Christmas, to downgrade the level of UKIP poll levels?
    In other words to be - active participants in the political process.


    Mike Smithson has always said that could never happen, has he changed his mind?

  • As a market researcher, I'll make a few comments on some of these points:

    - No poll is perfect but I think we in the industry try to make them as good as we can
    - Both main methodologies can have issues reaching the whole population - online panels tend to have less older people, while phone finds it hard to reach the increasing number of mobile only households (which tend to be younger)
    - As mentioned online panels are self-selecting to sign up but bear in mind that generally most surveys are not political and asking about consumer products. I think more people are motivated by interest in surveys or making a little bit of money (some sign up to multiple panels) rather than political reasons
    - While membership of a panel is self-selecting, the panel company chooses who to invite to each survey. A general rule of thumb is that each panellist should not be invited to more than 1 survey a month
    - For phone surveys, most companies should make calls in the evening as well as the daytime to catch people who are out at work.
    - It is a well known variable in research that if you have say a 1-7 scale where 7 is very good and 1 is very poor, that people in online research tend to give more answers at the extremes of the scale than phone respondents. This may be due to phone respondents feeling the need to justify extreme views to the interviewer and to the anonymity of online

    As I have mentioned before, I don't do political polling myself but think it must be some of the hardest polling to do:

    - Often deals with controversial topics. People are more likely to be untruthful when it comes to politics rather than something like choice of cereal.
    - The difficulties of weighting based on past voting and making sure past recall is accurate nearly 5 years from the last election (and when we have had local and Euro elections since)
    - The lack of frequent elections to validate results against
    - The fact that you can have a perfectly accurate poll taken say 4 days before the election but late developments mean people change their mind in the polling booth

    One good thing is that we have lots of poll companies and more frequent polls so that it should hopefully be easier to spot any outliers

    Looking at the next election I think most pollsters will get the Lab vs Con lead right. The big challenge will be getting the UKIP and SNP numbers right. Both parties are seeing 2010 non-voters now saying they will vote for them. How many will actually turn out on the day? It is hard to say.

    As someone who often starts but does not finish online polls, my suggestion would be to ask fewer questions.
  • Richard_NabaviRichard_Nabavi Posts: 30,821
    edited December 2014
    Very interesting article, as one would expect from Nick Sparrow.

    The 'self-select' bias problem for online polls is of course very well known, and reputable pollsters such as YouGov will try to correct for it. This is easier if, like YouGov, you start with an enormous panel, set up well before the particular poll you are conducting. So, in general, it doesn't seem to be the case that online polls are systematically more wrong that telephone or face-to-face polls, which (as several posters have pointed out) have their own problems.

    All the same, as political punters we should definitely be aware of the possibility of self-select bias, especially in circumstances where for some reason one segment of the respondent base is likely to be particularly fired-up. An example of this was the apparent surge in support for Mitt Romney after he did better than expected in the first TV debate against Obama. As always in political betting, you need to use polls as guides to be interpreted intelligently, not as exact and literal measurements of voting intention. In particular, if phone and online polls start to diverge systematically, ask yourself why - self-select bias might be one possible explanation.
  • OldKingColeOldKingCole Posts: 33,709
    edited December 2014
    MikeK said:

    "This means pollsters are not innocent observers of public opinion, but active participants in the political process; not only reporting public opinion but helping to shape it."

    Well smell the Coffee, Mr. Sparrow........ and is this why there is a concerted effort by all the pollsters, in the two weeks before Christmas, to downgrade the level of UKIP poll levels?
    In other words to be - active participants in the political process.


    Mike Smithson has always said that could never happen, has he changed his mind?

    Unlikely that they are trying to affect the process, Mr K. It’s “simply” that however one frames a question it is extremely difficult to avoid some sort of bias. At one stage of my life I was writing multiple choice questions and writing credible confounders was especially difficult.

    In passing, it does seem that if one asked if a respondent suffered or had ever suffered from mild paranoia, then a yes answer means they could probably be put down as a Kipper!
  • NickPalmerNickPalmer Posts: 21,566
    DavidL said:

    <

    Do you have a source for this Satanism allegation Nick? ;-)

    Oh, it's just intuitively clear. :-) But I'll do a random sample of my friends to check if you like.

    Really good post by Gareth (and apols to Ishmael for the patronising bit, didn't word my post well). We should probably be surprised how well the polls do, given the difficulties.

    MPs and would-be MPs get regularly polled too, incidentally - typically you're offered £25 (to you or to charity) to spend 10 minutes or so expressing views on issues of interest to the sponsors, e.g. attitudes to Heathrow expansion. Occasionally the sponsors release the results ("64% of X's MPs favour our cause") but usually not. I once in a quiet moment agreed to do an hour-long face to face survey on a range of issues (for the CBI, I think) - that definitely introduced bias, since the interviewer couldn't stop herself from raising her eyebrows or chuckling at some of my responses. It tempted me to be more outrageous - others might have tried to be more mainstream.





  • VerulamiusVerulamius Posts: 1,550

    Interesting bee-hatch fight:

    Mathemeticians agruing about and/or over observations versus samples. It is like Einstien and that Danish bloke who no-one remembers....*

    * Well, until and Englishman (oops - he was an Ulster-Scot) prove the latter right. Red-or-black; you play the game....

    But the Danish bloke was right though. After all how many people have a radius named after them?
  • SandyRentoolSandyRentool Posts: 22,181
    I've been thinking that the chnage in voting pattern next year versus 2010 could pose a major challenge to the BBC exit poll. After becoming the de facto election result at 10:01, it would be quite a surprise to many if, for example, it had the wrong party winning most seats or predicted UKIP on 2 when they actually win 22, or whatever. There might even be some betting opportunities for those with a good insight who can spot the errors in the exit poll.
  • VerulamiusVerulamius Posts: 1,550
    Can we apply a quantum approach to polls? That there is a probability function over people's voting pattern that only crystallises in the voting booth?
  • Decrepit John L - I mainly do research with doctors, nurses and pharmacists and we pro rata the incentive to the length of interview. In consumer research, the incentive tends to be quite low (say 50p), however long the survey is, as there is not a shortage of people who will do it for that amount

    Nick P - thanks. One of the things that winds market researchers up is how many people think research is easy and you can just do a few questions in Survey Monkey in 5 minutes. Anyone can do bad research but getting it right and getting good quality data is a lot harder. Certainly I have had quite a few clients come to me asking me to field their questions they have written themselves and have had to tactfully rewrite them substantially...

  • isamisam Posts: 41,118

    chestnut said:

    Just to add my voice to those who equally question the validity of 'polling by human'.

    Stigmatised opinions are more likely to be withheld, with the soft, middle of the road response offered in these situations.

    All polling needs to be balanced by the actual outcomes of real elections where comparisons to polling can be made.

    ICM had a particularly poor Euros, and were the worst pollster for understating the Kippers and over-estimating both Lab and Con.

    That's worth bearing in mind when viewing their polling outcomes. Are they really geared up for the change that has happened since 2010?

    At the Euros ICM did better with the UKIP share than any of the online firms apart from YouGov. Please get your facts right before making statements.

    ICM was out by 2%
    TNS by 4%
    Opinium by 5%
    Survation by 5%
    ComRes online by 6%

    YouGov was spot on.



    This time last year there were very few polls, if any, that put Ukip ahead in the euros

    In April ICM had labour on 36 and Ukip on 20

    http://en.m.wikipedia.org/wiki/European_Parliament_election,_2014_(United_Kingdom)

  • isamisam Posts: 41,118

    chestnut said:

    Just to add my voice to those who equally question the validity of 'polling by human'.

    Stigmatised opinions are more likely to be withheld, with the soft, middle of the road response offered in these situations.

    All polling needs to be balanced by the actual outcomes of real elections where comparisons to polling can be made.

    ICM had a particularly poor Euros, and were the worst pollster for understating the Kippers and over-estimating both Lab and Con.

    That's worth bearing in mind when viewing their polling outcomes. Are they really geared up for the change that has happened since 2010?

    At the Euros ICM did better with the UKIP share than any of the online firms apart from YouGov. Please get your facts right before making statements.

    ICM was out by 2%
    TNS by 4%
    Opinium by 5%
    Survation by 5%
    ComRes online by 6%

    YouGov was spot on.



    You are referring to the final polls... Thus far out no one was calling it right (well no pollster)

    Also pre euro you made a big point of saying AIFE would hit the Ukip vote

    You said about 2% would be knocked off, and after the result said it was as you expected

    Therefore it is wholly inconsistent of you to use the final Ukip score as a measure of pollster accuracy, as people accidentally voting AIFE would have told a pollster Ukip

  • isam said:

    chestnut said:

    Just to add my voice to those who equally question the validity of 'polling by human'.

    Stigmatised opinions are more likely to be withheld, with the soft, middle of the road response offered in these situations.

    All polling needs to be balanced by the actual outcomes of real elections where comparisons to polling can be made.

    ICM had a particularly poor Euros, and were the worst pollster for understating the Kippers and over-estimating both Lab and Con.

    That's worth bearing in mind when viewing their polling outcomes. Are they really geared up for the change that has happened since 2010?

    At the Euros ICM did better with the UKIP share than any of the online firms apart from YouGov. Please get your facts right before making statements.

    ICM was out by 2%
    TNS by 4%
    Opinium by 5%
    Survation by 5%
    ComRes online by 6%

    YouGov was spot on.



    This time last year there were very few polls, if any, that put Ukip ahead in the euros

    In April ICM had labour on 36 and Ukip on 20

    http://en.m.wikipedia.org/wiki/European_Parliament_election,_2014_(United_Kingdom)

    This doesn't prove the polls were wrong. People turned to UKIP later in the campaign. Polling is a snapshot of opinion at the time of the poll. Unless the pollsters changed their methods.
  • isamisam Posts: 41,118
    edited December 2014

    isam said:

    chestnut said:

    Just to add my voice to those who equally question the validity of 'polling by human'.

    Stigmatised opinions are more likely to be withheld, with the soft, middle of the road response offered in these situations.

    All polling needs to be balanced by the actual outcomes of real elections where comparisons to polling can be made.

    ICM had a particularly poor Euros, and were the worst pollster for understating the Kippers and over-estimating both Lab and Con.

    That's worth bearing in mind when viewing their polling outcomes. Are they really geared up for the change that has happened since 2010?

    At the Euros ICM did better with the UKIP share than any of the online firms apart from YouGov. Please get your facts right before making statements.

    ICM was out by 2%
    TNS by 4%
    Opinium by 5%
    Survation by 5%
    ComRes online by 6%

    YouGov was spot on.



    This time last year there were very few polls, if any, that put Ukip ahead in the euros

    In April ICM had labour on 36 and Ukip on 20

    http://en.m.wikipedia.org/wiki/European_Parliament_election,_2014_(United_Kingdom)

    This doesn't prove the polls were wrong. People turned to UKIP later in the campaign. Polling is a snapshot of opinion at the time of the poll. Unless the pollsters changed their methods.
    ICM had a 16 point labour lead over Ukip in April... How wrong do you want it to be?






  • Very interesting article, as one would expect from Nick Sparrow.

    The 'self-select' bias problem for online polls is of course very well known, and reputable pollsters such as YouGov will try to correct for it. This is easier if, like YouGov, you start with an enormous panel, set up well before the particular poll you are conducting. So, in general, it doesn't seem to be the case that online polls are systematically more wrong that telephone or face-to-face polls, which (as several posters have pointed out) have their own problems.

    All the same, as political punters we should definitely be aware of the possibility of self-select bias, especially in circumstances where for some reason one segment of the respondent base is likely to be particularly fired-up. An example of this was the apparent surge in support for Mitt Romney after he did better than expected in the first TV debate against Obama. As always in political betting, you need to use polls as guides to be interpreted intelligently, not as exact and literal measurements of voting intention. In particular, if phone and online polls start to diverge systematically, ask yourself why - self-select bias might be one possible explanation.

    Divergence could be due to self-selection but might just as easily be due to differential changes between subgroups *after* weights have been established.
  • Right that's me done for this year - off for a final shop and then time to break out the xmas beers.

    Merry Christmas PBers, and here's to a prosperous political New Year!
  • CD13CD13 Posts: 6,366
    edited December 2014
    DecrepitJohn,

    Bell was an Ulsterman - the entanglement man of EPR paradox fame

    And Bohr was a real legend. I saw a Scandinavian airplane recently bearing images of famous Scandinavians. The only one I'd heard of was Hans Christian Anderson, the others were actors and other such nonentities. No Bohr.

    A travesty.
  • OldKingColeOldKingCole Posts: 33,709
    CD13 said:

    DecrepitJohn,

    Bell was an Ulsterman - the entanglement man of EPR paradox fame

    And Bohr was a real legend. I saw a Scandinavian airplane recently bearing images of famous Scandinavians. The only one I'd heard of was Hans Christian Anderson, the others were actors and other such nonentities. No Bohr.

    A travesty.

    No Tycho Brahe either?
  • Incidentally, I agree this should be reposted a little way into 2015, so it gets a bit more of an airing.
  • Ishmael_XIshmael_X Posts: 3,664

    CD13 said:

    DecrepitJohn,

    Bell was an Ulsterman - the entanglement man of EPR paradox fame

    And Bohr was a real legend. I saw a Scandinavian airplane recently bearing images of famous Scandinavians. The only one I'd heard of was Hans Christian Anderson, the others were actors and other such nonentities. No Bohr.

    A travesty.

    No Tycho Brahe either?
    Linnaeus?

  • CD13CD13 Posts: 6,366
    Mr Cole,

    "No Tycho Brahe either?"

    Nope, but with Copernicus, Kepler and Galileo stealing his work, I didn't expect him to be there.

    But plenty of Danish soap stars.
  • As a market researcher, I'll make a few comments on some of these points:

    - No poll is perfect but I think we in the industry try to make them as good as we can
    -ate poll taken say 4 days before the election but late developments mean people change their mind in the polling booth

    ......... Both parties are seeing 2010 non-voters now saying they will vote for them. How many will actually turn out on the day? It is hard to say.

    As someone who often starts but does not finish online polls, my suggestion would be to ask fewer questions.


    I agree. I find with long polls my attention wanders and I tend to answer quickly just to finish. Some of the answers are often - err - rubbish...

    YouGov's long multiple choice ones (not on politics but on products or supermarkets often ften) are an utter turnoff...
  • Ishmael_XIshmael_X Posts: 3,664
    Here's a plan:

    Form a consortium of large mainstream websites which require log-ons (banks, shopping, betting sites). Randomly ask customers a VI question which they have to answer (even if "prefer not to answer") to get further on in the log-in process.

    Obviously there are selection biases here. Again, it's the least-worst we are looking for.
  • OldKingColeOldKingCole Posts: 33,709
    Ishmael_X said:

    CD13 said:

    DecrepitJohn,

    Bell was an Ulsterman - the entanglement man of EPR paradox fame

    And Bohr was a real legend. I saw a Scandinavian airplane recently bearing images of famous Scandinavians. The only one I'd heard of was Hans Christian Anderson, the others were actors and other such nonentities. No Bohr.

    A travesty.

    No Tycho Brahe either?
    Linnaeus?

    CD13 said:

    Mr Cole,

    "No Tycho Brahe either?"

    Nope, but with Copernicus, Kepler and Galileo stealing his work, I didn't expect him to be there.

    But plenty of Danish soap stars.

    Coipernicus was earlier, I think.
    And Carl Linnaeus was Swedish. Although, to be fair Brahe was born in what was then Denmark, but is now Sweden.
  • As a market researcher, I'll make a few comments on some of these points:

    - No poll is perfect but I think we in the industry try to make them as good as we can
    -ate poll taken say 4 days before the election but late developments mean people change their mind in the polling booth

    ......... Both parties are seeing 2010 non-voters now saying they will vote for them. How many will actually turn out on the day? It is hard to say.

    As someone who often starts but does not finish online polls, my suggestion would be to ask fewer questions.


    I agree. I find with long polls my attention wanders and I tend to answer quickly just to finish. Some of the answers are often - err - rubbish...

    YouGov's long multiple choice ones (not on politics but on products or supermarkets often ften) are an utter turnoff...
    Respondent fatigue is a well known issue. Ideally surveys shouldn't be longer than 20 mins but that is not always possible.
  • NickPalmerNickPalmer Posts: 21,566




    I agree. I find with long polls my attention wanders and I tend to answer quickly just to finish. Some of the answers are often - err - rubbish...

    YouGov's long multiple choice ones (not on politics but on products or supermarkets often ften) are an utter turnoff...

    So true - asking almost identical questions (Do you like... Would you be proud to work for... Do you think it represents quality...) on every brand you claim to recognise gives a very strong incentive to claim to recognise almost nothing, as you will be literally punished for every brand you say you know. BT? Wassat?

    Bu maybe that's why I don't get YouGov surveys often!

  • OldKingColeOldKingCole Posts: 33,709

    As a market researcher, I'll make a few comments on some of these points:

    - No poll is perfect but I think we in the industry try to make them as good as we can
    -ate poll taken say 4 days before the election but late developments mean people change their mind in the polling booth

    ......... Both parties are seeing 2010 non-voters now saying they will vote for them. How many will actually turn out on the day? It is hard to say.

    As someone who often starts but does not finish online polls, my suggestion would be to ask fewer questions.


    I agree. I find with long polls my attention wanders and I tend to answer quickly just to finish. Some of the answers are often - err - rubbish...

    YouGov's long multiple choice ones (not on politics but on products or supermarkets often ften) are an utter turnoff...
    Respondent fatigue is a well known issue. Ideally surveys shouldn't be longer than 20 mins but that is not always possible.
    Not keen on the design of those Yougov polls at all. Less and less inclined to complete them. One the other day asked me which bank I’d be proud to work for!

    I have it in my mind that when we were doing questionnaire design we were advised to keep them to one side of A4, if at all possible.
    Was quite a long time ago though!
  • Ishmael_XIshmael_X Posts: 3,664

    Ishmael_X said:

    CD13 said:

    DecrepitJohn,

    Bell was an Ulsterman - the entanglement man of EPR paradox fame

    And Bohr was a real legend. I saw a Scandinavian airplane recently bearing images of famous Scandinavians. The only one I'd heard of was Hans Christian Anderson, the others were actors and other such nonentities. No Bohr.

    A travesty.

    No Tycho Brahe either?
    Linnaeus?

    CD13 said:

    Mr Cole,

    "No Tycho Brahe either?"

    Nope, but with Copernicus, Kepler and Galileo stealing his work, I didn't expect him to be there.

    But plenty of Danish soap stars.

    Coipernicus was earlier, I think.
    And Carl Linnaeus was Swedish. Although, to be fair Brahe was born in what was then Denmark, but is now Sweden.
    Has someone pinned a "please patronize me" notice to my back this morning? The question was about famous Scandinavians.

  • CD13CD13 Posts: 6,366
    edited December 2014
    Mr Cole,

    "Copernicus was earlier, I think."

    True, but he was a bit dull compared to Brahe.
  • OldKingColeOldKingCole Posts: 33,709
    Ishmael_X said:

    Ishmael_X said:

    CD13 said:

    DecrepitJohn,

    Bell was an Ulsterman - the entanglement man of EPR paradox fame

    And Bohr was a real legend. I saw a Scandinavian airplane recently bearing images of famous Scandinavians. The only one I'd heard of was Hans Christian Anderson, the others were actors and other such nonentities. No Bohr.

    A travesty.

    No Tycho Brahe either?
    Linnaeus?

    CD13 said:

    Mr Cole,

    "No Tycho Brahe either?"

    Nope, but with Copernicus, Kepler and Galileo stealing his work, I didn't expect him to be there.

    But plenty of Danish soap stars.

    Coipernicus was earlier, I think.
    And Carl Linnaeus was Swedish. Although, to be fair Brahe was born in what was then Denmark, but is now Sweden.
    Has someone pinned a "please patronize me" notice to my back this morning? The question was about famous Scandinavians.

    Sorry. Thought it was about famous Danes. (Nearly put great Danes but thought it would compound my felony!)
  • PlatoPlato Posts: 15,724
    LOL
    Charles said:

    Indigo said:

    the nice young lady (probably a student) with a clipboard approaches you in... if those views are not deeply held, or you are wavering and possibly embarrassed about your wavering, there is going to be a tendency to tell her what you think she wants to hear.

    http://www.telegraph.co.uk/news/politics/liberaldemocrats/8215481/Vince-Cable-I-have-the-nuclear-option-its-like-fighting-a-war.html

    Q.E.D.
  • CD13CD13 Posts: 6,366
    MrX,

    "Has someone pinned a "please patronize me" notice to my back this morning?"

    I didn't realise we were being that serious, but there is a serious point here.

    At this time of year we get what are laughingly called "Celebrity Specials" where old has-beens appear to promote their latest panto, and are eulogised by the presenters. A real "pass the sick bag" feast.
  • PlatoPlato Posts: 15,724
    Were you a fan of Mass Observation?

    I love reading their stuff post WW2.
    Indigo said:

    My feeling is that most polls online, telephone or face-to-face are in effect attempts at a large http://en.wikipedia.org/wiki/Fermi_problem. Fermi problems give surprisingly accurate results given their broad brush estimates and scope for error because overestimate and underestimates tend to cancel each other out.

    Most of these polls probably accept that the sample is biased, and prone to stuffing with wonks and activists, but probably get close to the result by assuming that the relative amount of stuffing was equal from all parties, so you got just as many Labour wonks ramping their side as you did Conservative wonks ramping their party. I am not sure how this stands up in a five party system.

    In a phone poll if the number isn't answered, do they call back, or do they substitute the next number in the sequence. In a door-to-door stratified random sample like we used doing psychological research if the selected person doesn't open the door the usual instruction was to go to the neighbour, because within a small geographical area with similar housing, health, school facilities etc, there was no reason to suspect that the neighbour wasn't as representative as the selected candidate, and because sample sizes were quite large any minor effect would be mitigated. If you do the same with a phone number you have no reason to suspect the next number has anything to do with the selected number, they could be in completely different area of town, and you could get a noticeably biased sample (ie those prepared to answer political opinion surveys, ie wonks and activists ;-) )

  • OldKingColeOldKingCole Posts: 33,709

    Ishmael_X said:

    Ishmael_X said:

    CD13 said:

    DecrepitJohn,

    Bell was an Ulsterman - the entanglement man of EPR paradox fame

    And Bohr was a real legend. I saw a Scandinavian airplane recently bearing images of famous Scandinavians. The only one I'd heard of was Hans Christian Anderson, the others were actors and other such nonentities. No Bohr.

    A travesty.

    No Tycho Brahe either?
    Linnaeus?

    CD13 said:

    Mr Cole,

    "No Tycho Brahe either?"

    Nope, but with Copernicus, Kepler and Galileo stealing his work, I didn't expect him to be there.

    But plenty of Danish soap stars.

    Coipernicus was earlier, I think.
    And Carl Linnaeus was Swedish. Although, to be fair Brahe was born in what was then Denmark, but is now Sweden.
    Has someone pinned a "please patronize me" notice to my back this morning? The question was about famous Scandinavians.

    Sorry. Thought it was about famous Danes. (Nearly put great Danes but thought it would compound my felony!)
    I’ll add the Lion of the North.
  • PlatoPlato Posts: 15,724
    Mr Oborne is like a one-armed bandit - pull the handle and out comes a random polemic opinion. piece.

    If he was a sandwich board nutcase - he's have one saying The End Is Nigh, and another saying We're All Saved.

    Mr. Borough, I think Oborne will be wrong by a large margin. I can't see the SNP collapsing (support may decline but they could make substantial gains even then), nor UKIP.

  • As we are talking about questionnaire design and as I mentioned about DIY surveys earlier, I have come up with a challenge for PBers. What is wrong with these questions and how would you improve them?

    1) How would you rate your stay at Hotel X? Poor, Average, Good, Very Good, Excellent
    2) What is your favourite type of meat? Lamb, Beef, Pork, Chicken, Turkey
    3) How many car journeys did you go on in the last year?
    4) How many hours of TV will you watch over Christmas this year?
  • CD13CD13 Posts: 6,366
    Mr Cole,

    While we're on the subject, who are your three most famous scientists?

    I'd go for Newton, Einstein and Bohr ... in that order. But as always they stood on others shoulders, even though Newton didn't really mean it.
  • CD13CD13 Posts: 6,366
    "I’ll add the Lion of the North."

    Sorry, but I had to google him. Looks to be a passable commander and certainly more worthy than the woman in the funny jumpers. Don't let Messrs Eagles and Dancer see or they'll have the Hanniba/'wotsisname argument again.
This discussion has been closed.