Last week I tried to give some reasons why serious polling firms could give discrepant results. A couple of days ago President of Datanálisis Luis Vicente Leon (LVL) addressed the issue in a radio interview with Vladimir Villegas. LVL argued that the large differences currently seen between the results of the reputable, long term Venezuela pollsters (Datanálisis, Consultores 21, IVAD, and Datos) are largely the result of how they deal with “undecided” respondents.
In my post I mentioned this issue in combination with the (unlikely) possibility of a “fear factor” being at work, i.e. that a significant sector of the population (about 17%) tell Datanálisis they are undecided but for some reason are comfortable enough with C21 to reveal that they will vote for HCR.
LVL took a different direction, starting with the idea that all serious pollsters are showing HCF to be ahead but differ in how big the gap is. Pollsters that are showing a very small or a very large gap tend to have a small percentage of “undecided” respondents in their sample. When the gap is small (for example with Consultores 21) the pollster has used some sort of indirect question or other inferential technique to decide that a good part of the “undecided” votes are actually pro-HCR.
In contrast, pollsters that are showing HCF with an enormous lead (for example Datos or IVAD) assume that undecided respondents will distribute the same way as decided respondents, and simply withdraw them from the sample. This increases the gap. For example if you take Datanálisis’s June Omnibus and get rid of the 17.1% that are undecided, instead of a sixteen point gap—43.6% (HCF) to 27.7% (HCR)—you would get an almost twenty point gap—52.6% (HCF) versus 33.4% (HCR).
Either way of adjusting undecided votes amounts to a projection, rather than an actual reflection of the current tendency in the population. LVL says that Datanálisis does not adjust undecided responses until shortly before the election. I think that is the right decision because whether undecided voters are going to break towards one candidate or mirror the decided voters cannot be predicted this far ahead of time.
Note that this still does not explain what I mentioned in my previous post: That across the board you can see a higher percentage of typically opposition answers in Consultores 21’s poll compared to Datanalisis’ Omnibus. This suggests there is also a difference in sampling (unless C21 throws the cases of “undecided” respondents completely out of the data, thereby affecting all questions in the same direction).
One final note. In response to my previous post one reader suggested that I was too generous in my assessment of C21’s track record, pointing out that they were way off with respect to the 2009 referendum. Looking at the numbers in this article, C21 was indeed off by about 15 points. However, the fieldwork that poll was based on was done in December 2008, probably almost 2 months ahead of the February 15 referendum. Datanálisis also showed the government starting out way behind in December 2008, closing the gap by early January, and surging ahead by early February. So C21’s numbers don’t actually look too far off to me.