Articles and Analysis


Why Pollsters Manipulate Public Opinion

Two recent polls, one by Gallup and the other by CNN, illustrate how easy it is for pollsters to manipulate public opinion into something different from what it really is.


The Gallup poll, Jan. 6-7, 2009, attempted to measure the public's reaction to a federal government stimulus package, with the question phrased as follows:


Do you favor or oppose Congress passing a new 775 billion dollar economic stimulus program as soon as possible after Barack Obama takes office?


In almost the same time period, the NBC News/Wall Street Journal poll also attempted to measure public opinion about the stimulus package, with a question that provided for a "don't know" option:


Do you think that the recently proposed economic stimulus legislation is a good idea or a bad idea? If you do not have an opinion either way, please just say so.


The results are shown below:


0901_16 CNN Manipulating public opinion (Econ stim pack).pngThe major difference, of course, is in the percentage of people who don't have an opinion - Gallup says just 11 percent, while NBC/WSJ says almost three times that number.


The margin in favor of the stimulus package is virtually identical in the two polls, 16 and 17 percentage points, but instead of being able to report a majority of Americans in favor, NBC and the Journal had to report that a "plurality" of Americans were in favor, with a substantial portion of the public ambivalent or unengaged. Gallup, by contrast, could report (although erroneously) that a majority was in favor.


It's not always the case that the margin in favor of a proposition is always the same in both ways of measuring public opinion, as is illustrated in the following case. When CNN wanted to discover whether the public was copasetic with Democratic control of all three branches of government, it asked a forced choice question (Nov. 6-9, 2008):


As you may know, the Democrats will control both the Senate and the House of Representatives, as well as the presidency. Do you think this will be good for the country or bad for the country?


Again, coincidentally, another polling organization, Associated Press/GfK Roper Public Affairs and Media asked a similar question at virtually the same time (Nov. 6-10, 2008), though this question allowed for a middle position:


As you may know, the Democrats will now control the House of Representatives, the Senate and the presidency. Do you think it good for the country, bad for the country, or does it not really make a difference that the Democrats now control the House, the Senate and the presidency?


The results of the two polls show two very different publics:


0901_16 CNN Manipulating public opinion (Dems control branches).pngWhile CNN reports a large majority of Americans in favor of Democratic control, by a 21-point margin, the Associated Press finds a small plurality in favor (just an 8-point margin) and about a quarter of the public either saying the situation doesn't matter or not expressing an opinion.


In this case, both polling organizations deliberately manipulated their respondents to come up with an opinion (even if, in the case of the AP/GfK poll, to say the issue didn't matter) by giving them information up front. Why did they need to tell respondents that the Democrats controlled all three branches? Why not find out how many people knew that, and then - among those who knew it - ask whether it was good or bad, didn't it make a difference, or didn't they have an opinion?


But the major media pollsters are generally not interested in realistic measures of public opinion. On the matters discussed here, Gallup and CNN clearly do not want to report how many people don't have an opinion or might want to take a valid middle position on the issue. Instead, these pollsters believe it's more interesting to create a "public opinion" that reflects a highly engaged and decisive public.


For CNN to say that 97 percent of Americans believe Democratic control of the government is either "good" or "bad," and for Gallup to claim that nine out of ten Americans have an opinion about the stimulus package, may fit their journalistic needs - but they know, and we know, it's simply not true.



I think your complaint is only valid with regard to the questions about divided government, where the question is not based around an actual decision or vote. With the stimulus package, however, the poll is acting as a referendum- at least in the style Gallup asked it.

Just as in a ballot referendum, there's no box to check "unsure" or "I don't have an opinion," when voting on legislation, lawmakers must vote yes or no (barring abstention, which for a vote on $800b of taxpayer dollars is not likely).

You will counter, of course, that you are manipulating the data and deceiving consumers by presenting such a decided electorate. I say that if pollsters want to measure decisiveness, they can ask how strong they feel about their opinion, whether they could change their mind, or whether a particular piece of new information would sway them. Each question provides insight different aspects of public opinion on the issue.



To me, this seems totally backwards.

I don't like the assumption that there is any significant percentage of people out there with "no" opinion. My mental model here is that, if "yes"/"no" is forced, everyone has some probability of answering "yes", with almost everyone near zero or one, but those with very weak or snap opinions near 0.5. The only place that precisely corresponds to "no opinion" is 0.5 exactly. But the integral of any probability distribution function from 0.5 to 0.5 is always zero.

Obviously, if you ask for "no opinion", the question isn't capturing p=0.5 exactly. You're integrating from 0.5-epsilon to 0.5+epsilon via the question wording. But now you're forcing your readers to try to figure out or guess what the heck epsilon is (at least intuitively), based on the wording of the question. What is the border between "no opinion" and "weak/slight/changeable opinion", or "doesn't really matter" to "really matters"? We don't have any intuitive sense of it at all! Lumping those with very weak / leaning opinions, or who would give opinions on snap reactions, in "no opinion" is, I think, much more deceptive. A better way to measure opinion strength would to ask strong/weak or likeliness of changing opinion. We have more of an intuitive sense of these.

I don't really like the first example's NBC/WSJ wording - it not only offers the "no opinion" answer but seems to be actively soliciting it with the oddly strong "please just say so". In other words, this probably has an artificially big epsilon.

In the second case, the results are consistent with each other. The wording "doesn't really matter" would logically capture everyone with a weak opinion or lean, as well as non-partisans who just happen to be supporting the Democrats for reasons unrelated to the fact they are Democrats. (Note how it's perfectly reasonable for them to answer "Good thing" to the first -- "hey, I happen to support them" -- but "Doesn't really matter" to the second -- "that's not why I voted for them".) So it's certainly reasonable for there to be 17ish% weak good vs. 4ish% weak bad.



(On second thought, that model's a little lacking. It's probably more-accurately a convolution with "yes", "no", and "no opinion" filters, with the area under the filter taking the role of epsilon. (The filters would need to be normalized so that the sum of their areas is 1.) The difference is to allow for the fact that the wording would not necessarily result in an influence that is guaranteed to be symmetric or centered or with clean vertical boundaries.)


Professor M:

As I have commented before, it makes little sense to get bogged down into arguments over the correct form a question should take. When responses change as questions are worded differently, our response should be to try to learn something about public opinion from the differences we observe.


Post a comment

Please be patient while your comment posts - sometimes it takes a minute or two. To check your comment, please wait 60 seconds and click your browser's refresh button. Note that comments with three or more hyperlinks will be held for approval.