Articles and Analysis


Questions on the Daily Kos Poll of Republicans

Topics: Daily Kos , Interpreting polls , Poll Accuracy , Sampling

Yesterday, Daily Kos released a poll of self-identified Republicans which showed surprisingly high agreement on a series of questions asking whether Barack Obama should be impeached, is a socialist, is a racist, and if he was born in the United States, among other things. The poll has created much debate in the blogosphere about the extremism of Republicans in the U.S.

While the results of the survey may be troubling, some have asked whether they are really representative of all Republicans or only the most extreme. Newsweek's Katie Connolly speculates, for example, that the potentially loaded questions that follow two more conventional probes of vote likelihood and 2012 vote preference may have caused some moderate respondents to discontinue the survey:

It's worth noting that those who completed the poll are also a self-selecting group. It's pretty commonplace in phone polls for respondents to simply hang up when asked loopy questions, and many of these questions qualify as loaded. As this poll progressed and questions like "do you think your state should secede from the union" came up, I'm willing to bet that the more rational Republicans just hung up, leaving the poll to be completed by those more divorced from reality.

Markos ("Kos") Moulitsas responded to such concerns in via twitter:

Cons[ervatives] arguing R2K poll might be skewed by "sane" Republicans hanging up b/c of crazy questions. R2K says no diff in response rates.

This may be true, but straight response rates might not tell the whole story here. A more important question is whether those respondents who started but failed to complete the survey were any different from those who did complete it.

Typically, the vast majority of those who do not respond are either not available when the pollster calls or refuse to participate immediately after answering the phone Those who hang up during the interview are a small component. For example, a report on the response rates for Marist's final 2008 New Hampshire primary poll includes only 70 incomplete interviews, compared to 2,990 dialed numbers that resulted in refusals or where the household could not be contacted. The cases that pollsters call "mid-interview terminates" could increase significantly without making a big dent in the overall response rate.

Moulitsas attempts to address that question with the following tweet:

In fact, R2K says that more than usual, "most of respondents enjoyed answering the questions once they agreed to participate."

Having been employed as a survey interviewer before, I know interviewers can get a feel for the reactions of respondents that might provide at least anecdotal evidence of whether it was difficult to keep some survey respondents on the line or whether respondents were offended by any of the questions. However, Daily Kos and Research 2000 could provide harder evidence that their results were not skewed toward the most extreme Republicans by providing the following information from their data on respondents who began but did not complete the survey:

  1. What were their responses to the second question, about the 2012 presidential primary? This question comes before any of the potentially tainted questions and could therefore give some indication as to whether those who hung up the phone mid-survey were any different from those who did not. For example, were the respondents who completed the survey more likely to support Sarah Palin or Ron Paul than those who did not?

  2. How did they respond to the third question, about Obama's impeachment? If some respondents continued part way through the survey after the more loaded questions began before breaking off, this is the question for which Daily Kos should have the most responses from those who stopped the survey later. If those who broke off after this point answered similarly to those who completed the survey, it would lend credence to the rest of the data and suggest that the final sample was indeed representative of Republicans generally. If fewer of those who refused to continue agreed that Obama should be impeached, that would suggest that the final sample was skewed toward those who agreed with the questions posed.

  3. How many incomplete interviews were there? If the incomplete interviews were different from the interviews included in the final sample, but there were very few of them, their inclusion might not significantly alter the final results.

By answering these questions, Daily Kos and Research 2000 would provide important context and help to address valid questions about whether their final sample was skewed in any way.

Thanks to Mark for contributing to this post


William Ockham:

Perhaps there is an element of social desirability bias going on here. Take a question where we have a lot of comparable data, gay marriage. The respondents in this survey were significantly more opposed to gay marriage than even "conservative Republicans" in a fairly recent Pew survey (and Pew's numbers on that question are pretty much the same as everybody else's).

Maybe the responses skewed to the more extreme end because the questions conditioned the respondents to think in terms of what their Republican friends would expect them to say.



There are two serious problems with the Daily Kos poll. First, this is a self-selected group (and the published poll does not show how party ID was determined) so there could be a significant number of people who responded just to spoof the poll. Calling actual Republicans from voter registration rolls would give a more correct result.

The second problem is that the Daily Kos poll questions are nothing more than a liberal caricature of what conservatives are supposed to believe. Furthermore, these are yes-or-no questions, there is no range of agreement, nor are there any alternatives (such as "Should Obama be voted out of office or impeached?")

This is like a push poll in reverse; instead of delivering negative messages, it tries to obtain negative responses.



How do you think self described democrats would answer a poll beginning with the following questions?

Do you think President Obama is the saviour of the world?

Do you agree that anyone who disagrees with President Obama is a radical redneck racist and hates that there is an African American in the white house?

How many would stay on the line for the next set of leading questions?



Thank you for the post. It will serve as a useful example for students as a partial fisking of the survey's failings.

Sampling and questionnaire structure are, of course, so key to the collection of objective data. I fear the survey represented here is one created more for the sake of ratings and pageviews than seeking data worthy of earnest consideration.

Here's hoping that Daily Kos and Research 2000 will come by to respond to the questions above.


Post a comment

Please be patient while your comment posts - sometimes it takes a minute or two. To check your comment, please wait 60 seconds and click your browser's refresh button. Note that comments with three or more hyperlinks will be held for approval.