Articles and Analysis


Undersized Undecideds

Two days ago, Nick Panagakis reopened our debate about the "true" size of the undecided voters in his post on pollster.com, entitled Supersized Undecideds. Oddly, his post tends to support my argument, rather that contradict it.


First I should note that Nick has misstated my position somewhat, which was explained here and here. In brief, my argument is that pollsters should measure the undecided vote, by including in their vote choice question a tag line, "or haven't you made up your mind yet?" I also argue that pollsters should not insist on asking who voters would choose "if the election were held today," but who would they support on Election Day. I contend that this way of asking voters their candidate preferences produces a more realistic and accurate picture of the electorate than the way pollster currently report the results of their hypothetical, forced-choice vote question.


Nick disagrees, because he thinks that this approach would exaggerate the number of undecided voters. He makes the novel argument that any indecision measured as I suggest would be "calendar-induced" indecision but not "candidate induced" indecision. I don't know of any evidence for the validity of this distinction, but it's crucial to his argument.


To illustrate this point, he presents recent data from the ABC/Washington Post tracking polls, which suggest that currently only 9 percent of voters say they could change their mind before election day, including 3 percent who say it's a "good" chance they could do so, and 6 percent who say it's "pretty unlikely" they would do so. The latter term Nick interprets in his own mental framework as "no chance in h*ll."


Then, as though it's an obvious problem, Nick says, "Imagine if polls up until last week were showing undecideds 10 to 20 points higher - or still showing 9 points greater this week." Yes, let's imagine the 9 percentage point increase in the undecided voter group over what is reported these days.


It's important to note that most polls have been showing just a couple of percentage points of undecided voters, including ABC and the Post. These news organizations did not highlight the 9 percent undecided in their news stories, but instead focused on Obama's lead over McCain by 52 percent to 45 percent - leaving 3 percent unaccounted for (1 percent "other" and 2 percent "undecided"). If you want to know how many voters might "change their minds," you have to look hard for the data. Of course, ABC and the Post are no different from most other polling organizations that regularly suppress the undecided vote.


So, if the polls were to show "9 points greater undecided this week," as Nick feared, that would still be only 10 to 11 percent. That hardly seems excessive, given that the 2004 exit poll found 9 percent of voters saying they had made up their minds in the three days just prior to the election. And just today, the AP reported that about 14 percent of voters were "persuadable," a news story that emphasized the size of the undecided voter unlike most poll stories, which suppress that information.


Just before the New Hampshire Democratic Primary, the UNH Survey Center found 21 percent of voters who said they had not made up their minds (when asked directly, without the hypothetical, forced-choice version that is standard), and the exit poll showed that 17 percent of voters said they had made up their minds on election day.


These numbers suggest that measuring and reporting the size of the undecided voters is an important part of describing the state of the electorate. Not to do so is one of the continuing failures of most media polls.



I think inserting that many undecideds into a poll does two things:

1_ It allows the pollster to cover their a** on election day.

2_ It allows, in the event of a McCain upset (stolen), for an argument as to why McCain won (stole) the election.

NO way that in this year's election, the importance and intensity of it, that there are that many undecideds. Just no way. This is not a traditional election.



I like David's idea for two reasons.

1) It's more acurate. For those of us who obsess over the race, this may seem odd, but a lot of people don't really give a serious thought about who to vote for until late. The 9% exit poll value noted above supports this.

2) It's important to show a lot of uncertainty if it's there. It's good for democracy. Even from a press or horse race perspective it's better. I support Obama and would like to see his high numbers as solid, but if that's not the way it is, I'd rather know that.



I'm dubious of a forced-choice/election day distinction. I doubt many voters actually expect to change their mind.

If the argument is that the forced-choice risks understating the number of undecideds that's fair. However, not asking the question would create the opposite problem of understating voters who are strongly leaning.

A poll with artificially high numbers of undecideds strikes me as less predictive than a poll which captures voters who are leaning.

The "or haven't you decided yet?" question seems unnecessarily suggestive, but then again so does listing choices of candidates.

What's wrong with just asking "which candidate, if any, do you support?"


Dave Barnes:

@Adam who wrote: "What's wrong with just asking "which candidate, if any, do you support?""

How do you answer that if you are Steven Colbert who
1. Endorsed Barck
2. but, is voting for John ?



David, I think your last paragraph pretty much sums up the issue. you're trying to curently describe the state of the electorate. Accordingly, the number of "true" undecided voters is important to do that. If you're trying to preserve or disrupt the current state of the electorate between day X and the actual election, then I imagine that the "true" number of undecideds is even more so important.

Others seem to confuse that with predicting the result of the election. At poll-closing on election day, there are precisely zero undecided voters. If I were interested in predictions (and I'm not), I'd say forcing a choice is quite likely to be better than not forcing a choice. (I'm assuming that the forced choice information is better than no information - and that learning is possible to determine how that information can be helpful.)

Of course, whether or not we can actually measure the "true" number of undecideds is an entirely different matter.

Perhaps, a longitudinal study to measure the incidence of "mind-changing" per decisiveness state?



The issue is to get an accurate read. I tend to agree that the "forced" approach to get preference will minimize undecideds to an unrealistic level.

Candidly, the undecided issue is also an "I don't want to tell you" answer. The forced approach tends to create the "liars" in surveys.

While not perfect, the approach suggested would mitigate to some degree the "nonparticipants" issue. While much has been talked about regarding "cell phones", the issue of who doesn't answer the phone or declines to participate is just as, if not more, significant an issue. The assumption that those people have a demographic reflective of the general population is just as flawed as an assumption that cell phone only users reflect the general population. Actually, the declined to participates are probably an even more skewed demographic.

It would be interesting to see the phone exchanges for DNPs over census data to see if there is a demographic profile for those people.



David Moore wrote :

"That hardly seems excessive, given that the 2004 exit poll found 9 percent of voters saying they had made up their minds in the three days just prior to the election."

That is true, but in the 2004 election we had a tape from Al Queda (which I happen to think that was manufactured) released a few days before the election, besides the "religious" and "pro-life" factors that were used. I believe that the fear that was created among the electorate pushed them in making their decision. That is the reason that the Republican and McCain campaigns are focusing (once again) on the "politics of fear", not necessarily to move the undecideds to their side, but to move people to the undecided column.

However, I think that this year is a completely different election. There is a very high level of excitement and the focus is not in national security or religion, as much as McCain and the Republicans would like to. But in case that there was such an event (a tape or an international incident), I think that the electorate has become more comfortable with Obama handling such a situation (besides the fact that it would be obvious that the government is trying to manipulate these elections).



@PlayingItStraight - I know it's off-topic, but it's one of my pet peeves re: sample balancing - Even if the demos are the same, that doesn't rule out differences that matter. The key is that some people simply don't answer the phone or participate in polls. Whether they're demographically different or not, we know they're behaviorally different from the people who do participate - namely, they behave differently as regards participating in polls.



I was just wondering how many of the polls are factoring in people who have already voted, and if so, what is their methodology for including these in their predictions? Do they get counted more? Do they not get counted at all?



I'm late to this site and to this topic and I hate to sound impossibly naive but could somebody explain to me:

1) What credible journalistic reason could there be for NOT including the category and the word "undecided" in the reporting of any and all media polls?

2) What credible journalistic reason could there be for "forcing" a choice or "pushing" for a preference or whatever else is done to minimize the number of undecideds?

3) What possible use to anybody would a poll conducted and reported like that be?




I think we basically agree...I think...

The entire point of MOE is that you have a representative random sample of the population in question.

When you balance because your sample is unrepresentative, you can't have a truly random sample. Admittedly there is some art involved and balancing appears to work many times, but statistically I question its validity and I believe but can't prove it flattens the distribution.



Wouldn't you think that anybody who hasn't made up their mind by now is extremely unlikely to vote at all? Voting isn't required. If you are so un-connected to the process that you haven't decided between two vastly different candidates after nearly 2 years of campaigning, chances are you aren't going to the polls.



@PlayingItStraight - yes, I was agreeing...

@Bonzi77 - If you imagine that so many people can be so loyalty to either candidate, then why can't you imagine that a small proportion could be enthused about both or neither?

To further the case, people say who have voted in the past (the best predictor of future voting behavior), say that they have not yet made up their minds. Maybe they're leaning, but uncommitted. Maybe they're miles away from a final decision. But we know that people who say they're undecided (self-defined) do actually go to the polls and make a decision.



How can you have one ounce of respect for McCain/Palen after this ugly and disgusting smear campaign?!

By the way, how is it Palin is so comfortable in this negative trash-the-other-guy campaign?

Maybe because she's been doing this her whole professional life?! Maybe she's like Henry the 8th with "off with their heads" methods?

Want to know more about Sarah Palen from a non political source? Visit wikipedia and look up Saran Palen. Shocking! Then look up Barack Obama. Stunning difference.

This is my first year NOT voting republican.


Post a comment

Please be patient while your comment posts - sometimes it takes a minute or two. To check your comment, please wait 60 seconds and click your browser's refresh button. Note that comments with three or more hyperlinks will be held for approval.