Articles and Analysis


Poll of Pollsters: Rating the IA Polls

Topics: 2008 , Iowa , Pollsters , The 2008 Race

The main impetus for our "Poll of Pollsters" (from which I posted the first results on Thursday) is the desire to find better objective criteria to help sort out pollsters. As both Charles and I review today in complementary posts (here and here), the handful of past Iowa polls do not lend themselves to easy assessments of accuracy. The lack of transparency by many public pollsters makes it difficult to fairly assess differences in their methods, although I would propose the degree of their disclosure as a surrogate measure of quality. A third possibility is to measure pollster reputation, especially among their peers. We decided to start with a survey of pollsters about the public polls now in the spotlight in Iowa and New Hampshire.

I included all the details in the previous post, but here are the highlights: We sent out invitations to just over a hundred pollsters and had 46 complete the entire survey, although a few more (49) completed the questions about the reliability of the polls in Iowa. Of those, 22 are media pollsters and and 27 campaign pollsters (16 Democrats and 11 Republicans). There is no margin of error because the results represent nothing more or less than the views of the pollsters that participated. Like any survey respondents, we promised to keep their identities confidential.

We started with a simple question asked about each of the 16 pollsters that have released public polls in Iowa: "How reliable do you consider surveys of IOWA CAUCUS goers done by each of the following organizations, very reliable, somewhat reliable, not very reliable or not reliable at all?" We also provided an option to say they "do not know enough to rate" each organization.

We left "reliability" in the eye of the beholder, but it is fair to assume that few are in a position to evaluate the performance of each organization in past Iowa caucuses. As one pollster put it (in a space provided for comments the end of the survey) that there is "no way we can know who's most reliable until we can compare their final estimates with actual vote." Instead, it is safe to assume that most based their judgements on the reputation of each organization and its methods. As you will see, the pollsters had little trouble making such judgements.

As the following table shows, the Des Moines Register "Iowa Poll" conducted by Selzer and Company easily earns the highest marks, with virtually all rating it either very (36%) or somewhat (50%) reliable. The other pollsters with the highest scores are nationally known media surveys: ABC/Washington Post, the Pew Research Center and CBS/New York Times.

12-28 PP All Pollsters.png

The pollsters receiving the lowest scores are Zogby International, the American Research Group and Rasmussen Reports. In the case of Zogby, four out of five pollsters rated their surveys as not very (28%) or not at all reliable (52%).

Not surprisingly, the media pollsters are generally more positive than their campaign consultant colleagues (keep in mind that we invited principals at all 13 organizations that are polling in Iowa to participate). Their rankings are generally similar, although the University of Iowa and Strategic Vision rank higher among the campaign than the media pollsters.

12-28 PP camp v media.png

We also asked our pollster-respondents to select from the same list the "pollsters you consider MOST and LEAST reliable in Iowa." As the table below shows, the Des Moines Register/Selzer survey easily stands out as the favorite, especially among the campaign pollsters:

12-28 PP most.png

As for the least reliable pollster in Iowa, once choice easily led the pack. One third (33%) of the campaign pollsters and just less than half (45%) of the media pollsters picked Zogby International:

12-28 PP least.png

Again, reputation cannot tell us everything we need to know about the quality of the numbers a pollster produces. Pollsters with poor reputations may conduct quality polls and even the best pollsters are fallible. However, when a pollster earns the respect of their colleagues, it should tell us something.

Up next, similar ratings of the pollsters in New Hampshire. Meanwhile, Pollster.com readers, what do these ratings tell you?

Note: I accidentally omitted InsiderAdvantage which released their Iowa poll as we were in the process of drafting this survey. I apologize for the oversight.


Mark Lindeman:

I wonder whether these results correlate with anything in the survey results. I agree that the judgments are at least primarily based on reputation, but it would be interesting to know whether the most highly regarded pollsters are getting different answers.

Just at first glance, Zogby seemed pretty much in line with the averages. My impression is that Zogby has damaged its reputation in lots of ways that have little to do with its capacity to field a decent pre-election poll. But I have to admit, what I remember from 2004 is not what Zogby's pre-election polls did or didn't show, but the way Zogby came up with a late-afternoon Election Day "call" that gave Kerry a comfortable EV win. Zogby's last Ohio poll apparently gave Bush a 6-point lead in Ohio, yet on Election Day he confidently assigned it to Kerry. Amazing. (Not that the poll numbers look all that great either.)



Funny. Everybody hates Zogby but his final poll in the 2004 Iowa caucuses was nearly identical to the Des Moines Register/Seltzer poll, which everyone loves.

Zogby, 1/16-18/2004
n = 502 LVs

Dean 22%
Edwards 21%
Gephardt 18%
Kerry 25%

Des Moines Register/Seltzer
1/14-1/16/2004, n = 606 LVs

Dean 20%
Edwards 23%
Gephardt 18%
Kerry 26%

In addition to those two, Democracy Corps, Pew, Research 2000, the LA Times, the Quad-City Times (not sure if those were Research 2000 polls too or not) and Survey USA all polled the caucus race during the 2004 season and they all generally agreed with one another fairly well except for Survey USA, which was completly out to lunch. The Edwards and Gephardt campaigns also made polls public that were well off the mean in some respect. But there was really nothing particularly wrong with Zogby last time.



You did not leave insideradvantage by oversight. You hate them and want to damage their corporate reputation. Mark, figure out who owns this company. They are on to you and they have the resources to put you in real trouble. Rumor is down here you were factually incorrect on the story you wrote on them and I happen to know that they are waiting (with a sign off from the newspaper chain that owns then) to demand retract from you.


Bill Mitchell:

You have GOT to be kidding me!

Polls by uber-left-wing MSM with HUGE agendas more reliable than polls done by independent pollsters with reputations to protect?

What utter umitigated BS.

EVERYONE considers Rasmussen to be dead-on accurate yet you rate him close to a zero here.

Idiotic article. Your liberal bias is showing.



Interesting that Zogby still rates so low. This election cycle, the blog Political Arithmetik shows his numbers have stayed pretty true to aggregate polling data. The scatter is much greater in ARG, research 2000 and strategic vision.

His data seems to be quite good, but his interpretation, well... that's another story.



The reason people in the business don't like Zogby is simple, he cooks his numbers. The perfect example of this is the 2004 New Hampshire numbers. Every poll showed Kerry pulling away in the final days, but Zogby had Dean and Kerry neck and neck. Then, all of a sudden, on the final day of polling, Zogby has Kerry with an incredible night, which moves a rolling three day average a full 6 points (not sure of the math needed to achieve such a massive move). In the end, just like Iowa, Zogby was right in line with others. Conclusion, his polling is flawed, then he looks around and pads the numbers in the end, so he can claim accuracy. This is what your findings tell me, people know that Zogby isn't a trustworthy source, he plays games. Here is the link for 2004, looks way too convenient for me.



Post a comment

Please be patient while your comment posts - sometimes it takes a minute or two. To check your comment, please wait 60 seconds and click your browser's refresh button. Note that comments with three or more hyperlinks will be held for approval.