Articles and Analysis


Rasmussen Profile in Washington Post

Topics: AAPOR Transparency Initiative , Automated polls , Disclosure , IVR Polls , Jason Horowitz , Rasmussen , Scott Rasmussen , Washington Post

Today's Washington Post Style Section features a lengthy Jason Horowitz profile of Scott Rasmussen, the pollster whose automated surveys have "become a driving force in American politics." Horowitz visited Rasmussen's New Jersey office -- he leads with the "fun fact" that Rasmussen "works above a paranormal bookstore crowded with Ouija boards and psychics on the Jersey Shore" -- and talked to a wide array of pollsters about Rasmussen including Scott Keeter, Jay Leve, Doug Rivers, Mark Penn, Ed Goeas and yours truly. It's today's must read for polling junkies.

It's also apparent from the piece that Rasmussen won't be joining AAPOR's Transparency Initiative any time soon:

Rasmussen said he didn't take the criticism personally, but he grew visibly annoyed when asked why he didn't make his data -- especially the percentage of people who responded to his firm's calls -- more transparent.

"If I really believed for a moment that if we played by the rules of AAPOR or somebody else they would embrace us as part of the club, we would probably do that," he said, his voice taking on an edge. "But, number one, we don't care about being part of the club."

With due respect, AAPOR's goal in promoting transparency issue is not about getting anyone to join a club (and yes, interests disclosed, I'm an AAPOR member) or even about following certain methodological "rules," it's about whether your work can "stand by the light of day," as ABC's Gary Langer put it recently.

And speaking of methodological rules, I want to add a little context to Horowitz' quote from me:

"The firm manages to violate nearly everything I was taught what a good survey should do," said Mark Blumenthal, a pollster at the National Journal and a founder of Pollster.com. He put Rasmussen in the category of pollsters whose aim, first and foremost, is "to get their results talked about on cable news."

The quotation is consistent with an argument I made last summer in a piece titled "Can I Trust This Poll," which explained how pollsters like Rasmussen are challenging the rules I was taught:

A new breed of pollsters has come to the fore, however, that routinely breaks some or all of these rules. None exemplifies the trend better than Scott Rasmussen and the surveys he publishes at RasmussenReports.com. Now I want to be clear: I single out Rasmussen Reports here not to condemn their methods but to make a point about the current state of "best practices" of the polling profession, especially as perceived by those who follow and depend on survey data.


If you had described Rasmussen's methods to me at the dawn of my career, I probably would have dismissed it the way my friend Michael Traugott, a University of Michigan professor and former AAPOR president, did nine years ago. "Until there is more information about their methods and a longer track record to evaluate their results," he wrote, "we shouldn't confuse the work they do with scientific surveys, and it shouldn't be called polling."

But that was then.

In the piece, I go on to review the findings of Traugott and AAPOR's report on primary polling in 2008, as well as Nate Silver's work in 2008, both of which found automated polling to be at least as accurate as more conventional surveys in predicting the outcome in 2008.

The spirit of "that was then" is also evident in quotations at the end of the Horowitz profile that remind us that automated polling depends on people's willingness to answer landline telephones and is barred by federal law from calling respondents on their cell phones:

"When you were growing up, you screamed, 'I got it, I got it,' and raced your sister to the telephone," said Jay Leve, who runs SurveyUSA, a Rasmussen competitor who uses similar automated technology. "Today, nobody wants to get the phone."

Leve thinks telephone polling, and the whole concept of "barging in" on a voter, is kaput. Instead, polls will soon appear in small windows on computer or television screens and respondents will reply at their leisure. For Doug Rivers, the U.S. chief executive of YouGov, a U.K.-based online polling company that is building a vast panel of online survey takers, debating the merits of Rasmussen's method struck him as "a little odd given we're in 2010."

Again, I'm doing the full profile little justice -- please go read it all.



The obsession with Rasmussen is getting tiring. We get it. Liberals don't like Rasmussen.

I have been trying to find any evidence whatsoever that belonging to AAPOR affects the accuracy of a pollster. I have found none. The raw data actually points to just the opposite.



HG: I'm not sure what "raw data" you are referring to. If you are only talking about the accuracy of immediately pre-election polls then clearly the trend is for AAPOR organizations to do better than average (raw scores in Silver's ratings). If you look at other polls and see who is most likely to be within the expected range of group trend lines, then again the AAPOR organizations do better.

Rasmussen is interesting because they clearly make reasonable predictions for the former situation - but do not do so well in the latter situation.




It seems that a key to examining differences between pollsters is to be able to tease out the adults/registered voters/likely voters side of the story. I think it would be nice if the Chart system on Pollster.com allowed for a population filter. Hope you consider that in the near future.



No, if you average the raw scores out for those that belong to AAPOR, those that Silver ranked above Rasmussen are at -.93. Rasmussen is at -.32. Using raw scores alone, Rasmussen would rise to 4th place in the rankings from 15th where Silver now has them.

Rasmussen's raw score at -.32 is better than the average of all pollsters who belong to AAPOR.

So, based on Silver's numbers you can't draw any conclusion at all from membership in AAPOR. They aren't any better than Rasmussen unless you apply Silver's adjustment factor that says that they are somehow better.

The point is that using both raw scores and even Silver's controversial rating system, Rasmussen ranks as one of the best pollsters out there and yet liberals are obsessed with talking about their bias. There is no evidence of any bias in their polls.



HG: I see the mistake you are making now. For the raw scores, the more negative the number the better. For example having a value of -0.93 indicates you have 0.93 points LESS error than the average pollster while a value of -0.32 indicates than you are 0.32 points better than average. A value of 0 is at average and positive scores indicate MORE error than average.


Mark Sanford:

Wow, another day, another post about Scott Rasmussen. Who could have predicted that? I will pay a little more attention to criticisms of Rasmussen when the people who run this site and those at the Washington Post take time to criticize media polls that use a sample with a double-digit partisan advantage for the Democrats or polls that show Obama with a 52% or 54% approval rating when every other poll shows him at 46-48% (with Gallup, Pew, Rasmussen and Quinnipiac all showing him lower than that now).

It has become routine to see eight to ten polls showing Obama with an approval rating in the 46-48% range only to have a Washington Post or ABCNews poll come out with a 54% approval rating, or some similar number that doesn't even remotely pass the smell test, unless of course you believe a poll that samples 13% more Democrats is somehow fair. Companies that produce polls with such ridiculously lopsided partisan samples are the ones that should be accused of producing polls to sway public opinion, rather than gauge it, not Rasmussen.



The last point is interesting, in that communication is changing and telephone polling may become more and more difficult.

There is a huge generation gap when it comes to communication technology and that's something that pollsters haven't gotten their heads around.

How are you going to poll people when their primary method of communication is text messaging or facebook?



Why do right-wingers think if they lie about numbers it will buttress their case? Here's what is relevant: Rasmussen refuses to be transparent, does robocalls and his approval ratings are 14-15% points different than EVERY OTHER POLL!!! Get it...remember sesame street? One of these things is not like the other? I believe that Rasmussen faked the Coakley poll and that started the Fox news narrative that energized the teabaggers in a low-turnout special election with an admittedly awful Dem candidate. As Nate Silver points out, other than Brown, Rasmussen is notorious for not polling within 10-14 days of the primary, so they have plausible deniability. I would hope that, just as Markos M did the right and honorable thing by dropping Research 2000 for dubious data, the same would happen to Rasmussen by the right wing nut news media, but that will not happen.



I do not think lying is the problem for anyone. You illustrate it, yourself, by dumping on Rassmussen for "robocalls," when the great Nate Silver found their results to be superior to personal questioning whe compared for recent elections.

Am I remembering this correctly, in your opinion? If not, am I lying or just mistaken?

If you are incorrect, are you lying or just mistaken?

In any case, how does vitriol move the ball here? It just obliterates what you have to say and replaces it with an impression of you, just like it does when you speak to someone in person.


Post a comment

Please be patient while your comment posts - sometimes it takes a minute or two. To check your comment, please wait 60 seconds and click your browser's refresh button. Note that comments with three or more hyperlinks will be held for approval.