Articles and Analysis


In Defense of Polling on Public Policy

The team of writers behind "The Daily Show" released a book in 2004 by Jon Stewart entitled America: A Citizen's Guide to Democracy Inaction. On page 112 (for those of you following along at home), Stewart and company lampoon the traditional roles found in an American campaign. The pollster is not spared.

"Pollster: No one will meet him. He does not exist because you, as a candidate, pay no attention to polls. You do not do anything until you talk to him first."

Well, not anymore.

The Monday morning column in National Journal from Pollster.com editor Mark Blumenthal collected up the incredible number of references to polls used by those on both sides of the aisle in last week's Health Care Summit at Blair House. The conventional wisdom that politicians are supposed to pretend they don't care about polls has been turned on its head.

I, for one, am glad. In a post a year ago during the forum about Stan Greenberg's Dispatches from the War Room, I applauded his defense of polling as an important part of democratic government nowadays, and suggested that pollsters get to serve as the "reality check" against policymakers who remain hopelessly out of touch with the experience of real Americans.

Polling is an often maligned discipline, seen by some as a crutch for unprincipled politicians to figure out how to be successful political chameleons. And in fairness, I don't doubt that there are some politicians looking to polls to "tell them what to believe." For decades, politicians have often taken the position that they ignore polls and govern from their principles.

Which makes it somewhat refreshing, as a pollster, to see polling serve as an integral part of a major public policy debate.

The challenges of public policy polling are numerous. Do a survey that is too complex on a topic where the public knows little, and the wrong type of question can yield a worthless answer. Almost any positioning of a policy will be objectionable to one or both sides of the political spectrum. (For a great example, see the Pollster.com rundown of a somewhat recent duel over card check polling.) Plus, polls don't happen for free, and the funding for a poll can certainly impact the study's credibility.

But a well-executed survey that tests basic beliefs and attitudes can tell an important story to elected officials and policymakers. It can highlight fears and concerns that might not otherwise be heard absent a wave of letters and phone calls to congressional offices. It can help identify clear, simple ways to engage the public in policy discussions.

A great example comes from a missed opportunity from my own side of the aisle. In 2006 and 2007, Republicans were often seen touting economic growth, proclaiming that the Bush tax cuts had improved the economy. Yet their approval ratings weren't budging. No amount of messaging about a rosy economic outlook was convincing Americans, and folks inside Washington couldn't figure out why. Yet when pollsters and focus group experts took to the field to unravel the mystery, the answer was simple.

From the linked WSJ piece: "The reality, of course, is that the investment tax cuts did help create seven million jobs and did steer the economy out of recession. That doesn't matter to these "stressed out" voters, as Mr. Thau calls them."

The reality was that cost of living increases created real pressure being felt by most Americans, and as a result, the creation of millions of jobs didn't have an impact on the day-to-day American experience for many. Maybe folks weren't calling their Senators en masse, but there was anxiety out there. Research provided a window to the concerns of Americans.

Republicans finally came around with a somewhat successful moment in summer of 2008 focused on lowering gas prices, but by then it was too little too late.

All of which is to say that yes, there is an important place for quality survey research in a public policy debate. The "inside the beltway" distortion field is difficult to escape even for the most earnest policymakers. So long as polling is used appropriately, it can provide helpful clarity and direction to those whose decisions have a major impact on the lives of Americans.




I appreciate your positive outlook on our profession's importance, but I think the caveat in your last sentence might be more important than you make it seem. As Mark's compilation of the polling references during the Blair House meeting demonstrates, the numbers bandied about on health care do not provide a clear, consistent impression of the attitudes of the American public. (If they did, we wouldn't see both sides pointing to them as definitively supporting their positions.)

And I don't think this is just a case of "hired gun" polling run amok. Even earnest attempts to accurately gauge opinion on health care have shown widely divergent levels of support. There are incredibly complicated bills in both the House and Senate with an even greater set of interpretations and implications--both honest and spun--floating around. And despite the year-long media frenzy, the vast majority of the public's understanding of the current proposals remains limited. Furthermore, as you know, subtle differences in question wording and order or sampling and weighting techniques can have a major impact even on deceptively simple topline support/oppose numbers.

Of course, we can't blame the public for this lack of understanding when the White House and Hill leaders don't appear to be entirely sure what the final health care bill will end up looking like. (Although Obama's "closing argument" might yet help pin things down.) I'm curious: is there a single poll you'd point to as an indicator of "true" public support?

As policy--like many fields today--becomes more technical and audiences become more fragmented in their news exposure, it will continue to be challenging to accurately gauge public opinion on the issues.

Tellingly, the example from Rich Thau that you cite was a finding from a focus group, not a survey. Perhaps the better conclusion would be that on confused, emotionally-charged issues not neatly captured in a telephone survey question, our best way to inform the debate is through qualitative or exploratory research that doesn't claim to do too much.

I'm not saying that issue polling isn't useful or important, just that we shouldn't overstate our case--especially when we pollsters aren't the only ones in Washington having trouble figuring out what's going on with health care. Polling can provide clarity, to be sure, but I don't think we've got a great story in this case.


Post a comment

Please be patient while your comment posts - sometimes it takes a minute or two. To check your comment, please wait 60 seconds and click your browser's refresh button. Note that comments with three or more hyperlinks will be held for approval.