David Moore | August 28, 2008
According to Pew's Andrew Kohut, the American electorate is suffering from "Obama fatigue." A close examination of the polling data suggests this conclusion is more of a personal opinion than one supported by the polling data.
Kohut came to his conclusion after first noting that the latest
There are a couple of problems of data interpretation. First is the assertion of what Kohut calls a "tightening race." Pew conducted three polls - one each in June, July and August - and in those polls found Obama's lead going from eight points in June (48 percent to 40 percent), to five points in July (47 percent to 42 percent) and to just three points in early August (46 percent to 43 percent). Thus, overall, Obama's support dropped two percentage points over the summer, while McCain's increased by three. That such minor differences in the polls should be treated as a definitive trend is stunning. Even with larger-than-average sample sizes, those differences in the polls are within the polls' margins of error. In other words, even according to these polls, it's quite possible that there was no decline in Obama's lead, and perhaps even an increase. We just can't know for sure (using the 95 percent confidence level).
There are many other polls besides Pew that are measuring the candidates' support, but only one major media organization has conducted polls on a daily basis over this same time period.
On June 10,
A second problem with data interpretation is the almost indecipherable meaning that is elicited by the question that was used to suggest Obama fatigue. The poll question Kohut cited asked whether people felt they had been hearing "too much, too little, or the right amount" about each of the campaigns. Forty-eight percent said too much about Obama's campaign, 26 percent about McCain. To be sure, that's a major gap, but what does it mean? If it means people are unhappy with hearing about Obama, and that is related to their "declining" support for him, how could Pew have found Obama's support dropping by only two percentage points, given the 22-point gap in the "fatigue" question? If that sentiment truly affected voters' support of Obama, one would expect a much greater drop.
More important, we know that the crucial question to explain change in support is whether the explanatory variable also shows change over the same time period. Did people become more dissatisfied from June to August with media coverage of Obama's campaign and, if so, did that increased dissatisfaction in turn cause their support to "wither"? As it turns out, Pew didn't ask that question back in June, so we don't know. Thus, statistically, we can't link dissatisfaction in the August poll with the change in support from June to August. The assertion of "Obama fatigue" is not a statistical conclusion, but an intuitive one.
An alternative intuitive explanation of what this question measured is that many voters may well be tired of a presidential campaign that goes on for 18 months or more - in other words, not "Obama fatigue" as much as "campaign fatigue." Dissatisfaction may have appeared to be more focused on Obama in this particular poll, because the question was asked during a time when there was more media coverage of Obama for his overseas trip. Had the question been asked at a different time, or had the pollsters tried to probe beneath the surface of this superficial question, we might have obtained a better insight into what the public was thinking.
Instead, we are treated to the fiction of "Obama fatigue" as a cause of a "tightening race" - a spurious explanation of a non-event.
(A slightly different version of this critique was posted at HuffingtonPost.)