January 20, 2008 - January 26, 2008
Some thoughts about tonight's South Carolina's primary, including some follow-up on yesterday's post on the variation in pre-election poll results:
1) Reverse Bradley/Wilder? Noam Scheiber sees evidence of a potential "reverse Bradley/Wilder effect" in South Carolina. His theory is that live-interviewer surveys may be understating Barack Obama's support among African-Americans. Scheiber's post is worth reading in full, but here is the gist:
If Obama consistently did better among black voters in automated polls, which eliminate the "social discomfort" that might discourage them from telling (presumably white) interviewers they support him, we'd have evidence for this hypothesis.
So what do the polls say? They say I might be onto something:
In the three most recent automated polls in South Carolina (PPP, SurveyUSA, and Rasmussen), Obama takes 67, 73, and 68 percent of the black vote, while Hillary takes 13, 18, and 16. In the three most recent live-interviewer polls (Zogby, Mason-Dixon, and ARG), Obama's takes 55, 59, and 61 percent of black voters, while Hillary takes 18, 25, and 25.
So, among black voters, that's an average lead of 69-16 for Obama in automated polls, but only 58-23 in live-interviewer polls--a huge difference (53-point lead in the former; 35-point lead in the latter). It's not exactly definitive--I'm only using three data points in each case, and there are other methodological differences between the polls--but it does strongly suggest that some black voters are reluctant to tell human pollsters they support Obama, but feel comfortable saying it to a machine.
2) Lying to Robots? Mickey Kaus floats a whole new theory, that voters are just as likely to lie to "robots," perhaps even more so:
I used to think talking to a robotic phone answerer was pretty close to a "secret ballot"--what was the robot going to do to me, anyway? But machines do a whole lot these days--they track your musical tastes, follow your movements, raise or lower your credit ratings. Now a robot can conceivably do a lot to me, at least in the paranoid part of my imagination activated when I get an unsolicited call. At best, it's probably generating a list to sell someone! I don't want it know my real innermost thoughts, including my political thoughts, especially my un-PC political thoughts. These days, I'd be much more paranoid about pushing a button that say "I'm voting against beloved minority candidate X" than telling a live operator the same thing. Sorry, Rasmussen! The traditional truth-revealing advantage of robo-calling may be the artifact of a transitional era in info-technology
This is an interesting theory that, at least for the moment, lacks supportive evidence. Survey methodologists have been studying "interviewer effects" for decades, and have found consistent evidence that "self-administered" surveys (that use paper or a computer rather than an interviewer) produce more reports of "sensitive" behaviors (sexual activity, drinking, drug use). If the growing presence of computers in ours lives has made respondents less truthful when responding to self-administered surveys, so far at least, no one has proven it.
3) Will The SC Exit Poll Resolve These Questions? Noam Scheiber and my colleague Charles Franklin (and many others) will be looking at vote-by-race tabulations in the South Carolina exit poll. But readers of both wonder if exit polls are susceptible to the same effects as telephone polls. After quoting Franklin, Kaus writes:
Of course, people can lie to exit pollsters too! If you're a black South Carolinian and want to help Hillary as much as you can, you'll walk into the booth, vote for her, then walk out and tell the exit poll person you voted for Obama. There may also be non-Machiavellian peer pressure in black precincts to tell the exit pollsters the same thing (which, perversely, might hurt Obama in tomorrow night's press spin by making it look as if he received an ethnic bloc vote). In white areas similar pressure might enocourage voters to falsely tell exit pollsters they voted for Edwards or Clinton.
One of Scheiber's commenters reaches a similar conclusion:
But aren't the exit polls all done by human beings, not machines? How will you know how African-Americans really voted if they tell you on the way out that they voted for Clinton?
The problem with both arguments is that voters don't "tell" exit pollsters anything. Interviewers hand respondents a paper form, which they fill out privately and drop into a "ballot box."
On the other hand, the characteristics of exit poll interviewers (race, gender and age) may have some influence on whether voters agree to participate in the survey. Historically, exit pollsters depend on mostly younger interviewers and have, as a results, had the hardest time gaining cooperation from older voters. While exit pollsters attempt to correct for such "non-response bias" by weighting, some distortions may remain.
So on the question of racial polarization in today's vote, it would be helpful to attempt to verify the exit poll findings with actual results in heavily African-American precincts.
4) More on Race and the Exit Poll - Speaking of validating the exit polls with hard numbers, NBC's Chuck Todd shares a helpful email he received from Mason-Dixon's Brad Coker after Todd noted that the 2004 exit poll estimated South Carolina Democratic primary electorate four years ago as 47% black, as compared to 55% on the most recent Mason-Dixon poll. Coker is skeptical of the exit polls as an "overall demographic indicator." He writes:
To get a real handle on what the African-American vote is likely to be, one only needs to look at real numbers. The South Carolina Secretary of State's office published the following statistics on South Carolina's 2004 and 2006 state Democratic primary elections. These are based on real voters, not a survey sampling.
According to the state's statistics, the '04 Dem primary for president attracted 58% of non-white voters compared to 42% of white voters; In the '06 Dem primary for governor, the ratio was 60-40 in favor of black voters.
These hard numbers show a much higher percentage of African-American voters in South Carolina's state primary races for Governor and U.S. Senate, so I don't think it is a stretch to expect a similar turn-out in a presidential primary that features a major African-American contender. If anything, 55% black might actually end up being a bit on the low side. I will be very surprised if a clear majority of today's Democratic primary voters are not African-American."
5) "Fraudstorm Advisory" - Our friend Mark Lindeman points out in a DailyKos diary that South Carolina "votes on ES&S iVotronics, paperless Direct Recording Electronic (DRE) voting machines" that are "exquisitely, ridiculously vulnerable to attack." As such, he has some helpful information on the shortcomings of those machines and advice on how to interpret the results in light of the inevitable speculation of vote fraud. Most relevant to our topic is his discussion of exit polls:
The exit poll "results" are not what they seem. If form holds, very shortly after the polls close at 7 PM, several networks will post preliminary tabulations based on exit poll estimates. Even before that, once the quarantine is broken, rumors may fly about what the exit polls show. Please be advised: even if you unaccountably believe that exit poll interviews are practically foolproof, these tabulations (or rumors) will not tell you the interview results! The early projections will be based on a combination of interview data and prior expectations. Given the variability in the pre-election polls, who knows what "prior expectations" will be?
Also be advised that in the 2004 general election, the estimated margin of error for the South Carolina exit poll -- assuming that the poll was otherwise unbiased -- was about 8 points on the margin between Kerry and Bush. (This margin of error cannot be figured in advance, and it can't be figured based on the number of respondents alone. It depends on the variability across the precincts in the exit poll sample.)
For those interested, the exit poll tabulations should be available just after the polls close at the following links on MSNBC, CNN and CBS.
The last in a series of Reuters/C-SPAN/Zogby tracking surveys in South Carolina (conducted 1/23 through 1/25) finds:
- 816 Likely Democratic Primary Voters (± 3.4%)
4 Someone else
10 Not sure
- And their poll among 814 likely Republican primary voters in Florida (± 3.4%) finds:
2 Someone else
9 Not sure
A new American Research Group South Carolina survey (conducted 1/23 through 1/24) finds:
- 600 Likely Democratic Primary Voters (± 4%)
A new American Research Group Florida survey (conducted 1/23 through 1/24) finds:
- 600 Likely Republican Primary Voters (± 4%)
- 600 Likely Democratic Primary Voters (± 4%)
A new SurveyUSA automated survey of South Carolina (conducted 1/23 through 1/24) finds:
- 606 Likely Democratic Primary Voters (± 4.1%)
Gallup introduces a daily national tracking survey (in case you didn't follow the links on Eric's earlier post).
SurveyUSA blogs the final South Carolina polls.
Jay Cost delves into the SurveyUSA crosstabs to ponder the state of the GOP race.
Jon Cohen considers the racial divide in the Democratic contest and mines past exit polls for the demographic profiles of upcoming primaries.
Frank Newport looks at the class gap in support for Obama, Mike Huckabee's challenges and the perceived impact of coming tax rebates.
David Hill takes issue with the way pundits define the Republican base.
Kathy Frankovic looks at perceptions of how the media has treated Clinton and Obama.
Gary Langer warns that stock market drops seldom impact consumer confidence
Chris Cillizza talks to campaign pollsters about whether national polls matter.
The Examiner profiles pollster Andrew Kohut.
Douglas Burns interviews Ann Selzer.
We've had quite a bit of discussion today in the comments section about the wide variation in results from the South Carolina polls. Reader Ciccina noticed some "fascinating" differences in the percentages reported as undecided, differences that lead reader Joshua Bradshaw to ask, "how is it possible to have so widely different poll numbers from the same time period?" There are many important technical reasons for the variation, but they all stem from the same underlying cause: Many South Carolina voters are still uncertain, both about their choices and about whether they will vote (my colleague Charles Franklin has a separate post up this afternoon looking at South Carolina's "endgame" trends).
Take a look at the results of eight different polls released in the last few days. As Ciccina noticed, the biggest differences are in the "undecided" percentage, which varies from 1% to 36%:
1) "Undecided" voters -- Obviously, the differences in the undecided percentage are about much more than the random sampling variation that gives us the so-called "margin of error," but they are surprisingly common. Differences in question wording, context, survey mode and interviewer technique can explain much of the difference. In fact, variations in the undecided percentage are usually the main sources of "house effect" differences among pollsters.
The key issue is that many voters are less than completely certain about how they will vote and will hesitate when confronted by a pollster's trial heat question. How the pollster handles that hesitation determines the percentage that ultimately get recorded as undecided.
On one extreme, is the approach taken by the Clemson University Palmetto Poll. First, their trial-heat question, as reproduced in an online report, appears to prompt for "undecided" as one of the four choices. And just before the vote question, they asked another question which probably suggests to respondents that "undecided" is a common response:
Q1. Thinking about the 2008 presidential election, which of the following best describes your thoughts on this contest?
1. You have a good idea about who you will support
2. You are following the news, but have not decided
3. You are not paying much attention to the news about it
4. Don’t know, no answer
So two of the categories prime respondents with the idea that other South Carolina voters either "have not decided" or are "not paying much attention."
Most pollsters take the opposite approach. They try to word their questions, train their interviewers or structure their automated calls in a way to push voters toward expressing a preference. Most pollsters include an explicit follow-up to those who say they are uncertain, asking which way they "lean." The pollsters that typically report the lowest undecided percentages have probably trained their interviewers to push especially hard for an answer. And SurveyUSA, the pollster with the smallest undecided in South Carolina (1%), typically inserts a pause in their automated script, so that respondents have to wait several seconds before hearing they can "press 9 for undecided."
But it is probably best to focus on the underlying cause of all this variation: South Carolina voters feel a lot of uncertainty about their choice. Four of the pollsters followed up with a question about whether voters might still change their minds, and 18% to 26% said that they might. So many South Carolina Democrats -- like those in Iowa, New Hampshire before them -- are feeling uncertain about their decision. Thus, as reader Russ points out, "the last 24 hours" may count as much in South Carolina as elsewhere.
2) Interviewer or automated? - A related issue is what pollsters call the survey "mode." Do they conduct interviews with live interviewers or with an automated methodology (usually called "interactive voice response" or IVR) that uses a recording and asks respondents to answer by pressing keys on their touch-tone phones.
Three of the pollsters that released surveys over the last week (SurveyUSA, Rasmussen and PPP) use the IVR method (as does InsiderAdvantage), while the others use live interviewers. One thing to note is that the so-called "Bradley/Wilder effect" (or a the "reverse" Bradley/Wilder effect - via Kaus) assumes that respondents alter or hide their preferences to avoid a sense of "social discomfort" with the interviewer. Without an interviewer, there should be little or no effect.
In this case the difference seems to be mostly about the undecided percentage, which is lower for the IVR surveys. In the most recent surveys, the three IVR pollsters report a smaller undecided percentage (7%) than the live interviewer pollsters (17%). That pattern is typical, although pollsters disagree about the reasons. Some say voters are more willing to cast a "secret ballot" without an interviewer involved, while others argue that those willing to participate in IVR polls tend to be more opinionated.
If the Bradley/Wilder effect is operating, we would expect to see it on surveys that use live interviewers, but in this case, the lack of an interviewer seems to work in Obama's favor. He leads Clinton by an average of 17 points on the IVR polls (44% to 27%, with 19% for Edwards), but by only 9 points on the interviewer surveys (37% to 28%, with 17% for Edwards).
3) What Percentage of Adults? -- Four years ago, the turnout of 289,856
573,101 South Carolina Democrats amounted to roughly 9% 20% of the eligible adults in the state.* Turnout tomorrow will likely be higher, but how much higher is anyone's guess. Thus, selecting "likely voters" in South Carolina may not be as challenging as the Iowa or Nevada caucuses, but it comes close.
For Iowa, I spent several months requesting the information necessary to try to calculate the percentage of adults represented by each pollster. With the exception of SurveyUSA (who tell us their unweighted Democratic likely voter sample amounted to 33% the adults they interviewed), none of the pollsters have reported incidence data.
So some of the variation in results may come from the tightness of the screen, but we have no way to know for certain.
4) List or RDD? One important related issue is the "sample frame." Three of the South Carolina pollsters (SurveyUSA, ARG and Rasmussen) typically use a random-digit dial (RDD) technique that samples from all landline phones. They have to use screen questions to select likely Democratic primary voters.
As least two (PPP and Clemson) drew samples from lists of registered votes and used the records on the lists to narrow their sampled universe to those they knew had a past history of participating in primaries.
These two methods may also contribute to different results, and pollsters debate the merits of each approach.
5) Demographics? Differences in the likely voter selection methods mean that the South Carolina polls have differences in the kinds of people sampled for each poll. One of the most important characteristics is the percentage of African-Americans. It varies from 42% to 55% among the five pollsters that reported it (I extrapolated an approximate value for Rasmussen from their results by race crosstab).
Another important difference largely hidden from view is the age composition of each sample. Only three pollsters reported an age breakdown. SurveyUSA reports 50% under the age of 50, compared to 43% on the McClatchy/MSNBC/Mason-Dixon survey. PPP had an older sample, with only 23% under the age of 45.
So the bottom line? All of these surveys indicate quite a bit of uncertainty, both about who will vote and about the preferences that their "likely voters" express. Obama appears to have an advantage, but we will not know how large until the votes are counted.
*Kevin: thank you for the edit.
A new Gallup national survey (conducted 1/22 through 1/24) finds:
- 1,045 Republican adults and those who lean Republican (± 3%)
8 No opinion
- 1,201 Democratic adults and those who lean Democratic (± 3%)
7 No opinion
The South Carolina polling continues to show a substantial lead for Obama, while Edwards' rise hints that he could challenge Clinton for second place.
At the moment, Clinton continues to hold a six-point advantage over Edwards, but Edwards has been rising while Clinton has been moving down. Obama, meanwhile, has been fairly steady at around 40-44% support, though with some hint of a small decline in the sensitive estimator. Note however that the Clemson University poll included here had an amazing undecided rate of 36%. That makes every candidate in their poll look lower than in all other polls that have a much lower rate of undecided. The level of undecided is quite sensitive to how the poll is conducted, including whether respondents are pushed as to whether they "lean" towards a candidate. The Clemson poll apparently didn't push at all among undecided voters. We'd be making a mistake to read their data as indicating a decline of support for anyone.
A second place for Edwards would, of course, be good news for his campaign, while Clinton would no doubt argue she had conceded the state to account for a third place finish. But Edwards still has some ground to make up, and late deciding voters remain an unknown-- if they are unhappy with either Clinton or Obama, Edwards can benefit simply by not being one of them. This may be especially true among independents who vote in the Democratic race, and the expected handful of Republicans who show up. (Republicans and independents can vote in the Democratic primary only if they did NOT vote in last week's Republican primary.)
I think the more compelling story of South Carolina will be the exit poll results. Obama has appealed to white voters in previous primaries and caucuses. The pre-election polls have found him getting as low as 10% of the white vote in South Carolina. The potential for racial polarization in this Southern state could damage his ability to transcend race as a basis of voting. Paradoxically, there has been speculation that Clinton can win the votes of black women, a result that could reduce polarization in the exit poll. We'll know much more about how voters decided by Saturday night.
Cross posted at Political Arithmetik.
A new Public Policy Polling (D) automated survey in South Carolina (conducted 1/24) finds:
- 595 Likely Democratic Primary Voters (± 4%)
Among a subsample of African-Americans (51% of the sample), Obama leads Clinton 67% to 13%.
A new Ebony/Jet South Carolina survey (story, results; conducted 1/19 through 1/22 by Ron Lester & Associates) finds:
- 600 Likely Democratic Primary Voters (± 2.9%)
22 Don't Know/Not Sure
Among a subsample of African-Americans, Obama leads Clinton (53% to 21%).
A new SurveyUSA automated survey in Florida (conducted 1/23 through 1/24) finds:
A new in a series of Reuters/C-SPAN/Zogby tracking surveys of South Carolina (conducted 1/22 through 1/24) finds:
- 811 Likely Democratic Primary Voters (± 3.4%)
5 Someone else
7 Not sure
Since the beginning of their tracking survey (conducted 1/20 through 1/22), Obama has moved from 43% to 38%, Edwards from 15% to 21%, while Clinton remains at 25%.
A new NBC News/Wall Street Journal national survey (NBC story, results; WSJ story, results) (conducted 1/20 through 1/22) finds:
- Likely Democratic Primary Voters (± 4.4%)
5 Not sure
- Likely Republican Primary Voters (± 5.1%)
8 Not sure
8 Not sure
A new McClathy/MSNBC/Mason-Dixon South Carolina survey (results PDF, conducted 1/22 through 1/23) finds:
- 400 Likely Democratic Primary Voters (± 5%)
A new Clemson University South Carolina survey (conducted 1/15 through 1/23) finds:
- Among Likely Democratic Primary Voters (± 4.6%)
Note: "Respondents were chosen for the sample if they voted in at least on of the past four Democratic primaries."
A new Mason-Dixon Florida survey (conducted 1/22 through 1/23) finds:
- 400 Likely Democratic Primary Voters (± 5%)
- 400 Likely Republican Primary Voters (± 5%)
My National Journal column, in which I revisit the issue of the primary voting screens used in national surveys, is now online.
On Tuesday, New York political consultant Eric Schmelzer posted a DailyKos diary (via Smith) about an aspect of Monday night's debate that he believes may "pivot the campaign" in Barack Obama's favor:
Surprisingly (at least to me), it seemed like Clinton, as well, made the decision to fight Obama on the trust issue, intimating that Obama was slick ("...it’s just very difficult to get a straight answer"), and that he worked for a slum lord.
I can't think of a worse issue for Clinton to try to match Obama over than trust. So far, trust hasn't entered the debate. It's surprising Obama didn't raise it before, because it tops the list of things people concern themselves with when voting (not surprising since we're coming out of W's administration). It's also an issue where Clinton is a clear loser (see questions 43, 44, 69). While voters fundamentally love what the Clintons did for the country in the 1990s, I think without reservation, most voters didn't see them as the most honest couple in the world.
In the battle to change the dialogue from experience to trust, Obama clearly won. It was definitely viewed as a victory by the Obama team, which released a post-debate dripping with references to 'trust.'
He's still waging that battle today, and Clinton is still taking the bait on it.
Today, Senator Clinton's team is still hitting on trust/truth, with Clinton saying this morning, "He has a hard time responding to questions about his record..." and "[Obama's answers] were so rehearsed that he kept on insisting that I had mentioned President Reagan in what I had said when I didn't mention President Reagan..."
Schmelzer is certainly right about "trust" being dangerous ground for the Clinton campaign, as is evident in results from today's new LA Times/Bloomberg survey (PDF):
Clinton still holds enormous advantages on foreign policy, health care and the economy and for having the "right experience." "Honesty/integrity," however, is easily her her weakest dimension, and one of Obama's strongest, as other surveys have shown in recent months.
How important are perceptions of integrity and trust? Very. Drawing on decades of opinion poll data, political scientists identify two central traits -- competence and integrity -- that drive judgements about presidents and presidential candidates. "Presidents are judged," wrote Professor Donald Kinder (with whom I once studied at the University of Michigan), " by their intelligence, knowledge and experience on the one hand, and by their honesty, decency and ability to set a good moral example on the other" (p. 840). Candidates that are perceived to be otherwise qualified and competent lose when voters find them lacking in terms of honesty and trust. And keep in mind that the bulk of the research driving these conclusions comes from general election surveys in which perceptions of competence and integrity were sometimes strong enough to overcome partisan leanings in driving voter choices.
The Clinton campaigns emphasis of experience throughout the campaign is entirely consistent with the perceived importance of the competence/experience brand. Emphasizing areas of perceived strength in a campaign's final days is a basic element of Political Strategy 101, which makes the sudden, blistering shift to issues of personal character by both Clintons so curious.
Of course, Schmelzer was right to hedge. Only time will tell," he wrote on Tuesday, whether the "trust/truth narrative" would overtake the "'experience' narrative of the past few weeks," or whether such a shift might benefit Obama "at the polls."
The last 24 hours, however, brings signs that the coverage may be changing: A front page story in today's Washington Post takes sides in the Clinton-Obama dispute, hitting a Clinton radio ad for repeating a "discredited charge" against Obama and "juxtapos[ing] it with GOP policies that Obama has never advocated." A companion editorial in goes further, concluding that this "episode does not speak well" for Clinton's "character and judgment."
Of course, that is just one newspaper on one day. It is still too soon to know how far the media curve will turn or how much such a development might affect voter preferences, but a shift to the "trust/truth" narrative is not something the Clinton campaign should welcome.
A new Los Angeles Times/Bloomberg national survey (Times story, results; Bloomberg story; conducted 1/18 through 1/22) finds:
- 532 Likely Democratic Primary Voters (± 4%)
- 337 Likely Republican Primary Voters (± 5%)
With his column in The Hill this week, Democratic pollster Mark Mellman becomes the latest pollster to weigh in on the various theories behind the polling kerfuffle in New Hampshire two weeks ago. Since I neglected to link to some or mentioned others only in passing, I thought it would be worthwhile to post a collection of links to everything I have collected on the subject. Going forward, this entry wills serve as a "frequently asked questions" (FAQ) page for the New Hampshire polling controversy.
For now, here are the links. If you know of an analysis worth including that I have overlooked (or if one of these links is broken), please email us (questions at pollster dot com).
My Blog Posts
New Hampshire: So What Happened?
A Lesson from 1948
What About Monday Night?
More Clues: The CBS Panel Survey
The New Hampshire Recount
Polling Errors in New Hampshire
See also all posts on Pollster.com tagged "New Hampshire 2008"
Analysis and by Pollsters and Other Notables
Marc Ambinder, The Atlantic
What Happened To The Polls? The ZbornakEffect?
Jon Cohen, polling director, The Washington Post
About Those Democratic Pre-election Polls
What if the Polls Were Right?
Robert Erikson and Chris Wlezian
Likely Voter Screens and the Clinton Surprise in New Hampshire
Kathy Frankovic, director of surveys, CBS News
NH Polls: What Went Wrong
Gender and Race in the Democratic Primary
John Judis, senior editor, The New Republic
Response to Kohut
Mickey Kaus, Slate
Hillary Stuns -- Four Theories
Andrew Kohut, president, Pew Research Center
Getting it Wrong
Response to Judis
Gary Langer, polling director, ABC News
New Hampshire's Polling Fiasco
The New Hampshire Polls: What We Know
Why Pollsters Got it Wrong (Video)
Joe Lenski, Edison Media Research
More on the New Hampshire Turnout, And Its Implications
Nancy Mathiowetz, president, American Association for Public Opinion Research (AAPOR)
Pre-Election Polling in New Hampshire: What Went Wrong?
Who Were New Hampshire's Likely Democratic Primary Voters
Mark Mellman, president, The Mellman Group (D)
N.H. Many Theories, Little Data
John Nichols, The Nation
Did "The Bradley Effect" Beat Obama in New Hampshire?
Frank Newport, editor-in-chief, Gallup
Putting the New Hampshire Polls Under the Microscope
More on New Hampshire
Scott Rasmussen, president, Rasmussen Reports
What Happened to the Polls in New Hampshire
Should We Blame Secretly Prejudiced New Hampshire Voters for Obama's Loss?
The New Hampshire Polls, One More Time
John Zogby, president, Zogby International
Polling the New Hampshire Primaries: What Happened?
Prior Research on Bradley/Wilder and Interviewer Effects
Finkel, Guterbock and Borg, Race-of-Interviewer Effects in a Preelection Poll: Virginia 1989 (via AAPOR)
Hugick, Polling in Biracial Elections
Hugick and Zeglarski, Polls During the Past Decade in Biracial Election Contests
Keeter and Samaranayake (Pew Research Center), Can You Trust What Polls Say about Obama's Electoral Prospects?
Pew Research Center, Race and Reluctant Respondents
Traugott and Price, A Review: Exit Polls in the 1989 Virginia Gubernatorial Race: Where Did They Go Wrong? (via AAPOR)
Streb et. al, Social Desirability Effects and Support for a Female American President (via AAPOR)
AAPOR Ad-Hoc Committee
AAPOR FAQ on New Hampshire Polling: What Went Wrong
AAPOR Announces Ad-Hoc Committee to Evaluate New Hampshire Polls
Mark Blumenthal - The New Hampshire Recount
Jennifer Agiesta and Jon Cohen, Washington Post - The Method or the Map
DailyKos Diarist DHinMI - Enough with the "Diebold Hacked the NH Primary" Lunacy
Brad Friedman, BradBlog
NH Primary: Pre-Election Polls Wildly Different Than Announced Results for Clinton/Obama
New Hampshire's Chain of Custody
Farhad Manjoo, Salon - Was the New Hampshire Vote Stolen?
Josh Marshall - Enough
New Hampshire Secretary of State - Recount Results
Sharon Brogan - I Have This to Say About That
Don't those pollsters know
that married women
lie in the presence
of their husbands?
A new Public Policy Polling (D) automated survey in South Carolina (conducted 1/21) finds:
- 580 Likely Democratic Primary Voters (± 4.1%)
Among a subsample of African-Americans, Obama leads Clinton 70% to 15%.
A new American Research Group Florida survey (conducted 1/20 through 1/21) finds:
- 600 Likely Democratic Primary Voters (± 4%)
- 600 Likely Republican Primary Voters (± 4%)
A new Quinnipiac University statewide survey of New York (conducted 1/14 through 1/21) finds:
- 544 Likely Democratic Primary Voters (± 4.2%)
10 Don't know
- 331 Likely Republican Primary Voters (± 5.4%)
1 Someone else
9 Don't know
A new Rasmussen Reports automated survey in Florida (conducted 1/20) finds:
- 754 Likely Republican Primary Voters (± 4%)
6 Not Sure
A new SurveyUSA automated survey in Florida (conducted 1/20 through 1/21) finds:
A new WNBC/Marist statewide survey of New York (conducted 1/15 through 1/17) finds:
- 426 Likely Democratic Primary Voters (± 5%)
- 175 Likely Republican Primary Voters (± 7.5%)
A new Zogby statewide survey of New York (conducted 1/19 through 1/20) finds:
- 280 Likely Republican Primary Voters (± 6%)
5 Someone else
20 not sure
- 425 Likely Democratic Primary Voters (± 4.9%)
2 Someone else
14 Not sure
Polling for the South Carolina Republican primary mostly got the winner right, but large undecided percentages prior to the election accounted for a general underestimate of the final vote for both McCain and Huckabee. Late polling also found the race closer than the eventual outcome.
Pollng for third and fourth place was less successful in detecting Fred Thompson's final strength and the drop of support for Mitt Romney, who abandoned South Carolina to spend time in Nevada, where he scored a strong first place finish. While almost all the polls finished inside the "5-ring", correctly seeing a close fight for the 3-4 spots, all but one poll got the order of Thompson-Romney wrong (and the one that got that order right substantially missed the magnitude of both votes.)
In the Nevada Republican caucuses the polls wildly underestimated Romney's final strength of 51% of the vote. (Note that the rings here have to be rescaled to include the very large errors.) Even the best of the three Nevada polls was more that 15 points off on Romney. The earliest of the three polls, taken before Romney's win in Michigan, was over 30 points low.
McCain and Huckabee ignored Nevada, essentially conceding the state to Romney, but the polling still failed to pick up the magnitude of his support there.
The polling likewise failed to capture Ron Paul's second place strength. Preelection polls put Paul at about 7% compared to his finish of 13.7%.
In terms of erroneous expectations, the polling also put McCain well ahead of Paul, uniformly getting the 2nd and 3rd place finishers wrong.
On the Democratic side, the final poll was inside the "10-ring", and the polling improved as the caucus approached. Here the surprisingly poor showing of John Edwards, and the Democrat's caucus reallocation rules for non-viable candidates helped boost the final percentages away from the polls. The three most recent polls all got the order of finish correct.
A new Siena College statewide survey of New York (conducted 1/14 through 1/17) finds:
- 311 Registered Democrats (± 5.6%)
19 Dont know/No opinion
- 174 Registered Republicans (± 7.4%)
17 Don't know/No opinion