Pollster.com

AAPOR

 

AAPOR 2010: Thoughts from a First-Time Attendee


Last week, I had the opportunity to attend the annual conference of the American Association for Public Opinion Research. The conference featured presentations from private sector, government, and academic researchers about their methods and findings, in addition to the release of reports by two AAPOR task forces - one on online panels, one on cell-phone surveying.

On the whole, I had a phenomenal experience. I truly enjoyed the spirit of collaboration as attendees and presenters shared best practices and supported each other's research.  I had an opportunity to meet an impressive group of established public opinion researchers, and also got to meet many young students and professionals who are doing fascinating work. (I now believe that pollsters are excellent conversationalists precisely because they're so good at asking questions.)

I think that one of the great benefits of attending AAPOR came in seeing how research is conducted by those in other industries.  For example, political pollsters deal with a variety of pressures that are lessened in academic research: the speed of data production, the need to insert your findings into the conversation quickly, as well as client demands and cost pressures.  While a major academic study may consume years of a doctoral candidate's life, a campaign poll typically needs rapid turnaround and subsequent immediate release in order to remain "fresh."  A campaign rarely if ever has time to improve its coverage and conduct in-person face-to-face interviews of populations missed by land-line and cell-phone surveys, for instance. Weeks or months of post-stratification are a luxury not afforded to those in the world of campaign polling.  

At AAPOR, you get exposure to "the ideal" - projects refined by the most advanced and rigorous techniques, exploring the toughest challenges of sampling, processing and analysis that the survey research field faces.  It highlights ways to improve your methods, regardless of field, and helps a researcher facing time and cost pressures make informed decisions about what is critical to producing useful data.  And for a political pollster, AAPOR is a great time to focus on these issues exclusively, away from discussion about whose clients won more races or who got which race predictions closest.

There was one thing that surprised me a bit about the AAPOR conference, and I'd love to hear comments on this from those who have been to the conference before or who have been involved in the organization more deeply.  Essentially, if AAPOR is the "American Association for Public Opinion Research," one might logically assume the conference would devote a substantial portion of time to the findings of public opinion research in addition to the methods of collecting data.  

A great example of a panel that balanced these two was the Gary Langer/Matthew Warshaw presentation about ABC News' "Where Things Stand" research in Afghanistan.  I walked away with a greater understanding of how to conduct research in the most incredibly challenging circumstances, but I also learned what the people of Afghanistan think about the future of their nation.

However, the vast majority of content from the conference was about the process of social science research. In some cases, it was not necessarily even about opinion research in the strictest sense of the word "opinion", but rather the collection of demographics.  This is understandable, given that at a professional conference, everyone is trying to figure out how to do what they do better, but I felt there was a very narrow focus on the methods of research and less attention paid to what we're finding.  

Why do we conduct opinion research in the first place? We do it to learn about certain groups of people and audiences.  Developing a research methodology that perfectly captures cell-only populations is as useful as the research findings it generates.  So what are we finding? Opinion research conducted by another organization about, say, shifting attitudes in America about the media, have a great deal of application to my work as a political pollster, even if that research presentation does not impact the methods of how I do my work in the future.

With the wealth of knowledge possessed by the various professional and academic organizations in AAPOR, it would be great to see more panels highlighting the findings of public opinion professionals.

In the end, I think it is critical that more political pollsters take the opportunity to focus on their methods in order to create the highest quality data. We often measure political pollsters by the accuracy of their results and how often their numbers are "on the money" when final ballots counts are in. A conference like AAPOR gives researchers the tools to make sure they are right rather than lucky. There is a great deal that political polling professionals can learn from their counterparts in other industries and I feel very thankful that I had the opportunity to attend this conference and learn from their experiences.


Column: AAPOR's Transparency Initiative


My new column reviews AAPOR's Transparency Initiative as described in detail this past weekend by outgoing AAPOR President Peter Miller. I hope you'll click through and read it all. It may not be quite as consequential as health care reform, but in the polling world it has the potential to be a very big deal (to paraphrase our Vice President).

This column follows up on two items posted last week: The first reviews the rationale for the initiative, and the second features a video interview with Miller and includes the full list of participants "so far." Regular readers will know that AAPOR's initiative jives neatly with my own ongoing interest in improving disclosure of polling methodology (discussed most completely here).


AAPOR2010 Interviews Wrap-Up


I'm back in DC and just filed a column that should appear on NationalJournal.com later this afternoon about the Transparency Initiative announced over the weekend at the conference of the American Association for Public Opinion Research (AAPOR). Kristen also tells me she's preparing a post on her experiences as a first-time conference attendee. So while this post will not truly "wrap-up" our coverage of the AAPOR conference, I do want to provide a collection of links to all of the interviews we conducted this past weekend at this year's conference (with some final thoughts below):

A few things to remember about AAPOR's annual conference (largely cribbed from my comments last year): First, our interviews barely scratch the surface of the breadth and depth of subjects covered at the conference and the findings presented. We tend to focus on topics related to pre-election polling, but the conference covers a much wider array of methodological issues that are often highly technical and not easily distilled into a quick video interview.

Second, remember also that many of the findings presented at the conference -- including some that made their way into our interviews -- are very preliminary. The quality of the "papers" presented at the AAPOR varies widely (and I put that word in quotations because most are just Powerpoint presentations). Only a handful will eventually appear in academic journals, and those that do are at the beginning of a long peer review process that may ultimately lead their authors to different conclusions.

Finally, some words of thanks: First, thanks to all of those we interviewed for making themselves available. Second, to all of the folks at AAPOR for their logistical support. Third, to Lisa Mathias at the Winston Group for creating the animation that appears at the beginning of each video. Finally, last but definitely not leas, a huge personal thank you to the Winston Group's Kristen Soltis for sharing the interviewing duties. I got considerably more sleep this weekend than I would have otherwise, and our interviews are made that much better by having a newcomer's perspective. Thank you Kristen and thank you all!


AAPOR 2010: Pew's Jocelyn Kiley


One of the things I loved most about the AAPOR conference was the opportunity to learn from pollsters of different disciplines. The lessons one organization learns about how to reach a unique population are often useful to researchers of all varieties. In this case, Pew presented its findings about how best to reach Hispanics in general public opinion surveys. From issues in language and translation to interviewer hand-offs to the prevalence of cell phone use, Pew's findings highlighted the challenges in ensuring Hispanics are properly represented in survey research. For campaign pollsters, particularly those operating in states with a high proportion of Hispanic voters, knowing how to get a representative snapshot is becoming more and more critical to monitoring political attitudes. I had a chance to chat with Jocelyn Kiley about the research and its importance to political polling.


AAPOR 2010: Chris Wilson and Bryon Allen


While AAPOR's panels are predominantly comprised of academics and professional non-partisan researchers, it was nice to run into a handful of political pollsters who had presentations as well. Chris Wilson and Bryon Allen from Wilson Research Strategies dug into the ANES data to answer a basic question: what matters more, persuasion or turnout? Is it more critical to move the middle or to energize your base? Their research points to persuasion as key. I pulled them aside for a moment before their panel to find out about their research.


AAPOR 2010: Washington Post's Jennifer Agiesta


On the second day of the AAPOR Conference in Chicago, I had a chance to catch up with Jennifer Agiesta of the Washington Post who chaired a panel session on candidate preferences and election outcomes. The panel featured presentations on a number of topics that impact how public opinion measures elections, including third parties and whether or not having your name at the top of the ballot gives you a distinct advantage. I caught up with Jennifer after the panel to chat with her about the presentations and to find out what challenges she's anticipating as a media pollster heading into the 2010 elections.


AAPOR2010: George Bishop


One last AAPOR interview (from me) for tonight: On Saturday, I talked to University of Cincinnati Political Science Professor George Bishop, who presented a paper on interpreting and misinterpreting public opinion trends, about recent polling on Arizona's immigration law.

I wrote about Bishop's well known work on survey questions on fictitious issues in a National Journal column this past November.

We are conducting interviews this week with participants at the annual conference of the American Association for Public Opinion Research (AAPOR). The following links take you to all the videos and occasional Twitter updates from me, Kristen Soltis and others at the conference.


AAPOR2010: Robert Erikson


This latest interview from the AAPOR Conference in Chicago features Columbia University Political Science Professor Robert Erikson discussing his paper on individual level stability and change in presidential elections from 1952 to 2008.

Long-time Pollster readers may remember the guest contributions from Erickson includes an article he co-authored in 2006 with Joseph Bafumi and Chris Wlezien on forecasting House seats from Generic and Congressional polls and another in 2008, co-authored with Karl Sigman, on SurveyUSA's 50 State Polls and the Electoral College.

We are conducting interviews this week with participants at the annual conference of the American Association for Public Opinion Research (AAPOR). The following links take you to all the videos and occasional Twitter updates from me, Kristen Soltis and others at the conference.


AAPOR2010: David Rothschild


This latest interview from the AAPOR Conference in Chicago features David Rothschild, a PhD candidate at the Wharton School of the University of Pennsylvania, discussing his paper on using voter expectations to forecast election outcomes.

We are conducting interviews this week with participants at the annual conference of the American Association for Public Opinion Research (AAPOR). The following links take you to all the videos and occasional Twitter updates from me, Kristen Soltis and others at the conference.


AAPOR2010: Peter Miller & AAPOR's Transparency Initiative


Today at the AAPOR Conference in Chicago, I interviewed Peter V. Miller, Northwestern University professor and the organization's outgoing president just after he addressed the full conference on AAPOR's transparency initiative.

Miller made news today when he announced the organizations that pledged to support the initiative by routinely depositing methodological information to a public archive that AAPOR is creating (I subject I wrote about at more length earlier this week). According to Millter, the news media organizations and media pollsters that have pledged to support the initiative so far include: ABC News, the Associated Press, CBS News, Gallup, GfK, Knowledge Networks, the New York Times, the Pew Research Center, SurveyUSA and the Washington Post. Miller was careful to emphasize that "we are not cutting off the membership now, this is just the beginning." [Update: Peter Miller shared via email the full list of organizations that have committed, so far, to joining the transparency initiative, and it appears below, after the jump].

We are conducting interviews this week with participants at the annual conference of the American Association for Public Opinion Research (AAPOR). The following links take you to all the videos and occasional Twitter updates from me, Kristen Soltis and others at the conference.

Continue reading "AAPOR2010: Peter Miller & AAPOR's Transparency Initiative"


AAPOR2010: Peter Woolley


At the AAPOR Conference yesterday, I interviewed Peter J. Woolley, director of Fairleigh Dickinson University's Public Mind Poll, on an experiment he conducted on how to measure support for an independent or third party candidate. Woolley and his Fairleigh Dickinson's colleague Daniel Cassino are presenting a paper on this experiment today, but they originally published the survey data back in October.

We are conducting interviews this week of participants at the annual conference of the American Association for Public Opinion Research (AAPOR). conferehttp://publicmind.fdu.edu/finaldays/final.pdf The following links take you to all the videos and occasional Twitter updates from me, Kristen Soltis and others at the conference.


AAPOR2010: Doug Rivers


Yesterday, I interviewed Douglas Rivers, President and CEO of YouGov Polimetrix and professor of political science at Stanford University, on a paper he presented at the AAPOR Conference on the statistical properties of internet and phone polls in the 2008 Elections.

Rivers authored a guest contribution on Pollster responding to study on the accuracy of web surveys. I wrote on two-part column on the controversy last October. Interests disclosed: YouGov Polimetrix is the owner and primary sponsor of Pollster.com.

We are conducting interviews this week of participants at the annual conference of the American Association for Public Opinion Research (AAPOR). conferehttp://publicmind.fdu.edu/finaldays/final.pdf The following links take you to all the videos and occasional Twitter updates from me, Kristen Soltis and others at the conference.


AAPOR 2010: Public Opinion in Afghanistan


If you think pollsters in the US have it rough - difficulty getting folks to agree to participate, difficulty finding good samples given the rise of cell phone only households, etc. - try conducting public opinion research in Afghanistan. Over the last few years, ABC News has worked with research firm D3 Systems and the Afghan Center for Socio-Economic and Opinion Research to pull together some unique and fascinating research on shifting public opinion in the country. Their Emmy award-winning work was presented in the first session here at AAPOR and I was there to hear what they'd done and what they'd learned.
Research of this nature is of interest not just because of its unique nature but also because of its impact. For instance, their research found that beliefs about civilian casualties were linked to optimism about the country's situation and a variety of other indicators. With the "winning the hearts and minds" item so integral to the conflict in Afghanistan, research like that conducted by ABC/D3 highlights key links between public opinion and things like the conduct and outcome of a war effort.
I had the opportunity to chat with ABC News' Gary Langer and D3's Matthew Warshaw about their research - take a look to hear more about the findings of their research and the challenges they encountered.


Greetings from the AAPOR Conference


This post is the first of a series of video interviews we'll be doing over the next few days at this year's conference of the American Association for Public Opinion Research (AAPOR) in Chicago, Illinois. Over the last few years, I've found that video-blogging is a better way to provide you with a window into the substance of what happens at AAPOR while still finding time to attend the sessions...and sleep.

Today, for example, was a whirlwind of planes, trains and automobiles. So we taped a quick interview with, well, me to introduce this series.

And did you notice I said "we?" Well, if you play the video above, you should hear me say that they person behind the camera asking the questions is regular Pollster contributor Kristen Soltis who will be sharing interviewing duties with me this year. Thank you Kristen!

You can see all the videos, including those from last year's conference, here. Once again, credit for the animated graphic title that introduces each video goes to Kristen's colleague Lisa Mathias at the Winston Group. Thank you (again) Lisa!

And stay tuned, we'll have much much more on Friday and over the weekend.


AAPOR's Keeps 2011 Conference in Arizona


The American Association for Public Opinion Research (AAPOR) alerted its membership this afternoon that it will "continue with long-standing plans" to hold its annual conference in Phoenix Arizona in 2011.

Over the last two weeks, AAPOR's members-only email listserv has been a hotbed of debate -- more than 100 messages-- over what to do about the scheduled conference in light of Arizona's stringent new immigration law. This afternoon, AAPOR President Peter Miller sent an email to the group's membership explaining the dilemma and their decision:

Some AAPOR members have recommended that we move the conference site from Arizona because they view the new Arizona immigration law as a moral affront. Others worry that some conference attendees might be harassed under the new law. Others point out that conference attendance is likely to fall if we do not move the site in reaction to the law. Still others argue that we should hold the meeting in Arizona despite the law because our reputation as a professional body dedicated to the dispassionate, nonpartisan study of public opinion would suffer if we take a position against the measure. The fact that we will certainly suffer severe financial consequences (a charge of at least $200,000) by canceling the Phoenix contract weighs on many. And, finally, all of us, regardless of our views, are operating in an environment of uncertainty about whether the just-passed law will be in effect at the time of the meeting, or whether it will be modified, delayed or even rescinded. There is also the possibility that a similar law could be enacted wherever we move the conference.

After an "extensive discussion," Miller reports, they have opted to keep the meeting in Arizona. "We feel that our responsibility to the mission of the organization and our fiduciary responsibility to AAPOR and its members make this the best choice in a very difficult situation" (Miller's full letter is posted after the jump).

Interests disclosed: I'm an active AAPOR member and a former member of the organization's Executive Council.

Incidentally, AAPOR's 2010 conference will be held next week in Chicago. I will be on hand and will be once again post video interviews with notable presenters, with an assist this year from Pollster contributor Kristen Soltis.

Update - AAPOR also blasted out an email this afternoon to its membership a letter from 2011 Conference Chair Rob Santos. An excerpt:

I can personally attest to a deep feeling of ambivalence - I detest SB1070 and what I believe it communicates to immigrant minorities (I being the 3rd generation descendent of an undocumented Mexican immigrant). But I also know how widespread profiling is. It has been an unfortunate reality for generations in the Southwest. SB1070 is a marginal step backwards in an environment that was already in great need of attention. Putting aside the content of these views, my point is that, I - like you - have my own unique set of experiences, knowledge, and perspectives that I bring to this issue. AAPOR members share common beliefs and values, and we have our differences too. Both can be used to strengthen our association and our industry.

Continue reading "AAPOR's Keeps 2011 Conference in Arizona"


POQ: Understanding the 2008 Election


The events of the week delayed my pointing to a terrific resource made available to pollsters and polling junkies over the weekend by Public Opinion Quarterly, the academic journal published by the American Association for Public Opinion Research (AAPOR). Their special issue on the 2008 Presidential Election is now posted online and access is totally free.

It leads with the article previously teased on Pollster.com by co-author Mike Mokrzycki on exit poll measurements of cell-phone only voters and implications for future polling. It also includes many more articles of interest to Pollster readers, including a review of poll performance in 2008 by Mike Traugott and Chris Wlezien, a look at how predication markets compare to polls in forecasting outcomes by David Rothschild and much, much more.

Again, POQ has made access open and free to all. Definitely worth a click.


So Why Isn't AAPOR More Transparent?


The crux of the reprimand issued last week by the American Association for Public Opinion Research (AAPOR) against Strategic Vision, LLC is that the pollster failed to provide "any information about response rates, weighting, or estimating procedures." But if you look closely at the materials posted online in connection with AAPOR's "Ad Hoc" investigation of the primary polling mishaps of 2008, you will see several other pollsters for whom no response rate or weighting information is available. So why did AAPOR single out Strategic Vision?  And why isn't AAPOR itself more transparent about the identify of the person that filed the complaint or about their communication with Strategic Vision's CEO, David Johnson? Let's take a closer look.

As of this writing, AAPOR has published information disclosed by pollsters in response to the requests of its primary polling investigation in two ways. Their final report, released this past April, summarizes the information that had been disclosed at the time (see especially Tables 4, 5, 7, 9 and 18). In partnership with the Roper Center, AAPOR has also created an online archive that includes responses and data received from pollsters, as well as many of their initial public reports. Some of these responses on the Roper site were received since they wrote the report.

If you take the time to sift through the various documents, you will still find (as of this writing) no responses on weighting procedures from three organizations: Strategic Vision, Clemson University and Ebony/Jet. Response rate information is still missing for those three plus two more, LA Times and Rasmussen Reports. Both response rates and weighting information are among the "minimal disclosure" items that the AAPOR code mandates that all pollsters disclose. So why did AAPOR single out Strategic Vision for public condemnation and not any of the others?

I put that question to AAPOR and received a two-part answer from standards chair Stephen Blumberg. First, the Roper/AAPOR archive does not include all of the latest information:

We recognize that there may be discrepancies between the ad hoc committee report, the information on the Roper Center site, and the information available to the ad hoc committee. Some information that was received after the ad hoc committee report was finalized has not yet been posted. More information will be posted soon to update the Roper Center site.

Second, while some organizations were apparently unable to provide all the the information requested, they apparently convinced AAPOR that they had made a good faith effort to disclose whatever information they had retained or otherwise had available:

Several organizations provided responses indicating that they did not produce, obtain, or retain sufficient information to provide the methodological information listed in the AAPOR Code and requested by the ad hoc Committee. Hence, it was not always possible for each organization to provide equally detailed information.

So why was Strategic Vision singled out for public reprimand?

Strategic Vision LLC, however, was the only polling firm that explicitly refused to provide such information in response to multiple requests. Strategic Vision LLC never indicated that such information was not produced, obtained, or retained.

Blumberg also expanded on why AAPOR is not commenting on the actions of other pollsters or disclosing the identify of the person that filed the initial complaint against Strategic Vision:

Regarding any judgments that may have been made during an AAPOR Standards Investigation of the adequacy of disclosure for any organization, you are aware (as an active AAPOR member and former Council member) that the confidentiality provisions in our Procedures do not permit AAPOR to comment. We cannot reveal whether complaints were filed, evaluation committees were formed, judgments were made, or actions other than public censure were taken.

[And yes, interests disclosed once again: I am an active AAPOR member and served as a member of its Executive Council from 2006 to 2008].   

Blumberg's reference to confidentiality raises an objection voiced frequently by Strategic Vision CEO David Johnson in response to the AAPOR action. "We've asked for a copy of the complaint that was filed against us, and who filed it," Johnson told Jim Galloway of the Atlanta Journal Constitution. "How can you respond to something when you don't know who filed the complaint." He also told the website Research that he "find[s] it unusual that an organisation that says they are all about transparency won't supply us with details of the complaint. What they were asking for were trade secrets."

AAPOR's refusal to name the person that filed the complaint is, as Blumberg says, consistent with its extensive "Schedule of Procedures for Code Violations" that includes numerous safeguards to "maintain confidentiality of the subject(s), information sources, and methods of investigation." Why the lack of transparency?

A good clue to the answer can be found in Sidney Hollander's chapter of the official AAPOR history that is posted on the organization's website. Ironically, emphasis on anonymity and confidentiality was partly a reaction to the concern about potential lawsuits and legal liability of the sort that David Johnson is now threatening. Hollander writes (p. 76):

In early 1974, some Council members began exploring what legal liability the organization might incur if it were to adopt stronger measures. The legal advice obtained recommended explicit procedures that could be applied uniformly as a means of minimizing the possibility of retaliation by liability suits.

Hollander does not address the issue, but it seems likely that the authors of those procedures wanted to protect against those who might try to use their process to promote frivolous or unfounded complaints. So they set up a procedure to carefully evaluate and investigate reported violations of their code before making any comment.

The chapter also reports that complaints about unidentified complainants are not new. He cites a 1973 complaint against a polling organization that (p 76),

declined to respond to the complaint without knowing the identity of the complainants. Anonymity of the complaint's source was an issue that has been continually debated as the Code developed. Although Council member Cisin said that the concealing complainants' identities make the Standards Committee party to a 'security action,' the Standards Committee took the position that once a claim is accepted, the committee itself becomes the plaintiff in criminal law. (p. 76)

And what about another of David Johnson's complaints: Why would AAPOR expect a non-member to conform to its rules? That issue, Hollander writes, was also the subject of internal debate from the very beginning. He writes that in 1964, an argument in favor of acting on complaints against non-members was that AAPOR members had shown "overwhelming support for action against pseudo-surveys, for instance, and other violations by non-members that threatened to impair interviewer access to respondents" (p. 74). As a professional organization, AAPOR has always been concerned about unethical actions that threaten the image of their profession.

They resolved the debate, Hollander writes, by setting up rules that would hold practitioners "responsible for their work" through disclosure requirements, but not proscribe specific standards or best practices for how survey research should be conducted. And that brings us back to the Strategic vision reprimand.

What is striking about AAPOR's action last week, especially in light of the responses of other organizations, is that other pollsters that fell short of full disclosure were not the subject of public reprimand. For example, the AAPOR Code says researchers should release response rates with their public reports, but as far as I know, only 1 of 21 pollsters disclosed a response rate at the time their surveys were released in 2008.  However, the other organizations either  released response rate information on request or responded in good faith about the information they could and could not "produce, obtain, or retain." AAPOR singled out Strategic Vision because it was, they say, the only organization that flat out refused to answer even cursory questions about its response rates and weighting procedures. It was the only organization, in effect, that refused to take responsibility for its work.


AAPOR 2009 Interview Wrapup


I'm back in DC and rested, or at least, better rested than I was this time yesterday. As a wrap up to my coverage of this year's conference of the American Association for Public Opinion Research (AAPOR), here are links to all of the video interviews (with some final thoughts below):

  • Michael Link, AAPOR's conference chair, provides an introduction.
  • Masahiko Aida on a likely voter validation study conducted by Greenberg, Quinlan Rosner & Democracy Corps.
  • Reg Baker on AAPOR's online panel survey task force.
  • Chris Borick on the whether the "incumbent rule" made a comeback in 2008.
  • Elizabeth Dean on RTI International's efforts to recruit and interview respondents in Second Life.
  • Don Dillman on Address Based Sampling (ABS) and its applications for Internet surveys.
  • Paul Donato on the role of traditional survey research in a world of changing electronic measurement.
  • Tom Guterbock on his research on improving political message testing.
  • Lou Harris recounts experiences working for candidate John F. Kennedy in 1960.
  • Sunshine Hillygus on her study of how primary supporters of Hillary Clinton (and those of other unsuccessful presidential candidates) ended up voting in the general election.
  • Scott Keeter on the perils of pre-election polling in 2008.
  • Jon Krosnick on the challenges in assessing the quality of new survey methods.
  • Jennie Lai of the Nielsen Company on their research into the use of mobile devices for time use surveys.
  • Christopher Wlezien on the work he and Bob Erikson have done on the comparative accuracy of polls and political prediction markets.

A few things to remember about AAPOR's annual conference: First, my interviews barely scratched the surface of the breadth and depth of subjects covered and findings presented. I tend to focus more closely on topics related to pre-election polling, but the conference covered a much wider array of methodological issues. For example, there were by my count eight panels on survey non-response, six on web surveys, and five each on the address based sampling and cell phone interviewing. These tend to be highly technical and not easily conveyed via quick video interview.

Remember also that a lot of the findings presented at the conference -- including some that made their way into these interviews -- are very preliminary. The quality of the "papers" presented at the AAPOR varies widely (and I put that word in quotations because most are just Powerpoint presentations). Only a handful will eventually make their way into academic journals, and those that do are still at the beginning of a long peer review process that may ultimately lead their authors to different conclusions. Some of that review comes in the form of tough questions at the conference that my interviews cannot capture.

Finally, some words of thanks: First, thanks to o all of those I interviewed for making themselves avalable. Second, a big thanks to Michael Link, the conference chair, and Monica Evans-Lombe and her colleagues at AMP (AAPOR's management company), who helped provide important logistical support. Finally, another big thank you to Lisa Mathias at the Winston Group for creating the animation that appears at the beginning of each video (and her colleague and pollster regular Kristen Soltis for recommending her). Thank you to all!


AAPOR09: Sunshine Hillygus


On Saturday, I also interviewed Sunshine Hillygus, Director of Program on Survey Research at Harvard University, on her study of how primary supporters of Hillary Clinton (and those of other unsuccessful presidential candidates) ended up voting in the general election, as as presented at the annual conference of the American Association for Public Opinion Research (AAPOR).

The following links take you to all the videos, the conference program (pdf and online searchable) and occasional Twitter updates from me and others at the conference.


AAPOR09: Jennie Lai


On Saturday, I also interviewed Jennie Lai, research methodologist for The Nielsen Company, on her research into the use of mobile devices for time use surveys as presented at the annual conference of the American Association for Public Opinion Research (AAPOR).

The following links take you to all the videos, the conference program (pdf and online searchable) and occasional Twitter updates from me and others at the conference.


AAPOR09: Don Dillman


On Saturday, I interviewed Washington State University Professor Don Dillman, author of Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method, on his work on the use of Address Based Sampling (ABS) and applications for internet surveys, as presented at the annual conference of the American Association for Public Opinion Research (AAPOR).

The following links take you to all the videos, the conference program (pdf and online searchable) and occasional Twitter updates from me and others at the conference.


AAPOR09: Elizabeth Dean - Second Life


I interviewed Elizabeth Dean, a survey research methodologist at RTI International, about their efforts to recruit and interview survey respondents in Second Life  at the annual conference of the American Association for Public Opinion Research (AAPOR).

RTI has more information about their research facility in Second Life's online virtual world.


200905160749.jpg

The following links take you to all the videos, the conference program (pdf and online searchable) and occasional Twitter updates from me and others at the conference.


AAPOR09: Jon Krosnick


I interviewed Jon Krosnick, Stanford professor of communications, political Science and psychology, on challenges in assessing the quality of newer survey methodologies, including online panel surveys, at the annual conference of the American Association for Public Opinion Research (AAPOR).

Follow the links to all the videos, the conference program (pdf and online searchable) and occasional Twitter updates from me and others at the conference.


AAPOR09: Scott Keeter


I interviewed Scott Keeter, director of survey research at the Pew Research Center, on a paper he presented on the perils of pre-election polling in 2008.at the annual conference of the American Association for Public Opinion Research (AAPOR).

Follow the links to all the videos, the conference program (pdf and online searchable) and occasional Twitter updates from me and others at the conference.


AAPOR09: Chase Harrison


I interviewed Chase Harrison, preceptor at the Program on Survey Research at Harvard University, on his research into the accuracy of pre-election polling forecasts in 2008 at the annual conference of the American Association for Public Opinion Research (AAPOR).

I also interviewed Harrison last year on the first stage of this research.

Follow the links to all the videos, the conference program (pdf and online searchable) and occasional Twitter updates from me and others at the conference.


AAPOR09: Tom Guterbock


I interviewed Tom Guterbock, director of the Survey Research Center at the University of Virginia, on a paper he presented on tests he conducted over the last year to improve message testing poll at the annual conference of the American Association for Public Opinion Research (AAPOR).

Follow the links to all the videos, the conference program (pdf and online searchable) and occasional Twitter updates from me and others at the conference.


AAPOR: Masahiko Aida


I interviewed Masahiko Aida, director of analytics for Democratic polling firm Greenberg, Quinlan Rosner (which conducts surveys for Democracy Corps) on a paper he presented on their likely voter validation study at the annual conference of the American Association for Public Opinion Research (AAPOR).

Follow the links to all the videos, the conference program (pdf and online searchable) and occasional Twitter updates from me and others at the conference.


AAPOR09: Reg Baker


Up next in our series of interviews at the AAPOR 2009 conference is Reg Baker, Chief Operating Officer of Market Strategies International discussing his work as chair of AAPOR's Online Panel Task Force

You can see all the videos from this year's AAPOR conference here (and last year's too).  I may also post occasional updates via Twitter.


AAPOR09: Chris Borick


The next interview in our series of interviews at the AAPOR 2009 conference is with Chris Borick, director of the Institute of Public Policy at Muhlenberg Colllege about the paper he presented on the "incumbent rule" in 2008.

You can see all the videos from this year's AAPOR conference here (and last year's too).  I may also post occasional updates via Twitter.


AAPOR09: Paul Donato


The latest interview from our series of interviews at the AAPOR 2009 conference is with Paul Donato, Executive Vice President and Chief Research Officer for The Nielsen Company. Donato was one of two speakers at last night's Plenary Session, "The Role of Traditional Survey Research in a World of Electronic Measurement and Changing Information Needs."

Unfortunately, a combination of logistical and technical challenges prevented me from completing an interview with Columbia University Professor Kenneth Prewitt, who shared the stage with Donato.

You can see all the videos from this year's AAPOR conference here (and last year's too).  I may also post occasional updates via Twitter.


AAPOR2009: Christopher Wlezien


Yesterday, I also interviewed to Temple University Professor Christopher Wlezien, who long-time Pollster readers may remember from the guest contribution he co-authored with Joseph Bafumi and Bob Erikson. We talked today about the work Wlezien and Erikson have done on the comparative accuracy of polls and political prediction markets:

Wlezien sent a copy of the paper he co-authored with Erikson that he presented yesterday. Their earlier article, "Are Political Markets Really Superior to Public Opinion Polls," appeared in Public Opinion Quarterly in 2008.

You can see all the videos from this year's AAPOR conference here (and last year's too).  I may also post occasional updates via Twitter.


AAPOR09: Lou Harris


The second interview in the series I am conducting at this week's AAPOR conference has a bit of a back story. I had the real honor of interviewing 88-year-old Lou Harris, the founder of Harris and Associates in the mid-1950s and creator of the Harris Poll. Harris was also one of the first pollsters to serve as a senior advisor to a presidential candidate -- John F. Kennedy and 1960.

The unfortunate back story is that about 30 minutes after we finished our interview and just before he was scheduled to speak at the AAPOR conference, Harris fell and hurt himself badly enough that he had to be taken to a nearby hospital. I am told, however, that the injury is not serious although he is spending the night for observation. We hope he is feeling better soon.

We talked for a little over twelve minutes. In this first segment, Harris recounts the work he did for the Kennedy campaign prior to the West Virginia Democratic primary in 1960:

In this second segment, I asked Harris about the apparently apocryphal story that had candidate Kennedy snapping, ""Just give me the numbers, Lou, I know what they mean." Apparently, that story is something of a myth, although he goes on to describe Kennedy as a perceptive but demanding client:

.

You can see all the videos, including those from last year's conference, here.

Update:  Twenty five years ago, the Washington Post's Louis Romano reported the following about the mythical "I know what they mean" story (via Nexis, 1/15/1984):

THE STORY IS SAID to be apocryphal, but political pollsters still love to tell it:

John F. Kennedy was soaking in a tub during the final days of the 1960 presidential campaign.

Lou Harris, the first in a new breed of specialized presidential pollsters, was on the edge of the tub discussing his latest survey results, complete with a complicated package of political advice and in-depth analysis for Kennedy.

"Just give me the numbers, Lou," Kennedy is said to have snapped. "I know what they mean."

"Oh, we used to always meet in the bathroom," says Lou Harris today, one of the deans of political polling. "But the story is part of the mythology . . . I was certainly the first polltaker who . . . served on a super-strategy committee."


AAPOR09: Conference Chair Michael Link


This post is the first of a series of video interviews I'll be doing over the next few days at the annual conference of the American Association for Public Opinion Research (AAPOR). We begin with Michael Link, Chief Methodologist/VP for Methodological Research at The Nielsen Company, who is serving as this year's AAPOR conference chair. He explains what this conference is about and what we can expect over the next few days.

You can see all the videos, including those from last year's conference, here.

Credit for the animated graphic title that introduces each video goes to Lisa Mathias at the Winston Group (and her colleague Kristen Soltis for recommending her). Thank you Lisa and Kristen!


Twitter Updates


I am headed off to the airport in few minutes for Hollywood, Florida and this year's conference of the American Association for Public Opinion Research (AAPOR).  I will be video-blogging for the rest of the week (like last year), but will otherwise try to keep updated via Twitter

The box below displays my last five Twitter updates. You can also follow posts by other AAPORites on Twitter using the hashtag #AAPOR09.

;


Coming Soon: AAPOR Conference 2009 Interviews


Next Thursday, I will fly to Hollywood, Florida to cover the annual conference of AAPOR -- the American Association for Public Opinion Research. That coverage will include a reprise last year's popular video interviews of those presenting on topics of interest to Pollster.com readers, as well as comments via Twitter as time permits. I will have more details on our coverage next week, but the videos will all appear right here on Pollster.com

Meanwhile, for the true polling geeks out there, we would love your input on potential interviewees. The full program is available for download, either as a full pdf file or (if you're looking for something specific) through a search form. If you are a survey geek, are there any particular topics, papers or speakers you really hope I interview? If so, please email me or leave a comment below.


Robert Groves Nominated to Head Census


In the survey world, this is very big news (from AP via The Page):

President Barack Obama is tapping Robert M. Groves, a University of Michigan professor who has pushed the use of statistical sampling, to be the next census director.

A Commerce Department official who demanded anonymity said the White House will make the announcement later Thursday.

Groves is an expert in survey methodology and statistics who served as an associate director of the Census Bureau from 1990 to 1992. He and others recommended that the 1990 census be statistically adjusted to make up for an undercount, only to be overruled by then Republican Commerce Secretary Robert Mosbacher, who called it political tampering.

It is something of an understatement to describe Robert Groves as "an expert in survey methodology." He is one of our nations' most respected survey methodologists and arguably the leading authority on the subject of non-response in surveys. He has served as the president of the American Association for Public Opinion Research (AAPOR), and won many of its awards including the career award for exceptionally distinguished achievement. Interests disclosed: I had the good fortune to study under Groves in classes I took at the University of Maryland's Joint Program in Survey Methodology (JPSM), a program that Groves helped found.

As the first three three paragraphs of the AP make clear, some see nothing but "political tampering" in any reference to "statistical sampling" regarding the census. For those tempted to label Groves as the pawn of partisans in the White House or the Democratic party, I have a warning: The notion of Bob Groves yielding to partisanship is laughable. As in rolling on the floor laughing out loud laughable. Groves is well known and universally respected among survey researchers and Census Bureau professionals alike. He is an ideal choice for this appointment.

I conducted the interview with Groves below, on the topic of non-response, at last year's AAPOR conference. The Bob Groves in this interview is the scientist and professional his students and peers know well:


Update:  AP has updated their story with initial reactions to the appointment (via @AAPOR).  This didn't take long:

House Republicans quickly expressed dismay Thursday over the selection of Groves, saying Obama's choice raised serious questions about an "ulterior political agenda."

"The fight to protect the accuracy and independence of the 2010 census has just begun," said Rep. Patrick McHenry of North Carolina, the top Republican on a House subcommittee overseeing the census. "President Obama has made clear that he intends to employ the political manipulation of census data for partisan gain."

Also, the Washington Post has more background on Groves and the sampling controversy, including this:

Groves served as the bureau's associate director from 1990 to 1992 and currently is director of the university's Survey Research Center. He has researched why people participate in statistical surveys, worked to develop surveys with lower non-response errors and studied how data is collected for surveys.

A congressional aide familiar with Census matters said Groves has "bulletproof scientific credentials" and is "really highly regarded by his peers as a low-key, determined guy who's been really focused on reducing error in survey research for his whole career."

Update 2Time's Amy Sullivan, who kindly linked to this post, reviewed the census sampling controversy when it came up in connection with Judd Gregg's withdrawl as Commerce Secretary nominee. 


AAPOR Releases More Details on Burnham Censure


In early February, the American Association for Public Opinion Research (AAPOR) censured Dr. Gilbert Burnham, a faculty member at the Johns Hopkins Bloomberg School of Public Health, for violating the AAPOR ethical code for failing to disclose "essential facts about his research," a study (pdf) of civilian deaths in Iraq originally published in the journal Lancet.

When I blogged the story, one reader asked for more specifics about what exactly Burnham had failed to disclose. The response from AAPOR's standards chair, Mary Losch was a somewhat vague summary: "Included in our request were full sampling information, full protocols regarding household selection, and full case dispositions -- Dr. Burnham explicitly refused to provide that information for review."

On Tuesday, AAPOR's executive committee issued a statement (pdf) with more specifics on what they requested and how Burnham responded:

As part of the investigation, the AAPOR Standards Chair requested information from Dr. Burnham. The specific requests related to AAPOR’s finding of violation of minimum disclosure were as follows:

1. The survey sponsor(s) and sources of funding for the survey.

2. A copy of the original questionnaire or survey script used in the 2006 survey, in all languages into which it was translated.

3. The consent statement or explanation of the survey purpose.   

4. A full description of the sample selection process, including any written instructions or materials from interviewer training about sample selection procedures.

5. A summary of the disposition of all sample cases.

6. How were streets selected? How were the starting street, and the starting household, selected? Once the starting point was selected, how were interviewers instructed to proceed (e.g., when they came to an intersection)? How were houses and respondents chosen at housing units?

7. The survey description says that, “The interview team were given the responsibility and authority to change to an alternate location if they perceived the level of insecurity or risk to be unacceptable.” In how many clusters did the team change location, and what were the reasons for the changes?

8. The survey description says that, “Empty houses or those that refused to participate were passed over until 40 households had been interviewed in all locations.” Were such cases included in the number of not-at-home and refusal cases counted in each cluster?

Dr. Burnham responded with the following information related to the detailed request:

• “This study was carried out using standard demographic and household survey methods.”

• “The methods we employed for this study were set out in the Lancet paper reporting our findings (Lancet, 2006;368:1421-28). The dataset from the study was released some time ago.”

Despite repeated requests from the AAPOR Standards Chair for the information detailed above, Dr. Burnham refused to provide any additional information. He did not indicate that the information was unavailable, nor did he suggest that disclosure of this information would risk revealing the identities of survey participants.

Keep in mind that AAPOR asked Burnham to disclose these details to their standards committee as a part of a confidential inquiry. They were not asking him to make these details public, at least not at that stage of their investigation. They have not provided information on the nature of the original complaint made by an AAPOR member, which may have involved aspects of the research other than disclosure. Either way, the AAPOR code is very clear about a researchers obligation to disclose such details, on request. Failure to disclose is grounds for censure.

Mary Losch will present an overview of AAPOR's code and the Burnham case and will be available for questions today at 3:00 p.m. at an event sponsored by AAPOR's DC Chapter (more details at DC-aapor.org).

Interests disclosed: I am an AAPOR member and served on AAPOR's Executive Council for two years, from May 2006 to May 2008, but was not involved in the Standards Committee's investigation of the Lancet study.


New from AAPOR: Survey Practice and a Polling Webinar


From my colleagues at the American Association for Public Opinion Research (AAPOR) come two new projects of interest:

AAPOR is again teaming up with The Poynter Institute's News University to sponsor a "webinar" on Understanding and interpreting Polls in the 2008 Election.  The live webcast, which will be held on Thursday, September 18 at 2 p.m. Eastern time, will feature live audio and a slideshow and will allow participants to post questions.  Access to the live or archived version will cost $24.95.  Details are available here.

AAPOR also just launched Survey Practice, a new online publication that has been in the works for two years (including my stint as AAPOR's communications chair).  The publication aims to be a forum for pollsters and survey researchers to share "share advances in practical survey methods, current information on conditions affecting survey research, and interesting features about surveys and people who work in survey research."  Links to articles in the first issue is are available here.


AAPOR 2008: Brian McCabe


And finally the last of my my series of brief interviews conducted at last week's AAPOR Conference, this one with Brian McCabe discussing a paper he co-authored with Jennifer Heerwig that won AAPOR's award for the best student paper presented at the conference. McCabe and Heerwig are both doctoral candidates in Sociology at New York University. The paper is entitled, "Social Desirability Bias in Estimated Support for a Black Presidential Candidate."

For those intrigued by McCabe's summary, I strongly recommend the full paper, a "terrific read," as conference discussant Murray Edelman noted after Heerwig and McCabe presented their results on Saturday. The Heerwig-McCabe paper is easily the most intriguing and currently relevant I saw presented at the AAPOR conference, and worth reading alongside the cover story ("The Big Race: Obama and the psychology of the color barrier") by John Judis in the current issue of The New Republic.

Having said that, I want to pass along two cautions. First, keep in mind that the data were collected in June 2007, when Hillary Clinton, not Barack Obama was the clear front-runner for the Democratic nomination.

Second, consider the caution that Edelman voiced on Saturday (a point that may not be clear until you read the full paper): A "mode effect" may confound the estimates of social desirability bias that Heerwig and McCabe calculate by asking the same question in different ways (explained on pp. 9-12 of their paper and shown in the comparisons of "true" to "overt" support in Table 3). Edelman cited a 2006 article in Public Opinion Quarterly by Smyth, et. al. that found that respondents tend to answer more questions more completely when asked one at a time rather than when presented in a "check all that apply" list format.


AAPOR 2008: Sunshine Hillygus


Next to last in my series of brief interviews conducted at this week's AAPOR Conference, this one with Sunshine Hillygus, associate professor of government and director of the Program on Survey Research at Harvard University. She discussed findings on microtargeting and direct mail appeals in the 2004 presidential campaign and her new book, The Persuadable Voter: Wedge Issues in Presidential Campaigns:


AAPOR 2008: Gary Langer


Yet another in series of brief interviews conducted at this week's AAPOR Conference, this one with Gary Langer, director of polling at ABC News:

For more information, see ABC News' Polling Methdology and Standards.


AAPOR 2008: Andrew Smith


Continuing with our series of brief interviews conducted at this week's AAPOR Conference, this one with Andrew Smith, director of the University of New Hampshire (UNH) Survey Center. He discusses his AAPOR presentation on undecided voters in the New Hampshire, which drew on a question I wrote about often:

For more on the pollster debate over RDD vs RBS sampling see our posts here and here and the AAPOR conference interview with Patrick Murray.


AAPOR 2008: Mark DiCamillo


Continuing with our series of brief interviews conducted at this week's AAPOR Conference, this one with Mark DiCamillo, director of The Field Poll in California. A year ago, we reported on Field's use of RBS sampling. Here, DiCamillo discusses details of Field's ability to interview voters over their cell phones in California using Registration Based Sampling (RBS):


AAPOR 2008: Chase Harrison


One more in series of brief interviews conducted at this week's AAPOR Conference, this one with Chase Harrison, preceptor in survey research at Harvard University. Harrison describes his analysis of the accuracy of pre-election polls during the 2008 primaries.


AAPOR 2008: Jeff Jones on Gallup's Cell Phone Interviews


Yet another in series of brief interviews conducted at this week's AAPOR Conference, this one with Jeff Jones, managing editor of the Gallup Poll. Jones presented findings on the interviews Gallup has conducted by cell phone with Americans living in households with non landline phone service to supplement national surveys and polls conducted in four primary states.

Starting just after 2:00 on the video, Jones discusses the impact of those additional interviews on the general election matchups between the two Democratic contenders and John McCain. While the inclusion of cell phone only households makes little difference in the Clinton-McCain contest, it benefits Obama by a net four points: Without cell phone interviews, and weighted using Gallup's usual likely voter model, McCain would get 49% to Obama's 46% (clarification: this result combines six Gallup/USAToday surveys conducted so far during 2008). With the cell-phone interviews included, the result is Obama 48%, McCain 47%.

And don't miss the cameo appearance by ABC polling director Gary Langer.


AAPOR 2008: Patrick Murray on RBS vs RDD


Another in series of brief interviews conducted at this week's AAPOR Conference, this one from Patrick Murray, director of the Polling Institute at Monmouth University. His paper, co-authored by Monmouth University colleague Timothy MacKinnon, concerned an experiment with random digit dial (RDD) and registration based list sampling (RBS):

More on the pollster debate over RDD vs RBS sampling here and here.


AAPOR 2008: SurveyUSA's Jay Leve


Another in series of brief interviews conducted at this week's AAPOR Conference, this one with SurveyUSA's Jay Leve on why he values AAPOR:


AAPOR 2008: Dutton/Race and Gender of Interivewer Effects


Another in series of brief interviews conducted at this week's AAPOR Conference, this one with Sarah Dutton, deputy director of surveys for CBS News. Dutton discusses findings presented at the conference (and co-authored by CBS colleagues Jennifer De Pinto and Fred Backus) regarding the effects of race and gender of interviewers on primary polling on the 2008 Democratic presidential nomination.

Dutton discusses these findings in the context of theories about the so-called Bradley-Wilder effect. I've written about that subject here and here. Politico's Daniel Libit's story today also reviews the debate over this issue during the 2008 primaries.


AAPOR 2008: Nancy Mathiowetz


Another in series of brief interviews conducted at this week's AAPOR Conference, this one with outgoing AAPOR President Nancy Mathiowetz discussing her advice to pollsters and journalists on pre-election poll coverage and thoughts on pollster cooperation with the request for data from AAPOR's Special Committee on Primary Polls:


AAPOR 2008: Michael Traugott - Primary Polling Committee


Another in my series of interviews conducted at this week's AAPOR Conference, this one with Michael Traugott, chair of AAPOR's Special Committee on 2008 Primary Polling:

Related - Politico's Daniel Libit also filed a story today that touches on AAPOR's special committee:

In the wake of New Hampshire’s polling glitch, AAPOR convened a task force to study what had gone awry. Though its report has not been finalized, the task force’s head, University of Michigan professor Michael Traugott, says the evidence has yet to point to any “smoking gun.” [...]

AAPOR had hoped to publish the task force findings before this weekend’s convention, but Traugott says the work has been stalled by the hesitancy of pollsters to submit their methods and practices for peer review. He expects to report by mid-to-late summer, providing enough time before the general election for pollsters to tweak their methodology or improve their voter-screening questions.

“It does make for a challenge when you have public pollsters who won’t share their methods appropriately with others,” says Rob Daves, a past AAPOR president. “Science is based on transparency, and I’m not just talking about social science.”


 

MAP - US, AL, AK, AZ, AR, CA, CO, CT, DE, FL, GA, HI, ID, IL, IN, IA, KS, KY, LA, ME, MD, MA, MI, MN, MS, MO, MT, NE, NV, NH, NJ, NM, NY, NC, ND, OH, OK, OR, PA, RI, SC, SD, TN, TX, UT, VT, VA, WA, WV, WI, WY, PR