Pollster.com

Articles and Analysis

 

Cell Phones and Political Surveys: Part II

Topics: Cell Phones

Two weeks ago, I took a long look at the cell-phone only problem and whether the absence of those without landline phones is affecting survey results. Today, I want to conclude with a look at how pollsters conduct surveys via cell phone. Like Part I, this article is long, even by Pollster.com standards. So it continues after the jump.

This year's annual conference of the American Association for Public Opinion Research (AAPOR) featured a special "track" of presentations on lessons learned from a variety of pilot tests of surveys that interviewed respondents on their cell phones. Just to provide a sense of their depth and breadth, here is a list of some of the most recent pilot tests presented (full details on offline references appear at the end of this article):

  • Studies conducted in three states (GA, NM and PA) this year as part of the Center for Disease Control's 50-state Behavioral Risk Factor Surveillance System (BRSFF) surveys (Link, et. al., 2007).
  • Studies in six states (MA, FL, NJ, FL, MT & TX) conducted by the research firm Macro International on behalf of clients including the BRSFF and the Adult Tobacco Survey (ATS) -- (Dayton, et. al., 2007; ZuWallack, 2007).
  • A study in 2006 conducted for the California Health Interview Survey (CHIS) -- (Brick, et. al, 2007).
  • A study in February conducted by the National Highway Traffic Safety Administration as part of the Motor Vehicle Occupant Safety Survey (MVOSS) of (Boyle, et. al., 2007)
  • Studies conducted since 2002 by the Arbitron, including a pilot survey of more than 9,000 respondents conducted on their cell phones in 2005 (Fleeman, 2007).
  • Pilot studies conducted in 2005 by the British firm GfK NOP. (Moon, 2007)

The point is that the most scientifically rigorous survey methodologists are doing considerable research and development on how to conduct interviews over mobile phones. Their presentations at the AAPOR conference provide insights on the technical challenges they are grappling with. What follows is a summary of some of those issues.

Sampling -- The process of drawing of random sample of cell phone numbers is relatively easy and essentially the same as the "random digit dial" (RDD) technique for sampling landline telephone numbers. Here's the oversimplified version of how it works: Companies like Survey Sampling and Marketing Systems Group obtain listings of the special telephone exchanges (the first three digits of a seven digit telephone number) assigned for cell phone usage and draw a sample from all possible numbers that could occur within those exchanges. They randomly generate numbers from the pool of all possible numbers in those exchanges

One key difference is that cell phone samples cannot use "list assisted sampling", a process that makes landline RDD samples more efficient by using computerized telephone directory listings (think "White Pages") to count the numbers in each exchange. Without such data, and the cost efficiencies that result, cell phone surveys more expensive to conduct.

Dialing - The federal Telephone Consumer Protection Act (TCPA) bans all unsolicited calls to a cell phone using "automated dialing devices." Most pollsters use computerized systems to dial the numbers and administer the questionnaires. The obvious solution -- having interviewers hand dial each number -- slows down the process and increases cost (it also effectively precludes doing cell phone surveys using an automated, interviewer-free, IVR methodology).

Cooperation -- As most of us know, cell phone users pay for most of the "minutes" they consume and carry their phones in the midst of other activities, so pollsters have long assumed that respondents would be more reluctant to cooperate with a cell phone survey than a landline survey. As such, most of the pilot studies offered each respondent a cash incentive ranging from $5 to $20, depending on the project and length of the survey. A few of the government sponsored pilot studies tested the impact of such incentives and found mixed evidence. In some cases the incentive did not seem to boost participation.

Nonetheless, in virtually all of the studies, the response rates were slightly lower for the cell phone studies than their landline counterparts even with financial incentives. For example, the Pew Research Center reported an average response rate of 23% on their three cell phone surveys of adults (which offered respondents a $20 incentive) compared to 29% on the landline surveys (which offered no incentive). Interestingly, the pattern was reversed on their "GenNext" survey of 18-24 year olds, where the response rate was higher on the cell phone survey (29%) than the landline survey (25%).

Driving and Liability -- What if the respondent is driving a car when the pollster calls? Could the pollster be subject to legal liability for potentially distracting the driver? Most of the pilot studies immediately ask if the respondent is "currently driving a car or doing any activity that requires your full attention," and if so, offer to call back at a later time. The ORC pilot studies found that roughly a third (32%) of their initial contacts were driving at the time (Dayton, et. al.). Needless to say, that extra screening and rescheduling also ads to cost of interviewing.

Costs -- All of the challenges listed above make for higher costs for cell phone surveys. Another cost factor is the added per-interview cost of screening out those with landline service (if the purpose of the cell phone study is to interview only those out of reach of landline surveys).

Pew's Keeter reports that with the incentives included, the per interview cost of their cell phone studies was "approximately 2.4 times" the cost of comparable landline surveys. If the cell-phone-sample screens to identify cell-phone-only respondents, "the per interview costs are four to five times as large." A paper on the BRSFF pilot studies reported very similar cost differences (Link, et. al., 2007).

Geography -- One of the potential pitfalls of a cell phone sample is the weaker association between telephone numbers and geography in surveys of states or local geography. Cell phone subscribers may buy a phone in a location far from their home, or may keep their cell phone number even when moving from one state to another. Arbitron found, for example, that 32% of the numbers in their cell phone study "were not in the county associated with the cell phone number," as compared to 6% on their comparable landline sample (Fleeman, 2007).

Screening & Weighting -- Aside from cost, one of the biggest challenges is figuring out how to best combine the cell phone and landline samples. The root problem is that survey researchers do not have great experience or data on the probabilities of reaching various kinds of cell phone users. Some users carry their phones everywhere an answer calls at all hours. Others may use them only during daytime hours or turn them on only for emergencies.

The pilot studies conducted by the Pew Research Center interviewed everyone they reached via cell phone, thus allowing for a comparison of the responses of "dual users" (those with both kinds of telephone service) reached by either cell phone or landline survey. Theoretically, we should see no differences because the target population (dual users) should be identical. In practice, however, significant differences emerge. A paper presented at the AAPOR conference by former Pew analyst Courtney Kennedy showed that dual users interviewed by cell phone tended to be younger, "reflect[ing] heavier usage of cellular phones among young people."

But what if pollsters screen out dual users and only interviews those in "cell phone only" households? Even here, they may over-represent heavier cell phone users. Kennedy reported that she could reduce bias (as measured against in-person studies) in a combined sample by weighting respondents according to self-reported cell phone and landline service, in addition to standard demographics.

* * *

So what is the bottom line? Surveys via cell phone are feasible, but much more expensive than landline surveys and with some methodological kinks (like weighting) yet to be worked out. Supplemental cell-phone interviewing is going to be important for the multi-million-dollar government surveys that track health and health related behavior (including some measures that currently show statistically significant bias when the cell-phone only population is missed). However, it is not yet clear that very expensive supplemental RDD cell-phone samples will make a noticeable improvement in routine political studies over the next year or two (see Part I). The cost alone puts this approach out of reach for most media and internal campaign surveys.

I asked some of the national political pollsters about their plans in this regard. Not surprisingly, the Pew Research Center's Scott Keeter was the most open about their plans:

We are currently discussing our plans for cell phone research in the coming year. It's possible that we will do some type of major study, but I am pretty sure that we will include cell phone samples -- or cell-only samples -- in a few surveys conducted between now and the November elections. We do not intend to make cell samples a standard feature of our sample designs. It is too costly, takes longer, and the weighting must be regarded as experimental for now.

California's Field Poll has also been using a registerd voter list sample (RBS) to interview some respondents via cell phone (possible because many Californians provide a cell-phone number when they register to vote). Most of the others, as per standard practice, prefer not to comment (either on or off the record) about their future plans. However it is worth noting that organizations such as Gallup and ABC News conducted post Katrina cell phone surveys in Gulf Coast states, and thus have gained experience with the methodology.

So what do we do about the cell-phone only problem? Those of us who obsessing over political polls need to keep a close eye on the special cell phone surveys conducted by the Pew Research Center and, perhaps, by others pollsters. These will provide invaluable clues as to whether the cell-phone-only problem is creating any sort of consistent errors in political surveys.

References

All of the following papers were presented at the annual meeting of the American Association for Public Opinion Research, Anaheim, California in May 2007:

Brick, J. Michael, Sherman Edwards, Sunghee Lee. "Sampling and Interviewing Methods for Cell Phone Surveys"

Boyle, John M., Alan Block, Eunyoung Lim. "Cell Phone Augmentation of a National RDD Survey."

Dayton,James, J., Cristine Delnevo, Susan J. Cummings, Zi Zhang, Lori Westphal, Michelle Cook, Diane Aye, Randal S. ZuWallack, Naomi Freedner, Ruth Burnstein. "Cell Phones and Public-Sector Survey Research: Are Incentive and/or Compensation Offers Really Needed?"

Fleeman, Anna. "Survey Research Using Cell Phone Samples: Important Operational and Methodological Considerations."

Keeter, Scott, Courtney Kennedy, Trevor Tompson, Mike Mokrzycki, April Clark. "What's Missing from National RDD Surveys? The Impact of the Growing Cell-Only Population."

Kennedy, Courtney. "Constructing Weights for Landline and Cell Phone RDD Surveys."

Link, Michael, W., Michael P. Battaglia, Martin R. Frankel, Larry Osborn, Ali H. Mokdad. "Conducting Public Health Surveys over Cell Phones: The Behavioral Risk Factor Surveillance System Experience."

Moon, Nick. "Broken Voices on Broken Phones: Can Interviewing Be Done by Cell Phone, and at What Cost?"

ZuWallack, Randall S. "Combining Cell Phone and Landline Samples: Dual Frame Approach."

 

Comments
Dan:

Since I would have a to pay for minute of time before I hung up on the pollster if they ever called my cellphone, I hope they plan to write checks to cover the airtime anyway.

If a pollster ever was so rude as to call my cellphone (possibly while I was driving, etc), I would

a) Hang up immediately
b) File a complaint with the FCC if I didn't reveive a check within a week

I'd only be ok with this if they got PRIOR permission...

____________________

George B:

I discontinued land line phone service specifically because of having to deal with unsolicited calls to voice mail. If someone I don't know cold called my cell phone and offered me money I would assume it was a scam and hang up.

The ultimate poll would be a blended on-line survey of Wal-Mart and Starbucks customers. These retailers could offer discounts on future purchases in exchange for the customer's time and use of the discounts at local stores would allow verification of identity and location.

____________________

Scott :

The root problem is key demographics (18-30) may be underrepresented if pollers do not adapt methods. Blending online results with phone samplings may work, but are no longer random. Perhaps if there was some way for a database to be created, such as a poll authorization to call list (signups for incentives may help this).

I am a single phone user , cell only, who resides on the opposite coast as my number, which would really mess any survey demographics up! =)

____________________

Gary:

Start with a random sample of residential addresses, contact them with a post card initially, make it worth their time to participate, interview them by phone (cell or landline), or in person or by mail, what ever is necessary. Do enough follow-up to get acceptably high response rates.

____________________



Post a comment




Please be patient while your comment posts - sometimes it takes a minute or two. To check your comment, please wait 60 seconds and click your browser's refresh button. Note that comments with three or more hyperlinks will be held for approval.

MAP - US, AL, AK, AZ, AR, CA, CO, CT, DE, FL, GA, HI, ID, IL, IN, IA, KS, KY, LA, ME, MD, MA, MI, MN, MS, MO, MT, NE, NV, NH, NJ, NM, NY, NC, ND, OH, OK, OR, PA, RI, SC, SD, TN, TX, UT, VT, VA, WA, WV, WI, WY, PR