Articles and Analysis


Cell Phones and Political Surveys: Part I

Topics: Cell Phones

Since the AAPOR conference back in May, I have promised several times to take a closer look at the challenge that the growth of "cell phone only" households are presenting to political polling. Today, finally, I am posting the first of a two-part review of some of the latest research.

As this item is a bit long even by Pollster.com standards, it continues in full after the jump.

Why don't pollsters call cell phones?

Pollsters routinely omit cell or mobile phone numbers from their samples. The mechanics of this are fairly easy, since mobile phone numbers typically use specific "exchanges" (the first three digits of a local telephone number).

Why exclude them? Just a few years ago, pollsters gained virtually nothing by dialing mobile phone numbers, since virtually all mobile phone users also had "wired" home telephone service. At the same time, survey researchers also faced both legal and logistical obstacles to interviewing respondents on their mobile phones: Most users pay for airtime, and so pollsters feel obligated to offer financial reimbursement or incentives, either as a matter of law (in some states) or simply as a practical means of obtaining a reasonable response rate. The federal Telephone Consumer Protection Act (TCPA) also bans any sort of unsolicited call to a cell phone using "automated dialing devices." Since virtually all pollsters now conduct their surveys using computerized systems, surveys with a cell phone component require that interviewers dial the telephone the old-fashioned way.

None of these have been absolute barriers, but they have collectively limited virtually all telephone surveys to landline telephones.

How many "cell phone only" Americans are out of reach?

We know how many Americans are without wired telephone service because of a massive, ongoing, in-person survey conducted regularly by the Centers for Disease Control (CDC), called the National Health Interview Survey (NHIS), that now regularly asks about household telephone usage. The CDC cares about the growth of cell phone only households because they spend many millions of dollars on various telephone surveys (especially one known as BRFSS) that measure the health of Americans. And their NHIS survey is a great source of data because they sample more than 43,000 households annually with a response rate "greater than 90 percent of the eligible households in the sample."

The CDC's National Center for Health Statistics puts out semi-annual estimates on cell phone only households based on the NHIS survey. The most recent (also discussed here) estimated that 12.8% of American households (and 11.8% of all adults) had "only wireless telephones" during the second half of 2006. Including those without any telephone service at all (1.7% of adults), the survey estimated that 13.5% of adults were essentially out of reach of telephone surveys late last year.


Unfortunately, the CDC numbers released just weeks ago are already out of date because, as the chart above makes clear, the percentage of cell-phone-only households has been rising rapidly, with an upward turn since late 2005. I asked Stephen J. Blumberg, the primary author of the CDC report, about that apparent increase in the rate of growth. He reports (via email) that the "compound growth rate" in cell-phone only households and adults has remained "remarkably stable over time," growing by "22-23% every six months" and not slowing down:

If the compound growth rate in 2007 and 2008 remains as it was in 2003-2006, then more than 25% of U.S. households will only have cell phones during the second half of 2008 (that is, when pre-election polling will be at its peak).

Of course, we may also understate the problem by focusing only on those with a cell phone but no wired telephone service. Many continue to pay for wired telephone service yet use it far less often in favor of their cell phones or Internet phone service. A study of adults of adults in South Carolina presented at the recent AAPOR conference, for example, found that slightly less than a third (30.4%) of those with both landline and cellular service use the cell phone for "most" or "almost all" of their calls (Lambries, et. al., 2007 -- though bear in mind that all were interviewed via land line phone for the study). The rising use of Internet phone service (services like Vonage and Skype, known technically as voice over internet protocol or VoIP) threatens to exacerbate that trend. The key issue -- hard to determine from these data -- is how many Americans have simply stopped answering their landline phones altogether, thus putting themselves as much out of reach of conventional telephone surveys.

So how much do the missing "cell-phone-only" affect political survey results?

The missing "cell-phone-only" (CPO) households create the potential for what pollsters call coverage error. That is, respondents that are not covered by the sample can cause an error (or a statistical bias) in the results, but the size of any such error depends on two factors:

  • The size of the uncovered population
  • How much the interviewed respondents differ from those that were missed

If all respondents are covered then, obviously, the potential for coverage error is zero. But even if a decent chunk of the population is not covered -- as is the case now with cell-phone-only households -- the error on any given question will still be zero if the uncovered population would have answered that question exactly the same as those interviewed. Error creeps into the results only to the degree that the covered and uncovered respondents differ. So in addition to the size of the cell-phone-only population, we also want to know something about the degree of differences between cell-phone-only and wired-phone respondents.

The NHIS report indicates some large and important differences as of the latter half of 2006. They tell us that the percentage of adults with only wireless telephone service (11.8% overall) tended to be highest among:

  • Adults living with unrelated roommates (54.0%)
  • Adults age 18-24 (25.2%) or age 25-29 (29.1%)
  • Adults renting a home (26.4%)
  • Adults living in poverty (22.4%)

The Pew Research Center graphic below provides vivid evidence of how the rise of cell-phone only households has dramatically affected their ability to reach younger respondents. The percentage of 18-34 year olds in their un-weighted samples has declined by roughly ten percentage points over the last four years or so -- the same time period in which the percentage of adults living in wireless only households has grown from 3% to 12%.


The good news is that this big age skew also provides pollsters with a tool to limit the damage, so I need to duck my own question for a moment to consider a different one...

What do pollsters do now to correct for missing "cell phone only" respondents?

Legend has it, that after seeing a presentation of some of the data on the cell-phone-only problem last year, the late Warren Mitofsky suggested that the answer was spelled "w-e-i-g-h-t." That is, statistical weighting is the procedure pollsters have used to correct any demographic bias in their results as compared to U.S. Census estimates for the population. For example, the Pew Research Center labeled the black line in the above chart as the "weighted parameter" because they weight each survey by age (and other demographic variable) so that it matches the estimates produced by the Current Population Survey (CPS) of the U.S. Census.

Let's use those Pew numbers above for a quick (and grossly oversimplified) explanation of how weighting works: The most recent values from the chart suggest that 18-34 year olds make up roughly 20% of the unweighted adult samples, yet the Census estimates tell us that they are 30% of all adults. So to correct the age distribution, pollsters would multiply (or weight) every respondent between the ages of 18 and 34 by roughly 1.5, while weighting down older respondents (multiplying by roughly 0.6). Pew and other pollsters weight by many different demographic variables, so the actual weighting procedure gets a lot more complicated, but that's the general idea.

OK, so taking weighting into account, how much do the missing "cell-phone-only" respondents affect political survey results?

Here is where this topic gets a bit more complicated. Weighting by age means that within each age group we are, in effect, replacing those living in cell-phone-only households with others of the same age with landline phone service. So if, for example, 18-34 year olds in cell phone only households have the same attitudes on a given question as those 18-34 year olds with landline service, then weighting by age will largely "correct" a skew that might result in questions about those attitudes. If, however, differences remain between those living in cell-phone-only and landline households within age groups, an error will remain in the results even after weighting by age.

A snapshot of the problem from the 2004 exit polls, as described by the Pew Center's Scott Keeter, provides a great example. On a question included on the national exit poll, 7% of voters said they lived in cell-phone only households, and those voters preferred John Kerry to George Bush by nine-point margin (53.7% to 44.7%). But because the cell-phone only population was still relatively small, Bush's lead among those with landline phones "was only 1.3 percentage points larger than in the electorate as a whole: 3.7 points compared with the 2.4 points by which he actually won the election."

However, the data also showed that cell-phone-only voters "were more similar to all voters within their age cohorts, especially among the younger groups where the prevalence of [cell-phone-only] voters was the greatest" (emphasis added). As the table below indicates, the 1.3 percentage point difference between all voters and those with landline phones narrowed to just 0.5 percentage points within age groups. If they simply weighted by age, Keeter wrote, pollsters would have "eliminated most of the bias that could occur from the underrepresentation of younger, more liberal voters."


Keeter and his colleagues at the Pew Research Center and the Associated Press greatly extended this analysis last year using four telephone studies that included samples of both landline and cell-phone numbers (as summarized here and in more detail here). The four studies covered "a very wide range of topics, including use of technology, media consumption, political and social attitudes, and electoral engagement"

While they found statistically significant differences between landline and cell-phone-only respondents, once they weighted the landline sample to Census estimates for age and other demographics, they found that "none of the measures would change by more than 2 percentage points when the cell-only respondents were blended into the landline sample."

In fact, 41 of the 46 measures changed either not at all or by only a single percentage point once the cell-phone-only samples were "blended" in (these included party identification and generic vote preference for the U.S. House). The five measures that changed by 2 percentage points are noteworthy:


Notice that three of the five involved voting or political awareness. Cell-phone-only respondents were much less likely to report being registered to vote (49%) than those with landline service (76%), with differences nearly as high for having ever voted "in your precinct" or awareness of the Republican majority (just before the 2006 elections). Since these differences are also highly correlated with age, weighting by age reduced the difference to just two percentage points. Also -- a bit of good news for the accuracy of pre-election surveys -- the missed cell-phone-only adults are less likely to be voters.

Even owning a cell phone -- something obviously and directly related to being in a cell-phone-only household -- produces only a two point change.

But notice that one item, the percentage reporting having been "online yesterday," produces a 2 percentage point difference between the landline and blended samples, even though the difference between landline and cell-phone-only samples is much smaller than for the other items (just five percentage points). Why? Because in this case, the difference occurs within age groups -- particularly among the youngest cohort -- so weighting by age has less effect. Pew conducted a separate study among 18-24 year olds and found the biggest differences between younger landline and cell-only respondents on technology usage (see table 5).

So what have we learned? For the moment, the Pew Center studies tell us that including samples of "cell-phone-only" Americans produces survey results that "are nearly identical to those from the landline sample alone." Why? Scott Keeter writes:

[T]he cell-only population remains a relatively small part of the total population and the fact that they are not dramatically different on most measures from landline respondents, especially those with the same age cohort. At 11.8% of all adults, they are significant but still a small minority. While distinct, they are not so different that their presence or absence can shift the total estimates by a noticeable amount.

Of course, two big cautions remain. First, these differences can be greater when we are looking only at surveys or subgroups of younger Americans, although Keeter et. al. found in their 2006 study that "explicitly political attitudes" remain "generally unrelated to telephone status" even among 18-24 year olds.

Second, the size of the cell-phone-only population continues to grow rapidly, and may well nearly double in size between the fall of 2006 and the fall of 2008. While we can only speculate about the future, it is likely that the "cell-phone-only" population will become less distinct as it grows in size. "As more people ditch land lines," Michelle Tsai wrote recently in Slate "the cell-only subset could start to resemble the general population." Either way, this phenomenon is something we will need to continue to watch this as we get closer to the 2008 election.

Keeping tabs on cell-phone-only households brings me to the topic of how pollsters conduct surveys via cell phone, which I take up in Part II.

Typo corrected



Sounds like me YIKES !



Well, why don't the pollsters simply ask landline respondents if they have cell phones too? This way, they could weight their samples better.


Jeff Baker:

I am one of those individuals who only has a cell phone, although I do not fit the norm (being 43). For about four years prior to losing my landline, the only use I had for the landline was as an internet connection. There was no phone hooked up to the phone line, only my computer. When I switched to cable for internet service I dropped the landline. Basically, for four years, I would have been considered as having a landline, even though I never placed a phone call on it.

I suspect though, that as cell phone only use has increased, the percentage of people using dial-up to access the internet has decreased.



Have there been any similar studies to determine the demographics of those that actually participate in phone surveys. As a dedicated caller-ID phone call screener who seldom answers if I don't recognize the caller, I would tend to believe that there are quite a number of folks (particularly in the younger generation) who simply don't answer their land-line phones when pollsters call. Even those that do pick up may not be inclined to sit through a poll, but rather decline or just hang-up. There are so many stealth marketing surveys out there that dress themselves up as legitimate polls that I don't bother trying to determine who the pollster really is anymore - I simply say no thanks. I don't think I am alone. Any data out there.


Ted Gerard:

...if we can thus reliably assess the views of primary 'Cell-Phone-Users' without actually sampling them -- then why are 'random' samples of a target population pursued at all in professional opinion polling ?


Chris G:

"it is likely that the "cell-phone-only" population will become less distinct as it grows in size"

I'd argue the opposite is true if cell-only in fact is linked with important political characteristics. Just look at that plot that shows the decline in % of 18-34 in survey samples. It suggests that cell-only is rising at a much faster rate among younger voters, and even in this group the % of cell-only is still at a mere 25%. As this rises it could produce *more* bias as the trend continues, not less. And younger voters are only the low hanging fruit. Maybe over the next 10 years the trend will begin to rise more quickly among older voters that, for example, live in the city (which is another important political factor).

Lets put it this way--suppose 10-15 years from now 75% of the electorate is cell-only. Then we'll be focusing on those left behind and who *they* are. Maybe Evangelicals and older voters will be disproportionately represented then. No one knows, but I think something like this is a distinct possibility.



Hi Mark,

Long time no write (since I dropped out of the blogging world). I still read you, though.

This jumped out at me:

'with a response rate "greater than 90 percent of the eligible households in the sample."'

Remarkable, wouldn't you say? If this sort of response rate is attainable, would you not say that figuring out why they can attain it whilst political pollsters cannot would be a valuable exercise?

I have to think it is a matter of people thinking there is no agenda (beyond their own health) being chased, that the survey itself has intrinsic value, etc. But I *think* that, rather than *know* that. Knowing the why strikes me as extremely important.

Any ideas for figuring out the "why"?


Re: Computerized dialing. When I get a call that is dialed by a computer (it's easy to tell by the gap after answering) I do not take that call. If a human being can't be bothered to be on the line when I answer, I don't feel any obligation whatsoever to speak. I'm sure I'm not the only one.



Rick: It depends on how the predictive dialing system is setup. We use a couple at work, but ours is setup to not place calls unless there is an agent available. It all depends on costs vs number of abandoned calls you're willing to tolerate, and how much you're willing to upset the people you're trying to reach.

Personally, when I get one of those calls, I count how long it takes for a live person to get on the phone, and I refuse to respond to them until I count off that many again. My time is valuable, and if they're willing to waste my time, I'm willing to waste their time, and it ends up costing them more than the abandoned call.


michelle :

Okay yeah I am in 7th grade looking up facts for an assignment no offense but this whole site is soo confusing. Can't you use more graphs????


michelle :

Okay yeah I am in 7th grade looking up facts for an assignment no offense but this whole site is soo confusing. Can't you use more graphs????


Randy Iowa:

Is there a Do Not Call list that i can get on. I have received a survey call everyday this week and at least one candidate has called everyday as well.


I think Cell phones and Political Surveys is a matter of people thinking there is no agenda (beyond their own health) being chased, that the survey itself has intrinsic value, etc. But I think that, rather than know that. Knowing the why strikes me as extremely important.


Post a comment

Please be patient while your comment posts - sometimes it takes a minute or two. To check your comment, please wait 60 seconds and click your browser's refresh button. Note that comments with three or more hyperlinks will be held for approval.