Summary of Findings

Faced with a growing number of unsolicited telephone calls and armed with increasingly sophisticated technology for screening their calls, more Americans are refusing to participate in telephone polls than was the case just six years ago. Yet a survey research experiment to gauge the effects of respondent cooperation on survey quality indicates that carefully conducted polls continue to obtain representative samples of the public and provide accurate data about the views and experiences of Americans.

A typical five-day survey conducted by the Pew Research Center, employing standard techniques used by most opinion polling organizations, now obtains interviews with people in fewer than three-in-ten sampled households (27%). That represents a decrease of about nine percentage points (on average) from the late 1990s.1 The decline results from increased reluctance to participate in surveys and not from an inability by survey organizations to contact someone in a household.

The growing use of answering machines, voice mail, caller ID, and call blocking is not preventing survey organizations from reaching an adult in most of the households sampled. Across five days of interviewing, surveys today are able to make some kind of contact with the vast majority of households (76%), and there is no decline in this contact rate over the past seven years. But because of busy schedules, skepticism and outright refusals, interviews were completed in just 38% of households that were reached using standard polling procedures. In 1997, a majority of those who were reached (58%) cooperated with the survey. The decline in cooperation also was seen in a separate survey, which had a much longer field period and used more rigorous survey techniques. In this poll, which was in the field for five months, 59% of contacted respondents cooperated, compared with 74% in 1997.

But the decline in participation has not undermined the validity of most surveys conducted by reputable polling organizations. When compared with benchmarks obtained from the U.S. Census and other government surveys with response rates that exceed 90%, the demographic and social composition of the samples in the average poll today is remarkably accurate.

Judged by their accuracy in forecasting voter behavior on Election Day, properly designed election surveys conducted just before voting continue to be highly valid.2

And even though a typical survey interviews only around one-in-four or one-in-three people it attempts to reach, there is little to suggest that those who do not participate hold substantially different views on policy and political issues.

As in its 1997 survey research study, the Pew Research Center experiment found little difference between a standard survey ­ conducted with commonly utilized polling techniques over a five-day period ­ and a survey conducted over a much longer period that employed more rigorous techniques aimed at obtaining a high rate of response. The rigorous survey obtained a response rate of 51%, compared with 27% for the standard survey. However, a comparison of more than 90 separate measures covering a wide range of attitudes and behaviors found relatively small differences between the two surveys. The median difference was less than two percentage points, well within the margin of sampling error. In addition, there was no clear pattern to the differences.

Nonetheless, there are notable differences between typical survey respondents and people who are hardest to reach in such surveys ­ those who were successfully interviewed only after multiple attempts or who declined to participate on at least two occasions before complying. Some of these differences reflect the practical difficulties of polling. For example, the hardest to reach are less likely to be at home in the evening, when survey organizations conduct most of their telephone surveys. People who are reluctant to participate also are less engaged by politics and say they vote in lower numbers. Yet here again, there were no consistent attitudinal differences between typical survey respondents and those who are more difficult to interview.

About the Survey

The basic approach of the experiment was to compare the responses from a sample of people obtained through Pew’s usual methodology with a sample obtained with a more rigorous survey effort over a much longer field period. To do this, an identical survey questionnaire was inserted into two separate surveys. The “standard” survey was conducted among 1,000 adults from June 4-8, 2003, using the same amount of effort that would be applied to any Pew survey project. The rigorous survey was conducted from June 4-October 30, 2003 and completed interviews with 1,089 people. In order to maximize response rates, a number of procedures were implemented, as described on page 13 of this report. Since the two survey questionnaires are identical, comparisons of the personal attitudes, behaviors, and characteristics can be made, though questions for which opinions were subject to change over the long field period of the rigorous survey ­ e.g., President Bush’s job approval or interest in news stories ­ are not used in the analysis presented here.

In addition, we compared the opinions of people who were especially difficult to interview (494 respondents) with those who were more easily available and who readily cooperated. “Hardest to reach” cases had refused the interview at least twice before complying and/or required 21 or more calls to complete.3

The methodology mirrors that of a 1997 study conducted by the Pew Research Center.

Acknowledgments

Several people provided valuable advice in the design of the study. An advisory committee chaired by Diane Colasanto included Richard Kulka, Warren Mitofsky, Richard Morin, Linda Piekarski, Mark Schulman, Evans Witt, and Cliff Zukin. Jonathan Best, Jon Rochkind, and Mary McIntosh of Princeton Survey Research Associates International provided methodological guidance for the project. The contribution of Survey Sampling International, which donated the telephone sample and demographic data for the project, is gratefully acknowledged.