Polling: a look inside the machinery of public opinion surveys

Polling: Behind the scenes at Gallup, interviewers and editors try to find out how Americans will vote on election day. With the media's dependence on public opinion statistics, news consumers must educate themselves about which surveys provide valuable data and why.

|
Art by Laura Smith/Special to The Christian Science Monitor
Behind the scenes at Gallup, interviewers try to find out how Americans will vote on election day. With the media's dependence on public opinion statistics, news consumers must educate themselves about which polls provide valuable data and why. This is the cover story of the July 9 and 16 issue of The Christian Science MonitorWeekly magazine.

It's dinner time in America.

Somewhere parents and children sit down together to eat, to talk through their respective days, to enjoy precious shared time. The phone rings.

"Hi, I'm Michael Jablonski," says a man whose name is no doubt unfamiliar to the woman who answers, "and I'm calling about a Gallup poll.... We'd like to include your opinions. Can you help us out?"

His inquiry – from a Gallup interviewer sitting in a phone center in a city office park here in Omaha, Neb. – is one of many thousands that citizens will field from interested survey companies as the 2012 White House contest ramps up.

Their answers will mirror the mood of the country, but the mood of the country can very well be affected by the poll itself.

Public opinion surveys have become a ubiquitous element of American political culture. The numbers – some reliable, others less so – are pawed over and interpreted for headlines, insight, and horse-race drama by newspaper reporters, cable news talking heads, and bloggers of all party persuasions.

With each successive election of the modern age, there are more organizations looking to mine what the adult voter believes about the incumbent president and his rival, the state of the nation, and a host of issues from the economy to gay marriage.

With the media's dependence on poll numbers – and the sheer frequency with which those numbers are collected – news consumers must educate themselves about which surveys provide valuable data and why.

"Polling is important because it gives every voter and every nonvoter an equal chance of having their voice represented," says Scott Keeter, the director of survey research at the Pew Research Center. "When properly done, without bias and malice, polls can give you a view of what the public is experiencing or wanting, which you don't get from interest groups or the candidates or even elections, which are very blunt instruments."

Gallup – by the estimation of most in the industry – consistently generates quality data. Editors with the 77-year-old privately held survey company agreed to let the Monitor take an inside look at the process of putting together its June political poll, which assessed the role of religion in the presidential race as well as a range of other matters, including public views on the economy and the two men who want to lead this nation.

The 10-day endeavor of creating and conducting the poll involved Gallup staff in four states and Washington, D.C. Their work shows how important the human element is in shaping and thoughtfully interpreting a survey. Their efforts also illuminate the many junctures at which polls can be manipulated by those more interested in spinning numbers.

For example, question-crafting and order of questions when asked are vital to a valid survey. So are well-trained callers, like Mr. Jablonski, and the quality controls that guard their practices. Proper balance of land lines and cellphones matters in collecting a representative sample. How a data set is weighted – or made, via statistical calculations, to resemble the adult population of the nation – on the back end is critical, too.

Good polling is, in its own way, as intricately detailed as "successful heart surgery," says Frank Newport, the editor in chief of Gallup and the immediate past president of the American Association for Public Opinion Research: "Failures, wrong decisions, or low quality in any of these phases of the process can negatively affect the objective of using carefully selected samples of respondents to accurately represent the attitudes and self-reported behavior of an entire population of citizens."

So with this mission in mind, if Jablonski calls, should you steal away from your family to take the time to answer his questions? And if you don't, should you care how neighbors, friends, or faraway strangers responded?

Are polls a civic duty?

"Being called to do a survey is a privilege, not a burden," says Jon Krosnick, a Stanford University professor and expert in the psychology of political behavior and survey research methods. "Polls are taken so seriously that being given the opportunity to express your point of view is meaningful. It is a valuable opportunity, and it will make a difference."

George Gallup, a grandfather of the contemporary polling movement, believed as much. A veteran of the advertising world who held a PhD in psychology, Mr. Gallup founded the precursor to the company, the American Institute of Public Opinion, in 1935.

In that era, polls were conducted in-home, face to face by interviewers across the nation. The emergence of telephones in homes and the decrease in cost of calls changed that practice by the 1970s, making data more easily and cheaply obtainable and more timely, wrote Pew Research Center president Andrew Kohut in a 2009 history of public opinion surveying.

Over the past few presidential election cycles, cellphone use – which varies among ages and other demographics – has challenged pollsters to find a new way to reach, and effectively sample the opinions of, Americans.

This year, as President Obama faces a challenge from Republican Mitt Romney, almost a third of American homes are cellphone only. During the 2004 cycle, that number was around 5 percent, according to the Centers for Disease Control and Prevention's National Health Interview Survey.

One common – and some say questionable – polling technique prevalent in this and recent election years is automated calls. Robopolls, as they are known, could be answered by anyone; a computer that records respondents' views can't determine if those interviewed are who they say they are. These surveys are dramatically cheaper and are used by campaigns, local TV stations, and other polling outfits. They are generally viewed by experts as less reliable, though as people make up their minds closer to the election, they can be relatively accurate.

The public can seek out a reputable poll's methodology – easiest to do when browsing results online – to determine if a poll was automated.

Andy Smith, director of the University of New Hampshire Survey Center, says with a laugh that he puts his 11-year-old on the phone to answer robopolling questions. Obviously, Mr. Smith's son is not of voting age.

"I'd just keep away from them," Smith says of the surveys, suggesting they do a poor job of communicating the public's position on issues.

Gallup uses humans to craft the polls and conduct them. It's the people at Gallup who inject the enterprise at turns with critical subjectivity.

Gallup's Mr. Newport, based in Washington, and his colleagues – managing editor Jeff Jones in Princeton, N.J., and senior editor Lydia Saad in Connecticut – were responsible for crafting the June survey.

During a 90-minute conference call – conducted with the efficiency, shorthand, and wonkish tone of longtime colleagues – they reviewed a core set of questions asked monthly since 2001. Unlike others in the field who might seek quick headlines with data, Gallup's team is looking to explore what Americans are feeling now and to establish a long-term trend of opinion (which itself can produce headlines, too).

The three wended their way through pages of possible questions, including those about Mr. Obama's approval ratings, perception of the job market, support for third-party candidates, and top reasons the respondent is supporting a particular presidential candidate.

With five months until voters go to the polls, they thought June marked a good time to assess the role of religion in the presidential contest – and Newport noted that he had gotten calls from journalists, academics, and regular citizens asking about the issue.

Mr. Romney is, as election watchers know, a Mormon. Newport, Mr. Jones, and Ms. Saad thought it wise to update the trend on Romney by asking voters if they would support a generally well-qualified candidate who happened to be Mormon. Much as Obama's race could have been a roadblock for him last cycle, there's interest in determining if Romney's religion might be a sleeper factor this year.

"I think religion is going to be at least subliminally an issue in this election," Newport said on the conference call.

While the public has shown growing acceptance over time for black, female, Roman Catholic, and gay candidates, the numbers for Mormons have remained "flat historically," Jones said. When Romney's father, George Romney, ran for the Republican nomination for president, an April 1967 Gallup poll indicated that 17 percent of Americans would not support a Mormon's candidacy.

While question substance is one goal of the call – and a follow-up conversation a day later – another is to make sure the questions asked will prompt a phone conversation of 18 minutes or less. Gallup research indicates that respondents tend to tune out or jump off a call at that point.

Mr. Krosnick of Stanford notes that this preliminary part of the poll process is key because poor question construction can create bias and ultimately influence the data. Surveys that pose agree-or-disagree, true-or-false, or yes-or-no questions typically push people in the affirmative direction, he says.

Other questions are leading. Such as this example Krosnick suggests: "Some people believe that Barack Obama was born outside the United States. How much do you agree or disagree?"

The question sends a signal that the interviewer – or survey sponsor – might have an agenda. A more neutral way to ask would be to also include a mention that others believe Obama was born in the US.

Yet another way to skew results, says Krosnick, is to target respondents who favor a candidate or issue. "If you prefer to slant a poll in a particular direction, then you can make decisions about who to go after and how to go after them," he says. "If a candidate does better in higher socioeconomic status [areas], use more respondents in [those areas]."

Science of probability

Once Newport and his colleagues formulate poll questions, the survey is sent to Tara McGhee, a survey design editor in Omaha. Ms. McGhee takes every path through the survey to make sure it's programmed correctly; in other words, depending on a respondent's reply – if he or she offers support for Romney or Obama, for example – that person is branched through the survey questions differently. Follow-up questions are prompted by their answers. She also checks spelling and grammar.

"We are expected to be perfect," says McGhee, who has been with Gallup for 10 years. "We are the last set of eyes on it. My personal mission is quality."

Stephanie Morrow, in the firm's nearby Lincoln, Neb., call center, is in charge of getting the teams of individuals in five offices – including Houston, where the Spanish-language outreach is done – ready to make their calls. About 100 interviewers worked the June political poll. As the survey got started – it ran for four nights, from Thursday, June 7, to Sunday, June 10 – Ms. Morrow and her team did random quality checks, listening to recordings of the calls.

As it has done for years, Gallup outsources a critical part of the polling process – sampling – to a firm called Survey Sampling International, which has an office in Connecticut.

This is an element of the undertaking that most respondents and citizens are blind to – and it's vital. Sampling is the process of finding a random group of Americans – 1,004 in the case of the June Gallup poll – whose responses represent the same views that would be obtained if every American adult were interviewed. (Sample sizes do vary among reputable outfits and can be smaller, though experts believe the number used by Gallup, and others, is a good one.)

Survey Sampling International gives Gallup a set of listed and unlisted phone numbers. And those numbers are entered into Gallup's sampling and interviewing software and then called automatically by the system. Gallup inter-viewers take over from there.

The process involved in choosing those numbers is called random-digit dialing. The scientific premise – the equal probability of selection principle – is that if all possible homes, in other words every number in America, are available to be randomly called, the survey will produce a collection of answers that can be generalized to reflect the entire population.

Note, too, that businesses must be weeded out of any call list.

"The power of random sample is fairly miraculous," Newport says.

Sampling has been complicated by the rise of cellphones. In Gallup's June survey, 40 percent of respondents were reached on a cellphone. Young people are more likely to have only a cellphone, while older Americans tend to still have land lines.

Michael Traugott, a professor of political science and communication studies at the University of Michigan, says that though it can cost twice as much to reach people on cellphones – usually calls have to be made several times – it's imperative to include that population.

"If there is a greater proportion or probability of young people and minorities adopting cellphones early, earlier than the rest of the population, and you have a candidate who is strongly favored among both, like Barack Obama, you could underestimate his support if you exclude cellphone-only or cellphone-primarily people from your sample," Mr. Traugott says.

Experts caution consumers to be wary of the results if a poll doesn't contact respondents via both means. This isn't always easy to determine, however. Disclosure of poll methodology is not uniform, though some organizations offer online links to information.

Krosnick believes that media outlets should require pollsters to disclose their methodologies in a standard format to get news coverage.

The call-cubicle-to-voter connection

Gallup's Omaha call center is similar to many cubicle-lined work spaces. Workers post personal keepsakes at their desks – pet pictures, a teddy bear, even a book of Psalms. They stash snacks for breaks – sodas, Red Bull, a box of Nips hard candies.

Unlike many offices, though, there's a steady buzz of chatter. Everyone in the room – approximately 140 callers – is talking at once. Most are not working on the political poll. Even one of America's premier pollsters needs to make money – and that largely comes from conducting satisfaction or employee surveys for corporate clients. (Gallup wouldn't discuss the cost of its June political poll, but Pew's Mr. Keeter estimates it can cost between $30 and $50 per respondent – or about $30,000 to $50,000 for a survey of 1,000 people. As for pay, Gallup interviewers are given an hourly quota of calls and receive bonuses if they go over that, says a veteran Gallup caller.)

Jablonski has been with Gallup since 1991. He interviewed for the June political poll with alacrity. He snagged a willing respondent on the phone and dived right in. (For legal reasons, the Monitor was not permitted to listen to the respondents' side of the calls.) This female interviewee, age 19, Jablonski said later, is supporting Obama because she likes his views on gay marriage and that he led the effort to kill Osama bin Laden. She says she doesn't trust anyone else running for president.

Dressed casually in shorts and flip-flops, Jablonski encourages her participation.

"I want you to know you did a really good job, and I really appreciate your time," he says.

Jablonski said after the call that he gets excited when he sees Gallup's surveys mentioned online or on cable TV. "We don't represent Barack Obama. We don't represent Mitt Romney. We represent the American people," he said.

"There's a real sense of pride in that [work]," he added. "To know that's data I collected."

One element of the polling process that's outside the express control of Gallup and its interviewers is the truthfulness of respondents. Pollsters say honesty is less of a problem in political surveys than with those exploring personal habits like alcohol use, sexual activity, church attendance, or compliance with the Internal Revenue Service.

"It's more of a problem on questions about behaviors that are strongly socially desirable or undesirable," Keeter, of Pew, says.

Truthfulness could be a factor in answers to that Gallup question about supporting an otherwise qualified individual who happens to be Mormon. There's a danger that the respondent wouldn't be forthcoming because saying no to a Mormon candidate, because of his faith, might reveal an unacceptable prejudice.

But Keeter says that so far this election cycle, any reluctance among voters to communicate that view hasn't panned out in polling: "We certainly get plenty of people who are willing to express that feeling."

Meanwhile, a respondent's predisposition to oppose a candidate based on that person's race or religion usually lines up with a host of other reasons the individual wouldn't otherwise support the candidate in question.

How 1,000 can mimic millions

When the June poll calling was completed, the process got more complicated – hardly a simple tally.

Anna Chan, a Princeton-based Gallup methodologist, took over. She "weighted" the responses mathematically to bring the percentages in the sample closer to known data about the demographic distribution of characteristics – age, gender, region, education, race, etc. – in the general population.

In other words, each respondent is assigned a weight within the sampled group to match his or her representation in the general population.

If Gallup is short respondents in certain demographic categories, those answers will get re-calculated essentially so that they are reflected appropriately in the final results.

"One of the realities of survey research is you're not going to get a perfect representation of the population based on who you're able to call and get a hold of and interview," Jones said. "We take our sample and match it. If we're short on 18-to-29-year-olds, which we routinely are, we'll give them a higher weight in the survey." Likewise, older Americans in the sample would have to be weighted down in influence to compensate mathematically.

The weighted results of the June poll were sent to a data analyst in Princeton – Dave Banas – who produced cross tabulations, a tool used by researchers to analyze the raw data across a range of demographics. This provides Gallup with a more specific sense of what trends or news they've gleaned from the poll. What issues are paramount, for example, to younger voters and which candidate they prefer. Or whom Hispanics are supporting in the White House contest.

From there, Newport and his team determine what's valuable, and several related stories are written and posted to Gallup's website over the course of weeks.

One of those stories explained the results of that question about voting for a well-qualified candidate who happens to be a Mormon. It turns out not much has changed since the father of this year's Republican nominee made his run for the White House; the June poll showed 18 percent of adults would not back a Mormon.

So what should voters read into that number? It's interesting, Newport notes, that numbers for other demographics have moved dramatically over time. In 2007, for example, 43 percent of adults wouldn't back an otherwise qualified gay or lesbian for the nation's top job. That number was down to 30 percent in this poll.

Last summer, when the GOP nomination contest got rolling, a Gallup poll also showed that 18 percent would not vote for a Mormon for president. So Romney's selection as the nominee hasn't moved the bar.

On the other hand, perhaps that's because the candidate hasn't talked about his faith much on the trail, Newport says. The flip side of this calculation is that the June survey showed approximately 4 in 10 of those polled don't know that Romney is Mormon.

The results of the religion question produced a spate of stories that looked at the issue from different angles. USA Today wrote: "Gallup poll: 44% don't know Obama's religion." ABC News took another view: "Americans least likely to vote for atheist, Muslim presidential candidates, polls finds." UPI homed in on Romney's faith: "18% would not vote Mormon," and CNN looked at the historic trend: "Bias against Mormon presidential candidate unchanged since 1967, poll finds."

One other tidbit to consider when evaluating these numbers: In 1960, 21 percent of Americans responded to a similarly framed question about Catholics. And then the nation elected one, John F. Kennedy Jr.

Nuance and framing – both historic and current – is meaningful, then, in determining what polling communicates. And, Newport says, this poll suggests that in terms of the Mormon question: "In context, it's not a deal killer."

The truth test: election results

Pollsters of varying methods ultimately have one very public test: Election Day. Who is right – and how the electorate shapes up – will be indisputably clear.

Reputable outlets, like Gallup, have a track record of results. In its final preelection polls in the 19 presidential contests since 1936, Gallup has incorrectly predicted just three: 1948, 1976, and 2004, in which the final Gallup survey showed a tie between George W. Bush, the winner, and the Democratic nominee John F. Kerry. It's worth noting that Gallup's preelection poll in 2000 gave Mr. Bush the advantage over Democrat Al Gore, and while Bush ultimately won the election, Gore won the popular vote.

For Americans hoping to follow this season's polls on their own, experts say the source of a poll is the best place to start when determining veracity of numbers. Traugott suggests that when evaluating data, citizens seek out not just who conducted the survey, but also its field dates, who was sampled, and what questions were asked, and review the full questionnaire.

And as November nears, perhaps those who answer their phones and let a pollster into their dinner hour can take heart that there are many people – like interviewer Jablonski – with high intentions for accuracy on the other end of the phone.

"If the quality isn't good," he says of his work and Gallup's, "the data isn't going to be good."

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Polling: a look inside the machinery of public opinion surveys
Read this article in
https://www.csmonitor.com/USA/Politics/2012/0708/Polling-a-look-inside-the-machinery-of-public-opinion-surveys
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe