How biased are the polls?

Rick Moran
An excellent and timely article by Katrina Trinko at NRO yesterday about oversampling of Democrats in opinion polls.

The answers coming from pollsters may surprise you:

It has become a familiar pattern this election cycle: A poll shows President Obama several points ahead of Mitt Romney, conservatives look at the poll's sample and discover that a significantly higher percentage of those polled are Democrats than Republicans, and, finally, the poll is dismissed as biased.

But here is another possibility: At this particular point in the race, a higher percentage of voters may be identifying themselves as Democrats. That doesn't mean they will still see themselves as Democrats on Election Day, but for whatever reason (such as excitement over the convention) they do now. Exit polls, after all, are taken only on Election Day, and they don't register the shifts in party ID that may have occurred in the weeks before (when some people are already sending in absentee ballots or participating in early voting).

Lydia Saad, a senior editor at Gallup, argues that if a pollster tries to control what percentage of those polled belong to the respective parties, the pollster might well be predetermining the results. "In a wave election, let's say," Saad explains, "there is a huge shift of voters toward a certain candidate. In that case, you're going to see that party's ID go up in the polls. Does that mean you push it back down? Well, no, because you might as well just decide what percentage of the vote each candidate is going to get and don't even bother polling."

In August, the Pew Research Center, facing criticism for a poll where the sample was skewed Democratic, made a similar argument: "Party identification is one of the aspects of public opinion that our surveys are trying to measure, not something that we know ahead of time like the share of adults who are African American, female, or who live in the South."

By randomizing sample audiences, pollsters hope to measure party affiliation rather than actually searching out and finding a set percentage of Democrats, Republicans, and independents. Even Rasmussen agrees that pollsters don't have their thumb on the scale:

Pollster Scott Rasmussen concurs. "It does not mean it's a bad poll," he says, "if it has an oversampling of Democrats or an oversampling of Republicans or something else that conflicts with a perception that a person might have, but it is something that you should take into consideration in evaluating, especially two polls from the same company. If a company comes out with a poll one day that shows the president leading by five points, and the next day it has a poll that shows the race tied, and you look and see the only thing that changed was that the partisan sample changed, that's probably just noise as opposed to real significance."

I've always understood polls to be a snapshot of a certain time frame - usually 48-72 hours. By the time the poll is published, it may already be obsolete as an accurate gauge of the state of the race. It's usefulness then, is in assessing the state of the race over a period of time - weeks, months - and watching the ebb and flow of support for each candidate.

The Pew poll from yesterday that shows the president leading Romney by 8 points among likely voters is obviously an anomaly. The daily tracking polls from Gallup and Rasmussen show no such spurt for Obama. Plus, the Pew poll was taken over a 4 day period, over a weekend, and probably oversampled cell phone users.

The bottom line is that each polling company has their own proprietary methods. They wouldn't be in business long if they intentionally "cooked the books" to satisfy one partisan or another.




An excellent and timely article by Katrina Trinko at NRO yesterday about oversampling of Democrats in opinion polls.

The answers coming from pollsters may surprise you:

It has become a familiar pattern this election cycle: A poll shows President Obama several points ahead of Mitt Romney, conservatives look at the poll's sample and discover that a significantly higher percentage of those polled are Democrats than Republicans, and, finally, the poll is dismissed as biased.

But here is another possibility: At this particular point in the race, a higher percentage of voters may be identifying themselves as Democrats. That doesn't mean they will still see themselves as Democrats on Election Day, but for whatever reason (such as excitement over the convention) they do now. Exit polls, after all, are taken only on Election Day, and they don't register the shifts in party ID that may have occurred in the weeks before (when some people are already sending in absentee ballots or participating in early voting).

Lydia Saad, a senior editor at Gallup, argues that if a pollster tries to control what percentage of those polled belong to the respective parties, the pollster might well be predetermining the results. "In a wave election, let's say," Saad explains, "there is a huge shift of voters toward a certain candidate. In that case, you're going to see that party's ID go up in the polls. Does that mean you push it back down? Well, no, because you might as well just decide what percentage of the vote each candidate is going to get and don't even bother polling."

In August, the Pew Research Center, facing criticism for a poll where the sample was skewed Democratic, made a similar argument: "Party identification is one of the aspects of public opinion that our surveys are trying to measure, not something that we know ahead of time like the share of adults who are African American, female, or who live in the South."

By randomizing sample audiences, pollsters hope to measure party affiliation rather than actually searching out and finding a set percentage of Democrats, Republicans, and independents. Even Rasmussen agrees that pollsters don't have their thumb on the scale:

Pollster Scott Rasmussen concurs. "It does not mean it's a bad poll," he says, "if it has an oversampling of Democrats or an oversampling of Republicans or something else that conflicts with a perception that a person might have, but it is something that you should take into consideration in evaluating, especially two polls from the same company. If a company comes out with a poll one day that shows the president leading by five points, and the next day it has a poll that shows the race tied, and you look and see the only thing that changed was that the partisan sample changed, that's probably just noise as opposed to real significance."

I've always understood polls to be a snapshot of a certain time frame - usually 48-72 hours. By the time the poll is published, it may already be obsolete as an accurate gauge of the state of the race. It's usefulness then, is in assessing the state of the race over a period of time - weeks, months - and watching the ebb and flow of support for each candidate.

The Pew poll from yesterday that shows the president leading Romney by 8 points among likely voters is obviously an anomaly. The daily tracking polls from Gallup and Rasmussen show no such spurt for Obama. Plus, the Pew poll was taken over a 4 day period, over a weekend, and probably oversampled cell phone users.

The bottom line is that each polling company has their own proprietary methods. They wouldn't be in business long if they intentionally "cooked the books" to satisfy one partisan or another.