Why Democrat Poll Numbers Are Worse Than You Think
Before the 2020 presidential election, I became curious about political polls claiming that Joe Biden had a ten-percentage-point lead over Donald Trump. At that time, Trump was addressing crowds in the thousands. When he was not barricaded in his basement, Biden was lucky to draw a crowd of a hundred. That did not seem right.
So I did some research and concluded that the polls were undercounting Republicans. In one of the articles, which you can read here, I predicted the silent Trump vote to be north of two percent of the electorate. I was not the first to consider this, but I was one of the first to make a prediction.
Polling organizations would not admit their polls were biased against Republicans. However, it turned out my estimate was too low by half. In fact, the polling error for the 2020 election was roughly 4% nationwide, the largest in the last 40 years.
Fast-forward to today. Inflation is 8+ percent, the price of food and gasoline is way up, crime is up, there is a nationwide shortage of baby formula, and don't get me started on the border crisis. Yet Joe Biden's job approval is close to 40% positive. That means almost four out of every ten Americans think Joe is doing a good job if you believe the RealClearPolitics average. And I don't.
It is possible that Biden's job approval is being helped by positive coverage from the news and social media. But I am not buying that, either. Spin can go only so far, and even rank-and-file Democrats have to fill their gas tanks and buy groceries.
The big difference between today and two years ago is that pollsters will now admit that their results are systemically biased against conservatives. For example, in an article published in Vox, pollster David Shor said:
For three cycles in a row, there's been this consistent pattern of pollsters overestimating Democratic support in some states and underestimating support in other states. It happened in 2018. It happened in 2020. And the reason that's happening is because the way that [pollsters] are doing polling right now just doesn't work.
Pollsters face two fundamental problems. One is developing an accurate voter turnout model that predicts who is likely to vote. The other is getting an unbiased measurement of what voters think, known as a random sample.
The turnout model is usually based on demographic distributions and historical voting records. If pollsters get the model wrong, it can bias their results. For example, in the 2020 election, most turnout models did not account for Republicans who rarely vote, participating in larger numbers than predicted.
The second problem is getting a random sample of the electorate. Unfortunately, in recent elections, this has become increasingly difficult to do. Although there are several theories as to why this is happening, it boils down to two issues. One is technology, and the other is a lack of trust in political polls.
As recently as the 1990s, pollsters could count on getting a random sample of responses to telephone surveys, but not anymore. Although most Americans have a cell phone, prohibitions on auto-dialing cell phones mean that pollsters continue to call landlines. This is problematic because landlines have a different demographic from the general population. And many of them have Caller ID, allowing voters to see who is calling.
According to Fairleigh Dickinson associate professor Dan Cassino:
Caller ID, more than any other single factor, means that fewer Americans pick up the phone when a pollster calls. That means it takes more calls for a poll to reach enough respondents to make a valid sample, but it also means that Americans are screening themselves before they pick up the phone.
The trust issue is a societal problem that has been building for many years. Due to partisan infighting, some voters have lost faith in our national institutions; politics; and, by association, political polls. This issue affects conservatives more than liberals, causing a polling effect called partisan nonresponse or nonresponse bias.
Partisan nonresponse is a phenomenon where low-trust conservatives opt out of participation in the polls and are replaced by higher-trust liberals. So why are conservative voters opting out? Pollster Nate Silver has two theories:
First, Republicans are becoming more distrustful of institutions and society, and that may be extending to how they feel about pollsters. Second, suburban Republican college graduates are more likely to fear professional sanction for their views and are therefore self-censoring more, including in surveys.
High-trust voters are basically the opposite. They tend to be highly educated, liberal, and more enthusiastic about talking to pollsters.
Independent pollster Richard Baris believes that the reason for Democrat bias is where they are polling.
You have to look not just at who[m] you poll, but where you poll. The way they're polling, they are reaching voters that skew too urban. In that case, your Republican sample will be stacked with the John Kasich ... and Bill Kristol Republicans[.] ... [T]hat's not the Republican Party that gave the presidency to Trump.
Pollsters say they are open to new methods of contacting voters besides landline telephones. And they intend to research which voter groups may be missing from their samples. But will that correct the polls for nonresponse bias?
According to Shor, the answer may be no.
Qualitative research doesn't solve the problem of one group of people being really, really excited to share their opinions, while another group isn't. As long as that bias exists, it'll percolate down to whatever you do.
Let's see if anything has changed since 2020. If you average the two 2020 October polls from New Jersey, Biden is leading by 22 points. According to the Cook Political Report, he won by 16 points, a miss of 6. In the 2021 New Jersey state election, polls overestimated Governor Murphy's margin of victory by 5 points. Apparently, the Democrat bias did not change in New Jersey.
If you averaged the final four polls from Virginia in 2020, Biden leads by 11.5 points. He won by 9.4 points, a miss of 2.1. In 2021, the polls seemed to have got it right, predicting a Youngkin margin of 1.7 percent versus the actual result of 1.9.
However, we see a different picture if we focus on the one outlier poll included in the Virginia average. The outlier is Fox News at Youngkin +8. If we remove the Fox News poll, the average changes to Youngkin +0.4. Therefore, the adjusted polling average underestimated Youngkin's support by 1.5 percentage points. So the Democrat bias is still alive and well — just masked by one flawed poll.
On top of nonresponse bias, another fly in Democrat approval numbers is that most polls currently sample registered voters rather than likely voters. Nate Silver believes that midterm polls of registered voters tend to lean toward Democrats.
We estimate that on average in midterm years since 1990, registered voter polls have had a 2.6 percentage-point Democratic bias — compared against likely voter polls, which have been unbiased.
If the polls are overestimating approval numbers for Biden and other Democrats, how bad is it? The political climate today is different since the 2020 election, but the Democrat poll bias seems intact, which was 4% nationwide. Since nonresponse bias, 4%, and registered voter bias, 2.6%, should be mutually exclusive, we can add them together. This gives us a total Democrat bias of roughly 6.5%
What does this mean? Until pollsters switch to sampling likely voters right before the election, you can subtract a solid 6 percent from Joe Biden's approval numbers. And if nothing changes before the election, any Democrat who leads by 3 percent or less is likely to lose.
Democrats had better pray I am not underestimating the number of hidden Republican voters, as I did in 2020.
Image via Pxhere.