How Campaigns Can Spot Bad Polling Numbers

Choices. Life is full of them. One that we currently face in our industry is whether we can trust polling numbers that may come our way. It could be a breaking story citing numbers from a recently-released poll in a race you’re working on. Another time, maybe it’s a poll you get your hands on from someone in your network. Perhaps you’re looking to find a pollster and you wonder how you know that the work will be good.

The easiest way is to have an election to compare a poll against, but that’s not always available. You may be weeks, even months away from Election Day and a lot is riding on what happens in the meantime. Here are three ways to tell if a poll may not be up to snuff.

No Crosstabs

If there are no crosstabs available, it’s most likely because there was no real methodology applied to get the results. In crosstabs, generally, you are able to see the distribution in demographics like gender, age, race, party, etc. You also want to see the distribution geographically. These factors are how to attain representativeness. If you are not able to evaluate this, you can likely ascertain that the results are not accurate.

No Cell Phones

In today’s environment, it’s very, very tough to get a representative sample without including cell phones. With response rates on phones getting lower and the phone penetration into any given area not being 100%, why would you want to further lower those odds by another 25% (and even higher in more diverse areas)? Including the right ratio of landlines to cell phone collects is another big key to making sure the results are representative.

Too Much Bias Created

Some polls are not fielded to get accurate measurement of public opinion. These polls have the purpose of creating something else, like trying to influence a certain outcome or spread some kind of messaging. Normally, these polls will have thousands of responses and will have utilized robocalls to keep costs low. The questions are leading, double-barrelled, or asked in an order intended to impact the follow-on questions in a bias-causing way.

There are more keys to knowing how trustworthy a poll is, but these three cover most of the issues you will find out there. Want to discuss more? Have some other examples of polling red flags? Ping me anytime. Find me here.