How SMS Responses to Surveys Differ from Phones

In hundreds of conversations with consultants, committees, caucuses, and even voters, not a single person has said, “Oh, yeah, telephone calls are the future of polling.”

Cygnal has spent the last 19 months scientifically testing the use of peer-to-peer text messaging as a tool for inviting voters to participate in a political survey. Heck, we built our own P2P SMS platform as an outcropping of this research since none of the existing providers had the reliability or deliverability to get the response rates needed for text polling.

In more than 420 multi-mode surveys (phones + texting + email), we’ve learned a heckuva lot about what works – and what doesn’t – when it comes to gathering survey responses via SMS.

  1. Conducting a poll through multiple communication channels is significantly more work than simply handing off a script and some quota groups to a call center.
  2. You’re essentially running three polls at the same time to a single district with a unified quota structure despite the fact that different types of people respond differently on each different channel.
  3. Not every demographic group responds to a communication channel identically from district to district. We’ve seen multiple instances where high-income voters perform masterfully on SMS responses in one metro area, then on the next survey in a nearly identical metro area, that same voter group tanks on SMS completion rate.
  4. “Spray-and-pray” is an ineffective strategy. Loading a list of voters into a texting platform and sending them all the same link multiple times results in a very unbalanced sample and diminishing response rate.
  5. Relying on an outside SMS vendor – even if your call center is the provider – means you lose control over the timing of messages and the frequency in which each quota group receives a request to participate.
  6. Multiple attempts at communicating with a voter over several different channels produces the greatest, most representative result. For example, a voter receives a live call from an agent and gets a voicemail requesting survey participation. Then she gets a text message asking her to answer a survey. And lastly, she receives an email invitation and finally takes the survey.

If you’re looking for some hard data, here’s what an internal machine learning project using Classification And Regression Trees showed us about SMS (and email) response rates on our multi-mode survey projects.

  • Age is a significant factor in response mode preference. Voters age 65+ overwhelmingly prefer phone calls followed by email.
  • Among older voters, higher educated individuals prefer email; lower-propensity older voters also prefer email.
  • Gen X, Millenials, and Gen Z overwhelmingly prefer SMS.
  • Boomers are split between email and SMS; men in this group prefer email, but women prefer SMS, with the exception of Hispanic women who prefer email.

As you can see, if you want to have a representative sample in your political poll nowadays, you must have a representative approach to communicating with and inviting the voters to answer the survey.

We have found peer-to-peer text messaging is a great way to increase that representativeness, but only if your pollster is experienced in mitigating properly for SMS pitfalls.