Our co-founder and CEO Mike Greenfield explains how Change Research sees the role of a pollster.
“Wow, we’re ahead by four points?” the candidate said. “That’s great news. So what do we do now?”
This was the first poll Change Research had ever run for a candidate for office, and it was a doozy. Our poll showed the Virginia state legislative candidate ahead of a longtime Republican legislator, and the candidate was shocked and thrilled to see that our poll found him ahead. This was the first time he’d ever commissioned a poll, and they weren’t sure what to do with the results.
I explained that:
- The Democrat running for governor was doing significantly better with female voters than he was, presenting an opportunity to gain ground.
- Medicaid expansion, as he had surmised, was a winning issue in the district—but that the gas pipeline was paramount for fewer voters than he’d imagined.
- Voters did not like the fact that his opponent had spent over a decade in the state legislature.
The candidate was excited, and we were too. But the state house caucus was unpersuaded: they thought the candidate had little chance to win, they were skeptical of our poll results, and they opted not to support the candidate in a material way.
Still, armed with numbers and actions, the candidate adjusted his strategy and got to work. A few months later, he shocked almost everyone and won by nine points.
In that first project, we told our candidate the right story, and he won.
What is “the Right Story?”
We told the right story in that Virginia race.
The right story explained which voters weren’t currently planning to vote for him but might be persuaded.
The right story detailed what voters cared about and what mattered less to them.
The right story outlined how the candidate could best talk about his opponent and his record.
The right story was his path to victory.
As it turned out, our poll also accurately forecast the outcome of the race. Election analysts said our guy would lose. Other pollsters said our guy was behind. We said our guy was ahead. And he won!
Getting the outcome right is one part of telling the right story—but it’s just one part.
Pollster Ratings
In 2013, I wrote a blog post with the provocative headline Your Metrics are Bad.
In that post, I argued that:
Well calibrated metrics are quite useful
Poorly calibrated metrics are often worse than having no metrics at all
If you’re looking for how “good” a pollster is, you can find several ratings systems online. Sadly, these ratings systems fall into the “poorly calibrated” category, for two reasons:
1. Pollster ratings systems try to measure how predictive a pollster’s polls are.
Without a large amount of data—on the order of hundreds of polls per pollster— this is likely to yield noisy results along the lines of evaluating basketball players by watching each player take ten shots from different places on the floor. These rating systems are trying to rate pollsters with a tiny number of data points—only their public polls—many of which feature situations (especially in primaries) where a race is shifting quickly in the final days and it’s virtually impossible to know if the poll was “wrong” or the race simply moved.
2. Pollster ratings systems generally do not measure what’s most important to you as a poll buyer: whether a pollster delivers actionable, valuable strategic recommendations.
We of course want our polls to be accurate, and our results in our first three years have been solid. Our public general election polls for Congressional and statewide races in the three weeks before an election each had average errors between 5 and 6 points (5.2, and 5.7 points respectively), virtually identical to long-term polling averages from other pollsters (5.6, and 5.1 points, respectively).
Most of our polls are not public, and our overall accuracy numbers have been even better—including a better accuracy record than the NY Times in races where we both polled in 2018.
Accuracy is hard to measure. Accuracy is important. But also important: if you’re choosing a pollster, accuracy is not enough.
Impact Over Forecasting
I founded my first company, TeamRankings, when I was in college. TeamRankings is a sports analytics company whose business centers on predicting the outcome of sporting events. In 2017, for instance, TeamRankings produced far more accurate predictions for March Madness than other data geeks, including Ken Pomeroy, ESPN’s analytics team, and Fivethirtyeight.
When I co-founded Change Research in 2017, I was pointedly not trying to create the TeamRankings of politics. TeamRankings seeks to predict sporting events but not influence them; for Change Research, influence matters far more than prediction.
In order to have an impact, Change Research’s pollsters seek to tell the right story—uncovering insights that candidates and organizations can use to improve their messaging and their targeting—with a form of polling that is fast, accurate, and affordable.
How to Tell the Right Story
By doing three things well, we put ourselves in good position to tell the right story. Here are a couple of ways we’ve done that:
- Ask the right questions
- Gather good data
- Draw the right conclusions from the data
Asking the Right Questions
Asking the right questions is something Change Research does at both a data level and a client level.
What’s the best way to assess people’s response to a candidate bio? How do we gauge the likelihood that someone will vote? What kinds of questions will yield high response rates from typical citizens and not just highly engaged semi-professional survey takers? How can we elicit more honest responses? We test our assumptions and make choices based on empirical measurements.
On a client level, we’re spending time with our clients to deeply understand their needs. What are you looking to learn? What decisions will you make as a result of the poll results we present to you? Are you looking to understand sentiment, and how? Are you looking to test messages, and how?
Gathering Good Data
Good sampling is fundamental to good polling.
Change Research introduced a brand new approach to polling in 2017, Dynamic Online Sampling. Dynamic Online Sampling solicits respondents predominantly via online ads, adjusting ad targeting based on who is taking a survey.
One political consultant told me she was approached by a partner at a top flight polling firm inquiring about trying to replicate our methodology. “How can we do that?” they asked her.
She explained at a high level the approach our engineering team has taken to build our technology over several years. “Oh. That’s way too hard. It’s not something we’d be able to pull off.” The partner had no choice but to go back to his old approach.
Meanwhile, because of the technology we’ve built, Change Research clients can affordably:
- Find hundreds or thousands of respondents who are not regularly being compensated to take surveys.
- Field surveys even in tiny geographic areas. Change has done hundreds of polls in small state house districts.
- Quickly field a survey without relying on slow, expensive call centers.
- Test virtually any sort of content — text, images videos — online.
Gathering that data and weighting it optimally—using approaches honed by our data science team over three years and millions of surveys taken—yields good answers to meaningful questions.
Drawing the Right Conclusions from the Data
Good questions and good survey data are necessary but not sufficient in telling the right story to our clients.
Helping Yes on 802 Deliver a Win in a Tight Race:
Back in June, Change Research worked with the Yes on 802 team to track support for Medicaid Expansion in Oklahoma in the final few weeks before Election Day—while an aggressive push by opponents depressed support for the measure. We conducted daily polling in the final days of the race. Yes on 802 used that continuous stream of fresh data to adjust their targeting strategy. Because we were able to affordably monitor the race as it was shifting, we were able to give much more precise and timely advice than we would have by conducting a one-time poll. On election day, results came in exactly where our final estimates showed the race—close, but close enough to win.
Helping Color of Change PAC Advance Forward-Thinking Candidates:
We worked with Color Of Change PAC to help guide their engagement in several State Attorney primary races. Specifically, Color Of Change PAC was looking to understand:
- What types of criminal justice reforms did Black voters prioritize?.
- What were the most persuasive messages around reforms?
- How could Color of Change PAC best speak to Black voters, who are often left out of the political conversation?
The results showed that the public outcry over racism in policing had become a breakthrough issue even among white voters, and one that was extremely salient to them when deciding whom to support for State Attorney.
Our polling allowed us to detect changes in voters’ perceptions around policing and criminal legal issues. We also saw that specific and measured proposals were more popular than vague generalities or slogans. Those insights helped Color of Change PAC to act on the opportunity, pivot to more effective issue messaging, and successfully advance Democratic candidates Kim Foxx in Cook County and Monique Worrell in Orange-Osceola counties.
Asking for more
We’re not aware of any organizations that hire a pollster because their mean error is half a point better than some other pollster. It’s a distinction that doesn’t make a meaningful difference to your campaign.
When choosing a pollster, you should be asking about more than predictive accuracy.
We take pride in our ability to use data to tell the right story, in a way that’s fast and affordable. By telling our clients the right story, we help them get to their goals, understand what people care about, determine which messages are effective, and find the right voters to approach in the right ways.
A poll that accurately tells a candidate that they’re behind 52-41 is rarely helpful or useful.