Commentary

The Pollsters Were Better In 2016 Than They Were In 2012

Polling in 2016 was a news driver throughout the election cycle. Donald Trump often touted the polls when they were in his favor and attacked them as rigged when they were not.

In campaign emails, Hillary Clinton told supporters not to feel overconfident or extra worried because of polls, pleading for grassroots support, despite often leading among various poll aggregators.

Outlets like FiveThirtyEight that live and die by data and polling saw their traffic shoot through the roof to become around the 500th most-visited site in the world, having been down at about 2,600 at the beginning of 2016.

Major pollsters had Hillary Clinton winning the election, and in one sense they got it right -- Hillary Clinton won the popular vote by over 2.5 million votes. In another and vastly more consequential sense, pollsters had it wrong -- Donald Trump comprehensively won the Electoral College, despite many crucial states being extreme close.

advertisement

advertisement

“We didn’t do national polls, we did polling only in the states,” Tony Fabrizio, the Trump campaign’s top pollster told “The Pollsters” podcast last week, at the Harvard post-mortem that brought together top Clinton and Trump aides.

“We did aggregated of the battleground states, and we did state-specific,” continued Fabrizio.

Joel Benenson, top Clinton pollster and former Obama pollster, who was also on the podcast, explained what he saw as the biggest issue with polling: “The problem is we have an epidemic of polling, the media is covering them all like tracking polls.”

As a shot at many of the polling aggregators out there, he added that if each news organization only reported on their own polls they would be “infinitely better off.”

“[Public] pollsters on the national level were more right now in this election, as it turns out, than they were in 2012, when they had Mitt Romney winning,” said Benenson.

How do pollsters measure the outcome of elections more accurately, when the raw numbers can produce such varied outcomes, and certain demographic shifts are poorly modeled?

Magid, a research-based consulting firm, suggests that instead of taking an up-and-down vote on the candidates, pollsters “take more of a survey approach.”

“We often ask a series of questions about what respondents believe, hoping to better understand response biases,” Brent Magid told Red, White & Blog. “From the responses, we get a deeper understanding of voters’ emotional constructs, from that we can index against polling data and get a multidimensional and more accurate understanding of the state of a race.”

Taking a hybrid approach with both survey and raw polling data within one model, while more expensive and time-consuming, could improve outcomes and more accurately predict the outcome of elections going forward.

The pollsters were wrong in 2016, but they weren’t so wrong that we should now discount polling going forward. Many pollsters may have underestimated the groundswell of support for Trump, inaccurately weighing groups, such as white working-class voters in key states. 

After all, polls are only as reliable as the information people feed them.

Next story loading loading..