Friday, March 7, 2014
When conservative pollster Scott Rasmussen spoke in Maine last week, he didn’t just talk about his polling numbers, but his analysis of everything from voters’ opinions about health care reform to the long-term future of Sens. Olympia Snowe and Susan Collins.
For Rasmussen, a professional speaker who has written books about the Tea Party and privatizing Social Security, polling is just the first step. He uses the numbers from his surveys to support or criticize specific political ideas or policy positions as a pundit.
These kinds of statistics-backed political arguments have become common in our country’s public discussion of elections and government policy. Unfortunately, the public’s understanding about where these numbers come from and their potential biases hasn’t kept pace.
Polling expert Nate Silver recently found Rasmussen’s pre-election polls to be some of the most inaccurate and biased of any pollster. He also cited significant issues with Rasmussen’s method of polling and an unwillingness to fully disclose how they conduct his surveys. Maine’s in-state pollsters are more forthcoming than Rasmussen.
In fact, every one of the pollsters that released public polls during the recent electionhas agreed to follow the American Association of Public Opinion Research or the National Council on Public Polls’ standards for disclosure.
Most were happy to go beyond those standards. Principals at Critical Insights, Pan Atlantic SMS and the Maine Center for Public Opinion all were willing to provide detailed explanations of their methods and practices. Pan Atlantic and MECPO both also indicated that they would be willing to provide raw data (the actual recorded responses from participants) for future polls.
(The Maine People’s Resource Center, an organization I work for, also conducted a pre-election poll and already has published the raw data from that survey along with a complete accounting of its methodology.)
It makes sense for pollsters to release raw data, according to Curtis Mildner, president of Market Decisions, a Maine-based research company that does public opinion research but doesn’t conduct political polls.
“I don’t see why it isn’t done that way. Releasing both weighted and unweighted data is simple to do and it makes things much more clear.”
Why is this agreement to pursue transparency important?
Because it means that Maine journalists, political professionals and others who have questions about where numbers come from will get meaningful answers. It also means that we can better assess the accuracy of Maine pollsters.
An open letter released recently by a group of national pollsters urged the media to judge the effectiveness of pollsters not just by how close the last poll came to the results of the election, but also by the professionalism and transparency of the entire operation.
A good example of this in Maine is the case of Pan Atlantic SMS, the pollster that was closest to predicting the actual results of the governor’s race and the 1st District congressional race.
Pan Atlantic can certainly brag about its numbers, but perhaps not its methods. It choose to shape (by choosing who to call) and weight (by altering after the fact) its results based on the party registration of the respondents until they fit the current registration of Maine voters, as provided by the Bureau of Elections.
There are two problems with this. First, stated party affiliation changes over time. As the editor-in-chief of the Gallup Poll put it, “There is no reliable measure of the distribution of party identification within the population.”
Second, even among pollsters that do weight by party, the goal is always to predict who will vote. It’s very unlikely that the same proportions of people will turn out to vote as registered for each of Maine’s political parties. This year, for instance, it’s likely that Republicans were more enthused and turned out at a higher rate than Democrats.
One can easily find similar potential problems with the methods of other pollsters, but without transparency and disclosure, we’ll never know about them.
In the future, here are the basic questions that journalists and political junkies should expect Maine pollsters to answer when they release a poll:
• Who sponsored the survey and who made the calls.
• The wording and order of the questions.
• The population studied and how it was sampled.
• The sample size and margin of error.
• A detailed description of any weighting procedures used.
• The method and times that the survey was taken.
If you don’t see these, at a minimum, then don’t trust the poll, or the pollster.
Mike Tipping is a political junkie. He writes the Tipping Point blog on Maine politics at DownEast.com, his own blog at MainePolitics.net and works for the Maine People’s Alliance and the Maine People’s Resource Center. He’s @miketipping on Twitter.