Some new polls look so wild, so shocking, that they go viral.
You see them shared on your Facebook and tweeted about. Maybe they’re covered on cable tv, as the talking heads jaw away.
If it’s a political poll, what should you do?
Well, first I’ll tell you what not to do, based on an incident this week — the report that 71% of Obama voters regretted voting for him.
As I discussed the other day, that turned out to be completely and utterly wrong.
Why? Because instead the poll found that 10% of Obama voters said they would not vote for him if the election was held today, and of those 10%, 71% said they regretted voting for him.
Quite obviously, 71% of 10% is very, very different than 71%.
What shouldn’t you do?
Don’t take a poll report on face value. Don’t just read what some blogger or journalist said and pass it along.
Consider that it’s essential to actually look at the polling data and — most importantly — the wording of the actual questions.
You should be able to look at the question wording because that’s a standard for best polling practices of the American Association of Public Opinion Research.
Do what I did when I saw that reporting on that poll:
Follow the links back to the actual data showing the question wording and percentages. Check it yourself.
Practice critical thinking.
If 71% of Obama voters actually regretted voting for him, consider what his approval rating would look like.
Obama won 53% of the vote. The other 47% probably disapprove of him in very large numbers.
And if 71% of the 53% who voted for Obama regret doing so, likely many of them also disapprove of him in very large numbers.
Thus you’d expect Obama’s job approval to be very, very low, perhaps around 20%. That’s low but not impossible. George W. Bush’s low point in Gallup polling was 25%. Nixon’s low point was 24% and Truman’s was 22%.
Yet you find that the Rasmussen poll, with the history of being a Republican-leaning polling operation, had Obama’s job approval around 50% when the “71% poll” was taken. The Gallup poll, which had a Republican skew in the 2012 election, had a job approval number several percentage points lower, but nowhere near where it would have to be if 71% of Obama voters actually regretted voting for him.
Consider what you have to consider with any poll
We’ll have many, many polls coming this election year. Here’s what everyone should remember:
1. One poll is pretty meaningless. Always look at multiple polls and focus on trends.
2. Question wording is important, nay, critical.
3. Don’t forget that each poll is a sample with a margin of error — and there’s a one in twenty chance that a poll result will be wrong even more than the margin of error.
4. Some polling operations, organizations or reporters get it wrong a lot when presenting poll data. (For an example of one of these, see “The Maine Heritage Policy Center is always wrong about polling” by my fellow blogger Mike Tipping, and note that this group’s online reporting adjunct, like much of the conservative blogosphere, misreported the “71%” poll).
You should treat those polling operations, organizations and reporters as less credible in the future.
They could regain credibility, but not until they have a better track record.
5. Never judge a result by whether it fits with what people you know say. People tend to hang out with people who agree with them. In social science speak, that group is a “self-selected sample” and it is pretty much guaranteed to be unrepresentative of the overall population.
6. Keep in mind that campaigns won’t release polling data unless it makes their candidate look like he or she is doing well.
7. Most of all, when a poll result is shocking, sharpen up your critical lens — whether or not you like what’s reported.