It's all the rage with news organizations. They pay someone to conduct a poll about the presidential race, and then they trumpet the results. If the poll finds some cool new angle on the race -- and these polls are often designed to do just that -- the punditry will pick it up and add their two cents to it for days, earning the news outlet that did the poll millions of dollars of free publicity.
A lot of what the media sells to the public as legitimate polling is complete garbage that political candidates' own campaigns would never pay for because it isn't worth the paper it is printed on. That's why candidates do their own internal polling rather than just use what appears in the news.
Unfortunately, this trash polling influences what people think is actually going on every bit as much as other legitimate news coverage does.
The average person couldn't hope to find all the flaws in the way a given poll was conducted, because polls are statistically complex. But there are a couple of obvious red flags that anyone can watch for in these media-flavor-of-the-minute polls and the news coverage about them. When you see them, you might as well disregard the results.
Take the two big polls that dominated the news last week, the ones that claimed the Rev. Jeremiah Wright scandal hasn't hurt Sen. Barack Obama.
A three-day Pew Research Center poll claimed that Obama maintained a 10-point lead over Sen. Hillary Clinton after a serious battering over Wright's racially inflammatory sermons. Pew determined this by polling 1,503 adults who had heard "a lot" or "a little" about Obama's famous race speech and Wright's sermons.
Did you catch that? Pew polled adults. Not registered voters, not likely voters, but "adults." Will these people be voting? Have they voted in recent years? Ever? These are questions of huge significance if your aim is to find out what the electorate is actually thinking. If you don't poll the people who will show up on election day or those just like them, you are wasting your time.
Part of the reason pollsters who poll for media consumption poll "adults" without regard to their voting status and history is to save money by not having to prequalify the individuals they poll as actual voters who will cast a ballot. Lists of actual voters, the kind that vote in every contest (A voters) or most contests (B voters) must either be obtained from boards of election across the country, or those who work for pollsters must prequalify poll respondents by determining their voting history and then eliminating those that aren't likely to vote. To do that, pollsters often have to make double, triple or quadruple the calls, which costs a lot of money.
This is a huge problem, because adults who aren't registered or don't vote are often from a very different demographic sample than those who are registered and vote regularly. And they throw polls wildly off kilter. They tend to be of a lower income, less educated and less likely to follow the news.
Last week, the Wall Street Journal and NBC News published what a Journal article called a "myth-buster" poll that also claimed that the Wright fiasco hadn't hurt Obama. Unlike the flawed Pew poll, which had Obama up by 10 points over Clinton, the Journal/NBC News poll showed Clinton and Obama in a tie nationally. It polled 700 "registered voters" over three days. That's better, but only slightly so, because the Motor Voter Act, which allows people to easily register, in some cases by merely checking a box, when they renew their driver's license. Tens of millions of people who have never seen the inside of a voting booth have been registered that way. The voter rolls are crammed with them. So polling registered voters is only a slight improvement over polling "adults." Again, no campaign manager in her right mind would pay for a poll of "registered voters" for the campaign's own internal use. If you can't ascertain if poll respondents will be voting, what they think is irrelevant.
The widely publicized (unfortunately) Journal/NBC poll had other serious flaws. While a sample of 700 might work for a poll in a congressional district, nationwide it adds up to the equivalent of a mere 14 people a state, which is frighteningly thin. Even more bizarre is the fact that 25 percent of the respondents polled in the Journal/NBC poll were black, while just over 12 percent of the American population is. Since African-American voters overwhelmingly favor Obama, doubling their sample makes the poll nearly useless.
Compare that with Gallup's big tracking polls -- not the dinky ones the organization sometimes does for media outlets -- which have a history of being fairly accurate and samples in the 6,600-voter range.
So did the Rev. Wright fiasco hurt Obama? Last week the members of the media pushed a story line that said it didn't. But since no one has yet conducted a decent public poll on the issue, your guess is as good as mine. But trusting polls with the flaws described above is akin to throwing darts in the dark.