I'll admit, there is very little about Sunshine State News' spin on the news that I agree with, but a few weeks back, I found a small place of common ground---and tomorrow, when they release a poll they have commissioned, they have a chance to walk the walk.
For those of you who don't know about Sunshine State News (SSN), it is a web-based news service, located in Tallahassee that covers state politics, often from a perspective that makes Fox News look objective. Basically, it is a news wire that covers the blocking and tackling of Republican politics, with plenty of opinions on Democrats tossed in. That's fine---the more voices, the merrier.
In late May, Kenric Ward, who serves as the Chief Political Corespondent, as well as top Miami Hurricane fan, wrote a piece that outlined the, well let's just say interesting, model used by Quinnipiac in their most recent poll. As Ward points, out, that particular poll way over accounted for independents, to the detriment of both Republicans and Democrats. The Democrats' strength with independents, as Ward points out, drove the margins to levels arguably higher than reality.
While surely Ward was trying to spin a poll that was bad for his point of view (that's something we are all guilty of doing), I nonetheless believe the point Ward was making is there is often far more to the public polling than meets the eye. From my view, too often our friends in the media, including some of Ward's colleagues at SSN, print every public poll and the conclusion they reach as though it is pure fact, even though there are times where the models used by public pollsters bear little resemblence to reality.
Renowned pundit Charlie Cook said in 2010 that "most academic polling, as well as the polling sponsored by local televisions and newspapers, is dime store junk." Now in fairness, I am not sure I would go this far, but I do believe a lot of it is suspect from a methodology standpoint, particularly in a state like Florida. I also agree with Republican pollster Whit Ayers, who in the same piece said he trusts partisan and candidate polling more, since those of us in the business of politics have a reason to have good numbers, since numbers drive resource questions.
To get it right in Florida, not only do you have to get your partisan balance right, but you have to land at the right place regionally. For example, the Tampa and Miami media markets are similar in size, but Tampa has a lot more voters. In 2010, I had it out with a public pollster who released results based on a survey that had 30% of the statewide vote in the Miami market and only 20% in Tampa, even though there is no scenario where Election Day would ever look like that. Ironically, this was a poll that showed the candidate I was helping with a significant lead, even though that wasn't the case at the time.
You also have to get the demographics right, which with a state that has both a diverse Hispanic and Black population, isn't easy. Margin of error accounts for some of this, though as we all know, most people don't read polls with a real understanding of how margin of error works.
Now, why does any of this matter? In today's news environment, polling drives news. One day a poll shows you up three, then you have Big Mo. The next day you are down one, and you are losing, even though with sampling/modeling differences and margin of error, those surveys say the same exact thing. These things drive opinion leader views, donor energy, the "enthusiasm" gap, as well as news coverage. I remember this all too well from 2008, when 2-3 bad polls in a row in early September led a couple of reporters to publicly ask "was Obama done" in news stories. Of course, a few days later polls changed, and so did the narrative.
To point out the absurdity of this, last summer, when I was helping my good friend Dan Gelber, a poll came out that showed he was losing to his primary opponent, Dave Aronberg, by a margin of something like 22-20, with nearly 60% undecided. This was before either candidate had done any real television, mail or other form of communication. The poll was reported in the news, and within minutes, my email was blowing up with reporters, donors and other activists wondering what we were going to do to change this deficit. Within literally an hour or two, a second poll came out which showed us ahead, something in the neighborhood of 21-19, with the same 60% undecided. This was similarly heralded by the same as proof Dan was moving ahead. When the final bell rang in the primary, neither poll bared any resemblance to the final outcome. Interestingly for this blog, the only one who was close was Sunshine State News, and they weren't that close.
I get the allure of publishing these polls. One of the hard things about covering politics and elections is that there is only one scoreboard, and it doesn't come until election day. Therefore, reporters and pundits alike look for ways to create a score, and poll numbers, like fundraising numbers, provide a vehicle for doing that. But poll numbers, especially this far out, aren't worth anything---just ask President Guiliani, who in many polls in 2007, was on pace to beat Hillary Clinton.
So here is my proposal to my friends in the media, publish all the public polling you want, but demand in order to publish it, that the pollster or company release their full methodology and model. Since most polls are published on blogs, not in the paper, its not as though there are space constraints that stand in the way of this level of reporting. If Mason-Dixon has numbers in Florida on the Presidential race or the Governor's approval rating, publish them, but report their model (republicans-democrats-independents) and ask them to put together a memo that details the breakdown by media market and their demographics.
When I've asked this of press before, the usual response is something like 'well, I am only publishing this online, or in a blog,' as if that makes it OK. I also get that when a poll is bad for your side, it is always a 'bad poll.' But that doesn't diminish that there is a responsibility for ensuring that standards are met before publishing polls. But more than that, what is on a blog these days is news, and as I discussed earlier, has consequences for campaigns.
Some public polls do this well. PPP polls, for example, which does robo polls (which may be an entirely different conversation), releases its entire cross-tabs, with exact survey questions. So does Survey USA, another robo-firm. Mason-Dixon, Quinnipiac and Rasmussen won't, unless in the latter's case, you pay for them.
To publish a political science paper in a journal---heck for that matter, when I wrote for grad school, I had to make my data available to others,and explain how I got to my findings. No good journalist would ever publish a candidate poll without broad disclosure, so why is great deference given to the public polling. Given the stakes in politics, especially when you are playing around in statewide or Presidential world, we should expect the same from those who publish public polling data. All it takes from the press is one more email, and a few more lines on their blogs.
And starting tomorrow, when Sunshine State News releases its first poll of 2011, they have an opportunity to set an example. So to my friend Kenric Ward, I look forward to seeing what the SSN poll looks like tomorrow---as well as how SSN got to the conclusions it will inevitably make about the numbers.