Nothing's better than getting PolitiFact editors on the record about PolitiFact. Their statements probably do more than anything else to show that the PolitiFact system encourages bias and the people who created it either don't realize it or couldn't care less.
Statements from
editors at PolitiFact's associated newspapers come in a close second.
Ted Diadiun, reader representative for the
Cleveland Plain Dealer (host of PolitiFact Ohio), answering a reader's question:
In July you printed a chart with two years of PolitiFact Ohio results. It showed Democrats with 42 ratings of Mostly False, False or Pants on Fire, while the Republicans had a total of 88 in those categories. Doesn't that prove you guys are biased?
Well, it doesn't necessarily prove that. It might prove instead that in the statements our PolitiFact team chose to check out, Republicans tended to be more reckless with the truth than Democrats.
Diadiun apparently doesn't realize that
if PolitiFact Ohio chooses more Republican statements to treat harshly then it is a likely sign of institutional selection bias unless PolitiFact Ohio either randomizes its story selection (highly unlikely) or coincidentally chose a representative sample. How would we ever know that the sample is representative unless somebody runs a controlled study? Great question. It's such a good question that it is reasonable to
presume that a disparity in the treatment of the respective parties by PolitiFact results from an ideologically influenced selection bias. That was
the point of Eric Ostermeier's study of PolitiFact's 2011 "Truth-O-Meter" results.
Diadiun, continuing his answer to the same question:
Or, it might prove only that there are a lot more Republicans who have been elected to the major offices that provide most of the fodder for fact-checking.
It would prove nothing of the kind. PolitiFact has one state operation in a state that is firmly controlled by Democrats: PolitiFact Oregon. PolitiFact Oregon, despite
a state political climate dominated by Democrats, rates the parties about evenly in its bottom three "Truth-O-Meter" categories (Republicans fare slightly worse).
Diadiun:
It is also a fact that Republicans had a few more statements rated True than the Democrats did, but the Truth-O-Meter was indeed a bit tougher overall on Republicans. You can find the report here
Does that show bias? I've said it before, and I'll say it again here: The PolitiFact Truth-O-Meter is an arbitrary rating that has the often impossible job of summing up an arduously reported, complicated and nuanced issue in one or two words.
Diadiun goes on to tell his readers to "ignore" the Truth-O-Meter.
That's quite the recommendation for PolitiFact's signature gimmick.
He's partly right. PolitiFact ratings cram complicated issues into narrow and ill-defined categories. The ratings almost inevitably distort whatever truth ends up in the reporting. So shouldn't we ask why a fact checker steadfastly continues to use a device that distorts the truth?
The answer is pretty plain: The "Truth-O-Meter" gimmick is about money. PolitiFact's creators think it helps market the fact checking service. And doubtless they're right about that.
There is a drawback to selling out accuracy for 30 pieces of silver: Contrary to Diadiun's half-hearted reassurances, the "Truth-O-Meter" numbers do tell a story of selection and
ideological bias. Readers should not ignore that story.
Jeff adds:
The hubris on display from Diadiun could fill gallon-sized buckets. Notice that he completely absolves PolitiFact from the role they play in
selecting which statements to rate, and immediately implies the "
Republicans tended to be more reckless with the truth than Democrats." Incompetence or deceit are the only reasonable explanations for such an empty claim.
For the benefit of our new readers, I'd like to provide an exaggerated example of
selection bias: Let's say I'm going to call myself an unbiased fact checker. Let's say I'm going to check 4 statements that
interest me (as opposed to a random sample of claims). I'll check Obama's claim that he would close Guantanamo Bay, and his claim that he "didn't raise taxes once." I find he's telling falsehoods on both accounts.
Next, I'll check Rush Limbaugh's statement that he's happy to be back in the studio after a long weekend. I'll also check his claim that he's one of the most popular radio shows in the nation. Of course, these claims are true.
What can we learn from this? According to PolitiFact's metrics, Rush Limbaugh is a bastion of honesty while Barack Obama is a serial liar. I'll even put out "report cards" that "reveal patterns and trends about their truth-telling." I'll admit the "
tallies are not scientific, but they provide interesting insights into a candidate's overall record for accuracy." It's unbiased because, as the popular defense goes, I checked both sides. The reality that I checked statements that interested me supposedly has no influence on the credibility of the overall ratings. If you don't like the results, it's because
you're biased!
See how that trick works?
It's something to keep in mind the next time you see Obama earning a
Promise Kept for getting his daughters a puppy or a True for his claim that the
White Sox can still make the playoffs. A cumulative total of PolitiFact's ratings serves the purpose of telling readers about the bias of PolitiFact's editors and what claims are interesting to them. It's a worthless measure of anything else.