Wednesday, January 11, 2012

Mark Hemingway and Glenn Kessler on NPR

Mark Hemingway, who wrote a key critique of modern fact-checking operations back in December, appeared with the Washington Post's fact checker, Glenn Kessler, for a radio interview on NPR.  It's worth either listening to it or reading the transcript, but one particular section deserves special attention:
CONAN: Here's an email from Noreen(ph). I don't understand that the - that since - excuse me. I don't understand the idea that since PolitiFact demonstrates that Republicans lie three times as often as Democrats mean it's biased. Maybe Republicans actually do lie that much more. The idea that you have to have an even number of lies reported for Democrats and Republicans in order to be considered not biased is ridiculous. One side could lie way more than the other. And by trying to make them even, you are distorting fact. Is simple numerical balance an indication of nonpartisanship?

KESSLER: No. I don't look at them that way, and, as I said, I don't really keep track of, you know, how many Democrats or how many Republicans I'm looking at until, you know, at the end of the year, I count it up. My own experience from 30 years covering Washington and international diplomacy and that sort of thing is there's - both Democrats and Republicans will twist the truth as they wish if it somehow will further their aims. I mean, no one is pure as a driven snow here. And I've often joked that if I ever write an autobiography, I'm going to title it "Waiting for People to Lie to Me."

(SOUNDBITE OF LAUGHTER)

CONAN: That's something reporters do a lot. Mark Hemingway?

HEMINGWAY: Why - I think I said when I even brought this up. I mean, you know, I don't think that, you know, that, you know, numerical selection is indicative of, you know, bias per se. I just think that it's highly suspicious. When it's three to one, you know, if it were 60-40, you know, whatever, yeah, sure, you know? But when it's three to one, you start getting things where, you know, you start wondering about, you know, why the selection bias.

Hemingway's December article was quite valuable, but he missed an opportunity to explain an important aspect of Eric Ostermeier's examination of PolitiFact's story selection.

PolitiFact rated about the same number of politicians from each party.  Yet one party received significantly worse "Truth-O-Meter" ratings.   The key inference behind Ostermeier's study was the expectation that a party-blind editorial selection process should be expected to choose the same types of stories for both parties.  The results, then, if Republicans really do lie more, would show approximately the same distribution of ratings but with one party more heavily represented in the total number of stories.  The approximately even number of stories for each group throws the monkey wrench in Noreen's reasoning.

It would have been good if Hemingway had explained that during the broadcast.

As a side note, it's interesting that Kessler likewise ends up writing approximately as many stories about Democrats as about Republicans.  Run the numbers for Kessler as Ostermeier did for PolitiFact and perhaps the tendencies look alike. The obvious reason for focusing on PolitiFact instead of Kessler is PolitiFact's far greater volume of material.



(1/12/12) Jeff adds: There's a flaw that is often overlooked when discussing the "add 'em up" style of interpreting PolitiFact's ratings, and that's the issue of the quality of the fact checks themselves. Assuming PolitiFact actually adheres to an objective formula for avoiding selection bias, and then rates 50 statements from the left and 50 from the right, it still wouldn't disprove an ideological lean.

Take for example the different standards used when PolitiFact rated similar statements from Herman Cain and Barack Obama. Both included the employer's portion of payroll taxes in their respective calculations, but in Cain's case PolitiFact downgraded him for it, while in Obama's case it pushed him higher up the rating scale. And this still doesn't take into account the dishonest tactic of inventing statements out of thin air.

It may be interesting to review the tallies of who gets what ratings and discuss the merits of the numbers. Ultimately though it's the alternating standards that will offer the best evidence of PolitiFact's liberal bias.


(1/19/2012) Jeff adds: An additional flaw with adding up PolitiFact's ratings is the fact that PolitiFact chooses who to give the rating to.

When Obama claimed that preventative health care "saves money", and David Brooks said he's wrong, PolitiFact gave a True to Brooks. This serves the dual purpose of sparing Obama a False on his "report card" that PolitiFact likes to shill so often, while also providing cover in the "we give Republicans True's too!" sense.

When PolitiFact rated the oft-repeated, and false, claim about $16 muffins at DOJ event, PolitiFact could have given the rating to NPR, the New York Times, or even (gasp!) PolitiFact partner ABC News. Instead, they chose to burden Bill O'Reilly with the falsehood, despite the original claim coming from a government report.

It's these types of shenanigans that will always distort a ratings tally.

Update/clarification (1/14/2012):

Added "for a radio interview on NPR" to the first sentence.

No comments:

Post a Comment

Thanks to commenters who refuse to honor various requests from the blog administrators, all comments are now moderated. Pseudonymous commenters who do not choose distinctive pseudonyms will not be published, period. No "Anonymous." No "Unknown." Etc.