Wednesday, November 7, 2018

PolitiFact: "PolitiFact is not biased--here's why" Pt. 1

In an article titled "PolitiFact is not biased--here's why" PolitiFact Editor Angie Drobnic Holan offers four points as evidence PolitiFact is not biased. This series deals with each of the four.


1. We fact-check inaccurate statements, not political parties.

We are always on the lookout for bad information that needs correcting. We don’t have any concern about which party it comes from or who says it. If someone makes an inaccurate statement, it gets a negative rating on our Truth-O-Meter: Mostly False, False or Pants on Fire.
If we at PolitiFact Bias were to come up with a story making an assertion, we would certainly try to produce some type of evidence giving palpable evidence in support. We find PolitiFact's article striking for its lack of evidence in support of the claim in the title.

Let's assume for the sake of argument that it's true PolitiFact fact checks inaccurate statements and not political parties. We find both assertions questionable, but we can set that aside for the moment.

What stops a biased fact checker from allowing factors like confirmation bias to guide its selection of fact checks to reflect an ideological bias? This is an obvious objection to the first part of Holan's argument but her article completely fails to acknowledge it. If Holan assumes that PolitiFact has no bias and therefore no confirmation bias can result then her argument begs the question (circular reasoning: PolitiFact is not biased because PolitiFact is not biased).

If Holan isn't using circular reasoning then she's simply not addressing the issue in any relevant way. Fact-checking inaccurate statements and not political parties does nothing to show a lack of bias.

The Elephant in the Room (a pun of foreshadowing)

In early 2011 Eric Ostermeier of the University of Minnesota did a study of PolitiFact's ratings. Ostermeier found Republicans were receiving worse treatment in PolitiFact's ratings. Ostermeier noted that PolitiFact's descriptions of its methodology offered no assurance at all that the skew in its ratings was unaffected by selection bias. In other words, was unrepresentative sampling responsible for making it appear that Republicans lie more?

Ostermeier posed an important question that PolitiFact has never satisfactorily addressed:
The question is not whether PolitiFact will ultimately convert skeptics on the right that they do not have ulterior motives in the selection of what statements are rated, but whether the organization can give a convincing argument that either a) Republicans in fact do lie much more than Democrats, or b) if they do not, that it is immaterial that PolitiFact covers political discourse with a frame that suggests this is the case.

The evidence says PolitiFact's story selection is biased

While developing our own research approaches to PolitiFact's ratings we came up with an observation we say strongly shows PolitiFact guilty of selection bias.

Imagine PolitiFact used only its editorial judgment of whether a statement seemed so false that it was worthy of a fact check and was completely blind to political party and ideology.

We say that regardless of whether one party lies more, the results should prove pretty close to proportional. If 40 percent of PolitiFact's ratings of Republicans come out "Pants on Fire" or "False" then the same should hold true of Democrats. If Republicans lie more that should end up reflected in the number of ratings, not in the proportions.

PolitiFact as much as admitted to selection bias in the early days. PolitiFact founding editor Bill Adair said PolitiFact tried to do a roughly equal number of fact checks for Republicans and Democrats. That makes no less than two criteria for selecting a story, and one of them is not simply whether the statement appeared false. Trying to fact check Republicans and Democrats equally will skew the proportions (unless the parties lie equally and PolitiFact's sample is effectively random).

In Ostermeier's research, Republicans' statements were 39 percent "Pants on Fire" or "False" while Democrats' statements were 12 percent "Pants on Fire" or "False." That's strong evidence of selection bias.

Note: We have not tracked these numbers through the present. Perhaps PolitiFact is closer to rating claims proportionally now than it was in Adair's time. If it is, then PolitiFact could present that as evidence it is blind to ideology when it chooses which claims to check.

Until PolitiFact answers Eric Ostermeier's question it is unsafe to conclude that PolitiFact lacks bias.

No comments:

Post a Comment

Thanks to commenters who refuse to honor various requests from the blog administrators, all comments are now moderated. Pseudonymous commenters who do not choose distinctive pseudonyms will not be published, period. No "Anonymous." No "Unknown." Etc.