Wednesday, January 20, 2016

PolitiFact's "Pants on Fire" bias, 2015 update

It's time again for our annual update to our research on PolitiFact's bias in applying its "Pants on Fire" rating. And the results for 2015 show a surprisingly low bias against Republicans. Read on.

PolitiFact's "Pants on Fire" bias

As we have noted, PolitiFact has failed to ever provide any objective means for distinguishing between its "False" and "Pants on Fire" ratings on its trademarked "Truth-O-Meter." The only difference between the two ratings by PolitiFact's telling is that "Pants on Fire" statements are ridiculous as well as false.

We did an extensive survey of the reasons PolitiFact has given in its stories for its ratings and failed to even find an informal criterion that might pass as objective.

This research project does not focus on whether Republicans simply receive more "Pants on Fire" ratings than Democrats. We look at proportions, not the raw numbers. We look at the percentage of total false ratings ("False" plus "Pants on Fire") and then divide the number of "Pants on Fire" ratings for each party by that number. Comparing the percentages for each party then gives us a number we call the "PoF bias number." And what is the significance of that number?

The PoF bias number shows which political party is more likely to receive a "Pants on Fire" rating. If the difference between "False" and "Pants on Fire" is subjective, as PolitiFact's definitions and our research appear to indicate, the "Pants on Fire" rating shows which party is more likely to receive the subjective "Pants on Fire" rating.

In 2011, the first year we did the study, Republican statements were 57 percent more likely to receive a "Pants on Fire" rating than statements from Democrats. As the chart below shows, 1.57 serves as the corresponding PoF bias number.

In PolitiFact's very first year, Democrats received a higher percentage of "Pants on Fire" ratings. It turned out PolitiFact's founding editor Bill Adair said the "Pants on Fire" rating started out for use on "light-hearted" fact checks. Ever since "Pants on Fire" stopped being a joke at PolitiFact, Republicans have been more likely to have their false statements ruled as "Pants on Fire."

Republicans' false statements were far more likely than Democrats' to receive a "Pants on Fire" rating from 2009 through 2012.



(PoFBias)(Sel.Bias)
YEAR
GOP
Democrat
GOP
Democrat
2007

2.50
1.00
1.00
2008
1.31

1.53

2009
3.14

1.48

2010
2.75

3.15

2011
1.57

3.83

2012
2.25

2.52

2013
1.24

1.81

2014
1.95

2.56

2015
1.10

4.83

Chart notes: Columns 2 and 3 deal with the PoF bias number. Columns 3 and 4 deal with the "selection bias number," which is the comparison between Republicans and Democrats of the total number of false claims.


Our results for 2015 proved intriguing.

Republicans' false statements were only 10 percent more likely than Democrats' false statements to receive the "Pants on Fire" rating. In terms of historical patterns, the bias against Republicans in 2015 was minimal.

But given the current mainstream media narrative that the Republicans wantonly lie and do so perhaps the most in history, how should we interpret the data?

The Matter of Interpretation

We've had critics insist that a high PoF bias number against Republicans is best interpreted as a sign that Republicans simply lie more. For those critics, this year's findings present a problem. Is the supposed trend of Republicans lying tailing off?

We'd love to see our critics try to make that argument.

We would suggest two hypotheses to help explain the numbers.

First, we would suggest the numbers do not necessarily need much explanation. Simple regression to the mean may explain the surprisingly low measurement of anti-Republican bias in applying the "Pants on Fire" rating. The PoF Bias number for the past three years, after all, falls very close to the average for the preceding years. However, we looked for an alternative explanation anyway because we should predict that the "Republicans lie more" narrative would make the recent numbers look worse than usual for Republicans if PolitiFact journalists allow ideological bias to affect their ratings.

The Secondary Hypothesis

Our secondary hypothesis came from our past analysis of the PoF bias number for PolitiFact's state operations.

Some of the PolitiFact states had PoF bias numbers favoring Republicans instead of Democrats despite paying relatively little attention to statements from Democrats. We hypothesized some PolitiFact states might be applying a compensatory bias: If a PolitiFact state was grading many Republican statements as "False" compared to very few statements from Democrats, the state PolitiFact might rate the few Democrat statements more harshly to look more fair.

So, we hypothesized that PolitiFact states might go harder on false statements from Democrats to make the large number of false statements from Republicans look less like a sign of journalistic bias.

As a corollary to the secondary hypothesis, we noted with our first publication of our research that PolitiFact staffers could read our research and act to change future outcomes. That possibility does not concern us much, for if the bias in applying the "Pants on Fire" rating stabilizes near a more equitable level, it tends to support our interpretation of the data for PolitiFact's earlier years.

Are Democrats Telling the Truth More and More?

Though our research project focuses on the percentages of "Pants on Fire" ratings compared to the total number of statements PolitiFact views as false (making the number totals unimportant to the PoF bias research), we end up collecting data on the total number of false statements as a matter of course.

Those data do appear to tell an interesting story: PolitiFact National seems to have increasing difficulty in rating Democrats' statements false.

Note that we sift the PolitiFact ratings to obtain the data most likely to accurately measure bias.  We're talking about the "Group A" data described in our first research paper, which consists of ratings of party candidates, elected officeholders, partisan appointees or party organizations. We exclude statements obviously attacking members of one's own party. Still, the shrinking number of false statements for Democrats stands out:

(Numbers exclude party-on-party claims, as where a Democrat attacks another Democrat)

PolitiFact had little difficulty finding false statements from Democrats over its first three years. The low mark during those first three years was 18, when PolitiFact only operated for about half the year. Since that first year the number has declined or stayed even every year save for one. The number of false statements from Democrats skyrocketed from 20 all the way up to 23 in 2011 before resuming its decline.

Why is PolitiFact only finding half the false statements per year it found over the first three years? What explains that decline? Are Democrats edging closer to the truth on the whole? Just once per month, on average, PolitiFact was able to catch a full-fledged partisan Democrat telling a falsehood in 2015?

We doubt Democrats have significantly changed their approach to political speech. We think the explanation has to do with the fact checkers buying the idea that failing to let their bias lead them (they don't see it as bias) results in a "false equivalence" in fact checking. PolitiFact produces results consistent with what we should expect from left-leaning fact checkers.

No comments:

Post a Comment

Thanks to commenters who refuse to honor various requests from the blog administrators, all comments are now moderated. Pseudonymous commenters who do not choose distinctive pseudonyms will not be published, period. No "Anonymous." No "Unknown." Etc.