Showing posts with label PoF bias. Show all posts
Showing posts with label PoF bias. Show all posts

Monday, January 13, 2025

PolitiFact's "Pants on Fire" bias in 2024

For years, PolitiFact Bias has tracked the proportion of false ("False" plus "Pants on Fire") statements PolitiFact rated "Pants on Fire." As PolitiFact has never established an objective distinction between the two ratings, we infer that the difference between the two is substantially or wholly subjective. That makes this dividing line perhaps the best means of using PolitiFact's own ratings to measure its political bias.

As we are looking at proportions and not raw numbers for the bias measurement, the results cannot be dismissed on the basis that Republicans supposedly lie more.

The Tale of the Tape in 2024

Graphs-a-plenty this year!

We'll start with the dual graph of the PoF Bias number along with the story selection proportion number. The PoF Bias number could be expressed either of two ways. As the numbers pretty consistently have show an anti-Republican/pro-Democrat bias, this number shows that anti-Republican bias when the number is greater than 1. Using this option a PoF Bias Number less than 1 shows the PoF bias harmed Democrats. Our chart shows that occurring for four different years (2007, 2011, 2013, 2015). But it's important to point out that the state franchises accounted for the apparent relative evenhandedness for the latter three years. We tracked PolitiFact National separately, and only 2007 and 2017 showed the anti-Democrat bias. The year 2007 counts as a statistical anomaly, we would say. PolitiFact treated the "Pants on Fire" rating as a joke at first.


The chart shows that after 2007 Republicans consistently had more false ratings than Democrats. In 2024 that preference for GOP falsehoods fell just short of the record for 2020. For both years, PolitiFact gave the GOP more than five times the number of false ratings it gave Democrats.

Because Republicans lie more?

Not so fast! Here's where the PoF Bias number shows its value. The PoF Bias number compares the percentages of false statements rated "Pants on Fire" for each party. PolitiFact has never offered an objective means of separating ridiculously false statements from those that are merely false. As the number represents a proportion, it is immune from influence by the sheer number of false ratings. Put another way, it's entirely independent of the Selection Proportion number.

In 2024, PolitiFact was over six times more likely to (subjectively rate a Republican false claim "Pants on Fire" than a false claim from a Democrat. That figure easily eclipsed the old record of 4.58 times more likely set in 2020.

Democrats Lie Less?

Interestingly, the recent year spikes in the PoF Bias Number are not driven by increases in "Pants on Fire" ratings given to Republicans. Those have actually moderated. The driver of the higher bias number stems from PolitiFact having increasing difficulty bringing itself to rate a Democrat "Pants on Fire." In 2010, PolitiFact meted out 31 "Pants on Fire" ratings to Democrats. That number has shrunken pretty steadily over time, with the Democrats setting a new record for PoF avoidance. Only one false Democrat claim received a "Pants on Fire" rating, just 4 percent of the total false ratings.




PolitiFact encountered a similar distaste for giving Democrats false ratings of any kind. "False" and "Pants on Fire" combined fell from a peak of 135 in 2012 to 25 in 2024. That figure was the lowest for any presidential election year over PolitiFact's entire history.

Republicans Lie More and Less?

Probably thanks to social media dollars drawing fact checkers away from politicians and toward fake news and social media hijinx, PolitiFact is finding fewer false claims from Republicans. No! I'm not kidding.



Check the presidential election year peaks.

2008 : 55 (PolitiFact's infancy)
2012: 247 (Good ol' Romney)
2016: 237 (Dawn of the Trump)
2020: 253 (Day of the Trump)
2024: 132 (Return of the Trump)

PolitiFact barely gave the GOP half the false ratings it did in 2020. When former PolitiFact editor Bill Adair runs around in support of his new book decrying an explosion in political falsehoods, what is he talking about? PolitiFact has apparently cut Democrat falsehoods down to almost nothing and cut Republican falsehoods nearly in half.

No, of course we don't believe that. A fool would believe that. We say the "Truth-O-Meter" numbers give us information about PolitiFact, not about the figures on whom they stand in judgment.

Thursday, December 19, 2024

The PolitiFact Wisconsin story

 This article is a companion to Bryan's forthcoming review of former PolitiFact editor Bill Adair's book, "The Big Lie."

In my review of Bill Adair's book I refer to the way PolitiFact's state operations like PolitiFact Wisconsin tended to favor Republicans during years Adair excluded from his dataset. Readers of that Substack article may find this explanation helpful.

Research published here at PolitiFact Bias has examined the bias PolitiFact applies in the use of its "Pants on Fire" ratings. The difference between "False" and "Pants on Fire" appears entirely subjective and based squarely on the term "ridiculous." Until PolitiFact defines "ridiculous" in a reasonably objective way, its descriptions up through this point strongly encourage the view that the term is subjective.

Until 2020, a "Wisconsin" tag on a PolitiFact story dependably indicated that staffers from PolitiFact's affiliate performed the fact checks. We stopped tracking state data after 2020 because the stories could as easily come from PolitiFact National staffers. We also had reason to believe the state affiliates were no longer in charge of determining the "Truth-O-Meter" ratings.

"Pants on Fire" Bias at PolitiFact Wisconsin

Wisconsin was unusually tough on its Democrats compared to most other PolitiFact operations. Whereas PolitiFact National gave Democrats a "Pants of Fire" for about 17 percent of their false statements from 2007 through 2019, PolitiFact Wisconsin gave them over 27 percent, slightly higher than the 27 percent average Republicans received from PolitiFact National.

Raw Numbers at PolitiFact Wisconsin

Adair's claim that Republicans lie more doesn't rest on percentages, though. Adair sticks with raw numbers of disparaging ratings.

There, too, PolitiFact Wisconsin moderated the bias of the larger organization.

Republicans "earned" about 40 percent more "False" plus "Pants on Fire" ratings than did Democrats from PolitiFact Wisconsin. In contrast, PolitiFact National gave Republicans over 300 percent (3x) more such ratings than Democrats.

The tendency in Wisconsin, as this graph helps show, matches that for PolitiFact as a whole. It isn't that Republicans lie more. It's that Democrats lie less and less.


Where did the Democrat lies go? Did PolitiFact and other fact checkers force them to clean up their act? Did fact checkers at long last realize that they had been too tough on Democrats early on?

Did narrative increasingly conquer objectivity?

Thursday, February 8, 2024

The "Pants on Fire" bias study updated through 2023

 We have updated our "Pants on Fire" bias study with data from 2023.

What is it? We use a spreadsheet to track all "False" and "Pants on Fire" ratings given to partisan Republicans or Democrats whether candidate, officeholder or appointed administration official plus party officials or organizations. We then calculate the percentage of false ("False" plus "Pants on Fire") ratings given the "Pants on Fire" rating.

Why do we do it? Because PolitiFact has never offered an objective means of distinguishing its "False" rating from its "Pants on Fire" rating, we infer that the difference is either substantially or wholly subjective. Assuming the substantial subjectivity of the ratings, we expect that differences in the percentages will help identify PolitiFact's partisan bias, if any.

Here's the updated chart:


What have we learned so far?

We've learned that national PolitiFact after 2007 shows a consistent bias for Democrats/against Republicans. That trend shows poorly on the graph above because this graph includes ratings from PolitiFact's various state operations. Before PolitiFact changed its website making it far less clear which franchise was responsible for what, we kept track of each part of the organization separately. The years from 2010 through 2015 show a moderation of bias thanks to state operations that sometimes were legitimately tough on Democrats. PolitiFact Wisconsin was notably tough on Democrats during that period, for example.

By looking at the total number of various ratings given to the political parties, we've also noted that Republicans (after 2007) receive far more of PolitiFact's bottom two ratings. That effect may stem from Republicans lying more or simply because of bias in story selection and ratings. We've documented enough of the latter two factors to reasonably prefer the second option. That's where the evidence leads.

If, as the available evidence suggests, PolitiFact's "Pants on Fire" rating has no objective basis, "Republicans lie more" carries no objective explanatory value respecting the percentages on our graph.

We've also learned that harsh ratings for both parties are on the decline, in terms of raw numbers. The most obvious explanation for that trend stems from PolitiFact's social media partnerships. If PolitiFact fact checks a politician, revenue consists of donations, grants and ad revenue. But if PolitiFact fact checks something for its social media partners, there's a payday for that. PolitiFact discloses that more than 5 percent of its revenue comes from the social media company Meta. The Chinese social media company TikTok likewise accounts for over 5 percent of PolitiFact's revenue.  

Why doesn't PolitiFact offer more transparency than that regarding its income? Good question, but we don't have an answer free of conjecture.

As for our study of PolitiFact's numbers in 2023, the Republican average fell well below its historic norm, establishing an all-time low for the GOP. PolitiFact's ratings of Democrats pulled their historic average down for the eighth straight year.

A potential weird Trump effect?

The percentages for Republicans haven't really changed much over the years, defying the existence of any Trump effect in terms of increasing Republican dishonesty (in PolitiFact's data, anyway). But the percentages for Democrats have declined noticeably since around 2016 as Trump ascended politically.

Could Trump help explain an increase in Democratic Party honesty?

More likely those changes happen because the makeup of PolitiFact's franchises has shifted over time. State franchises no longer take the edge off the pro-Democrat bias of national PolitiFact. 

Thursday, June 8, 2023

Have politicians discovered asbestos pants? (Update/Correction)

We think PolitiFact's "Truth-O-Meter" rating system offers ample evidence of PolitiFact's bias.

Why?

1) It's an admittedly subjective rating system.

2) Rating patterns differ widely at different franchises.

3) Fundamentally similar fact checks may conclude with very different ratings.

And to those three reasons we add a fourth, newly ascendant in the data we collect:

4) "Pants on Fire" seems to be going extinct/extinguished.

Have politicians discovered asbestos pants?

Through June 8, 2023, the total number of "Pants on Fire" ratings given to politicians totals five. Five.

Correction 6/22/2023: We apparently made a careless error in transcribing the number of Pants on Fire ratings given to party politicians during the first half (or so) of 2023. The correct number was two, not five. The corrected number only strengthens our point that "Pants on Fire" numbers have fallen off a cliff. Yes, the chart is wrong as well in reporting five in 2023.


From 2007 through 2009, PolitiFact was just starting out, which helps explain the low numbers during that period. In 2010 state franchises such as PolitiFact Texas and PolitiFact Florida started to contribute heavily to the number of ratings, including "Pants on Fire" ratings.

The era of Bill Adair's directorship was in full flower through 2013. We see the three-year spike of GOP "Pants on Fire" ratings and a rise followed by a slow decline in Democratic Party "Pants on Fire" ratings.

Current Editor-in-Chief Angie Drobnic Holan took over from Adair, and under Holan we observe a decline in "Pants on Fire" ratings for Democrats. We see the same for Republicans, notwithstanding notable election-year spikes in 2016 and 2020.

So far, the year 2023 stands out for its exceptionally low numbers.

"Republicans Lie More!"

Oh, please!

As a catchall excuse for weird PolitiFact data, that just won't cut it. It's not good as an excuse for PolitiFact's selection bias problem. It doesn't explain PolitiFact's biased application of "Pants on Fire" ratings, and it cannot ever explain lower numbers of "Pants on Fire" ratings over time to both political parties.

So, what's the explanation?

The simplest explanation boils down to money. PolitiFact gets paid for its role as the falsehood-sniffing dog for social media censors. The most recent page of "Pants on Fire" ratings at PolitiFact's webpage is filled with "Pants on Fire" ratings given for social media claims, with not one given to a party officeholder, candidate, appointee or the like. Not one. On the previous page there's one for Donald Trump given back in May.

That suggests PolitiFact now takes a greater interest in its social media role than in holding politicians accountable. To be fair, however, PolitiFact can still manipulate political messaging effectively by giving poor ratings to messages Republicans are likely to share. Rating one social media claim, no matter who it's from, can justify stuffing a sock in the social media mouth that would repeat it.

An alternative explanation? Politicians, both Democrat and Republican, are lying less.

It will be fun to see whether fact checkers try to take credit for making politicians more truthful without any sound basis for that claim.


Monday, January 2, 2023

PolitiFact's "Pants on Fire" bias in 2022

Back in 2012, we started an ongoing study of PolitiFact's bias particular to its decisions on "False" versus "Pants on Fire" ratings, given that the difference counts to all appearances as "entirely subjective."

This post updates that research with observations about 2022.

The "Pants on Fire" Bias

PolitiFact has never described an objective grounds for deciding between a "False" rating and a "Pants on Fire" rating for its "Truth-O-Meter." Our research approach predicts that PolitiFact's bias will drive a preference for one party over the other in making those decisions. That bias we express as the "PoF Bias number," where 1.0 shows perfect balance between the two parties. A figure below 1.0 shows PolitiFact favoring Republicans and a figure over 1.0 shows PolitiFact favoring Democrats.

 

For 2022, PolitiFact scored its third-highest PoF Bias number since it started in 2007. Of note, these figures include all of PolitiFact's state franchises. Tracking PolitiFact National by itself, as we once did, shows a sharp lean to the left during the 2010-2015 period. State franchises provided the balance shown during that time. The more recent spikes in bias likely stem from a combination of new, more left-leaning franchises and fact checker zeal over President Trump.

The cumulative PoF Bias number stands at a relatively modest 1.33 even with recent left-leaning spikes. So. over its history PolitiFact is 33 percent more likely to give a claim PolitiFact deems false a "Pants on Fire" rating for a Republican compared to a Democrat. That figure for PolitiFact National from 2007 through 2019 was 56 percent.

That variation, by the way, supports the hypothesis that different PolitiFact fact checkers display differing trends in their fact-checking. If, for example, PolitiFact National leans more left than state franchises then increased control over those franchises by National should show increased left-leaning bias.

Trends

We observe a fascinating trend at PolitiFact toward lower numbers of fact checks for politicians. Probably thanks to the lure of social media dollars, PolitiFact's timeline over time shows increased checking of social media claims. That's understandable, as social media reimburse fact-checking partners like PolitiFact for their work.  We noted that trend in a separate dataset including PolitiFact's fact checks of all U.S. politicians. We also see it reflected in this study focused only on "False" and "Pants on Fire" ratings of politicians, though with a notable spike during the 2020 election year.

Why was PolitiFact able to find over 100 false claims from Democrats each year from 2010 through 2012 but unable to crack the peak of 68 in the years since? Though we've noticed PolitiFact claiming politicians have started communicating with greater care, we do not find that explanation plausible without specific supporting evidence. Selection bias likely serves as an adequate explanation. And PolitiFact's numbers show an increased reluctance to rate false statements from Democrats as "Pants on Fire."

In 2022 PolitiFact established a new low rate from Democrats by rating only 6.06 percent of Democratic Party false claims as "Pants on Fire." That edged marks of 6.12 percent for 2018 and 6.38 percent in 2020.

PolitiFact also gave Republicans "Pants on Fire" ratings for the lowest annual rate ever, at 21.7 percent. Of course the rate for Republicans receiving that subjective and severe rating was over three times greater than for Democrats.

Monthly Tracking

We experimented with creation of a monthly chart for 2022's numbers. We expected the chart to show reasonable stable trends in the numbers despite the relatively low number of ratings in 2022. Small datasets should show relatively greater variation with randomized sample sizes. PolitiFact, of course, makes no apparent attempt to randomize its dataset. Selection bias may explain the relatively stable numbers we see from the chart.



Thursday, March 17, 2022

PolitiFact's "Pants on Fire" bias in 2021

As we noted in our post about the "Pants on Fire" research for 2020, we have changed the way we do the research.

PolitiFact revamped its website in 2020, and the update made it next to impossible to reliably identify which of PolitiFact's various franchises were responsible for a fact check. Instead of focusing on PolitiFact National, it makes more sense to lump all of PolitiFact together. But the new approach has a drawback. The new evaluations represent an apples-to-oranges comparison to the old evaluations.

To deal with that problem, we went back and did PolitiFact's entire history since 2007 using the new method.

With the research updated using the new method, we are now able to compare the new research with the old method.

Spoiler: Using the new method, PolitiFact was 2.66 times more likely to rule a claim it viewed as false as "Pants on Fire" from a Republican than for a Democrat. That's PolitiFact's third-highest bias figure of all time, though PolitiFact National, considered separately, has exceeded that figure at least three times.

 

Method Comparison: New vs. Old 

Our new graph shows the old method, running from 2007 through 2019, along with the new method graphed from 2007 through 2021.


The black line represents the old method. The red line represents the new.

The numbers represent the what we term the "PoF bias number," which is an expression of how much more likely it is that PolitiFact will a claim it regards as false a "Pants on Fire" rating for a Republican over a Democrat. So, for 2009 under the old method (black line), the GOP was 3.14 times more likely to have one of its supposedly false statements rated "Pants on Fire."

As our research has documented, PolitiFact has never offered an objective means of determining the ridiculousness of a claim viewed as false. The "Pants on Fire" rating, to all appearance, has to qualify as a subjective judgment. In other words, the rating represents PolitiFact's opinion.

In 2017, under the old method, the bias number dropped to 0.89, showing a bias against Democrats for that year at PolitiFact National. On average over time, of course, Republicans were significantly more likely to have their false claims regarded as "ridiculous" by PolitiFact.

Notably, the new method (red line) shows a moderating effect on PolitiFact's "Pants on Fire" bias from 2008 through 2014. The red line hovers near 1.00 for much of that stretch. After 2015 the red line tends to run higher than the black line, with the notable exception of 2019.

Explaining the Numbers?

We found two correlations that might help explain the patterns we see in the graphs.

PolitiFact changes over time. From 2007 through 2009, PolitiFact National did nearly every rating. Accordingly, the red and black lines track very closely for those years. But in 2010 PolitiFact added several franchises in addition to PolitiFact Florida. Those franchises served to moderate the PoF bias number until 2015, where we measured hardly any bias at all in the application of PolitiFact's harshest rating.

After 2015, a number of franchises cut way back on their contributions to the PolitiFact "database" and a number ceased operations altogether, such as PolitiFact New Jersey and PolitiFact Tennessee. And in 2016 PolitiFact added eight new state franchises (in alphabetical order): Arizona, Colorado, Illinois, Nevada, New York, North Carolina, Vermont and West Virginia.

The Franchise Shift

We made graphs to help illustrate the franchise shift. PolitiFact has had over 20 franchises over its history, so we'll divide the graph into two time segments to aid the visualization.

First, the franchises from 2010 through 2015 (click for larger view):

We see Florida, Texas, Rhode Island and Wisconsin established as consistent contributors. Tennessee lasts one year. Ohio drops after four years. Oregon drops after five and New Jersey after three.

Next, the franchises from 2016 through 2022 (click for larger view):


I omitted minor contributions from PolitiFact Georgia in 2016 (12) and 2017 (2). The orange bar near the top of 2016 is six states combined (hard to make out in the columns after 2016).

Note that the contributions are skinny, except for the one from Wisconsin. But even Wisconsin cut its output compared to the previous graph. We have a correlation suggesting that the participation of different state franchises impacted the bias measure.

But there's another correlation.

Republicans Lie More! Democrats Lie Less!

Liberals like to explain PolitiFact ratings that look bad for Republicans by saying that Republicans lie more. Seriously, they do that. But we found that spikes--especially recent ones--in the "Pants on Fire" bias measure were influenced by PolitiFact's spiking reluctance to give Democrats a "Pants on Fire" rating.

That correlation popped out when we created a graph showing the percentage of false statement given the "Pants on Fire" rating by party. The graph for Republicans stays pretty steady between 20 and 30 percent. The graph for Democrats fluctuates wildly, and the recent spikes in the bias measure correlate with very low percentages of "Pants on Fire" ratings for Democrats.


As is always the case, our findings support the hypothesis that PolitiFact applies its "Pants on Fire" rating subjectively, with Republicans receiving the bulk of the unfair harm. And in this case Republicans receive the bulk of the unfair harm through PolitiFact's avoidance of rating Democrat claims "Pants on Fire."

Do Democrats lie less? We don't really know. We suspect not, given the number of Democrat whoppers PolitiFact allows to escape its notice (such as this recent gem--transcript). We think PolitiFact's bias explains the numbers better than the idea Democrats lie less.



Notes on the PolitiFact franchise numbers: As we noted from the outset, PolitiFact's revamped website made it all but impossible to identify which franchise was responsible for which fact check. So how did we get our numbers?

We mostly ignored tags such as "Texas" or "Wisconsin" and looked for the names of staffers connected to the partnered newsroom. This was a fallible method because the new-look website departs from PolitiFact's old practice of listing any staffers who helped write or research an article. The new site only lists the first one mentioned from the old lists. And it has long been the case that staffers from PolitiFact National would publish fact checks under franchise banners. So our franchise fact check numbers are best taken as estimates.

Monday, December 14, 2020

Changes to the "Pants on Fire Bias" study

Back in 2011 PolitiFact Bias started a study of one measure of PolitiFact's bias. Our study took PolitiFact's "Truth--O-Meter" ratings and looked at the differential between its application of the "False" rating and the "Pants on Fire" rating.

We had noted that the only difference PolitiFact advertised between a "False" rating and a "Pants on Fire" rating is that PolitiFact considers the latter "ridiculous" but not the former. So a "False" statement is a false statement and a "Pants on Fire" statement is a statement that is both false and ridiculous.

Save us your comments like "Well, to me a "Pants on Fire" means PolitiFact believes it was an intentional lie." That's not how PolitiFact has ever defined it.

We kept the study updated for every year from 2007 through 2019. For PolitiFact National, claims viewed as false by PolitiFact (rated either "False" or "Pants on Fire") were over 50 percent more likely to receive a "Pants on Fire" rating than one from a Democrat.

Because "ridiculous" seems like a subjective measure and we could find no unspoken or hidden objective rationale justifying a "Pants on Fire" rating, we judged it was likely a subjective measure. So PolitiFact's preference for doling out the "Pants on Fire" rating to Repubicans instead of the "False" rating, compared to the same measure for Democrats, we argue counts as a legitimate measure of political bias.

PolitiFact's various state operations have varied considerably from PolitiFact National in terms of their "Pants on Fire" bias measure. We say the variations between them support the hypothesis that the ratings are subjective.

Change Time

In 2020, PolitiFact revamped its website. Instead of publishing material from the state franchises to a special domain for each state, PolitiFact changed to a tagging system.

That was not good for our study.

PolitiFact did not, for example, reserve the tag "Wisconsin" for fact checks stemming from the staff at PolitiFact Wisconsin. Any post or article with content dealing with Wisconsin might receive that tag. That means that categorizing the data to match what we did from 2007 through 2019 would mean a giant headache.

To keep things simple, from 2020 onward we'll be lumping all of PolitiFact into one hopper. We will no longer track the state operations separately. That makes things easy. We just have to review that the claimant counts as a Republican or Democrat and log appropriately.

We don't think the trends will change much for all of PolitiFact compared to what we measured for PolitiFact National over the years. The operations that were kinder to Republicans have either shifted left (bubble effect?) or dropped out. We expect Repubicans to be about 50 percent more likely than a Democrat to receive a "Pants on Fire" rating for a statement deemed false. But we'll see.

Future PoF bias charts will not appropriately compare directly to those from the past, like this one:

 

That will be the final series graph from the first run of the study.

Afters:

It's worth noting again, we suppose, that "Republicans lie more!" does not help explain the disparity in the graph if our hypothesis about the subjectivity of the ratings is correct. We have no good reason for thinking it is incorrect.

The graph measures percentages, not raw numbers of ratings. Democrats making one "Pants on Fire" statement to go with nine "False" statements ends up as the same percentage as Republicans making 100 "Pants on Fire" statements to go with 900 "false" statements.

Monday, January 7, 2019

Research shows PolitiFact leans left: The "Pants on Fire" bias

In 2011 PolitiFact Bias started a study of the way PolitiFact employs its "Pants on Fire" rating.

We noted that PolitiFact's definitions for "False" and "Pants on Fire" ratings appeared to differ only in that the latter rating represents a "ridiculous" claim. We had trouble imagining how one would objectively measure ridiculousness. PolitiFact's leading lights appeared to state in interviews that the difference in the ratings was subjective. And our own attempt to survey PolitiFact's reasoning turned up nothing akin to an empirically measurable difference.

We concluded that the "Pants on Fire" rating was likely just as subjective as PolitiFact editors described it. And we reasoned that if a Republican statement PolitiFact considered false was more likely than the same type of statement from a Democrat to receive a "Pants on Fire" rating we would have a reasonable measure of ideological bias at PolitiFact.

Every year we've updated the study for PolitiFact National. In 2017, PolitiFact was 17 percent more likely to give a Democrat a "Pants on Fire" rating for a false statement. But the number of Democrats given false ratings was so small that it hardly affected the historical trend. Over PolitiFact's history, Republicans are over 50 percent more likely to receive a "Pants on Fire" rating for a false claim than a Democrat.

2017


After Angie Drobnic Holan replaced Bill Adair as PolitiFact editor, we saw a tendency for PolitiFact to give Republicans many more false ("False" plus "Pants on Fire") ratings than Democrats. In 2013, 2015, 2016 and 2017 the percentage was exactly 25 percent each year. Except for 2007, which we count as an anomaly, that percentage marked the record high for Democrats. It appeared likely that Holan was aware of our research and leading PolitiFact toward more careful exercise of its subjective ratings.

Of course, if PolitiFact fixes its approach to the point where the percentages are roughly even, this powerfully shows that the disparities from 2009 through 2014 represent ideological bias. If one fixes a problem it serves to acknowledge there was a problem in need of fixing.


In 2018, however, the "Pants on Fire" bias fell pretty much right in line with PolitiFact's overall history. Republicans in 2018 were about 50 percent more likely to receive a "Pants on Fire" rating for a claim PolitiFact considered false.

The "Republicans Lie More!" defense doesn't work

Over the years we've had a hard time explaining to people why it doesn't explain away our data to simply claim that Republicans lie more.

That's because of two factors.

First, we're not basing our bias measure on the number of "Pants on Fire" ratings PolitiFact doles out. We're just looking at the percentage of false claims given the "Pants on Fire" rating.

Second, our research provides no reason to believe that the "Pants on Fire" rating has empirical justification. PolitiFact could invent a definition for what makes a claim "Pants on Fire" false. PolitiFact might even invent a definition based on some objective measurement. And in that case the "Republicans lie more!" excuse could work. But we have no evidence that PolitiFact's editors are lying when they tell the public that the difference between the two ratings is subjective.

If the difference is subjective, as it appears, then PolitiFact's tendency to more likely give a Republican's false statement a "Pants on Fire" rating counts as a very clear indicator of ideological bias.

To our knowledge, PolitiFact has never addressed this research with public comment.

Tuesday, December 19, 2017

PolitiFact's "Pants on Fire" bias--2017 update (Updated)

What tale does the "Truth-O-Meter" tell?

For years, we at PolitiFact Bias have argued that PolitiFact's "Truth-O-Meter" ratings serve poorly to tell us about the people and organizations PolitiFact rates on the meter. But the ratings may tell us quite a bit about the people who run PolitiFact.

To put this notion into practice, we devised a simple examination of the line of demarcation between two ratings, "False" and "Pants on Fire." PolitiFact offers no objective means of distinguishing between a "False" rating and a "Pants on Fire" rating. In fact, PolitiFact's founding editor, Bill Adair (now on staff at Duke University) described the decision about the ratings as "entirely subjective."

Angie Drobnic Holan, who took over for Adair in 2013 after Adair took the position at Duke, said "the line between 'False' and 'Pants on Fire' is just, you know, sometimes we decide one way and sometimes decide the other."

After searching in vain for dependable objective markers distinguishing the "Pants on Fire" rating from the "False" rating, we took PolitiFact at its word and assumed the difference between the two is subjective. We researched the way PolitiFact applied the two ratings as an expression of PolitiFact's opinion, reasoning that we could use the opinions to potentially detect PolitiFact's bias (details of how we sorted the data here).

Our earliest research showed that, after PolitiFact's first year, Republicans were much more likely than Democrats to have a false claim rated "Pants on Fire" instead of merely "False." Adair has said that the "Pants on Fire" rating was treated as a lighthearted joke at first--see this rating of a claim by Democrat Joe Biden as an example--and that probably accounts for the unusual results from 2007.

In 2007, the lighthearted joke year, Democrats were 150 percent more likely to receive a "Pants on Fire" rating for a false statement.

In 2008, Republicans were 31 percent more likely to receive a "Pants on Fire" rating for a false statement.

In 2009, Republicans were 214 percent more likely to receive a "Pants on Fire" rating for a false statement (not a typo).

In 2010, Republicans were 175 percent more likely to receive a "Pants on Fire" rating for a false statement (again, not a typo).

We published our first version of this research in August 2011, based on PolitiFact's first four years of operation.

In 2011, Republicans were 57 percent more likely to receive a "Pants on Fire" rating for a false statement.

In 2012, Republicans were 125 percent more likely to receive a "Pants on Fire" rating for a false statement.

Early in 2013, PolitiFact announced Adair would leave the project that summer to take on his new job at Duke. Deputy editor Angie Drobnic Holan was named as Adair's replacement on Oct. 2, 2013.

In 2013, the transition year, Republicans were 24 percent more likely to receive a "Pants on Fire" rating for a false statement.

Had Republicans started to curb their appetite for telling outrageous falsehoods?

In 2014, Republicans were 95 percent more likely to receive a "Pants on Fire" rating for a false statement.

In 2015, Republicans were 2 percent (not a typo) more likely to receive a "Pants on Fire" rating for a false statement.

In 2016, Republicans were 17 percent more likely to receive a "Pants on Fire" rating for a false statement.

In 2017, Democrats were 13 percent more likely to receive a "Pants on Fire" rating for a false statement.

Had Republicans gotten better than Democrats at reigning in their impulse to utter their false statements in a ridiculous form?

We suggest that our data through 2017 help confirm our hypothesis that the ratings tell us more about PolitiFact than they do about the politicians and organizations receiving the ratings.






Do the data give us trends in political lying, or separate journalistic trends for Adair and Holan?

We never made any attempt to keep our research secret from PolitiFact. From the first, we recognized that PolitiFact might encounter our work and change its practices to decrease or eliminate the appearance of bias from its application of the "Pants on Fire" rating. We did not worry about it, knowing that if PolitiFact corrected the problem it would help confirm the problem existed  regardless of what fixed it.

Has PolitiFact moderated or fixed the problem? Let's look at more numbers.

The "Pants on Fire" bias

From 2007 through 2012, PolitiFact under Adair graded 29.2 percent of its false claims from the GOP "Pants on Fire." For Democrats the percentage was 16.1 percent.

From 2014 through 2017, PolitiFact under Holan graded 26 percent of its false claims from the GOP "Pants on Fire" and 21.9 percent for Democrats.

It follows that under Adair PolitiFact was 81.4 percent more likely to give a "Pants on Fire" rating to a false GOP statement than one for a Democrat. That includes the anomalous 2007 data showing a strong "Pants on Fire" bias against Democrats.

Under Holan, PolitiFact was just 18.7 percent more likely to give a "Pants on Fire" rating to a false GOP statement than one from a Democrat.

Story selection bias

While tracking the percentage of false ratings given a "Pants on Fire" rating, we naturally tracked the sheer number of times PolitiFact issued false ratings (either "False" or "Pants on Fire"). That figure speaks to PolitiFact's story selection.

From 2007 through 2012, PolitiFact under Adair found an average of 55.3 false claims per year from Republicans and 25.8 false claims per year from Democrats. That includes 2007, when PolitiFact was only active for part of the year.

From 2014 through 2017, PolitiFact under Holan found an average of 81 false claims per year from Republicans and 16 false claims per year from Democrats.

Under Holan, the annual finding of false claims by Republicans increased by nearly 58 percent. At the same time, PolitiFact's annual finding of false claims by Democrats fell by 38 percent.

Update Jan. 1, 2018: GOP false claims reached 90 by year's end.


One might excuse the increase for the GOP by pointing to staff increases. But the same reasoning serves poorly to explain the decrease for the Democrats. Likewise, increased lying by Republicans does not automatically mean Democrats decreased their lying.

Did the Democrats as a party tend strongly toward greater truth-telling? With the notable blemish a greater tendency to go "Pants on Fire" when relating a falsehood?

Conclusion

We suggest that changes in PolitiFact's practices more easily make sense of these data than do substantial changes in the truth-telling patterns of the two major U.S. political parties. When Adair stepped down as PolitiFact's editor, a different person started running the "star chamber" meetings that decide the "Truth-O-Meter" ratings and a different set of editors voted on the outcomes.

Changing the group of people who decide subjective ratings will obviously have a substantial potential effect on the ratings.

We suggest that these results support the hypothesis that subjectivity plays a large role in PolitiFact's rating process. That conclusion should not surprise anyone who has paid attention to the way PolitiFact describes its rating process.

Has Holan cured PolitiFact of liberal bias?

We recognized from the first that the "Pant on Fire" bias served as only one measure of PolitiFact's ideological bias, and one that PolitiFact might address. Under Holan, the "Pants on Fire" bias serves poorly to demonstrate a clear ideological bias at PolitiFact.

On the other hand, PolitiFact continues to churn out anecdotal examples of biased work, and the difficulty Holan's PolitiFact has in finding false statements from Democrats compared to Adair's PolitiFact suggests our data simply show something of a trade-off.

When we started evaluating PolitiFact's state operations, such as PolitiFact Georgia, we noticed that lopsided numbers of false statements were often accompanied by a higher percentage of "Pants on Fire" statements from the party receiving many fewer false ratings. We hypothesized a compensatory bias might produce that effect when the fact checkers, consciously or unconsciously, encourage the appearance of fairness.

PolitiFact, after all, hardly needs to grade false Republican statements more harshly to support the narrative that Republicans lie more when it is finding, on average, five times more false statements from Republicans than Democrats.


We doubt not that defenders of PolitiFact can dream up some manner of excusing PolitiFact based on the "fact' that Republicans lie more. But we deeply doubt that any such approach can find a basis in empirical evidence. Subjective rating systems do not count as empirical evidence of the rate of lying.


In addition to empirically justifying the increase in GOP falsehoods, defenders will need to explain the decrease in Democratic Party falsehoods implied in PolitiFact's ratings. Why, with a bigger staff, is PolitiFact having a more difficult time finding false statements from Democrats than it did when Adair was steering the ship?

If Truth-O-Meter data were ostensibly objective, it would make sense to question the reliability of the data given the differing trends we see for PolitiFact under Adair and Holan.

Given PolitiFact's admissions that its story selection and ratings are substantially subjective, it makes sense for the objective researcher to first look to the most obvious explanation: PolitiFact bias. 

 

Notes on the research method

Our research on the "Pants on Fire" bias looks at partisan elected officials or officeholders as well as candidates and campaign officials (including family members participating in the campaign). We exclude PolitiFact ratings where a Republican attacked a Republican or a Democrat attacked a Democrat, reasoning that such cases may muddy the water in terms of ideological preference. The party-on-party exclusions occur rarely, however, and do not likely affect the overall picture much at all.

In the research, we use the term "false claims" to refer to claims PolitiFact rated either "False" or "Pants on Fire." We do not assume PolitiFact correctly judged the claims false.

Find the data spreadsheet here.


Afters

We have completed alternative versions of our charts with the data for Donald Trump removed, and we'll publish those separately from this article at a later time. The number of false claims from Republicans went down from 2015-2017 but with PolitiFact still issuing far more false ratings to Republicans. The "Pants on Fire" percentages were almost identical except for 2016. With Trump removed from the data the Republicans would have set an all-time record for either party for lowest percentage of "Pants on Fire" claims.

These results remain consistent with our hypothesis that PolitiFact's "False" and "Pants on Fire" ratings reflect a high degree of subjectivity (with the former perhaps largely influenced by story selection bias).



Update Dec. 19, 2017: Added intended hyperlink to explanations of the research and the notable Biden "Pants on Fire."
Update Dec. 21, 2017: Corrected date of previous update (incorrectly said Dec. 12), and updated some numbers to reflect new PolitiFact ratings of Donald Trump through Dec. 21, 2017: "13 percent"=>"10 percent",  "87.3 claims per year"=>"80.5 claims per year", "23.8"=>"26.1" and "8.7"="19.2." The original 87.3 and 23.8 figures were wrong for reasons apart from the new data. We will update the charts once the calendar year finishes out. Likewise the 8.7 figure derived in part from the incorrect 23.8.

Update Jan 1, 2017:  Changed "10 percent" back to "13 percent" to reflect updated data for the whole year. "80.5 claims per year" updated to "81 claims per year." We also changed "26.1" to "26" and "8.7" to "18.7." The latter change shows that we neglected to make the "8.7" to "19.2" change we announced in the description of the Dec. 21, 2017 update, for which we apologize.

Wednesday, September 28, 2016

PolitiFact's presidential "Pants on Fire" bias

PolitiFact Bias has tracked for years a measure of PolitiFact's bias called the "Pants on Fire" bias. The presidential election gives us a fine opportunity to apply this research approach in a new and timely way.

This measure, based on PolitiFact's data, shows PolitiFact's strong preference for Democrat Hillary Clinton over the Republican candidate Donald Trump. When PolitiFact ruled claims from the candidates as false (either "False" or "Pants on Fire"), Trump was 82 percent more likely than Clinton to receive a "Pants on Fire" rating.

Why does this show a bias at PolitiFact? Because PolitiFact offers no objective means of distinguishing between the two ratings. That suggests the difference between the two ratings is subjective. "Pants on Fire" is an opinion, not a finding of fact.

When journalists call Trump's falsehoods "ridiculous" at a higher rate than Clinton's, with no objective principle guiding their opinions, it serves as an expression of bias.

 

How does the "Pants on Fire" bias measure work?


Friday, January 22, 2016

The 2015 "Pants on Fire" bias for PunditFact and the PolitiFact states

Earlier this week we published our 2015 update to our study of PolitiFact's bias in applying its "Pants on Fire" rating.

The premise of the research, briefly, is that no objective criterion distinguishes between a "False" rating and a "Pants on Fire" rating. If the ratings are subjective then a "Pants on Fire" rating provides a measure of opinion and nothing more.

In 2015 the states provided comparatively little data. State franchises, with a few exceptions, seem to have a tough time giving false ratings. The state PolitiFact operations also tend to vary widely in the measurement of the "Pants on Fire" bias. PolitiFact Wisconsin's "Pants on Fire" ratings proportionally treat Democrats more harshly than Republicans, for example.


PolitiFact Florida: PolitiFact Florida's data roughly matched those from PolitiFact National and PunditFact. Those three franchises are the most closely associated with one another since all are based in Florida and tend to share writing and editorial staff. In 2015 the PoF bias number was within a range of five hundredths for each. That's so close that it's suspicious on its face. All three gave Republicans more false ratings than Democrats (4.83, 3.50, 2.43).

PolitiFact Georgia: Though PolitiFact Georgia has operated for a good number of years and has in the past provided us with useful data, that wasn't the case in 2015. PolitiFact Georgia's false ratings went to apparently non-partisan claims.

PolitiFact New Hampshire: PolitiFact New Hampshire historically provides virtually nothing helpful in terms of the PoF bias number. But false ratings for Democrats outnumbered false ratings for Republicans (blue numerals indicate that bias).

PolitiFact Rhode Island: PF Rhode Island rated two statements from Democrats "False."

PolitiFact Texas: PF Texas gave Republicans false ratings an astonishing 8 times more than Democrats. But at the same time, PF Texas produced a PoF bias number harming Democrats. The key to both figures? PF Texas only doled out two false ratings to partisan Democrats. Both were "Pants on Fire" ratings. An entire year with no "False" ratings for Democrats? Texas' previous low for "False" ratings was four (twice).

PolitiFact Virginia: PF Virginia achieved perfect neutrality in terms of our PoF bias number. That's the meaning of a 1.00 score. The "Pants on Fire" claims as a percentage of all false claims was equal for Republicans and Democrats.

PolitiFact Wisconsin: PF Wisconsin continued its trend of giving Democrats the short end of the PoF bias measure. That's despite giving Republicans a bigger share than usual of the total false ratings. The 5.00 selection bias number was easily the all-time high for PF Wisconsin, besting the old mark of 1.57 back in 2011.

PunditFact: PunditFact, we should note, produces data we class in our "Group B." PunditFact tends not to rate partisan candidates, officeholders, partisan appointees or party organizations. It focuses more on pundits, as the name implies. We consider group B data less reliable as a measure of partisan bias than the group A data. But we do find it interesting that PunditFact's data profile lines up pretty closely with the most closely associated PolitiFact entities, as noted above. That finding proves consistent with the idea that PolitiFact ratings say something about the viewpoint of the ones giving the ratings.