Showing posts with label Democrats Lie Less?. Show all posts
Showing posts with label Democrats Lie Less?. Show all posts

Thursday, June 8, 2023

Have politicians discovered asbestos pants? (Update/Correction)

We think PolitiFact's "Truth-O-Meter" rating system offers ample evidence of PolitiFact's bias.

Why?

1) It's an admittedly subjective rating system.

2) Rating patterns differ widely at different franchises.

3) Fundamentally similar fact checks may conclude with very different ratings.

And to those three reasons we add a fourth, newly ascendant in the data we collect:

4) "Pants on Fire" seems to be going extinct/extinguished.

Have politicians discovered asbestos pants?

Through June 8, 2023, the total number of "Pants on Fire" ratings given to politicians totals five. Five.

Correction 6/22/2023: We apparently made a careless error in transcribing the number of Pants on Fire ratings given to party politicians during the first half (or so) of 2023. The correct number was two, not five. The corrected number only strengthens our point that "Pants on Fire" numbers have fallen off a cliff. Yes, the chart is wrong as well in reporting five in 2023.


From 2007 through 2009, PolitiFact was just starting out, which helps explain the low numbers during that period. In 2010 state franchises such as PolitiFact Texas and PolitiFact Florida started to contribute heavily to the number of ratings, including "Pants on Fire" ratings.

The era of Bill Adair's directorship was in full flower through 2013. We see the three-year spike of GOP "Pants on Fire" ratings and a rise followed by a slow decline in Democratic Party "Pants on Fire" ratings.

Current Editor-in-Chief Angie Drobnic Holan took over from Adair, and under Holan we observe a decline in "Pants on Fire" ratings for Democrats. We see the same for Republicans, notwithstanding notable election-year spikes in 2016 and 2020.

So far, the year 2023 stands out for its exceptionally low numbers.

"Republicans Lie More!"

Oh, please!

As a catchall excuse for weird PolitiFact data, that just won't cut it. It's not good as an excuse for PolitiFact's selection bias problem. It doesn't explain PolitiFact's biased application of "Pants on Fire" ratings, and it cannot ever explain lower numbers of "Pants on Fire" ratings over time to both political parties.

So, what's the explanation?

The simplest explanation boils down to money. PolitiFact gets paid for its role as the falsehood-sniffing dog for social media censors. The most recent page of "Pants on Fire" ratings at PolitiFact's webpage is filled with "Pants on Fire" ratings given for social media claims, with not one given to a party officeholder, candidate, appointee or the like. Not one. On the previous page there's one for Donald Trump given back in May.

That suggests PolitiFact now takes a greater interest in its social media role than in holding politicians accountable. To be fair, however, PolitiFact can still manipulate political messaging effectively by giving poor ratings to messages Republicans are likely to share. Rating one social media claim, no matter who it's from, can justify stuffing a sock in the social media mouth that would repeat it.

An alternative explanation? Politicians, both Democrat and Republican, are lying less.

It will be fun to see whether fact checkers try to take credit for making politicians more truthful without any sound basis for that claim.


Thursday, March 17, 2022

PolitiFact's "Pants on Fire" bias in 2021

As we noted in our post about the "Pants on Fire" research for 2020, we have changed the way we do the research.

PolitiFact revamped its website in 2020, and the update made it next to impossible to reliably identify which of PolitiFact's various franchises were responsible for a fact check. Instead of focusing on PolitiFact National, it makes more sense to lump all of PolitiFact together. But the new approach has a drawback. The new evaluations represent an apples-to-oranges comparison to the old evaluations.

To deal with that problem, we went back and did PolitiFact's entire history since 2007 using the new method.

With the research updated using the new method, we are now able to compare the new research with the old method.

Spoiler: Using the new method, PolitiFact was 2.66 times more likely to rule a claim it viewed as false as "Pants on Fire" from a Republican than for a Democrat. That's PolitiFact's third-highest bias figure of all time, though PolitiFact National, considered separately, has exceeded that figure at least three times.

 

Method Comparison: New vs. Old 

Our new graph shows the old method, running from 2007 through 2019, along with the new method graphed from 2007 through 2021.


The black line represents the old method. The red line represents the new.

The numbers represent the what we term the "PoF bias number," which is an expression of how much more likely it is that PolitiFact will a claim it regards as false a "Pants on Fire" rating for a Republican over a Democrat. So, for 2009 under the old method (black line), the GOP was 3.14 times more likely to have one of its supposedly false statements rated "Pants on Fire."

As our research has documented, PolitiFact has never offered an objective means of determining the ridiculousness of a claim viewed as false. The "Pants on Fire" rating, to all appearance, has to qualify as a subjective judgment. In other words, the rating represents PolitiFact's opinion.

In 2017, under the old method, the bias number dropped to 0.89, showing a bias against Democrats for that year at PolitiFact National. On average over time, of course, Republicans were significantly more likely to have their false claims regarded as "ridiculous" by PolitiFact.

Notably, the new method (red line) shows a moderating effect on PolitiFact's "Pants on Fire" bias from 2008 through 2014. The red line hovers near 1.00 for much of that stretch. After 2015 the red line tends to run higher than the black line, with the notable exception of 2019.

Explaining the Numbers?

We found two correlations that might help explain the patterns we see in the graphs.

PolitiFact changes over time. From 2007 through 2009, PolitiFact National did nearly every rating. Accordingly, the red and black lines track very closely for those years. But in 2010 PolitiFact added several franchises in addition to PolitiFact Florida. Those franchises served to moderate the PoF bias number until 2015, where we measured hardly any bias at all in the application of PolitiFact's harshest rating.

After 2015, a number of franchises cut way back on their contributions to the PolitiFact "database" and a number ceased operations altogether, such as PolitiFact New Jersey and PolitiFact Tennessee. And in 2016 PolitiFact added eight new state franchises (in alphabetical order): Arizona, Colorado, Illinois, Nevada, New York, North Carolina, Vermont and West Virginia.

The Franchise Shift

We made graphs to help illustrate the franchise shift. PolitiFact has had over 20 franchises over its history, so we'll divide the graph into two time segments to aid the visualization.

First, the franchises from 2010 through 2015 (click for larger view):

We see Florida, Texas, Rhode Island and Wisconsin established as consistent contributors. Tennessee lasts one year. Ohio drops after four years. Oregon drops after five and New Jersey after three.

Next, the franchises from 2016 through 2022 (click for larger view):


I omitted minor contributions from PolitiFact Georgia in 2016 (12) and 2017 (2). The orange bar near the top of 2016 is six states combined (hard to make out in the columns after 2016).

Note that the contributions are skinny, except for the one from Wisconsin. But even Wisconsin cut its output compared to the previous graph. We have a correlation suggesting that the participation of different state franchises impacted the bias measure.

But there's another correlation.

Republicans Lie More! Democrats Lie Less!

Liberals like to explain PolitiFact ratings that look bad for Republicans by saying that Republicans lie more. Seriously, they do that. But we found that spikes--especially recent ones--in the "Pants on Fire" bias measure were influenced by PolitiFact's spiking reluctance to give Democrats a "Pants on Fire" rating.

That correlation popped out when we created a graph showing the percentage of false statement given the "Pants on Fire" rating by party. The graph for Republicans stays pretty steady between 20 and 30 percent. The graph for Democrats fluctuates wildly, and the recent spikes in the bias measure correlate with very low percentages of "Pants on Fire" ratings for Democrats.


As is always the case, our findings support the hypothesis that PolitiFact applies its "Pants on Fire" rating subjectively, with Republicans receiving the bulk of the unfair harm. And in this case Republicans receive the bulk of the unfair harm through PolitiFact's avoidance of rating Democrat claims "Pants on Fire."

Do Democrats lie less? We don't really know. We suspect not, given the number of Democrat whoppers PolitiFact allows to escape its notice (such as this recent gem--transcript). We think PolitiFact's bias explains the numbers better than the idea Democrats lie less.



Notes on the PolitiFact franchise numbers: As we noted from the outset, PolitiFact's revamped website made it all but impossible to identify which franchise was responsible for which fact check. So how did we get our numbers?

We mostly ignored tags such as "Texas" or "Wisconsin" and looked for the names of staffers connected to the partnered newsroom. This was a fallible method because the new-look website departs from PolitiFact's old practice of listing any staffers who helped write or research an article. The new site only lists the first one mentioned from the old lists. And it has long been the case that staffers from PolitiFact National would publish fact checks under franchise banners. So our franchise fact check numbers are best taken as estimates.