Saturday, December 28, 2024

Why does PolitiFact struggle with simple stuff?

 PolitiFact Bias started out and continues as an effort to improve PolitiFact.

We understand PolitiFact's liberal bloggers disliking criticism. But c'mon, it's for your own good. And why the struggle with simple stuff?

Moments ago, I was dipping into some search results relating to a potential research project. While reviewing a PolitiFact story I noticed it had an update notice.




"An update," I think. "I wonder what was updated?"

So I look through the story for an update. Then I looked for the update again.

Then I cheated and tried for an Internet Archive comparison. The oldest archived page was already updated, so that was initially a dead end.

I looked through the story again looking for the update without finding it.

What does (did) PolitiFact's statement of principles say about updates?

Updates – From time to time, we add additional information to stories and fact-checks after they’ve published, not as a correction but as a service to readers. Examples include a response from the speaker we received after publication (that did not change the conclusion of the report), or breaking news after publication that is relevant to the check. Updates can be made parenthetically within the text with a date, or at the end of the report. Updated fact-checks receive a tag of "Corrections and updates."

The update announcement at the top should have featured the date it was added. And, as PolitiFact's supposed principles state, the story should have had a "Corrections and Updates" tag added. There's no such tag.

I was reminded by my attempt to access PolitiFact's archived statement of principles that PolitiFact's update to its website might hide older pages from ordinary Internet Archive searches. I went to PolitiFact's main page, archived on the date of the article. The article was highlighted on the main page, and clicking on it took me to the page as archived on March 29, 2019.

The page had no update announcement. 

Now we're cookin'.

Comparing March 29 to April 1 revealed five added paragraphs from (liberal blogger) Jon Greenberg.

So, PolitiFact updated the story and did not inform its readers on the specifics of the update. This has the effect of a stealth edit, which counts as a no-no in journalism.

This Is So Minor! Who Cares?

Yes, this case is fairly minor, albeit following published principles, in journalism as in anything else, should count as standard practice. Inconsistent application of principles empties the term "principles" of the meaning it ought to have.

As to who cares, the public ought to care because journalism organizations have principles to establish their trustworthiness. Following the principles provides evidence of trustworthiness. Failing to follow principles offers evidence of untrustworthiness.

The International Fact-Checking Network, in its supposed role in holding fact-checking organizations accountable, also ought to care. But I could send a correction request to PolitiFact asking to have this problem corrected and PolitiFact probably would not bother to fix it. I say that based on past experience. Moreover, after PolitiFact failed to fix the error for weeks, I could send this example to the International Fact-Checking Network as an example of PolitiFact failing to scrupulously follow its corrections policy and the IFCN would ignore it (see here).

Meanwhile, the IFCN (owned as is PolitiFact by the Poynter Institute) will continue to assure the public that fact-checking orgs like PolitiFact that are "verified" by the IFCN scrupulously follow their corrections policies.

These journalists who want our trust are telling us falsehoods.

Why wouldn't it be better to fix stories so that they live up to published principles? If they don't have time to follow principles on corrections and updates (among other things), should we expect them to have time to live up to their principles in reporting and fact-checking?

We believe we haven't been able to help PolitiFact or the IFCN much because they don't want any help.

Thursday, December 19, 2024

The PolitiFact Wisconsin story

 This article is a companion to Bryan's forthcoming review of former PolitiFact editor Bill Adair's book, "The Big Lie."

In my review of Bill Adair's book I refer to the way PolitiFact's state operations like PolitiFact Wisconsin tended to favor Republicans during years Adair excluded from his dataset. Readers of that Substack article may find this explanation helpful.

Research published here at PolitiFact Bias has examined the bias PolitiFact applies in the use of its "Pants on Fire" ratings. The difference between "False" and "Pants on Fire" appears entirely subjective and based squarely on the term "ridiculous." Until PolitiFact defines "ridiculous" in a reasonably objective way, its descriptions up through this point strongly encourage the view that the term is subjective.

Until 2020, a "Wisconsin" tag on a PolitiFact story dependably indicated that staffers from PolitiFact's affiliate performed the fact checks. We stopped tracking state data after 2020 because the stories could as easily come from PolitiFact National staffers. We also had reason to believe the state affiliates were no longer in charge of determining the "Truth-O-Meter" ratings.

"Pants on Fire" Bias at PolitiFact Wisconsin

Wisconsin was unusually tough on its Democrats compared to most other PolitiFact operations. Whereas PolitiFact National gave Democrats a "Pants of Fire" for about 17 percent of their false statements from 2007 through 2019, PolitiFact Wisconsin gave them over 27 percent, slightly higher than the 27 percent average Republicans received from PolitiFact National.

Raw Numbers at PolitiFact Wisconsin

Adair's claim that Republicans lie more doesn't rest on percentages, though. Adair sticks with raw numbers of disparaging ratings.

There, too, PolitiFact Wisconsin moderated the bias of the larger organization.

Republicans "earned" about 40 percent more "False" plus "Pants on Fire" ratings than did Democrats from PolitiFact Wisconsin. In contrast, PolitiFact National gave Republicans over 300 percent (3x) more such ratings than Democrats.

The tendency in Wisconsin, as this graph helps show, matches that for PolitiFact as a whole. It isn't that Republicans lie more. It's that Democrats lie less and less.


Where did the Democrat lies go? Did PolitiFact and other fact checkers force them to clean up their act? Did fact checkers at long last realize that they had been too tough on Democrats early on?

Did narrative increasingly conquer objectivity?