Monday, January 2, 2023

PolitiFact's "Pants on Fire" bias in 2022

Back in 2012, we started an ongoing study of PolitiFact's bias particular to its decisions on "False" versus "Pants on Fire" ratings, given that the difference counts to all appearances as "entirely subjective."

This post updates that research with observations about 2022.

The "Pants on Fire" Bias

PolitiFact has never described an objective grounds for deciding between a "False" rating and a "Pants on Fire" rating for its "Truth-O-Meter." Our research approach predicts that PolitiFact's bias will drive a preference for one party over the other in making those decisions. That bias we express as the "PoF Bias number," where 1.0 shows perfect balance between the two parties. A figure below 1.0 shows PolitiFact favoring Republicans and a figure over 1.0 shows PolitiFact favoring Democrats.

 

For 2022, PolitiFact scored its third-highest PoF Bias number since it started in 2007. Of note, these figures include all of PolitiFact's state franchises. Tracking PolitiFact National by itself, as we once did, shows a sharp lean to the left during the 2010-2015 period. State franchises provided the balance shown during that time. The more recent spikes in bias likely stem from a combination of new, more left-leaning franchises and fact checker zeal over President Trump.

The cumulative PoF Bias number stands at a relatively modest 1.33 even with recent left-leaning spikes. So. over its history PolitiFact is 33 percent more likely to give a claim PolitiFact deems false a "Pants on Fire" rating for a Republican compared to a Democrat. That figure for PolitiFact National from 2007 through 2019 was 56 percent.

That variation, by the way, supports the hypothesis that different PolitiFact fact checkers display differing trends in their fact-checking. If, for example, PolitiFact National leans more left than state franchises then increased control over those franchises by National should show increased left-leaning bias.

Trends

We observe a fascinating trend at PolitiFact toward lower numbers of fact checks for politicians. Probably thanks to the lure of social media dollars, PolitiFact's timeline over time shows increased checking of social media claims. That's understandable, as social media reimburse fact-checking partners like PolitiFact for their work.  We noted that trend in a separate dataset including PolitiFact's fact checks of all U.S. politicians. We also see it reflected in this study focused only on "False" and "Pants on Fire" ratings of politicians, though with a notable spike during the 2020 election year.

Why was PolitiFact able to find over 100 false claims from Democrats each year from 2010 through 2012 but unable to crack the peak of 68 in the years since? Though we've noticed PolitiFact claiming politicians have started communicating with greater care, we do not find that explanation plausible without specific supporting evidence. Selection bias likely serves as an adequate explanation. And PolitiFact's numbers show an increased reluctance to rate false statements from Democrats as "Pants on Fire."

In 2022 PolitiFact established a new low rate from Democrats by rating only 6.06 percent of Democratic Party false claims as "Pants on Fire." That edged marks of 6.12 percent for 2018 and 6.38 percent in 2020.

PolitiFact also gave Republicans "Pants on Fire" ratings for the lowest annual rate ever, at 21.7 percent. Of course the rate for Republicans receiving that subjective and severe rating was over three times greater than for Democrats.

Monthly Tracking

We experimented with creation of a monthly chart for 2022's numbers. We expected the chart to show reasonable stable trends in the numbers despite the relatively low number of ratings in 2022. Small datasets should show relatively greater variation with randomized sample sizes. PolitiFact, of course, makes no apparent attempt to randomize its dataset. Selection bias may explain the relatively stable numbers we see from the chart.



Monday, December 19, 2022

PolitiFact's gender pay gap shenanigans continue (2022 edition)

PolitiFact has completed quite a few fact checks touching the gender wage gap. And a Dec. 16, 2022 item from PolitiFact Wisconsin carries on PolitiFact's rich tradition of left-leaning inconsistency.


As we have noted here and at Zebra Fact Check, PolitiFact wanders all over the map on gender pay gap claims. Here's the central thing to remember, for those seeking consistency: The raw gender wage gap, such as the one cited by Sen. Tammy Baldwin in the claim pictured above, serves as no measure of gender (or racial) discrimination. PolitiFact Wisconsin both acknowledges that and ignores it for the sake of Baldwin's "Truth-O-Meter" rating.

PolitiFact Wisconsin:

As PolitiFact National, which has reviewed numerous pay-gap claims over the years, has noted: "a speaker’s choice of words can significantly affect whether their point about the gender pay gap is right or wrong." 

...

... (T)he government data isn’t based on men and women doing the same jobs. Rather, it’s an average that widens or closes by factors such as race, job type and age. Research suggests women are overrepresented in jobs that tend to pay less, for a variety of reasons.

PolitiFact Wisconsin explains why Baldwin does not deserve a "True" rating and proceeds to award Baldwin a "True" rating.

Review what Baldwin said. Again, from PolitiFact Wisconsin:

"On Latina Equal Pay Day, we bring attention to the fact that Latina workers make 54 cents for every dollar earned by white, non-Hispanic men. It’s past time that Latina workers are given equal pay for equal work."

Using the raw wage gap figure while appealing for equal pay for equal work implies that the raw wage gap represents the gap between groups doing equal work.

That's exactly what Baldwin did.  It's flatly deceptive, but "earns" a "True" from PolitiFact.

PolitiFact simply ignores the problem with Baldwin's implied argument, except for purposes of amplifying it. 

It works like this: Explicitly say that the raw wage gap occurs between groups doing the same job and get a "Mostly False" (unless you're extremely lucky!) Merely imply that the raw wage gap occurs between groups, as Baldwin did, and get "True" (unless you're unlucky!).

PolitiFact's inconsistency on the gender wage gap all by itself should dispel the notion that PolitiFact does its job in a non-partisan or objective manner.

It's a journalistic disgrace.


Afters:

PolitiFact's fact check is marvelously horrible. The deck reads "Yes, wage gap does have big impact on Latina workers." The wage gap itself is an effect, primarily of Latina women's choices of low-paying, unskilled jobs. It's not the wage gap driving them into those jobs. It's those jobs driving the effect of a pay gap. The job choices create the impact of the wage gap, not vice-versa.

PolitiFact repeatedly mentions that the wage gap represents the difference between what the average white man makes compared to Latina women. But it's a median figure, not an average. Past PolitiFact gender gap stories likewise tend to ignore the distinction. PolitiFact writes "The averages were based on median earnings for full- and part-time workers." We can think of no solid justification for averaging averages or averaging medians. It's the kind of math people invent to mislead others.

Thursday, December 1, 2022

More PedantiFact: PolitiFact vs. Kevin McCarthy

 Fact checkers supposedly don't fact check opinions.

PolitiFact fact checks opinions. Real Clear Politics has kept a study going looking at how often a set of top fact checkers rate opinions or predictions (among other things). PolitiFact has paced the group.

We expect Real Clear Politics will get around to adding this Nov. 30, 2022 PolitiFact fact check to the list:

 

Why do we think McCarthy was expressing an opinion?

In other words, why do we have the opinion that McCarthy was expressing an opinion?

We're intentionally giving away the answer, of course. "I think" counts as one of the classic ways of marking one's statement as an opinion.

Why does PolitiFact ignore such an obvious clue?

We think it's likely PolitiFact was looking to build a narrative. By overlooking that McCarthy was expressing opinion and focusing on one part of his statement to the exclusion of another, PolitiFact was able to support that narrative under the guise of fact-checking.

PolitiFact supports the narrative that Donald Trump counts as a racist. Facts don't matter in pursuit of that narrative.

PolitiFact quotes McCarthy correctly, and we'll highlight the part that PolitiFact decided to omit from its fact-checking focus even though it's the only part that McCarthy stated as fact:

"I think President Trump came out four times and condemned him and didn't know who he was," McCarthy said.

That drew real-time pushback from a reporter, who said, "He didn't condemn him or his ideology." McCarthy responded, "The president didn't know who he was."
For PolitiFact, it isn't important whether Trump knew who Nick Fuentes was. It's important that Fuentes is a white nationalist, and important to link Fuentes to Trump in a way that reinforces the narrative that Trump is a racist. Toward that end, PolitiFact ignores the claim Trump did not know who Fuentes was and focuses on the supposed lack of condemnation.

We would argue that Trump saying he did not know Fuentes counts as a condemnation, when we consider the context.

PolitiFact argues the opposite, albeit without any real argument in support:

A look at Trump’s statements during the week between the Nov. 22 dinner and McCarthy’s press availability Nov. 29 show that McCarthy was wrong. Specifically, Trump did not condemn Fuentes on four occasions; instead, Trump said in four statements that he did not know who Fuentes was.
PolitiFact implicitly says that it does not count as a condemnation to profess ignorance of Fuentes' identity.

Here's why that's wrong.

Trump was implying that if he had known who Fuentes was, he would not be welcome at dinner. Hardly anything could be more obvious, particularly given the context that Trump went on record condemning neo-Nazis and white nationalism.

We can even source Trump's quotation through PolitiFact, albeit the fact checkers do an excellent job of not drawing attention to it:

"And you had people -- and I’m not talking about the neo-Nazis and the white nationalists -- because they should be condemned totally. But you had many people in that group other than neo-Nazis and white nationalists. Okay?"

So the fact checkers, though they have reason to know Trump condemned white nationalism, leave that out of a fact check focusing on whether Trump condemned white nationalism. That's context fit for suppression.

The facts don't matter when liberal bloggers posing as unbiased fact checkers want to promote a narrative.

Friday, November 11, 2022

Glenn Youngkin and PolitiPedant Virginia

PolitiFact's penchant for pedantry justly earns it the derisive nickname "PolitiPedant."

Ready for another example? Let's go!

Youngkin claimed he won cities no Republican had won before in Virginia. Obviously a fact checker needs to find a reasonable definition of "city" to fact check Youngkin's claim.

PolitiFact's method credits the state of Virginia with 38 cities. Does that seem low? There's justification for it, even if it's the wrong justification for this fact check. We charge PolitiFact with failing to give readers anything approaching an adequate explanation.

The Independent City

Virginia has an unusual feature regarding its cities. PolitiFact mentions it casually, without explanation, as it sets up its fact-finding:

GOP results in Virginia cities

Youngkin won 14 of Virginia’s 38 independent cities: Bristol; Buena Vista; Chesapeake; Colonial Heights; Covington; Galax; Hopewell; Lynchburg; Norton; Poquoson; Radford; Salem; Virginia Beach and Waynesboro.

In its summary section PolitiFact said "Youngkin won 14 of Virginia's 35 cities in that [2021--ed.] election." Did Virginia have 38 or 35 independent cities in 2021? We expect PolitiFact will fix that inconsistency as though it was a typographical error. But we'll focus on the key term "independent cities," which PolitiFact does not explain to its readers.

There are 41 independent cities in the United States. Virginia counts as home to 38 of them.

So, what is an independent city? It's a city independent of the county (or counties?) in which it is located:

Virginia’s thirty-eight incorporated cities are politically and administratively independent of the counties with which they share borders, just as counties are politically and administratively independent of each other.

In Virginia, unlike the other 49 states, any city that isn't an independent city wears the official designation "town," regardless of its size. In Virginia, a town may be larger than an independent city. That runs counter to the typical understanding of the respective words "city" and "town." Cities, according to the typical definitions, exceed towns in size.

Youngkin in Context 

Was Youngkin using Virginia's understanding of "city" when he addressed his New York audience? We judge there's little reason to think so. Yes, Youngkin himself, as governor of Virginia, must possess some awareness of Virginia's unusual technical standard for cities. But should Youngkin expect this audience to share that understanding? That seems like a stretch.

In the end, Youngkin isn't precluded from using the more common definition of "city" when he mentions cities in Virginia, especially to an outside audience. Many towns in Virginia fit the definition of "city" understood in other states, New York included.

PolitiFact doesn't waste any words at all on that possibility. Why? 

For the journalistic team at PolitiFact Virginia, Virginia's particular approach to defining cities may count as second nature. That may have blinded the team to alternate possibilities. Or, PolitiFact may have stuck with Virginia's narrow definition of "city" to simplify its fact-finding. It's easier to check the list of independent cities to test Youngkin's claim than to check the list of independent cities plus towns-that-may-reasonably-fit-the-usual-definition-of-cities.

But a fact checker that checks facts using methods of convenience over methods of accuracy does not count as much of a fact checker. Maybe Youngkin won no cities that hadn't been won before by a Republican. We do not plan to fact check that. We'll simply point out that PolitiFact Virginia's fact check uses an unacceptable approach to the problem.

PolitiFact should have explained Virginia's unusual approach to designating cities, at minimum, if it failed to properly fact check Youngkin's claim according to the typical definition of "city."

Oversights such as these are what we should expect of biased fact checkers. And that's what we see from PolitiFact on a regular basis.

Friday, November 4, 2022

Narrative-shepherding instead of fact-checking for America as stolen land

One has to hand it to PolitiFact's Yacob Reyes. Earlier this year, Reyes turned a mostly unoccupied barrier island into the whole of Lee County. Now, Reyes turns bad-faith treaties affecting a limited number of American lands into support for the narrative of the United States (plus other modern American nations north and south) consisting of stolen land.

Behold:

PolitiFact's left-wing editorial masquerading as a fact check published on Nov. 3, 2022.

It takes DeSantis out of context, and instead of devoting an "In Context" article to DeSantis' statement, akin to the cover PolitiFact provided for President Obama when he informed business owners "You didn't build that," DeSantis received a "Pants on Fire" rating.

Strikingly, PolitiFact apparently draws a complete blank in trying to figure out why DeSantis would say the United States is not build on stolen land. That comes through in two ways. First, PolitiFact opines in print that it "wondered what DeSantis was referring to and whether he was right in his assessment of whether the U.S. was built on 'stolen land.'" Second, PolitiFact offered absolutely nothing to represent DeSantis position other than tweets from DeSantis associate Christina Pushaw.

Pushaw tweeted an image promoted by Democratic Party candidate for Lieutenant Governor Karla Hernández-Mats saying "No one is illegal on stolen land." After reporting that Hernández-Mats offered no response to its questions about the image, PolitiFact drops that subject for the remainder of the article.

That's how PolitiFact treated the context.

With the context safely ignored, PolitiFact documents some of the admittedly raw deals the Native Americans got and declares DeSantis as wrong as can be:

It's well-documented that the U.S. repeatedly made treaties with Native Americans and then violated them using force and other means to accommodate non-Native settlement. Courts, including the U.S. Supreme Court, have time and again affirmed that as fact.

Government-endorsed actions to remove Native Americans from their ancestral lands included the 1830 passage of a federal law that led to war and resulted in thousands of Native deaths and more than 3,000 Seminoles being removed from Florida.

DeSantis' claim is wildly historically inaccurate. We rate it Pants on Fire!

How does that push back against a DeSantis objection to the immigration statement Hernández-Mats promoted? It doesn't. Instead, it blandly moves in step with liberal-progressive orthodoxy. PolitiFact can't be bothered to dig up articles that explain the objection to singling out Western nations as occupying stolen land.

The Spectator/Historian Jeff Fynn-Paul:

The narrative of the ‘stolen country’ or ‘Native American genocide’ does not stand up to scrutiny by any honest and clear-sighted historian. It is a dangerously myopic and one-sided interpretation of history. It has only gained currency because most practising historians and history teachers are either susceptible to groupthink, or else have been cowed into silence by fear of losing their jobs. Reduced to its puerile form of ‘statement of guilt’, this myth puts 100 per cent of the burden on Europeans who are held responsible for all historical evil, while the First Nations people are mere victims; martyrs even, whose saintlike innocence presumes that their civilisation and society were practically perfect in every way.

All Land is Stolen/Anthony Galli:

The only reason you can claim to “own” land is because of the implicit threat of military/police force against anyone who might try to take it from you. In the good ol’ prehistoric days, man would have to defend his own cave, but now our self-defense is largely done by our respective governments so that we can worry about other things like what’s on Netflix. In other words, a country is one big cave where the current occupant claims to own the cave by threatening force if you try to “steal” it.

In fact, what makes the United States of America so special is how well we treated the former inhabitants of the land we purchased… relative to how every other nation on Earth had treated conquered people up-till that point, which granted still isn’t saying much because the Trail of Tears definitely wasn’t a walk in the park.

Authors such as these help point out that viewing one's own nation as a thief will tend to erode society. And given that every nation qualifies as a thief in trivial "stolen land" terms, there is no real solution to the problem that doesn't involve destroying society. Further, in terms of permitting free immigration on the southern border, why should the descendants of land-stealing conquistadors have title to land stolen in an arguably more civilized way north of the Mexican border?

It doesn't make sense. But PolitiFact will not present those views.

PolitiFact has a narrative to nurture. And excluding competing narratives serves as one means toward that end.

Research note: PolitiFact's 'Pants on Fire' bias through October 2022

We continue to update our research project examining PolitiFact's bias in applying its "Pants on Fire" rating.

The project notes that the difference between "False" and "Pants on Fire" ratings on PolitiFact's "Truth-O-Meter" counts as subjective. "False" statements are untrue, while "Pants on Fire" statements are untrue and "ridiculous." "Ridiculous" is a subjective judgment on its face, and PolitiFact has never published a description of the it suggesting otherwise.

Through the end of October 2022, PolitiFact had only given a Democratic Party officeholder/candidate/organization/appointee one "Pants on Fire" rating that counts toward our statistics. Our count omits claims from that attack the claimant's own political party. One of those occurred, published through PolitiFact New York.

As a result, we found PolitiFact nearly six times more likely to subjectively rate (what it views as) false claims from Republicans as "Pants on Fire" compared to such claims from Democrats.

Interestingly, PolitiFact National published its first "Pants on Fire" rating for a Democrat in 2022 not long after we tagged PolitiFact's Editor-in-Chief Angie Drobnic Holan in our tweet about this graph. A graph including the November stats as of today would put the 2022 PoF Bias number near 3.2--not the all-time record but instead solidly among the four highest bias measures taken since 2007.

Of course new ratings may move the numbers greatly by the end of the year.

Readers curious about the trends and details of the chart may wish to look at our explanation for the full chart at the end of 2021.

Thursday, September 22, 2022

PolitiSpin: Biden says he cut the debt by $1.5 trillion? Half True!

We kiddeth not when we call PolitiFact a collection of liberal bloggers posing as non-partisan fact checkers.

U.S. debt hasn't gone down at all under Biden, as PolitiFact admits. PolitiFact cited a September 2022 estimate saying the deficit, not the debt, would decrease by about $1.7 trillion compared to FY2021, but that leaves a deficit of almost $1 trillion that will increase the debt by that same amount.

So, how does a left-leaning fact checker go about making a false statement seem like a partially true statement that leaves out important details or takes things out of context?

Watch and learn, wannabe liberal bloggers who covet the "fact checker" label:

"We’ve also reduced the debt and reduced the debt by $350 billion my first year," Biden said. "This year, it's going to be over $1.5 trillion (that we’ve) reduced the debt."

Biden has a point that his administration has presided over smaller deficits than were seen under the Trump administration, based on Congressional Budget Office estimates. But Biden’s remark leaves out important context. The debt had risen because of a temporary phase of unusual federal spending.

No Reduction of U.S. Debt

It's simple. Declare that when Biden says he reduced the debt by $1.5 trillion he's actually making a valid point about reducing the deficit and therefore reducing the growth of the debt. Then imply that the problem with Biden's claim isn't using "debt" instead of "deficit" but that he has left out the fact that most of the deficit reduction happened as old COVID programs stopped shelling out so much federal money.

We're probably not supposed to point out that PolitiFact omits all mention of Mr. Biden's student loan forgiveness program. The CBO said, on the page PolitiFact cited for its deficit figure, loan forgiveness actions in September 2022 could substantially affect deficit figures for FY2022.

That's what a liberal blogger will leave out that a nonpartisan fact checker will mention.

What About Biden's Underlying Point?

PolitiFact has reliably (?) informed us that the most important aspect of a numbers claim comes from the speaker's underlying point. If the numbers are off but the main point stands, a favorable "Truth-O-Meter" rating may result.

 Adair:

(W)e realized we were ducking the underlying point of blame or credit, which was the crucial message. So we began rating those types of claims as compound statements. We not only checked whether the numbers were accurate, we checked whether economists believed an office holder's policies were much of a factor in the increase or decrease.

It turns out in the Biden fact check PolitiFact found Mr. Biden was taking credit for the non-existent debt reduction:

During a Sept. 18 interview with CBS’ "60 Minutes," President Joe Biden touted his administration’s efforts to rein in federal debt.

We judge that if PolitiFact believed Biden was touting "his administration's efforts to rein in federal debt" then it regards his debt reduction claim was an effort to take credit for that supposed reduction.

So, was it a Biden administration effort that reduced the deficit (not the debt) by $1.5 trillion compared to FY2021?

PolitiFact (bold emphasis added):

Spending programs passed earlier in the pandemic began expiring this year, meaning federal outlays have declined. The Committee for a Responsible Federal Budget, a nonprofit public policy group, has estimated that more than 80% of the $1.7 trillion reduction in the deficit can be explained by expiring or shrinking COVID-19 relief.
We calculate that as $1.35 trillion out of the $1.7 trillion, leaving Biden with the potential to claim credit for as much as $350 billion of the deficit reduction. Giving the president credit for the entire amount results in an estimated exaggeration (minimum) of 329 percent (($1.5 trillion-$.35 trillion)/$.35 trillion).

So Biden claimed debt reduction that was not debt reduction and exaggerated his administration's share of the deficit reduction by over three times its actual amount. Therefore, according to PolitiFact, what he said was half true.

The 'Slowing the Rate of Growth' Excuse

PolitiFact cleverly, or perhaps stupidly, excuses Biden's use of "debt" instead of "deficit" by interpreting the claim to mean slowing the growth of the debt. PolitiFact could argue precedent for that approach, for claims about "cutting Medicare" or "cutting Medicaid" tend to receive "Half True" ratings or worse (worse tends to happen if Republican).

The problem? Biden got the "Half True" while exaggerating the numbers in his favor for purposes of claiming credit. And that's with PolitiFact helping out by not mentioning the potential cost of his student loan bailout proposal. The Penn Wharton budget model (University of Pennsylvania) estimated costs of over $500 billion for 2022.

That figure would wipe out the administration's potential share of $350 billion of deficit reduction.

The "slowing the rate of growth" excuse doesn't come close to justifying a "Half True" rating.

We have here another strong entry from PolitiFact for the Worst Fact Check of 2022.

Friday, September 9, 2022

PolitiFact vs Your Lyin' Eyes on the immigration invasion

On Sept. 6, 2022 PolitiFact published an item titled "A surprising number of Americans believe these false claims about immigrants. Here are the facts."

We have a favorite among the supposedly false claims believed by a surprising number of Americans.

"There is no invasion at the southern border"

No invasion at the southern border? Tell us more, PolitiFact.
More than half of Americans surveyed by NPR/Ipsos believe it is completely or somewhat true that the "U.S. is experiencing an invasion at the southern border."

But many immigrants crossing the border illegally turn themselves into Border Patrol agents on purpose, to ask for asylum, Brown said. 

"That is not behavior that you would really attribute to an invader," Brown said. She said that usually, the term invasion is used to describe a concerted effort by a country to forcibly enter another country to take it over.

Such reasoning does not belong in fact-checking. On the contrary, the logic PolitiFact accepts belongs as far from fact-checking as possible.

Here's PolitiFact's supposed logic: If the expert says "invasion" usually means one country making a concerted effort to forcibly enter another to take it over, then "invasion" means one country making a concerted effort to forcibly enter another country to take it over. For PolitiFact, it then follows that Americans viewing a tide of illegal immigration at the southern border incorrectly see it as an invasion.

At least PolitiFact declined to follow the lead of its source, NPR/Ipsos, in calling the term "invasion" racist.

Hopefully the definition from Webster's New World College Dictionary, supposedly a standard for U.S. journalists (we're using the fourth edition) can help clear things up:
 

in-va-sion ... n. ... an invading or being invaded; specif., a) an entering or being entered by an attacking military force b) an intrusion or infringement

Is there any valid reason to suppose that illegal entry to the United States does not count as intrusion? Of course not. And any fact-checker unable to figure that out cannot be worthy of the name.

Regarding the invasion at the southern border, you can believe PolitiFact or your lyin' eyes.