Saturday, August 11, 2018

Did an Independent Study Find PolitiFact Is Not Biased?

An email alert from August 10, 2018 led us to a blaring headline from the International Fact-Checking Network:

Is PolitiFact biased? This content analysis says no

Though "content analysis" could mean the researchers looked at pretty much anything having to do with PolitiFact's content, we suspected the article was talking about an inventory of PolitiFact's word choices, looking for words associated with a political point of view. For example, "anti-abortion" and "pro-life" signal political points of view. Using those and similar terms may tip off readers regarding the politics those who produce the news.

PolitiFact Bias has never used the presence of such terms to support our argument that PolitiFact is biased. In fact, I (Bryan) tweeted out a brief judgment of the study on Twitter back on July 16, 2018:
We have two major problems with the the IFCN article at Poynter.org (by Daniel Funke).

First, it implies that the word-use inventory somehow negates the evidence of bias that PolitiFact's critics use that do not include the types of word choices the study was was designed to detect:
It’s a critique that PolitiFact has long been accustomed to hearing.

“PolitiFact is engaging in a great deal of selection bias,” The Weekly Standard wrote in 2011. “'Fact Checkers' Overwhelmingly Target Right-Wing Pols and Pundits” reads an April 2017 headline from NewsBusters, a site whose goal is to expose and combat “liberal media bias.” There’s even an entire blog dedicated to showing the ways in which PolitiFact is biased.

The fact-checking project, which Poynter owns, has rebuffed those accusations, pointing to its transparent methodology and funding (as well as its membership in the International Fact-Checking Network) as proof that it doesn’t have a political persuasion. And now, PolitiFact has an academic study to back it up.
The second paragraph mentions selection bias (taking the Weekly Standard quotation out of context) and other types of bias noted by PolitiFact Bias ("an entire blog dedicated to showing the ways in which PolitiFact is biased"--close enough, we suppose, thanks for linking us).

The third paragraph says PolitiFact has "rebuffed those accusations." We think "ignores those accusations" describes the situation more accurately.

The third paragraph goes on to mention PolitiFact's "transparent methodology" (true if you ignore the ambiguity and inconsistency) and transparent funding (yes, funded by some left-wing sources but PolitiFact Bias does not use that as an evidence of PolitiFact's bias). before claiming that PolitiFact "has an academic study to back it up."

"It"=PolitiFact's rebuffing of accusations it is biased????

That does not follow logically. To support PolitiFact's denials of the bias of which it is accused, the study would have to offer evidence countering the specific accusations. It doesn't do that.

Second, Funke's article suggests that the study shows a lack of bias. We see that idea in the title of Funke's piece as well as in the material from the third paragraph.

But that's not how science works. Even for the paper's specific area of study, it does not show that PolitiFact has no bias. At best it could show the word choices it tested offer no significant indication of bias.

The difference is not small, and Funke's article even includes a quotation from one of the study's authors emphasizing the point:
But in a follow-up email to Poynter, Noah Smith, one of the report’s co-authors, added a caveat to the findings.

“This could be because there's really nothing to find, or because our tools aren't powerful enough to find what's there,” he said.
So the co-author says maybe the study's tools were not powerful enough to find the bias that exists. Yet Funke sticks with the title "Is PolitiFact biased? This content analysis says no."

Is it too much to ask for the title to agree with a co-author's description of the meaning of the study?

The content analysis did not say "no." It said (we summarize) "not in terms of these biased language indicators."

Funke's article paints a very misleading picture of the content and meaning of the study. The study refutes none of the major critiques of PolitiFact of which we are aware.


Afters

PolitiFact's methodology, funding and verified IFCN signatory status is supposed to assure us it has no political point of view?

We'd be more impressed if PolitiFact staffers revealed their votes in presidential elections and more than a tiny percentage voted Republican more than once in the past 25 years.

It's anybody's guess why fact checkers do not reveal their voting records, right?


Correction Aug. 11, 2018: Altered headline to read "an Independent Study" instead of "a Peer-Reviewed Study"

The Weekly Standard Notes PolitiFact's "Amazing" Fact Check

The Weekly Standard took note of PolitiFact's audacity in fact-checking Donald Trump's claim that the economy grew at the amazing rate of 4.1 percent rate in the second quarter.
The Trumpian assertion that moved the PolitiFact’s scrutineers to action? This one: “In the second quarter of this year, the United States economy grew at the amazing rate of 4.1 percent.” PolitiFact’s objection wasn’t to the data—the economy really did grow at 4.1 percent in the second quarter—but to the adjective: amazing.
That's amazing!

PolitiFact did not rate the statement on its "Truth-O-Meter" but published its "Share The Facts" box featuring the judgment "Strong, but not amazing."

PolitiFact claims it does not rate opinions and grants license for hyperbole.

As we have noted before, it must be the fault of Republicans who keep trying to use hyperbole without a license.

Friday, August 10, 2018

PolitiFact Editor: It's Frustrating When Others Do Not Follow Their Own Policies Consistently

PolitiFact Editor Angie Drobnic Holan says she finds it frustrating that Twitter does not follow its own policies (bold emphasis added):
The fracas over Jones illustrates a lot, including how good reporting and peer pressure can actually force the platforms to act. And while the reasons that Facebook, Apple and others banned Jones and InfoWars have to do with hate speech, Twitter’s inaction also confirms what fact-checkers have long thought about the company’s approach to fighting misinformation.

They’re not doing anything, and I’m frustrated that they don’t enforce their own policies,” said Angie Holan, editor of (Poynter-owned) PolitiFact.
Tell us about it.

We started our "(Annotated) Principles of PolitiFact" page years ago to expose examples of the way PolitiFact selectively applies its principles. It's a shame we haven't had the time to keep that page updated, but our research indicates PolitiFact has failed to correct the problem to any noticeable degree.

Tuesday, August 7, 2018

The Phantom Cherry-pick

Would Sen. Bernie Sanders' Medicare For All plan save $2 trillion over 10 years on U.S. health care expenses?

Sanders and the left were on fire this week trying to co-opt a Mercatus Center paper by Charles Blahous. Sanders and others claimed Blahous' paper confirmed the M4A plan would save $2 trillion over 10 years.

PolitiFact checked in on the question and found Sanders' claim "Half True":


PolitiFact's summary encapsulates its reasoning:
The $2 trillion figure can be traced back to the Mercatus report. But it is one of two scenarios the report offers, so Sanders’ use of the term "would" is too strong. The alternative figure, which assumes that a Medicare for All plan isn’t as successful in controlling costs as its sponsors hope it will be, would lead to an increase of almost $3.3 trillion in national health care expenditures, not a decline. Independent experts say the alternative scenario of weaker cost control is at least as plausible.

We rate the statement Half True.
Throughout its report, as pointed out at Zebra Fact Check, PolitiFact treats the $2 trillion in savings as a serious attempt to project the true effects of the M4A bill.

In fact, the Mercatus report use what its author sees as overly rosy assumptions about the bill's effects to estimate a lower boundary for the bill's very high costs and then proceeds to offer reasons why the bill will likely greatly exceed those costs.

In other words, the cherry Sanders tries to pick is a faux cherry. And a fact checker ought to recognize that fact. It's one thing to pick a cherry that's a cherry. It's another thing to pick a cherry that's a fake.

Making Matters Worse

PolitiFact makes matters worse by overlooking Sanders' central error: circular reasoning.

Sanders' takes a projection based on favorable assumptions as evidence that the favorable assumptions are reasonable assumptions. But a conclusion one reaches based on assumptions does not make the assumptions more true. Sanders' claim suggests the opposite, that when the Blahous paper says it is using unrealistic assumptions the conclusions it reaches using those assumptions makes the assumptions reasonable.

A fact checker ought to point out what a politician peddles such nonsensical ideas.

PolitiFact made itself guilty of bad reporting while overlooking Sanders' central error.

Reader: "PolitiFact is not biased. Republicans just lie more."

Every few years or so we recognize a Comment of the Week.

Jehosephat Smith dropped by on Facebook to inform us that PolitiFact is not biased:
Politifact is not biased, Republicans just lie more. That is objectively obvious by this point and if your mind isn't moved by current realities then you're willfully ignorant.
As we have prided ourselves on trying to communicate clearly exactly why we find PolitiFact biased, we find such comments fascinating on two levels.


First, how can one claim that PolitiFact is not biased? On what evidence would one rely to support such a claim?

Second, how can one contemplate claiming PolitiFact isn't biased without making some effort to address the arguments we've made showing PolitiFact is biased?

We invited Mr. Smith to make his case either here on the website or on Facebook. But rather than simply heaping Smith's burden of proof on his head we figured his comment would serve us well as an excuse to again summarize the evidence showing PolitiFact's bias to the left.


Journalists lean left
Journalists as a group lean left. And they lean markedly left of the general U.S. population. Without knowing anything else at all about PolitiFact we have reason to expect that it is made up mostly of left-leaning journalists. If PolitiFact journalists lean left as a group then right out of the box we have reason to look for evidence that their political leaning affects their fact-checking.

PolitiFact's errors lean left I
When PolitiFact makes a egregious reporting error, the error tends to harm the right or fit with left-leaning thinking. For example, when PolitiFact's Louis Jacobson reported that the Hobby Lobby's policy on health insurance "barred" women from using certain types of birth control, we noted that pretty much anybody with any rightward lean would have spotted the mistake and prevented its publication. Instead, PolitiFact published it and later changed it without posting a correction notice. We have no trouble finding such examples.

PolitiFact's errors lean left II
We performed a study of PolitiFact's calculations of percentage error. PolitiFact often performs the calculation incorrectly, and errors tend to benefit Democrats (caveat: small data set).

PolitiFact's ratings lean left I
When PolitiFact rates Republicans and Democrats on closely parallel claims Democrats often fare better. For example, when PolitiFact investigated a Democratic Party charge that Rep. Bill McCollum raised his own pay while in Congress PolitiFact said it was true. But when PolitiFact investigated a Republican charge that Sherrod Brown had raised his own pay PolitiFact discovered that members of Congress cannot raise their own pay and rated the claim "False." We have no trouble finding such examples.

PolitiFact's ratings lean left II
We have done an ongoing and detailed study looking at partisan differences in PolitiFact's application of its "Pants on Fire" rating. PolitiFact describes no objective difference in distinguishing between "False" and "Pants on Fire" ratings, so we hypothesize that the difference between the two ratings is subjective. Republicans are over 50 percent more likely than Democrats to have a false rating deemed "Pants on Fire" false for apparently subjective reasons.

PolitiFact's explanations lean left
When PolitiFact explains topics its explanations tend to lean left. For example, when Democrats and liberals say Social Security has never contributed a dime to the deficit PolitiFact gives it a rating such as "Half True," apparently unable to discover the fact that Social Security has run a deficit during years when the program was on-budget (and therefore unquestionably contributed directly to the deficit those years). PolitiFact resisted Republican claims that the ACA cut Medicare, explaining that the so-called Medicare cuts were not truly cuts because the Medicare budget continued to increase. Yet PolitiFact discovered when the Trump administration slowed the growth of Medicaid it was okay to refer to the slowed growth as a program cut. Again, we have no trouble finding such examples.

How can a visitor to our site (including Facebook) contemplate declaring PolitiFact isn't biased without coming prepared to answer our argument?