Friday, November 10, 2017

'Not a lot of reader confusion' VI

PolitiFact editor Angie Drobnic Holan has claimed she does not notice much reader confusion regarding the interpretation of PolitiFact's "report card" charts and graphs.

This series of posts is designed to call shenanigans on that frankly unbelievable claim.

Rem Rieder, a journalist of some repute, showed himself a member of PolitiFact's confused readership with a Nov. 10, 2017 article published at
While most politicians are wrong some of the time, the fact-checking website PolitiFact has found that that [sic] Trump's assertions are inaccurate much more frequently than those of other pols.
When we say Rieder showed himself a member of PolitiFact's confused readership, that means we're giving Rieder the benefit of the doubt by assuming he's not simply lying to his readers.

As we have stressed repeatedly here at PolitiFact Bias, PolitiFact's collected "Truth-O-Meter" ratings cannot be assumed to reliably reflect the truth-telling patterns of politicians, pundits or networks. PolitiFact uses non-random methods of choosing stories (selection bias) and uses an admittedly subjective rating system (personal bias).

PolitiFact then reinforces the sovereignty of the left-leaning point of view--most journalists lean left of the American public--by deciding its ratings by a majority vote of its "star chamber" board of editors.

We have called on PolitiFact to attach disclaimers to each of its graphs, charts or stories related to its graphs and charts to keep such material from misleading unfortunate readers like Rieder.

So far, our roughly five years of lobbying have fallen on deaf ears.

Monday, November 6, 2017

PolitiFact gives the 8 in 10 lie a "Half True."

We can trust PolitiFact to lean left.

Sometimes we bait PolitiFact into giving us examples of its left-leaning tendencies. On November 1, 2017, we noticed a false tweet from President Barack Obama. So we drew PolitiFact's attention to it via the #PolitiFactThis hashtag.

We didn't need to have PolitiFact look into it to know that what Obama said was false. He presented a circular argument, in effect, using the statistics for people who had chosen an ACA exchange plan to mislead the wider public about their chances of receiving subsidized and inexpensive health insurance.

PolitiFact identified the deceit in its fact check, but used biased supposition to soften it (bold emphasis added):
"It only takes a few minutes and the vast majority of people qualify for financial assistance," Obama says. "Eight in 10 people this year can find plans for $75 a month or less."

Can 8 in 10 people get health coverage for $75 a month or less? It depends on who those 10 people are.

The statistic only refers to people currently enrolled in
The video ad appeals to people who are uninsured or who might save money by shopping for health insurance on the government exchange. PolitiFact's wording fudges the truth. It might have accurately said "The statistic is correct for people currently enrolled in but not for the population targeted by the ad."

In the ad, the statistic refers to the ad's target population, not merely to those currently enrolled in

And PolitiFact makes thin and misleading excuses for Obama's deception:
(I)n the absence of statistics on visitors, the 8-in-10 figure is the only data point available to those wondering about their eligibility for low-cost plans within the marketplace. What’s more, the website also helps enroll people who might not have otherwise known they were eligible for other government programs.
The nonpartisan fact-checker implies that the lack of data helps excuse using data in a misleading way. We reject that type of excuse-making. If Obama does not provide his audience the context allowing it to understand the data point without being misled, then he deserves full blame for the resulting deception.

PolitiFact might as well be saying "Yes, he misled people, but for a noble purpose!"

PolitiFact, in fact, provided other data points in its preceding paragraph that helped contextualize Obama's misleading data point.

We think PolitiFact's excuse-making influences the reasoning it uses when deciding its subjective "Truth-O-Meter" ratings.
HALF TRUE – The statement is partially accurate but leaves out important details or takes things out of context.
MOSTLY FALSE – The statement contains an element of truth but ignores critical facts that would give a different impression.
FALSE – The statement is not accurate.
In objective terms, what keeps Obama's statement from deserving a "Mostly False" or "False" rating?
His statement was literally false when taken in context, and his underlying message was likewise false.

About 10 to 12 million are enrolled in HealthCare.Gov ("Obamacare") plans. About 80 percent of those receive the subsidies Obama lauds. About 6 million persons buying insurance outside the exchange fail to qualify for subsidies, according to PolitiFact. Millions among the uninsured likewise fail to qualify for subsidies.

Surely a fact-checker can develop a data point out of numbers like those.

But this is what happens when non-partisan fact checkers lean left.

Correction Nov. 6, 2017: Removed "About 6 million uninsured do not qualify for Medicaid or subsidies" as it was superseded by reporting later in the post).

Monday, October 23, 2017

PolitiFact's Evangelism & Revival Tour III

PolitiFact's Katie Sanders PolitiSplains why conservatives should trust PolitiFact

PolitiFact reached out to red state residents in three states, Alabama, Oklahoma and West Virginia thanks to a grant from the Knight Foundation. We're calling it PolitiFact's Evangelism and Revival Tour thanks to its resemblance to religious "love-bombing."

In our post from this series published on Oct. 22, 2017, we wondered what specific reasons PolitiFact was offering conservatives to convince them they should trust PolitiFact.

We're supposing the red state unwashed are hearing little more than the spiel PolitiFact's Katie Sanders gave in West Virginia.

MetroNews and Alex Thomas reported:
Organization deputy editor Katie Sanders said following the 2016 presidential campaign, they noticed a trend among conservatives regarding a distrust of news organizations.

“We are concerned about that because we are independent, we’re nonpartisan, we call out both sides, yet there’s still this skepticism,” she said on MetroNews’ “Talkline.”
PolitiFact is neutral and trustworthy because it is "independent"?

We like the response of the University of Miami's Joe Uscinski to that one:
We believe by "independent" PolitiFact means it does not allow outside entities to guide its process. The same is true of PolitiFact Bias. Does that make us unbiased?

PolitiFact is neutral and trustworthy because it is "nonpartisan"? 

Think tanks nearly all call themselves "nonpartisan." Yet news reports routinely report that a think tank is "right-leaning" or "left-leaning." "Nonpartisan" does not automatically equate with "unbiased," let alone neutral and trustworthy.

We might as well mention that PolitiFact Bias is "nonpartisan" by the same definition think-tanks (and likely PolitiFact) use (everything but "unbiased"). Does that make us unbiased?

PolitiFact is neutral and trustworthy because it calls out both sides?

Bush made mistakes. Obama made mistakes. Look, Ma, I'm neutral!

Calling out both sides does nothing to guarantee neutrality or trustworthiness. It's perfectly possible to call out one side with kid gloves and the other with a hammer.

At PolitiFact Bias, we think PolitiFact is often guilty of applying unequal standards, and we created this site in part to highlight such cases. We point out that PolitiFact sometimes unfairly harms Democrats as well as Republicans. Does that make us unbiased?

The argument for trust that Sanders used counts as flim-flam.

If PolitiFact wants trust from conservatives and moderates it will need a better sales pitch. That is, a sales pitch with specifics that actually address the issues that lead to the lack of trust.

Get to it, PolitiFact.

Sunday, October 22, 2017

The PolitiFact Evangelism & Revival Tour II

Thanks to a generous and wasteful grant from the Knight Foundation, PolitiFact is reaching out to red state voters!

These outreaches suspiciously correlate to new PolitiFact state franchises, in turn making it look like the Knight Foundation wants to help PolitiFact advertise itself.

Daniel Funke of the Poynter Institute posted a story about the Oklahoma leg of PolitiFact's dog & pony show. We reviewed that in our first part in this series. This installment concerns a Washington Post story about the third and final stage of the evangelism and revival tour, ending up in West Virginia.

What's the Purpose of This Tour, Again?

The Post article leads with a section that more-or-less paints PolitiFact's outreach as a failure.

PolitiFact planned to go out and tell people PolitiFact is nonpartisan and fair and let them see, at least to some degree, how PolitiFact works. That was supposed to lead to greater trust. But when given the opportunity to make that case, PolitiFact editor Amy Hollyfield comes across like Eeyore.
“I have discussions with people about the news all the time on Facebook, and I show them what I consider to be credible sources of information,” a man named Paul Epstein says from a middle row. “And they say, ‘Oh, that’s all biased.’ So how can you, or how can we, convince people to trust any mainstream media?”

Amy Hollyfield of PolitiFact, the Pulitzer Prize-winning fact-checking organization, considers the question. She hesitates a beat before telling Epstein and about 65 others in the audience that maybe you can’t. Not all the time.
Well, that's encouraging! What else does Hollyfield have?
“We have a lot of things on our website” that attest to PolitiFact’s impartiality and credibility, Holly­field says. “But I don’t think that seeps in when you’re having that kind of conversation. That’s why we’re trying to tell our story.”
Specifics? Aren't specifics always foremost in the minds of journalists? Okay, maybe Hollyfield gave the specifics. Maybe the Post's Paul Farhi left them out. But it seems to us beyond question that if the idea of the evangelism tour is to build trust of PolitiFact in red states then PolitiFact should focus on those specifics, whatever they are.
The fact-checkers keep steering the conversation back to Politi­Fact and its 10-year track record of rating political speech, including how it assigns its most damning rating, “Pants on Fire.”
What? It would be great to have some specifics on that. Pretty much the best description we have of the difference between PolitiFact's "False" and "Pants on Fire" ratings is PolitiFact Editor Angie Drobnic Holan's immortal "Sometimes we decide one way and sometimes decide the other." We'd like to know even more about this occult-yet-objective (?) process. But there's nothing new in the Post article. So not today.

Sharockman has the Evidence of Neutral Nonpartisanship (not)!

Just a few days ago we published a chart showing PolitiFact has published more fact checks of President Trump between his inauguration and Oct. 18 than it did of President Obama over the same period in 2009 and 2013 combined. We did it to show the utter ridiculousness of Executive Director Aaron Sharockman's argument that fact-checking Obama frequently serves as an evidence of PolitiFact's neutrality.

Lo and behold, the Post captured Sharockman making that same argument again. Christmas in October (bold emphasis added):
(Sharockman) bristles a bit at the conservative critique [The basic critique is that fact-checkers cherry-pick statements and facts to create a false impression — usually that conservative candidates are less truthful than the liberal kind"--bww]. “People say, ‘Why didn’t you fact-check Hillary Clinton’s claim about coming under fire [as first lady] in Bosnia?’ Well, we did. The person we fact-checked more than anyone else is Barack Obama. . . . The person we fact-check the most is the president. We’re going to hold the president accountable.”
As we pointed out in our earlier article accompanying the graph, yes of course national fact checkers check the president the most. That will be true regardless of party and therefore serves as no evidence whatsoever of impartiality, particularly if a Republican president may have drawn greater scrutiny than Obama. Sharockman's argument is flim-flam.

This article about PolitiFact trying to convince conservatives it is neutral and non-partisan gives conservatives no evidence of PolitiFact's neutrality or non-partisanship. These people could use some talking points that have greater strength than wet toilet paper.

Hey, the article mentions "PolitiFact Bias"!

Plus: How PolitiFact could build trust across the board

At the risk of humeral fracture from patting ourselves on the back, the best section of the Post article is the one that mentions PolitiFact Bias. That's not because it mentions PolitiFact Bias, though that's part of it (bold emphasis added)
(Sharockman)’s fully aware of the free-floating cynicism about fact-checking, a form that has enjoyed a boomlet in the past few years with such outfits as PolitiFact,, Snopes and The Washington Post’s Fact Checker on the scene. In one poll last year, 88 percent of people who supported Trump during the 2016 campaign said they didn’t trust media fact-checking. (Overall, just 29 percent of likely voters in the survey said they did.) PolitiFact itself has come in for particularly intense criticism; a blog called PolitiFact Bias is devoted to “exposing [its] bias, mistakes and flimflammery.”

The basic critique is that fact-checkers cherry-pick statements and facts to create a false impression — usually that conservative candidates are less truthful than the liberal kind.
The fact is that the polls show that moderates and independents are more skeptical about mainstream media fact-checking than are Democrats. The corollary? The political group that most trusts political fact-checking is Democrats.

Shouldn't we expect moderates more than Democrats or Republicans to favor PolitiFact if it treats Democrats and Republicans with equal skepticism? Indeed, for years PolitiFact tried to argue for its neutrality by saying it gets attacked from both sides. Left unsaid was the fact that most of the attacking came from one side.

PolitiFact needs to hear the message in the numbers. Likely voters don't trust fact checkers (71 percent!). PolitiFact can't do meet-and-greets with 71 percent of likely voters.  To earn trust, PolitiFact needs to severely ramp up its transparency and address the criticism it receives. If the criticism is valid, make changes. If the criticism is invalid, then crush entities like PolitiFact Bias by publicly discrediting their arguments with better arguments.

Establish trust by modeling transparently trustworthy behavior, in other words.

Or PolitiFact can just keep doing what it's doing and see if that 30 percent or so that trusts it just happens to grow.

Good luck with that.


Is this true?
The fact of the matter is that both sides are becoming less moored to the truth, Sharockman says. The number of untrustworthy statements by Republicans and Democrats alike has grown over the past three presidential cycles, he noted.
Our numbers show that the number of false ("False" plus "Pants on Fire") statements from Democrats, as rated by PolitiFact, drop from PolitiFact's early years.  Though with a minor spike during the 2016 election cycle.

What data would support Sharockman's claim, we wonder?

Friday, October 20, 2017

PolitiFact and the principle of inconsistency

In October, six days apart, PolitiFact did fact checks on two parallel claims, each asserting the existence of a particular law. One, by U.S. Senate candidate Roy Moore, was found "False." The other, by a Saturday Night Live cast member, was found "Mostly True."

Moore asserted that an act of Congress made it "against the law" to fail to stand for the playing of the national anthem. PolitiFact confirmed the existence of the law Moore referenced, but noted that it merely offered guidance on proper etiquette. It did not provide any punishment for improper etiquette.

SNL's Colin Jost said a Texas law made it illegal to own more than six dildos. PolitiFact confirmed a Texas law made owning more than six "obscene devices" illegal. PolitiFact found that a federal court had ruled that law unconstitutional in 2008.

Both laws exist. The one Moore cited carries no teeth because it describes proper etiquette, not a legal requirement backed by government police power. The one Jost cited lacks teeth because the Court voided it.

How did PolitiFact and PolitiFact Texas justify their respective rulings?

PolitiFact (bold emphasis added):
Moore said NFL players taking a knee during the national anthem is "against the law."

Moore's basis is that a law on the books describes patriotic etiquette during the national anthem. But his statement gives the false impression the law is binding, when in fact it’s merely guidance that carries no penalty. Additionally, legal experts told us the First Amendment protects the right to kneel during the national anthem.

We rate this False.
PolitiFact Texas (bold emphasis added):
Jost said: "There is a real law in Texas that says it’s illegal to own more than six dildos."

Such a cap on "obscene devices" has been state law since the 1970s though it’s worth clarifying that the law mostly hasn’t been enforced since federal appeals judges found it unconstitutional in 2008.

We rate the claim Mostly True.
From where we're sitting, the thing PolitiFact Texas found "worth clarifying" in its "Mostly True" rating of Jost closely resembles in principle one of the reasons PolitiFact gave for rating Moore's statement "False" (neither law is binding, but for different reasons). As for the other rationale backing the "False" rating, from where we're sitting Jost equaled Moore in giving the impression that the Texas law is binding today. But PolitiFact Texas did not penalize Jost for offering a misleading impression.

We call these rulings inconsistent.

Inconsistency is a bad look for fact checkers.

Update Oct. 23, 2017: We appreciate Tim Graham highlighting this post at Newsbusters.

Wednesday, October 18, 2017

Fact-checking the president

When accused of focusing its fact checks on conservatives more than liberals, PolitiFact has been known to defend itself by pointing out that it has fact checked Barack Obama more than any other political figure.

We properly ridiculed that claim because it is natural for a national political fact checker to place special importance on the statements of a president. We should only be surprised if the fact checker fails to fact check the president most frequently. And now that President Donald Trump has succeeded President Obama in office, we can do some comparisons that help illustrate the point.

Please note that this comparison does have an apples-to-oranges aspect to it. PolitiFact started out with the aim of fact-checking the election campaign. Therefore, we should allow for PolitiFact to get a slow start on President Obama's first term.

We based the comparisons on the number of fact checks PolitiFact performed on the presidents between their inauguration (two of those for Obama) and Oct. 18. In fact, PolitiFact fact checked Obama more frequently in 2009 than it did when he launched his second term in 2013.

As the graph shows, through Oct. 18 PolitiFact has fact checked Trump more in 2017 than it did Obama in 2009 and 2013 combined.

Trump has an excellent shot at supplanting Obama as the figure most fact checked by PolitiFact within just four years of taking office.

And perhaps we'll never again hear PolitiFact's balance defended on the basis of its fact-checking Obama more often than other political figures.

Surprise! Another way PolitiFact rates claims inconsistently

When we saw PolitiFact give a "Mostly False" rating to the claim state spending in Oklahoma had reached an all-time high, it piqued our curiosity.

PolitiFact issued the "Mostly False" rating because the Oklahoma Council of Public Affairs used nominal dollars instead of inflation-adjusted dollar in making its claim.
The Oklahoma Council of Public Affairs said that spending this year is on track to be the highest ever. While the raw numbers show that, the statement ignores the impact of inflation, a standard practice when comparing dollars over time. Factoring in inflation shows that real spending was higher in 2009 to 2011.

When population and economic growth are added in, spending has been higher over most of the past decade.

The statement contains an element of truth but it ignores critical facts that would give a different impression. We rate this claim Mostly False.
Considering the claim was arguably "Half True" based on nominal dollars, we wondered if PolitiFact's ruling was consistent with similar cases involving the claims of Democrats.

Given our past experience with PolitiFact, we were not surprised at all to find PolitiFact giving a "Half True" to a Democratic National Committee claim that U.S. security funding for Israel had hit an all-time high. There was one main difference between the DNC's claim and the one from the Oklahoma Council of Public Affairs: The one from the DNC was false for either nominal dollars or inflation-adjusted dollars (bold emphasis added).
The ad says "U.S. security funding for Israel is at an all-time high." Actually, it was higher in one or two years, depending whether you use inflation-adjusted dollars. In addition, the ad oversells the credit Obama can take for this year’s number. The amount was outlined by a memorandum signed in 2007 under President George W. Bush. On balance, we rate the claim Half True.

That's not just inconsistent, it's PolitiFinconsistent!


The fact check that drew our attention was technically from PolitiFact Oklahoma, but was perpetrated by Jon Greenberg and Angie Drobnic Holan, both veterans of PolitiFact National.