Monday, October 23, 2017

PolitiFact's Evangelism & Revival Tour III

PolitiFact's Katie Sanders PolitiSplains why conservatives should trust PolitiFact

PolitiFact reached out to red state residents in three states, Alabama, Oklahoma and West Virginia thanks to a grant from the Knight Foundation. We're calling it PolitiFact's Evangelism and Revival Tour thanks to its resemblance to religious "love-bombing."

In our post from this series published on Oct. 22, 2017, we wondered what specific reasons PolitiFact was offering conservatives to convince them they should trust PolitiFact.

We're supposing the red state unwashed are hearing little more than the spiel PolitiFact's Katie Sanders gave in West Virginia.

MetroNews and Alex Thomas reported:
Organization deputy editor Katie Sanders said following the 2016 presidential campaign, they noticed a trend among conservatives regarding a distrust of news organizations.

“We are concerned about that because we are independent, we’re nonpartisan, we call out both sides, yet there’s still this skepticism,” she said on MetroNews’ “Talkline.”
PolitiFact is neutral and trustworthy because it is "independent"?

We like the response of the University of Miami's Joe Uscinski to that one:
We believe by "independent" PolitiFact means it does not allow outside entities to guide its process. The same is true of PolitiFact Bias. Does that make us unbiased?

PolitiFact is neutral and trustworthy because it is "nonpartisan"? 

Think tanks nearly all call themselves "nonpartisan." Yet news reports routinely report that a think tank is "right-leaning" or "left-leaning." "Nonpartisan" does not automatically equate with "unbiased," let alone neutral and trustworthy.

We might as well mention that PolitiFact Bias is "nonpartisan" by the same definition think-tanks (and likely PolitiFact) use (everything but "unbiased"). Does that make us unbiased?

PolitiFact is neutral and trustworthy because it calls out both sides?

Bush made mistakes. Obama made mistakes. Look, Ma, I'm neutral!

Calling out both sides does nothing to guarantee neutrality or trustworthiness. It's perfectly possible to call out one side with kid gloves and the other with a hammer.

At PolitiFact Bias, we think PolitiFact is often guilty of applying unequal standards, and we created this site in part to highlight such cases. We point out that PolitiFact sometimes unfairly harms Democrats as well as Republicans. Does that make us unbiased?

The argument for trust that Sanders used counts as flim-flam.

If PolitiFact wants trust from conservatives and moderates it will need a better sales pitch. That is, a sales pitch with specifics that actually address the issues that lead to the lack of trust.

Get to it, PolitiFact.

Sunday, October 22, 2017

The PolitiFact Evangelism & Revival Tour II

Thanks to a generous and wasteful grant from the Knight Foundation, PolitiFact is reaching out to red state voters!

These outreaches suspiciously correlate to new PolitiFact state franchises, in turn making it look like the Knight Foundation wants to help PolitiFact advertise itself.

Daniel Funke of the Poynter Institute posted a story about the Oklahoma leg of PolitiFact's dog & pony show. We reviewed that in our first part in this series. This installment concerns a Washington Post story about the third and final stage of the evangelism and revival tour, ending up in West Virginia.

What's the Purpose of This Tour, Again?

The Post article leads with a section that more-or-less paints PolitiFact's outreach as a failure.

PolitiFact planned to go out and tell people PolitiFact is nonpartisan and fair and let them see, at least to some degree, how PolitiFact works. That was supposed to lead to greater trust. But when given the opportunity to make that case, PolitiFact editor Amy Hollyfield comes across like Eeyore.
“I have discussions with people about the news all the time on Facebook, and I show them what I consider to be credible sources of information,” a man named Paul Epstein says from a middle row. “And they say, ‘Oh, that’s all biased.’ So how can you, or how can we, convince people to trust any mainstream media?”

Amy Hollyfield of PolitiFact, the Pulitzer Prize-winning fact-checking organization, considers the question. She hesitates a beat before telling Epstein and about 65 others in the audience that maybe you can’t. Not all the time.
Well, that's encouraging! What else does Hollyfield have?
“We have a lot of things on our website” that attest to PolitiFact’s impartiality and credibility, Holly­field says. “But I don’t think that seeps in when you’re having that kind of conversation. That’s why we’re trying to tell our story.”
Specifics? Aren't specifics always foremost in the minds of journalists? Okay, maybe Hollyfield gave the specifics. Maybe the Post's Paul Farhi left them out. But it seems to us beyond question that if the idea of the evangelism tour is to build trust of PolitiFact in red states then PolitiFact should focus on those specifics, whatever they are.
The fact-checkers keep steering the conversation back to Politi­Fact and its 10-year track record of rating political speech, including how it assigns its most damning rating, “Pants on Fire.”
What? It would be great to have some specifics on that. Pretty much the best description we have of the difference between PolitiFact's "False" and "Pants on Fire" ratings is PolitiFact Editor Angie Drobnic Holan's immortal "Sometimes we decide one way and sometimes decide the other." We'd like to know even more about this occult-yet-objective (?) process. But there's nothing new in the Post article. So not today.

Sharockman has the Evidence of Neutral Nonpartisanship (not)!

Just a few days ago we published a chart showing PolitiFact has published more fact checks of President Trump between his inauguration and Oct. 18 than it did of President Obama over the same period in 2009 and 2013 combined. We did it to show the utter ridiculousness of Executive Director Aaron Sharockman's argument that fact-checking Obama frequently serves as an evidence of PolitiFact's neutrality.

Lo and behold, the Post captured Sharockman making that same argument again. Christmas in October (bold emphasis added):
(Sharockman) bristles a bit at the conservative critique [The basic critique is that fact-checkers cherry-pick statements and facts to create a false impression — usually that conservative candidates are less truthful than the liberal kind"--bww]. “People say, ‘Why didn’t you fact-check Hillary Clinton’s claim about coming under fire [as first lady] in Bosnia?’ Well, we did. The person we fact-checked more than anyone else is Barack Obama. . . . The person we fact-check the most is the president. We’re going to hold the president accountable.”
As we pointed out in our earlier article accompanying the graph, yes of course national fact checkers check the president the most. That will be true regardless of party and therefore serves as no evidence whatsoever of impartiality, particularly if a Republican president may have drawn greater scrutiny than Obama. Sharockman's argument is flim-flam.

This article about PolitiFact trying to convince conservatives it is neutral and non-partisan gives conservatives no evidence of PolitiFact's neutrality or non-partisanship. These people could use some talking points that have greater strength than wet toilet paper.

Hey, the article mentions "PolitiFact Bias"!

Plus: How PolitiFact could build trust across the board

At the risk of humeral fracture from patting ourselves on the back, the best section of the Post article is the one that mentions PolitiFact Bias. That's not because it mentions PolitiFact Bias, though that's part of it (bold emphasis added)
(Sharockman)’s fully aware of the free-floating cynicism about fact-checking, a form that has enjoyed a boomlet in the past few years with such outfits as PolitiFact,, Snopes and The Washington Post’s Fact Checker on the scene. In one poll last year, 88 percent of people who supported Trump during the 2016 campaign said they didn’t trust media fact-checking. (Overall, just 29 percent of likely voters in the survey said they did.) PolitiFact itself has come in for particularly intense criticism; a blog called PolitiFact Bias is devoted to “exposing [its] bias, mistakes and flimflammery.”

The basic critique is that fact-checkers cherry-pick statements and facts to create a false impression — usually that conservative candidates are less truthful than the liberal kind.
The fact is that the polls show that moderates and independents are more skeptical about mainstream media fact-checking than are Democrats. The corollary? The political group that most trusts political fact-checking is Democrats.

Shouldn't we expect moderates more than Democrats or Republicans to favor PolitiFact if it treats Democrats and Republicans with equal skepticism? Indeed, for years PolitiFact tried to argue for its neutrality by saying it gets attacked from both sides. Left unsaid was the fact that most of the attacking came from one side.

PolitiFact needs to hear the message in the numbers. Likely voters don't trust fact checkers (71 percent!). PolitiFact can't do meet-and-greets with 71 percent of likely voters.  To earn trust, PolitiFact needs to severely ramp up its transparency and address the criticism it receives. If the criticism is valid, make changes. If the criticism is invalid, then crush entities like PolitiFact Bias by publicly discrediting their arguments with better arguments.

Establish trust by modeling transparently trustworthy behavior, in other words.

Or PolitiFact can just keep doing what it's doing and see if that 30 percent or so that trusts it just happens to grow.

Good luck with that.


Is this true?
The fact of the matter is that both sides are becoming less moored to the truth, Sharockman says. The number of untrustworthy statements by Republicans and Democrats alike has grown over the past three presidential cycles, he noted.
Our numbers show that the number of false ("False" plus "Pants on Fire") statements from Democrats, as rated by PolitiFact, drop from PolitiFact's early years.  Though with a minor spike during the 2016 election cycle.

What data would support Sharockman's claim, we wonder?

Friday, October 20, 2017

PolitiFact and the principle of inconsistency

In October, six days apart, PolitiFact did fact checks on two parallel claims, each asserting the existence of a particular law. One, by U.S. Senate candidate Roy Moore, was found "False." The other, by a Saturday Night Live cast member, was found "Mostly True."

Moore asserted that an act of Congress made it "against the law" to fail to stand for the playing of the national anthem. PolitiFact confirmed the existence of the law Moore referenced, but noted that it merely offered guidance on proper etiquette. It did not provide any punishment for improper etiquette.

SNL's Colin Jost said a Texas law made it illegal to own more than six dildos. PolitiFact confirmed a Texas law made owning more than six "obscene devices" illegal. PolitiFact found that a federal court had ruled that law unconstitutional in 2008.

Both laws exist. The one Moore cited carries no teeth because it describes proper etiquette, not a legal requirement backed by government police power. The one Jost cited lacks teeth because the Court voided it.

How did PolitiFact and PolitiFact Texas justify their respective rulings?

PolitiFact (bold emphasis added):
Moore said NFL players taking a knee during the national anthem is "against the law."

Moore's basis is that a law on the books describes patriotic etiquette during the national anthem. But his statement gives the false impression the law is binding, when in fact it’s merely guidance that carries no penalty. Additionally, legal experts told us the First Amendment protects the right to kneel during the national anthem.

We rate this False.
PolitiFact Texas (bold emphasis added):
Jost said: "There is a real law in Texas that says it’s illegal to own more than six dildos."

Such a cap on "obscene devices" has been state law since the 1970s though it’s worth clarifying that the law mostly hasn’t been enforced since federal appeals judges found it unconstitutional in 2008.

We rate the claim Mostly True.
From where we're sitting, the thing PolitiFact Texas found "worth clarifying" in its "Mostly True" rating of Jost closely resembles in principle one of the reasons PolitiFact gave for rating Moore's statement "False" (neither law is binding, but for different reasons). As for the other rationale backing the "False" rating, from where we're sitting Jost equaled Moore in giving the impression that the Texas law is binding today. But PolitiFact Texas did not penalize Jost for offering a misleading impression.

We call these rulings inconsistent.

Inconsistency is a bad look for fact checkers.

Update Oct. 23, 2017: We appreciate Tim Graham highlighting this post at Newsbusters.

Wednesday, October 18, 2017

Fact-checking the president

When accused of focusing its fact checks on conservatives more than liberals, PolitiFact has been known to defend itself by pointing out that it has fact checked Barack Obama more than any other political figure.

We properly ridiculed that claim because it is natural for a national political fact checker to place special importance on the statements of a president. We should only be surprised if the fact checker fails to fact check the president most frequently. And now that President Donald Trump has succeeded President Obama in office, we can do some comparisons that help illustrate the point.

Please note that this comparison does have an apples-to-oranges aspect to it. PolitiFact started out with the aim of fact-checking the election campaign. Therefore, we should allow for PolitiFact to get a slow start on President Obama's first term.

We based the comparisons on the number of fact checks PolitiFact performed on the presidents between their inauguration (two of those for Obama) and Oct. 18. In fact, PolitiFact fact checked Obama more frequently in 2009 than it did when he launched his second term in 2013.

As the graph shows, through Oct. 18 PolitiFact has fact checked Trump more in 2017 than it did Obama in 2009 and 2013 combined.

Trump has an excellent shot at supplanting Obama as the figure most fact checked by PolitiFact within just four years of taking office.

And perhaps we'll never again hear PolitiFact's balance defended on the basis of its fact-checking Obama more often than other political figures.

Surprise! Another way PolitiFact rates claims inconsistently

When we saw PolitiFact give a "Mostly False" rating to the claim state spending in Oklahoma had reached an all-time high, it piqued our curiosity.

PolitiFact issued the "Mostly False" rating because the Oklahoma Council of Public Affairs used nominal dollars instead of inflation-adjusted dollar in making its claim.
The Oklahoma Council of Public Affairs said that spending this year is on track to be the highest ever. While the raw numbers show that, the statement ignores the impact of inflation, a standard practice when comparing dollars over time. Factoring in inflation shows that real spending was higher in 2009 to 2011.

When population and economic growth are added in, spending has been higher over most of the past decade.

The statement contains an element of truth but it ignores critical facts that would give a different impression. We rate this claim Mostly False.
Considering the claim was arguably "Half True" based on nominal dollars, we wondered if PolitiFact's ruling was consistent with similar cases involving the claims of Democrats.

Given our past experience with PolitiFact, we were not surprised at all to find PolitiFact giving a "Half True" to a Democratic National Committee claim that U.S. security funding for Israel had hit an all-time high. There was one main difference between the DNC's claim and the one from the Oklahoma Council of Public Affairs: The one from the DNC was false for either nominal dollars or inflation-adjusted dollars (bold emphasis added).
The ad says "U.S. security funding for Israel is at an all-time high." Actually, it was higher in one or two years, depending whether you use inflation-adjusted dollars. In addition, the ad oversells the credit Obama can take for this year’s number. The amount was outlined by a memorandum signed in 2007 under President George W. Bush. On balance, we rate the claim Half True.

That's not just inconsistent, it's PolitiFinconsistent!


The fact check that drew our attention was technically from PolitiFact Oklahoma, but was perpetrated by Jon Greenberg and Angie Drobnic Holan, both veterans of PolitiFact National.

Tuesday, October 17, 2017

Can you trust what "Media Bias/Fact Check" says about PolitiFact? (Updated x2)

(See update at the end)

Somehow we got to the point where it makes sense to talk about Media Bias/Fact Check.

Media Bias/Fact Check bills itself as "The most comprehensive media bias resource." It's run by Dave Van Zandt, making it fair to say it's run by "some guy" ("Dave studied Communications in college" is his main claim to expertise).

We have nothing against "some guy" possessing expertise despite a lack of qualifications, of course. One doesn't need a degree or awards (or audience) to be right about stuff. But is Van Zandt and his Media Bias/Fact Check right about PolitiFact?

Media Bias/Fact Check rates PolitiFact as a "Least-biased" source of information. How does MB/FC reach that conclusion? The website has a "Methodology" page describing its methods:
The method for (rating bias) is determined by ranking bias in four different categories. In each category the source is rated on a 0-10 scale, with 0 meaning without bias and 10 being the maximum bias(worst). These four numbers are then added up and divided by 4. This 0-10 number is then placed on the line according to their Left or Right bias.
This system makes PolitiFact's "Truth-O-Meter" almost look objective by comparison. An 11-point scale? To obtain objectivity with an 11-point scale would require a very finely-grained system of objective bias measures--something that probably nobody on the planet has even dreamt of achieving.

It comes as no surprise that Van Zandt lacks those objective measures:

The categories are as follows (bold emphasis added):
  1. Biased Wording/Headlines- Does the source use loaded words to convey emotion to sway the reader. Do headlines match the story.
  2. Factual/Sourcing- Does the source report factually and back up claims with well sourced evidence.
  3. Story Choices: Does the source report news from both sides or do they only publish one side.
  4. Political Affiliation: How strongly does the source endorse a particular political ideology? In other words how extreme are their views. (This can be rather subjective)
Likely Van Zandt regards only the fourth category as subjective. All four are subjective unless Van Zandt has kept secret additional criteria he uses to judge bias. Think about it. Take the "biased wording" category, for example. Rate the headline bias for "PolitiFact Bias" on a scale of 0-10. Do it. What objective criteria guided the decision?

There is nothing to go on except for one's own subjective notion of where any observed bias falls on the 0-10 scale.

If the scale was worth something, researchers could put the rating system in the hands of any reasonable person and obtain comparable results. Systems with robust objective markers attached to each level of the scale can achieve that. Those lacking such markers will not.

Based on our experience with PolitiFact, we used Van Zandt's system on PolitiFact. Please remember that our experience will not render Van Zandt's system anything other than subjective.

Biased Wording/Headlines: 4
Factual/Sourcing: 3
Story Choices: 4
Political Affiliation: 3

Formula calls for division by 4.
3.5=Left Center Bias

Why is Van Zandt's rating objectively more valid than ours? Or yours?

Here's more of Van Zandt's rating of PolitiFact.
Factual Reporting: VERY HIGH
World Press Freedom Rank: USA 43/180

Notes: is a project operated by the Tampa Bay Times, in which reporters and editors from the Times and affiliated media outlets “fact-check statements by members of Congress, the White House, lobbyists and interest groups”. They publish original statements and their evaluations on the website, and assign each a “Truth-O-Meter” rating. The ratings range from “True” for completely accurate statements to “Pants on Fire” (from the taunt “Liar, liar, pants on fire”) for false and ridiculous claims. Politifact has been called left biased by Extreme right wing and questionable sources. Our research indicates that Poltifact [sic] is an accurate fact checker and is considered the gold standard for political fact checking. (7/10/2016)


Notice the biased language from Van Zandt? Van Zandt only allows that PolitiFact has been called left-leaning by "Extreme right wing and questionable sources." In fact, PolitiFact has been called left-biased by many sources, including the non-partisan Allsides Project.

Van Zandt even has an opt-in poll on his PolitiFact page asking visitors how they rate PolitiFact's bias. Most of the respondents disagree with the site's rating of PolitiFact.

Over 50 percent of Van Zandt's respondents rated PolitiFact biased to the left. Does that mean that all those 2,000+ people were "Extreme right wing" or "questionable sources"?

Note: I voted "Left-Center."

Why is PolitiFact called the "gold standard" for fact checking instead of, or even Zebra Fact Check? That's a mystery.

The crux of the matter

The temptation of subjective rating scales is obvious, but such scales misinform readers and probably tend to mislead their creators as well.

A rating scale that fails to base its ratings on quantifiable data is worthless. Van Zandt's ratings are worthless except to tell you his opinion.

Opinions about PolitiFact's bias start to have value when backed by specific, quantifiable findings. We've taken that approach for years here at PolitiFact Bias. When we see the biased headline, we write a post about it if it's of sufficient note. When we see the bad reporting, we write a post about it and document PolitiFact's failure with reliable sourcing. When we see PolitiFact skewing its story choices in a way that unfairly harms conservatives (or liberals), we write an article about it. When we see systematic signs of bias in PolitiFact's ratings, we do objective research on it.

We do that because specific examples trump subjective rating scales.

Until Dave Van Zandt adds objective markers to the MB/FC rating scales and justifies every rating with real objective data, take the ratings with a boulder of salt. They're worthless without specific backing data.


On its PolitiFact page, Media Bias/Fact Check links the flawed PolitiFact article we fisked here.

"VERY HIGH" factual reporting.


Update August 11, 2018:

 Dave Van Zandt contacted us on Aug. 9, 2018 to say MB/FC has changed its rating of PolitiFact to "Left-Center." But we can't find any evidence the change occurred so we have no response yet to the supposed change (perhaps Van Zandt's message was simply in error, intending to inform us of a more subtle shift in the rating).

The Internet Archive pretends to have plenty of saves but the only one it shows (when we checked) seems to be from January 2018.

The latest live version contains the following, which seems short of moving PolitiFact to the "Left-Center" category:
Overall, this update reveals a slight leftward shift in Politifact’s fact checking selection, but not enough to move them from the least biased category. (7/10/2016) Updated (D. Van Zandt 7/15/2018)

That seems a bit wishy-washy. Subjectivity can have that effect.

Update 2, June 6, 2021:

MB/FC updated its rating of PolitiFact to "Left-Center Bias" with an update time-stamped 04/28/2021:

Overall, we rate Politifact Left-Center Biased based on fact checks that tend to be more favorable for the left. We also rate them High for factual reporting and a credible fact-checker that is not without bias. (7/10/2016) Updated (D. Van Zandt (4/28/2021)
Van Zandt's 2021 update vanishes the 2018 update. It would be better to keep the publication and update dates all intact and link each to an appropriate URL at the Internet Archive.

The fact that Van Zandt has come around to our position does not mean that either one of us has rendered an objective judgment of PolitiFact's bias.

In our opinion, if PolitiFact's bias has shifted left since 2018 it was by a fraction. And we would say Van Zandt continues to overestimate PolitiFact's reliability.

Sunday, October 15, 2017

PolitiFact: LeBron James is Colin Kaepernick

Yes, we confess to using a strange title for this post.

Yet as far as we can tell, that is what PolitiFact is saying with a Facebook post from earlier today:

A fact check on Colin Kaeperick's shirt? We followed the link. We did find a rating of Kaepernick's misattribution of a quotation to Winston Churchill. But there was nothing about his shirt.

The linked page was titled "All Sports statements."

But we remembered seeing a fact check related to a sports figure that wasn't on that page.

It was a fact check of a Photoshop that changed the text on Lebron James' shirt.

So ... Lebron James is Colin Kaepernick?

We wonder how PolitiFact handles Facebook corrections transparently.

Friday, October 13, 2017

Yet more sweet PolitiLies

Today, with a fresh executive order from President Donald Trump ending subsidies insurance companies had received from the Obama and Trump administrations, PolitiFact recycled a PolitiSplainer it published on July 31, 2017.

The story claimed to explain what it meant when Trump threatened to end an insurance company bailout:

We have no problem with the bulk of the story*, except for one glaring omission. PolitiFact writer John Kruzel somehow left out the fact that a court ruling found the payments to insurance companies were not authorized by the Affordable Care Act. The Court suspended its injunction to leave time for an appeal, but time ran out on the Obama administration and now any such appeal is up to the Trump administration.

Anyone want to hold their breath waiting for that to happen?

That makes the lead of Kruzel's story false (bold emphasis added):
President Donald Trump warned lawmakers he would cut off billions in federal funding that insurance companies receive through Obamacare if Congress fails to pass new health care legislation.
If the ACA legislation does not authorize the spending the insurance companies received, then the insurance companies do not receive their funding "through Obamacare."

How does a "nonpartisan" fact checker miss out on a key fact that is relatively common knowledge? And go beyond even that to misstate the fact of the matter?

Maybe PolitiFact is a liberal bubble?

*At the same time, we do not vouch for its accuracy

Thursday, October 12, 2017

PolitiFact defends itself for money

PolitiFact has a terrible record when it comes to defending itself from public criticism. So it surprised us to see PolitiFact's Jon Greenberg take to Twitter highlighting an article he wrote defending against criticism he received from Mark Hyman.

The destination article by Greenberg failed to link to the attacking video. So persons wishing to see for themselves would just have to take Greenberg's word or else try to find the video through their own effort.

We found the video with a little effort. We could not find it through the association with Sinclair Broadcasting that Greenberg advertised. We found it by connecting the "Mark Hyman" mentioned in Greenberg's self-defense to

We found a number of things striking about Greenberg's article.

Sinclair, which has faced criticism for a clear conservative point of view, published a video commentary last week saying we fabricated data related to a fact-check we published on Sen. Ted Cruz, R-Texas. Cruz claimed, "Two-thirds of the (Sandy disaster relief) bill had nothing to do with Sandy."
First, we were overpoweringly bemused by PolitiFact, which has faced criticism for a clear liberal point of view, mentioning that Sinclair has received criticism for leaning right. If the accusations against Sinclair are worth mentioning, then what of those against PolitiFact?

More importantly, Greenberg made a logical leap with his claim the article says "we fabricated data related to a fact-check we published on Sen. Ted Cruz, R-Texas." That simply isn't in Hyman's video or the transcript. The closest to that occurs at the end of the video, when Hyman refers to his two other criticisms of PolitiFact:
On our website are two other segments [here, here] that show PolitiFact fabricating info and presenting false claims.
While it is possible to read the statement as a suggestion PolitiFact fabricated information in its fact check of Cruz, it may also be read to simply say the other two segments show PolitiFact fabricating information and(/or) presenting false claims. In his tweet Greenberg was more specific, fabricating the claim that Hyman accused him of making up "numbers."

Back to Greenberg's article:
We found that the bulk of the federal money went to states hit hardest by Sandy.

Sinclair executive Mark Hyman countered, saying that "billions were not for emergency relief. Or for Sandy."
Is this a titanic battle of straw men or what?

Hyman's beef with PolitiFact was its supposed suggestion that virtually all of the Sandy relief bill went to pay for relief from Sandy's impact. The quotation Hyman used occurred in the Washington Post version of the Cruz fact check but not in the one PolitiFact published. But it isn't hard to see the same idea presented in PolitiFact's fact check. If the money went to states hardest hit by Sandy, PolitiFact apparently reasoned, then it was for relief from superstorm Sandy.

That's bad reasoning, and worth exposing.

Is Mark Hyman a "Sinclair executive"? We think Greenberg botched the reporting on this one [Pre-publication update: PolitiFact fixed this after we did some Twitter needling]. Hyman's biography (dated October 2017) says he stepped down from an executive position in 2005 and mentions no resumption of a similar post.

That straw man again

That could be true, but that isn’t what Cruz claimed. He said the lion’s share of the money had no connection to Sandy.

That’s a bold assertion, and nothing Sinclair presented actually supports it.
We must forgive Jon Greenberg for focusing on Hyman's failure to show Cruz was right. A fact checker cannot be expected to notice that Hyman did not try to defend Cruz and did not mention Cruz in his critique of PolitiFact.

In debate terms, Greenberg conceded that Hyman may have a sound premise.

Pictorial interlude/foreshadowing

How about a different straw man?

It’s a simple question of math and scale. Sinclair’s report said that $16 billion went to the Housing and Urban Development Department. It then gave two examples of that money going to Chicago (to upgrade sewer and water systems) and Springfield, Mass. (to boost development in tornado-damaged low-income neighborhoods). Together, the two grants add up to $85 million.

Those dollars amount to one half of 1 percent of the money HUD got after the storm.
Greenberg omits that Hyman explicitly said he was merely giving two examples among many. It was disingenuous, and a straw man fallacy, for Greenberg to total the amount from Hyman's two examples and use the total to amplify PolitiFact's point that the bulk of the spending went to disaster relief for damages wrought by Sandy. In a way, Greenberg actually proves Hyman's point after the fact.

A point unresponsive to Hyman's charge

This is what happens with straw man arguments. We see arguments advanced that have nothing to do with what the other person was arguing.

As we reported, HUD granted $12.8 billion to the places hit hardest by Sandy, namely New Jersey, New York and New York City. That represents about 80 percent of the HUD total, the opposite of Cruz’s claim that two-thirds had nothing to do with Sandy.
As Hyman wasn't defending Cruz, Greenberg wastes his words.

A paragraph hinting at what might have been ...

We nominate this next segment as Greenberg's best paragraph:
There are valid reasons to debate what qualifies as emergency relief and what is non-emergency spending. We noted that distinction in our report, as well as that the Sandy appropriation bill was a leaky bucket. The money, for example, could be spent on disasters in 2011, 2012 and 2013. We also highlighted that it takes years to spend many of those billions of dollars, especially when they go to roads, bridges, tunnels and other infrastructure.
The above is the sensible person's response to Hyman's editorial. Hyman charged that PolitiFact made it look like nearly all the money from the Sandy relief bill went for emergency relief. Greenberg's right that PolitiFact made these points in its fact check. Hyman's case, then, is certainly not a slam-dunk.

... and then back into the weeds of falsehood and obfuscation

The Sinclair report concluded by saying that PolitiFact "is fabricating info and presenting false claims." That is simply not true. Our reporting is accurate, and we list all of our sources.
Greenberg leaves out the context of Hyman's conclusion, as we pointed out above. As a result, Greenberg leaves his readers the misleading impression that his article refutes Hyman's concluding claim. That claim most obviously refers to two other segments about PolitiFact that Greenberg does not address in his article. On what basis does he call those charges false?

Making matters worse for PolitiFact, Greenberg's article contains inaccurate reporting and fails to list all its sources (documented above).

The Ulterior Motive

Why did PolitiFact defend itself from Hyman's video attack when we've been enthusiastically targeting PolitiFact for years while receiving a fairly dedicated silence in response?

The image we inserted above foreshadowed the answer, specifically the blue hotlink in the middle of Greenberg's article encouraging readers to "STAND UP FOR FACTS AND SUPPORT POLITIFACT!" (all caps in the original)

The link leads to a page where readers can sign up to join PolitiFact's "Truth Squad," which helps financially support PolitiFact.

Greenberg's story is PolitiFact's version of those ubiquitous emails politicians send out to spur their constituents to give them money. My opponent is forming a Super PAC! Send $8 to show you support Candidate X and getting rid of money in politics!

PolitiFact chose the attack from Sinclair because it could attach the attack to a company with deep pockets, scaring its supporters into giving it money.

Check out the email we got from Executive Director Aaron Sharockman:

The Sinclair Broadcasting Group, the nation's largest owner of television stations, is attacking PolitiFact for a recent fact-check we published about federal funding related to superstorm Sandy.
Sinclair, which has faced criticism for a clear conservative point of view, published a video commentary last week saying we fabricated data related to a fact-check we published on Sen. Ted Cruz, R-Texas. Cruz claimed, "Two-thirds of the (Sandy disaster relief) bill had nothing to do with Sandy."
We found that the bulk of the federal money went to states hit hardest by Sandy.
Sinclair executive Mark Hyman countered, saying that "billions were not for emergency relief. Or for Sandy."
That could be true, but that isn’t what Cruz claimed. He said the lion’s share of the money had no connection to Sandy.
That’s a bold assertion, and nothing Sinclair presented actually supports it.
Will you help PolitiFact fight for the truth?
It’s a simple question of math and scale. Sinclair’s report said that $16 billion went to the Housing and Urban Development Department. It then gave two examples of that money going to Chicago (to upgrade sewer and water systems) and Springfield, Mass. (to boost development in tornado-damaged low-income neighborhoods). Together, the two grants add up to $85 million.
Those dollars amount to one half of 1 percent of the money HUD got after the storm.
As we reported, HUD granted $12.8 billion to the places hit hardest by Sandy, namely New Jersey, New York and New York City. That represents about 80 percent of the HUD total, the opposite of Cruz’s claim that two-thirds had nothing to do with Sandy.
There are valid reasons to debate what qualifies as emergency relief and what is non-emergency spending. We noted that distinction in our report, as well as that the Sandy appropriation bill was a leaky bucket. The money, for example, could be spent on disasters in 2011, 2012 and 2013. We also highlighted that it takes years to spend many of those billions of dollars, especially when they go to roads, bridges, tunnels and other infrastructure.
The Sinclair report concluded by saying that PolitiFact "is fabricating info and presenting false claims." That is simply not true. Our reporting is accurate, and we list all of our sources.
Whenever we can, we let the numbers do the talking and in the case of Cruz’s statement, the numbers spoke loud and clear. He said two-thirds had nothing to do with Sandy. The dollars show that the bulk of the money went to the places hit hardest by Sandy.
Yours truly,

Aaron Sharockman
Executive Director
Isn't that precious? Sharockman sends out Greenberg's article under his own name! You'd think Sharockman could give Greenberg the credit for writing the thing, right?

Cheap Stunt, Poorly Executed

Anyway, PolitiFact makes it clear that it didn't answer Hyman as part of its supposedly firm commitment to transparency. PolitiFact answered Hyman to propel an appeal for money.

It only compounds our amusement that PolitiFact left out the fact that gives that appeal whatever urgency it might have. Maybe PolitiFact didn't want to credit HBO's John Oliver for it. Who knows? Regardless, Oliver reported  (our link goes via The Hill) that Sinclair Broadcasting Group makes Hyman's commentaries, among others, mandatory showing on local news programs (all or some we do not know). So Hyman's videos have greater reach than the modest Alexa rankings we see for (a little ahead of PFB in the 3.1 millions) would suggest.

All in all, we call this a cheap stunt poorly executed.

Nice work, PolitiFact. It makes a good bookend with your PolitiFact Evangelism and Revival Tour.

Monday, October 9, 2017

Was Obamacare a government takeover of health care?

An open letter to PolitiFact

Dear PolitiFact,

In 2010 you named your second "Lie of the Year." It was the GOP talking point that the Affordable Care Act, also known as Obamacare, represented a government takeover of health care.

The claim received a number of "Pants on Fire" ratings that year. According to PolitiFact's statement of principles, the "Pants on Fire" rating denotes a claim that is false and not merely false but also ridiculous.

The ACA now has the force of law, and some have expressed concern that spiking premium rates may lead to health insurance death spirals. Such death spirals may lead to insurers abandoning some markets and leaving them without an insurer.

The right blames the ACA. The left blames the Trump administration for not administering the law in a way aimed at helping it succeed.

And all of this leads up to my questions for you, PolitiFact.

If the ACA did not result in a government takeover of health care, then why do health insurance markets across the nation now depend on federal executive action for stability?

Would that have proved the case if the ACA (or something like it) had never passed?

Do you see why it's hard to take you seriously?

PolitiFact Bias

P.S. Yes, we know your commitment to transparency generally doesn't include responding to criticism.

PolitiFact does racial profiling (Updated)

On Oct. 6, 2017 PolitiFact confirmed Newsweek's report that white men commit the majority of mass shootings (bold emphasis added):
As details about the Las Vegas shooter’s identity emerged, media outlets noted some of the characteristics fit neatly within a familiar profile of prior mass shooting perpetrators.

Newsweek, for instance, ran a story with the headline, "White men have committed more mass shootings than any other group." The article builds on this claim, stating that 54 percent of mass shootings carried out since 1982 were done so by white males.
The "Mostly True" rating awarded to this claim counts as extremely dunderheaded. PolitiFact even explains why in the text of the fact check, but skips its common practice of rating "meaningless statistics" harshly on the "Truth-O-Meter" even when reported accurately.

That, for example, is why Donald Trump received a "Half True" rating for correctly stating that the Hispanic poverty rate went up under the Obama administration. PolitiFact said the statistic meant little because the Hispanic poverty rate went down during the same span.

In this case also, the rate counts as the key statistic. But PolitiFact's numbers showed that whites were no more than proportionally represented in the statistics (bold emphasis added):
Newsweek's claim is literally accurate. But it's worth noting the imprecision of this data, and the percentage of mass shootings by white men is lower than their share of the male population, according to Mother Jones.
Newsweek, for its part, allows a liberal expert to expound on the racial resentment factors that might explain the white male penchant for shooting up the town. And Newsweek follows that with the admission that maybe the sheer abundance of white people might help explain the data (bold emphasis added):
The high number of white men committing mass shootings is also explained, at least in part, by the fact white people make up a majority of the U.S. population (63 percent) and men are more likely to commit violent crime in general: In the U.S., 98 percent of mass shootings and 90 percent of all murders are committed by men.
Newsflash: There's no need to look for special explanations for the high number of whites committing mass shootings unless they are committing more than their share. And they aren't, according to the numbers PolitiFact used.

How did a statement just as flawed as Trump's garner a "Mostly True" rating?

That one's not hard. We know the ratings are substantially (if not entirely) subjective and that PolitiFact staffers are like everybody else: They're biased.  And their bias trends left.

But this fact check we found particularly egregious because it helps inflame racial conflict, albeit illogically. And we find it hard to imagine that the folks at PolitiFact did not realize, before publishing, that the fact check would feed that illogical thinking.

Here's a smattering of commentary from the comments at PolitiFact's Facebook page. We won't offer up the names because our purpose is to shame PolitiFact, not the people PolitiFact helped mislead.
"Because they listen to news media outlets that tell them they will be minorities in 20 years and that immigrants are taking away their jobs and women are threatening to take away their masculinity through economic means via education and most of them are stupid enough to believe it."
"White males, directly or indirectly, are responsible for far more than we realize!! We need to own what we've done and are currently doing!!"
"When you factor in the native genocide attacks and racially driven attacks, of course. Leave those out and they still are."
"I'm white so I'm not proud of it and I'd rather not admit it, but yes, probably. We probably lead in serial killers too."
"Add in serial killers and it gets worse."
"It shows 54% of mass shooting are done by whites, that there is a great chance of possibility that the next one can follow the same pattern."
"The point is that people like to pass laws based on statistics. For instance, there is, in effect, a Muslim ban and that was based on the assumption that it would make America safer because, supposedly and erroneously, Arab Muslims are a danger. Facts like the ones posted here negate that logic and show the racist intentions and biases behind actions like the travel ban."
"testosterone.....white priviledge [sic].....dangerous outcome?"
"Yes. And most of them are the ones with all of the guns... The REPUBLICANS."
"The real terrorist threat to the U.S.: extreme right wing white males."
Thanks, PolitiFact, for helping to bring us the truth in politics. Or something.

Update Oct. 9, 2017: PolitiFact reposted its fact check to Facebook. We'll take the opportunity to supplement our selection of comments from people buying into the deception.

"it seems white men are more motivated to commit mass shootings the [sic] people of color so there you have it."
"The focus is the right-wing NRA narrative that implies either minorities or radical Islamic terrorists are the biggest threat to safety rather than angry/crazy white guys with guns."
"well thankfully if the trend continues [white majority shrinking?--ed.] that won't be the case and we can stop sending worthless thoughts and prayers all the time."
"the facts say that even after you adjust for the per capita rates white males still do far more then their fair share of the mass shootings. The "Mostly True" rating is only because there's some debate over what really qualifies as a mass shooting for statistical purposes. RTFA."
"Ban white men."

Maybe PolitiFact will post the article again soon so we can update with even more comments.

Sunday, October 8, 2017

A mainstream media fact-checking scandal continues

Somewhere along the line, mainstream fact-checkers like PolitiFact had an epiphany about cutting funds from future baseline projections.

In the days of yore, when it was trendy for Republicans to decry the Affordable Care Act for cutting Medicare, PolitiFact said a cut from a budget baseline wasn't really a cut.

PolitiFact Virginia, June 2012 (bold emphasis added):
American Crossroads says (Sen. Tim) Kaine promoted a $500 billion cut to Medicare.
The Affordable Care Act contains about $564 billion in cost-savings measures for Medicare over 10 years. But the definition of a cut means there would be a reduction in spending. That’s not the case here. Medicare spending will continue to expand. The law will slow the projected rate of growth.
Now in the age of science and science-y fact-checking, PolitiFact has discovered that cutting funds from a future spending baseline is, in fact, a cut.

PolitiFact, October 2017 (bold emphasis added):
The Senate Budget Committee has a point that Medicare spending will be going up, just not as fast as it would under the status quo. It also has a point that more modest cuts sooner could stave off bigger cuts later. (Experts have often told us that it’s presumptuous to assume significant economic growth impacts before they materialize.)
But we don’t find it unreasonable for Schumer to call cumulative reductions to Medicare and Medicaid spending in the hundreds of billions of dollars "cuts."
That, friends and neighbors, is a major-league flip-flop. Zebra Fact Check documented it more extensively with a post on July 20, 2017. I pointed out the discrepancy on Twitter to the guilty parties,, PolitiFact and the Washington Post Fact Checker. If they took note of the criticism, apparently each has decided that there is nothing amiss with the inconsistency.

Don't miss this tree on account of the forest

We would draw attention back to one detail in PolitiFact's rating of Sen. Charles Schumer (D-NY):
(W)e don’t find it unreasonable for Schumer to call cumulative reductions to Medicare and Medicaid spending in the hundreds of billions of dollars "cuts."
Why did PolitiFact put "cuts" in quotes? If it was a word Schumer had used, then okay, no problem. But PolitiFact says it has no problem with Schumer using "cuts" to describe decreases to projected spending when Schumer used the term "guts," not "cuts."

When Wisconsin's Tommy Thompson said the Affordable Care Act gutted Medicare, PolitiFact Wisconsin had a big problem with him using "gut" instead of "cut":
The health care law slows Medicare’s growth but spending would still rise significantly, and some new services are added.

The changes do not promise to hold seniors harmless, but Medicare is not being gutted.

We rate the claim False.
See PolitiFact Wisconsin's fact check to appreciate the degree to which it emphasized a distinction between "guts" and "cuts."

For fact-checking Schumer in 2017, the words are merely synonyms.

The inconsistency on cuts from a baseline occurs routinely from the mainstream fact checkers.
It's a scandal. And we shouldn't be the only ones emphasizing that point.

Correction Oct. 9, 2017: Changed "put 'cuts' in parentheses" to "put 'cuts' in quotes."

Saturday, October 7, 2017

Miami New Times: "How PolitiFact Got Its "Fake News" Tag Wrong on Occupy Democrats"

This is not a post highlighting PolitiFact's left-leaning bias.

This is a post serving to remind us that PolitiFact often operates in a slipshod manner.

The Miami New Times came out with a story on Oct. 2, 2017 rightly panning PolitiFact for miscategorizing "Occupy Democrats" as fake news.

Sure, Occupy Democrats publishes false stories. But the folks at PolitiFact (and others in the fact-checking clique) bang the drum reminding everybody that fake news is the publication of deliberately false stories. That's made-up stuff, not just mistakenly or stupidly wrong.

Despite that, PolitiFact listed Occupy Democrats on its page of fake news sources.
(W)hen New Times asked why PolitiFact had classified the liberal Facebook-based news empire Occupy Democrats as "Fake News" in its Fake News Almanac, the site admitted Occupy Democrats should never have been on the list in the first place. The misclassification highlights the difficulty of judging fake news, even for the pros, and raises questions about the reliability of the almanac.
 The New Times obtained a predictable excuse from the PolitiFact's Joshua Gillin:
(Gillin said) the site should not have been included in the almanac because the majority of its posts reviewed by PolitiFact were not designated as fake news, and the two that were deemed fake news date to 2016. For a whole site to be classified as fake news, he said, it must regularly make a "deliberate attempt to mislead."
PolitiFact took Occupy Democrats off the list after getting challenged on its inclusion. But one advantage PolitiFact obtains by making its fake news site almanac an embed instead of a published article stems from canceling its responsibility to issue a correction notice.

Obviously there's neither a need nor responsibility to note corrections made to embedded content right, right?

Just call it the embedded content loophole.

Monday, October 2, 2017

The PolitiFact Evangelism & Revival Tour I

PolitiFact has embarked on what it is calling an "outreach" tour aimed at breaking down barriers that keep conservatives from trusting the fact checkers at PolitiFact.

To us here at PolitiFact Bias, this outreach tour bears all the earmarks one would expect of a naked publicity stunt.

Why would we say that?

First, consider the target audience. PolitiFact isn't just reaching out to any old run-of-the-mill conservatives, and the outreach certainly doesn't target any of PolitiFact's long-time critics. Rather, PolitiFact has targeted conservatives near the locations of its newest three state franchises in Alabama, Oklahoma and West Virginia.

What a coincidence? When I first heard that detail of the outreach plan all I could think of was the cult evangelism tool of "love-bombing."

We're keeping tabs on PolitiFact's effort to wash the unwashed, and the latest update from Poynter's Daniel Funke compelled us to complete this first in what is likely to prove a number of observations on the project. The Poynter Institute, by the way, owns the Tampa Bay Times, the newspaper that runs PolitiFact.

We found this paragraph striking:
(PolitiFact's Executive Director Aaron) Sharockman told Poynter they once did some consulting with business students at the University of Missouri to learn more about how to build trust and create new audiences. The students recommended that PolitiFact double down on growing liberal readership by visiting communities that skew blue — the exact opposite of its current project in middle America.
A-ha. So PolitiFact apparently recognizes that its audience tilts left. We figured the PolitiFact folks had to know that, but this serves as the best confirmation of that so far.

It's great (seriously!) that PolitiFact chose the hard route of reaching out to the tougher audience. If PolitiFact ends up only interested in outreach to conservatives who accept PolitiFact as an objective referee of political claims, then we would consider that outcome consistent with the project's resemblance to love-bombing.

But who knows? Maybe PolitiFact is more interested in finding out why conservatives withhold their trust and changing its approach to address those concerns.

We will eagerly look for evidence that would falsify our working hypothesis.

Correction Oct. 16, 2017: Changed "Virginia" to "West Virginia" in the fourth paragraph. Our congratulations to Virginia. Condolences to West Virginia.  We apologize to our readers for the mistake.