Monday, February 27, 2017

Daily Caller: "Politifact Says Trump Is Right, But Rates His Remark ‘Mostly False'"

The Daily Caller notes an item from PolitiFact where President Trump tweeted something PolitiFact found true, after which the fact checkers proceeded to rate the claim "Mostly False."

The Daily Caller's Alex Pfeiffer has the skinny:
The tweet from Trump came after Gateway Pundit reported on the change in the national debt under the two respective presidents and after former Godfather Pizza CEO Herman Cain brought up the figures on Fox News.

Politifact wrote: “The numbers check out. And in fact, the total public debt has dropped another $22 billion since the Gateway Pundit article published, according to data from the U.S. Department of Treasury.”

Despite this, Politifact still gave Trump a rating of “mostly false” and titled its article, “Why Donald Trump’s tweet about national debt decrease in his first month is highly misleading.”
We saw this item and considered writing it up. It seemed to us the type of thing that liberal (or even moderate) readers might excuse, judging that PolitiFact did enough to justify the "Mostly False" rating it gave to Trump's tweet.

The case needs additional information to show that it does not represent a fair fact check.

The definition of "Mostly False"

Did PolitiFact show that Trump's tweet met its definition of "Mostly False"? Here is the definition:
MOSTLY FALSE – The statement contains an element of truth but ignores critical facts that would give a different impression.
Trump's tweet did not simply contain "an element of truth." It was true (and misleading). PolitiFact's "Truth-O-Meter" definitions mean little. PolitiFact does not used objective criteria to decide the rating. If objective criteria decided the rating, then PolitiFact's creator would not declare that "Truth-O-Meter" ratings are "entirely subjective."

Sauce for the gander?


If PolitiFact applied its judgments consistently, then the Daily Caller and sites like ours would have little to complain about. But vague definitions that ultimately fail to guide the final rating make it virtually impossible even for well-meaning left-leaning journalists to keep the scales balanced.

Consider an example from the PolitiFact Oregon franchise. PolitiFact Oregon rated Democrat Brad Avakian "Mostly True" for a false statement:
Avakian, citing Census data and echoing claims by Obama and others, said women in Oregon "earn an average of 79 cents for every dollar that men earn for doing the same job." The report he relied on noted that the 79-cent figure applies to full-time, year-round work, although Avakian didn’t include those stipulations.

For starters, the commissioner loses points for cherry-picking the 79-cent figure. Other means of measuring pay gaps between men and women put it considerably less.

The same can be said of the "for doing the same job" piece. As PolitiFact has found previously, the existence of a pay gap doesn’t necessarily mean that all of the gap is caused by individual employer-level discrimination, as Avakian’s claim implies. Some of the gap is at least partially explained by the predominance of women in lower-paying fields, rather than women necessarily being paid less for the same job than men are.

Finally, Avakian used the term "average" when the report he relied on said "median." He could have avoided that by simply saying women "make 79 cents for every dollar a man earns," but since the information he cited contains only median incomes, we find the difference to be inconsequential.

Those caveats aside, he still is well inside the ballpark and the ratio he cited is a credible figure from a credible agency. We rate the claim Mostly True.
That's an inexcusably tilted playing field. If Avakian had described the raw pay gap without saying it compared men and women doing the same job, then his claim would have paralleled Trump's: a true but misleading statement. But Avakian's statement was not true and misleading. It was false and misleading at the same time.

Yet it received a "Mostly True" rating compared to Trump's "Mostly False" rating.

Doesn't fact-checking need better standards than that?



Jeff Adds (1922PST 2/27/17):
We'd love to see PolitiFact reconcile their Mostly False rating of Trump's claim with the rationale behind this gem:



Was there anything misleading about Clinton's statement?
Clinton’s figures check out, and they also mirror the broader results we came up with two years ago. Partisans are free to interpret these findings as they wish, but on the numbers, Clinton’s right. We rate his claim True.
Ha! Silly factseekers. When Trump makes an accurate claim PolitiFact conjures their magical powers of objectivity to decide what is misleading. When lovable ol' Bill makes a claim, heck, PolitiFact is just checkin' the numbers and all you partisans can figure out what it means.

Note that PolitiFact gave Bill Clinton a True rating, which they define as "The statement is accurate and there’s nothing significant missing." Must be nice to be in the club.

We've pointed out how PolitiFact's application of standards is akin to the game of Plinko. With ratings like this it's difficult to view PolitiFact as serious journalists instead of carnival barkers.

Tuesday, February 21, 2017

Another nugget from the Hollyfield interview

In an earlier post we pointed out how managing editor Amy Hollyfield of PolitiFact described its "Truth-O-Meter" in terms hard to reconcile with those used by PolitiFact's creator, Bill Adair.

The Hollyfield interview published at The Politic (Yale University) contains other amusing nuggets, such as this howler (bold emphasis added):
We take accuracy very seriously. Transparency is one of the key things we focus on, which is why we publish all the sources for our fact checks. We flag every correction and have a subject tag called “correction,” so you can see every fact check we’ve put a correction on.
We find Hollyfield's assertion offensive, especially as it occurs in response to a question about this website, PolitiFact Bias.

PolitiFact does a poor job of consistently adding the subject tags to corrected articles.

We pointed out an example in December 2016. PolitiFact California changed the rating of a fact check from "True" to "Half True," publishing a new version of its fact check from months earlier. Weeks later, PolitiFact California has yet to add a tag to the article that would make it appear on PolitiFact's "Corrections and Updates" page.

Maybe PolitiFact California does not regard rewriting an article as a correction or update?

How about PolitiFact Pennsylvania from January 2017? Lawyers pointed out that the Pennsylvania PolitiFact franchise incorrectly described the standard of evidence courts use for criminal cases. PolitiFact Pennsylvania ran a correction (the correction made the fact check incoherent, but that's another story), but added no tag to the story.


So, contrary to what Hollyfield claims, the corrected story is not transparently presented on its "Corrections and Updates" page.

PolitiFact's spotty compliance with its statement of principles is not new. We even complained about the problem to Paul Tash, the president of the Tampa Bay Times (Nov. 18, 2016). But we've noticed no improvement.

PolitiFact does not have a page that transparently informs readers of all of its corrections.

Will you believe Amy Hollyfield or your own lyin' eyes?

Monday, February 20, 2017

PolitiFact's "Truth-O-Meter": Floor wax, or dessert topping?

The different messages coming from PolitiFact founder Bill Adair and current PolitiFact managing editor Amy Hollyfield in recent interviews reminded me of a classic Saturday Night Live sketch.

In one interview (Pacific Standard), Adair said deciding PolitiFact's "Truth-O-Meter" ratings was "entirely subjective."

In the other interview (The Politic), Hollyfield gave a different impression:
There are six gradations on our [Truth-O-Meter] scale, and I think someone who’s not familiar with it might think it’s hard to sort out, but for people who’ve been at it for so long, we’ve done over 13,000 fact checks. To have participated in thousands of those, we all have a pretty good understanding of what the lines are between “true” and “mostly true,” or “false” and “pants on fire.”
If PolitiFact's "star chamber" of editors has a good understand of the lines of demarcation between each of the ratings, that suggests objectivity, right?

Reconciling these statements about the "Truth-O-Meter" seems about as easy as reconciling New Shimmer's dual purposes as a floor wax and a dessert topping. Subjective and objective are polar opposites, perhaps even more so than floor wax and dessert topping.

If, as Hollyfield appears to claim, PolitiFact editors have objective criteria to rely on in deciding on "Truth-O-Meter" ratings, then what business does Adair have claiming the ratings are subjective?

Can both Adair and Hollyfield be right? Does New Shimmer's exclusive formula prevent yellowing and taste great on pumpkin pie?

Sorry, we're not buying it. We consider PolitiFact's messaging about its rating system another example of PolitiFact's flimflammery.

We think Adair must be right that the Truth-O-Meter is primarily subjective. The line between "False" and "Pants on Fire" as described by Hollyfield appears to support Adair's position:
“False” is simply inaccurate—it’s not true. The difference between that and “pants on fire” is that “pants on fire” is something that is utterly, ridiculously false. So it’s not just wrong, but almost like it’s egregiously wrong. It’s purposely wrong. Sometimes people just make mistakes, but sometimes they’re just off the deep end. That’s sort of where we are with “pants on fire.”
Got it? It's "almost like" and "sort of where we are" with the rating. Or, as another PolitiFact editor from the "star chamber" (Angie Drobnic Holan) memorably put it: "Sometimes we decide one way and sometimes decide the other."


Afters

Though PolitiFact has over the years routinely denied that it accuses people of lying, Hollyfield appears to have wandered off the reservation with her statement that "Pants on Fire" falsehoods on the "Truth-O-Meter" are "purposely wrong." A purposely wrong falsehood would count as a lie in its strong traditional sense: A falsehood intended to deceive the audience. But if that truly is part of the line of demarcation between "False" and "Pants on Fire," then why has it never appeared that way in PolitiFact's statement of principles?

Perhaps that criterion exists only (subjectively) in Hollyfield's mind?


Update Feb. 20, 2017: Removed an unneeded "the" from the second paragraph

Sunday, February 19, 2017

Power Line: "Trump 4, PolitiFact 1"

John Hinderaker, writing for the Power Line blog, does a quick rundown of five PolitiFact fact checks of President Donald Trump. Hinderaker scores the series 4-1 for Trump.

Read it through for the specifics.

Our favorite part occurs at the end:
We could go through this exercise multiple times every day. Correcting the Democratic Party “fact checkers” would be a full-time job that I don’t plan to undertake. Suffice it to say that Trump is more often right than are the press’s purported fact checkers who pretend to correct him.
We continue to marvel at PolitiFact's supernatural ability to ignore substantive criticism. How often does it answer charges that it has done its job poorly?

If PolitiFact is an honest and transparent attempt at objective fact-checking, then we think PolitiFact should aggressively defend itself against such charges, or else change its articles accordingly.

On the other hand, if PolitiFact is a sham attempt at objective fact-checking, maybe it's smart to ignore criticism, trusting that its readers will conclude the criticisms did not deserve an answer.

Maybe there's an explanation that splits the difference?

Friday, February 17, 2017

PolitiFact: That was then, this is now

Now (2017)

PolitiFact is independent! That means nobody chooses for PolitiFact what stories PolitiFact will cover. PolitiFact made that clear with its recent appeal for financial support though its "Truth Squad" members--persons who contribute financially to PolitiFact (bold emphasis added):
Our independence is incredibly valuable to us, and we don't let anyone — not politicians, not grant-making groups, not anyone — tell us what to fact-check or what our Truth-O-Meter rulings should be. At PolitiFact, those decisions are made solely by journalists. With your help, they always will be.
Got it? Story selection is done solely by PolitiFact journalists. That's independence.

Then (2015)

In early 2015, PolitiFact started its exploration of public funding with a Kickstarter program geared toward funding its live fact checks of the 2015 State of the Union address.

Supporters donating $100 or more got to choose what PolitiFact would fact check. Seriously. That's what PolitiFact offered:

Pledge $100 or more

Pick the fact-check. We’ll send you a list of four fact-checks we’re thinking of working on. You decide which one we do. Plus the coffee mug, the shout out and the mail.
We at PolitiFact Bias saw this scam for what it was back then: It was either a breach of journalistic ethics in selling its editorial discretion, or else a misleading offer making donors believe they were choosing the fact check when in reality the editorial discretion was kept in-house by the PolitiFact editors.

Either way, PolitiFact acted unethically. And if Angie Drobnic Holan is telling the truth that PolitiFact always has its editorial decisions made by journalists, then we can rest assured that PolitiFact brazenly misled people in advertising its 2015 Kickstarter campaign.


Clarification Feb. 18, 2017: Belatedly added the promised bold emphasis in the first quotation of PolitiFact.

Tuesday, February 14, 2017

PolitiFact California "fact": Undocumented immigrants count as Americans

The secret formula for finding PolitiFact mistakes: Just look at what fact PolitiFact is checking, try to imagine how a biased liberal would flub the fact check, then look to see if that mistake occurred.

PolitiFact California makes this technique work like magic. Case in point:

We wondered if PolitiFact California and Gov. Brown count undocumented immigrants as "Californians." We wondered if PolitiFact California would even concern itself over who counts as a "Californian."

The answer? No. And PolitiFact California made its mistake even more fundamental by putting a twist on what Gov. Brown claimed. This was the statement Brown made from his 2017 state of the state address:
This is California, the sixth most powerful economy in the world. One out of every eight Americans lives right here and 27 percent – almost eleven million – were born in a foreign land.
Brown did not say 27 percent of "Californians" are foreign-born. In context, he said 27 percent of the Americans (U.S citizens) in California are foreign born. If Brown had referred to "Californians," the dictionary would have given him some cover. A resident of California can qualify as a "Californian."

But Merriam-Webster provides no such cover for the definition of "American":


Only one of the four definitions fits the context of Brown's claim. That is definition No. 3.

The problem for Brown and PolitiFact California? Both relied on Census Bureau data. The Census Bureau counts citizens and non-citizens in its population survey. About 3 million of California's population  (Kaiser Family Foundation estimates about 5 million) do not hold American citizenship and do not count as "American" by definition No. 3. Subtract 3 million from the number PolitiFact California used as the number of Californians, and subtract 3 million from the number of foreign-born California residents, and the percentage of foreign-born Americans in California (definition No. 3) comes up as 22 percent, not 27 percent.

If the true number of undocumented Californians is 5 million then the percentage drops below 18 percent.

Gov. Brown's figure is off by at least 5 percentage points, representing a percentage error of almost 23 percent. And PolitiFact California found it completely true:
Gov. Jerry Brown claimed in his State of the State Address that 27 percent of Californians, almost 11 million, "were born in a foreign land."

A 2015 American Community Survey by the U.S. Census Bureau verifies that statistic. Additionally, a researcher at the Public Policy Institute of California, which studies the state’s immigration and demographic patterns, confirmed the census report is the best authority on California’s foreign born population.

We rate Brown's claim True.

TRUE – The statement is accurate and there’s nothing significant missing.
To us, this looks like a classic case of a journalist's liberal bias damping proper skepticism. This type of mistake was predictable. We predicted it. And PolitiFact California delivered it.

Wednesday, February 8, 2017

PolitiFact misleads its "Truth Squad"

PolitiFact is in the midst of conducting a successful campaign for raising financial support from its readers.

We found it interesting and ironic that PolitiFact is using a misleading appeal toward that end:
As readers have cheered us on, plenty of politicians have actively rooted against us. At the 2012 Republican National Convention, journalists challenged Mitt Romney’s campaign team about an ad that falsely claimed Barack Obama was ending work requirements for welfare. Romney pollster Neil Newhouse responded by saying, "We're not going to let our campaign be dictated by fact-checkers."
Problem one: So far as we can tell, the fact checkers never responded to vigorous criticisms of their ruling on President Barack Obama's welfare work requirement tweak. That's despite basing the ruling essentially on Obama administration claims about what it was trying to accomplish with its Welfare requirement waiver provision.

Problem two: PolitiFact is taking the statement from Neil Newhouse out of context. And all the mainstream (left-leaning) fact checkers seem to enjoy doing that to enhance the popular view of their work.

I exposed that deception with an article at Zebra Fact Check:
What was Newhouse saying? We think the context makes clear Newhouse was not expressing a disdain for facts but instead expressing his distrust of fact checkers. The ABC News report makes that clear with its paraphrase of Newhouse: “Newhouse suggested the problem was with the fact-checkers, not the facts themselves.”

We’ll see that all three of the major fact checkers ignored the meaning ABC News identified for Newhouse’s statement and replaced it with a meaning that better served their purposes.
The fact checkers, including PolitiFact, misleadingly use Newhouse's statement as evidence campaigns do not care about the truth, and that, in turn, helps justify their own existence. And apparently the fact checkers themselves are perfectly willing to twist the truth to achieve that noble (selfish) end.

PolitiFact's "Truth Squad" is likely to end up as a left-leaning mob interested primarily in supporting journalism that attacks Republicans, conservatives and President Trump in particular.







Edit: A draft version of a Jeff Adds section was published today in error. We have since removed the section. Prior to removal we saved a version of this page that included the section. That can be found at Internet Archive
-Jeff 1619PST 2/10/2017

Update Aug. 22, 2018: Added the URL for the PolitiFact article encouraging readers to join its "Truth Squad." Also removed a redundant URL from the last paragraph.