Tuesday, February 21, 2017

Another nugget from the Hollyfield interview

In an earlier post we pointed out how managing editor Amy Hollyfield of PolitiFact described its "Truth-O-Meter" in terms hard to reconcile with those used by PolitiFact's creator, Bill Adair.

The Hollyfield interview published at The Politic (Yale University) contains other amusing nuggets, such as this howler (bold emphasis added):
We take accuracy very seriously. Transparency is one of the key things we focus on, which is why we publish all the sources for our fact checks. We flag every correction and have a subject tag called “correction,” so you can see every fact check we’ve put a correction on.
We find Hollyfield's assertion offensive, especially as it occurs in response to a question about this website, PolitiFact Bias.

PolitiFact does a poor job of consistently adding the subject tags to corrected articles.

We pointed out an example in December 2016. PolitiFact California changed the rating of a fact check from "True" to "Half True," publishing a new version of its fact check from months earlier. Weeks later, PolitiFact California has yet to add a tag to the article that would make it appear on PolitiFact's "Corrections and Updates" page.

Maybe PolitiFact California does not regard rewriting an article as a correction or update?

How about PolitiFact Pennsylvania from January 2017? Lawyers pointed out that the Pennsylvania PolitiFact franchise incorrectly described the standard of evidence courts use for criminal cases. PolitiFact Pennsylvania ran a correction (the correction made the fact check incoherent, but that's another story), but added no tag to the story.


So, contrary to what Hollyfield claims, the corrected story is not transparently presented on its "Corrections and Updates" page.

PolitiFact's spotty compliance with its statement of principles is not new. We even complained about the problem to Paul Tash, the president of the Tampa Bay Times (Nov. 18, 2016). But we've noticed no improvement.

PolitiFact does not have a page that transparently informs readers of all of its corrections.

Will you believe Amy Hollyfield or your own lyin' eyes?

Monday, February 20, 2017

PolitiFact's "Truth-O-Meter": Floor wax, or dessert topping?

The different messages coming from PolitiFact founder Bill Adair and current PolitiFact managing editor Amy Hollyfield in recent interviews reminded me of a classic Saturday Night Live sketch.

In one interview (Pacific Standard), Adair said deciding PolitiFact's "Truth-O-Meter" ratings was "entirely subjective."

In the other interview (The Politic), Hollyfield gave a different impression:
There are six gradations on our [Truth-O-Meter] scale, and I think someone who’s not familiar with it might think it’s hard to sort out, but for people who’ve been at it for so long, we’ve done over 13,000 fact checks. To have participated in thousands of those, we all have a pretty good understanding of what the lines are between “true” and “mostly true,” or “false” and “pants on fire.”
If PolitiFact's "star chamber" of editors has a good understand of the lines of demarcation between each of the ratings, that suggests objectivity, right?

Reconciling these statements about the "Truth-O-Meter" seems about as easy as reconciling New Shimmer's dual purposes as a floor wax and a dessert topping. Subjective and objective are polar opposites, perhaps even more so than floor wax and dessert topping.

If, as Hollyfield appears to claim, PolitiFact editors have objective criteria to rely on in deciding on "Truth-O-Meter" ratings, then what business does Adair have claiming the ratings are subjective?

Can both Adair and Hollyfield be right? Does New Shimmer's exclusive formula prevent yellowing and taste great on pumpkin pie?

Sorry, we're not buying it. We consider PolitiFact's messaging about its rating system another example of PolitiFact's flimflammery.

We think Adair must be right that the Truth-O-Meter is primarily subjective. The line between "False" and "Pants on Fire" as described by Hollyfield appears to support Adair's position:
“False” is simply inaccurate—it’s not true. The difference between that and “pants on fire” is that “pants on fire” is something that is utterly, ridiculously false. So it’s not just wrong, but almost like it’s egregiously wrong. It’s purposely wrong. Sometimes people just make mistakes, but sometimes they’re just off the deep end. That’s sort of where we are with “pants on fire.”
Got it? It's "almost like" and "sort of where we are" with the rating. Or, as another PolitiFact editor from the "star chamber" (Angie Drobnic Holan) memorably put it: "Sometimes we decide one way and sometimes decide the other."


Afters

Though PolitiFact has over the years routinely denied that it accuses people of lying, Hollyfield appears to have wandered off the reservation with her statement that "Pants on Fire" falsehoods on the "Truth-O-Meter" are "purposely wrong." A purposely wrong falsehood would count as a lie in its strong traditional sense: A falsehood intended to deceive the audience. But if that truly is part of the line of demarcation between "False" and "Pants on Fire," then why has it never appeared that way in PolitiFact's statement of principles?

Perhaps that criterion exists only (subjectively) in Hollyfield's mind?


Update Feb. 20, 2017: Removed an unneeded "the" from the second paragraph

Sunday, February 19, 2017

Power Line: "Trump 4, PolitiFact 1"

John Hinderaker, writing for the Power Line blog, does a quick rundown of five PolitiFact fact checks of President Donald Trump. Hinderaker scores the series 4-1 for Trump.

Read it through for the specifics.

Our favorite part occurs at the end:
We could go through this exercise multiple times every day. Correcting the Democratic Party “fact checkers” would be a full-time job that I don’t plan to undertake. Suffice it to say that Trump is more often right than are the press’s purported fact checkers who pretend to correct him.
We continue to marvel at PolitiFact's supernatural ability to ignore substantive criticism. How often does it answer charges that it has done its job poorly?

If PolitiFact is an honest and transparent attempt at objective fact-checking, then we think PolitiFact should aggressively defend itself against such charges, or else change its articles accordingly.

On the other hand, if PolitiFact is a sham attempt at objective fact-checking, maybe it's smart to ignore criticism, trusting that its readers will conclude the criticisms did not deserve an answer.

Maybe there's an explanation that splits the difference?

Friday, February 17, 2017

PolitiFact: That was then, this is now

Now (2017)

PolitiFact is independent! That means nobody chooses for PolitiFact what stories PolitiFact will cover. PolitiFact made that clear with its recent appeal for financial support though its "Truth Squad" members--persons who contribute financially to PolitiFact (bold emphasis added):
Our independence is incredibly valuable to us, and we don't let anyone — not politicians, not grant-making groups, not anyone — tell us what to fact-check or what our Truth-O-Meter rulings should be. At PolitiFact, those decisions are made solely by journalists. With your help, they always will be.
Got it? Story selection is done solely by PolitiFact journalists. That's independence.

Then (2015)

In early 2015, PolitiFact started its exploration of public funding with a Kickstarter program geared toward funding its live fact checks of the 2015 State of the Union address.

Supporters donating $100 or more got to choose what PolitiFact would fact check. Seriously. That's what PolitiFact offered:

Pledge $100 or more

Pick the fact-check. We’ll send you a list of four fact-checks we’re thinking of working on. You decide which one we do. Plus the coffee mug, the shout out and the mail.
We at PolitiFact Bias saw this scam for what it was back then: It was either a breach of journalistic ethics in selling its editorial discretion, or else a misleading offer making donors believe they were choosing the fact check when in reality the editorial discretion was kept in-house by the PolitiFact editors.

Either way, PolitiFact acted unethically. And if Angie Drobnic Holan is telling the truth that PolitiFact always has its editorial decisions made by journalists, then we can rest assured that PolitiFact brazenly misled people in advertising its 2015 Kickstarter campaign.


Clarification Feb. 18, 2017: Belatedly added the promised bold emphasis in the first quotation of PolitiFact.

Tuesday, February 14, 2017

PolitiFact California "fact": Undocumented immigrants count as Americans

The secret formula for finding PolitiFact mistakes: Just look at what fact PolitiFact is checking, try to imagine how a biased liberal would flub the fact check, then look to see if that mistake occurred.

PolitiFact California makes this technique work like magic. Case in point:

We wondered if PolitiFact California and Gov. Brown count undocumented immigrants as "Californians." We wondered if PolitiFact California would even concern itself over who counts as a "Californian."

The answer? No. And PolitiFact California made its mistake even more fundamental by putting a twist on what Gov. Brown claimed. This was the statement Brown made from his 2017 state of the state address:
This is California, the sixth most powerful economy in the world. One out of every eight Americans lives right here and 27 percent – almost eleven million – were born in a foreign land.
Brown did not say 27 percent of "Californians" are foreign-born. In context, he said 27 percent of the Americans (U.S citizens) in California are foreign born. If Brown had referred to "Californians," the dictionary would have given him some cover. A resident of California can qualify as a "Californian."

But Merriam-Webster provides no such cover for the definition of "American":


Only one of the four definitions fits the context of Brown's claim. That is definition No. 3.

The problem for Brown and PolitiFact California? Both relied on Census Bureau data. The Census Bureau counts citizens and non-citizens in its population survey. About 3 million of California's population  (Kaiser Family Foundation estimates about 5 million) do not hold American citizenship and do not count as "American" by definition No. 3. Subtract 3 million from the number PolitiFact California used as the number of Californians, and subtract 3 million from the number of foreign-born California residents, and the percentage of foreign-born Americans in California (definition No. 3) comes up as 22 percent, not 27 percent.

If the true number of undocumented Californians is 5 million then the percentage drops below 18 percent.

Gov. Brown's figure is off by at least 5 percentage points, representing a percentage error of almost 23 percent. And PolitiFact California found it completely true:
Gov. Jerry Brown claimed in his State of the State Address that 27 percent of Californians, almost 11 million, "were born in a foreign land."

A 2015 American Community Survey by the U.S. Census Bureau verifies that statistic. Additionally, a researcher at the Public Policy Institute of California, which studies the state’s immigration and demographic patterns, confirmed the census report is the best authority on California’s foreign born population.

We rate Brown's claim True.

TRUE – The statement is accurate and there’s nothing significant missing.
To us, this looks like a classic case of a journalist's liberal bias damping proper skepticism. This type of mistake was predictable. We predicted it. And PolitiFact California delivered it.

Wednesday, February 8, 2017

PolitiFact misleads its "Truth Squad"

PolitiFact is in the midst of conducting a successful campaign for raising financial support from its readers.

We found it interesting and ironic that PolitiFact is using a misleading appeal toward that end:
As readers have cheered us on, plenty of politicians have actively rooted against us. At the 2012 Republican National Convention, journalists challenged Mitt Romney’s campaign team about an ad that falsely claimed Barack Obama was ending work requirements for welfare. Romney pollster Neil Newhouse responded by saying, "We're not going to let our campaign be dictated by fact-checkers."
Problem one: So far as we can tell, the fact checkers never responded to vigorous criticisms of their ruling on President Barack Obama's welfare work requirement tweak. That's despite basing the ruling essentially on Obama administration claims about what it was trying to accomplish with its Welfare requirement waiver provision.

Problem two: PolitiFact is taking the statement from Neil Newhouse out of context. And all the mainstream (left-leaning) fact checkers seem to enjoy doing that to enhance the popular view of their work.

I exposed that deception with an article at Zebra Fact Check:
What was Newhouse saying? We think the context makes clear Newhouse was not expressing a disdain for facts but instead expressing his distrust of fact checkers. The ABC News report makes that clear with its paraphrase of Newhouse: “Newhouse suggested the problem was with the fact-checkers, not the facts themselves.”

We’ll see that all three of the major fact checkers ignored the meaning ABC News identified for Newhouse’s statement and replaced it with a meaning that better served their purposes.
The fact checkers, including PolitiFact, misleadingly use Newhouse's statement as evidence campaigns do not care about the truth, and that, in turn, helps justify their own existence. And apparently the fact checkers themselves are perfectly willing to twist the truth to achieve that noble (selfish) end.

PolitiFact's "Truth Squad" is likely to end up as a left-leaning mob interested primarily in supporting journalism that attacks Republicans, conservatives and President Trump in particular.







Edit: A draft version of a Jeff Adds section was published today in error. We have since removed the section. Prior to removal we saved a version of this page that included the section. That can be found at Internet Archive
-Jeff 1619PST 2/10/2017

Sunday, January 29, 2017

PolitiFact continues its campaign of misinformation on waterboarding

Amazing. Simply amazing.

PolitiFact Bias co-editor Jeff D. caught PolitiFact continuing its tendency to misinform its readers about waterboarding in a Jan 29, 2017 tweet:
PolitiFact's claim was untrue, as I demonstrated in a May 30, 2016 article at Zebra Fact Check, "Torture narrative trumps facts at PolitiFact."

Though PolitiFact claims scientific research shows waterboarding doesn't work, the only "scientific evidence in the linked article concerns the related conditions of hypoxia (low oxygen) and hypercapnia (excess carbon dioxide). PolitiFact reasoned that because science shows that hypoxia and hypercapnia inhibit memory, therefore waterboarding would not work as a means of gaining meaningful intelligence.

The obvious problem with that line of evidence?

Waterboarding as practiced by the CIA takes mere seconds. Journalist Christopher Hitchens had himself waterboarded and broke, saying he would tell whatever he knew, after about 18 seconds.  Memos released by the Obama administration revealed that a continuous waterboarding treatment could last a maximum 40 seconds.

Prisoners could be subjected to waterboarding during one 30 day period

Maximum five treatment days per 30 days

Maximum two waterboarding sessions per treatment day
Max 2 hours per session (the length of time the prisoner is strapped down)
Maximum 40 seconds of continuous water application

Maximum six water applications over 10 seconds long per session
Maximum 240 seconds (four minutes) of waterboarding per session from applications over 10 seconds long
Maximum total of 12 minutes of treatment with water over any 24 hour period
Applications under 10 seconds long could make up a maximum 8 minutes on top of the four mentioned above

While it is worth noting that reports indicate the CIA exceeded these guidelines in the case of al Qaeda mastermind Khalid Sheik Mohammed, these limits are not conducive to creating significant conditions of hypoxia or hypercapnia.

The typical person can hold their breath for 40 seconds without too much difficulty or distress. The CIA's waterboarding was designed to bring about the sensation of drowning, not the literal effects of drowning (hypoxia, hypercapnia, aspiration and swallowing of water). That is why the techniques often break prisoners in about 10 seconds.

And the other problem?

The CIA did not interrogate prisoners while waterboarding them. Nor did the CIA use the technique to obtain confessions under duress. Waterboarding was used to make prisoners more amenable to conventional forms of interrogation.

None of this information is difficult to find.

Why do the fact checkers at PolitiFact (not to mention elsewhere) have such a tough time figuring this stuff out?

There likely isn't any significant scientific evidence either for or against the effectiveness of waterboarding. PolitiFact pretending there is does not make it so.