Tuesday, April 26, 2016

NTSH: 95 percent of Clinton's claims "Mostly True" or better?

We tip our hats to Power Line blog for making it easy to add a "Nothing To See Here" item.

With "Nothing To See Here" we take note of political statements deserving of a fact check. But we tend to doubt one will occur. Power Line blog noted a problem with a Nicholas Kristoff column in The New York Times. Kristoff, a liberal columnist, wrote a column highlighting Clinton's position head-and-shoulders above the competition when it comes to PolitiFact report cards. But there was a problem: Kristoff got the key numbers wrong.

Power Line's Steven Hayward compared the original version of Kristoff's column with the Times' later correction of the article.
At the bottom of the column is this short correction:
Correction: April 23, 2016: An earlier version of this column misstated some of the percentages of true statements as judged by PolitiFact.
So how did the original version of Kristof’s column read? Here:
PolitiFact, the Pulitzer Prize winning fact checking site, calculates that of the Clinton statements it has examined, 95 percent are either true or mostly true.

That’s more than twice as high as the percentages for any of the other candidates, with 46 percent for Bernie Sanders’s, 12 percent for Trump’s, 23 percent for Ted Cruz’s and 33 percent for John Kasich’s. Here we have a rare metric of integrity among candidates, and it suggest that contrary to popular impressions, Clinton is far more honest and trustworthy than her peers.
So we go from 95 percent true to 50 percent true and switch out “far more honest and trustworthy than her peers” for “relatively honest by politician standards,” with the blink of a mere correction.
We've repeatedly noted PolitiFact's weak-to-nonexistent efforts to police the misuse of its "report card" data. If PunditFact and PolitiFact let Kristoff slide on this one, what else are they willing to overlook?

Wednesday, April 13, 2016

A reader's take on our "Pants on Fire" research

Elizabeth MacInnis wrote:
Your premise here is that if there are an equal amount of fact checks and Republicans lie more, then there must be fact-checker bias. It couldn't possibly be that Republicans actually do lie more. That's an unobjective analysis. In my opinion, watching both sides closely in each election, Republicans do lie more, but not for the reason that you think. Democrats typically run on a platform of hope and ideas. It's much more subjective. Republicans tend to run on fear and attacks against other candidates (whether you like this or not, it's true). An attack on someone's record is more likely to be proven true or false. Watching all the debates, I consistently hear Republicans say things like "we've had a job killing president," when in reality (as of now) we've had 72 months of private-sector job growth, a record. With consistent, clearly false statements like this, Republicans are doing it to themselves. I believe we need a balanced system with both parties, but in many Americans' opinions (including mine), Republicans have become more and more outrageous in recent years (and no, that is not to say that Democrats are clear of any wrongdoing). The only way, in my opinion, to tame this is to hold them accountable - in fact-checking and votes. I hope both parties learn this lesson before their groups become too fractured.
Point by point:

"Your premise here is that if there are an equal amount of fact checks and Republicans lie more, then there must be fact-checker bias."

No, that's not our premise. Our premise (the one you seem to be talking about) is that if PolitiFact chose its stories only based on its editorial sense of whether the claim is true, then the results would be proportional. So Republicans could have five times more "false" ratings than Democrats but the distribution curve for both parties should appear similar. We don't think PolitiFact uses only its editorial sense in choosing stories.

The true central premise of the "Pants on Fire" research is that PolitiFact offers no objective means of distinguishing between its ratings of "False" and "Pants on Fire."

"It couldn't possibly be that Republicans actually do lie more. That's an unobjective analysis."

Can anyone explain to me how the "Pants on Fire" rating, with no apparent objective measure undergirding its use, contributes any empirical data toward the notion that Republicans lie more? Isn't that notion a sham?

"In my opinion, watching both sides closely in each election, Republicans do lie more, but not for the reason that you think. Democrats typically run on a platform of hope and ideas. It's much more subjective. Republicans tend to run on fear and attacks against other candidates (whether you like this or not, it's true)."

Summing up, then, MacInnis' opinion is true whether I like it or not? Is there any solid evidence supporting that opinion? Any at all?



(skipping some opinion that doesn't interest me so much)

"Watching all the debates, I consistently hear Republicans say things like "we've had a job killing president," when in reality (as of now) we've had 72 months of private-sector job growth, a record."

Remember when the Obama administration was lauding the helpful effects of the $900 billion stimulus bill? Employment was going down, but PolitiFact accepted arguments that unemployment would be even worse without the stimulus bill. Why is it that PolitiFact gives no consideration at all to a parallel principle with respect to claims of job-killing? That's not consistent, is it? We shouldn't make the mistake of thinking that inconsistent methods lead to good fact-checking, should we?

Elizabeth MacInnis, PolitiFact does fact-checking poorly. Don't be fooled.

PolitiFact's subjective "Pants on Fire" ratings tell us about PolitiFact, not about the entities receiving the subjective ratings.


 

Friday, April 8, 2016

PolitiFact: Lightning strikes still make a better comparison than alligator attacks

If you're tempted to illustrate the rarity of something by comparing it to something from real life, PolitiFact has a message for you: Take lightning strikes over alligator attacks.

On the issue of voter fraud, PolitiFact has given a number of "True" ratings to persons saying lightning strikes outnumber cases (that's cases considered for prosecution, mind you, not "cases" in the sense of "instances") of in-person voter impersonation.

PolitiFact's comparison is rigged by its narrow count of "cases," of course, but that's another story.

PolitiFact Wisconsin recycled the fact check with an April 7, 2016 item. The item offers no hint of criticism of the comparison of voter fraud to lightning strikes.

PolitiFact Wisconsin's "True" rating perpetuates the inconsistency we noted from a PolitiFact Florida fact check from 2015. It was claimed alligator attacks are more likely than a criminal attack by a Floridian with a concealed-carry gun permit. PolitiFact found the evidence broadly supported the claim but ruled it "Mostly False" since comparing alligator attacks to attacks with a firearm doesn't make sense:
(T)hese statistics, imperfect as they are, do support the notion that both kinds of attacks are uncommon. Whether this is a valid argument in favor of the bill is in the eye of the beholder. We find the statement has an element of truth but ignores other information that would give a different impression. So we rate it Mostly False.
There's an item a Zebra Fact Check criticizing PolitiFact Florida's ruling in detail.

It's worth noting that PolitiFact Wisconsin's evidence on voter fraud shared essentially the same weakness (no dependable count creating doubt):
It’s fair to say, however, that impersonation cases can be hard to count in that they are hard to prove -- particularly when no photo ID requirement is in place and a voter can cast a ballot simply by stating the name of a registered voter.

So the number of cases of in-person fraud by impersonation may be higher than that cited by Levitt, but no independent source suggests it is higher than the number of lightning strikes.

(...)

We rate Pocan’s statement True.
In both cases, then, PolitiFact doesn't really have the facts to fit the claim. But the liberal gets a "True" and the conservative gets a "Mostly False."

In other words, PolitiFact is objective and nonpartisan. Or something.

And if something happens rarely, compare it to lighting strikes instead of alligator attacks. Your PolitiFact report card may suffer otherwise.