Thursday, April 23, 2015

PolitiFact Georgia, PolitiMath and the gender pay gap

On April 22, 2015, PolitiFact Georgia found it "Mostly False" that women make only 78 percent of what men make for doing the same work.

PolitiFact Georgia reported that the claim, from a Buzzfeed video, based its claim on statistics that, among other faults, did not bother to ensure that the men and women were doing the same work.

At Zebra Fact Check I've published an in-depth treatment of the way mainstream fact checkers mishandle the gender pay gap. But here we'll look narrowly at how PolitiFact Georgia applies a "Mostly False" rating to a gross exaggeration. Our "PolitiMath" stories explore the relationship between percentage error and PolitiFact's ratings, so PolitiFact Georgia's story makes a good subject.

PolitiFact's highest estimate of the wage gap after controlling for the type of job and some other factors was about 7 percent:
(T)he American Association of University Women that controlled for college major, occupation, age, geographical region, hours worked and more, and found there was still a 7 percent wage gap between male and female college grads a year after graduation.
Using that high-end estimate, the Buzzfeed video exaggerated by no less than 214 percent. There's precedent for liberals receiving ratings of "Mostly False" or better for exaggerations that large and larger. On the other hand, PolitiFact Wisconsin gave a state Democrat a "False" rating for an exaggeration of 114 percent.

At least we know Buzzfeed's exaggeration is not the largest to receive a rating of "Mostly False" or higher.

If anyone can find a statement from a Republican or conservative where a figure exaggerated by more than 100 percent received a rating of "Mostly False" or higher from PolitiFact, we'd love to hear about it. We haven't turned up anything like that yet.

Tuesday, April 21, 2015

No 2015 Pulitzer for PolitiFact

Though PolitiFact uses its 2009 Pulitzer Prize to burnish its credibility, we've argued that PolitiFact won in 2009 thanks to technical innovation. With that innovation now old news, our view of PolitiFact's win anticipated that PolitiFact would be unlikely to keep winning Pulitzer Prizes.

With the 2015 winners announced, PolitiFact was shut out for the sixth straight year. Nor did PolitiFact make the list of finalists in any category.

No matter! PolitiFact will run "Winner of the Pulitzer Prize" forever at the top of its website.

We'll celebrate PolitiFact's failure with a rerun of the "Hitler finds out" video we put together for last year's Pulitzer Prize announcements.

We predict a similar outcome in 2016.

Saturday, April 18, 2015

Breitbart News and Eric Wemple on fact checkers

The April 17 installment of Breitbart News' interview with the Washington Post's Eric Wemple features a section on fact checking. John Nolte, the Breitbart News interviewer, let Wemple turn the tables on which was conducting the interview, but on the bright side Nolte brings makes points Wemple probably wouldn't touch.

No. 1 highlight :
EW: Here are my thoughts on that specifically, and I told Glenn the same thing: You are saying that the Washington Post disproportionately targeted Republicans, and that’s fine. My only point is that I don’t think anyone can expect politicians from any party to lie at an even rate.

BNN: That wasn’t my approach with the Washington Post, though. My argument wasn’t that Kessler was calling more Republicans liars, my issue was that, by 2-to-1, Republican statements were chosen for the fact check treatment.
Credit Nolte with correcting Wemple's straw man. But the problem with the Fact Checker and PolitiFact comes at the point story selection and truth ratings intersect. Neither one tells the whole story by itself.

Nolte follows up by emphasizing the subjectivity of the ratings. Wemple offers a counter of sorts: That criticism also comes from the left.

From this point in the interview, Wemple stops giving his take and the rest of the fact checking section has Wemple prompting Nolte for his take. There's no admission from Wemple that the ratings are subjective. Wemple's abandonment of the issue leaves an implicit "the fact checkers are criticized from both sides so they must be doing something right" argument.
BNN: When PolitiFact fact checks a quip from Ted Cruz about Iran celebrating “Hate America Day” — everyone knows what he means, but they still call him a liar. It just goes too far. A subjective decision is made to get literal so Cruz can be called a liar. Kessler did this one once where Romney said Obama had never gone to Israel — and that was a fact. But Romney got Pinocchios because Kessler made a subjective decision to make certain context relevant. That’s an opinion column, not a fact check.

EW: PolitiFact is big on context too. Like the time Rachel Maddow went crazy on them. In a State of the Union speech, Obama took credit for creating so many jobs and PolitiFact said that wasn’t entirely true because those jobs were not created as a result of his policies.

BNN: Exactly!

EW: You think she’s right about that.

BNN: I think she’s dead right about that. That’s a subjective decision to bring in subjective context. Put it on the opinion pages.

EW: And you think that cudgel is used more often against Republicans.

BNN: Much more often.
Do you agree that the ratings are subjective, Eric Wemple? If so, then what does that say about claims that harsher ratings of Republicans show that Republicans simply lie more?

The meat of this issue comes from the fact checkers' framing of political truth-telling: Republicans lie more. But the fact checkers' methods remove the foundation for the frame.

Wednesday, April 15, 2015

Joshua Gillin's bad fact check: 'Rubio's bad analogy'

PolitiFact Florida and writer/researcher Joshua Gillin yesterday offered another example of PolitiFact's unworthiness.

PolitiFact Florida published Gillin's story about a portion of Marco Rubio's speech announcing the latter's run for president. In the story, Gillin judges Rubio used a bad analogy in his speech:
Sen. Marco Rubio confirmed his 2016 presidential campaign Monday, but an apparent musical analogy in his announcement speech was a bit off key. During a speech in which he implied his opponents were too old, Rubio accused the competition of wanting to recycle ideas "stuck in the 20th century."

"They’re busy looking backwards, so they do not see how jobs and prosperity today depend on our ability to compete in a global economy," Rubio said. "And so our leaders put us at a disadvantage by taxing and borrowing and regulating like it’s 1999."
Gillin surmises Rubio was referencing Prince's "1999," a top-40 hit back in 1983. Gillin likewise surmises Rubio was making a joke and so does not rate his claim on the "Truth-O-Meter." We think Gillin judged correctly on both points. Unfortunately for Gillin, that largely exhausts the good news.

Having determined Rubio was joking, Gillin proceeds to fact check Rubio's joke as though Rubio was trying to use 1999 as a serious example of liberal tax-and-spend policy.

How can we sufficiently emphasize the wrongheadedness of that approach? Why isn't it obvious that a joke is not intended as a literal comparison? If it's obvious that the joke isn't intended as a literal comparison then why proceed with a literal comparison at all?

It doesn't make a lick of sense.

Gillin even claims it's "ironic" that Rubio references a 1983 tune while touting his youth as a campaign positive. We can only count this as an evidence Gillin doesn't understand irony. Seriously, using a reference to an enduringly popular song is supposed to undercut Rubio's youthful appeal? Seriously? Did the relatively youthful Gillin have to Google "1999" to understand Rubio's reference?

As soon as PolitiFact understood it was a joke, the justifiable rationale for fact-checking Rubio disappeared. What PolitiFact Florida published was not a fact-check but an op-ed extolling the fiscal responsibility of President Bill Clinton. Gillin deftly avoids sharing credit for Clinton's accomplishments with the Republican-controlled Congress.

Near the end of his story Gillin confirms what we're saying:
So while Rubio likely isn’t making a literal comparison to 1999, his talking points would be off if he did: Taxes were higher back then, but the budget was balanced, while the opposite is true today.
So Rubio receives more-or-less the full effect of an unfavorable article premised on an admittedly unlikely view of his words.

It makes for a pretty good example of PolitiFact's supposed nonpartisanship.

Update 4/16/2015:

Added a link to the PolitiFact article. And, as the article was hosted at PolitiFact national and not PolitiFact Florida, struck the word "Florida" where it followed "PolitiFact" in our post.

Sunday, April 12, 2015

Are PolitiFact's "report cards" misleading?

One of our recurrent themes at PolitiFact Bias concerns the misleading nature of PolitiFact's "report cards." PolitiFact admits it does not perform its fact checks on a scientific basis, particularly in that story choices do not occur randomly. Despite that, it's utterly common for PolitiFact to publish a "report card" story encouraging readers to consider a candidate's "report card."

The latest such asks readers to consider the record of just-announced presidential candidate and former Secretary of State Hillary Clinton.

The reader comments from PolitiFact's Facebook page offer us a window into the degree of deception PolitiFact achieves with its "report card" stories.

We'll omit the names to save these individuals unnecessary embarrassment.

"When you compare the overall honesty of Democrats vs. Republicans, it's no wonder that some Republicans believe that fact checking web sites are liberally biased. It's an easier explanation then the reality that Republicans tend to lie more."

"Better than Ted Cruz."

"Comparison: Clinton, true or mostly true = 48%; Rand Paul, true or mostly true = 15%; Ted Cruz, true or mostly true = 8% (from PolitiFact archives)."

I would love to see a chart comparison against the other candidates. Paul's record would be a joke, his pants have been on fire so much the fire department had to move into a spare room"

"Still not a Hillary fan, but at least she's more honest that the right wing."

"Not a bad record."

Many more after the break! 

Thursday, April 9, 2015

It's tweezers for Senator Paul

Tweezers or tongs?

Will PolitiFact take just part of a statement into account (tweezers), or will it focus on the whole of the statement (tongs)?

PolitiFact fact checked a statement from Sen. Rand Paul (R-Ky.)

Here's Sen. Rand Paul (R-Ky.) announcing his bid for the presidency:
It seems to me that both parties and the entire political system are to blame.

Big government and debt doubled under a Republican administration.

And it’s now tripling under Barack Obama’s watch. President Obama is on course to add more debt than all of the previous presidents combined.
Paul's first sentence in the above quotation looks like opinion. That leaves three great potential fact checks. First, that the debt doubled under "a Republican administration" (George W. Bush). Second, the debt is now tripling under President Barack Obama. Third, Obama is on course to add more debt than all of the previous presidents combined.

PolitiFact feints as though it will cover the two claims about multiplying the debt. But the "Half True" rating suggests that PolitiFact only rated the claim about Obama tripling the debt. PolitiFact concludes:
Paul said, "Debt doubled" under Bush "and now it’s tripling under Barack Obama’s watch."

This statement is confusing. A person could easily interpret it to mean that debt has tripled since Obama took office -- which would be incorrect. Paul, on the other hand, said that it means debt today, under Obama, is triple what it was when Bush’s term started. 


From one not-so-obvious angle, Paul's numbers are correct. But because the statement could so easily be interpreted in another, less accurate way, we rate it Half True.
PolitiFact found Paul was accurate about the doubling of the debt under a Republican administration. So if his statement about the debt tripling under Obama was completely false, combining the true and false statements averages out to "Half True." But PolitiFact doesn't say Paul was wrong about the tripling of the debt, only that it was wrong if taken in the supposedly obvious way, that the debt tripled starting from the time Obama took office. So why isn't Paul's claim "Mostly True"?

Spoiler: PolitiFact rigs the game.

The "Half True" rating doesn't fit. The context of Paul's statement makes clear he's criticizing Democrats and Republicans. But the clincher is Paul's claim that Obama is on course to add more debt than all previous presidents combined.

Looked at in the simplest way, the way people are likely to understand it, the debt from year to year represents the debt of all previous presidents combined. Most added debt, but a few, like Calvin Coolidge, produced a surplus.

If Obama had nearly tripled the debt since he took office then he's not "on course" to add more debt than all previous presidents combined. He'd have done it once already with a good shot at doing it a second time.

Taken properly in context, the only sensible meaning of Paul's statement is the one he gave: He was talking about Obama tripling the debt in the sense of taking the next step past Bush's doubling of the debt.

PolitiFact, unsurprisingly, did not quote Paul's statement about Obama adding more debt than all the presidents preceding him combined. Leaving out important context helps PolitiFact apply its tweezers treatment.

Fact checkers shouldn't blame politicians when people interpret their statements incorrectly or stupidly. Fact checkers should explain the correct or most sensible interpretation to help those people understand it correctly.

Tuesday, April 7, 2015

But-but-but ... Experts!

I can imagine the PolitiFact apologist noticing that in our comparison between PolitiFact's alligator attack and shark attack stories that experts only complained about the silliness of the alligator attack comparison.

So, logically, it's not PolitiFact Florida's fault that the alligator attack comparison was silly while the shark attack comparison wasn't silly.

So there! Reality just has a liberal bias, and stuff.

Well, I covered this angle when I wrote up the alligator attack story at Zebra Fact Check.

Interpreting the comparison is not rightly the job of the criminology expert, nor the alligator attack expert. Those experts properly inform as to the number of attacks by concealed-carry permit holders, or the number of alligator attacks. It is the expert on English communications (if needed) that rightly evaluates the comparison. And in this case I challenge any expert on English to draw a principled distinction between PolitiFact Florida's stories on alligator and shark attacks. Both compare rare but dramatically different things. Both involve a number for which we don't have reliable statistics.

On the issue of experts pointing out the silliness of the comparison, the real question is why experts didn't call both comparisons silly.

Perhaps it was random variation.

Perhaps PolitiFact Florida didn't feel the need to interview a panel of experts about voter fraud.

Perhaps the experts carry their own political bias.

We think a fact checker should be able to make the call on a literary comparison that demands no particular English or logical expertise. And with respect to the alligator attack and shark attack stories, the call should be the same for both.

I'll reiterate what I wrote at Zebra Fact Check: Fact checkers should not allow experts to decide an issue outside their area of expertise.

Sharks & Alligators

I spent time at Zebra Fact Check last week dissecting PolitiFact Florida's effort to fact check whether the first 10 years of Florida's concealed-carry gun law produced twice as many alligator attacks as attacks by concealed-carry permit holders.

One aspect of that study deserves special attention here at PolitiFact Bias.

PolitiFact Florida couldn't find the information it needed to check the alligator claim. But it found a couple of experts who thought it was silly to compare gun attacks to alligator attacks, and the supposed silliness of the comparison found its way into the "Mostly False" rating. Health News Florida reviewed the rating with PolitiFact's Amy Hollyfield:
Since there's no source of comprehensive data for attacks by gun license holders, experts told PolitiFact Florida that it’s not very meaningful to compare alligator bites to the misuse of firearms.

“One of them told us it’s more than silly to compare bites to bullets,” Hollyfield said.
We couldn't avoid thinking about PolitiFact Florida's 2012 fact check of whether shark attacks outnumber cases of voter fraud. PolitiFact chose the measure: "cases" considered for prosecution by the state of Florida instead of "cases" as individual instances of fraud.

PolitiFact Florida admitted its measure was imperfect:
While the shark attack figures are cut and dry (sorry!), the voter fraud numbers are not. There could be more cases than we know about, involving more people. The numbers may not represent total voter fraud cases, as those could be handled by local supervisors and state attorneys.
Without considering the silliness of comparing voter fraud to shark attack and undeterred by the lack of good data on voter fraud, PolitiFact Florida ruled it "Mostly True" that shark attacks occur more frequently than voter fraud.

Obviously inconsistent? Yeah. Coincidentally, the point liberals like, that voter fraud occurs rarely, gets a pass. The claim conservatives like, that concealed-carry permit holders very rarely use their guns to attack others, gets the harsh rating.

That's PolitiFact.  That's bias.

Monday, April 6, 2015

Tweezers: PolitiFact and the Indiana boycott

Sometimes PolitiFact focuses on one part of a statement. Sometimes PolitiFact spreads its focus to cover the whole of a statement. We use the "Tweezers or Tongs" tag for posts where we draw a contrast involving PolitiFact's choice of focus.

On April 2, 2015, PolitiFact published a story looking at a statement from Red State's Erick Erickson. Erickson wrote about Indiana's version of the Religious Freedom Restoration Act and the willingness of some on the left to punish Indiana economically for passing the legislation.

PolitiFact lays out the basics, we provide the bold emphasis:
In a column for conservative grassroots site, editor-in-chief Erick Erickson criticized business owners and people on the left who say the law will allow anyone to cite religious belief in refusing to serve gays and lesbians. Erickson’s opening sentence hones in on Apple chief executive Tim Cook for what he sees as hypocritical business practices.

"To recap: Tim Cook (please, please click this link) and the left are happy to do business in countries that stone to death or otherwise jail gay people, but will not do business with Indiana," Erickson wrote, "which merely passed a law insisting that the ‘free exercise’ clause of the first amendment be on the same legal footing in courts as the ‘free speech’ clause of the first amendment."
Obviously Erickson wasn't talking only about Apple CEO Tim Cook. He mentioned "the left." And Erickson has a point that some on the left took action to cut back business dealings with Indiana:
Companies, celebrities and even local and state governments have come out in opposition to Indiana's controversial "religious freedom" bill. Several have even cancelled plans to do business in the state, citing the potential for discrimination against gays and lesbians.
PolitiFact's story contains mention of only one boycott: the one PolitiFact says Erickson said was coming from Apple. Tweezers.

PolitiFact claimed Erickson was making the point that Cook was acting hypocritically by boycotting Indiana while continuing to do business with nations like Iran. But if that was really Erickson's point then why did he dedicate only one line in his entire column to that point? Tweezers.

Erickson was making a broader point about the reaction by some on the left. He accurately characterized reaction of some on the left in threatening Indiana with economic sanctions. And in the process he made it sound like Apple had committed to a concrete set of such sanctions against Indiana. That's where PolitiFact's tweezers came in, for Cook had simply written a column criticizing Indiana's RFRA law.

But here's the hole in PolitiFact's fact check: Did Apple have any sponsored events occurring in Indiana that it might have cancelled, like some other companies had done? If not, is it safe to assume Apple would not have joined some other companies in canceling such events?

The Apple convention "MacWorld/iWorld" took place in the middle of March earlier this year. In San Francisco. A lost opportunity to teach Indiana a lesson?

What did PolitiFact do wrong, if anything?

As we pointed out, if Erickson was making a point about a real Apple boycott of Indiana, he had plenty of opportunity to mention Apple specifically. But he did not. He lumped Cook in with the boycott, which was somewhat misleading, but Erickson was setting the stage for a general criticism of the left's intolerance to resistance of its mainstreaming of homosexuality. And Cook's a fair example to match with the point of Erickson's column. PolitiFact missed the point of the column and the reference to Cook, using its tweezers to ding Erickson while not even acknowledging the reality of the boycott threatened by companies aside from Apple.


Impartial tweezers?

Nah. PolitiFact's trick is to often treat parallel statements from liberals or Democrats with tongs. Sure, part of the statement is false, but part of it is true! So, "Half True" or something!


The Erickson response: Erickson defended his column by saying sometimes a tweet is just a tweet. PolitiFact gleefully made light of that excuse, noting that Erickson's column was not a tweet. Note to Erickson: What PolitiFact did was ridiculous, but you need to do better than that.

Correction: Struck "Florida" from the title, as PolitiFact National was responsible for the Red State fact check, not PolitiFact Florida.