Monday, March 23, 2015

Fact checking the PolitiFact way

Okay, kids, today we're going to look at how to fact check the PolitiFact way. We'll look at how to tell whether President Obama's signature health care reform bill, known as the ACA or "Obamacare," was a job killer.

It's okay to leak your findings at the beginning of the fact check:
Predictions about the health care law were a dime a dozen back in 2010. Supporters contended that virtually everyone around the country would soon have access to affordable insurance. Opponents said the law would cost a fortune by adding to the national debt and killing jobs.

Actually, none of those things have happened.
Next, provide evidence supporting your findings:
One of the warnings that the law’s opponents issued repeatedly in the months leading up to passage was that the health care law would kill jobs. In 2011, Republicans titled the repeal legislation they were pursuing the "Repealing the Job-Killing Health Care Law Act." But independent studies didn’t back up the claims that the law would end up reducing employment, so PolitiFact has rated such statements False.
Some might question whether it's appropriate to judge the ACA's effect on jobs by simply looking at employment numbers. Just assume that such people are biased and that their criticism will be drowned out by those who agree with your judgment.

Even though pointing to your own past work without getting into specifics ought to be plenty to convince people, it's okay to repeat your evidence for emphasis:
In the years since passage, employment in both the United States and Florida has been on an upward trajectory as the economy has recovered from a recession.
What more proof could anyone need? There we have it: the facts. Because the total number of jobs increased and the unemployment rate decreased, therefore the ACA had no negative effect on the number of jobs.


Seriously, how can PolitiFact take itself seriously?

Sunday, March 15, 2015

More PolitiMath, featuring PunditFact and Jalen Ross

In the past, we've pointed out the tendency at PolitiFact to make mistakes on basic math (and the tendency of Democrats to receive benefits from those mistakes).

Today PunditFact gave us another example, though this one they fixed pretty quickly when they learned something was amiss. Therefore our focus will fall not on the fact that a liberal, student Jalen Ross, gained the benefit from PunditFact's temporary misstatement of fact, but on the PolitiMath (the relationship, or lack thereof, between percentage error and PolitiFact's ratings) involved in keeping a "Mostly True" rating for Ross after fixing the mistake.

Ross attempted to use evidence from a scientific experiment to support the idea that racism remains a prominent problem in the United States. But he did not properly use the study's findings, as PunditFact belatedly pointed out:
Ross erred slightly in his exact wording. While white-sounding names spurred 50 percent more callbacks than the ones with black-sounding names, black-sounding names were 33 percent less likely to get responded to.
The slight error in Ross' exact wording ended up exaggerating the discrimination against so-called "black-sounding names" by about 50 percent ((50-33)/33).

So Ross used bad math and ended up exaggerating the key figure by 50 percent.

Let's compare what happened to Republican Mitch McConnell when he used questionable math and exaggerated his key figure by 37 percent or more (bold emphasis added):
In any event, the lowest estimate of Bush's war spending through 2008 that is even remotely defensible is $808 billion. Tack that onto the $132 billion cost of Katrina and you get $940 billion for the wars and Katrina.

That's well over the expenditures expected from the Democrats' stimulus and children's health insurance bills, which total $686 billion once tax cuts are subtracted. Even if we included the cost of the tax cuts (for a total of $818 billion), he would still be wrong because that's less than the $940 billion that uses a more accurate cost of the war spending.
PolitiFact's ruling on McConnell's claim? "False."

Hmmm. Apparently McConnell had no valid underlying point that the stimulus bill cost a great deal of money.

Inconsistent.


Tuesday, March 3, 2015

PolitiFact & Israel, thousand words edition

The left-wing bloggers at PolitiFact used Israeli Prime Minister Netanyahu's address to Congress to throw out more clickbait: Come see all our fact checks involving Israel!

We found their choice of photographs intriguing. After all, nothing says "objective and nonpartisan" like people with bloody hands wearing Netanyahu masks, right?



The image comes from PolitiFact's Facebook page.

Friday, February 20, 2015

Hot Air: "PolitiFact’s Lie of the Year of 2014 falls apart only two months later"

Noah Rothman of the conservative site Hot Air offers a reminder that PolitiFact's 2014 "Lie of the Year" was a train wreck:
Just about two months later, PolitiFact’s LOTY imploded.

“A team of prominent researchers suggested Thursday that limited airborne transmission of the Ebola virus is ‘very likely,’” The Washington Post reported on Thursday, “a hypothesis that could reignite the debate that started last fall after one of the scientists offered the same opinion.”
PolitiFact, remember, bundled all the supposed misinformation about Ebola into one giant and ambiguous "Lie of the Year." George Will's claim that some scientists believe Ebola may pass via a sneeze or a cough was the centerpiece of PolitiFact's award.

In truth, Will's statement was never worthy of a bad rating, let alone inclusion in a group "Lie of the Year" award. We noted at the time PolitiFact's rating relied on playing games with Will's choice of words.

Read the whole of Rothman's latest evaluation, including mentions of blogger Ace and the clip at the end of Will reminding everyone back in October that "settled science" is rarely settled.

Thursday, February 19, 2015

PolitiFact's latest survey on Rush Limbaugh

We've long criticized PolitiFact's habit of presenting its "report cards" summing up its findings on political personalities and the like. Of course the report cards are non-scientific and should not be used to generalize about those personalities.

Yet PolitiFact continues to publish them. Apparently they can't resist throwing out this type of click bait.

Screen capture cropped from from PolitiFact.com's Facebook page

Of Rush Limbaugh, PolitiFact says "He has yet to receive a rating of True."

So what?

If the scorecard featured a randomized set of fact checks, then it might mean something about Limbaugh that he hasn't received a "True" rating from PolitiFact. But lacking any such randomization, the results say something about PolitiFact, not Limbaugh. And it's the predictable results of publishing these silly scorecard stories that makes the practice particularly wrong:


Screen capture cropped from DailyKos.com

It's a survey! You know, like a scientific survey using a randomized population of fact checks. Except it's not.

Allen Clifton at "Forward Progressives" foreshadowed PolitiFact's hightlighting of Limbaugh's record. More than a coincidence?

The folks at PolitiFact have to know that people get misled by these scorecards. Yet they keep highlighting them anyway, often with no warning about the unscientific nature of the [ahem] survey.

What does that say about PolitiFact?

Fifty shades of "Half True"

PolitiFact's founding editor, Bill Adair, has said the truth is often not black and white, but gray:
Our Truth-O-Meter is based on the concept that the truth in politics is often not black and white, but shades of gray.
With this post we'll look at an example of PolitiFact shading the truth with its middle-ground "Half True" rating.

Justice Roy Moore, conservative: "Half True"


The first example comes from Feb. 13, 2015. Alabama Supreme Court Justice Roy Moore said Alabama hadn't changed its mind about gay marriage since passing a law in 2007 defining marriage in heterosexual terms. Moore was answering a claim from CNN host Chris Cuomo that people in Alabama had changed their views on gay marriage. PolitiFact reported the key exchange:
"Times have changed as they did with slavery," Cuomo said Feb. 12 on New Day. "The population no longer feels the same way. And even in your state, people no longer feel the same way."

Moore held firm that marriage was defined as between a man and a woman, and said, "81 percent as recently as 2006 said it was the definition. They haven’t changed their opinion."
PolitiFact framed its fact check in terms of a contest between the statements from Cuomo and Moore. If support for gay marriage had changed in Alabama, then Moore's claim was not plainly true.

PolitiFact flubbed its interpretation of Moore's response. Moore was not arguing that no change had occurred in opinion polls. Moore referred to the percentage of Alabama voters who approved the heterosexual marriage definition in 2006. The voters had not changed their minds in that the people of Alabama had not moved to change the law they overwhelmingly approved. PolitiFact noted that Moore was referring to that vote, but somehow failed to put the pieces of the puzzle together. Moore's Truth-O-Meter rating: "Half True."

Even if PolitiFact's wrong interpretation was correct, Moore would be off by a scant 14 percent. Democrat Sen. Sheldon Whitehouse once received a "Mostly True" rating for a claim that was off by 27 percent.


Rep. Pete DeFazio (D-Ore.): "Half True"


Our second example comes from a PolitiFact fact check published on Feb. 17, 2015. Rep. Peter DeFazio (D-Ore.) blamed genetically modified crops for the impending extinction of the monarch butterfly.

PolitiFact quotes DeFazio:
"We certainly know there is going to be secondary harm to the environment," he said. "In fact, monarch butterflies are becoming extinct because of this sort of dumping, (the) huge increase in pesticides’ use because of these modified organisms."
DeFazio got a thing or two wrong. Monarch butterflies aren't going extinct. The causal connection between the increased use of herbicides and the decreased wintering population of monarch butterflies has not yet been scientifically established. And, though PolitiFact kindly ignored this mistake, DeFazio referred to "pesticides" instead of "herbicides." The expert PolitiFact cited mentioned the effects of herbicides on the monarch caterpillar's favored food, milkweed. PolitiFact apparently didn't investigate the effect of pesticide dumping on monarch butterfly populations.

So DeFazio got nothing right, but PolitiFact accepted his extinction claim as a mere exaggeration of the declining wintering population of monarch butterflies. The final ruling: "Half True."


It almost takes a masochist to read PolitiFact's fifty shades of gray.

Thursday, February 12, 2015

Courting the journo-lobbyist

PolitiFact, the hapless fact checkers/liberal bloggers with whom we find fault almost daily, has a history going back to 2007 of inviting readers to steer their "independent" fact checking.
What should we check? Our story about the Obama chain e-mail was suggested by a PolitiFact reader. If you have a suggestion for facts or chain e-mails we should check, click here to email us.
We noted long ago that such practices encourage people, including political activists, to try to influence PolitiFact's choice of stories.

PolitiFact created a Twitter hashtag intended to encourage story ideas from readers: #politifactthis.

We're not surprised that one doesn't get used so much. Political activists don't want broad publicity for their efforts to drive the news. It's best done behind the scenes and anonymously.

This week we found out PolitiFact is going that extra mile for its journo-lobbyists by creating a PolitiFact browser plug-in. Users will be able to use the plug-in to suggest fact check material to PolitiFact. Users will reportedly even have the privilege of voting for and commenting on specific story ideas:
Designing a fact-checking plug-in for Web browsers that will allow people to request a fact-check of Internet content from PolitiFact staff; users will be able to vote on fact-check requests and make comments on flagged content
You're doing a really fine job of maintaining your independence, PolitiFact.

Seriously, why do they not see that they're encouraging the practice of journo-lobbying? Or do they see it and just not care?

If PolitiFact views this as a problem, expect to see the plug-in disclose the identity of people who suggest, vote for or comment on fact check ideas.

We're betting it'll be anonymous. It wouldn't do to waste that $35,000 in Knight Foundation grant money on a browser plug-in that hardly anybody uses.