Sunday, December 25, 2011

The Weekly Standard: "Damned Lies and ‘Fact Checking’ (cont.)"

The Weekly Standard has a follow up to Mark Hemingway's story earlier this month focusing on the foibles of fact checking (link to our review).

The update, under the title "Damned Lies and 'Fact Checking' (cont.)," is mostly subscriber-only content, though the whole of it is available for preview at present on the Standard's "The Scrapbook" main page. 

Without giving too much away, this nugget both serves as an appropriate tease and a colorful summary of the story:
It’s high time liberal pundits figured out that there’s more going on in this fact-checking bordello than raucous piano music. If they’d been paying attention, they would have long ago stopped patronizing these journalistic houses of ill repute.

Slate: "PolitiFact Weirdly Unable to Discuss Facts"

PolitiFact's recent spat with its liberal readership base has led to the publication of quite a few stories that echo criticisms recurrent in the posts we publish and link at PolitiFact Bias.

Slate's Dave Weigel, famously/formerly of the Journolist, has another such:
After this week, plenty of pundits are well and done with the national version of PolitiFact. The local versions? They're great. I was actually pretty fond of how one of them debunked an ad that misued [sic] one of my quotes, attributing it to a candidate, in 2010. Alas, PolitiFact Editor Bill Adair has committed the main site to a factually dubious "Lie of the Year" claim. PolitiFact claims that it's a "lie" to say that the Path to Prosperity ends Medicare. ActualFacts tell us that this is not a lie.

Adair responds to the critics in the worst possible way.
At a Republican campaign rally a few years ago, I asked one of the attendees how he got his news.

"I listen to Rush and read NewsMax," he said. "And to make sure I'm getting a balanced view, I watch Fox."
We're starting with an anoymous [sic] quote from a straw man that Adair met once?
Weigel continues to expand on Adair's defense, noting that it does nothing to address substantive criticisms.

Adair's response matches the customary pattern at PolitiFact, with the possible exception of the explanation PolitiFact offered after one of its criticisms of Rachel Maddow likewise offended liberal sensibilities.  The sad thing is that it took so long for so many liberals to see it.  Apparently it's easy to overlook the problem so long as conservatives have to deal with the bulk of the harm.

Though we hardly agree with Weigel about the quality of PolitiFact's state franchises (the jury's still out on most of them), his main point is well taken and the post is worth reading.

PolitiFact would gain credibility if it answered substantive criticisms with well-reasoned rebuttals. 

Claiming the critics suffer from some type of echo-chamber syndrome that prevents them from understanding PolitiFact's greatness is not a well reasoned rebuttal.  Rather, it is an ad hominem fallacy.  Readers are not well served with that type of response.

Jeff adds: Weigel continues with a curious new pattern we've noticed with liberal writers describing PolitiFact. What used to be a ubiquitary reference to PolitiFact's Pulitzer (which served to inform the reader of their unquestionable credibility and authority) is no longer worth the extra space to mention. 

Saturday, December 24, 2011

Forbes: "PolitiFact and the Traditional Journalism Trap"

After a focused effort to publish my own response to Bill Adair's thin defense of PolitiFact's choice for its "Lie of the Year," I ran across a similar item in Forbes by John McQuaid.

McQuaid and I found two key areas of agreement.

First, the response story from PolitiFact editor Bill Adair was born of conceit:
The whole PolitiFact ruckus has the feel of traditional newspaper journalism (despite the new-ish fact-checking approach) whipsawed by forces it cannot grasp. Traditional newspaper journalism wades boldly into the public square wielding its post-Watergate, “objective” approach and finds itself besieged. And so it concludes: from nasty anonymous comments to partisan sniping to political debates that are never resolved, the public square sucks.
Second, PolitiFact is unable or unwilling to adequately explain and defend its argument for "Lie of the Year":
Today, if you make a Big Statement, people will come after you. Yes, some (most?) will be hacks and fools. But some will be smart, and they will demolish you. Your Pulitzer Prize will not protect you. So you should be prepared to defend yourself and your statement. That means wading into the public square not only with facts, but with arguments and a grasp of the subtleties of the issue at hand. This is, on the whole, a good thing. Readers can affirm or object. Commentators can comment. And fact-checkers can defend and elaborate on their decisions.

That Politifact is apparently unable to understand the necessity of this, and may not even possess the vocabulary or self-awareness needed to do it, suggests it has big problems ahead.
McQuaid's article is worth a read.  It's short and occurs in two parts (1,2).

McQuaid does not touch on the issue of PolitiFact's bias problem.  But his comments touch one of the key sources of media bias.  Superficial knowledge of the subject tends to increase the role of ideological bias in reporting.  That's why citing expert sources may lead to problems where the experts disagree.  The journalist isn't likely to settle an issue debated by the experts on the topic.  Yet bias may lead the journalist to prefer one expert assessment over another.

Bill Adair: You who criticize us are in an echo chamber chamber chamber chamber

PolitiFact editor Bill Adair served up some delicious irony with his recent defense of PolitiFact's 2011 "Lie of the Year" selection.

That selection was Democrats' claim that Republicans voted to end Medicare.  Liberals and progressives far and wide have condemned the selection, and we at PolitiFact Bias share a degree of sympathy with offended liberals since there is some (not much) truth in the claim.

The deluge of port side criticism has prompted yet another one of PolitiFact's nearly content-free rebuttals under the headline "Fact-checking in the Echo Chamber Nation."

Adair seems blissfully unaware that he's inside the echo chamber.

At a Republican campaign rally a few years ago, I asked one of the attendees how he got his news.

"I listen to Rush and read NewsMax," he said. "And to make sure I'm getting a balanced view, I watch Fox."

My liberal friends get their information from distinctly different sources — Huffington Post, Daily Kos and Rachel Maddow. To make sure they get a balanced view, they click Facebook links — from their liberal friends.
Adair just told us that he's positioned within an echo chamber oriented left.  He hears opinions from the right when he's out reporting. But to hear what the left is saying he can just hang out with his friends.  A truly centrist Bill Adair may be expected to have discussions with a conservative friend to draw from in writing his story.

This is life in our echo chamber nation. We protect ourselves from opinions we don't like and seek reinforcement from like-minded allies.
Bear in mind Adair just finished hinting that his list of friends is predominantly (if not exclusively) liberal.

If Adair isn't in the echo chamber shoulder-to-shoulder with those he criticizes, then it's more akin to a liberal echo chamber duplex with one common living area.

The paradox of the Internet age is that never before have we had access to more ideas and different thoughts. And yet, many of us retreat into comfy parlors where everyone agrees and the other side is always wrong. Each side can manufacture its truths and get the chorus to sing along.

PolitiFact had its latest brush with the Echo Chamber Nation this week. We gave our Lie of the Year to the Democrats' claim that the Republicans "voted to end Medicare." That set off a firestorm in the liberal blogosphere, with many saying that claim was not actually wrong. We've received about 1,500 e-mails about our choice and only a few agreed with us.
Adair borrows a page from President Obama's book of rhetorical tricks.  Sure, "many of us" insist on surrounding ourselves with like-minded opinions.  But Adair's problematic audience response probably comes more from those who expose themselves to contrary opinion yet do not have the ability and/or inclination to sift through the clash of ideas to figure out what's wrong or right from either side.

And blame falls on PolitiFact on this point.  PolitiFact often fails to make a clear case in favor of its decisions, and its 2011 "Lie of the Year" is another good example. Observe Adair's method of treating substantial criticisms in response to the "Lie of the Year" selection:
Some of the response has been substantive and thoughtful. The critics said we ignored the long-term effects of Rep. Paul Ryan's plan and that we were wrong to consider his privatized approach to be Medicare. In their view, that is an end to Medicare.

We've read the critiques and see nothing that changes our findings. We stand by our story and our conclusion that the claim was the most significant falsehood of 2011. We made no judgments on the merits of the Ryan plan; we just said that the characterization by the Democrats was false.
You just can't blame the outraged liberals for finding this type of response unsatisfactory.  Adair appears to admit that they have a point.  And then tells them with no reason why--unless it's sufficient to claim non-specific support from Annenberg Fact Check or the Washington Post fact checker--that there's no reason to change the ruling.  .

We got other silly comments from readers who declared we were "a tool" of the Republicans, Fox News and the Koch brothers. Their reaction is typical these days. To paraphrase George W. Bush, you're either with us, or against us.

In reality, fact-checking is growing and thriving because people who live outside the partisan bubbles want help sorting out the truth. PolitiFact now has nine state sites run by news organizations around the country that employ more than 30 full-time journalists for fact-checking. We've inspired many copycat sites around the nation and roughly a dozen in other countries.
Adair says the extremist reactions are "typical."  And in almost the next breath he claims that fact checking is thriving because of the people living outside the partisan bubbles.  The atypical ones account for PolitiFact's success?   Why, if that's the case, did PolitiFact not receive greater email support for its "Lie of the Year" selection?  Is it that hard for Adair to see the writing on the wall from within his echo chamber?

On the whole, Adair's defense is elitist and defensive. The PolitiFact staff is enlightened, thank you very much.  If you don't like their "Lie of the Year" selection then there are plenty of potential readers who live outside the echo chamber.  And it would be nice if a few of those readers would send in some supportive emails (hint, hint).

It seems Adair doesn't know his audience.


One more area where PolitiFact needs to clean up its act:

Some of our critics wrongly attributed our choice to our readers' poll and said we were swayed by a lobbying campaign by Ryan. But our editors made the choice and the poll was not a factor.
Um--how do we know the poll was not a factor?  Because Adair says so?

Free advice for Adair:  If you want to be able to claim with confidence that the poll plays no role in the editors' selection then keep the editors ignorant of the poll numbers until they're finished making their choice.  And if you do it that way then you can write your defense like this:
Some of our critics wrongly attributed our choice to our readers' poll and said we were swayed by a lobbying campaign by Ryan. But we shield the editors from the poll data to ensure that it will not affect our decision.
Doesn't that sound a lot better?  More convincing?

Jeff adds: For us PolitiFoes, Adair's airing of grievances was a Festivus Miracle. Adair implies (as does the entire premise of the PolitiFact operation) that he is somehow immune from the echo chamber, and if you don't trust him, it's because you're in too deep to notice. You're not objective enough to see the emperor's non-partisan clothes. I also have issues with his you-a-culpa over the examples he provides. His GOP rally attendees and liberal friends are straw men all day long. Whatever the political inclinations of his acquaintances may be they are irrelevant to whether or not PolitiFact gives it to us straight.

It's also interesting to note the tone of the article when compared to the dispassionate text of Adair's response to 2010's Lie of the Year criticism. We've noticed PolitiFact responds much more aggressively to anger from the left than from the right and this article adheres to that theme.

Perhaps the injuries inflicted by friendly fire were deep, but it's doubtful Adair corralled any sheep back into the flock with this ill-advised tantrum.

Apoplectic Now: The Aneurysm of the Year

That massive popping sound you heard on Tuesday was the collective hearts and minds of liberals across America bursting as they witnessed their favorite source of smug validation betray them. PolitiFact editors played their pre-selected card and announced the Democrats' claim that Republicans voted to "end Medicare" as the Lie of the Year for 2011.

What could go wrong?

The wrath unleashed on PolitiFact went far and wide as hysterical condemnations and inordinate smiting piled up on the left side of the Internet. The High Priest of Haute Liberals himself, Paul Krugman, sounded the death knell in his subtly titled article "Politifact, R.I.P." in which he described PolitiFact as "useless and irrelevant." Talking Points Memo called the decision a "sham", and Steve Benen at Washington Monthly called the decision "indefensible" in his article "PolitiFact ought to be ashamed of itself." The list goes on and on and on (and on.).  The formerly ubiquitous mention of PolitiFact's Pulitzer that was previously announced as a badge of credibility is suspiciously absent in these articles.  

But for long-time PolitiFact critics like us, few things in life have been as entertaining as the epidemic hysteria witnessed over at PolitiFact's Facebook page. Check out this sample of outbursts posted on various Facebook threads throughout the week. (Names have been removed to protect the aggrieved):
"I've awarded Politifact the Steaming, Festering Turd of The Year Award for this one. Your credibility has been flushed."

"Politifact, you're either being bought off by the right wing echo machine or you're scared of them."

[The Pauline Kael Trophy goes to:] "This has been voted by everyone I know,including myself as the stinkiest,lamest,most cowardly decision of the year!"

"You let Fox News choose your Lie of the Year, didn't you."


[This guy may be on to something:]"Maybe, we just gave a group of idiots too much credit to begin with simply because the bore the name 'Politifact.'"

"PolitiFact's "Lie of the Year" is pretty good.......... for me to poop on."

"Noting your selective ignorance of objective facts, I am now forced to ignore you as a reference source. Unfortunate, but I am only interested in objective, FULL, analysis of facts." [Which is what I considered you when you were confirming my opinions.]

"So now we know Politifact is as bought as the politicians they scrutinize."

[Murderers!:] "Presumably, politifact also believes that if someone kills another person that it is not murder if they kill them slowly with a slow acting poison. Such lame and disreputable analysis and logic is incomprehensible for an organization wishing to claim some skill and reputation at factchecking."

[From the 'Paul Ryan stuffed the ballot box' conspiracy:] "The mere product of lobbying. Hey politifact way to bend over and take it. Hope you had on lipstick so atleast you looked good doing it."

[The Jews!:] "How many shekels did you guys get for that choice?"

"Did you guys get purchased by Newscorp?"

"Another election stolen. Dislike."

"What a bummer, I trusted Politifact implicitly until this." [Spencer Pratt responds]

[Baby, Don't Go Award:] "If you guys can do something to win back your credibility after this outrageous and outlandish ruling, then I may be back. Right now, though, I'm unliking this page and deleting the bookmarks I have to your website."

"Either you fire your editorial board and give yourself a pants on fire or just close up shop."

"God you guys are stupid."
Hell hath no fury like a liberal scorned.

PolitiFans fell into one of a few groups. Some accused PolitiFact of being a tool of the GOP.  Others claimed Paul Ryan sabotaged the vote by his email campaign (unaware that the readers poll is not the same as the editors' pick). Most simply said the claim was true, and that determining what constitutes the "end of Medicare" is an issue of semantics that falls outside the scope of objective values. That's a fair point, and it's one we've chronicled a number of times, including last years Lie of The Year. So where have all the indignant liberals been since PolitiFact's inception? Affixing varying degrees of "fact" to obvious hyperbole and opinion has been PolitiFact's shtick all along. For the left to become unhinged now betrays their own selective bias. In short: PolitiFact served its purpose as neutral, objective arbiters of fact, as long as they were validating liberal axioms.

To illustrate this point, check out this Jonathan Chait article (with some, uh, minor edits in bold):

The Patient Protection and Affordable Care Act would very dramatically change health care.


Is that “a government takeover?” Well, it’s a matter of opinion. At some point, a change is dramatic enough that it is clearly a government takeover. If you proposed to replace a voluntary, free market system with a plan that mandated everyone purchase health insurance and the government dictated what patients and ailments insurance companies had to cover and what to charge, I would hope Politfact would concede that this would be “a government takeover,” even if you call the new mandatesa free market solution.” On the other hand, small tweaks could not accurately be called “a government takeover.” Between those two extremes, you have gray areas where you can’t really say with certainty whether a change is radical enough to constitute a takeover.

Does ObamaCare indeed establish a government takeover? I would argue no. But it’s obviously a question of interpretation, not fact. And the whole problem with Politifact’s “Lie of the Year” is that it doesn’t grasp this distinction. Politifact doesn’t even seem to understand the criteria for judging whether a claim is a question of opinion or a question of fact, let alone whether it is true.

Obviously, Chait's unedited piece argued that whether or not Ryan's plan did in fact end Medicare was a matter of interpretation (and ironically it mirrors the Wall Street Journal's op-ed about last years LOTY). We tend to agree with this criticism. And to be fair to Chait, he's called PolitiFact out for being harsh toward the GOP before. But the mountain of new criticism of the Lie of the Year, and PolitiFact's operation in general seems to be a few years late. Like all of PolitiFact's betrayed lovers this week, the reaction to the sudden realization that PolitiFact operates as a biased actor with motivations less noble than honest determination of facts is comical and disingenuous to everyone who's seen it for years. The irony for us is it took PolitiFact's calculated attempt to appear even-handed for the liberals to rise up in revolt.

The Medicare claim was the winner from the outset. Just take a look at its competitors. The reality is that Jon Kyl's abortion claim, Michelle Bachmann's vaccine statement, and Debbie Wasserman Schultz's rant about Jim Crow laws were hardly repeated outside of PolitiFact's circles. They were minor blurbs that barely lasted on the news cycles and had no place being in the running for statements "that played the biggest role in the national discourse." For all the gnashing of teeth about the winner, somehow PolitiFact managed to protect Team Democrat from any unflattering press about legitimate nationally popular issues like Solyndra or "Fast and Furious." The ten finalists were carefully selected, with an eye on the Medicare claim to be the winner. And anyone that thought they would select a GOP claim for the third year in a row ignored the reality that PolitiFact is a political animal with a brand to protect and an impartial image to uphold.

In the end it's hard to determine the final estimate of the damage PolitiFact has caused with its overwhelmingly liberal readership. We've seen smaller scale exodus whenever they've gone after Jon Stewart that had only short term effects. Whatever the case, conservatives would be wise to avoid finding anything redeeming in this temporary respite from the partisans at PolitiFact. As we've explained before, the shoddy standards PolitiFact employs will inevitably hit both sides of the aisle, but the liberal fishbowl of the newsroom will ultimately cause them to come down against the right much more often.

The 2011 Lie of the Year selection does little to diminish PolitiFact's aura of liberal bias. If anything, it exemplifies the selection bias and inherent flaws of their operation that have made it so unreliable in the first place. Whether this is PolitiFact's demise as a tool of liberal validation, or if it bolsters their claims that "upsetting both sides proves they're doing it right", for us at least, it's been a fun week to be watching.

Bryan adds:

Count me among those naive enough to believe that PolitiFact would pick three consecutive Republican claims as "Lie of the Year" depending on the material under consideration.

Jeff notes: I was correct in predicting the winner would go against the left, but my final pick (Obama hasn't raised taxes) was wrong. I suspect that had PolitiFact followed my advice there would be much less turmoil among the ranks. It's hard to imagine liberals being too upset about PF confirming Obama raised taxes.

Wednesday, December 21, 2011

The meaning of PolitiFact's "Lie of theYear" for 2011

I wonder whether this award will have those conservatives blasting politically-motivated “fact check” operations rethinking that criticism?
--Ed Morrissey, Hot Air blog
Fact check critics who base their criticism on a completely consistent pattern of wronging only one party or ideological position should take Morrissey's argument to heart.

As I have written repeatedly, a significant ideological bias does not require all the harm to hit one side and all the benefit to accrue to the opposite side.  In scientific terms, a simple majority of cases favoring one ideology over another indicates an ideological bias (after taking the margin of error into account).  Two out of three "Lie of the Year" awards going to conservatives, for example, fits well with the hypothesis of liberal bias.  Granted, three out of three makes an even better case.

What, if anything, does the 2011 "Lie of the Year" mean with respect to the issue of media bias?

Answer:  probably not much.

One liberal media hypothesis, as expressed by economist/political hack Paul Krugman:
(T)he people at Politifact are terrified of being considered partisan if they acknowledge the clear fact that there’s a lot more lying on one side of the political divide than on the other. So they’ve bent over backwards to appear “balanced” — and in the process made themselves useless and irrelevant.
Krugman's charge is plausible if we simply take him to mean that PolitiFact carries a consciousness of the effect on its brand of, for example, choosing a Republican claim as its "Lie of the Year" for 85 years straight.  We'll table discussion of Krugman's evidence supporting a "clear fact that there's a lot more lying on one side of the political divide than the other."

At the bottom line, the criticisms of the 2011 "Lie of the Year" from the left are no better than the right's criticisms of the 2009 and 2010 "Lie of the Year" winners.  The latter linked story helped earn Joseph Rago a Pulitzer Prize.  This year's award is no different than those in the past except that the left got hit instead of the right.  And, of course, the apoplectic response from the left creates such a contrast to the right's past reactions that Karl of Patterico's Pontifications offers the following:
PolitiFact’s most useful function may be in triggering an analysis of the overwrought reactions of these progressive crybabies. 
The left is largely content with PolitiFact so long as conservatives take the worst of it.  If not, well, the sky is falling and PolitiFact loses all credibility.  Or something like that.

Krugman's hypothesis is an unlikely explanation for this year's "Lie of the Year" selection.  The pressure to pick a lie of the left was probably subtle and semiconscious.  Why?  Because PolitiFact already carries very little credibility with conservatives, Ed Morrissey notwithstanding.  PolitiFact has angered its main demographic without much hope of building trust in a potential audience of largely suspicious conservatives.

If PolitiFact gains nonpartisan credibility with this move, the effect is primarily in-house:  The journalists reinforce their own belief in their fairness and objectivity with moves like this one.

PolitiFact probably misjudges its audience.  The net effect will be decreased overall trust in the brand.  Sure, the staff can take solace in the absurd notion that criticism from partisans on either side shows their even handedness.

It doesn't work that way.

Stay tuned, because PolitiFact Bias will soon roll out objective research supporting our position that PolitiFact manifests a significant bend to the left.

Correction Sept. 5, 2017: Very belatedly effected the change from "James Rago" to "Joseph Rago" in the seventh paragraph. RIP Joseph Rago.

Friday, December 16, 2011

New feature: The (annotated) Principles of PolitiFact and the Truth-O-Meter

Jeff and I continually run across cases where PolitiFact applies its standards unevenly, contradicts them or simply ignores them.  But simply mentioning it on a case by case basis doesn't quite carry the impact to the reader as we might hope.  After all, most readers don't happen to look at every case we mention.

To help communicate the degree to which PolitiFact fails to keep to its principles (and, to be sure, to give us an added opportunity to express our snarky sides), we added a new page:  The (annotated) Principles of PolitiFact.  The page takes the text of PolitiFact's "Principles of PolitiFact and the Truth-O-Meter" and adds our commentary regarding the application--or lack thereof--of those principles.

We'll be adding scads of links to provide examples of the failures we point out.

The page will always be a work in progress, reflecting our growing body of work examining PolitiFact.

Forbes: "How to Fix Fact-Checking"

It's gratifying to see journalists and pundits piling on PolitiFact for all the reasons PFB preaches.  The latest to jump on is Forbes magazine with a story by John McQuaid titled "How to Fix Fact-Checking."

McQuaid uses the recent Weekly Standard story by Mark Hemingway as his jumping-off point:
The Weekly Standard deplores fact-checking – the journalistic efforts, by PolitiFact and others, to vet what politicians and others in the public eye say and call out lies and half-truths. So much that Standard editor Mark Hemingway is trying to knock down the whole fact-checking enterprise, arguing it’s a liberal media scam.
McQuaid doesn't buy into Hemingway's suggestion that fact checkers intentionally skew to the left.  But he grants that the fact checking biz does have problems (bold emphasis added):
Here’s the thing, though. The Standard piece offers up some genuine examples of faulty fact-checking in service of its tendentious argument. The problem with fact-checking is not that it’s a liberal media plot. The problem is that fact-checking – like everything – is sometimes a lazy, half-assed business. If fact-checking is as important as it claims, its practitioners need to acknowledge its problems and fix them.
We argue that the tendency of journalists to lean left translates into a tendency for the preponderance of "half-assed" fact checking to harm conservatives and benefit liberals.  McQuaid is probably right that the bias isn't intentional--but often it's so bad that one is hard pressed to detect the difference between intentional and unintentional bias.

Read it all.  The snippets above come from Page 1, while Page 2 contains McQuaid's prescriptions for the tainted industry.

Wednesday, December 14, 2011

Engineering Thinking: "PolitiFact’s Analysis of Cain’s 9-9-9 Plan is Fatally Flawed"

We were slow to notice a fresh PolitiFact item by Ed Walker at his blog "Engineering Thinking" from October.

Walker swiftly skewers PolitiFact's treatment of a Herman Cain claim about his 9-9-9 tax plan:
1. The first major problem with PolitiFact’s analysis is that it was not shown to be objective. PolitiFact selected three tax accountants to provide an opinion, but since Cain’s 9-9-9 plan — if implemented — will substantially reduce the need for tax accountants, they are the last folks that should be asked for an assessment.
Indeed, it seems odd that PolitiFact would solicit volunteers* from the ranks of tax accountants to test Cain's claim rather than going to tax experts at a think tank.  Not that the latter route is totally unproblematic.

And Walker's second point:
2. Politifact states in the online version, “For this fact-check, we’ll only be talking about the personal income tax and the sales tax since the business tax directly affects only business owners and corporations.” This assertion is nonsense, however, since everyone’s effective income is directly impacted by the prices that business owners and corporations charge their customers, and those prices are greatly affected by federal corporate and payroll taxes.

PolitiFact completely ignores such taxes, which are often hidden taxes that the Cain plan eliminates.
Walker is deadly accurate with his second point.  PolitiFact seems completely fooled by embedded taxes, formerly neglecting their existence in a fact check of Warren Buffett's claims about effective tax rates for the very rich.  I've coined the term "the Buffett fallacy" for that mistake.

A good fact check does not simply ignore important aspects of the issue it examines.

Walker's post is short, but it's worth a visit to read the entire thing.  So please do so.

* I have a very clear recollection of PolitiFact posting a request for readers with tax expertise to help evaluate Cain's plan.  Unfortunately, the Web page is either a bit hard to find or that item was scrubbed from PolitiFact's Web territory.

Sunday, December 11, 2011

The Weekly Standard: "Lies, Damned Lies, and 'Fact Checking'"

The Weekly Standard and Mark Hemingway add yet another effective critique of PolitiFact to the growing set:
They call themselves “fact checkers,” and with the name comes a veneer of objectivity doubling as a license to go after any remark by a public figure they find disagreeable for any reason. Just look at the Associated Press to understand how the scheme works.
Yes, Hemingway first uses the Associated Press as his example.  But PolitiFact isn't far behind:
(I)n 2009 the St. Petersburg Times won a Pulitzer Prize for PolitiFact, endowing the innovation with a great deal of credibility. “According to the Pulitzer Prize-winning PolitiFact .  .  . ” has now become a kind of Beltway Tourette syndrome, a phrase sputtered by journalists and politicians alike in an attempt to buttress their arguments.

If the stated goal seems simple enough​—​providing an impartial referee to help readers sort out acrimonious and hyperbolic political disputes​—​in practice PolitiFact does nothing of the sort.

Hemingway backs his assessment with the same example he used in his 2010 critique of PolitiFact in the Washington Examiner:  Rand Paul's statement about the gulf between average private sector pay and that received by federal workers.  Hemingway again explains the preposterousness of that rating and calls it "non-atypical" of PolitiFact.

What's PolitiFact's problem?  Hemingway's rundown sounds themes familiar to regular readers of PFB:
The media establishment has largely rallied round the self-satisfied consensus that fact checking is a noble pursuit. Nonetheless there are signs of an impending crack-up. In their rush to hop on the fact-checking bandwagon, the media appear to have given little thought to what their new obsession says about how well or poorly they perform their jobs.

It’s impossible for the media to fact check without rendering judgment on their own failures. Seeing the words “fact check” in a headline plants the idea in the reader’s mind that it’s something out of the ordinary for journalists to check facts. Shouldn’t that be an everyday part of their jobs that goes without saying? And if they aren’t normally checking facts, what exactly is it that they’re doing?
In a nutshell, the fact checkers are biased and not particularly good at fact checking.

Remember to read Hemingway's every word.  This review doesn't do it full justice.

Patterico's Pontifications: "Handicapping PolitiFact's 2011 Lie of the Year"

Karl, blogging at Patterico's Pontifications, published some comments about PolitiFact's upcoming "Lie of the Year" award.  Though Karl's post isn't exactly an evidence of PolitiFact's left-leaning bias, his opinion of PolitiFact is neatly phrased:
I think this year’s merely “False” claims have to be discounted.  Interestingly, of the five ”Pants On Fire” claims, three are by Democrats.  Only one of those is from Pres. Obama; the remaining two are from the DCCC and “Facebook posts.”  The DCCC claim that House Republicans voted to “end Medicare” ought to be Lie of the Year, as it had the most impact on the national discourse.  But PolitiFact is about helping the center-left, not hurting it, which leaves the two GOP “Pants On Fire” entries.
If Karl's prediction pans out then it does serve as another circumstantial evidence showing PolitiFact's liberal bias.

Karl was a bit more daring with his predictions than I was in a similarly titled post at Sublime Bloviations.  The main differences are that Karl picks a lone likely winner where I picked two, and I gave some space to considering the possibility that PolitiFact would choose a claim from the left in order to push back against the public perception that their operation is biased to the left.  Perhaps announcing the finalists helps inoculate PolitiFact on that count.  Simply having five statements from liberals to choose from among the finalists has liberals and progressives crying foul.

Jeff adds: I'll stick with the comments I left (both on Sublime Bloviations and on Karl's Patterico piece) that the award will go to Obama for his statement that he "didn't raise taxes once." Granting the Lie of the Year to a right-leaning statement a third year in a row might raise too many eyebrows when PolitiFact is already accused of a liberal bias. The statement itself isn't offensive to PolitiFact's liberal readers who already complain that Obama hasn't raised taxes enough. Picking this statement also serves the dual purpose of providing cover for their bias in the upcoming election cycle. One can imagine the arguments we'd hear for the next 11 months: "PolitiFact goes after both sides! They even picked Obama for the Lie of the Year!"

The final 10 statements they selected are also a bit curious. Whatever one may think of Bachmann's vaccine remark, or even Wasserman's Jim Crow claim, it's a stretch to consider them even in the running for comments that "played the biggest role in the national discourse." What kind of debate transpired in the PolitiFact editors meeting that granted a top ten spot to Jon Kyl's obscure and barely repeated abortion claim in the year of Anthony Weiner, Fast and Furious, and Solyndra?

Thursday, December 8, 2011

WaPo Fact Checker: "Revisiting Romney’s ‘deceitful, dishonest’ ad about Obama"

Back in late October, PolitiFact was publicly wringing its hands over a story it published that was out of step with fact checks of the same material by Annenberg Fact Check and the Washington Post's "The Fact Checker" column by Glenn Kessler.

It's hand-wringing time again as Kessler writes about a Mitt Romney ad that PolitiFact found outrageous ("Pants on Fire") while Kessler and the Annenberg folks found the ad more middle-of-the-road misleading:
(T)here are three reasons why we have trouble being outraged.

 First, the ad makes clear that Obama is speaking in 2008.
 Second, Obama’s statement was actually a misleading quote itself.
 Finally, the Romney campaign made it very clear that it had truncated the quote.
Two out of three of Kessler's points appeared in our own analysis of Romney's claim in our review of the PolitiFact fact check.

Though Kessler doesn't mention our central point about the ad, that its point doesn't change significantly regardless of whether the context was included or not, Kessler does note PolitiFact's out-of-step fact check response:
(Fact Checkers can disagree: PolitiFact labeled it “Pants on Fire.” But reached a conclusion similar to ours, saying the health-care line actually posed a “more serious problem.”)
Kessler treats PolitiFact very kindly.  The fact is that PolitiFact failed to make any mention of Kessler's three points.  In baseball terms, they whiffed on all three.

And Annenberg Fact Check?  The quotation issue was a sideshow so far as they were concerned:
What the Obama campaign chose to take issue with was how the then-candidate’s words were edited in a section where he is heard to say, “If we keep talking about the economy, we’re going to lose.” Obama was actually quoting his Republican opponent. The full quote is: “Senator McCain’s campaign actually said, and I quote, if we keep talking about the economy, we’re going to lose.”

Is that “deceitful and dishonest,” as Obama campaign spokesman Ben LaBolt quickly claimed? Or “blatantly dishonest,” as the liberal group ThinkProgress described it? It is possible that a viewer might be misled into thinking that Obama said this about his own campaign in 2011, since the quote comes 23 seconds after a graphic cites Obama’s comments as being uttered in 2008. But we’ll leave that for our readers to determine.
PolitiFact is, uh, bolder than that.  That's why PolitiFact is closer to Media Matters than the other major fact check services.  They have the chutzpah to let their subjective judgments determine the position of the misnamed "Truth-O-Meter" and serve it up to their readers as though it is objective journalism.

Jeff adds: When I first read the original PolitiFact piece I was reminded of a rating they gave former congressman Alan Grayson (D-FL). Grayson ran an ad that referred to his opponent, Daniel Webster, as "Taliban Dan." In the ad, Grayson edited a video of Webster to distort Webster's words into the opposite of what he said. Check out PolitiFact's summary in that ruling (bold emphasis added):
The Grayson ad clearly suggests that Webster thinks wives should submit to their husbands, and the repeated refrain of "Submit to me," is an effort to scare off potential female voters. But the lines in the video are clearly taken out of context thanks to some heavy-handed editing. The actual point of Webster's 2009 speech was that husbands should love their wives.

We rate Grayson's claim False.
Now read PolitiFact's treatment of Romney's ad (emphasis added):
We certainly think it’s fair for Romney to attack Obama for his response to the economy. And the Romney camp can argue that Obama’s situation in 2011 is ironic considering the comments he made in 2008. But those points could have been made without distorting Obama’s words, which have been taken out of context in a ridiculously misleading way. We rate the Romney ad’s portrayal of Obama’s 2008 comments Pants on Fire.
As Bryan noted, including the context wouldn't have changed the point of Romney's ad. Yet in Grayson's ad he not only took Webster out of context, he distorted (removed) Webster's words in order to make it appear Webster said something contrary to what he actually said (to say nothing of associating his opponent with a terrorist group). What exactly is more ridiculous about Romney's editing than Grayson's? What standard is PolitiFact using to make these determinations?

Until PolitiFact comes up with a way to objectively quantify a statements ridiculousness the ratings will continue to be plagued by the editors' personal biases.

Edit 12/11/11 : Added link to the original WaPo article-Jeff

Wednesday, December 7, 2011

Anchor Rising: "Do They Even Read What They Write?"

"Anchor Rising" contributor Patrick Laverty gives us yet another anecdote illustrative of PolitiFact's bias, thanks to PolitiFact Rhode Island:
This one was just too easy. First Politifact accuses Terry Gorman of RIILE of issuing a "Mostly False" statement, and then they actually explain how their own ruling is wrong!
RIILE is Rhode Islanders for Immigration Law Enforcement, and the issue is the decision by the Rhode Island Board of Governors for Higher Education to provide in-state tuition rates to at least some illegal immigrants.

Laverty makes a condensed but essentially accurate case in finding PolitiFact "pants on fire" for its ruling on Gorman.  The federal law, Laverty points out, allows a state legislature to provide secondary education benefits so long as the method complies with the rest of the federal statute.  But the federal law does not make that same exception for the Rhode Island Board of Governors for Higher Education.

PolitiFact makes an effort to legitimize the Rhode Island policy by playing up a key court decision in California:
In their decision, the California judges concluded that the basis upon which California granted the in-state tuition exemption -- which includes having attended a California high school for at least three years and obtaining a high school diploma or GED from California -- constituted criteria other than residency. Therefore, the judges wrote, "it does not violate section 1623."

The U.S. Supreme Court declined to hear the case on appeal.

The California court did not, however, rule on whether granting in-state tuition for undocumented students amounted to a "benefit" as defined in the federal law. That remains an open question.
There are two things of note in this portion of PolitiFact's analysis.

The first is the journalist offering a piece of legal analysis without directly sourcing it to an expert.  Journalists reporting in the objective style rarely set themselves forward as a definitive source of information.

Second, does it remain an open question?

On the face of it, the question doesn't seem so open.  The court's decision was the result of an appeal, and the lower court had ruled against the California law, finding it unconstitutional.  That court, it seems safe to say, operated on the premise that granting in-state tuition for undocumented students was a benefit under the applicable federal law.

It seems counterintuitive for the higher court to leave that issue unaddressed if it objected to that facet of the lower court's ruling.

One wonders why PolitiFact presented it as an open question, though it's clear enough in the context of the story that it serves as one of the keys to the unfavorable ruling Gorman received.

Thursday, December 1, 2011

Pete Sepp: "I don't know who the experts you consulted are or whatever policy agendas they may have"

Pete Sepp, vice president for communications and policy for the National Taxpayers Union, usually interacts with PolitiFact as an expert source.  This month, however, the NTU ran an ad that received the PolitiFact treatment, and Sepp ended up as NTU's spokesperson in defending it.  The ad called the federal government's proposed rebate program for drug purchases a "tax."

Sepp did not publish a public rebuttal to PolitiFact.  Rather, we find his arguments hosted by PolitiFact's Texas affiliate.   PolitiFact combined the bodies of three email messages from Sepp on a single reference page.

Sepp's initial email (bold emphasis added):
Based on our experience, calling this rebate plan anything less than a tax fails to capture all of its effects:

1) With a few exceptions that the Secretary of HHS would be able to approve (an uncertain proposition), drug manufacturers would be required to rebate 23 percent of the average manufacturer price (more if the drug price rose quicker than inflation) for a brand-name pharmaceutical that was distributed to lower-income Part D beneficiaries. Otherwise, the company could not participate in providing drugs to Medicaid, Medicare, or other government beneficiaries. Considering there are already genuine rebates (i.e., negotiated discounts) under several such programs, this latest demand from the government for being able to sell to a huge segment of the entire consumer drug market in the U.S. seems more like a mandatory extraction than a voluntary refund.

2) The money collected from these "rebates" don't wind up in the actual consumers' pockets or the various Part D plans; instead they go to a fund that will defray certain government Medicare program costs. A "rebate" as is commonly understood is something that the consumer of product receives after purchase. This "rebate" is nothing of the kind, and represents deceptive terminology.

3) The "rebate" is based on a percentage of price per unit, a lot like the way some excise taxes on products such as some tobacco items work.

4) This "rebate" will in essence squeeze the price bubble somewhere else. Either other Part D beneficiaries get stuck with higher premiums, people in private, non-Medicare plans pay higher prices for their drugs, or drug development and access gets scaled back, or even voluntary discounts start to dry up.

For a good summary of how this could happen, as well as some previous CBO work on this topic. I'd suggest the following link at American Action Forum, which former CBO Director Douglas Holtz-Eakin serves at:
Sepp from his first followup:
I didn't see a feature yet on your site so I thought I'd send you a couple other good links to commentaries that discuss the rebate scheme: 

Yes, there are several groups like ours (AEI, Galen Institute, American Action Forum) who share concern that this proposal amounts to a tax.
And from Sepp's second followup, registering his apparent incredulity at PolitiFact's ruling:
1) "There's nothing in the proposal that calls this a tax and experts we visited say rebates like the one in Medicaid never have been called taxes." I don't know who the experts you consulted are or whatever policy agendas they may have, but here are people in the health policy field who agree with the ad's contention that the rebate proposal is best described as a tax.
Sepp gave four examples then moved to his second point:
2) In another email you had asked, "There's nothing in the proposal that calls this a tax." My answer: well, of course not! Supporters call this a rebate so they can raise revenues for the federal government without branding their scheme a tax and having to answer a lot of inconvenient questions about it. Just because they don't want to call it a tax doesn't mean it won't function like one (see above). That's exactly the point of our ad, and our mission for the past 42 years -- exposing attempts by the political class to cover up a proposal that walks, talks, and hurts like a tax by calling it something else.
Contrast Sepp's argument with PolitiFact's conclusion:
We see how the Obama proposal could be judged a nearly mandatory give-back in that drug companies that decline to give rebates would do so at risk to their bottom lines. It also makes sense that drug companies wouldn’t swallow the costs of the rebates; they’re not free.

Then again, contrary to the ad's statement, there’s no evidence low-income Medicare beneficiaries would pay a 23 percent "tax." And all told, Obama's urged rebate remains that--money paid in return for a purchase or action/opportunity. One would have to connect more dots to make it a tax. We rate the group’s statement False.
What dots require connecting, other than having the term "tax" appear in the text of the bill to describe the rebate?  Good luck finding it in the story.  I couldn't.  It's hard to know when one has met a secret standard, and other than the absurd standard of requiring the bill to describe the rebate as a "tax" it's hard to see what would serve.

One additional brickbat for PolitiFact:  Where is the full context of the ad?  If there's some reason for not giving readers a copy of the ad to look at then the readers deserve to know what it is.

Wednesday, November 23, 2011

Bewz Newz 'n' Vewz: "Total Clusterfact: Sorting out Solyndra"

PFB associate Jeff Dyberg has posted a magum opus questioning how PolitiFact Florida could reach its finding of "Mostly False" that President Obama's administration extended half a billion in loans to its friends at Solyndra.

No, really:

(clipped from

A snippet of Jeff's take from his blog Bewz Newz 'n' Vewz:
PolitiFact reviews an Americans for Prosperity ad and helpfully specifies what they're going to sort out the truth of:
We decided to fact-check the ad, focusing on whether the president gave "half a billion in taxpayer money to help his friends at Solyndra, a business the White House knew was on the path to bankruptcy." 
They can't screw this one up, can they? Multiple media reports have shown beyond dispute that Obama donors are closely tied to Solyndra, and also that the White House was aware of Solyndra's problems prior to the loan. So just how bad did PolitiFact flub this rating?
Jeff provides plenty of evidence showing the PolitiFact bloodhounds all over the trail without picking up the scent.   Apparently, it's plenty of correlation without any hint of causation.

It's recommended reading.

Monday, November 21, 2011

Hope 'n' change at PolitiFact

Crossposted from Sublime Bloviations

 I keep hoping that criticism will influence positive change at PolitiFact, the fact checking arm of the St. Petersburg Times (soon changing its name to the Tampa Bay Times).

Well, a positive change occurred at PolitiFact recently.

Unfortunately, it was of the "one step forward, two steps back" variety.

For some time I've carped about PolitiFact's inconsistent standards, and in particular its publishing of two different standards for its "Half True" position on the "Truth-O-Meter."

The recent change probably stemmed from a message I sent to an editor at the paper's city desk (sent Nov. 9):
PolitiFact has created a problem for itself through inconsistency.  During the site's earlier years a page called "About PolitiFact" gave information about how the "Flip-O-Meter" and the "Truth-O-Meter" supposedly operate.  The page includes a description of each of the "Truth-O-Meter" rating categories.

More recently, editor Bill Adair posted an item called "Principles of PolitiFact and the Truth-O-Meter."  The problem?  The definition for "Half True" is different than the one PolitiFact posted for well over a year prior.  Compounding the problem, PolitiFact has kept both versions online through now.

1)  The statement is accurate but leaves out important details or takes things out of context.
2)  The statement is partially accurate but leaves out important details or takes things out of context.

I'll be interested to see the eventual remedy.  Which items over PolitiFact's history went by which definition? Was a change made in Feb. 2011 or before without any announcement?  How can PolitiFact legitimately offer report cards and "Truth Index" ratings if the grading system isn't consistent?  Those are questions I'd imagine readers would have if they realized PolitiFact is using two different definitions for the same rating.  I don't expect you to answer them for my sake (not that I would mind if you did). 

Good luck to all sorting this one out.
The eventual remedy is apparently to simply change the longstanding definition at "About PolitiFact" to match the newer one at "Principles of PolitiFact and the Truth-O-Meter" without any fanfare--indeed, without any apparent notice whatsoever.  I detect no admission of error at all and no acknowledgment that PolitiFact changed its standard.

The move seems consistent with the desire of the mainstream press to avoid doing things that "undermine the ability of readers, viewers or listeners to believe what they print or broadcast."

Sadly, I'm not at all surprised.

On the positive side, the definitions are now consistent with one another.

On the negative side, PolitiFact either created a past illusion where Truth-O-Meter ratings used the old system or else created a fresh illusion that past ratings follow the new system.  And went about it in about the least transparent way possible.


Good luck to PolitiFact retroactively changing the dozens (perhaps hundreds) of places on the Web that republished the original definition of "Half True."

(Clipped from; click image for enlarged view)

Contact PolitiFact Wisconsin.  They didn't get the memo yet.  And PolitiFact Texas has the same problem.

It's not the crime, it's the coverup.

Update 2:

It's also worth remembering PolitiFact's agonizing decision to change "Barely True" to "Mostly False."

"It is a change we don't make lightly," wrote Bill Adair.

How do you like that?  A change in the wording of a rating gets a reader survey prior to the change and an article announcing the change.  A change in the definition of a rating--a much more substantial change--gets the swept-under-the-rug treatment.

11/22/11-Added PFB link in update 2-Jeff

Wednesday, November 16, 2011

Media Trackers' PolitiFact series

Recently the media watchdogs Media Trackers published a five part series on PolitiFact.

Intro: Media Trackers Announces Series on PolitiFact
Part 1: PolitiFact and the Political Parties
Part 2: PolitiFact and Third-Party Organizations
Part 3: PolitiFact and Talk Radio
Part 4: PolitiFact and Governor Scott Walker
Part 5: Conclusion on PolitiFact 

We were unimpressed with the start of the series, but by the conclusion Media Trackers reached solid ground.

Part 1

Concern over the direction of the series started early:
On the whole, PolitiFact can’t be called completely biased towards conservatives or liberals. By Media Trackers count, PolitiFact has devoted nearly equal ink to conservative/Republican statements as to liberal/Democrat.
Comparing the number of stories devoted to each party tells nothing of ideological slant.  PolitiFact, if it was so inclined, could set a quota of 50 Republican stories and 50 Democrat stories and then proceed to write every single one of them with a liberal bias.

The remainder of Part 1 built a comparison between PolitiFact Wisconsin's treatment of state Republican Party statements with those of its Democratic Party counterpart.  The number of statements involved was very small (11 combined), but suggested that PolitiFact's editorial focus fixed more on the Democratic Party and doled out harsher ratings.

Part 2

The second installment focused on the treatment of what Media Trackers calls "third party" organizations.  That is, political action groups not directly associated with the political parties.

Media Trackers noted a trend opposite that from part one, albeit the two mini-studies share the problem of small sample size.  The conclusion of the second part found Media Trackers on top of a live spoor:
(D)oes PolitiFact lead readers to believe that conservative third-party organizations are less likely to tell the truth? How come the organization that spent the most on negative advertising in the recall elections had just one statement reviewed? Why more scrutiny to Pro-Life groups than Pro-Choice? Why were One Wisconsin Now’s statements reviewed four times more than the MacIver Institute? And what about statements on critical stories such as the denial by Citizen Action of Wisconsin of a connection to Wisconsin Jobs Now!? Why did PolitiFact choose not to tackle that statement?

No one expects PolitiFact to be the “be all end all” of watchdog journalism. But when they set themselves up as the judge and jury for all political statements in the state, one has to question how they select stories and why certain groups receive far and away more scrutiny than others.
In other words, the selection bias problem at PolitiFact is pretty obvious.

Part 3

Part three looked at PolitiFact Wisconsin's treatment of local radio personalities and established Media Trackers' modern day record for small sample size.  Conservative Charlie Sykes received two ratings while fellow conservative Mark Belling received one.  All three ratings were of the "Pants on Fire" variety.  Again, it smells like selection bias.

Part 4

The fourth installment examined PolitiFact's treatment of Republican governor Scott Walker.

Media Trackers forgave PolitiFact for rating a high number of Walker's statements because of his position of power.  Time will reveal the reliability of that measure.

The Media Trackers analysis noted that PolitiFact appeared to go a bit hard on Walker:
It seems that PolitiFact’s burden for truth is a bit higher for Governor Walker than it is for others. Given the “lightening rod” status of Walker, it certainly seems a bit disingenuous to call the Governor’s claim that Wisconsin is “broke” a false claim because he could just layoff workers and raise taxes to fix the deficit. And to say that Walker did not campaign on the reforms found in the Budget Repair Bill is also disingenuous given that the Governor spoke on a number of the reforms he sought, even though he did not spell out the eventual changes to collective bargaining.
The anecdotes can add up.

Part 5

Media Trackers seized on the common thread in its conclusion:
As Media Trackers has shown with this series, PolitiFact arbitrarily applies its scrutiny. Statements from the Democratic Party of Wisconsin have been evaluated seven times to the Republican’s two. Conservative Club For Growth have been examined seven times (three during the recall elections) while We Are Wisconsin was examined just once. Pro-Life groups have been scrutinized twice and never a Pro-Choice group.

Each of these political groups and officials are putting out an equal number of statements on a myriad of issues every day. If PolitiFact intends to claim the mantle of watchdog journalism by “calling balls and strikes” in the name of “public service,” PolitiFact needs more transparency about how they select their stories and a review of why certain groups and individuals receive more scrutiny than others.
Sample sizes aside, Media Trackers settles on a conclusion well supported by a huge mass of anecdotal material collected by others.  The final installment also refers to Eric Ostermeier's study pointing out PolitiFact's selection bias problem (highlighted at PolitiFact Bias here).

Though the Media Trackers conclusion about PolitiFact isn't exactly groundbreaking, the outfit deserves credit for overcoming its initial stumble and doing an independent examination of its local version of PolitiFact with the conclusion supported on those data.

Jeff adds: It's worth mentioning that PolitiFact Wisconsin is by far the most frequent target of accusations of right-wing bias. We've never found anything that sufficiently corroborates those claims and Media Trackers seems to do a capable job of dispelling that myth.

Wednesday, November 9, 2011

Sublime Bloviations: "Grading PolitiFact: Alan Hays, proof of citizenship and voting"

Could the cure for world hunger be as simple as picking the low hanging fruit from PolitiFact? Sometimes it seems that way.

PFB editor Bryan White was quick to spot the latest gaffe from our facticious friends. Check out PolitiFact Florida's rating of state Senator Alan Hays (R-Umatilla):

Image from

Now check out what Hays actually said:
"...I'm not aware of any proof of citizenship necessary before you register to vote."
Bryan notes:
If words matter then we should expect PolitiFact to note the difference between saying one does not know of a requirement and saying that no requirement exists.
If PolitiFact was just your average bucket of hackery, there wouldn't be much more to say other than they distorted Hays' quote. But our site wasn't created because PolitiFact is average. They take distortion to new heights.

Bryan goes on to expose the flim-flammery of how they eventually found Hays Mostly False for something he didn't say (which seems to be a common theme for them). It's impressive to witness the amount of work it takes to get something so wrong.

And for those of you keeping track, it includes yet another example of PolitiFact citing non-partisan, objective Democrats as experts.

So make sure to head over to Sublime Bloviations, because this one is a must read.

Bryan adds:  Not only did PolitiFact rate Hays on a statement he did not make, the rating of what he didn't say is also wrong.  PolitiFact continues to amaze.

Thursday, November 3, 2011

Reason: "PolitiFact Gets High-Speed Rail Facts in Florida Wrong"

Given the recent news about California's impressive high speed rail cost overruns, it seems like a good time to call attention to's pushback against PolitiFact's defense of the high speed rail system proposed for Florida.

The chief evidence of bias comes from PolitiFact's attempt to discredit on ideological grounds--an intriguing move for an organization known to uncritically cite Guttmacher Institute studies when fact checking claims by abortion opponents.  The Guttmacher Institute, of course, is ideologically attached to Planned Parenthood.

Most of PolitiFact's criticisms of the study promoted by were quite weak, such as pointing out that data from the study showing cost overruns were not exclusively rail studies.  While that's true, the cost overruns were greater for rail projects, so the supposed problem actually made rail look perhaps better than it deserved.

The key point of dispute concerns the responsibility for costs if the project stays in the red.  PolitiFact argued that Florida's project provided adequate protections. argues the reverse:
When Gov. Scott was making his rail decision, he knew that if Florida had taken federal money for the Tampa-to-Orlando high-speed rail system, one of the federal government’s rules clearly says that a state government can’t take the construction money and then stop operating the project it has accepted the money for. Under long-standing federal rules, the state would have to repay the federal grant money—in this case, $2.4 billion. If it didn’t repay the $2.4 billion, Florida’s taxpayers would be forced to keep the train running —at a loss— and be on the hook for the future operating subsidies. The U.S. Department of Transportation did send notice that it would negotiate over its repayment rule, but only after Gov. Scott had already announced his decision to turn down the federal money.
I'll admit I'm not familiar with the cited rule, but it's easy on principle to imagine it exists.  It could have helped's case to include more information about it.

On the whole, makes a pretty good case that PolitiFact failed to settle the issue.

Matthew Hoy: "You guys screwed up"

Ordinarily we highlight Matthew Hoy's criticisms of PolitiFact via the posts at his blog, Hoystory.  But this time we catch Hoy at his pithy best while blasting PolitiFact over at Facbook for its "Pants on Fire" rating of Herman Cain's supposed claim that China is trying to develop nuclear weapons.  PolitiFact took Cain to mean China was developing nuclear weapons for the first time, you see.

You guys screwed up. Congratulations. Read the whole context (which you provide) and it's ambiguous -- he very well may be referring to nuclear-powered AIRCRAFT CARRIERS -- which they don't have yet. Also, during Vietnam, Cain was working ballistics for the Navy, studying the range and capabilities of China's missiles. He knew they had nukes. It was inartfully said. Not a mistake. According to your own rules, you don't fact check things like this: "Is the statement significant? We avoid minor "gotchas"’ on claims that obviously represent a slip of the tongue."
That about says it all, but I'll just add one helpful informational link.

Given the ambiguity of Cain's statement, it speaks volumes about PolitiFact's ideological predisposition that no attempt was made to interpret Cain charitably.

Wednesday, November 2, 2011

Grading PolitiFact: Joe Biden and the Flint crime rate

(crossposted from Sublime Bloviations with minor reformatting)

To assess the truth for a numbers claim, the biggest factor is the underlying message.
--PolitiFact editor Bill Adair

The issue:
(clipped from

The fact checkers:

Angie Drobnic Holan:  writer, researcher
Sue Owen:  researcher
Martha Hamilton:  editor


This PolitiFact item very quickly blew up in their faces.  The story was published at about 6 p.m. on Oct. 20.  The CYA was published at about 2:30 p.m. on Oct. 21, after and the Washington Post published parallel items very critical of Biden.  PolitiFact rated Biden "Mostly True."

First, the context:

(my portion of transcript in italics, portion of transcript used by PolitiFact highlighted in yellow):

If anyone listening doubts whether there is a direct correlation between the reduction of cops and firefighters and the rise in concerns of public safety, they need look no further than your city, Mr. Mayor.  

In 2008--you know, Pat Moynihan said everyone's entitled to their own opinion, they're not entitled to their own facts.  Let's look at the facts.  In 2008 when Flint had 265 sworn officers on their police force, there were 35 murders and 91 rapes in this city.  In 2010, when Flint had only 144 police officers the murder rate climbed to 65 and rapes, just to pick two categories, climbed to 229.  In 2011 you now only have 125 shields.  

God only knows what the numbers will be this year for Flint if we don't rectify it.  And God only knows what the number would have been if we had not been able to get a little bit of help to you.

As we note from the standard Bill Adair epigraph, the most important thing about a numbers claim is the underlying message.  Writer Angie Drobnic Holan apparently has no trouble identifying Biden's underlying message (bold emphasis added):
If Congress doesn’t pass President Barack Obama’s jobs plan, crimes like rape and murder will go up as cops are laid off, says Vice President Joe Biden.

It’s a stark talking point. But Biden hasn’t backed down in the face of challenges during the past week, citing crime statistics and saying, "Look at the facts." In a confrontation with a conservative blogger on Oct. 19, Biden snapped, "Don’t screw around with me."
No doubt the Joe Biden of the good "Truth-O-Meter" rating is very admirable in refusing to back down.  The "conservative blogger" is Jason Mattera, editor of the long-running conservative periodical "Human Events."  You're a blogger, Mattera.  PolitiFact says so.

But back to shooting the bigger fish in this barrel.

We looked at Biden’s crime numbers and turned to the Federal Bureau of Investigation's uniform crime statistics to confirm them. But the federal numbers aren’t the same as the numbers Biden cited. (Several of our readers did the same thing; we received several requests to check Biden’s numbers.)

When we looked at the FBI’s crime statistics, we found that Flint reported 32 murders in 2008 and 53 murders in 2010. Biden said 35 and 65 -- not exactly the same but in the same ballpark.
Drobnic Holan initially emphasizes a fact check of the numbers.  Compared to the FBI numbers, Biden inflated the murder rate for both 2008 and 2010, and his inflated set of numbers in turn inflates the percentage increase by 45 percent (or 27 percentage points, going from 60 percent to 87 percent).  So it's a decent-sized ballpark.

For rapes, though, the numbers seemed seriously off. The FBI showed 103 rapes in 2008 and 92 rapes in 2010 -- a small decline. The numbers Biden cited were 91 rapes in 2008 and 229 in 2010 -- a dramatic increase.
If inflating the percentage increase in murders by 27 percentage points is not a problem for Biden then this at least sounds like a problem.

After going over some other reports on the numbers and a surprising discussion of how not much evidence suggests that Obama's jobs bill would address the number of police officers in Flint, PolitiFact returns to the discrepancy between the numbers:
(W)e found that discrepancies between the FBI and local agencies are not uncommon, and they happen for a number of reasons. Local numbers are usually more current and complete, and local police departments may have crime definitions that are more expansive than those of the FBI.
All this is very nice, but we're talking about the city of Flint, here.  We don't really need current stats for 2008 and 2010 because they're well past.  Perhaps that affects the completeness aspect of crime statistics also; PolitiFact's description is too thin to permit a judgment.  As for "expansive" definitions, well, there's a problem with that.  Biden's number of rapes in 2008 is lower than the number reported in the UCR (FBI) data.  That is a counterintuitive result for a more expansive definition of rape and ought to attract a journalist's attention.

In short, even with these proposed explanations it seems as though something isn't right.

Flint provided us with a statement from Police Chief Alvern Lock when we asked about the differences in the crime statistics, particularly the rape statistics.

"The City of Flint stands behind the crime statistics provided to the Office of The Vice President.  These numbers are an actual portrayal of the level of violent crime in our city and are the same numbers we have provided to our own community. This information is the most accurate data and demonstrates the rise in crime associated with the economic crisis and the reduced staffing levels.

"The discrepancies with the FBI and other sources reveal the differences in how crimes can be counted and categorized, based on different criteria." (Read the entire statement)
This is a city that's submitting clerical errors to the FBI, and we still have the odd problem with the rape statistics.  If the city can provide numbers to Joe Biden then why can't PolitiFact have the same set of numbers?   And maybe the city can include stats for crimes other than the ones Biden may have cherry-picked?  Not that PolitiFact cares about cherry-picked stats, of course.

Bottom line, why are we trusting the local Flint data sight unseen?

PolitiFact caps Biden's reward with a statement from criminologist and Obama campaign donor James Alan Fox of Northeastern University to the effect that Biden makes a legitimate point that "few police can translate to more violent crime" (PolitiFact's phrasing).  Fox affirms that point, by PolitiFact's account, though it's worth noting that on the record Biden asserted a "direct correlation" between crime and the size of a police force.  The change in wording seems strange for a fact check outfit that maintains that "words matter."

The conclusion gives us nothing new other than the "Mostly True" rating.  Biden was supposedly "largely in line" with the UCR murder data for Flint.  His claim about rape apparently did not drag down his rating much even though PolitiFact admittedly could not "fully" explain the discrepancies.  PolitiFact apparently gave Biden credit for the underlying argument that reductions in a police force "could result in increases in violent crime" despite Biden's rhetoric about a "direct correlation."

The grades:

Angie Drobnic Holan:  F
Sue Owen: N/A
Martha Hamilton:  F

This fact check was notable for its reliance on sources apparently predisposed toward the Obama administration and its relatively unquestioning acceptance of information from those sources.  The Washington Post version of this fact check, for comparison, contacted three experts to PolitiFact's one and none of the three had an FEC filing indicating a campaign contribution to Obama.

And no investigation of whether Biden cherry-picked Flint?  Seriously?  See the "Afters" section for more on that as well as commentary on PolitiFact's CYA attempt.