Wednesday, July 27, 2011

"It is a change we don't make lightly"

(crossposted from Sublime Bloviations)

Big news came down from PolitiFact today.  The fact checking organization will change its "Barely True" rating on its "Truth-O-Meter" to "Mostly False."

PolitiFact editor Bill Adair explains:
Today, Barely True becomes Mostly False.

It is a change we don't make lightly. The Truth-O-Meter has been the heart of PolitiFact since we launched the site four years ago, and we were reluctant to tinker with it.
If the powers that be at PolitiFact show such reluctance in making this minor cosmetic change then it sends a strong signal that PolitiFact will remain unwilling to make the substantial changes it would have to make to avoid its ongoing branding as the mainstream cousin to the partisans at Media Matters.

And I can't help but react with a wry smile when PolitiFact makes this change while leaving intact a long-running discrepancy in the descriptions of its ratings.  The "Principles of PolitiFact and the Truth-O-Meter" page describes "Half True" as  "The statement is partially accurate but leaves out important details or takes things out of context."  At "About PolitiFact" another version of "Half True" reads "The statement is accurate but leaves out important details or takes things out of context."

Between the new definition of "Half True" and the change from "Barely True" to "Mostly False" where both have the same definition, which is the more significant change?

But don't look for PolitiFact to reconcile its differing definitions with any fanfare.  There is still a reputation to protect.  Look for PolitiFact to (again) break the pledge it made about what it would do when it makes a mistake:
We strive to make our work completely accurate. When we make a mistake, we correct it and note it on the original item. If the mistake is so significant that it requires us to change the ruling, we will do so.

Sunday, July 24, 2011

Relevant: "Left Turn: How Liberal Media Bias Distorts the American Mind"

Power Line blog has published some excerpted material from a new book,  "Left Turn: How Liberal Media Bias Distorts the American Mind."  We find the material relevant in helping explain the liberal bias manifest at PolitiFact.

Find the excerpts here:

Part 1 (the Introduction)
Part 2 (Forward)
Part 3
Part 4
Part 5

The book's author, Tim Groseclose, co-published a study of media bias back in 2004, perhaps the first study to succeed at objectively measuring media bias in practice.

From the report:
We measure media bias by estimating ideological scores for several major media outlets. To compute this, we count the times that a particular media outlet cites various think tanks and policy groups, and then compare this with the times that members of Congress cite the same groups. Our results show a strong liberal bias: all of the news outlets we examine, except Fox News’ Special Report and the Washington Times, received scores to the left of the average member of Congress.
It's worth noting that Republicans held a majority in Congress for a good long time during the period from which the study arose.

The Chapter 8 emphasis on Katherine Kersten at the Minneapolis Star-Tribune reinforces one of the repeated emphases here at PFB:  Regardless of any conscious attempt to bias the news, journalists tend to lean left.  The left tilt skews story selection (selection bias) and the factual emphasis in the story.  People who think like Kersten, when their presence is known, make up a type of alien minority population in newsrooms.

Groseclose's book is very likely to make reading lists at PFB.

Saturday, July 23, 2011

Jim Lakely: "The PolitiFact ‘Lie of the Year’ Is the Lie of the Year About Obamacare"

It's always a good time to review the perfidious PolitiFact treatment of the Democrats' attempted government takeover of healthcare.

This Dec. 2010 item from a Heartland Institute blog ("Somewhat Reasonable") by Jim Lakely makes its point largely through items we have already highlighted, but it's worth a read on its own:
PolitiFact holds itself up as an objective arbiter of “truth” and “lies” in America’s political discourse. But, like any organ played by the MSM, this project of The St. Petersburg Times is inclined to succumb to institutional liberal bias. PolitiFact’s “Lie of the Year” award for 2010 is a great example.
Lakely's title may rate as the finest in its specific genre.

Wednesday, July 20, 2011

RedState: "Politifact’s Review of Josh Trevino: Mostly Hackery"

Red State, thanks to Leon H. Wolf, has another excellent criticism of a PolitiFact fact check.

Wolf takes PolitiFact Texas to task over its rating of RedState Co-Founder Josh Treviño. Treviño cited a poll on an MSNBC program. PolitiFact subjected (pun intended) the statement to its Truth-O-Meter.

And that gets Wolf to wondering:
Politifact was forced to concede that Trevino’s characterization of the poll showing a plurality opposed to raising the debt ceiling was 100% correct and accurate. So what caused them to rate Trevino’s remarks as “mostly true” instead of “completely and entirely true”?
I'll supply a bit more context than did Wolf in summing up PolitiFact's complaint (he quotes only the latter paragraph):
Treviño’s other point — that Americans favor mostly budget cuts to deal with the deficit — didn’t poll as neatly as his recap suggests.

Asked how they’d prefer members of Congress to address the deficit, 20 percent said only by cutting spending and another 30 percent said mostly with spending cuts. Four percent favored solely tax increases, while 7 percent said they’d prefer to tackle the deficit mostly by tax hikes.
Wolf notes that the 20 percent and 30 percent figures add up to exactly the plurality Treviño describes.  So Treviño's numbers and underlying argument both stand as "True."  Yet contrast the rating of Treviño with a "Mostly True" PolitiFact rating of President Obama using a similar set of figures:
Getting back to Obama's statement, he said, "You have 80 percent of the American people who support a balanced approach. Eighty percent of the American people support an approach that includes revenues and includes cuts." Even the best poll doesn't show support quite that high -- he would more accurately have accounted for the small numbers that support only tax increases or were unsure, putting the number at 70 percent. But his overall point is correct that polls show most Americans support a balanced approach when given a choice between cutting spending or raising taxes. So we rate his statement Mostly True.
The president, using the most favorable numbers, therefore inflates his figure by 14 percent (10 percentage points).  And the president leaves at least as much context unstated as did Treviño.  Treviño arguably left out nothing of importance.

Wolf (bold emphasis added):
Memo to Politifact: the fact that a poll contains additional information that Trevino did not discuss does not make his statement less than entirely truthful. For example: if Trevino had been discussing the latest poll of the Republican caucus in Iowa and had claimed (correctly) that “Bachmann leads Romney 32%-29%,” his statement would not be rated merely “mostly true” because he did not disclose that Pawlenty was at 7%, Santorum at 6%, etc. Trevino by his own statement wasnt’ (sic) discussing the people who wanted the deficit solution split roughly down the middle, he was discussing people who favored “mostly cuts” versus “mostly taxes,” and his statement was (and should have been scored) completely correct.
Treviño used the poll data responsibly and accurately.  The president didn't.  If Treviño is at fault for failing to point out that a plurality are open to additional revenue/tax increases then isn't the president at fault for failing to mention the plurality who favor more reliance on budget cuts than on tax increases?  Yet PolitiFact mentions only Treviño's supposed omission.  The president gets a pass.

Do both men deserve the same grade, PolitiFact?  Seriously?

Sunday, July 17, 2011

Sublime Bloviations: "PolitiFlub: 'Effortless promise-keeping by the president'"

Sometimes, PFB editor Bryan White laments, he can't believe his eyes when he reads various PolitiFact stories. Thus begins his latest review of PolitiFact's most recent "Obameter" rating.

PolitiFact's latest gift to our 44th president came in the form of the coveted "Promise Kept" rating regarding Obama's campaign promise to "...establish a 10 percent federal Renewable Portfolio Standard (RPS) to require that 10 percent of electricity consumed in the U.S. is derived from clean, sustainable energy sources, like solar, wind and geothermal by 2012."

With Cap and Trade legislation bogged down and unlikely to pass, how did the Commander in Chief manage to fulfill his promise to require greater reliance on green energy sources? PolitiFact intern David G. Taylor lays out how the campaign commitment was committed:
We spoke with the [sic] Christina Kielich of the U.S. Department of Energy press office. She told us that the United States receives approximately 11 percent of its electricity from renewable sources. This breaks down to about 6 percent from hydroelectricity, 3 percent from wind, and approximately 1% each from solar, biomass, and geothermal. Thus, in 2011 - one year head [sic] of Obama"s promise, the United States has already reached more than the 10 percent renewable level.
Holy Pulitzer Prize, Batman! Not only is Obama actually exceeding the 10% he campaigned on, he's a full year ahead of schedule!

But faster than you can say "non-partisan", Bryan points out the flaw:
In case it isn't clear what is going on here, Taylor is substituting a new promise for the old promise. The old promise was that the president would require 10 percent of U.S. energy to come from renewable sources. The new promise is that the U.S. will produce at least 10 percent of its energy from renewable sources. The latter promise is a tad like my personal promise that the sun will come up tomorrow. When the sun appears, my promise is kept. Did I do anything to help it along? Not at all.

As bad as it would be to credit the president with keeping a promise which required nothing of him, the real problem stems from the fact that Obama's promise was one of action. He would establish a requirement. Taylor's story provides no evidence of the establishment of any sort of requirement.
In other words, the amount of energy the U.S. is currently producing, or acquiring, from renewable sources is irrelevant to and independent of Obama's promise to create a renewable energy standard.

Long time PolitiFact followers might note that this is hardly the first time they've granted Obama a positive grade for something he never actually did. Back in 2009 they rated a chain email "Half True" for the claim that Obama "closed off shore tax safe havens." In that rating, Obama had simply proposed legislation that would close off shore tax havens. As the rating noted:
Although the legislation enjoys the support of the White House, it is likely to face strong opposition from corporations that do considerable business overseas....In other words, it's premature to put this one in the "Obama Accomplishments" column.
But why let a little technicality like actually being enacted stop you from putting it in the "Obama Accomplishments" column anyway? Besides, it's not like he's a Republican:
Just as we rate Obama"s [sic] promises kept only when they were passed by Congress and signed into law, we will rate Republican promises not just on whether they pass the House, but whether they are ultimately enacted.
Had Taylor simply followed PolitiFact's own guidelines, he would have realized his Promise Kept rating was dubious just by looking up the date Obama's 10% renewable energy source requirement was "passed by Congress" and "ultimately enacted". Instead, we're left with a gross double standard, and another solid example of PolitiFact's war on objectivity.

Bryan has more issues with the rating that I didn't get into here. As always, read the whole article.

Saturday, July 16, 2011

Red State: "PolitiFact or PolitiSpin?"

Red State blogger "Flagstaff" published a survey of PolitiFact's fact checking in early July.  Though the survey was limited in scope and lacked any apparent scientific construction, the conclusion is solid:
In the end, we can’t trust a newspaper service to grade the truthfulness of politicians for us.  The grades turn on the political bias of the paper, and you can imagine where that is.  We can’t simply believe claims that they’re non-partisan; we must make them prove it by what they write, then do our own evaluation anyway, based on whether what they say makes sense or not.
Flagstaff made a valuable addition in the subsequent commentary thread:
The bias mostly seems to present by nit-picking at petty mistakes of the right, insisting on strict definitional usage of words, finding fault with what is NOT said, and glossing over major errors from the left, supplying exculpatory explanations for obvious mistakes, allowing broad interpretation of leftist words and their intent, basically behaving exactly as the MSM does every day
Yeah.  That.

Visit Red State to read it all.

Monday, July 11, 2011

PJ Tatler: "PolitiFact or PolitiFAIL? MSM “fact checker” refutes itself on Romney’s Obama debt claim" (Updated)

Patrick Poole of Pajama's Media's PJ Tatler feature shows what happens when GOP figures turn PolitiFact-approved factoids into GOP talking points.

Mitt Romney's campaign did exactly that with a fact-checked claim originally from Eric Cantor (R-Va.).

When PolitiFact went to fact check Romney, it ended up fact checking its own work and finding it wanting.

Poole summarizes:
So we find that “PolitiFact” changed the basis for their assessment once Republicans started using their own assessment to beat up their boyfriend Barack. So will they start giving out ratings of their own posts?
Visit PJ Tatler for all the gory details.

The evidence of ideological bias would improve if the comparison wasn't between two Republicans, but Poole makes a good point about PolitiFact fact checking itself.  PolitiFact is building a record of wild inconsistency, and it appears to break predominantly against conservatives.


Changed Tatler links to put focus on Tatler story rather than my added commentary.

Saturday, July 9, 2011

Michael F. Cannon: "PolitiFact Just Called, Again. I Declined to Help, Again."

Short and to the point, Cato Institute's Michael J. Cannon reminds us that his boycott of PolitiFact continues.  As he notes in his title, PolitiFact sometimes seeks his services as an expert source.  Follow the links to see why Cannon no longer cooperates with PolitiFact.

Friday, July 8, 2011

Media Trackers: "Politifact’s Rating of ‘Half-True’ Only Tells Half The Story"

Conservative website Media Trackers recently took a look at a PolitiFact Wisconsin rating dealing with Gov. Scott Walker's budget. A liberal group, One Wisconsin Now, made the claim that Walker’s budget "includes tax breaks for corporations and the rich that will cost the state of Wisconsin taxpayers $2.3 billion over the next decade."

Media Trackers explains:
In typical fashion, Politifact rates the claim “half-true,” beating up One Wisconsin Now on some technicalities while just grazing the surface of the larger issues.
The larger issue is PolitiFact simply accepts a standard liberal talking point as an obvious fact. See if you can spot it in PolitiFact's summary:
[One Wisconsin Now's] larger point -- that the tax breaks benefit corporations and wealthier residents, rather than average taxpayers -- is generally on target, given that the largest amounts of the tax breaks go to businesses and to individuals with higher incomes.
Fret not if you missed the subtle sleight. Media Trackers tracked it down:
No benefit to the average taxpayer?

The average taxpayer would not benefit from corporations having more capital to innovate? The average taxpayer would not benefit from entrepreneurs who might relocate to Wisconsin? The average taxpayer would not benefit from the potential for more jobs in Wisconsin?

One doesn’t expect One Wisconsin Now to grasp the reasons why tax breaks for businesses and corporations are beneficial to Wisconsin as a whole. But the writers at the Milwaukee Journal Sentinel ought to recognize that it is beneficial to average taxpayers if Wisconsin’s business climate improves, that we shun the policies that gave Wisconsin the 4th highest tax burden while ranking 40th in business climate.
This is an excellent point, and exposes a flaw in PolitiFact's methods. The Wall Street Journal made this exact assertion in an article we've previously reviewed:
PolitiFact's decree is part of a larger journalistic trend that seeks to recast all political debates as matters of lies, misinformation and "facts," rather than differences of world view or principles. PolitiFact wants to define for everyone else what qualifies as a "fact," though in political debates the facts are often legitimately in dispute.
Reasonable people can disagree on how specific economic policies will affect various citizens. Those effects are not confined within the realm of quantifiable facts and it would be nearly impossible to objectively measure them. But PolitiFact did exactly that when they determined the average taxpayer wouldn't benefit from the tax cuts in Walkers budget. This acceptance of liberal axioms blemishes PolitiFact's implicit claim to non-partisan objectivity.

Read the entire piece here. Readers can find another review of a Media Trackers critique here.

Thursday, July 7, 2011

The New Republic: "'Politifact' Unfairly Attacks The GOP"

Jonathan Chait of The New Republic blasts PolitiFact for a story unfair to Republicans. Chait's story is notable because he and TNR lean decidedly left.

Politifact slams the Republicans:
The important point in each examination is that $500 billion -- the figure confirmed by the NRSC's citations -- are not taken out of the current Medicare budget and are not actual cuts. Nowhere in the bill are benefits actually eliminated, experts said.

The $500 billion are reductions to future spending. The health care law attempts to slow the projected growth in Medicare spending by that amount over 10 years.

Medicare spending will still increase. The Congressional Budget Office estimated it will reach $929 billion in 2020, up from $499 billion in actual spending in 2009....

The NRSC’s claim cites a real figure -- $500 billion -- that is part of the health reform debate. But it incorrectly describes it as $500 billion in Medicare cuts, rather than as decreases in the rate of growth of future spending.
Sorry, this is just wrong. Indeed, it's ridiculous, and nobody should listen to Politifact on this topic.
Chait argues that for a program like Medicare, where the costs are expected to dramatically spike, it is ridiculous not to allow some truth to equating a flat cap with a cut.  Chait likewise argues that cutting Medicare Advantage clearly removes the Medicare Advantage benefit (a subsidy).

The only part of the TNR criticism that I don't quite get is Chait's suggestion that this error accords with a PolitiFact bias that favors cutting entitlement programs.

Why then should PolitiFact object to calling the elimination of the Medicare Advantage subsidy a cut?  Does Chait suppose that ObamaCare represents an attempt to cut entitlement programs?  On that point his argument seems strained.  The rest of it appears pretty solid.

Wednesday, July 6, 2011

When PolitiFact makes a mistake

When we make a mistake, we correct it and note it on the original item.
--Principles of PolitiFact and the Truth-O-Meter

This past Sunday, PolitiFact published yet another one of those mailbag stories--the kind where readers claim PolitiFact has made a mistake for thus-and-such a reason, and PolitiFact notes the complaint without comment and then we all move on.

But this latest one had something unusual near the start of the latter third:
(PolitiFact Editor Bill Adair responds: You're right that we have not always been consistent on our ratings for these types of claims. We've developed a new principle that is reflected in the Axelrod ruling and should be our policy from now on. The principle is that statistical claims that include blame or credit like this one will be treated as compound statements, so our rating will reflect 1) the relative accuracy of the numbers and 2) whether the person is truly responsible for the statistic.)
If an admission of the inconsistent application of standards is not an admission of error then what is it?  Yet something tells us that Adair and company will not venture into the PolitiFact archives in an effort to apply this "new principle." 

The supposed "new principle" is actually an old principle inconsistently applied.  Adair himself described it in a story called "Numbers game" back in 2008:
To assess the truth for a numbers claim, the biggest factor is the underlying message.
When an ad says "Governor X was in charge while the state lost 10,000 jobs" it is sending the message that the governor was responsible.  It's not rocket science.  But now we get a "new principle."  And you folks who got burned from the failure to apply that principle?  Tough luck, most likely.  It would not look good for PolitiFact to put correction notices on any substantial number of fact check items.  Plus somebody might notice that one party received more harm than the other based on which stories received a correction.

Let's not go there.

In truth, admitting the need for a "new standard" by itself serves as an admission of one of the problems of subjectivity PFB was created to expose. If PolitiFact improves its performance then we accomplish part of our mission. Unfortunately we have little reason to expect any improvement at PolitiFact. After all, the "new standard" is really a restatement of a standard Adair proclaimed in 2008.

While PolitiFact and Adair tease us with the promise of new appropriate rating standards, we get another fact check like this one.  Try to find any mention of the underlying point.

Hurray for the new standard.