Wednesday, February 20, 2013

Reminder: PolitiFact's standards shift

While doing a triple-decker fact check of PolitiFact and the Violence Against Women Act over at Zebra Fact Check, I stumbled over yet another great example of PolitiFact's shifting standards.

Remember how PolitiFact had to rate Mitt Romney "Pants on Fire" for his claim Jeep would build Jeeps in China because the claim supposedly implied Jeep would move all its manufacturing to China?  Of course.  We all remember that one.

But how about 2008, when then-candidate for president Barack Obama said his newly-chosen running mate, Joe Biden, wrote the VAWA and domestic violence went down dramatically.

PolitiFact found that Obama was accurate about the amount of the decrease, but the story expressed doubts about the implied cause.
Okay, so both of Obama's statements are true. Biden wrote an anti-domestic violence law, and domestic violence rates dropped dramatically.

But did one cause the other? Although Obama doesn't say it directly, that's the clear implication in his statement, and that's where things get a bit murkier. As we discussed back in 2007, when Biden was gunning for the top spot on the Democratic ticket, figuring out the reasons for changes in crime statistics can be tricky.  Multiple factors, including economic prosperity and demographic changes, contributed to an overall decline in violent crime throughout the 1990s and into this decade.
Romney, like Obama, made true statements.  PolitiFact in both cases saved its objections for the implied argument.

After finding a shred of evidence that the VAWA helped decrease domestic violence, PolitiFact offered its ruling on Obama's statement:
A study by the University of Arkansas, for instance, concluded in 2000 that the law's increased funding for civil legal assistance for victims contributed to the decline. Though that study also said economic and demographic factors mattered.

Obama never said the law caused all of the decrease, but he implied it, so we will rate this statement Mostly True.
"Pants on Fire" for Romney.  "Mostly True" for Obama.

Totally fair and unbiased.  Or something.

We do find one mitigating factor in PolitiFact's defense.  This story was from 2008, before 2011 when PolitiFact announced that it would provide greater weight for implied claims of responsibility.  Still, an impressive contrast results between the two rulings.  Romney's "Pants on Fire" ruling helped him capture PolitiFact's ultra-subjective "Lie of the Year" award for 2012, while Obama's "Mostly True" simply helped cement the beltway media impression that Democrats are much more truthful than Republicans.  And Obama gets to keep his "Mostly True" rating in his PolitiFact file to this day.

PolitiFact and fact checking.  Meh.

Friday, February 15, 2013

PolitiFact Oregon: Making pretzels out of PolitiFact's principles

Remember PolitiFact's principles?

No worries.  PolitiFact doesn't either.  At least not enough to update its statement of principles when editor Bill Adair adds to them.

PolitiFact originally published its statement of principles on Feb. 21, 2011.

On Jan. 25 last year, in "Tuning the Truth-O-Meter," Adair wrote:
About a year ago, we realized we were ducking the underlying point of blame or credit, which was the crucial message. So we began rating those types of claims as compound statements. We not only checked whether the numbers were accurate, we checked whether economists believed an office holder's policies were much of a factor in the increase or decrease.

We give a lot of Half True ratings because the numbers are often right, but experts repeatedly tell us that the policies of a single executive have a relatively small impact in a big and complex economy.
Going back one year before Jan. 25 last year, we get to approximately January of 2011--not at all far from the time PolitiFact published its statement of principles.

With the credit/blame issue missing from its statement of principles, who can blame PolitiFact Oregon for ignoring it?

We will.  Fact checkers ought to avoid inconsistency in their rulings.

PolitiFact Oregon uncritically accepts the underlying argument

PolitiFact Oregon fact checked a claim by Jeff Merkley (D-Ore.) in support of the Violence Against Women Act.
(S)upporters like Sen. Jeff Merkley, D-Ore.,  have been talking about the law’s benefits.

Here’s what Merkley said Feb. 7 during a conference call with reporters: "Since 1994 when VAWA was first passed, incidents of domestic violence have dropped more than 50 percent."

That seems to be a pretty strong selling point and as the bill moves toward a final vote in the Senate it’s something that will be repeated and emphasized during debate.
The statistic is only a strong selling point for the VAWA if the VAWA has a substantial effect on the decrease in domestic violence.  That's Merkley's underlying argument.  PolitiFact Oregon fact checks only the statistic and implicitly accepts the underlying argument without any critique at all, giving Merkley a "True" rating for his statement.

There's no question Merkley was crediting the VAWA for the the change.  PolitiFact notes Merkley was "talking about the law's benefits" before breathlessly reporting that it "seems to be a pretty strong selling point."

That's the way you do the fact check if you're biased toward Merkley's point of view.  And unwilling to let your standards for fact checking get in the way.

Sunday, February 10, 2013

PolitiFact's defense spending shenanigans

Remember when PolitiFact believed that the best way to measure defense spending was as a percentage of GDP among OECD nations or world powers?

No worries. PolitiFact doesn't remember it either.

Today PolitiFact New Jersey ruled "True" a claim from Newark's Mayor Cory Booker that U.S. defense spending exceeds defense spending for "the next 10, 11, 12 countries combined."

Back in July of 2010, PolitiFact ruled "Mostly False" a claim from conservative Sarah Palin that in terms of defense spending as a percentage of GDP, the U.S. ranks No. 25 in the world.

The two rulings make up yet another classic comparison illustrating PolitiFact's liberal bias.

In Booker's case, the facts are simple.  If the U.S. spends more like Booker said, then his statement is true.  It doesn't even matter if nobody really knows what China and Russia spend on defense, and it doesn't matter that the various nations use no standard method of defining defense spending.  We might spend more on salaries and veterans' benefits than Canada spends on guns, tanks, ships and planes.  Can we defend ourselves effectively with veterans' benefits?  Probably not.  But none of that matters to PolitiFact New Jersey.  Booker is right, so just deal with it.

In Palin's case, facts are complex.  Palin is technically right about U.S. defense spending as a percentage of GDP.
We quickly tracked down the chart from which we suspect she pulled her factoid. (Her staff didn't return our e-mail query.) It's a credible source -- the CIA World Factbook -- and, as Palin said, the U.S. does rank 25th in the world, spending an estimated 4.06 percent of GDP on defense in 2005.

Case closed? Not really.
Palin was mostly wrong because it's not fair to compare the U.S. to tiny non-industrialized nations.  In Booker's case, of course, it's fair to compare our defense spending to that of nations that deliberately under-report their defense spending.

Stepping back from agenda journalism toward reality, both Booker and Palin have a legitimate point, with Palin probably better grounded in the facts since her claim uses a figure for the U.S. that is inflated relative to nations that do not fully reveal their defense spending.  In effect the failure of certain nations to fully report their defense spending understates Palin's point while overstating Booker's.

There's no objective justification for this disparity in ratings about defense spending.  PolitiFact New Jersey liked the point Booker was making and subjected it to less scrutiny than Palin received for her claim.  That's how mainstream media fact checking works.  It's not objective.  And PolitiFact is the worst of the lot when it comes to this type of thing.

Note:
I fisked the whole of the Palin ruling way back when at Sublime Bloviations.

Friday, February 1, 2013

Research Update: PolitiFact in 2012

We started a research project a couple of years ago, focusing on the proportion of all false statements PolitiFact places in its "Pants on Fire" category.  PolitiFact has never provided any objective criterion for categorizing a statement as "Pants on Fire," so our research hypothesizes that PolitiFact's use of the rating serves as a reasonable measure of partisan bias.

In 2012, PolitiFact gave 32.1 percent of the GOP's false statements the "Pants on Fire" designation compared to 14.3  percent for the Democratic Party's false statements.  We give that type of proportional disparity a "PoF Bias number," in this case 2.25.  The number indicates that PolitiFact was 2.3 times more likely to give a false statement from a Republican a "Pants on Fire" rating than a false statement from a Democrat.

Remember that PolitiFact offers no objective means for distinguishing between the two ratings.

FALSE – The statement is not accurate.

PANTS ON FIRE – The statement is not accurate and makes a ridiculous claim.


Only during PolitiFact's first year did PolitiFact's use of the "Pants on Fire" rating fail to favor Democrats.  It's well worth noting that PolitiFact editor Bill Adair has since said that the "Pants on Fire" rating started out as a lighthearted joke.  That explains why Joe Biden received a "Pants on Fire" rating for saying President Bush was "brain-dead."  And it may also explain the anomaly in PolitiFact's partisanship for 2007.

This year's numbers carry a couple of implications for the popular theories PolitiFact's defenders bring to bear against criticisms of the fact check organization.

The ratings reflect reality, which is biased to the left


Figuring the numbers for state operations into the total obliterates the PoF Bias number trend for PolitiFact national.  So Democrats from Tennessee don't lie more, but when they lie look out 'cause it's likely to be a gigantic whopper?  Who buys it?

State franchises produce numbers all over the map.  At least with respect to the application of the "Pants on Fire" rating, this is more consistent on its face with the numbers reflecting something about the individual PolitiFact operations, not differences in political truth-telling from state to state.

 

The Republican primary explains the focus on Republican statements, and its contentiousness explains the harsh ratings


The PoF Bias number came down slightly in 2011 and 2012 compared to the previous two years.  The primary season offers no obvious explanatory value at all with respect to "Pants on Fire" differentials.

The simple, parsimonious explanation for the data comes from the ideology of the PolitiFact staff combined with the flow of news.

Given the wide variation among the state franchises, it is astonishing (without employing the theory of journalistic bias) for PolitiFact to obtain the consistency it has achieved over the past three years in using the "False" and "Pants on Fire" ratings on claims from Democrats.  The recent three year trend resembles a quota.

PolitiFact is biased.  The bias in the application of the "Pants on Fire" rating is just one example of it.