Wednesday, November 23, 2011

Bewz Newz 'n' Vewz: "Total Clusterfact: Sorting out Solyndra"

PFB associate Jeff Dyberg has posted a magum opus questioning how PolitiFact Florida could reach its finding of "Mostly False" that President Obama's administration extended half a billion in loans to its friends at Solyndra.

No, really:

(clipped from PolitiFact.com)

A snippet of Jeff's take from his blog Bewz Newz 'n' Vewz:
PolitiFact reviews an Americans for Prosperity ad and helpfully specifies what they're going to sort out the truth of:
We decided to fact-check the ad, focusing on whether the president gave "half a billion in taxpayer money to help his friends at Solyndra, a business the White House knew was on the path to bankruptcy." 
They can't screw this one up, can they? Multiple media reports have shown beyond dispute that Obama donors are closely tied to Solyndra, and also that the White House was aware of Solyndra's problems prior to the loan. So just how bad did PolitiFact flub this rating?
Jeff provides plenty of evidence showing the PolitiFact bloodhounds all over the trail without picking up the scent.   Apparently, it's plenty of correlation without any hint of causation.

It's recommended reading.

Monday, November 21, 2011

Hope 'n' change at PolitiFact

Crossposted from Sublime Bloviations


 I keep hoping that criticism will influence positive change at PolitiFact, the fact checking arm of the St. Petersburg Times (soon changing its name to the Tampa Bay Times).

Well, a positive change occurred at PolitiFact recently.

Unfortunately, it was of the "one step forward, two steps back" variety.

For some time I've carped about PolitiFact's inconsistent standards, and in particular its publishing of two different standards for its "Half True" position on the "Truth-O-Meter."

The recent change probably stemmed from a message I sent to an editor at the paper's city desk (sent Nov. 9):
PolitiFact has created a problem for itself through inconsistency.  During the site's earlier years a page called "About PolitiFact" gave information about how the "Flip-O-Meter" and the "Truth-O-Meter" supposedly operate.  The page includes a description of each of the "Truth-O-Meter" rating categories.

More recently, editor Bill Adair posted an item called "Principles of PolitiFact and the Truth-O-Meter."  The problem?  The definition for "Half True" is different than the one PolitiFact posted for well over a year prior.  Compounding the problem, PolitiFact has kept both versions online through now.

1)  The statement is accurate but leaves out important details or takes things out of context.
2)  The statement is partially accurate but leaves out important details or takes things out of context.

I'll be interested to see the eventual remedy.  Which items over PolitiFact's history went by which definition? Was a change made in Feb. 2011 or before without any announcement?  How can PolitiFact legitimately offer report cards and "Truth Index" ratings if the grading system isn't consistent?  Those are questions I'd imagine readers would have if they realized PolitiFact is using two different definitions for the same rating.  I don't expect you to answer them for my sake (not that I would mind if you did). 

Good luck to all sorting this one out.
The eventual remedy is apparently to simply change the longstanding definition at "About PolitiFact" to match the newer one at "Principles of PolitiFact and the Truth-O-Meter" without any fanfare--indeed, without any apparent notice whatsoever.  I detect no admission of error at all and no acknowledgment that PolitiFact changed its standard.

The move seems consistent with the desire of the mainstream press to avoid doing things that "undermine the ability of readers, viewers or listeners to believe what they print or broadcast."

Sadly, I'm not at all surprised.

On the positive side, the definitions are now consistent with one another.

On the negative side, PolitiFact either created a past illusion where Truth-O-Meter ratings used the old system or else created a fresh illusion that past ratings follow the new system.  And went about it in about the least transparent way possible.


Update:

Good luck to PolitiFact retroactively changing the dozens (perhaps hundreds) of places on the Web that republished the original definition of "Half True."


(Clipped from PolitiFact.com; click image for enlarged view)

Contact PolitiFact Wisconsin.  They didn't get the memo yet.  And PolitiFact Texas has the same problem.


It's not the crime, it's the coverup.


Update 2:


It's also worth remembering PolitiFact's agonizing decision to change "Barely True" to "Mostly False."

"It is a change we don't make lightly," wrote Bill Adair.

How do you like that?  A change in the wording of a rating gets a reader survey prior to the change and an article announcing the change.  A change in the definition of a rating--a much more substantial change--gets the swept-under-the-rug treatment.



11/22/11-Added PFB link in update 2-Jeff

Wednesday, November 16, 2011

Media Trackers' PolitiFact series

Recently the media watchdogs Media Trackers published a five part series on PolitiFact.

Intro: Media Trackers Announces Series on PolitiFact
Part 1: PolitiFact and the Political Parties
Part 2: PolitiFact and Third-Party Organizations
Part 3: PolitiFact and Talk Radio
Part 4: PolitiFact and Governor Scott Walker
Part 5: Conclusion on PolitiFact 

We were unimpressed with the start of the series, but by the conclusion Media Trackers reached solid ground.


Part 1

Concern over the direction of the series started early:
On the whole, PolitiFact can’t be called completely biased towards conservatives or liberals. By Media Trackers count, PolitiFact has devoted nearly equal ink to conservative/Republican statements as to liberal/Democrat.
Comparing the number of stories devoted to each party tells nothing of ideological slant.  PolitiFact, if it was so inclined, could set a quota of 50 Republican stories and 50 Democrat stories and then proceed to write every single one of them with a liberal bias.

The remainder of Part 1 built a comparison between PolitiFact Wisconsin's treatment of state Republican Party statements with those of its Democratic Party counterpart.  The number of statements involved was very small (11 combined), but suggested that PolitiFact's editorial focus fixed more on the Democratic Party and doled out harsher ratings.

Part 2

The second installment focused on the treatment of what Media Trackers calls "third party" organizations.  That is, political action groups not directly associated with the political parties.

Media Trackers noted a trend opposite that from part one, albeit the two mini-studies share the problem of small sample size.  The conclusion of the second part found Media Trackers on top of a live spoor:
(D)oes PolitiFact lead readers to believe that conservative third-party organizations are less likely to tell the truth? How come the organization that spent the most on negative advertising in the recall elections had just one statement reviewed? Why more scrutiny to Pro-Life groups than Pro-Choice? Why were One Wisconsin Now’s statements reviewed four times more than the MacIver Institute? And what about statements on critical stories such as the denial by Citizen Action of Wisconsin of a connection to Wisconsin Jobs Now!? Why did PolitiFact choose not to tackle that statement?

No one expects PolitiFact to be the “be all end all” of watchdog journalism. But when they set themselves up as the judge and jury for all political statements in the state, one has to question how they select stories and why certain groups receive far and away more scrutiny than others.
In other words, the selection bias problem at PolitiFact is pretty obvious.

Part 3

Part three looked at PolitiFact Wisconsin's treatment of local radio personalities and established Media Trackers' modern day record for small sample size.  Conservative Charlie Sykes received two ratings while fellow conservative Mark Belling received one.  All three ratings were of the "Pants on Fire" variety.  Again, it smells like selection bias.

Part 4

The fourth installment examined PolitiFact's treatment of Republican governor Scott Walker.

Media Trackers forgave PolitiFact for rating a high number of Walker's statements because of his position of power.  Time will reveal the reliability of that measure.

The Media Trackers analysis noted that PolitiFact appeared to go a bit hard on Walker:
It seems that PolitiFact’s burden for truth is a bit higher for Governor Walker than it is for others. Given the “lightening rod” status of Walker, it certainly seems a bit disingenuous to call the Governor’s claim that Wisconsin is “broke” a false claim because he could just layoff workers and raise taxes to fix the deficit. And to say that Walker did not campaign on the reforms found in the Budget Repair Bill is also disingenuous given that the Governor spoke on a number of the reforms he sought, even though he did not spell out the eventual changes to collective bargaining.
The anecdotes can add up.

Part 5

Media Trackers seized on the common thread in its conclusion:
As Media Trackers has shown with this series, PolitiFact arbitrarily applies its scrutiny. Statements from the Democratic Party of Wisconsin have been evaluated seven times to the Republican’s two. Conservative Club For Growth have been examined seven times (three during the recall elections) while We Are Wisconsin was examined just once. Pro-Life groups have been scrutinized twice and never a Pro-Choice group.

Each of these political groups and officials are putting out an equal number of statements on a myriad of issues every day. If PolitiFact intends to claim the mantle of watchdog journalism by “calling balls and strikes” in the name of “public service,” PolitiFact needs more transparency about how they select their stories and a review of why certain groups and individuals receive more scrutiny than others.
Sample sizes aside, Media Trackers settles on a conclusion well supported by a huge mass of anecdotal material collected by others.  The final installment also refers to Eric Ostermeier's study pointing out PolitiFact's selection bias problem (highlighted at PolitiFact Bias here).

Though the Media Trackers conclusion about PolitiFact isn't exactly groundbreaking, the outfit deserves credit for overcoming its initial stumble and doing an independent examination of its local version of PolitiFact with the conclusion supported on those data.



Jeff adds: It's worth mentioning that PolitiFact Wisconsin is by far the most frequent target of accusations of right-wing bias. We've never found anything that sufficiently corroborates those claims and Media Trackers seems to do a capable job of dispelling that myth.

Wednesday, November 9, 2011

Sublime Bloviations: "Grading PolitiFact: Alan Hays, proof of citizenship and voting"

Could the cure for world hunger be as simple as picking the low hanging fruit from PolitiFact? Sometimes it seems that way.

PFB editor Bryan White was quick to spot the latest gaffe from our facticious friends. Check out PolitiFact Florida's rating of state Senator Alan Hays (R-Umatilla):

Image from PolitiFact.com

Now check out what Hays actually said:
"...I'm not aware of any proof of citizenship necessary before you register to vote."
Bryan notes:
If words matter then we should expect PolitiFact to note the difference between saying one does not know of a requirement and saying that no requirement exists.
If PolitiFact was just your average bucket of hackery, there wouldn't be much more to say other than they distorted Hays' quote. But our site wasn't created because PolitiFact is average. They take distortion to new heights.

Bryan goes on to expose the flim-flammery of how they eventually found Hays Mostly False for something he didn't say (which seems to be a common theme for them). It's impressive to witness the amount of work it takes to get something so wrong.

And for those of you keeping track, it includes yet another example of PolitiFact citing non-partisan, objective Democrats as experts.

So make sure to head over to Sublime Bloviations, because this one is a must read.


Bryan adds:  Not only did PolitiFact rate Hays on a statement he did not make, the rating of what he didn't say is also wrong.  PolitiFact continues to amaze.

Thursday, November 3, 2011

Reason: "PolitiFact Gets High-Speed Rail Facts in Florida Wrong"

Given the recent news about California's impressive high speed rail cost overruns, it seems like a good time to call attention to Reason.com's pushback against PolitiFact's defense of the high speed rail system proposed for Florida.

The chief evidence of bias comes from PolitiFact's attempt to discredit Reason.com on ideological grounds--an intriguing move for an organization known to uncritically cite Guttmacher Institute studies when fact checking claims by abortion opponents.  The Guttmacher Institute, of course, is ideologically attached to Planned Parenthood.

Most of PolitiFact's criticisms of the study promoted by Reason.com were quite weak, such as pointing out that data from the study showing cost overruns were not exclusively rail studies.  While that's true, the cost overruns were greater for rail projects, so the supposed problem actually made rail look perhaps better than it deserved.

The key point of dispute concerns the responsibility for costs if the project stays in the red.  PolitiFact argued that Florida's project provided adequate protections.  Reason.com argues the reverse:
When Gov. Scott was making his rail decision, he knew that if Florida had taken federal money for the Tampa-to-Orlando high-speed rail system, one of the federal government’s rules clearly says that a state government can’t take the construction money and then stop operating the project it has accepted the money for. Under long-standing federal rules, the state would have to repay the federal grant money—in this case, $2.4 billion. If it didn’t repay the $2.4 billion, Florida’s taxpayers would be forced to keep the train running —at a loss— and be on the hook for the future operating subsidies. The U.S. Department of Transportation did send notice that it would negotiate over its repayment rule, but only after Gov. Scott had already announced his decision to turn down the federal money.
I'll admit I'm not familiar with the cited rule, but it's easy on principle to imagine it exists.  It could have helped Reason.com's case to include more information about it.

On the whole, Reason.com makes a pretty good case that PolitiFact failed to settle the issue.

Matthew Hoy: "You guys screwed up"

Ordinarily we highlight Matthew Hoy's criticisms of PolitiFact via the posts at his blog, Hoystory.  But this time we catch Hoy at his pithy best while blasting PolitiFact over at Facbook for its "Pants on Fire" rating of Herman Cain's supposed claim that China is trying to develop nuclear weapons.  PolitiFact took Cain to mean China was developing nuclear weapons for the first time, you see.

Hoy:
You guys screwed up. Congratulations. Read the whole context (which you provide) and it's ambiguous -- he very well may be referring to nuclear-powered AIRCRAFT CARRIERS -- which they don't have yet. Also, during Vietnam, Cain was working ballistics for the Navy, studying the range and capabilities of China's missiles. He knew they had nukes. It was inartfully said. Not a mistake. According to your own rules, you don't fact check things like this: "Is the statement significant? We avoid minor "gotchas"’ on claims that obviously represent a slip of the tongue."
That about says it all, but I'll just add one helpful informational link.

Given the ambiguity of Cain's statement, it speaks volumes about PolitiFact's ideological predisposition that no attempt was made to interpret Cain charitably.

Wednesday, November 2, 2011

Grading PolitiFact: Joe Biden and the Flint crime rate

(crossposted from Sublime Bloviations with minor reformatting)


To assess the truth for a numbers claim, the biggest factor is the underlying message.
--PolitiFact editor Bill Adair


The issue:
(clipped from PolitiFact.com)


The fact checkers:

Angie Drobnic Holan:  writer, researcher
Sue Owen:  researcher
Martha Hamilton:  editor


Analysis:

This PolitiFact item very quickly blew up in their faces.  The story was published at about 6 p.m. on Oct. 20.  The CYA was published at about 2:30 p.m. on Oct. 21, after FactCheck.org and the Washington Post published parallel items very critical of Biden.  PolitiFact rated Biden "Mostly True."

First, the context:



(my portion of transcript in italics, portion of transcript used by PolitiFact highlighted in yellow):

BIDEN:
If anyone listening doubts whether there is a direct correlation between the reduction of cops and firefighters and the rise in concerns of public safety, they need look no further than your city, Mr. Mayor.  

In 2008--you know, Pat Moynihan said everyone's entitled to their own opinion, they're not entitled to their own facts.  Let's look at the facts.  In 2008 when Flint had 265 sworn officers on their police force, there were 35 murders and 91 rapes in this city.  In 2010, when Flint had only 144 police officers the murder rate climbed to 65 and rapes, just to pick two categories, climbed to 229.  In 2011 you now only have 125 shields.  

God only knows what the numbers will be this year for Flint if we don't rectify it.  And God only knows what the number would have been if we had not been able to get a little bit of help to you.

As we note from the standard Bill Adair epigraph, the most important thing about a numbers claim is the underlying message.  Writer Angie Drobnic Holan apparently has no trouble identifying Biden's underlying message (bold emphasis added):
If Congress doesn’t pass President Barack Obama’s jobs plan, crimes like rape and murder will go up as cops are laid off, says Vice President Joe Biden.

It’s a stark talking point. But Biden hasn’t backed down in the face of challenges during the past week, citing crime statistics and saying, "Look at the facts." In a confrontation with a conservative blogger on Oct. 19, Biden snapped, "Don’t screw around with me."
No doubt the Joe Biden of the good "Truth-O-Meter" rating is very admirable in refusing to back down.  The "conservative blogger" is Jason Mattera, editor of the long-running conservative periodical "Human Events."  You're a blogger, Mattera.  PolitiFact says so.

But back to shooting the bigger fish in this barrel.

PolitiFact:
We looked at Biden’s crime numbers and turned to the Federal Bureau of Investigation's uniform crime statistics to confirm them. But the federal numbers aren’t the same as the numbers Biden cited. (Several of our readers did the same thing; we received several requests to check Biden’s numbers.)

When we looked at the FBI’s crime statistics, we found that Flint reported 32 murders in 2008 and 53 murders in 2010. Biden said 35 and 65 -- not exactly the same but in the same ballpark.
Drobnic Holan initially emphasizes a fact check of the numbers.  Compared to the FBI numbers, Biden inflated the murder rate for both 2008 and 2010, and his inflated set of numbers in turn inflates the percentage increase by 45 percent (or 27 percentage points, going from 60 percent to 87 percent).  So it's a decent-sized ballpark.

PolitiFact:
For rapes, though, the numbers seemed seriously off. The FBI showed 103 rapes in 2008 and 92 rapes in 2010 -- a small decline. The numbers Biden cited were 91 rapes in 2008 and 229 in 2010 -- a dramatic increase.
If inflating the percentage increase in murders by 27 percentage points is not a problem for Biden then this at least sounds like a problem.

After going over some other reports on the numbers and a surprising discussion of how not much evidence suggests that Obama's jobs bill would address the number of police officers in Flint, PolitiFact returns to the discrepancy between the numbers:
(W)e found that discrepancies between the FBI and local agencies are not uncommon, and they happen for a number of reasons. Local numbers are usually more current and complete, and local police departments may have crime definitions that are more expansive than those of the FBI.
All this is very nice, but we're talking about the city of Flint, here.  We don't really need current stats for 2008 and 2010 because they're well past.  Perhaps that affects the completeness aspect of crime statistics also; PolitiFact's description is too thin to permit a judgment.  As for "expansive" definitions, well, there's a problem with that.  Biden's number of rapes in 2008 is lower than the number reported in the UCR (FBI) data.  That is a counterintuitive result for a more expansive definition of rape and ought to attract a journalist's attention.

In short, even with these proposed explanations it seems as though something isn't right.

PolitiFact:
Flint provided us with a statement from Police Chief Alvern Lock when we asked about the differences in the crime statistics, particularly the rape statistics.

"The City of Flint stands behind the crime statistics provided to the Office of The Vice President.  These numbers are an actual portrayal of the level of violent crime in our city and are the same numbers we have provided to our own community. This information is the most accurate data and demonstrates the rise in crime associated with the economic crisis and the reduced staffing levels.

"The discrepancies with the FBI and other sources reveal the differences in how crimes can be counted and categorized, based on different criteria." (Read the entire statement)
This is a city that's submitting clerical errors to the FBI, and we still have the odd problem with the rape statistics.  If the city can provide numbers to Joe Biden then why can't PolitiFact have the same set of numbers?   And maybe the city can include stats for crimes other than the ones Biden may have cherry-picked?  Not that PolitiFact cares about cherry-picked stats, of course.

Bottom line, why are we trusting the local Flint data sight unseen?

PolitiFact caps Biden's reward with a statement from criminologist and Obama campaign donor James Alan Fox of Northeastern University to the effect that Biden makes a legitimate point that "few police can translate to more violent crime" (PolitiFact's phrasing).  Fox affirms that point, by PolitiFact's account, though it's worth noting that on the record Biden asserted a "direct correlation" between crime and the size of a police force.  The change in wording seems strange for a fact check outfit that maintains that "words matter."

The conclusion gives us nothing new other than the "Mostly True" rating.  Biden was supposedly "largely in line" with the UCR murder data for Flint.  His claim about rape apparently did not drag down his rating much even though PolitiFact admittedly could not "fully" explain the discrepancies.  PolitiFact apparently gave Biden credit for the underlying argument that reductions in a police force "could result in increases in violent crime" despite Biden's rhetoric about a "direct correlation."


The grades:

Angie Drobnic Holan:  F
Sue Owen: N/A
Martha Hamilton:  F

This fact check was notable for its reliance on sources apparently predisposed toward the Obama administration and its relatively unquestioning acceptance of information from those sources.  The Washington Post version of this fact check, for comparison, contacted three experts to PolitiFact's one and none of the three had an FEC filing indicating a campaign contribution to Obama.

And no investigation of whether Biden cherry-picked Flint?  Seriously?  See the "Afters" section for more on that as well as commentary on PolitiFact's CYA attempt.