Thursday, March 31, 2011

Sublime Bloviations: "Grading PolitiFact (Florida): Grover Norquist and half of something"

I try to resist highlighting posts from my second blog here at PolitiFact Bias, but JD's busy and, all humility aside, this item looks like a classic example of the way the leftward tilt of the uniform newsroom culture can stifle the type of thinking needed to do a fair job of fact-checking:
This item immediately caught my attention because of the ambiguity.  What does Grover Norquist's statement mean?  What was the original context?  PolitiFact seemed to figure it out easily and quickly:
"FYI," he wrote. "Withheld union dues fund half of Dem (Democratic) campaigns in Florida."

That's an awfully big number. So, FYI, we decided to check it out.
Is it "an awfully big number"?

On its face, Norquist's statement appears to refer to half the total number of political campaigns of Democrats in Florida.  For some as-yet-unknown reason, PolitiFact takes it to mean that withheld union dues provide half the funding for all Democratic campaigns in Florida.

The difference in those two understandings is very substantial.  Use the wrong understanding and the wrong fact gets checked.
As subsequently noted in the post, PolitiFact evidently gave no consideration at all to a completely justifiable interpretation of Norquist's words.  Under the alternate interpretation, numbers from PolitiFact's key source for its errant fact check show that no less than 46.7 percent of Democratic Party campaigns in Florida received dues via paycheck deductions from public employee unions.  That's a number that reasonably passes for "half" in normal human communications.

This type of anecdote is a stronger evidence of bias than the usual error favoring one party over another, as I went on to explain:
This is yet another flub that is extremely difficult to understand apart from echo-chamber institutional bias at PolitiFact.  Nobody thought the statement might be talking about something other than half of all the funds received by Democratic Party campaigns?

It's kind of hard to believe.

Wednesday, March 23, 2011

A tale of two fact checks

(crossposted from Sublime Bloviations)

PolitiFact recently checked a claim by Rep. H. Morgan Griffith (R-Va.) that EPA regulations on mining wastewater are so strict that even some popular bottled waters would not pass.

The fact check is notable because it is one for which the claim is technically accurate but leaves a misleading impression.  Those types of claims seem like they should typically result in a "Mostly True" or "Half True" true rating, since PolitiFact's descriptions of those ratings suggest as much (yellow highlights added):
TrueThe statement is accurate and there’s nothing significant missing.
Mostly TrueThe statement is accurate but needs clarification or additional information.
Half TrueThe statement is accurate but leaves out important details or takes things out of context.
Barely True – The statement contains some element of truth but ignores critical facts that would give a different impression.
False – The statement is not accurate.
Pants on Fire – The statement is not accurate and makes a ridiculous claim.
Compare a claim from President Barack Obama back in 2009:
"The problem is that, for decades, we have avoided doing what must be done as a nation to turn challenge into opportunity," Obama said. "As a consequence, we import more oil today than we did on 9/11. The 1908 Model T earned better gas mileage than a typical SUV sold in 2008. And even as our economy has been transformed by new forms of technology, our electric grid looks largely the same as it did half a century ago."
As with Griffith, the fact check found that Obama created an impression that was the reverse of the truth, though it was deferentially described as "a reach" rather than flatly false:
But his implication is that we haven't gotten more fuel efficient in 100 years. And that's a reach.
It's more than a reach. It's simply untrue, as was highlighted in a NPR broadcast I referenced in my review of the Obama fact check.

The stories are parallel except for the fact that Griffith ends with a "Barely True" rating and the president ends with a "Mostly True" rating from the "Truth-O-Meter."  That's if we overlook the fact that Obama's statement may well have been technically false contrary to PolitiFact's finding.

The story comparison suggests that those who work for PolitiFact subjectively interpret PolitiFact's grading structure.

Tuesday, March 15, 2011

PFB Smackdown: The Providence Phoenix rises to aid Politifact--sort of

The Providence Phoenix provided a would-be defense of PolitiFact Rhode Island--but in the end the defense hurts the mainstream media's version of Media Matters.

The story, by David Scharfenberg, opens with the misleading title "PolititFact Faces Its Conservative Critics."  Scharfenberg's story included, I have yet to see PolitiFact meaningfully engage its conservative critics.  The closest we get to that is stuff like Bill Adair waving off criticisms from the right by pointing out that PolitiFact is also criticized from the left.  No doubt the critics from the left are eligible for the same meaningless defense--except that PolitiFact seems concerned enough about offending its left-tilted base to the point of defending itself from the criticisms of Arianna Huffington and Rachel Maddow.  For contrast, a major Wall Street Journal editorial had its existence noted without addressing its content.

Scharfenberg moves on to a critique of PolitiFact written by Willam A. Jacobson (and linked at PolitiFact Bias).

Scharfenberg's conclusion is curious:

Any of the ratings from half-true to mostly true to true would have been in order. For the ProJo to find "pants on fire" itself deserves a "pants on fire" rating.

PolitiFact, you have a problem.

I find Jacobson's critique less than persuasive. The ProJo may be guilty of examining McKay's statement a bit too literally. But for McKay to claim that Whitehouse labeled any Rhode Islander who didn't agree with him on health care reform a "white supremacist" is a pretty serious distortion.
While Scharfenberg apparently disagrees with Jacobson that McKay may have warranted a "Half True" or higher on the "Truth-O-Meter," he apparently concurs with Jacobson that the "Pants On Fire" rating was unjustified.

Scharfenberg subsequently sharpens the point:
Still, when I read the PolitiFact entry, I couldn't help but wonder: is parsing this sort of political theater, this sort of obvious hyperbole, the best use of PolitiFact's time?
PolitiFact claims it grants license for hyperbole:
In deciding which statements to check, we ask ourselves these questions:
  • Is the statement rooted in a fact that is verifiable? We don’t check opinions, and we recognize that in the world of speechmaking and political rhetoric, there is license for hyperbole.
Fact check that?

Anchor Rising: "More Bias on Display"

JD and I only got started with PolitiFact Bias in early 2011.  That puts us a few years behind in highlighting excellent examples others have found that help show PolitiFact's left-leaning bias.

Thanks to the new "PolitiFarce" tag at Anchor Rising I ran across this stellar example from Justin Katz:
The statement being addressed is that "over half of the foreign-born population in Rhode Island is white," and the findings were as follows:
Brown directed us to the U.S. Census Bureau's American Community Survey, 2006-2008, which includes three-year estimates of foreign-born populations in the United States. Specifically, he said he was citing the figures showing that 45.2 percent of foreign-born Rhode Islanders are white. That's not more than half. ...
Drawing from data in the 2006-2008 survey, the census said that 32 percent of foreign-born people, about one third, are white alone, not Hispanic or Latino. ...
A one-year report from 2009 showed that 30 percent of Rhode Island respondents identified themselves as "white alone, not Hispanic or Latino."
So, judged by the statistic that Brown incorrectly thought he should be using, his statement was only false by a little; judged by the appropriate statistic, Brown's statement was false by a lot. On what grounds did PolitiFact give him a "half true"?
Indeed, upon examination of PolitiFact's argument it is difficult to see what portion of Steve Brown's statement, if any, was true.

It's worth noting that this story by PolitiFact did attempt to address Brown's underlying point.  PolitiFact's standards (using the term advisedly) call for giving the underlying point the greatest emphasis in a numbers claim.

But trying to understand PolitiFact's approach on that basis simply leads to more trouble.

PolitiFact:
In the end, Brown's underlying claim that the state police investigate Hispanics more often than non-Hispanics for immigration violations is supported by the department's own numbers. Of the 92 people investigated, 71 were from Latin American countries.
The most obvious problem is the small sample size.  But the bigger problem is PolitiFact's supposed identification of Brown's "underlying claim" that Hispanics were investigated more often than non-Hispanics.  If that was Brown's underlying claim then there should have been no reason to look at race percentages among Rhode Island's foreign-born population.  PolitiFact could have just used the numbers 71 and 92 and had done with it with a glowing "True" rating.  But clearly Brown's point was that Hispanics are investigated disproportionately by race, implying racism in the department's methods.  That argument is specious on its face given the aforementioned small sample size and the strong possibility that factors other than race (proximity of the nation of origin, for example) come into play in leading to an investigation.

The "true" in Brown's statement, then, appears to come from an "underlying claim" that wasn't really Brown's point.  PolitiFact used a superficial factoid to justify bumping Brown up a notch or two (or three).

Sunday, March 13, 2011

Introducing PFB's Anecdote-O-Meter

Just kidding with the title.  The o-meter stuff is too trite to use as more than a one-off joke.

But the subject is anecdotes and their role in helping to show bias in a body of work.

Most of us think we can perceive bias in an individual piece of journalism--an anecdotal evidence.  And we may often be right, but the appearance of bias in reporting or fact checking may simply occur at the result of writer error.

So how does one tell the difference between mere errors and ideological bias?

One method, bolstered by the methods of science, involves counting the number of errors and tracking any association with political parties or ideas.  Mere errors ought to occur roughly equally in stories regarding Republicans compared to those involving Democrats.  Where the errors harm one party more than the other beyond the line of statistical significance, evidence of a political bias has come to light.  The degree of deviation from best practices also may figure in a scientific study of journalistic errors.

In March of 2008 at my blog Sublime Bloviations, I started tagging relevant posts with "grading PolitiFact."  On occasion I have criticized PolitiFact for harsh grading of Democrats.  The vast majority of the posts, in accordance with my selection bias, is made up of criticisms of faulty grades given to Republicans or conservatives, or ridiculously gentle treatment of Democrats or progressives.

I work under no illusion that the list represents definitive evidence of a systemic bias at PolitiFact.  But the number of times PolitiFact's grades go easy on Democrats and tough on Republicans does count as an important and legitimate evidence supporting (not definitively) the charge of bias.

If the ideological bias at PolitiFact is not significant, then it should be possible to compile a list of comparable size and quality containing criticisms of PolitiFact where PolitiFact favors Republicans and deals harshly with Democrats.

I don't foresee that occurring.

The list from Sublime Bloviations, in chronological order (and do pardon the more polemical bent in the earlier entries.  I was shocked by the amateurish fact checks I was reading):

Saturday, March 12, 2011

Le·gal In·sur·rec·tion: "PolitiFact Has A Serious Problem, But I Repeat Myself"

Our friend William A. Jacobson over at Legal Insurrection takes on two PolitiFact ratings.

First Jacobson reviews the Milwaukee Journal Sentinel's PolitiFact team's distortion of a statement by Republican state senator Scott Fitzgerald that "a mob showed up and busted down the door and took over the Capitol." PolitiFact found that Half-True.

Jacobson notes:
Notice how PolitiFact took a correct statement by Fitzgerald, but then added in a political factor, what caused the mob to act as it did, to find the statement only half-true. The statement by Fitzgerald had nothing to do with policy, it was a simple statement of what happened that night, so PolitiFact injected an irrelevant factor to find the statement only half-true . PolitiFact did not even [give it a] "mostly true" rating, which is defined as "The statement is accurate but needs clarification or additional information."
The second fact check that drew Jacobson's scrutiny was a piece done by The Providence Journal of Rhode Island. That story rates candidate for chairman of the Rhode Island Republican Party, Kenneth McKay Pants on Fire for claiming Senator Sheldon Whitehouse said "Everybody in Rhode Island who disagrees with me about Obamacare is an Aryan, is a white supremacist"

PolitiFact engages in their all too typical word mincing the come up with its rating. Jacobson exposes the sophistry with ease:
Whitehouse was not just attacking Senators. It may have been hyperbole for McKay to say "everyone" was attacked, but not much of a hyperbole. Additionally, while Whitehouse did not mention Rhode Islanders by name, he also did not excuse Rhode Islanders from his smear of health care protesters. Any of the ratings from half-true to mostly true to true would have been in order. For the ProJo to find "pants on fire" itself deserves a "pants on fire" rating.
Jacobson ends the article with the obvious conclusion:
"PolitiFact, you have a problem."
We couldn't agree more.

Anchor Rising introduces "PolitiFarce" tag

Rhode Island bloggers Anchor Rising have produced a number of high-quality critiques of PolitiFact, and the good folks there have added a new post category to make it easier to link to their content via PolitiFact Bias.  Look for a new link in the "PolitiFact's Detractors" section in the sidebar.

Anchor Rising's Justin Katz included the following tantalizing tidbit in a post introducing the new category:
I've actually been offered a bit of inside description of the PolitiFact process: Apparently, we can't attribute all of the blame to the journalists who pen the pieces, because at least at the Providence Journal, there's a PolitiFact board that rules on the statement and tasks the writers with explaining it.

I couldn't get details on the makeup of the board, but the process sounds exactly as the skeptical public already suspected: The analyses back-fill to the conclusions.
Katz's description appears at odds with what PolitiFact claims as its process.
A PolitiFact writer researches the claim and writes the Truth-O-Meter article with a recommended ruling. After the article is edited, it is reviewed by a panel of at least three editors that determines the Truth-O-Meter ruling.
Note that the first sentence in PolitiFact's description of its process may be taken in either of two ways.  The "recommended ruling" may either precede the work of the author by coming from an editor or editors, or it may be the work of the writer.

Certainly the description suggests that the writers produce the recommended ruling based on their research.  But perhaps that isn't the case.



March 14, 2011:  Edited to achieve subject-verb agreement in final paragraph

Monday, March 7, 2011

"Stinky Journalism" highlights PFB, Wisconsin's "Media Trackers"

Count me among those who had no idea that Stinky Journalism.org exists.

Stinky Journalism was started to help hold journalists to a higher standard.  That goal resonates here at PolitiFact Bias, and we appreciate the nod we received in a recent media brief

Also congratulations to Media Trackers for the apparently sympathetic account of its review of a PolitiFact Wisconsin story.


I can't wait to make use of one of those Stink-O-Grams.

Sunday, March 6, 2011

Media Trackers: "Is Wisconsin Broke?"

In the same vein as the previous post, Media Trackers finds PolitiFact Wisconsin in nitpick mode for ruling Gov. Scott Walker "False" over his repeated claim that Wisconsin is "broke":
PolitiFact recently concluded that Governor Scott Walker’s statement that Wisconsin is broke is false. The state is $3.6 billion dollars in debt, but is it broke?

Media Trackers believes this battle is one of semantics. In trying to paint a picture of the fiscal crisis in Wisconsin Gov. Walker is using terms and ideas that the average person understands. When we hear someone or something is broke we immediately understand that there is no money to pay the bills. And that is the case in Wisconsin. There is no money to pay the bills, if no actions are taken.

Saturday, March 5, 2011

Steve Bukosky: "JS PolitiFact Distorts Reality"

Steve Bukosky blogs at Waukesha Now, a Wisconsin community site.  Bukosky offers a short-yet-definitive critique of a recent PolitiFact Wisconsin ruling on Gov. Scott Walker:
You may not be aware that there are people who listen to speakers, radio and TV shows only for the purpose of playing gotcha. Gotcha is a tactic using a statement out of context and dressing it up as menacingly as possible. Secretary of State Clinton calls it politics of personal destruction.

Ok, so let us do some nit picking here. No, we are not broke. We have a cash flow deficiency. But to label the Governor's statement as false is to mislead the readers. Too many people are of the mentality that if we have checks left in the checkbook, we still have money. Obviously, there are people who really believe that the state can make ends meet. It is a mentality that was displayed during the OJ Simpson murder trial. Never mind the evidence, the facts, we are going to stick it to the man, the taxpayers in this case.
I had considered highlighting the same PolitiFact Wisconsin story at Sublime Bloviations, but Bukosky's simple judo takedown of the PolitiFact piece saves me the trouble.


June 17, 2011:  Fixed misspelling of "Bukosky" in the second sentence.  Sorry for that extra "w," Steve.

Friday, March 4, 2011

San Diego Rostra: "When Fact Checks are False"

Back in July of last year, Bradley J. Fikes unloaded a broadside against PolitiFact at San Diego Rostra.  Fikes used a fact check of Michele Bachmann as an example of fact checks gone awry:
Last year during the health care debates, PolitiFact rated this statement by Rep. Michele Bachmann, R-Minn, false: “Ezekiel Emanuel, one of President Obama’s key health care advisers, “says medical care should be reserved for the nondisabled. So watch out if you’re disabled.”

I think Bachmann was mostly right, after examining the evidence she presented. PolitiFact misrepresented that evidence, following guidance from an Obama Administration spokesperson.
Fikes offers a detailed takedown of his chosen example along with some trenchant observations of PolitiFact's approach to fact checking.

Fikes' review overlaps somewhat with a criticism I did some months after his on the same PolitiFact fact check.

I recommend his longer and more thorough version.