Wednesday, February 23, 2011

Michelle Malkin: PolitiFact to “debunk” my Gwen Moore/abortion post

Conservative pundit Michelle Malkin blasts PolitiFact for its plan to fact check the title of one of her blog post headlines, among other things.

As usual, I have a favorite part:
Matthew Hoy points to this statement from PolitiFact editor Bill Adair posted yesterday:
“We don’t check opinions, and we recognize that in the world of speechmaking and political rhetoric, there is license for hyperbole.”

Um, Bill Adair, meet your employee, Lou Jacobson.
Malkin posted her e-mail interaction with Jacobson.  Good stuff well worth the read, and good to see Matthew Hoy get a shout.

Tuesday, February 22, 2011

Keeping up appearances at PolitiFact

(crossposted from Sublime Bloviations)

Yesterday PolitiFact published a piece by editor Bill Adair apparently intended to reassure readers that PolitiFact is, well, politifair in the way it does business.

Given Adair's recent past of expressing indifference to the public's perception of bias at PolitiFact this is a significant development.  Eric Ostermeier probably deserves a great deal of the credit for putting PolitiFact on the defensive.  Ostermeier published a study of PolitiFact's results suggesting the strong possibility of selection bias and called for PolitiFact to make its selection process transparent.

Though Ostermeier's name might as well have been "Voldemort" for purposes of Adair's article, the latter probably serves as Adair's response to Ostermeier's call.

How does the answer measure up?
Editor's Note: We've had some inquiries lately about how we select claims to check and make our rulings. So here's an overview of our procedures and the principles for Truth-O-Meter rulings.
The editor's note is about half true.  PolitiFact didn't just have inquiries.  It found itself criticized by a serious researcher who made a good case that PolitiFact ought to be viewed as having a selection bias problem unless PolitiFact could allay the concern by making its methods transparent.  The editor's note isn't exactly transparent.

Adair's off to a great start!

Wednesday, February 16, 2011

MinnPost: "Politifact responds to U researcher's anti-GOP bias claim"

David Brauer of MinnPost.com enterprisingly asked PoltiFact creator/editor Bill Adair for his take on the recent study by Eric Ostermeier suggesting a partisan selection bias.

Adair played it a bit like a politician.  Read the whole thing here, but here's an excerpt:
"Eric Ostermeier's study is particularly timely because we've heard a lot of charges this week that we are biased — from liberals. They are unhappy with our False rulings on President Obama from his interview with Bill O'Reilly. So we're accustomed to hearing strong reactions from people on both ends of the political spectrum."
I liked Brauer's response to that portion of Adair's statement:
I've never been a fan of the "both sides hate us so we must be doing something right" argument; that can enable false balance.
Brauer's right.

After the diversionary pooh-pooh, Adair goes on to justify PoltiFact's legitimacy by describing the story selection process.  It's editorial judgment and giving the readers what they want ("We check claims that we believe readers are curious about").  In other words, it's pretty much a recipe for selection bias.

Adair then says that PolitiFact's practice of  listing its sources permits readers to decide things for themselves.  That's only partly true.  PolitiFact hardly ever shares the context of its expert source interviews.  And if the readers are supposed to decide for themselves then what is the point of the "Truth-O-Meter"?

Let's face it:  The majority don't pay much attention to PolitiFact beyond the flashy graphics.  If PolitiFact adds 28 and 19 to reach the sum of 49 probably hardly anybody notices.  Nor do the majority pay attention to selection bias in PolitiFact's application of standards.

But that's why JD and I started PolitiFact Bias.  We're here to help alert people to the fact that PolitiFact is second-rate in its fact checking, inconsistent in its application of standards, obviously non-objective and ideologically slanted left.

Adair's statement does not address Ostermeier's hypothesis except to convey the message that PolitiFact doesn't care.  If their editorial judgment isn't good enough then go back to reading Media Matters.

Tuesday, February 15, 2011

Speaking of Selection Bias...

Alan Colmes has now jumped into the discussion (by repeating a Political Wire post) about the Smart Politics review of PolitiFact's selection bias.

In a truly amazing feat, Colmes (and Political Wire) managed to completely ignore the point of Eric Ostermeier's article, presenting the finding that PolitiFact more frequently rates Republicans negatively as proof that Republicans lie more often.

Check out the Colmes headline-
Republicans’ Statements Untrue Three Times As Often As Those Of Democrats
Colmes made absolutely no mention of the fact that Ostermeier was suggesting that the lopsided figures point to selection bias at PolitiFact. This is how Colmes treated the study-
"An analysis by Smart Politics of more than 500 stories during the past year show that statements made by Republican politicians are false three times as often as those made by Democrats."
Representing Ostermeier's piece as the polar opposite of what it is takes either an act of denial or more likely a shoddy job at research.

Compounding the error, Colmes, et al. ignore the focus of the study. Ostermeier's article dealt specifically with the issue of selection bias, i.e. the notion that PolitiFact's editors are more inclined to select outlandish or questionable statements from the GOP. This still leaves the issue of the quality of the ratings themselves untouched. As Ostermeier noted-
"Assuming for the purposes of this report that the grades assigned by PolitiFact are fair (though some would challenge this assumption), there has nonetheless been a great discrepancy regarding which political parties' officials and officeholders receive the top ratings and those that are accused of not telling the truth."
(bold emphasis added)
Colmes' careless posting of misleading information exemplifies how PolitiFact's opinionated brand of journalism can be misrepresented as objective fact, and repeated throughout the echo chamber.



Feb. 16, 2011  Corrected misspelled version of Eric Ostermeier's last name in one of the middle paragraphs.  Also added some new hyperlinks and a definite article in the first sentence.

Monday, February 14, 2011

PFB Smackdown: Kos clueless on Ostermeier study

The reality-based community hasn't taken kindly to Eric Ostermeier's attempts to rescue reality from its liberal bias.

Steve Singiser for the Daily Kos is the latest to take a swipe at the study, writing under the title "Documented proof that Republicans are the biggest liars in politics":
(T)he study (which can be found here) focuses on Politifact itself, charging the nonpartisan analysis done by the St. Petersburg Times fact-checking unit with a systemic bias against Republicans.
Singiser misstates the charge made by Ostermeier.  Ostermeier argues that PolitiFact's presentation suggests Republicans lie more than Democrats, but in scientific terms that conclusion only follows if the conclusions from the data account for selection bias.  But the evidence we have of PolitiFact's selection process offers no assurance against ideological bias in the results.  The stark contrast in the results, Ostermeier says, places a burden of proof on PolitiFact to assure readers that the implicit conclusion is not the result of selection bias.  In other words, make the selection process transparent.  Ostermeier is not charging PolitiFact with "a systemic bias against Republicans."  But he's pointing out that it may well be the case.

After constructing a straw man position for Ostermeier, Singiser proceeds to attack it:
(T)here are other explanations that are equally, if not more, plausible than charging Politifact with grading in bad faith.
True, among them an unconscious bias (good faith, bad methods) among PolitiFact journalists that results in the type of selection bias Ostermeier hypothesizes may account for the disparity in PolitiFact's cumulative grades.  And the latter is an extraordinarily plausible (and parsimonious) explanation.  Singiser would be hard pressed to match it, and that's why Ostermeier argues that PolitiFact should do something to provide a sure foundation for the conclusions it implicitly encourages.

Singiser continues:
For one thing, the study was conducted during a time when the GOP was out of power. The party out of power, it could reasonably be assumed, is going to take more chances with their rhetoric, in an effort to turn the electorate against the party in office.
That's all well and good, but it does nothing to mitigate Ostermeier's real argument.  Remember, PolitiFact grades approximately the same number of statements from figures in both parties.  If PolitiFact used a policy of simply grading chancy rhetoric then one would expect PolitiFact to rate far more statements from one party than the other if the conditions Singiser suggests were to prevail.  And if PolitiFact ends up padding the Democrats' numbers with statements chosen according to a different criterion then we have selection bias by definition.

In short, Singiser's suggested explanation doesn't help.

He doesn't get it.



Feb 17, 2011:  Corrected a recurrent misspelling of Markos Moulitsas' last name (no "z" as it turns out).  Sincere apologies to Mr. Moulitsas.
Jan. 24, 2012:  Double apologies to Markos Moulitsas.  While going back to reference the Kos blog entry I found the link broken.  Shortly after I found an identical version of the post clearly credited to Steve Singiser.

Saturday, February 12, 2011

PFB Smackdown: Why grant PolitiFact the presumption of neutrality?

The smackdown is actually courtesy of Eric Ostermeier, publisher of the study highlighted recently here at PolitiFact Bias and originally published at the University of Minnesota's Smart Politics blog.

Ostermeier sniffed out a criticism of his work at mnpACT! by progressive blogger Dave Mindeman.  After noting two of the central facts in Ostermeier's study, that PolitiFact rates about the same number of statements from Democrats as for Republicans and that Republicans get the worst ratings of the two, Mindeman offers his two cents:
Occam's razor. The GOP get the worst ratings because they make the worst statements.

Ostermeier concentrates on making Politifact defend their selection process, but overlooks the facts about the statements themselves. Could it be possible that the Republicans make more outrageous and indefensible assertions?

Politifact is certainly going to be drawn to statements that get the most attention and the more outrageous the statement, the more attention it gets.
Ostermeier promptly addressed the second paragraph by posting in the commentary section:
FYI: this very possibility was in fact addressed in my report:

"One could theoretically argue that one political party has made a disproportionately higher number of false claims than the other, and that this is subsequently reflected in the distribution of ratings on the PolitiFact site."
Mindeman (as DaveM) answered back:
Then why ask title your post with "selection bias"? It would seem that most of the data says there is none.
And Ostermeier responded again:
What data is that? You can't use the data I published from coding PolitiFact's stories itself (noting the site attributes more false statements to the GOP) as proof that the GOP lies more and thus there is de facto no selection bias. That's circular reasoning. Note: My report did not definitively prove there is such bias, but the data published shifts the burden, I would argue, to PolitiFact. And greater transparency in their selection methodology would shine a light on this very question.
Ostermeier is exactly right.  Mindeman apparently wants to entitle PolitiFact to the presumption of neutrality on the issue of selection bias.  But there simply isn't any basis for that presumption.  The presumption would follow if PolitiFact chose its stories at random.  Lacking that, the reader has no good reason to take PolitiFact as a neutral party minus the transparent methodology Ostermeier mentions at the conclusion of his second comment.

Mindeman has the last word on Ostermeier for the moment:
I fail to understand why Politifact has to "prove" anything. They examined political statements that interested them. Most reporters choose their own stories. If they do a good job of reporting the story, do they still have to prove that they have no inherent bias? Maybe I should assume you have an inherent conservative bias because your analysis deals with GOP favorable data?? But I don't, because I think you use the data in a broad enough sense that it tells something regardless of the outcome.
By this time Mindeman is offering some clues that he doesn't understand selection bias.  But the biggest problem in his analysis is actually his attempt to employ Occam's razor (aka the principle of parsimony) to disfavor Ostermeier's hypothesis that PolitiFact displays a selection bias favorable to Democrats.

Occam's razor favors simple explanations, and Mindeman seems to understand it that far.  But it isn't at all clear why he regards a pattern of lying among a large set of entities as a simpler explanation than political bias from a much smaller group of journalists.  Rather than using Occam's razor to legitimately favor a simpler explanation, Mindeman wields it more like a magic wand that produces a supernatural sphere of protection around the ideas he favors.

Thursday, February 10, 2011

Smart Politics on selection bias at PolitiFact

Research associate Eric Ostermeier of "Smart Politics" at the University of Minnesota has published a study of PolitiFact's content and finds a disturbing vein of partisanship in PolitiFact's selection bias:
Assuming for the purposes of this report that the grades assigned by PolitiFact are fair (though some would challenge this assumption), there has nonetheless been a great discrepancy regarding which political parties' officials and officeholders receive the top ratings and those that are accused of not telling the truth.
Ostermeier goes on to note PolitiFact's focus on untrue statements by the party out of power--perhaps unusual considering the press considers itself the watchdog of government.

Ostermeier's approach is exactly right in taking PolitiFact's grade groupings as an indication of PolitiFact's selection bias rather than as a measure of candidate truthfulness.

A key chart from the report:


As Hot Air's Ed Morrissey points out, it remains possible to interpret the data to mean that Republicans simply utter more false and "Pants on Fire" statements than Democrats.

But is that the best explanation in terms of the empirical data?

I would again call attention to the divide between the "False" and "Pants on Fire" ratings.  PolitiFact to date has offered no metric for objectively distinguishing one from the other.  This in turn strongly suggests that subjective judgment serves as the ultimate criterion for grading a statement as "Pants on Fire."

 ***

Importantly, Ostermeier also points out that PolitiFact's presentation encourages readers to use the ratings to draw conclusions about the subjects receiving the ratings:
The question is not whether PolitiFact will ultimately convert skeptics on the right that they do not have ulterior motives in the selection of what statements are rated, but whether the organization can give a convincing argument that either a) Republicans in fact do lie much more than Democrats, or b) if they do not, that it is immaterial that PolitiFact covers political discourse with a frame that suggests this is the case.



Feb 11, 2011: Corrected typo affecting the word "Minnesota." My apologies to that fine state.
June 14, 2018: Clarified the third paragraph (first after the quotation) by removing a "that" and an -ed suffix.

Monday, February 7, 2011

Logicology: "'Non-partisan'"

Scare quotes!  Run away!

Sean W. Malone shared some thoughts about PolitiFact's nonpartisanship at his blog, Logicology:
Sometimes, I feel compelled to compile a list of my own "Iron Lawz" of politics and maybe of reality in general. If I did, one of them would be as follows:
Any time an organization must be cited as "non-partisan", it probably isn't.
I bring this up because of a post I read recently calling the website PolitiFact a "non-partisan" source for analysis about political decisions. But this is, itself, pretty misleading.
Malone sees PolitiFact's 2010 "Lie of the Year" as a great example of partisan fact-checking.  By all means, follow the link and read it all.

Friday, February 4, 2011

PFB Smackdown: Ron Paul proves that PolitiFact isn't biased?

Ken Winters (on FaceBook) gave voice to one common type of defense of PolitiFact's objectivity:
Of the politicians checked by Politifact for more than 10 statements, the most honest one is Ron Paul and the most dishonest one is Michele Bachman (both are GOP). So anyone calling Politifact biased, needs to fact-check their own thinking.
I'll stick with the reply I offered on FaceBook:
You need to consider PF ratings in terms of selection bias (PF does not randomly choose its topics). It is completely consistent with a liberal bias to rate an oddball Republican (Paul) highly against mainstream Republican competition. Anyone who thinks that Paul's average rating on the "Truth-O-Meter" somehow gets PolitiFact off the hook from the charge of bias needs to fact check their own thinking.
Evaluations such as Winters' rely on dubious assumptions like the following:
  • selection bias as to subject doesn't matter
  • selection bias as to charitable/uncharitable interpretation doesn't exist
  • selection bias as to charitable interpretation doesn't matter
  • PolitiFact doesn't make a significant number of mistakes in its ratings
Winters went on to make a valid point in a subsequent post:  It's hard to back up the charge of bias empirically.

But it's just as hard to back up the claim of fairness empirically.

Thursday, February 3, 2011

James Taranto: "'Death Panels' Revisited"

James Taranto's Wall Street Journal column goes back in time a bit to PolitiFact's biased selection of Sarah Palin's "death panel" post as its "Lie of the Year" for 2009.  Read it all, but the gist is here:
In truth, PolitiFact was more vulnerable to the charge of lying than Palin was, for its highly literal, out-of-context interpretation of her words was at best extremely tendentious.

I couldn't agree more.