Wednesday, May 29, 2013

About that George Mason University study showing PolitiFact rates Republicans as less truthful ...

Just about every media outlet has flubbed the reporting on that study from George Mason University that says PolitiFact finds Republicans less truthful.

Most media outlets lean (or fall) toward the view that the study is saying something about the veracity of Republicans.  That's not the point of the study.  It's a media study.  It's studying PolitiFact, not politicians.   Conclusions from the study apply to PolitiFact, not to politicians.

What's the value of this study?  Not much at all.  It proves nothing, as John Sides points out, because so many different explanations may explain the facts.  This study simply records what PolitiFact did with its ratings over a given time period.  So as much as we might like to see a study that quantifies PolitiFact's selection bias or outright spin in writing stories, this isn't it.  Our study probably remains the best of the lot when it comes to showing PolitiFact's bias.

We've run across a couple of media reports that get things mostly right:  Peter Roff at U.S. News & World Report and John Sides of Washington Monthly and "The Monkey Cage."

Roff doesn't clearly describe the point of the study except in terms of his own view (bold emphasis added):
The fact that, as the Lichter study shows, "A majority of Democratic statements (54 percent) were rated as mostly or entirely true, compared to only 18 percent of Republican statements," probably has more to do with how the statements were picked and the subjective bias of the fact checker involved than anything remotely empirical. Likewise, the fact that "a majority of Republican statements (52 percent) were rated as mostly or entirely false, compared to only 24 percent of Democratic statements" probably has more to do with spinning stories than it does with evaluating statements.
It's likely Roff is describing the purpose of the study.  He's not explaining anything new to the researchers (nor to Sides at The Monkey Cage). 

But, hilariously, the media have largely interpreted the GMU press release in terms of liberal orthodoxy.

The Poynter Institute, owner of the Tampa Times and PolitiFact, ran the ambiguous headline "Study: PolitiFact finds Republicans ‘less trustworthy than Democrats’" and published comments from long-time PolitiFact editor Bill Adair to the effect that PolitiFact doesn't try to measure "which party tells more falsehoods."  Newsflash, Bill Adair:  That's not the point of the study.

Typically the media published semi-accurate accounts like the one at Poynter.  But a few others flatly interpreted the study as saying Republicans tell more falsehoods.

Ambiguous

The Huffington Post
Mediaite 



Evidence Republicans tell more falsehoods

The Raw Story
Talking Points Memo
Salon
PoliticsUSA


The two in the "ambiguous" category should write clarifications.  The four in the latter category should write corrections.

4 comments:

  1. Conservatives always claim that fact checkers (or the media, or educators, or historians, or scientists, or whoever) are "biased against them" when they get called on their lies, disinformation, and myth spreading. What else do you expect serial deceivers to do? Just accept the truth from an impartial perspective? What do you think they're liberals?

    ReplyDelete
    Replies
    1. Yes, Ole, the fact checkers are liberals. They're to the political left of the average American, as numerous polls show.

      You're right that it's easy to claim "media bias!" whenever reporting goes against conservatives. Arguably it's done too often. We try to set a sterling example of picking meaningful case studies, and by the weight of our examples (combined with other evidences mention on our About/FAQ page) we make a strong case that PolitiFact is biased.

      It's also easy to claim conservatives are just making an easy claim of media bias without looking at the evidence, by the way. I hope you'll take some time to poke around the site and decide for yourself based on the evidence. Thanks for commenting.

      Delete
  2. I feel like the statement that, "Our study probably remains the best of the lot when it comes to showing PolitiFact's bias." Is, itself, a biased claim. I've looked over both now and I feel qualified to draw my own conclusion on that.

    Your study looks into the subjectivity of the Pants on Fire (PoF) rating vs normal lies and concludes that since the ratio of PoF ratings to regular lies is higher for Republicans, it shows bias on PolitiFacts as a whole.

    The issue with that is that the issue isn't that simple. Subjectivity muddles the whole thing and a conclusion can't really be drawn from this data, even though the presentation of said data is helpful. That is to say that Republicans and Democrats aren't saying the same things and as evidenced by the fact that Republicans are "caught" in lies more often, the two groups are not the same.

    If both parties were the same and said the same things, that would eliminate the variable and, yes, the data would then conclude that PolitiFacts is biased against conservatives. However, the groups are not saying the same things and so, even though the PoF rating is subjective, there is still not enough information to draw that conclusion because of the variable of what's being said.

    The only real conclusion that can be drawn there (since your study doesn't contend whether or not Politifacts conclusions are correct, excluding PoF) is that when reading PolitiFacts, it's probably best to read the article on a particular lie and decide for one's self whether the statement being judged is ridiculous or not according to one's own subjectivity.

    As for the George Mason University study, it seems as though it may be the more useful of the two as it suggests that normal PolitiFact ratings (not just the PoF ratings) may be subjective. Ultimately, neither discount PolitiFact as a source of information, but only support the conclusion that one should read into the statement being scrutinized. It's still best to think for one's self than to blindly follow conclusions drawn by others.

    On that note, however, I do notice that most, if not all, of the "Folks Who Link To Us" are conservative in nature. It makes me wonder if one should do a study of Politifactbias.com in order to determine if any of the observations made about PolitiFact are biased, which seems likely.

    Further down the rabbit-hole we go.

    ReplyDelete
    Replies
    1. "Unknown," (Ari Asulin?) wrote:

      **I feel ... and I feel qualified to draw my own conclusion on that.**

      Go for it. Use the Force, Luke. Let your feelings be your guide.

      **Your study looks into the subjectivity of the Pants on Fire (PoF) rating vs normal lies and concludes that since the ratio of PoF ratings to regular lies is higher for Republicans, it shows bias on PolitiFacts as a whole.**

      If you had read the "Pants on Fire" research, you'd know the conclusion is restricted to bias in the application of the "Pants on Fire" rating. There is no conclusion regarding PolitiFact on the whole. If you're not Ari Asulin you at least share his habit of getting things wrong.

      **The issue with that is that the issue isn't that simple. (...) Republicans and Democrats aren't saying the same things and as evidenced by the fact that Republicans are "caught" in lies more often, the two groups are not the same.**

      Now you're ignoring your own observation. You're right that it's not simple and it *is* subjective. The subjectivity lends itself, ironically, to drawing reasonable conclusions about the ones who are putting the subjectivity on the record (PolitiFact). It does not, unfortunately for you and PolitiFact, lend itself to conclusions about differences between the Republicans and Democrats receiving the subjective ratings.

      **If both parties were the same and said the same things, that would eliminate the variable and, yes, the data would then conclude that PolitiFacts is biased against conservatives.**

      That condition would allow us to expand the study's evaluation across every rating that PolitiFact offers. But that condition isn't required to support the conclusion from the PoF study. Your conclusion is based on faulty evaluation.

      **However, the groups are not saying the same things and so, even though the PoF rating is subjective, there is still not enough information to draw that conclusion because of the variable of what's being said.**

      The conclusion that PolitiFact applies a bias in the application of its "Pants on Fire" rating, which is the conclusion of the study, is fully supported by the study results. It is reasonable, in addition, to take the results of the study as one sturdy evidence in support of the hypothesis that PolitiFact is leans left.

      **The only real conclusion that can be drawn there (since your study doesn't contend whether or not Politifacts conclusions are correct, excluding PoF) is that when reading PolitiFacts, it's probably best to read the article on a particular lie and decide for one's self whether the statement being judged is ridiculous or not according to one's own subjectivity.**

      I'm missing the part of your explanation that would show why clear evidence of bias in the application of "Pants on Fire" ratings doesn't count as evidence of bias in the application of "Pants on Fire" ratings.

      **As for the George Mason University study, it seems as though it may be the more useful of the two as it suggests that normal PolitiFact ratings (not just the PoF ratings) may be subjective.**

      Whatever the strengths of the GMU study (we differ as to those), it's not reproducible as published. Moreover, its results are enslaved to the selection bias problem. How do you separate selection bias from the subjectivity of the overall rating system? The PoF bias study does not suffer from that problem.

      **Ultimately, neither discount PolitiFact as a source of information, but only support the conclusion that one should read into the statement being scrutinized. It's still best to think for one's self than to blindly follow conclusions drawn by others.**

      I heartily recommend that view when reading your conclusions.

      **It makes me wonder if one should do a study of Politifactbias.com in order to determine if any of the observations made about PolitiFact are biased, which seems likely.**

      Keep us posted, Mad Hatter.

      Delete

Thanks to commenters who refuse to honor various requests from the blog administrators, all comments are now moderated. Pseudonymous commenters who do not choose distinctive pseudonyms will not be published, period. No "Anonymous." No "Unknown." Etc.