PolitiFact has a pretty terrible and rather partisan history of Obamacare fact checks. However, there's one, in particular, about Obamacare that remains especially puzzling. It's the "half-true" rating the organization gave when President Obama promised that, If you like your health insurance, you can keep your health insurance under Obamacare. This was not a casually tossed-off statement by the president, either. It was made repeatedly and quite deliberately in an attempt to sell America on Obamacare.Treat yourself to reading every word. Hemingway nails it, and his conclusion is not to be missed.
Showing posts with label Mark Hemingway. Show all posts
Showing posts with label Mark Hemingway. Show all posts
Wednesday, October 30, 2013
The Weekly Standard: 'Will PolitiFact Ever Correct Its Biggest Obamacare Error?'
Just a few weeks ago, I was lamenting the decreased volume of criticism directed at PolitiFact. But as Obamacare promises continue to crash and burn, picking on PolitiFact is back in style. And few do it better than The Weekly Standard's Mark Hemingway:
Sunday, June 2, 2013
PolitiFact's Paradoxical Positions: The Conflicting Claims of Bill Adair
A press release for a study by George Mason University (CMPA) last week garnered wide attention, though, hilariously, for the wrong reasons. In a case of comically misinterpreted data, the supposedly fact-based left took to the Twitterverse and other media outlets to trumpet what they thought the study said about Republican honesty, when in reality the study was attempting to quantify PolitiFact's liberal tilt. (We don't think it was successful, but more on that later.)
While distorted headlines may comfort some liberals in the warm blanket of confirmation bias, a study highlighting PolitiFact's harsher treatment of the GOP doesn't fit with PolitiFact's self-proclaimed non-partisanship. With headlines like "Study: Republicans are “the less credible party” and tweets claiming "Republicans lie 3x more often than Democrats, says @politifact" quickly being repeated, PolitiFact founder Bill Adair felt compelled to issue a response:
We've written before about why that method is so flawed. Simply adding up the ratings provides data that has too many plausible explanations to reach a conclusion. The most obvious problem is selection bias, which, by the way, Adair openly admits to in his response:
So we agree with Adair that a tally of ratings is a poor measure of which party tells more falsehoods. Of course, that means we also disagree with him.
What Adair forgot to mention in his response to the CMPA study is Adair himself uses the exact same method to promote PolitiFact's ratings:
Granted, it's possible Adair misspoke that one time. It's also possible that in 2008 he wrote an article giving readers "tips and tricks on how to find what you want on PolitiFact":
Adair's response to the CMPA study doesn't pass the sniff test. His dismissal of CMPA's method contradicts years' worth of his own lauding of that exact practice. When Adair stops pimping PolitiFact's report cards as if they're some kind of objective referendum on a politician's honesty, we'll be happy to give him kudos. Until then, he's little more than a disingenuous hack trying to have it both ways.
Afters:
At the very least I'll give Adair points for hypocritical chutzpah. The very same day he published his response whining that the study seemed "to have counted up a small number of our Truth-O-Meter ratings" and then "drew their own conclusions", PolitiFact published a recap of Michele Bachmann's record with this admonition:
If 100 ratings is a "small number" to draw from for your own conclusions about PolitiFact, then how can 59 ratings tell you anything about Bachmann?
Additional Reading:
-Check out Bryan's piece over at Zebra Fact Check, PolitiFact's Artful Dodger, for a more straightforward analysis of both the CMPA press release and Adair's response.
-We always think Hemingway is worth reading and his take on Adair's response to the study is no exception. He has several good points we didn't mention here.
-You can review our empirical evidence highlighting PolitiFact's liberal bias. Our method avoids the flaws we've seen in so many other studies. You can find it here.
While distorted headlines may comfort some liberals in the warm blanket of confirmation bias, a study highlighting PolitiFact's harsher treatment of the GOP doesn't fit with PolitiFact's self-proclaimed non-partisanship. With headlines like "Study: Republicans are “the less credible party” and tweets claiming "Republicans lie 3x more often than Democrats, says @politifact" quickly being repeated, PolitiFact founder Bill Adair felt compelled to issue a response:
Actually, PolitiFact rates the factual accuracy of specific claims; we do not seek to measure which party tells more falsehoods.We actually agree with Adair on a few points in his response, but there's still problems with it. Mark Hemingway was quick to spot one of them:
The authors of the press release seem to have counted up a small number of our Truth-O-Meter ratings over a few months, and then drew their own conclusions.
Adair's statement is lawyerly and bordering on dishonest. CMPA did not draw their own conclusions—they simply tallied up all of PolitiFact's ratings during a specific time period to get a representative sample. All the CMPA did was present relevant data, they most certainly did not "draw their own conclusions."Hemingway is right. Apparently all CMPA did was tally up PolitiFact's ratings and report the results. On the one hand, that's a much less dubious version of fact-checking than anything Adair's ever done. But that's also one of the reasons we don't think it's valuable as a tool to measure PolitiFact's bias.
We've written before about why that method is so flawed. Simply adding up the ratings provides data that has too many plausible explanations to reach a conclusion. The most obvious problem is selection bias, which, by the way, Adair openly admits to in his response:
We are journalists, not social scientists. We select statements to fact-check based on our news judgment -- whether a statement is timely, provocative, whether it's been repeated and whether readers would wonder if it is true.It's good to see Adair finally admit to selection bias, but we've spoken at length about the numerous other problems with this methodology. The bottom line is a tally of PolitiFact's ratings falls well short of proving ideological bias. A lopsided tally may be consistent with a political bias, but in and of itself it isn't unassailable evidence.
So we agree with Adair that a tally of ratings is a poor measure of which party tells more falsehoods. Of course, that means we also disagree with him.
What Adair forgot to mention in his response to the CMPA study is Adair himself uses the exact same method to promote PolitiFact's ratings:
Instead of traditional articles, our Truth-O-Meter fact-checks are a new form that allows you to see a politician’s report card, to see all fact-checks on a subject or see all the Pants on Fire ratings. We can make larger journalistic points through the automatic tallies and summaries of our work.You see, when PolitiFact tallies up its ratings, it provides "larger journalistic points" about their work. When academics do it in order to highlight PolitiFact's harsher treatment of the GOP, hey, PolitiFact is just checking specific claims, a tally doesn't tell you anything.
Granted, it's possible Adair misspoke that one time. It's also possible that in 2008 he wrote an article giving readers "tips and tricks on how to find what you want on PolitiFact":
• Check a candidate's report card — Our candidate pages...provide a helpful overview. Each one includes a running tally of their Truth-O-Meter ratings, the most recent claims and attacks they've made, attacks against them and their Flip-O-Meter ratings.Helpful overview? OK. Well, at least Adair knows that a tally of PolitiFact's ratings can't reveal patterns and trends about a candidates truth telling. Right?
Collectively, those fact-checks have formed report cards for each candidate that reveal patterns and trends about their truth-telling.Wait. What?
The PolitiFact report cards represent the candidates' career statistics, like the back of their baseball cards.But, but, I thought PolitiFact only checked specific claims and tallies don't show insight about an overall record?
The tallies are not scientific, but they provide interesting insights into a candidate's overall record for accuracy.Adair is pulling a sneaky trick. In one instance he's claiming a tally of PolitiFact's ratings reveals nothing, yet on multiple occasions he's implicitly telling readers the opposite and claims tallying up their ratings is a valuable tool in order to determine a politician's honesty. Adair cannot simultaneously use that method as both a shield and a sword.
Adair's response to the CMPA study doesn't pass the sniff test. His dismissal of CMPA's method contradicts years' worth of his own lauding of that exact practice. When Adair stops pimping PolitiFact's report cards as if they're some kind of objective referendum on a politician's honesty, we'll be happy to give him kudos. Until then, he's little more than a disingenuous hack trying to have it both ways.
Afters:
At the very least I'll give Adair points for hypocritical chutzpah. The very same day he published his response whining that the study seemed "to have counted up a small number of our Truth-O-Meter ratings" and then "drew their own conclusions", PolitiFact published a recap of Michele Bachmann's record with this admonition:
"At this point, Bachmann's record on the Truth-O-Meter skews toward the red [False]."
If 100 ratings is a "small number" to draw from for your own conclusions about PolitiFact, then how can 59 ratings tell you anything about Bachmann?
Additional Reading:
-Check out Bryan's piece over at Zebra Fact Check, PolitiFact's Artful Dodger, for a more straightforward analysis of both the CMPA press release and Adair's response.
-We always think Hemingway is worth reading and his take on Adair's response to the study is no exception. He has several good points we didn't mention here.
-You can review our empirical evidence highlighting PolitiFact's liberal bias. Our method avoids the flaws we've seen in so many other studies. You can find it here.
Sunday, January 20, 2013
Another Black Knight for PolitiFact
The comedy film "Monty Python and the Holy Grail" is justly famous for its fight scene between King Arthur and the mysterious Black Knight who attempts to block his path.
Arthur defeats the Black Knight, first chopping off an arm, then another arm, then a leg and then the other leg. As the Black Knight suffers each stage of defeat he defiantly continues to challenge Arthur to continue the fight.
PolitiFact's efforts to defend itself from criticism often run parallel to the Black Knight's fighting prowess against Arthur.
The latest duel pits PolitiFact editor Bill Adair against critics who say Fiat's confirmation that it will produce over 100,000 Jeep vehicles annually at a Chinese manufacturing plant undercuts PolitiFact's 2012 choice for "Lie of the Year." The Romney campaign produced an ad saying Obama sold Jeep to Italians who will build Jeeps in China. PolitiFact ruled the ad "Pants on Fire" in October before selecting it as the "Lie of the Year" in December.
The original ruling drew plenty of criticism, and the recent confirmation of the deal to produce Jeeps in China produced a renewal of that criticism, perhaps best expressed by Mark Hemingway of The Weekly Standard.
Adair's right about one thing, at least. All of PolitiFact's "Lie of the Year" selections contain a significant element of truth, so of course it doesn't matter to PolitiFact if the ad is true. It can still qualify as "Lie of the Year." The tough thing for Adair to explain, which he doesn't attempt, is how the ad can be technically true yet receive a "Pants on Fire" rating as election day approached.
It's just another dismal defense of a PolitiFact blunder.
Mark Hemingway, by the way, responded with Arthurian effectiveness to Adair's post the same day it was published.
We'll give away the ending:
(The video contains language some may find offensive. Oh, and there's lots of obviously fake blood.)
Jeff adds (1/30/13):
Adair's most recent CYA/non-response to Hemingway is typically awful of the genre, and PolitiFact has had some stinkers. Chock full of evasions and denials, it would seem that Adair is completely unable to confront the facts that lurk in front of his face. Take a look at the opening paragraph of his nada culpa, and pay special attention to the quotation marks:
Another comically dishonest diversion from Adair is his assertion that PolitiFact isn't making a value judgement on Obama's policy. He writes:
Arthur defeats the Black Knight, first chopping off an arm, then another arm, then a leg and then the other leg. As the Black Knight suffers each stage of defeat he defiantly continues to challenge Arthur to continue the fight.
PolitiFact's efforts to defend itself from criticism often run parallel to the Black Knight's fighting prowess against Arthur.
The latest duel pits PolitiFact editor Bill Adair against critics who say Fiat's confirmation that it will produce over 100,000 Jeep vehicles annually at a Chinese manufacturing plant undercuts PolitiFact's 2012 choice for "Lie of the Year." The Romney campaign produced an ad saying Obama sold Jeep to Italians who will build Jeeps in China. PolitiFact ruled the ad "Pants on Fire" in October before selecting it as the "Lie of the Year" in December.
The original ruling drew plenty of criticism, and the recent confirmation of the deal to produce Jeeps in China produced a renewal of that criticism, perhaps best expressed by Mark Hemingway of The Weekly Standard.
"It's just a flesh wound."
On Jan. 18 Adair responded to the latest round of criticism:A number of readers emailed us this week about news reports that Chrysler is moving forward with a partnership in China to produce Jeeps. They wondered: Doesn’t that disprove our Lie of the Year -- that Mitt Romney said Barack Obama "sold Chrysler to Italians who are going to build Jeeps in China" at the cost of American jobs?It bears emphasis that Jeep sold about 50,000 American-made Jeeps in China in 2012. Somehow no mention of Jeep exports to China crept into any of PolitiFact's fact checking of the Romney ad.
No, it doesn’t.
Adair's right about one thing, at least. All of PolitiFact's "Lie of the Year" selections contain a significant element of truth, so of course it doesn't matter to PolitiFact if the ad is true. It can still qualify as "Lie of the Year." The tough thing for Adair to explain, which he doesn't attempt, is how the ad can be technically true yet receive a "Pants on Fire" rating as election day approached.
It's just another dismal defense of a PolitiFact blunder.
Mark Hemingway, by the way, responded with Arthurian effectiveness to Adair's post the same day it was published.
We'll give away the ending:
PolitiFact has a reputation for alternately being unresponsive or inadequately responding to criticisms. And they haven't done anything to remedy that today.Exactly.
(The video contains language some may find offensive. Oh, and there's lots of obviously fake blood.)
Jeff adds (1/30/13):
Adair's most recent CYA/non-response to Hemingway is typically awful of the genre, and PolitiFact has had some stinkers. Chock full of evasions and denials, it would seem that Adair is completely unable to confront the facts that lurk in front of his face. Take a look at the opening paragraph of his nada culpa, and pay special attention to the quotation marks:
[Readers] wondered: Doesn’t that disprove our Lie of the Year -- that Mitt Romney said Barack Obama "sold Chrysler to Italians who are going to build Jeeps in China" at the cost of American jobs?The entire basis for the Pants of Fire rating is something the Romney ad never claimed. If it did, why didn't PolitiFact quote the relevant portion? The portion that Adair quotes is entirely accurate, even by PolitiFact's own admission. The only falsehood here is PolitiFact's invention that the Romney ad claimed it would cost American jobs.
No, it doesn’t.
Another comically dishonest diversion from Adair is his assertion that PolitiFact isn't making a value judgement on Obama's policy. He writes:
We should be clear, we are not defending President Obama’s auto policy. As independent fact-checkers, we don’t take positions on policy issues. So whether it was advisable to bail out the auto companies, and or whether the bailout was done with proper safeguards was beyond the scope of our fact-check.As I pointed out in our original review of this claim back in November, PolitiFact was much more smitten with the Presidents performance back then (emphasis added):
With Ohio’s 18 electoral votes very much in play, the Mitt Romney campaign aims to blunt one of Barack Obama’s key advantages in that state -- his rescue of the auto industry.Let me be clear: PolitiFact has determined that Barack Obama single-handedly rescued the entire auto industry...they're just not taking a position it.
Sunday, September 2, 2012
The Weekly Standard: "PolitiFact's Credulous Romney-Ryan Health Care Attacks"
The Weekly Standard's Mark Hemingway once again exemplifies what I'm talking about when I say the criticisms of PolitiFact from the right sustain a higher standard than those from the left. Hemingway methodically dismantles PolitiFact's facade of misstatements surrounding health care claims by Mitt Romney and Paul Ryan.
Behold as Hemingway warms to his topic:
He then highlights the mendacity of the Obama campaign and its fact-checking lackey in promoting the claim that Medicare beneficiaries may bear an increased cost of $6,400 per year for Medicare insurance. The Obama campaign attacked an obsolete version of a Ryan reform, and PolitiFact evidently granted the Democrats the benefit of the doubt that attacking the old plan was not an attempt to mislead the audience. The claim in the ad, says PolitiFact, is "Half True."
Hemingway:
Hemingway makes the complex easy to digest, so make the time to read the whole thing.
Behold as Hemingway warms to his topic:
Perhaps if we all ignore PolitiFact, they'll go away. But for the time being, the supposedly independent organization continues to crank out skewed and partisan work. There's no better example of this than the the current jihad the "fact checking" organization is waging against the Romney-Ryan health care plan.Hemingway goes on to point out PolitiFact's failure to acknowledge the power of the Independent Payment Advisory Board to implement policies that reduce services for Medicare beneficiaries by decreasing the supply of providers.
He then highlights the mendacity of the Obama campaign and its fact-checking lackey in promoting the claim that Medicare beneficiaries may bear an increased cost of $6,400 per year for Medicare insurance. The Obama campaign attacked an obsolete version of a Ryan reform, and PolitiFact evidently granted the Democrats the benefit of the doubt that attacking the old plan was not an attempt to mislead the audience. The claim in the ad, says PolitiFact, is "Half True."
Hemingway:
PolitiFact presents no evidence that the current Romney-Ryan Medicare plan will costs [sic] seniors anywhere close to $6,000. So how the heck, in the total absen(c)e of evidence, does that statement rate even "half true"?May I suggest to Hemingway that employing inconsistent standards for judgment can easily assist the opinion journalists at PolitiFact in reaching their apparently partisan conclusions?
Hemingway makes the complex easy to digest, so make the time to read the whole thing.
Wednesday, July 25, 2012
The Weekly Standard: "PolitiFact Mucks Up the Contraception Debate"
This year has sped by at a breathtaking pace so far, and we've neglected to review some worthy stories about PolitiFact simply because we placed a higher priority on some stories than others.
But it's not too late.
In February, The Weekly Standard's Mark Hemingway weighed in with yet another damning assessment of PolitiFact's talent for fact checking:
For those who don't have the time, I'll sum up:
Hemingway's latest example of PolitiFactian perfidy concerns its use of a Guttmacher Institute publication to support an Obama administration claim that 98 percent of sexually active women use birth control.
The Obama administration was trying to justify its insurance mandate requiring birth control as a basic coverage requiring no copay.
Hemingway noted the Guttmacher Institute's lack of neutrality, a number of the arguments marshaled against its findings and PolitiFact's selective use of the evidence.
At the end of the day, a study drawn from a group of women aged 15-44 does not justify extrapolating the data to the set of all women of any age. PolitiFact went soft again on an administration claim.
But it's not too late.
In February, The Weekly Standard's Mark Hemingway weighed in with yet another damning assessment of PolitiFact's talent for fact checking:
Before I explain why PolitiFact is once again being deliberately misleading, grossly incompetent, or some hellbroth of these distinguishing characteristics, you'll have bear with me. Part of the reason PolitiFact gets away with being so shoddy is that it counts on its readers believing that it can be trusted to explain any necessary context to justify its status as judge, jury, and factual executioner.Obviously the right thing to do now is click the link and read the whole thing for yourself.
For those who don't have the time, I'll sum up:
Hemingway's latest example of PolitiFactian perfidy concerns its use of a Guttmacher Institute publication to support an Obama administration claim that 98 percent of sexually active women use birth control.
The Obama administration was trying to justify its insurance mandate requiring birth control as a basic coverage requiring no copay.
Hemingway noted the Guttmacher Institute's lack of neutrality, a number of the arguments marshaled against its findings and PolitiFact's selective use of the evidence.
At the end of the day, a study drawn from a group of women aged 15-44 does not justify extrapolating the data to the set of all women of any age. PolitiFact went soft again on an administration claim.
Wednesday, June 20, 2012
The Weekly Standard: "Romney to PolitiFact: There You Go Again"
The Weekly Standard's Mark Hemingway was back in PolitiFact's grille back in April.
PolitiFact ruled "Mostly False" a claim from the Mitt Romney campaign that women as a group have suffered 92.3 percent of the net job losses under Obama's presidency. That ruling brought a swift and stern response from the Romney campaign.
Hemingway filed the battle report:
The above point, that PolitiFact appears absurd for ruling a true statement "Mostly False" probably can't receive enough emphasis. PolitiFact's rating system provides no description fitting this type of rating. If the results make it look like PolitiFact isn't categorizing claims according to whether they fit some type of established objective criteria, it's probably because that's the way it is.
Addendum:
PolitiFact's response to the complaint from the Romney campaign deserves a closer look:
The point is that the original reporting didn't justify the ruling. If PolitiFact can't see that then it's no surprise that additional reporting fails to sway its made-up mind.
PolitiFact ruled "Mostly False" a claim from the Mitt Romney campaign that women as a group have suffered 92.3 percent of the net job losses under Obama's presidency. That ruling brought a swift and stern response from the Romney campaign.
Hemingway filed the battle report:
Given that PolitiFact says Romney's numbers check out, how the heck did PolitiFact then conclude Romney's statement is "mostly false"? Well, they did what fact checkers habitually do whenever they find something factually correct but politically disagreeable—kick up a bunch of irrelevant contextual dirt and lean on some biased sources. Which is why PolitiFact's own language here is absurd: "We found that though the numbers are accurate, their reading of them isn’t" and "The numbers are accurate but quite misleading." I would also note that my friend Glenn Kessler, the fact checker at the Washington Post, evaluated the same claim and deemed it "TRUE BUT FALSE." I do hope that if media fact checkers expect to retain any credibility to evaluate basic empirical claims, they're aware that this kind of Orwellian doublespeak is going to make them a laughingstock.Read the whole thing, because Hemingway's just warming up with the above.
The above point, that PolitiFact appears absurd for ruling a true statement "Mostly False" probably can't receive enough emphasis. PolitiFact's rating system provides no description fitting this type of rating. If the results make it look like PolitiFact isn't categorizing claims according to whether they fit some type of established objective criteria, it's probably because that's the way it is.
Addendum:
PolitiFact's response to the complaint from the Romney campaign deserves a closer look:
We considered the complaint and interviewed four other economists, none of whom have formal or financial ties to any campaigns. Our additional reporting found no reason to change our ruling, which remains at Mostly False.Two words: Fig leaf.
The point is that the original reporting didn't justify the ruling. If PolitiFact can't see that then it's no surprise that additional reporting fails to sway its made-up mind.
Monday, February 20, 2012
The Weekly Standard: "Liberal Pundits Shocked to Discover PolitiFact Not Always Factual"
Mark Hemingway of the Weekly Standard has earned himself the reputation as perhaps PolitiFact's top critic. As evidence of that, Hemingway beat me to the "late to the party" theme by about a month after the progressive outrage over PolitiFact's "Lie of the Year" selection for 2011.
I'm sorry I missed his article before now.
Hemingway:
But he's probably right. And, as usual, it's well worth reading the whole article.
Correction 2/21/2012: Fixed spelling of "Pundits" in the title.
I'm sorry I missed his article before now.
Hemingway:
So the liberal punditry woke up today to find that PolitiFact has declared the "Lie of the Year" to be Democrats's claim that Paul Ryan's budget will "end Medicare" or "end Medicare as we know it." They're having quite the collective freakout—see Paul Krugman, Jonathan Chait, Matt Yglesias, Brian Beutler, Steve Benen, et al.Hemingway concedes the "end Medicare" claim has some truth to it:
Accusing Republicans of trying to end Medicare as we know it is also a stupid criticism because the implementation of the Independent Payment Advisory Board (IPAB) in the Patient Protection and Affordable Care Act will also "end Medicare as we know it." And unlike Ryan's plan, Democrats already made IPAB the law of the land. Under IPAB, unelected federal bureaucrats chosen by the president will bypass Congress and set the Medicare budget, and this will likely have pretty dramatic consequences for the program, such as severely restricting doctor access and rationing. It might well prove unconstitutional to boot.So why all the outrage if Medicare as we know it is already dead and gone? Hemingway has a hypothesis:
Liberals are freaking out over this because they're so used to PoltiFact and other fact checkers breaking things their way.Ouch!
But he's probably right. And, as usual, it's well worth reading the whole article.
Correction 2/21/2012: Fixed spelling of "Pundits" in the title.
Wednesday, January 11, 2012
Mark Hemingway and Glenn Kessler on NPR
Mark Hemingway, who wrote a key critique of modern fact-checking operations back in December, appeared with the Washington Post's fact checker, Glenn Kessler, for a radio interview on NPR. It's worth either listening to it or reading the transcript, but one particular section deserves special attention:
Hemingway's December article was quite valuable, but he missed an opportunity to explain an important aspect of Eric Ostermeier's examination of PolitiFact's story selection.
PolitiFact rated about the same number of politicians from each party. Yet one party received significantly worse "Truth-O-Meter" ratings. The key inference behind Ostermeier's study was the expectation that a party-blind editorial selection process should be expected to choose the same types of stories for both parties. The results, then, if Republicans really do lie more, would show approximately the same distribution of ratings but with one party more heavily represented in the total number of stories. The approximately even number of stories for each group throws the monkey wrench in Noreen's reasoning.
It would have been good if Hemingway had explained that during the broadcast.
As a side note, it's interesting that Kessler likewise ends up writing approximately as many stories about Democrats as about Republicans. Run the numbers for Kessler as Ostermeier did for PolitiFact and perhaps the tendencies look alike. The obvious reason for focusing on PolitiFact instead of Kessler is PolitiFact's far greater volume of material.
(1/12/12) Jeff adds: There's a flaw that is often overlooked when discussing the "add 'em up" style of interpreting PolitiFact's ratings, and that's the issue of the quality of the fact checks themselves. Assuming PolitiFact actually adheres to an objective formula for avoiding selection bias, and then rates 50 statements from the left and 50 from the right, it still wouldn't disprove an ideological lean.
Take for example the different standards used when PolitiFact rated similar statements from Herman Cain and Barack Obama. Both included the employer's portion of payroll taxes in their respective calculations, but in Cain's case PolitiFact downgraded him for it, while in Obama's case it pushed him higher up the rating scale. And this still doesn't take into account the dishonest tactic of inventing statements out of thin air.
It may be interesting to review the tallies of who gets what ratings and discuss the merits of the numbers. Ultimately though it's the alternating standards that will offer the best evidence of PolitiFact's liberal bias.
(1/19/2012) Jeff adds: An additional flaw with adding up PolitiFact's ratings is the fact that PolitiFact chooses who to give the rating to.
When Obama claimed that preventative health care "saves money", and David Brooks said he's wrong, PolitiFact gave a True to Brooks. This serves the dual purpose of sparing Obama a False on his "report card" that PolitiFact likes to shill so often, while also providing cover in the "we give Republicans True's too!" sense.
When PolitiFact rated the oft-repeated, and false, claim about $16 muffins at DOJ event, PolitiFact could have given the rating to NPR, the New York Times, or even (gasp!) PolitiFact partner ABC News. Instead, they chose to burden Bill O'Reilly with the falsehood, despite the original claim coming from a government report.
It's these types of shenanigans that will always distort a ratings tally.
Update/clarification (1/14/2012):
Added "for a radio interview on NPR" to the first sentence.
CONAN: Here's an email from Noreen(ph). I don't understand that the - that since - excuse me. I don't understand the idea that since PolitiFact demonstrates that Republicans lie three times as often as Democrats mean it's biased. Maybe Republicans actually do lie that much more. The idea that you have to have an even number of lies reported for Democrats and Republicans in order to be considered not biased is ridiculous. One side could lie way more than the other. And by trying to make them even, you are distorting fact. Is simple numerical balance an indication of nonpartisanship?
KESSLER: No. I don't look at them that way, and, as I said, I don't really keep track of, you know, how many Democrats or how many Republicans I'm looking at until, you know, at the end of the year, I count it up. My own experience from 30 years covering Washington and international diplomacy and that sort of thing is there's - both Democrats and Republicans will twist the truth as they wish if it somehow will further their aims. I mean, no one is pure as a driven snow here. And I've often joked that if I ever write an autobiography, I'm going to title it "Waiting for People to Lie to Me."
(SOUNDBITE OF LAUGHTER)
CONAN: That's something reporters do a lot. Mark Hemingway?
HEMINGWAY: Why - I think I said when I even brought this up. I mean, you know, I don't think that, you know, that, you know, numerical selection is indicative of, you know, bias per se. I just think that it's highly suspicious. When it's three to one, you know, if it were 60-40, you know, whatever, yeah, sure, you know? But when it's three to one, you start getting things where, you know, you start wondering about, you know, why the selection bias.
Hemingway's December article was quite valuable, but he missed an opportunity to explain an important aspect of Eric Ostermeier's examination of PolitiFact's story selection.
PolitiFact rated about the same number of politicians from each party. Yet one party received significantly worse "Truth-O-Meter" ratings. The key inference behind Ostermeier's study was the expectation that a party-blind editorial selection process should be expected to choose the same types of stories for both parties. The results, then, if Republicans really do lie more, would show approximately the same distribution of ratings but with one party more heavily represented in the total number of stories. The approximately even number of stories for each group throws the monkey wrench in Noreen's reasoning.
It would have been good if Hemingway had explained that during the broadcast.
As a side note, it's interesting that Kessler likewise ends up writing approximately as many stories about Democrats as about Republicans. Run the numbers for Kessler as Ostermeier did for PolitiFact and perhaps the tendencies look alike. The obvious reason for focusing on PolitiFact instead of Kessler is PolitiFact's far greater volume of material.
(1/12/12) Jeff adds: There's a flaw that is often overlooked when discussing the "add 'em up" style of interpreting PolitiFact's ratings, and that's the issue of the quality of the fact checks themselves. Assuming PolitiFact actually adheres to an objective formula for avoiding selection bias, and then rates 50 statements from the left and 50 from the right, it still wouldn't disprove an ideological lean.
Take for example the different standards used when PolitiFact rated similar statements from Herman Cain and Barack Obama. Both included the employer's portion of payroll taxes in their respective calculations, but in Cain's case PolitiFact downgraded him for it, while in Obama's case it pushed him higher up the rating scale. And this still doesn't take into account the dishonest tactic of inventing statements out of thin air.
It may be interesting to review the tallies of who gets what ratings and discuss the merits of the numbers. Ultimately though it's the alternating standards that will offer the best evidence of PolitiFact's liberal bias.
(1/19/2012) Jeff adds: An additional flaw with adding up PolitiFact's ratings is the fact that PolitiFact chooses who to give the rating to.
When Obama claimed that preventative health care "saves money", and David Brooks said he's wrong, PolitiFact gave a True to Brooks. This serves the dual purpose of sparing Obama a False on his "report card" that PolitiFact likes to shill so often, while also providing cover in the "we give Republicans True's too!" sense.
When PolitiFact rated the oft-repeated, and false, claim about $16 muffins at DOJ event, PolitiFact could have given the rating to NPR, the New York Times, or even (gasp!) PolitiFact partner ABC News. Instead, they chose to burden Bill O'Reilly with the falsehood, despite the original claim coming from a government report.
It's these types of shenanigans that will always distort a ratings tally.
Update/clarification (1/14/2012):
Added "for a radio interview on NPR" to the first sentence.
Sunday, December 25, 2011
The Weekly Standard: "Damned Lies and ‘Fact Checking’ (cont.)"
The Weekly Standard has a follow up to Mark Hemingway's story earlier this month focusing on the foibles of fact checking (link to our review).
The update, under the title "Damned Lies and 'Fact Checking' (cont.)," is mostly subscriber-only content, though the whole of it is available for preview at present on the Standard's "The Scrapbook" main page.
Without giving too much away, this nugget both serves as an appropriate tease and a colorful summary of the story:
The update, under the title "Damned Lies and 'Fact Checking' (cont.)," is mostly subscriber-only content, though the whole of it is available for preview at present on the Standard's "The Scrapbook" main page.
Without giving too much away, this nugget both serves as an appropriate tease and a colorful summary of the story:
It’s high time liberal pundits figured out that there’s more going on in this fact-checking bordello than raucous piano music. If they’d been paying attention, they would have long ago stopped patronizing these journalistic houses of ill repute.
Sunday, December 11, 2011
The Weekly Standard: "Lies, Damned Lies, and 'Fact Checking'"
The Weekly Standard and Mark Hemingway add yet another effective critique of PolitiFact to the growing set:
Hemingway backs his assessment with the same example he used in his 2010 critique of PolitiFact in the Washington Examiner: Rand Paul's statement about the gulf between average private sector pay and that received by federal workers. Hemingway again explains the preposterousness of that rating and calls it "non-atypical" of PolitiFact.
What's PolitiFact's problem? Hemingway's rundown sounds themes familiar to regular readers of PFB:
Remember to read Hemingway's every word. This review doesn't do it full justice.
They call themselves “fact checkers,” and with the name comes a veneer of objectivity doubling as a license to go after any remark by a public figure they find disagreeable for any reason. Just look at the Associated Press to understand how the scheme works.Yes, Hemingway first uses the Associated Press as his example. But PolitiFact isn't far behind:
(I)n 2009 the St. Petersburg Times won a Pulitzer Prize for PolitiFact, endowing the innovation with a great deal of credibility. “According to the Pulitzer Prize-winning PolitiFact . . . ” has now become a kind of Beltway Tourette syndrome, a phrase sputtered by journalists and politicians alike in an attempt to buttress their arguments.Ouch!
If the stated goal seems simple enough—providing an impartial referee to help readers sort out acrimonious and hyperbolic political disputes—in practice PolitiFact does nothing of the sort.
Hemingway backs his assessment with the same example he used in his 2010 critique of PolitiFact in the Washington Examiner: Rand Paul's statement about the gulf between average private sector pay and that received by federal workers. Hemingway again explains the preposterousness of that rating and calls it "non-atypical" of PolitiFact.
What's PolitiFact's problem? Hemingway's rundown sounds themes familiar to regular readers of PFB:
The media establishment has largely rallied round the self-satisfied consensus that fact checking is a noble pursuit. Nonetheless there are signs of an impending crack-up. In their rush to hop on the fact-checking bandwagon, the media appear to have given little thought to what their new obsession says about how well or poorly they perform their jobs.In a nutshell, the fact checkers are biased and not particularly good at fact checking.
It’s impossible for the media to fact check without rendering judgment on their own failures. Seeing the words “fact check” in a headline plants the idea in the reader’s mind that it’s something out of the ordinary for journalists to check facts. Shouldn’t that be an everyday part of their jobs that goes without saying? And if they aren’t normally checking facts, what exactly is it that they’re doing?
Remember to read Hemingway's every word. This review doesn't do it full justice.
Monday, January 3, 2011
Washington Examiner-Politifact Is Often More Politics Than Facts
Mark Hemingway of the Washington Examiner exposes flaws in Politifact's rating of Rand Paul. The Kentuckian pointed out a disparity between private and public worker compensation-
You can also read a companion critique at Sublime Bloviations, that points out another flaw with the Politifact piece. Three months prior to the Paul rating Politifact came to a different conclusion when they rated Mike Keown.
This represents three separate fact checks on basically the same issue with two different conclusions.
“The average federal employee makes $120,000 a year. The average private employee makes $60,000 a year.”Politifact rated him False. They explained that Paul might confuse his audience-
Since most people usually think about how much they, their spouses and their colleagues get paid in salary alone — not salary plus benefits — we think most people hearing this statement would assume that Paul means that the average federal employee gets paid a salary of $120,000. That’s simply not true.Politifact offered no evidence that "most people" would think Paul was talking about salary alone. And Hemingway was quick to point this out-
"So what they’re saying is not that what Paul said was literally false, but that according to how they think people will understand what he said, it’s not true. Come again?"Hemingway concludes that despite Politifact framing the fact-check to their own ambiguous standards, they still missed the mark-
"Politifact does make one relevant point about the average private sector worker not being an apples-to-apples comparison to the average federal worker, but that has no bearing on what Paul actually said and hardly justifies the exorbitant compensation federal workers get."You can read the entire article here.
You can also read a companion critique at Sublime Bloviations, that points out another flaw with the Politifact piece. Three months prior to the Paul rating Politifact came to a different conclusion when they rated Mike Keown.
This represents three separate fact checks on basically the same issue with two different conclusions.
Subscribe to:
Posts (Atom)