Showing posts with label Amy Hollyfield. Show all posts
Showing posts with label Amy Hollyfield. Show all posts

Friday, March 15, 2019

Remember Back When PolitiFact was Fair & Balanced?

PolitiFact has leaned left from the outset (2007).

It's not uncommon to see people lament PolitiFact's left-leaning bias along with the claim that once upon a time PolitiFact did an even-handed job on its fact-checking.

But we've never believed the fairy tale that PolitiFact started out well. It's always been notably biased to the left. And we just stumbled across a PolitiFact fact check from 2008 that does a marvelous job illustrating the point.


It's a well-known fact that nearly half of U.S. citizens pay no net income tax, right?

Yet note how the fact checker, in this case PolitiFact's founding editor Bill Adair, frames President Obama's claim:
In a speech on March 20, 2008, Obama took a different approach and emphasized the personal cost of the war.

"When Iraq is costing each household about $100 a month, you're paying a price for this war," he said in the speech in Charleston, W.Va.
Hold on there, PolitiFact.

How can the cost of the war, divided up per family, rightly get categorized as a "personal cost" when about half of the families aren't paying any net federal income tax?

If the fact check was serious about the personal cost, then it would look at the differences in tax burdens. Families paying a high amount of federal income tax would pay far more than the the price of their cable bill. And families paying either a small amount of income tax or no net income tax would pay much less then the cost of their cable service for the Iraq War (usually $0).

PolitiFact stuffs the information it should have used to pan Obama's claim into paragraph No. 8, where it is effectively quarantined with parentheses (parentheses in the original):
(Of course, Obama's simplified analysis does not reflect the variations in income tax levels. And you don't have to write a check for the war each month. The war costs are included in government spending that is paid for by taxes.)
President Obama's statement was literally false and highly misleading as a means of expressing the personal cost of the war.

But PolitiFact couldn't or wouldn't see it and rated Mr. Obama's claim "True."

Not that much has changed, really.


Afters (for fun)

The author of that laughable fact check is the same Bill Adair later elevated to the Knight Chair for Duke University's journalism program.

We imagine Adair earned his academic throne in recognition of his years of neutral and unbiased  fact-checking even knowing President Obama was watching him from behind his desk.

Sunday, October 22, 2017

The PolitiFact Evangelism & Revival Tour II

Thanks to a generous and wasteful grant from the Knight Foundation, PolitiFact is reaching out to red state voters!

These outreaches suspiciously correlate to new PolitiFact state franchises, in turn making it look like the Knight Foundation wants to help PolitiFact advertise itself.

Daniel Funke of the Poynter Institute posted a story about the Oklahoma leg of PolitiFact's dog & pony show. We reviewed that in our first part in this series. This installment concerns a Washington Post story about the third and final stage of the evangelism and revival tour, ending up in West Virginia.

What's the Purpose of This Tour, Again?


The Post article leads with a section that more-or-less paints PolitiFact's outreach as a failure.

PolitiFact planned to go out and tell people PolitiFact is nonpartisan and fair and let them see, at least to some degree, how PolitiFact works. That was supposed to lead to greater trust. But when given the opportunity to make that case, PolitiFact editor Amy Hollyfield comes across like Eeyore.
“I have discussions with people about the news all the time on Facebook, and I show them what I consider to be credible sources of information,” a man named Paul Epstein says from a middle row. “And they say, ‘Oh, that’s all biased.’ So how can you, or how can we, convince people to trust any mainstream media?”

Amy Hollyfield of PolitiFact, the Pulitzer Prize-winning fact-checking organization, considers the question. She hesitates a beat before telling Epstein and about 65 others in the audience that maybe you can’t. Not all the time.
Well, that's encouraging! What else does Hollyfield have?
“We have a lot of things on our website” that attest to PolitiFact’s impartiality and credibility, Holly­field says. “But I don’t think that seeps in when you’re having that kind of conversation. That’s why we’re trying to tell our story.”
Specifics? Aren't specifics always foremost in the minds of journalists? Okay, maybe Hollyfield gave the specifics. Maybe the Post's Paul Farhi left them out. But it seems to us beyond question that if the idea of the evangelism tour is to build trust of PolitiFact in red states then PolitiFact should focus on those specifics, whatever they are.
The fact-checkers keep steering the conversation back to Politi­Fact and its 10-year track record of rating political speech, including how it assigns its most damning rating, “Pants on Fire.”
What? It would be great to have some specifics on that. Pretty much the best description we have of the difference between PolitiFact's "False" and "Pants on Fire" ratings is PolitiFact Editor Angie Drobnic Holan's immortal "Sometimes we decide one way and sometimes decide the other." We'd like to know even more about this occult-yet-objective (?) process. But there's nothing new in the Post article. So not today.


Sharockman has the Evidence of Neutral Nonpartisanship (not)!


Just a few days ago we published a chart showing PolitiFact has published more fact checks of President Trump between his inauguration and Oct. 18 than it did of President Obama over the same period in 2009 and 2013 combined. We did it to show the utter ridiculousness of Executive Director Aaron Sharockman's argument that fact-checking Obama frequently serves as an evidence of PolitiFact's neutrality.

Lo and behold, the Post captured Sharockman making that same argument again. Christmas in October (bold emphasis added):
(Sharockman) bristles a bit at the conservative critique [The basic critique is that fact-checkers cherry-pick statements and facts to create a false impression — usually that conservative candidates are less truthful than the liberal kind"--bww]. “People say, ‘Why didn’t you fact-check Hillary Clinton’s claim about coming under fire [as first lady] in Bosnia?’ Well, we did. The person we fact-checked more than anyone else is Barack Obama. . . . The person we fact-check the most is the president. We’re going to hold the president accountable.”
As we pointed out in our earlier article accompanying the graph, yes of course national fact checkers check the president the most. That will be true regardless of party and therefore serves as no evidence whatsoever of impartiality, particularly if a Republican president may have drawn greater scrutiny than Obama. Sharockman's argument is flim-flam.

This article about PolitiFact trying to convince conservatives it is neutral and non-partisan gives conservatives no evidence of PolitiFact's neutrality or non-partisanship. These people could use some talking points that have greater strength than wet toilet paper.


Hey, the article mentions "PolitiFact Bias"!

Plus: How PolitiFact could build trust across the board


At the risk of humeral fracture from patting ourselves on the back, the best section of the Post article is the one that mentions PolitiFact Bias. That's not because it mentions PolitiFact Bias, though that's part of it (bold emphasis added)
(Sharockman)’s fully aware of the free-floating cynicism about fact-checking, a form that has enjoyed a boomlet in the past few years with such outfits as PolitiFact, FactCheck.org, Snopes and The Washington Post’s Fact Checker on the scene. In one poll last year, 88 percent of people who supported Trump during the 2016 campaign said they didn’t trust media fact-checking. (Overall, just 29 percent of likely voters in the survey said they did.) PolitiFact itself has come in for particularly intense criticism; a blog called PolitiFact Bias is devoted to “exposing [its] bias, mistakes and flimflammery.”

The basic critique is that fact-checkers cherry-pick statements and facts to create a false impression — usually that conservative candidates are less truthful than the liberal kind.
The fact is that the polls show that moderates and independents are more skeptical about mainstream media fact-checking than are Democrats. The corollary? The political group that most trusts political fact-checking is Democrats.

Shouldn't we expect moderates more than Democrats or Republicans to favor PolitiFact if it treats Democrats and Republicans with equal skepticism? Indeed, for years PolitiFact tried to argue for its neutrality by saying it gets attacked from both sides. Left unsaid was the fact that most of the attacking came from one side.

PolitiFact needs to hear the message in the numbers. Likely voters don't trust fact checkers (71 percent!). PolitiFact can't do meet-and-greets with 71 percent of likely voters.  To earn trust, PolitiFact needs to severely ramp up its transparency and address the criticism it receives. If the criticism is valid, make changes. If the criticism is invalid, then crush entities like PolitiFact Bias by publicly discrediting their arguments with better arguments.

Establish trust by modeling transparently trustworthy behavior, in other words.

Or PolitiFact can just keep doing what it's doing and see if that 30 percent or so that trusts it just happens to grow.

Good luck with that.


Afters

Is this true?
The fact of the matter is that both sides are becoming less moored to the truth, Sharockman says. The number of untrustworthy statements by Republicans and Democrats alike has grown over the past three presidential cycles, he noted.
Our numbers show that the number of false ("False" plus "Pants on Fire") statements from Democrats, as rated by PolitiFact, drop from PolitiFact's early years.  Though with a minor spike during the 2016 election cycle.


What data would support Sharockman's claim, we wonder?

Friday, April 7, 2017

PolitiFact fixes fact check on Syrian chemical weapons

When news reports recently appeared suggesting the Syrian government used chemical weapons, it presented a problem for PolitiFact. As noted by the Daily Caller, among others, PolitiFact said in 2014 it was "Mostly True" that 100 percent of Syrian chemical weapons were removed from that country.

If the Syrian government used chemical weapons, where did it get them? Was it a fresh batch produced after the Obama administration forged an agreement with Russia (seriously) to effect removal of the weapons?

Nobody really knows, just like nobody truly knew the weapons were gone when PolitiFact ruled it "Mostly True" that the weapons were "100 percent gone." (screen capture via the Internet Archive)


With public attention brought to its questionable ruling with the April 5, 2017 Daily Caller story, PolitiFact archived its original fact check and redirected the old URL to a new (also April 5, 2017) PolitiFact article: "Revisiting the Obama track record on Syria’s chemical weapons."

At least PolitiFact didn't make its old ruling simply vanish, but has PolitiFact acted in keeping with its commitment to the International Fact-Checking Network's statement of principles?
A COMMITMENT TO OPEN AND HONEST CORRECTIONS
We publish our corrections policy and follow it scrupulously. We correct clearly and transparently in line with our corrections policy, seeking so far as possible to ensure that readers see the corrected version.
And what is PolitiFact's clear and transparent corrections policy? According to "The Principles of PolitiFact, PunditFact and the Truth-O-Meter" (bold emphasis added):

When we find we've made a mistake, we correct the mistake.

  • In the case of a factual error, an editor's note will be added and labeled "CORRECTION" explaining how the article has been changed.
  • In the case of clarifications or updates, an editor's note will be added and labeled "UPDATE" explaining how the article has been changed.
  • If the mistake is significant, we will reconvene the three-editor panel. If there is a new ruling, we will rewrite the item and put the correction at the top indicating how it's been changed.
Is the new article an update? In at least some sense it is. PolitiFact removed and archived the fact check thanks to questions about its accuracy. And the last sentence in the replacement article calls the article an "update":
In the days and weeks to come, we will learn more about the recent attacks, but in the interest of providing clear information, we have replaced the original fact-check with this update.
If the new article counts as an update, we think it ought to wear the "update" tag that would make it appear on PolitiFact's "Corrections and Updates" page, where it has yet to appear (archived version).

And we found no evidence that PolitiFact posted this article to its Facebook page. How are readers misled about the original fact check supposed to encounter the update, other than by searching for it?

Worse still, the new article does not even appear on the list for the "The Latest From PolitiFact." What's the excuse for that oversight?

We believe that if PolitiFact followed its corrections policy scrupulously, we would see better evidence that PolitiFact publicized its admission it had taken down its "Mostly True" rating of the claim of an agreement removing 100 percent of Syria's chemical weapons.

Can evidence like this stop PolitiFact from receiving "verified" status in keeping the IFCN fact checkers' code?

We doubt it.


Afters
It's worth mentioning that PolitiFact's updated article does not mention the old article until the third paragraph. The fact that PolitiFact pulled and archived that article waits for the fifth paragraph, nearly halfway through the update.

Since PolitiFact's archived version of the pulled article omits the editor's name, we make things easy for our readers by going to the Internet Archive for the name: Aaron Sharockman.

PolitiFact's "star chamber" of editors approving the "Mostly True" rating likely included Angie Drobnic Holan and Amy Hollyfield.

Tuesday, February 21, 2017

Another nugget from the Hollyfield interview

In an earlier post we pointed out how managing editor Amy Hollyfield of PolitiFact described its "Truth-O-Meter" in terms hard to reconcile with those used by PolitiFact's creator, Bill Adair.

The Hollyfield interview published at The Politic (Yale University) contains other amusing nuggets, such as this howler (bold emphasis added):
We take accuracy very seriously. Transparency is one of the key things we focus on, which is why we publish all the sources for our fact checks. We flag every correction and have a subject tag called “correction,” so you can see every fact check we’ve put a correction on.
We find Hollyfield's assertion offensive, especially as it occurs in response to a question about this website, PolitiFact Bias.

PolitiFact does a poor job of consistently adding the subject tags to corrected articles.

We pointed out an example in December 2016. PolitiFact California changed the rating of a fact check from "True" to "Half True," publishing a new version of its fact check from months earlier. Weeks later, PolitiFact California has yet to add a tag to the article that would make it appear on PolitiFact's "Corrections and Updates" page.

Maybe PolitiFact California does not regard rewriting an article as a correction or update?

How about PolitiFact Pennsylvania from January 2017? Lawyers pointed out that the Pennsylvania PolitiFact franchise incorrectly described the standard of evidence courts use for criminal cases. PolitiFact Pennsylvania ran a correction (the correction made the fact check incoherent, but that's another story), but added no tag to the story.


So, contrary to what Hollyfield claims, the corrected story is not transparently presented on its "Corrections and Updates" page.

PolitiFact's spotty compliance with its statement of principles is not new. We even complained about the problem to Paul Tash, the president of the Tampa Bay Times (Nov. 18, 2016). But we've noticed no improvement.

PolitiFact does not have a page that transparently informs readers of all of its corrections.

Will you believe Amy Hollyfield or your own lyin' eyes?

Monday, February 20, 2017

PolitiFact's "Truth-O-Meter": Floor wax, or dessert topping?

The different messages coming from PolitiFact founder Bill Adair and current PolitiFact managing editor Amy Hollyfield in recent interviews reminded me of a classic Saturday Night Live sketch.

In one interview (Pacific Standard), Adair said deciding PolitiFact's "Truth-O-Meter" ratings was "entirely subjective."

In the other interview (The Politic), Hollyfield gave a different impression:
There are six gradations on our [Truth-O-Meter] scale, and I think someone who’s not familiar with it might think it’s hard to sort out, but for people who’ve been at it for so long, we’ve done over 13,000 fact checks. To have participated in thousands of those, we all have a pretty good understanding of what the lines are between “true” and “mostly true,” or “false” and “pants on fire.”
If PolitiFact's "star chamber" of editors has a good understand of the lines of demarcation between each of the ratings, that suggests objectivity, right?

Reconciling these statements about the "Truth-O-Meter" seems about as easy as reconciling New Shimmer's dual purposes as a floor wax and a dessert topping. Subjective and objective are polar opposites, perhaps even more so than floor wax and dessert topping.

If, as Hollyfield appears to claim, PolitiFact editors have objective criteria to rely on in deciding on "Truth-O-Meter" ratings, then what business does Adair have claiming the ratings are subjective?

Can both Adair and Hollyfield be right? Does New Shimmer's exclusive formula prevent yellowing and taste great on pumpkin pie?

Sorry, we're not buying it. We consider PolitiFact's messaging about its rating system another example of PolitiFact's flimflammery.

We think Adair must be right that the Truth-O-Meter is primarily subjective. The line between "False" and "Pants on Fire" as described by Hollyfield appears to support Adair's position:
“False” is simply inaccurate—it’s not true. The difference between that and “pants on fire” is that “pants on fire” is something that is utterly, ridiculously false. So it’s not just wrong, but almost like it’s egregiously wrong. It’s purposely wrong. Sometimes people just make mistakes, but sometimes they’re just off the deep end. That’s sort of where we are with “pants on fire.”
Got it? It's "almost like" and "sort of where we are" with the rating. Or, as another PolitiFact editor from the "star chamber" (Angie Drobnic Holan) memorably put it: "Sometimes we decide one way and sometimes decide the other."


Afters

Though PolitiFact has over the years routinely denied that it accuses people of lying, Hollyfield appears to have wandered off the reservation with her statement that "Pants on Fire" falsehoods on the "Truth-O-Meter" are "purposely wrong." A purposely wrong falsehood would count as a lie in its strong traditional sense: A falsehood intended to deceive the audience. But if that truly is part of the line of demarcation between "False" and "Pants on Fire," then why has it never appeared that way in PolitiFact's statement of principles?

Perhaps that criterion exists only (subjectively) in Hollyfield's mind?


Update Feb. 20, 2017: Removed an unneeded "the" from the second paragraph

Friday, July 22, 2016

Why PolitiFact flip-flopped on Clinton

We've dedicated two items to PolitiFact's "Half True" gift to Democratic presidential nominee Hillary Clinton on her claim she never sent or received classified information via her primate email account.

First we argued that PolitiFact's defense of its "Half True" rating made no sense following FBI Director James Comey's statement on July 5.

PolitiFact, whether influenced by our post or not, apparently agreed and reversed itself the next day while erasing nearly all the evidence of its embarrassing decision from the day before.

We, namely Jeff D, responded to PolitiFact's reversal by documenting the evidence that PolitiFact had continued its habit of changing stories without posting correction notices.

We have addressed what happened. Now we will consider why it happened.

A stupid idea whose time has come

It was just plain stupid of PolitiFact to say that it could not change Clinton's "Half True" rating in view of its policy of doing its ratings according to information available at the time (bold emphasis added):
(After this fact-check published, FBI Director James Comey released details of the Feb's investigation into Hillary Clinton's use of a private email server. This claim will remain rated Half True, because we base our rulings on when a statement was made and on the information available at that time. But the FBI investigation clearly undercuts Clinton’s defense if she makes a similar claim again. You can read more about the findings of the FBI investigation here.)
Applying that policy as PolitiFact did as described above could justify avoiding any number of corrections. Did an article misuse a word? Sure, but we're not going to correct it because we did not know any better at the time.

Yes, it's silly. But it beyond likely that a group of PolitiFact's editors agreed, at least for a time, that it was the right thing to do for this Clinton fact check.

Why the do-over?

PolitiFact reversed itself pretty quickly. But what kind of impetus could reverse the considered wisdom of PolitiFact's elite editorial group?

We'll consider some possibilities:
  • A trusted figure condemned PolitiFact's defense of its "Half True" rating
This option seems the most likely. But PolitiFact's lack of transparency about its reversal leaves us in the dark as to whether anybody inside the organization was independent enough to rock the boat.

Alternatively, the editors at PolitiFact may have felt distress that they were out of step with the Washington Post Fact Checker. The Post promptly changed its rating of Clinton's email claim from two Pinocchios to four. Despite their claims of independence, the mainstream fact checkers can't avoid seeing each others' work and doubtless feel pressure to make similar findings of fact.
  • PolitiFact changed because of our criticisms?
We condemned PolitiFact's reasoning and explained what was wrong with it before PolitiFact executed its reversal. However, it's not typical for PolitiFact to agree with and act on our criticisms.
  • "Lie of the Year" implications
PolitiFact horribly embarrassed itself with the 2014 "Lie of the Year." President Obama's promise that people could keep their insurance plans under his health care reform bill took the award, or at least PolitiFact tried to make it look that way, despite the fact that PolitiFact never rated the claim worse than "Half True."

Clinton's email fib easily qualifies as the early leader in the "Lie of the Year" sweepstakes. It's high-profile. It was deeply investigated by the FBI. It carries yuge implications for the 2016 election.

Did PolitiFact belatedly realize that it might have another Democratic claim rated "Half True" winning the Lie of the Year award? Two words: bad optics.

Conclusion

We don't know for sure why PolitiFact acted the way it did. We can only offer some possibilities. But one thing is certain. PolitiFact has not acted like a fact checker in this. It has acted like it loves its own reputation better than it loves the truth.

Tuesday, April 7, 2015

Sharks & Alligators



I spent time at Zebra Fact Check last week dissecting PolitiFact Florida's effort to fact check whether the first 10 years of Florida's concealed-carry gun law produced twice as many alligator attacks as attacks by concealed-carry permit holders.

One aspect of that study deserves special attention here at PolitiFact Bias.

PolitiFact Florida couldn't find the information it needed to check the alligator claim. But it found a couple of experts who thought it was silly to compare gun attacks to alligator attacks, and the supposed silliness of the comparison found its way into the "Mostly False" rating. Health News Florida reviewed the rating with PolitiFact's Amy Hollyfield:
Since there's no source of comprehensive data for attacks by gun license holders, experts told PolitiFact Florida that it’s not very meaningful to compare alligator bites to the misuse of firearms.

“One of them told us it’s more than silly to compare bites to bullets,” Hollyfield said.
We couldn't avoid thinking about PolitiFact Florida's 2012 fact check of whether shark attacks outnumber cases of voter fraud. PolitiFact chose the measure: "cases" considered for prosecution by the state of Florida instead of "cases" as individual instances of fraud.

PolitiFact Florida admitted its measure was imperfect:
While the shark attack figures are cut and dry (sorry!), the voter fraud numbers are not. There could be more cases than we know about, involving more people. The numbers may not represent total voter fraud cases, as those could be handled by local supervisors and state attorneys.
Without considering the silliness of comparing voter fraud to shark attack and undeterred by the lack of good data on voter fraud, PolitiFact Florida ruled it "Mostly True" that shark attacks occur more frequently than voter fraud.

Obviously inconsistent? Yeah. Coincidentally, the point liberals like, that voter fraud occurs rarely, gets a pass. The claim conservatives like, that concealed-carry permit holders very rarely use their guns to attack others, gets the harsh rating.


That's PolitiFact.  That's bias.

Sunday, July 20, 2014

Tweezers or tongs?

We've noted before PolitiFact's inconsistency in its treatment of compound statements.  It's time to focus on a specific way that inconsistency can influence PolitiFact's "Truth-O-Meter.

We'll call this problem "tweezers or tongs" and illustrate it with a recent PolitiFact fact check of Phil Gingrey (R-Ga.):
"As a physician for over 30 years, I am well aware of the dangers infectious diseases pose. In fact, infectious diseases remain in the top 10 causes of death in the United States. … Reports of illegal migrants carrying deadly diseases such as swine flu, dengue fever, Ebola virus and tuberculosis are particularly concerning."

[...]

The reality is that Ebola has only been found in Africa -- and experts agree that, given how the disease develops, the likelihood of children from Central America bringing it to the U.S. border is almost nonexistent. But most importantly for our fact-check, Gingrey’s office was unable to point to solid evidence that that Ebola has arrived in Western Hemisphere, much less the U.S. border. To the contrary, the CDC and independent epidemiologists say there is zero evidence that these migrants are carrying the virus to the border.

We rate the claim Pants on Fire.
It's tweezers this time.

Gingrey states that disease crossing the border via migration creates a concern.  He mentions reports of swine flu, dengue fever, Ebola virus and tuberculosis crossing the border as examples of concern.  PolitiFact takes its tweezers and picks out "Ebola virus," and drops from consideration the other diseases in Gingrey's compound statement.

Let's review again PolitiFact's guidelines statement of principles:
We sometimes rate compound statements that contain two or more factual assertions. In these cases, we rate the overall accuracy after looking at the individual pieces.
Or sometimes PolitiFact will just settle on rating one piece of the compound statement.  It's up to PolitiFact, based on the whim of the editors.

Burying Gingrey's underlying point

Though we're focused mainly on PolitiFact's inconsistent handling of compound statements, it's hard to ignore another PolitiShenanigan in the Gingrey fact check.  PolitiFact sometimes takes a subject's underlying point into account when making a ruling.  And sometimes not.  In Gingrey's case, PolitiFact buried Gingrey's underlying point:
As a surge of unaccompanied children from Central America was arriving on the United States’ southern border this month, Rep. Phil Gingrey, R-Ga., expressed concern about the impact they could have on public health.
PolitiFact left out part of the story.  Yes, Gingrey was expressing concern about the potential spread of disease from human migration.  But he wasn't simply airing his concerns to the Centers for Disease Control, to whom he addressed the letter PolitiFact fact checked.  He was asking the CDC to assess the risk:
I request that the CDC take immediate action to assess the public risk posed by the influx of unaccompanied children and their subsequent transfer to different parts of the country.
PolitiFact claims "words matter."  Yet, contrary to PolitiFact's claim, Gingrey did not say migrants may be bringing Ebola virus through the U.S.-Mexico border.  Rather, he said it was troubling to hear reports of diseases, including Ebola virus, coming across the border.

Words matter to PolitiFact, we suppose, since one needs to know exactly how much twisting is needed to arrive at the desired "Truth-O-Meter" rating.

Wednesday, April 30, 2014

PolitiFact finds true Rush Limbaugh claim "False"

Unbelievable.  That's PolitiFact's "PunditFact."

Rush Limbaugh said African Americans are now some of the wealthiest people in America.  PunditFact sprang into action:


The key to PunditFact's "False" rating for Limbaugh was simplicity itself.  PunditFact defined "wealthiest" to mean that a person appeared on the Forbes list of wealthiest people.

No, really.  That's what PunditFact did.
"You've got a black president. You've got a black attorney general. You've got the wealthiest TV performer in American history is a African-American woman. That would be The Oprah," Limbaugh said. "Some of the wealthiest Americans are African-American now."

That last line is quite incorrect.

The go-to source to learn about the wealthiest Americans is Forbes, which tracks the fortunes of the world’s elite. It publishes an annual list of the 400 wealthiest Americans and a list of the world’s billionaires.
PunditFact reasoned that since Oprah Winfrey was the only African American to appear on the list of the 400 wealthiest Americans, therefore what Limbaugh said was false.

No, really.  That's what PunditFact did.

Did Limbaugh say something in context to justify PunditFact narrowing the definition of "wealthiest" to the Forbes top 400?  Not from what we can tell.  Certainly the fact check makes no mention of it.  PunditFact's decision seems entirely arbitrary, especially given how commonly media outlets like Vanity Fair, CNBC, and the Cleveland Plain Dealer use the term for the top 1 percent of earners.

Are those media outlets lying to us?  PolitiFact's analysis suggests they are.  But it gets worse.

 

Hypocrites


PolitiFact is also among the media outlets comfortable with using "wealthiest Americans" to mean something other than the Forbes top 400.

PolitiFact New Jersey did it in a fact check of (Democrat) Steve Rothman.

PolitiFact Florida did it in a fact check of Debbie Wasserman Schultz, where defining "wealthiest Americans" as the top 400 would have reversed the ruling (DWS would have received a "True" rating, implying that CNN's Wolf Blitzer was wrong for claiming the wealthiest Americans foot most of the U.S. tax bill).

National PolitiFact did it in a Obameter item in 2013, rating President Obama's promise that he would raise taxes on those making over $200,000 per year--a figure PolitiFact paraphrased as the earning floor for the "wealthiest Americans."

PolitiFact Texas did it in a fact check of (Democrat) Lloyd Doggett, equating the top 1 percent of earners with the "wealthiest Americans" in a paraphrase.

Oh, and PunditFact did it in a fact check of MSNBC's Joe Scarborough earlier this year.

We could provide many more examples from PolitiFact, but you get the idea.  This is hypocrisy of the highest order.  PolitiFact has no right to decide where Limbaugh draws the line on what constitutes the "wealthiest Americans."  Limbaugh can draw that line anywhere he likes.  And if he happens to draw it at the top 1 percent of earners, like PolitiFact often does, then his statement is quite simply true.

The competition for worst fact check of the year is pretty intense in 2014.  And it's still early.

Wednesday, December 12, 2012

Media Trackers (Florida): "PolitiFact Florida Dishonestly Smears Pam Bondi on Obamacare"

Media Trackers of Florida continues to assail the purulent pronouncements of PolitiFact, this time over PolitiFact Florida's "False" ruling for Attorney General Pam Bondi for a statement regarding ObamaCare's effect on business.

Media Trackers:
The numbers cited by Bondi are verifiable and accurate. The Mercer survey found that 61 percent of employers expect costs to rise as a result of Obamacare. As PolitiFact Florida itself noted, “Bondi is correct on the specific numbers she cited.”

Nevertheless, PolitiFact Florida ruled that Bondi’s statement was “false.” How could this be?
It's a good question.  This case involving Bondi creates such a good example of poor journalism that Media Trackers probably distracts readers from appreciating its problems with an abundance of sensationalistic rhetoric, above quotation excepted.

It's hard to see how PolitiFact justifies the ruling in spite of the descriptions.  Take the conclusion, for example:
We don’t doubt there’s anxiety among some businesses over what’s to come under the health care law, and maybe some are talking about whether they’ll have to raise prices or cut jobs. But Bondi didn’t talk about planning, she talked about what’s occuring right now, and we find no studies already showing the negative effects or evidence that businesses are cutting jobs or raising prices now. We rate Bondi’s statement False.
PedantiFact is more like it. 

We often see PolitiFact applying unnecessarily uncharitable interpretations to politicians' statements, with conservatives receiving the greater harm.  Bondi made two main points, that multiple studies showed damage to businesses from ObamaCare and that businesses were responding by cutting hours or laying off workers.  Bondi did not state that studies showed businesses were cutting hours or laying off workers.  PolitiFact drew that inference and graded Bondi in part on that claim.

Given normal charitable interpretation, Bondi was correct in that Mercer conducted more than one study indicating economic damage to business as reflected in employer expectations.  Bondi was likewise correct, based on anecdotal evidence, that businesses are reacting by cutting hours or laying off workers.  The statement of intent is enough to justify Bondi's use of tense.

Here's an analogy:  Suppose a baseball team ended the previous season without hitting a home run.  At the winter meetings the team acquires renowned sluggers Jeff Smith and Alex Weston.  The GM announces the team is solving its power woes with Smith and Weston.

But wait!  The season hasn't started yet, so the team isn't solving anything yet.  Right, PolitiFact?  Smith and Weston might suffer season-ending injuries on their plane ride to join the team.

This type of language is common in English.  A high school senior in California announces she's going to college at Yale.  So what's she still doing in a California high school?  PolitiFact rates the scholarly senior "False."


We applaud Media Trackers for highlighting yet another PolitiGaffe fact check.

We do experience concern that some of Media Trackers' assertions are vulnerable to challenge, such as saying PolitiFact did its reporting "dishonestly."  Also saying that PolitiFact smears Republicans while "bolstering" Democrats oversimplifies a complex record of unfairness to both parties that happens to harm Republicans more than it does Democrats.  Toning down the condemnation will allow such reports to reach and influence a wider audience.

Tuesday, February 28, 2012

PolitiFact's sham fact checking

Senator Marco Rubio (R-Fla.) and  President Barack Obama had something in common last week, and Jeff Dyberg noticed.

Both made statements about majorities that were graded "Mostly True" by the fact checkers at PolitiFact.  The justifications PolitiFact used for the rulings was similar.  PolitiFact cited poll data showing that pluralities rather than majorities obtained, and ruled favorably based on the underlying points.

Note the summary paragraph for the Rubio story:
Rubio said that the majority of Americans are conservative. A respected ongoing poll from Gallup shows that conservatives are the largest ideological group, but they don’t cross the 50 percent threshold. So we rate his statement Mostly True.
Compare the summary paragraph for the Obama item:
So overall, the poll numbers support Obama’s general point, but they don’t fully justify his claim that "the American people for the most part think it’s a bad idea." Actually, in most of the polls just a plurality says that. On balance, we rate his statement Mostly True.
Rubio and Obama no longer have the "Mostly True" ruling in common.

PolitiFact received numerous complaints about the Rubio ruling and changed it to "Half True."

Of course, in the case of Rubio, PolitiFact found more information that bolstered the downgraded "Half True" rating.

Just kidding.  Go through the updated story with a fine-toothed comb and Rubio's claim ends up looking even more similar to Obama's, except maybe better.  Note the concluding paragraph of the updated Rubio story:
So by the two polls, he was incorrect. By one, he was correct and we find support for his underlying point that there are more conservatives than liberals. On balance, we rate this claim Half True.
This case makes it appear that PolitiFact is sensitive to scolding from the left, perhaps particularly when it comes from media elites like Jay Rosen.  And maybe that's understandable in a way.  But if the left doesn't complain about the Obama rating until it's downgraded to "Half True" then both the left and PolitiFact (or is there a difference?) look pretty inconsistent.

Wednesday, February 15, 2012

PFB Smackdown: Rachel Maddow (Updated)




We agree with Rachel Maddow up through about the 55 second mark.  Yes, PolitiFact is bad, and PolitiFact is so bad at fact checking that it doesn't deserve frequent citations as a trustworthy source. 

After that, our level of agreement starts to drop.

Sen. Rubio (R-Fla.) stated that most Americans are conservative and went on to argue the point based on attitudes toward the labels "conservative" and "liberal."

Maddow ignores the context of Rubio's remarks and attacks it using survey data about the way Americans self-identify politically.

Maddow is supposed to be ultra smart.  So how come she can't figure out that Rubio's statement isn't properly measured against self-identification numbers?

It appears that Maddow uncritically followed PolitiFact's approach to judging Rubio's accuracy.  The self-identification numbers serve as interesting context, but it's perfectly possible for 100 percent of Americans to self-identify as "liberal" yet reasonably classify as majority conservative.  That's because people can have inaccurate perceptions of their location on the political spectrum.

So, was Rubio correct that the majority of Americans are conservative?  That depends on his argument.  Rubio didn't cite surveys about self-identification.  He used a method concerned with attitudes toward the respective labels.  One can argue with the method or the application of the method, but using an inappropriate benchmark doesn't cut it.
When you ask people which party they lean toward, the independents split up so that the country is almost evenly divided. For the year of 2011, Gallup reported that 45 percent of Americans identified as Republicans or leaned that way, while 45 percent identified as Democrats or leaned that way.
Is "Republican" the same label as "conservative"?  No, of course not.

PolitiFact came close to addressing Rubio's point by looking at the political leanings of moderates, but fell short by relying on the wrong label along with the self-identification standard.  Maddow's approach was even worse, as she took Rubio's comment out of context and apparently expected PolitiFact to do the same thing.

Meanwhile, PolitiFact defends itself with the usual banalities:
“Our goal at PolitiFact is to use the Truth-O-Meter to show the relative accuracy of a political claim,” Adair explained. “In this case, we rated it Mostly True because we felt that while the number was short of a majority, it was still a plurality. Forty percent of Americans consider themselves conservative, 35 percent moderate and 21 percent liberal. It wasn’t quite a majority, but was close.”

“We don’t expect our readers to agree with every ruling we make,” he continued.
Pretty weak, isn't it?


Update 2/19/2012:

With a hat tip to Kevin Drum of Mother Jones (liberal mag), we have survey data that help lend support to Marco Rubio (as well as to my argument in his defense):

(click image to enlarge)

1)  The survey, from Politico and George Washington University, is limited to likely voters.
2)  The poll essentially forces likely voters to choose between "liberal" and "conservative."
3)  A plurality of those surveyed (43 percent) lean Democrat or self-identify as Democrat.
4)  Despite the plurality of Democrats in the survey sample, 61 percent identify as conservative ("Very conservative" or "Somewhat conservative").

Wednesday, December 14, 2011

Engineering Thinking: "PolitiFact’s Analysis of Cain’s 9-9-9 Plan is Fatally Flawed"

We were slow to notice a fresh PolitiFact item by Ed Walker at his blog "Engineering Thinking" from October.

Walker swiftly skewers PolitiFact's treatment of a Herman Cain claim about his 9-9-9 tax plan:
1. The first major problem with PolitiFact’s analysis is that it was not shown to be objective. PolitiFact selected three tax accountants to provide an opinion, but since Cain’s 9-9-9 plan — if implemented — will substantially reduce the need for tax accountants, they are the last folks that should be asked for an assessment.
Indeed, it seems odd that PolitiFact would solicit volunteers* from the ranks of tax accountants to test Cain's claim rather than going to tax experts at a think tank.  Not that the latter route is totally unproblematic.

And Walker's second point:
2. Politifact states in the online version, “For this fact-check, we’ll only be talking about the personal income tax and the sales tax since the business tax directly affects only business owners and corporations.” This assertion is nonsense, however, since everyone’s effective income is directly impacted by the prices that business owners and corporations charge their customers, and those prices are greatly affected by federal corporate and payroll taxes.

PolitiFact completely ignores such taxes, which are often hidden taxes that the Cain plan eliminates.
Walker is deadly accurate with his second point.  PolitiFact seems completely fooled by embedded taxes, formerly neglecting their existence in a fact check of Warren Buffett's claims about effective tax rates for the very rich.  I've coined the term "the Buffett fallacy" for that mistake.

A good fact check does not simply ignore important aspects of the issue it examines.

Walker's post is short, but it's worth a visit to read the entire thing.  So please do so.


* I have a very clear recollection of PolitiFact posting a request for readers with tax expertise to help evaluate Cain's plan.  Unfortunately, the Web page is either a bit hard to find or that item was scrubbed from PolitiFact's Web territory.