Showing posts with label Martha Hamilton. Show all posts
Showing posts with label Martha Hamilton. Show all posts

Tuesday, January 12, 2016

True statements ruled "Mostly False"

What happens when PolitiFact finds that a statement is literally true?

That issue was brought up indirectly when Jeff retweeted economist Howard Wall. Wall had tweeted:
We looked up the story where Wall was quoted as an expert. It was a fact check of Mitt Romney from the 2012 presidential election. The Romney campaign said women had suffered the greatest job losses under Obama, implying Obama's leadership had been bad for women.

PolitiFact ruled the claim "Mostly False."

The Romney campaign pushed back. PolitiFact looked at the issue again and ruled the claim "Mostly False." But at the same time, PolitiFact said "The numbers are accurate but quite misleading."

Don't blame the Romney campaign. It probably operated under the assumption that PolitiFact's definitions for its "Truth-O-Meter" ratings mean something.

Taking PolitiFact's definitions literally, the lowest rating one should receive for a literally true claim is "Mostly True." Once below that level, the definitions start talking about "partially true" statements that give a misleading impression ("Half True") and "The statement contains some element of truth" but ignores facts that could give a different impression ("Mostly False").

What's our point? We've always said PolitiFact's ratings reveal more about PolitiFact than they do about the entities receiving the ratings. It's a scandal that social scientists keep their eyes closed to that. Search PolitiFact's ratings for claims it says are literally true. Note the rating given to the claim. Then take a look at the ideology of the entity making the claim.

There's your evidence of journalistic bias by fact checkers.

This is an important issue. If social scientists aren't looking at it, it suggests they don't care.

Why wouldn't they care?



Jeff Adds: We highlighted a Mark Hemingway critique of PolitiFact's Romney claim back in 2012 that is still worth a read. It would seem little has changed at PolitiFact since then.



Update 0956PST 1/12/2016: Added "Jeff Adds" portion - Jeff


Tuesday, October 9, 2012

Flashback Oct. 2010: "Just the Hacks, Ma'am"

Note:  Jeff D. originally posted this story about PolitiFact's treatment of Obama's campaign contribution policies back in October of 2010 on his personal blog.  With renewed focus on the Obama campaign's handling of credit card donations, we feel a review of PolitiFact's past treatment of the issue has renewed value.  The post was edited for style in this incarnation.



Few media outlets are as disingenuous and misleading as the supposed "fact checking" outfit PolitiFact. Despite making the claim that they "help you find the truth in American politics", the project is simply an extension of the unabashedly left-wing St. Petersburg Times editorial page, and their consistently flawed "Truth-O-Meter" shtick betrays this bias.

This week produced a fine example of the bizarre contortions this "unbiased" outfit will go through to defend Obama. On Tuesday they offered up RNC chairman Michael Steele and his comments regarding disclosure of campaign donors. Specifically, PolitiFact chose to rate Steele's charge of Obama's hypocrisy-
When President, then candidate, Obama was asked to disclose some of his donors because there was suspicion of their being the foreign source of money into his campaign, they refused to do it. So don't give me this high-and-mighty, holier-than-thou attitude about special interests flooding the political marketplace.
With Obama's false narrative about the Chamber of Commerce, and Pelosi's hysterical warnings about plutocracies, Steele's comments were timely and spot on (for a change).

Was Obama asked to disclose donors, and did he refuse? It seems simple enough to verify.

It is a well documented fact that during the 2008 presidential campaign Obama refused to disclose the names of over 2 million donors. These particular donors contributed less than $200 each, and therefore fell below the reporting requirements. While Obama had no legal obligation to disclose them, he was under pressure to do just that. The reason was Obama had reduced the security safeguards on his campaign website that prevent fraudulent or illegal contributions. Obama claimed this was necessary due to the high volume of donations and the fact that the security measures slowed the process down. Fair enough.

Then erratic and abnormal donation patterns began to appear, including odd and un-rounded amounts (e.g. $133.29-suggesting foreign currency conversion), and curiously named donors like John Galt and Nodda Realperson, and of course Adolf Hitler and "Hbkjb,jkbkj".

In allowing donors to evade standard verification procedures, it became easier for people in Gaza, or even passionate supporters in Vermont, to circumvent donor disclosure laws. Basically, a single person using phony names could make multiple donations, with each individual donation under the $200 limit, but totaling tens of thousands of dollars in the aggregate, in order to avoid the reporting threshold.

These types of contribution shenanigans aren't unique to Obama's campaign. They happen to all politicians. What was unusual however was Obama's steadfast refusal to disclose the names of donors so independent journalists could vet the legitimacy of erroneous contributions.

Several groups started asking Obama to disclose the full list of donors in order to investigate these discrepancies. Obama refused.

The Republican National Committee went as far as filing a complaint with the FEC over the irregularities claiming Obama was accepting foreign cash. The Center For Responsive Politics asked Obama twice to disclose the names of "bundler" donors.

When the supposedly tech savvy Obama campaign finally responded with the ridiculous claim that compiling the list of names would be too technologically difficult, left-leaning Slate.com asked "So how come we were able to do it in a couple hours?"   Slate also noted:
Politically, there would be several advantages in releasing the names. Obama has campaigned on a platform of making government more transparent...
Ultimately the Obama campaign refused to disclose the names of over 2 million donors representing roughly $400 million in donations. In response to Obama's recent misleading attacks against the disclosure policies of Republican PAC's, the Wall Street Journal pointed out the hypocrisy in an editorial:
Mr. Axelrod told CNN the White House "believes deeply in disclosure"...But it wasn't always the case. During 2008, the Obama campaign didn't show any interest in going beyond the letter of the law in disclosing its donors to the general public. Despite public pleas from campaign-finance reform groups such as Common Cause and Democracy 21, Team Obama refused to...release names of donors who gave less than $200, even though such donors supplied about half of the $800 million the Obama campaign raised.
The bottom line is Obama accepted donations from contributors who were likely foreign nationals and he refused to publicly disclose the names. With all of this evidence it wasn't hard for PolitiFact to rate Michael Steele's claim......False?????

 PolitiFact tries to frame the "facts":
Despite the context of the conversation, Steele was not contending that the Obama campaign was asked to disclose donors to independent groups funding attack ads. That's a somewhat new phenomenon this election cycle. Trade groups and other 501 (c) groups were always allowed to keep donors anonymous. But the Supreme Court's Citizen United case upped the stakes with a ruling that allows corporations to contribute unlimited amounts to independent efforts to support or oppose a candidate.
What the what?!

PolitiFact correctly notes that Steele didn't imply that Obama refused to disclose donors of independent PAC groups. So why bring it up except to confuse the issue? And speaking of confusing the issue, what exactly does the Citizens United case have to do with Obama's 2008 campaign? Well, nothing except to throw the controversial ruling into the mix to get the base all fired up and attempt to connect two things that are otherwise unconnected. In this case it's diversionary and misleading.

Steele's statement begins and ends with calling Obama a hypocrite because in 2008 he refused to disclose his donors, and now Obama's complaining about right wing groups failing to disclose donors. All they need to determine is whether or not Obama refused to name names of donors. But if PolitiFact did that they'd have to call Obama a hypocrite.

Surprisingly, Politifact had the balls to cite Opensecrets.org to "prove" Obama's innocence while also taking a thinly veiled swipe at John McCain-
In fact, an analysis of campaign contributions by the Center for Responsive Politics found that the Obama campaign scored slightly higher than McCain's when it came to full disclosure of donors. The center found the Obama campaign fully disclosed 90 percent of the donations to the campaign, as opposed to 87 percent for the McCain campaign..
Those numbers are accurate. But what the unbiased, non-partisan, help you sort out the truth, fact checkers at PolitiFact fail to tell you is that those numbers don't include donor's who contributed under $200, which is the exact group of donors Steele was talking about. Oh, what Politifact also fails to mention in their snub was that unlike Obama, John McCain did release the names of donors who contributed less than $200. Why was this fact left out of the article?

What other gems did PolitiFact come up with?
We think Steele's comment is misleading in the context of responding to Democrats' complaints about tens of millions of dollars anonymously making their way into this election via independent groups like Crosssroads GPS. Steele's comments aren't directly related to that issue.
Huh? The argument is about transparency. How is it not a relevant criticism? And even if it was irrelevant, that doesn't make it false.
Again, it's not that the Obama campaign was asked for names of foreign donors and refused.
Well, except for the fact that that is exactly what happened.

 And finally they offer up their conclusion:
There was no issue of the Obama campaign willfully refusing to disclose the names of foreign donors.
Yes. There was. For PolitiFact to ignore the mountain of evidence that supports Steele's claim can only be a deliberate evasion of reality. PolitiFact's disingenuous "fact checking" can only be considered ideological cheerleading, and yet another example of media bias.

This latest disservice to facts is not new for PolitiFact. Bryan White over at Sublime Bloviations has been documenting their flawed and misleading ratings for a long time. His site is an invaluable source for exposing the misleading conclusions and flexible standards PolitiFact employs in their farcical "truth seeking" project.

Politics is full of misleading statements and outright lies. A truly unbiased source providing actual facts would be a welcome addition to political discourse.

But PolitiFact is not unbiased. They are simply a liberal opinion site riddled with inaccuracies, rhetoric, and ideology.

Falsely claiming to be objective purveyors of truth is wholly offensive, and PolitiFact should be exposed for the left wing ideologues they are.


Edit 10/09/12-Removed broken embed to video of Steele/MSNBC interview. It can still be found here. -Jeff

Edit 3/9/13-Removed link from words "
warnings about plutocracies" for dubious source. - Jeff

Thursday, August 30, 2012

A Little Bit More and We'll Be Inches Away from Hacks

Sometimes PolitiFact's assault on consistency is so overwhelmingly obvious, it makes my brain hurt. Do these people even keep track of what they write?

A few weeks ago PolitiFact published an article analyzing an Obama campaign ad describing the president's tax plan:

Image from PolitiFact.com
The ad attempts to differentiate between the president's plan and Mitt Romney's tax plan. The ad claims millionaires, under Obama's plan, will "pay a little more." PolitiFact's article goes on to describe different figures and analyze different metrics and concludes millionaires, on average, would end up paying roughly $189,000 more in taxes. Seems like a simple thing to rate. What could go wrong?
Supporters of Obama’s tax plan are free to argue that the tax hike on high earners is wise policy or morally justifiable. However, we think that even for a millionaire, an extra $189,000 in taxes on average -- resulting in a decline in after-tax income of 8.8 percent -- goes well beyond chump change.
So how does this rate on the trusty ol' Truth-O-Meter?
We considered putting this to the Truth-O-Meter, but we decided that "a little more" is an opinion, not a checkable fact.
That's right. The phrase "a little more" is beyond the scope of objective facts. It's an opinion. I agree with PolitiFact. So what's the point of this post? Have a look at this:

Image from PolitiFact.com

When Barack Obama tells you millionaires will only pay a little more, it's an opinion. When Mitt Romney uses a common metaphor, it's a lie. In fact, it's a lie twice:

Image from PolitiFact.com

Twice PolitiFact found enough verifiable facts to assign a value of honesty to a Mitt Romney opinion. Obama? Well, heck, they don't check opinions.

All three articles were written by Louis Jacobson. Both Romney articles were edited by Martha Hamilton.

Wednesday, July 25, 2012

The Weekly Standard: "PolitiFact Mucks Up the Contraception Debate"

This year has sped by at a breathtaking pace so far, and we've neglected to review some worthy stories about PolitiFact simply because we placed a higher priority on some stories than others.

But it's not too late.

In February, The Weekly Standard's Mark Hemingway weighed in with yet another damning assessment of PolitiFact's talent for fact checking:
Before I explain why PolitiFact is once again being deliberately misleading, grossly incompetent, or some hellbroth of these distinguishing characteristics, you'll have bear with me. Part of the reason PolitiFact gets away with being so shoddy is that it counts on its readers believing that it can be trusted to explain any necessary context to justify its status as judge, jury, and factual executioner.
Obviously the right thing to do now is click the link and read the whole thing for yourself.

For those who don't have the time, I'll sum up:

Hemingway's latest example of PolitiFactian perfidy concerns its use of a Guttmacher Institute publication to support an Obama administration claim that 98 percent of sexually active women use birth control.

The Obama administration was trying to justify its insurance mandate requiring birth control as a basic coverage requiring no copay.

Hemingway noted the Guttmacher Institute's lack of neutrality, a number of the arguments marshaled against its findings and PolitiFact's selective use of the evidence.

At the end of the day, a study drawn from a group of women aged 15-44 does not justify extrapolating the data to the set of all women of any age.  PolitiFact went soft again on an administration claim.

Wednesday, June 20, 2012

The Weekly Standard: "Romney to PolitiFact: There You Go Again"

The Weekly Standard's Mark Hemingway was back in PolitiFact's grille back in April.

PolitiFact ruled "Mostly False" a claim from the Mitt Romney campaign that women as a group have suffered 92.3 percent of the net job losses under Obama's presidency.  That ruling brought a swift and stern response from the Romney campaign.

Hemingway filed the battle report:
Given that PolitiFact says Romney's numbers check out, how the heck did PolitiFact then conclude Romney's statement is "mostly false"? Well, they did what fact checkers habitually do whenever they find something factually correct but politically disagreeable—kick up a bunch of irrelevant contextual dirt and lean on some biased sources. Which is why PolitiFact's own language here is absurd: "We found that though the numbers are accurate, their reading of them isn’t" and "The numbers are accurate but quite misleading." I would also note that my friend Glenn Kessler, the fact checker at the Washington Post, evaluated the same claim and deemed it "TRUE BUT FALSE." I do hope that if media fact checkers expect to retain any credibility to evaluate basic empirical claims, they're aware that this kind of Orwellian doublespeak is going to make them a laughingstock.
Read the whole thing, because Hemingway's just warming up with the above. 

The above point, that PolitiFact appears absurd for ruling a true statement "Mostly False" probably can't receive enough emphasis.  PolitiFact's rating system provides no description fitting this type of rating.  If the results make it look like PolitiFact isn't categorizing claims according to whether they fit some type of established objective criteria, it's probably because that's the way it is.

Addendum:

PolitiFact's response to the complaint from the Romney campaign deserves a closer look:
We considered the complaint and interviewed four other economists, none of whom have formal or financial ties to any campaigns. Our additional reporting found no reason to change our ruling, which remains at Mostly False.
Two words:  Fig leaf.

The point is that the original reporting didn't justify the ruling.  If PolitiFact can't see that then it's no surprise that additional reporting fails to sway its made-up mind.

Tuesday, March 20, 2012

Grading PolitiFact: Obama, Bush and the auto bailout

Crossposted from Sublime Bloviations.


Context matters -- We examine the claim in the full context, the comments made before and after it, the question that prompted it, and the point the person was trying to make.
--Principles of PolitiFact and the Truth-O-Meter

 Apparently context doesn't matter much, depending on the subject.


The issue:
(clipped from PolitiFact.com)

The fact checkers:

Molly Moorhead:  writer, researcher
Martha M. Hamilton:  editor


Analysis:

This fact check serves as an outstanding example of narrowing the story focus to fish a grain of truth out of an overall falsehood.

The incompetence is overpowering.  Note that PolitiFact frames the issue by stipulating that the $13 billion "given" by the Bush administration was gone "By the time Obama took office."   That bit of timing isn't mentioned in the film, so far as I can tell, though I was able to note that it used a Dec. 2, 2008 television news clip to emphasize the immediacy of the crisis faced by President Obama.

The film and PolitiFact omit a number of important facts.  First, GM received another $4 billion loan in February under the agreement worked out with the Bush administration.   Part of the agreement required the two automakers to submit plans for achieving financial stability by February.  The report of the Congressional Oversight Panel details the response from the Obama administration:
On February 15, 2009, President Obama announced the formation of an interagency Presidential Task Force on the Auto Industry (Task Force), that would assume responsibility for reviewing the Chrysler and GM viability plans.
The timing is far more complicated than either the film or PolitiFact lets on, and the loans from Bush were not necessarily "gone" when Obama took office, particularly in the case of the $4 billion received by GM in February, though that amount is not counted in the $13 billion through the magic of cherry picking the facts.

Let's pick up with PolitiFact's telling (bold emphasis added):
On the subject of Detroit, car company CEOs appear onscreen asking for money in Washington, followed by pictures of empty factories and dire news headlines. The movie talks about the financial pressures on the new president and the unpopularity with the public of more bailouts. But Obama, [narrator Tom] Hanks says, acted anyway to help American workers.

"He decided to intervene, but in exchange for help the president would demand action. The Bush administration had given the car companies $13 billion, and the money was now gone," Hanks says.

Then President Bill Clinton appears onscreen to lend his voice.

"He didn’t just give the car companies the money, and he didn’t give the UAW the money," Clinton says. "He said you guys gotta work together and come up, and everybody’s gotta have some skin in the game here. You gotta modernize the automobile industry."
This segment of the film is not about the history of $13 billion out of a total of $17 billion loaned to automakers by the Bush administration.  It is fully intended to build a contrast between the incoming president and his supposedly irresponsible predecessor.  That point is extremely misleading, as we shall continue to observe.

PolitiFact:
Bush authorized initial loans to Chrysler and GM (and their respective financing arms) before leaving office, using money from the Troubled Asset Relief Program. Chrysler initially received $4 billion, and GM got $13.4 billion in bridge loans meant to keep the companies afloat for a little longer.
Apparently the math amounts to $4 billion plus $13.4 billion equals $13 billion.  And that $13 billion was gone by Jan. 20 even though $884 million was loaned to GMAC on Jan. 16.  It lasted only four days by PolitiFact's account.

Of course the excess $4 billion was loaned in February as described above.  You just don't get to learn that from the PolitiFact version of events.

PolitiFact:
Early in 2009 [mid February], Obama convened a task force to study the companies’ viability. Both were required [through the agreement with the Bush administration] to submit plans for getting back to solvency, but both failed, the task force determined. In the meantime, they were running short of money again.
Pardon my editorial counterspin--which shouldn't be necessary for a fact check.  Unfortunately it is necessary.  GM, by the way, received its last Bush loan on Feb. 17, two days after Obama announced his task force.

PolitiFact:
A report from the Congressional Oversight Panel details the chronology of the spending, including an additional $6.36 billion that GM received between March and May 2009.
The $6.36 billion does not include the $4 billion loaned in February under the agreement with the Bush administration.  Nor does it include $8.5 billion sunk into Chrysler by the Obama administration as part of its restructuring.  Neither does it include the $30.1 billion subsequently sunk into GM as part of its eventual restructuring.  Both the latter figures come from the Congressional Oversight Panel's report PolitiFact cited.

PolitiFact interviewed former Obama team member Steve Rattner about the bailout numbers.  PolitiFact presents Rattner as agreeing that the funds from the Bush administration were exhausted "before we really were in the saddle."  Rattner states that the loans from the Bush administration weren't intended to rescue GM and Chrysler but rather to tide them over until the Obama administration could deal with the situation.

PolitiFact does not totally ignore the film's point about Bush:
We also think it’s worth mentioning the implication in the video that the Bush administration did not put enough restrictions on the money. "He decided to intervene, but in exchange for help the president would demand action," narrator Hanks says just before mentioning the Bush loans.
In case PolitiFact isn't the only party who missed it, note that the filmmaker uses the quotation of Bill Clinton to hammer the point all the more.  It was the main point of the segment, and it was untrue.

What's the verdict?

PolitiFact:
The Obama campaign movie says, "the Bush administration had given the car companies $13 billion and the money was now gone."

It's important to note that the $13 billion was provided as loans, not as grants, as the wording might suggest.

Referring to the time Obama took office, January 2009, GM and Chrysler by then had received almost $14 billion in bailout money. News reports also reflect that the money was basically used up. So, that much is correct. But the movie ignores the fact that this was not unexpected. The Bush administration’s loans were always just a temporary lifeline, meant to keep the companies operating so the new president would have time to decide what to do long term.

This is important information left out of the movie’s extensive discussion of the auto bailouts. That the $13 billion was gone when Obama arrived was no surprise. We rate the statement Mostly True.
The film glosses over quite a few facts that PolitiFact fails to note.  The point of the film is the contrast between the president who demands accountability and Bush who simply gives money away to big corporations.  The movie's account of the auto bailout is thorough spin.  Fact checking isolated statements in the fabric of this filmmaker's fiction will never fully reveal the misleading nature of the narrative.

If Obama went against popular sentiment on the bailout then so did Bush.  If Obama demanded accountability then so did Bush, albeit the latter's attempt was hamstrung by the end of his tenure as president.

PolitiFact disgraces itself again by connecting the film's distortion with a "Mostly True" label.


The grades:

Molly Moorhead:  F
Martha M. Hamilton:  F

PolitiFact let the main misleading message of the auto bailout segment slide.  PolitiFact's reporting corrected a fraction of the film's omissions and shades on the truth.  PolitiFact's version is scarcely an improvement on the original.

But President Obama and his campaign might like it.  That's got to count for something.


3/22/12-Added link to original PF article in first paragraph/fixed link to PoP-Jeff

Thursday, March 1, 2012

Tales of the Unexpected, featuring PolitiFact

 Crossposted from Sublime Bloviations

You have to love PolitiFact's fact-challenged statements about itself.


Now:
We’ve consistently ruled in the past that the economy is too complex to
assign full blame (or credit) for job gains or losses to a president or a
governor.
Then:
Our ruling

Pelosi compared a select time frame in the Obama administration against the entire length of the Bush administration -- a methodology that treats the two presidents unequally. The irony is that if she had used better methodology, she would have had a sounder argument that more private-sector jobs were created under Obama than under the Bush administration. For her general point, we give Pelosi some credit. For her methodological sins -- repeated at least three times -- we give her thumbs down. On balance, we rate her statement Half True.
There's consistency for you.

The NRCC makes a statement that's correct but represents cherry picking and gets a "Barely True."  Nancy Pelosi makes a statement that's also correct, represents cherry picking and gets a "Half True"--with no mention of docking Pelosi for crediting President Obama.  On the contrary, PolitiFact itself recommends an alternative method for giving President Obama credit for his job creation numbers compared to those of his predecessor.



Jeff adds (3/02/12): For the record, there's at least some consistency in these two articles: Both were written by Louis Jacobson, and both were edited by Martha Hamilton.

Wednesday, February 15, 2012

What's Wrong With the World: "How to Lie with Statistics, Example Umpteen"

Jeff and I hugely appreciate bloggers who delve into the more complicated PolitiFact-related issues.

Lydia McGrew of the "What's Wrong With the World" blog gives a proper dressing-down to the Obama administration, the Guttmacher Institute and our beloved PolitiFact over the supposedly "Mostly True" claim that 98 percent of Catholic women use birth control.

As is our wont, we'll focus primarily on PolitiFact's role in the mess.

McGrew:
(T)his Politifact evaluation of the meme gets it wrong again and again, and in both directions.

First, the Politifact discussion insists that the claim is only about women in this category who have ever used contraception. When I first heard that and hadn't looked at the study, I immediately thought of the fact that such a statistic would presumably include women who were not at the time of the study using contraception and had used it only once in the past. It was even pointed out to me that it would include adult converts whose use might easily have been prior to their becoming Catholic. However, that isn't correct, anyway. The study expressly was of current contraceptive use. That's, in a sense, "better" for the side that wants the numbers to be high.
McGrew pointed out earlier that the Guttmacher Institute study uses data for "women at risk for
unintended pregnancy, whom we define as those who had had sex in the three months prior to the survey and were not pregnant, postpartum or trying to get pregnant."  The women surveyed were additionally in the 15-44 age range.  Yet PolitiFact describes the findings like so:
We read the study, which was based on long-collected, frequently cited government survey data. It says essentially that — though the statistic refers specifically to women who have had sex, a distinction Muñoz didn’t make.

But that’s not a large clarification, since most women in the study, including 70 percent of unmarried Catholic women, were sexually experienced.
That's fact checking?

McGrew:
(O)n this point, too, the Politifact evaluation is completely wrong. Politifact implies that only the supplementary table on p. 8 excluded these groups and that Figure 3 on p. 6 included them! But this is wrong. The table on p. 8 is simply supplementary to Figure 3, and both are taken from the same survey using the same restrictions! This is made explicit again and again in the study.
McGrew's exactly right.  The same information accompanies the asterisk for each table (bold emphasis added):  "*Refers to sexually active women who are not pregnant, postpartum or trying to get pregnant."

It doesn't occur to PolitiFact that restricting the survey population like that throws a serious spanner in the works.

That kind of credulity goes by a different name:  gullibility.

Visit What's Wrong With the World and read all of McGrew's skillful fisking of the liberal trio.  It's well worth it.


Addendum:

The Guttmacher Institute drew its data ultimately from here.

It may be the case that the Guttmacher study is reliable.  Regardless of that, PolitiFact did virtually nothing to clarify the issue.  A recent Washington Post story does shed some light on things, however:
I called up Rachel Jones, the lead author of this study, to have her walk me through the research. She agrees that her study results do not speak to all Catholic women. Rather, they speak to a specific demographic: women between 15- and 44-years-old who have ever been sexually active.


Jeff Adds (2/15/2012): Over on PolitiFact's Facebook page, frequent PF critic Matthew Hoy offered up his usual spot on commentary:
I find [PolitiFact's] failure to note that the Alan Guttmacher Institute is closely allied with Planned Parenthood a troubling omission. It isn't some neutral observer and its studies shouldn't be taken at face value without some healthy skepticism.
This isn't the first time PolitiFact has ignored Guttmacher's relationship with Planned Parenthood. Regardless of the studies accuracy, the alliance deserves at least a cursory disclosure. It's also important to note that PolitiFact used a similar connection to justify the rating of Florida Governor Rick Scott's claim about high-speed rail projects:
Scott bases his claims on hypothetical cost overruns from a suspect study written by a libertarian think tank...We rate Scott's claim False.
We highlighted that rating here.



Correction 2/17/2012:  "Guttmacher" was misspelled in the next-to-last paragraph.

Tuesday, February 7, 2012

Hoystory: "Obama’s War on Religion and Conscience"

Matthew Hoy is back at it with his usual biting commentary on PolitiFact. This time he shares his thoughts on the current debate about the effect of PPACA mandates on institutions of the Roman Catholic Church.

Hoy deals broadly with the controversy, but we'll highlight his mention of PolitiFact. At issue is PolitiFact's treatment of Newt Gingrich's statement that the PPACA requires religious institutions to provide insurance coverage for contraceptives:
After honestly analyzing the rule and the law, Politifraud labels Gingrich’s charge “mostly false” as they engage in an amount of hand-waving that would enable human flight without the aid of wings, engines or the other commonly required tools.
Still, if you consider a Catholic church to be a "Catholic institution," or a synagogue to be a "Jewish institution," Gingrich isn’t correct that the recent federal rule on contraceptives applies. Those nonprofit religious employers could choose whether or not they covered contraceptive services.
It’s pretty clear that Gingrich chose his words carefully here and Politifraud is muddying the waters. When I hear the words “Catholic institution” I think of everything Catholic that isn’t the church. I think of hospitals, soup kitchens, homeless shelters, adoption services, the Knights of Columbus, etc. Maybe it’s just because I’m likely more familiar with religious terminology than the (snark on) godless heathens (snark off) who populate many newsrooms, that I interpret it this way. But if the difference between a “True” or “Mostly True” ruling and a “Mostly False” ruling is over whether the word “institution” includes the church or not, then there’s way too much parsing going on.
Parsing words is nothing new for PolitiFact. But that's not the biggest flub Hoy spots:
In the video Politifact links to of Gingrich’s statement (provided by none other than Think Progress), Gingrich makes it clear that he is talking about the rule issued “last week.” The rule issued last week was the one regarding religious employers covering contraceptives in their health plans. Politifraud dishonestly expands that specific criticism of that specific rule into states can set their own benchmarks. No, they can’t. Not when it comes to the rule that came down “last week.” That rule says they MUST cover contraceptives.
Once again Hoy is spot on, though as usual our brief review doesn't do his work justice. Head over to Hoystory and read the whole thing.

Tuesday, January 31, 2012

The Weekly Standard: PolitiFact Can’t Get Its Story Straight on Romneycare and Abortion

Does the truth have a shelf-life?

Jeffrey H. Anderson, writing at the Weekly Standard, takes the fact-finding DeLorean all the way back to 2007 to highlight PolitiFact's conflicting ratings on RomneyCare's coverage of abortions. Before you read Anderson's article, check out the graphics for the remarkably dissimilar PolitiFact articles:

In 2007, PolitiFact says RomneyCare covers abortions:

Image clipped from PolitiFact.com

In 2012, the issue isn't so clear:

Image clipped from PolitiFact.com

Anderson notes:
[The Gingrich rating] sounds reasonable enough — except that the 2007 PolitiFact verdict directly refutes it. 
At first glance the statements have just enough wiggle room between them to possibly have different ratings. But Anderson's article explains there's not enough to justify different ratings..

Also observe how PolitiFact presented Newt's statement. Newt claimed "Romney signed government-mandated health care with taxpayer-funded abortions." Notice PolitiFact's first question: "Did Mitt Romney make taxpayer funded abortion the law of the land?" The difference is that abortion's status as a covered procedure prior to Romney enacting the legislation is independent of Gingrich's claim. The context of Gingrich's ad was that Romney was sympathetic to abortion rights issues. Whether or not abortion was covered by taxpayers under existing Massachusetts law is irrelevant to Gingrich's point. The fact that Romney helped perpetuate the taxpayer funding is enough to make Gingrich's underlying argument accurate.

We'd also like to point out that for a Republican plan, RomneyCare has been the subject of several favorable articles, and even a ridiculous push poll, at PolitiFact. The cynical reader might surmise the kid glove treatment has something to do with RomneyCare's similarity to ObamaCare. Nah, that couldn't be it.




(1/31/2012) Corrected quote of Gingrich's PF statement. No change in context-Jeff

Friday, January 27, 2012

Liberals late to the party on PolitiFact

As expected, PolitiFact's 2011 "Lie of the Year" selection did a good bit of damage to PolitiFact's reputation on the left.  President Obama's 2012 State of the Union speech produced a claim that again has some liberals crying foul.  The Daily Kos and the Huffington Post both published entries condemning PolitiFact's "Half True" ruling on Obama's claim that the private sector jobs increased by 3 million in 22 months.

Jared Bernstein:
I ask you, why do they go where they go? Because of this:
In his remarks, Obama described the damage to the economy, including losing millions of jobs "before our policies were in full effect." Then he describe [sic!] the subsequent job increases, essentially taking credit for the job growth. But labor economists tell us that no mayor or governor or president deserves all the claim or all the credit for changes in employment.
Really? That's it? That makes the fact not a fact? I've seen some very useful work by these folks, but between this and this, Politifact just can't be trusted. Full stop.
(what's with the exclamation point after the "sic," Bernstein?)

Was PolitiFact blatantly unfair to Obama?

Not necessarily. PolitiFact pledged in July of 2011 to take credit and blame more into account for statistical claims.  PolitiFact, in the segment Bernstein quoted, made a decent case that Obama was giving credit to his policies.

Fortunately for the crybabies of the left, PolitiFact promptly caved on this one, revising the ruling to "Mostly True."  The rationale for the change is weaker than the justification for the original ruling:
EDITOR’S NOTE: Our original Half True rating was based on an interpretation that Obama was crediting his policies for the jobs increase. But we've concluded that he was not making that linkage as strongly as we initially believed and have decided to change the ruling to Mostly True.
That editor's note doesn't give readers any concrete information at all justifying the new ruling.  It doesn't take Obama's phrasing into account in any new way, doesn't acknowledge any misinterpretation of Obama's words and doesn't reveal new information unavailable for the earlier ruling.  In short, it looks like a judgment call all the way, where PolitiFact arbitrarily (if we don't count the criticism from the left) decided to give Obama the benefit of the doubt.

The critics on the left, meanwhile, remain apparently oblivious to the another ruling from the State of the Union speech where Obama received an undeserved "True" rating. 

And where were they when Sarah Palin could have used their defense for her true claim about defense spending as a percentage of GDP?

We have a PFB research project planned to address this general issue of technically true claims.


Addendum:

PolitiFact editor Bill Adair has once again come forth to explain PolitiFact's ruling and change of mind:
Lou, deputy editor Martha Hamilton and I had several conversations about the rating. We wrestled with whether it deserved a Half True or a Mostly True and could not reach a conclusion. We decided that it would depend on how directly Obama linked the jobs numbers to his policies.
What criteria were used to determine how directly Obama linked the jobs numbers to his policies?

Adair:
Lou, Martha and I had another conversation about the rating and whether it should be Half or Mostly True. At various points, each of us switched between Half and Mostly True. Each of us felt it was right on the line between the two ratings (unfortunately, we do not have a rating for 5/8ths True!).

We brought another editor, deputy government & politics editor Aaron Sharockman, into the conversation and he too was on the fence. Finally, we decided on Half True because we thought Obama was implicitly crediting his own policies for the gains.
How was Obama's statement "right on the line"?  What criteria placed it there?  What criteria might have moved it one way or the other?

An item like this from Adair is precisely where we should expect a detailed explanation if there is any detailed explanation.

There's essentially nothing.

We get the report of disagreement and vacillation and none of the specific reasons in favor of one rating over the other, except for the implied admission that at least one person making the determination had a change of heart leading to a reversal of the rating.

If that sounds subjective on PolitiFact's part, it probably is.

Thursday, January 19, 2012

Big Tent: "A PolitiFact Example"

Blogger and PolitiFact-cited expert Tom Bruscino supplies a partial insider's look at the PolitiFact process along with a critique of the finished work of which he was a part in his post "A PolitiFact Example."

PolitiFact writer Louis Jacobson asked Bruscino for his assessment of Mitt Romney's claim that the U.S. Navy is at its smallest since 1947.

Bruscino found Jacobson's questions leading:
Jacobson did a remarkable bit of research in a very short period of time. However, I did think his questions to me were leading. Remember, Mr. Jacobson asked "(2) What context does this ignore (changing/more lethal technology, changed geopolitical needs, etc)?," which both assumes and implies to the interviewees that Romney ignored those specific contexts.
And after registering some surprise at Jacobson's use of apparently non-objective descriptors of Romney, Bruscino demurs from PolitiFact's "Pants on Fire" ruling:
My opinion, for what it is worth, is that since Romney's base statement was factually accurate when it came to most numerical metrics, it would seem that he could be given credit for a half-truth, even if the context complicates the matter.
Do read Bruscino's entire post, which is particularly valuable since it provides yet another look at the style of inquiry used by PolitiFact journalists.  The commentary thread is also well worth reading.

Hat tip to Power Line blog.  Visit Power Line also for a parallel review I'd have been better off copying rather than writing up my own.



Jeff adds: I first saw this rating yesterday, and couldn't help but notice it provided another example of PolitiFact's alternating standards. Check out how PolitiFact presented this article on their Facebook page:

Image from http://www.facebook.com/politifact

Notice that Romney is spreading ridiculous falsehoods because he "ignores quantum leaps in technology and training."

Poor Mitt. If only he had made this statement back in 2009 when PolitiFact's standards were much different:

We agree that the two cars are totally different. But Obama was careful in the way he phrased his statement: "The 1908 Model T earned better gas mileage than a typical SUV sold in 2008."  As long as you don't consider any factors other than mileage, he's right. We rate his statement Mostly True.

You see, Obama is rated only for his literal statement, while ignoring quantum leaps in technology that make the Model T "totally different." Romney suffers from additional qualifiers that PolitiFact throws in to the mix.

The similarities between the two ratings don't end there. Here's a bit from the Obama/Model T rating:

So technically Obama is right.


But his implication is that we haven't gotten more fuel efficient in 100 years. And that's a reach.
...

...Model Ts reached top speeds of only 40 miles an hour. They guzzled motor oil, about a quart a month. The original tops were made of canvas, and they had no heating or cooling systems. They also had none of the safety features of modern cars: no bumpers, no air bags, no seat belts, no antilock breaks [sic].

The cars had large, skinny wheels to more easily clear the obstacles on rocky, rutted roads. Corner them too fast and they could tip over. And if you crashed, the windshield would usually shatter into sharp, jagged pieces that could slice you to ribbons.

"The government would not allow anyone to sell Model Ts today because they're so unsafe," Casey said. "It's a car that no one would use on a regular basis today. It's not a fair comparison."

Here's similar text from the Romney rating:

This is a great example of a politician using more or less accurate statistics to make a meaningless claim. Judging by the numbers alone, Romney was close to accurate.

...

Thanks to the development of everything from nuclear weapons to drones, comparing today’s military to that of 60 to 100 years ago presents an egregious comparison of apples and oranges. Today’s military and political leaders face real challenges in determining the right mix of assets to deal with current and future threats, but Romney’s glib suggestion that today’s military posture is in any way similar to that of its predecessors in 1917 or 1947 is preposterous.

Obama: Technically correct, as long as you don't consider any other factors, but a reach. Mostly True.

Romney: Close to accurate, meaningless, egregious, glib, preposterous. Pants on Fire.

Bruscino is right to point out the terms used to describe Romney's statement are more appropriate for the editorial page as opposed to an objective determination of facts. And once again, we're left to wonder why different guidelines are used for different people.

Update (1/19/2012 1921 pst) Jeff adds: Speaking of glib and preposterous, this part of the rating just caught my eye:

A wide range of experts told us it’s wrong to assume that a decline in the number of ships or aircraft automatically means a weaker military. Quite the contrary: The United States is the world’s unquestioned military leader today, not just because of the number of ships and aircraft in its arsenal but also because each is stocked with top-of-the-line technology and highly trained personnel.

The first problem is obvious. Romney never claimed that a reduction in the number of ships or aircraft automatically meant a weaker military.  Actually, Romney was citing examples in support of his overall claim (that continued cuts in defense spending will eventually lead to a weaker force). Jacobson's second sentence is a howler. "Quite the contrary" to what? The fact that the U.S. is the world's supreme military force is totally irrelevant to whether or not it's on the path to becoming weaker. If Warren Buffet loses a million dollars on a bad deal, the fact that he's still the richest guy in the room does not negate the fact that he's also a million dollars poorer. And just like Romney claimed in his statement, Buffet simply cannot continue to cut bad deals if he is going to remain the richest guy in the room.

Thursday, November 3, 2011

Matthew Hoy: "You guys screwed up"

Ordinarily we highlight Matthew Hoy's criticisms of PolitiFact via the posts at his blog, Hoystory.  But this time we catch Hoy at his pithy best while blasting PolitiFact over at Facbook for its "Pants on Fire" rating of Herman Cain's supposed claim that China is trying to develop nuclear weapons.  PolitiFact took Cain to mean China was developing nuclear weapons for the first time, you see.

Hoy:
You guys screwed up. Congratulations. Read the whole context (which you provide) and it's ambiguous -- he very well may be referring to nuclear-powered AIRCRAFT CARRIERS -- which they don't have yet. Also, during Vietnam, Cain was working ballistics for the Navy, studying the range and capabilities of China's missiles. He knew they had nukes. It was inartfully said. Not a mistake. According to your own rules, you don't fact check things like this: "Is the statement significant? We avoid minor "gotchas"’ on claims that obviously represent a slip of the tongue."
That about says it all, but I'll just add one helpful informational link.

Given the ambiguity of Cain's statement, it speaks volumes about PolitiFact's ideological predisposition that no attempt was made to interpret Cain charitably.

Wednesday, November 2, 2011

Grading PolitiFact: Joe Biden and the Flint crime rate

(crossposted from Sublime Bloviations with minor reformatting)


To assess the truth for a numbers claim, the biggest factor is the underlying message.
--PolitiFact editor Bill Adair


The issue:
(clipped from PolitiFact.com)


The fact checkers:

Angie Drobnic Holan:  writer, researcher
Sue Owen:  researcher
Martha Hamilton:  editor


Analysis:

This PolitiFact item very quickly blew up in their faces.  The story was published at about 6 p.m. on Oct. 20.  The CYA was published at about 2:30 p.m. on Oct. 21, after FactCheck.org and the Washington Post published parallel items very critical of Biden.  PolitiFact rated Biden "Mostly True."

First, the context:



(my portion of transcript in italics, portion of transcript used by PolitiFact highlighted in yellow):

BIDEN:
If anyone listening doubts whether there is a direct correlation between the reduction of cops and firefighters and the rise in concerns of public safety, they need look no further than your city, Mr. Mayor.  

In 2008--you know, Pat Moynihan said everyone's entitled to their own opinion, they're not entitled to their own facts.  Let's look at the facts.  In 2008 when Flint had 265 sworn officers on their police force, there were 35 murders and 91 rapes in this city.  In 2010, when Flint had only 144 police officers the murder rate climbed to 65 and rapes, just to pick two categories, climbed to 229.  In 2011 you now only have 125 shields.  

God only knows what the numbers will be this year for Flint if we don't rectify it.  And God only knows what the number would have been if we had not been able to get a little bit of help to you.

As we note from the standard Bill Adair epigraph, the most important thing about a numbers claim is the underlying message.  Writer Angie Drobnic Holan apparently has no trouble identifying Biden's underlying message (bold emphasis added):
If Congress doesn’t pass President Barack Obama’s jobs plan, crimes like rape and murder will go up as cops are laid off, says Vice President Joe Biden.

It’s a stark talking point. But Biden hasn’t backed down in the face of challenges during the past week, citing crime statistics and saying, "Look at the facts." In a confrontation with a conservative blogger on Oct. 19, Biden snapped, "Don’t screw around with me."
No doubt the Joe Biden of the good "Truth-O-Meter" rating is very admirable in refusing to back down.  The "conservative blogger" is Jason Mattera, editor of the long-running conservative periodical "Human Events."  You're a blogger, Mattera.  PolitiFact says so.

But back to shooting the bigger fish in this barrel.

PolitiFact:
We looked at Biden’s crime numbers and turned to the Federal Bureau of Investigation's uniform crime statistics to confirm them. But the federal numbers aren’t the same as the numbers Biden cited. (Several of our readers did the same thing; we received several requests to check Biden’s numbers.)

When we looked at the FBI’s crime statistics, we found that Flint reported 32 murders in 2008 and 53 murders in 2010. Biden said 35 and 65 -- not exactly the same but in the same ballpark.
Drobnic Holan initially emphasizes a fact check of the numbers.  Compared to the FBI numbers, Biden inflated the murder rate for both 2008 and 2010, and his inflated set of numbers in turn inflates the percentage increase by 45 percent (or 27 percentage points, going from 60 percent to 87 percent).  So it's a decent-sized ballpark.

PolitiFact:
For rapes, though, the numbers seemed seriously off. The FBI showed 103 rapes in 2008 and 92 rapes in 2010 -- a small decline. The numbers Biden cited were 91 rapes in 2008 and 229 in 2010 -- a dramatic increase.
If inflating the percentage increase in murders by 27 percentage points is not a problem for Biden then this at least sounds like a problem.

After going over some other reports on the numbers and a surprising discussion of how not much evidence suggests that Obama's jobs bill would address the number of police officers in Flint, PolitiFact returns to the discrepancy between the numbers:
(W)e found that discrepancies between the FBI and local agencies are not uncommon, and they happen for a number of reasons. Local numbers are usually more current and complete, and local police departments may have crime definitions that are more expansive than those of the FBI.
All this is very nice, but we're talking about the city of Flint, here.  We don't really need current stats for 2008 and 2010 because they're well past.  Perhaps that affects the completeness aspect of crime statistics also; PolitiFact's description is too thin to permit a judgment.  As for "expansive" definitions, well, there's a problem with that.  Biden's number of rapes in 2008 is lower than the number reported in the UCR (FBI) data.  That is a counterintuitive result for a more expansive definition of rape and ought to attract a journalist's attention.

In short, even with these proposed explanations it seems as though something isn't right.

PolitiFact:
Flint provided us with a statement from Police Chief Alvern Lock when we asked about the differences in the crime statistics, particularly the rape statistics.

"The City of Flint stands behind the crime statistics provided to the Office of The Vice President.  These numbers are an actual portrayal of the level of violent crime in our city and are the same numbers we have provided to our own community. This information is the most accurate data and demonstrates the rise in crime associated with the economic crisis and the reduced staffing levels.

"The discrepancies with the FBI and other sources reveal the differences in how crimes can be counted and categorized, based on different criteria." (Read the entire statement)
This is a city that's submitting clerical errors to the FBI, and we still have the odd problem with the rape statistics.  If the city can provide numbers to Joe Biden then why can't PolitiFact have the same set of numbers?   And maybe the city can include stats for crimes other than the ones Biden may have cherry-picked?  Not that PolitiFact cares about cherry-picked stats, of course.

Bottom line, why are we trusting the local Flint data sight unseen?

PolitiFact caps Biden's reward with a statement from criminologist and Obama campaign donor James Alan Fox of Northeastern University to the effect that Biden makes a legitimate point that "few police can translate to more violent crime" (PolitiFact's phrasing).  Fox affirms that point, by PolitiFact's account, though it's worth noting that on the record Biden asserted a "direct correlation" between crime and the size of a police force.  The change in wording seems strange for a fact check outfit that maintains that "words matter."

The conclusion gives us nothing new other than the "Mostly True" rating.  Biden was supposedly "largely in line" with the UCR murder data for Flint.  His claim about rape apparently did not drag down his rating much even though PolitiFact admittedly could not "fully" explain the discrepancies.  PolitiFact apparently gave Biden credit for the underlying argument that reductions in a police force "could result in increases in violent crime" despite Biden's rhetoric about a "direct correlation."


The grades:

Angie Drobnic Holan:  F
Sue Owen: N/A
Martha Hamilton:  F

This fact check was notable for its reliance on sources apparently predisposed toward the Obama administration and its relatively unquestioning acceptance of information from those sources.  The Washington Post version of this fact check, for comparison, contacted three experts to PolitiFact's one and none of the three had an FEC filing indicating a campaign contribution to Obama.

And no investigation of whether Biden cherry-picked Flint?  Seriously?  See the "Afters" section for more on that as well as commentary on PolitiFact's CYA attempt.

Monday, August 1, 2011

My Domestic Church: "Analyzing political analysis -Politifact and Bill O'Reilly"

Blogger "Elena" at My Domestic Church (mydomesticchurch.com) notices the bias at PolitiFact:
This recent article on Politifact about the Poverty Rate is a good example of its liberal bias.  First of all, they are taking apart a comment made by Bill O'Reilly made on the Fox News Channel.
During the O'Reilly Factor segment, Bill O'Reilly claimed that President Johnson's "Great Society" programs had done little to decrease poverty and used as evidence an increase in the poverty rate from 14 percent in 1965 up to 14.3 percent in the present day.  O'Reilly was off on the first figure--it was 17.3 percent in 1965.

Elena observed:
Politifact says that this assertion that the poverty rate has stayed about the same is false. But on reading the Politifact article I wonder if they didn't actually prove O'Reilly's point:
Politifact says: • He uses the wrong numbers. The poverty rate -- the percentage of Americans whose income is lower than the federally determined poverty line -- was 17.3 percent in 1965, not 14 percent. For 2009, O’Reilly is correct -- the rate was 14.3 percent.

So if you compare the poverty rate in those two years, it has fallen by 3 percentage points, or by about one-sixth its original level. It didn’t stay roughly constant, as O’Reilly claimed.
Well, he really didn't say it was constant. Still if it was 17.3 in 1965 and it is now 14 or so, that's not very good progress to have in over 46 years!
Elena is exactly right that PolitiFact exaggerates O'Reilly's claim.  In fact, as she later points out, the PolitiFact story repeats the error in the conclusion, with "O’Reilly said the Great Society programs did nothing to reduce poverty."

Elena expands on her contention that PolitiFact's findings support O'Reilly's point with another observation:
But I got a good chuckle over this:
The poverty rate has fallen even further if you start counting a few years before the Great Society began. Between 1959 and 1962, the poverty rate ranged between 20 and 22 percent. If you compare that level to 2009, poverty declined by an even steeper rate -- by more than one-third.
So if we say the poverty rate was 22 in 1962 and fell to 17 in 1965, that's a drop of 5 points in three years - so couldn't we surmise that the poverty rate was dropping faster and further before the government programs started to interfere?
Touche.

Please read the rest here, and remember not to succumb to the error of dismissing information based solely on the source.  Content comes first.