Wednesday, February 29, 2012

Michael F. Cannon: "Strike Three for PolitiFact"

Better late than never, we take note of Cato's Michael F. Cannon's take on PolitiFact's 2011 "Lie of the Year":
 The annual unveiling of its “Lie of the Year” award garners PolitiFact more attention than anything else. Hopefully, it will garner so much attention that people will recognize this award, which is supposed to improve political discourse, instead degrades it.

PolitiFact’s past three Lies of the Year have been about health care.  Not one of them was a lie.
Cannon has boycotted PolitiFact because of the subjectivity of its "Lie of the Year" selections, turning down requests that he contribute is expert opinion to various stories.

We join others in calling for an expanded boycott of PolitiFact:  Don't cite PolitiFact as a dependable source of information.  Pressure it to reform by dumping the silly and misleading "Truth-O-Meter" and finding additional ways to separate objective reporting from news analysis and outright opinion.

Cannon's commentary is crisp and to-the-point.  Do read it all, and visit his links to appreciate the history of his principled boycott.

Tuesday, February 28, 2012

PolitiFact's sham fact checking

Senator Marco Rubio (R-Fla.) and  President Barack Obama had something in common last week, and Jeff Dyberg noticed.

Both made statements about majorities that were graded "Mostly True" by the fact checkers at PolitiFact.  The justifications PolitiFact used for the rulings was similar.  PolitiFact cited poll data showing that pluralities rather than majorities obtained, and ruled favorably based on the underlying points.

Note the summary paragraph for the Rubio story:
Rubio said that the majority of Americans are conservative. A respected ongoing poll from Gallup shows that conservatives are the largest ideological group, but they don’t cross the 50 percent threshold. So we rate his statement Mostly True.
Compare the summary paragraph for the Obama item:
So overall, the poll numbers support Obama’s general point, but they don’t fully justify his claim that "the American people for the most part think it’s a bad idea." Actually, in most of the polls just a plurality says that. On balance, we rate his statement Mostly True.
Rubio and Obama no longer have the "Mostly True" ruling in common.

PolitiFact received numerous complaints about the Rubio ruling and changed it to "Half True."

Of course, in the case of Rubio, PolitiFact found more information that bolstered the downgraded "Half True" rating.

Just kidding.  Go through the updated story with a fine-toothed comb and Rubio's claim ends up looking even more similar to Obama's, except maybe better.  Note the concluding paragraph of the updated Rubio story:
So by the two polls, he was incorrect. By one, he was correct and we find support for his underlying point that there are more conservatives than liberals. On balance, we rate this claim Half True.
This case makes it appear that PolitiFact is sensitive to scolding from the left, perhaps particularly when it comes from media elites like Jay Rosen.  And maybe that's understandable in a way.  But if the left doesn't complain about the Obama rating until it's downgraded to "Half True" then both the left and PolitiFact (or is there a difference?) look pretty inconsistent.

Wednesday, February 22, 2012

CJR's interview with Robert Higgs

Columbia Journalism Review has posted online an interview with the editor of PolitiFact Ohio, Robert Higgs.

Though it was tempting to quote the positive, where Higgs expressed reservations about labeling a subject as a "liar," the winner for the PFB spotlight was this cringe-worthy response by Higgs:
What is your audience like? Do you have any sense of how it compares to the audience for the regular political or news coverage?

It is very diverse. I know we have readers who are very conservative, and I know we have readers who are very liberal who read it every day. I get email from all ends of the political spectrum, both criticizing us because they don’t like what we wrote or suggesting items to take a look at. I like that. I get hit from both sides, with people accusing me of having a leftist bias or an obvious conservative bent. It’s refreshing to know we are hitting somewhere down the middle.


Recommended question for the next Higgs interview:

Mr. Higgs, when PolitiFact Ohio uses the "Half True" grade on the 'Truth-O-Meter,' is it defined as "The statement is partially accurate but leaves out important details or takes things out of context" or is it defined as "The statement is accurate but leaves out important details or takes things out of context"?

Follow up: 

Has that definition ever changed at PolitiFact Ohio?

Jeff adds: This oft repeated notion that if PolitiFact is upsetting both sides, they must be doing something right, is grossly flawed. "If I put this foot in a bucket of ice, and my other foot in a smelting pot, it follows that I'll be perfectly comfortable." If the best self-analysis PolitiFact editors can come up with is a cliche that doesn't work, it's no surprise their fact checks get the same uncritical review.

Tuesday, February 21, 2012

Politico: "PolitiFact without the 'Truth-O-Meter'"

Politico's media guy Dylan Byers hits and misses with his "PolitiFact without the 'Truth-O-Meter'" column.

First the miss, occurring in Byers' set up based on last week's dust-up between PolitiFact and Rachel Maddow over a rating of Florida senator Marco Rubio:
PolitiFact, the Tampa Bay Times fact-checking project, has come under fire this week for a ruling that seems to contradict common sense. Yesterday, MSNBC's Rachel Maddow -- PolitiFact's most vocal critic -- went to town on the group for claiming that an assertion made by Florida Senator Marco Rubio was "mostly true" when it was, in fact, false.
Rubio was probably correct, so if Byers intends to say that the statement was false then he needs to do better fact checking himself.  The construction of the sentence allows him to blame it on Maddow, I suppose.

Now the hit:
So, here's a thought. Get rid of the 'Truth-O-Meter.'
Okay, so I was peddling that idea back in 2008, but it's nice to see others picking up on the notion.

Byers scores another hit in supporting his suggestion:
I asked Adair today if PolitiFact would ever consider getting rid of its rulings and just present the facts on their own.

"The Truth-O-Meter is a key part of PolitiFact's work," he said. "We independently research political claims, analyze their overall accuracy and rate them from True to Pants on Fire. The rating allows readers our assessment to see the overall accuracy at a glance; they can read our analysis for more details."

Here is a less generous interpretation of that claim: The "Truth-O-Meter" allows PolitiFact to market its research -- which is painstaking and time-consuming -- to a political discourse that doesn't have time to read its analysis. The "Truth-O-Meter" is what enables pundits to put politicians on the spot by saying, "Ok, but PolitiFact found that that statement was "'mostly false.'" It is what enables political opposition to sound the siren whenever something is ruled "Pants on Fire." And without these convenient rulings, people might stop paying attention.
The "Truth-O-Meter" is a marketing gimmick.  And despite the fact that the meter's design by nature degrades PolitiFact's journalism, PolitiFact is so wedded to it that no divorce is possible.

There's a sense in which PolitiFact's marketing approach is a "savior" to print journalists.  That perception probably helped PolitiFact capture its 2008 Pulitzer Prize.  Hard news reporting was made popular while newspaper circulation numbers steadily declined.

But the false prophecies are getting more difficult to overlook.

Jeff adds: Since when is Rachel Maddow "PolitiFact's most vocal critic"? Perhaps the voices of James Taranto or Mark Hemingway aren't able to break through the echo echo chamber chamber?

Bryan adds:
  Maybe she's the "most vocal" critic because her televised messages are audible while most others just write?

PFB Smackdown: Lawrence O'Donnell, Rachel Maddow and Tommy Christopher

Uh-oh!  Liberals are once again scandalized by a PolitiFact fact check!

MSNBC's Lawrence O'Donnell appeared in the following political ad:

PolitiFact looked into O'Donnell's claim about critics calling the GI Bill "welfare" and ruled it "Mostly False."   The fact check does have some problems.

PolitiFact went easy on O'Donnell

The fact check contains a huge error.  PolitiFact overlooks the fact that O'Donnell is making an equivocal argument.  O'Donnell stresses that the GI Bill was an education program.  But when PolitiFact pressed MSNBC to support O'Donnell's claim, the latter responded by providing criticisms that almost exclusively aimed at unemployment benefits that were part of the bill.  O'Donnell's argument is a bait-and-switch.

PolitiFact claims to take such abuse of context into account.  Some of the "Truth-O-Meter" grades, in fact, carry clear signs of the perils of making a claim with limited context.

Rachel Maddow

Monday, February 20, 2012

The Weekly Standard: "Liberal Pundits Shocked to Discover PolitiFact Not Always Factual"

Mark Hemingway of the Weekly Standard has earned himself the reputation as perhaps PolitiFact's top critic.  As evidence of that, Hemingway beat me to the "late to the party" theme by about a month after the progressive outrage over PolitiFact's "Lie of the Year" selection for 2011.

I'm sorry I missed his article before now.

So the liberal punditry woke up today to find that PolitiFact has declared the "Lie of the Year" to be Democrats's claim that Paul Ryan's budget will "end Medicare" or "end Medicare as we know it." They're having quite the collective freakout—see Paul Krugman, Jonathan Chait, Matt Yglesias, Brian Beutler, Steve Benen, et al.
Hemingway concedes the "end Medicare" claim has some truth to it:
Accusing Republicans of trying to end Medicare as we know it is also a stupid criticism because the implementation of the Independent Payment Advisory Board (IPAB) in the Patient Protection and Affordable Care Act will also "end Medicare as we know it." And unlike Ryan's plan, Democrats already made IPAB the law of the land. Under IPAB, unelected federal bureaucrats chosen by the president will bypass Congress and set the Medicare budget, and this will likely have pretty dramatic consequences for the program, such as severely restricting doctor access and rationing. It might well prove unconstitutional to boot.
So why all the outrage if Medicare as we know it is already dead and gone?  Hemingway has a hypothesis:
Liberals are freaking out over this because they're so used to PoltiFact and other fact checkers breaking things their way.

But he's probably right.  And, as usual, it's well worth reading the whole article.

Correction 2/21/2012:  Fixed spelling of "Pundits" in the title.

Don Surber: "Hey PolitiFact, here's your death panel"

Thanks to the Charleston Daily Mail and columnist Don Surber, a little reminder that PolitiFact's "Lie of the Year selection for 2011 isn't the first to receive well-grounded criticism:
The liberal apologists at the Tampa Bay Times’ PolitiFact have denied for more than 2 years that Obamacare has death panels. How it could assure us in 2009 of just what was in a law that was not finalized until 2010 is a mystery that defies the laws of chronology; maybe in addition to having the power to divine the truth in politics, the personnel of PolitiFact have the power of prophecy.
Surber's just getting started, so scoot on over to the Daily Mail's website and read the whole thing.

Sunday, February 19, 2012

Positives: PolitiFact adds corrections page

A bit of optional background, first.

On Feb. 7 of this year, PolitiFact announced the addition of a corrections page.

It's about time, better late than never, etcetera.

As a corrections page the new feature is pretty skimpy.  Essentially it's just a list of stories that have corrections or updates and looks like any other list of stories in the domain.  One interested in finding a list of stories that required a change in rating is out of luck, at least with the present version of the page.

Caveats aside, we applaud the modest improvement in transparency.  Now we just have to figure out why the total number of corrections on the page disagrees with the number we counted on our "(Annotated) Principles of PolitiFact" page.

Looks like their corrections page needs a few corrections.  Or at least a clarification to illuminate the fact that PolitiFact does not intend to admit corrections from more than three weeks prior to the publication of its "Principles of PolitiFact and the Truth-O-Meter" on Feb. 21, 2011.

Welcome to the wonderful world of PolitiFact fact checking.

Friday, February 17, 2012

PolitiFact's prophylactic CYA

Yesterday PolitiFact rolled out a CYA article in response to the blowback to the oft-floated claim that 98 percent of all Catholic women use contraception.  PolitiFact rated that claim from an Obama administration official on Feb. 6, finding it "Mostly True."  PolitiFact's treatment of the issue provided little evidence of earnest journalistic curiosity and left its readers with no real means of independently verifying the data.

Watch how PolitiFact deftly avoids taking any responsibility for failing to present a clear account of the issue:
For the past week, thoughtful readers have let us know that we were wrong to give a Mostly True to the claim from a White House official that "most women, including 98 percent of Catholic women, have used contraception."

They said we overlooked a chart in a study from the Guttmacher Institute that showed the percentage was far more limited. But there’s a good reason we didn’t rely on the chart — it wasn’t the right one.
PolitiFact doesn't tell you that the Feb 6 story doesn't refer at all to the relevant chart.  PolitiFact claims to provide its sources. The source list doesn't include the relevant chart.  Instead, it features the charts that drew so much attention in the published criticisms.
Guttmacher Institute, "Contraceptive Use Is The Norm Among Religious Women," April 13, 2011

Guttmacher Institute, "Countering Conventional Wisdom: New Evidence on Religion and Contraceptive Use," April 2011

Centers for Disease Control and Prevention, "National Survey of Family Growth," accessed Feb. 2, 2012

Centers for Disease Control and Prevention, "Key Statistics from the National Survey of Family Growth," accessed Feb. 6, 2012
PolitiFact's mission (bold emphasis added):
PolitiFact relies on on-the-record interviews and publishes a list of sources with every Truth-O-Meter item. When possible, the list includes links to sources that are freely available, although some sources rely on paid subscriptions. The goal is to help readers judge for themselves whether they agree with the ruling.
Um, yeah, whatever.

So did PolitiFact fact check the item without checking the facts or simply forget to link the relevant data in the source list? 

Don't look for a confession in a CYA:
To double-check, we reviewed the criticism, talked with the study’s lead researcher, and reviewed the report and an update from the institute. We’re confident in our original analysis.
We can take that statement for what it's worth, given that the original analysis never produced a baseline for determining the error of the 98 percent figure.  We're left to guess whether the CYA intends to assure us that the original item includes data sufficient to help readers judge for themselves whether to agree with the ruling.

PolitiFact is suggesting that the fact check was perfectly fine, and those of you who used their references to try to reach your own conclusions mishandled the facts.

The spate of blog posts and stories this week — some directly claiming to debunk our reporting — unfortunately rely on a flawed reading of a Guttmacher Institute study.

They were easy mistakes to make, confusing the group of women who have "ever used" contraceptives with those who are "currently using" contraceptives — and misapplying footnote information about those "currently using" to the 98 percent statistic.
The "flawed reading" results directly from the fact that neither the Guttmacher Institute nor PolitiFact provided access to the data that might have supported the key claim.  I'll quote from the PFB assessment:  "That's fact checking?"

If PolitiFact had checked the claim properly in the first place then PolitiFact could have answered the criticisms without the wholesale review.  In fact, the criticisms would be clearly wrong based on material included in or linked from the original fact check.

More from PolitiFact:
The critics of our reporting — bloggers for the Weekly Standard, and — were relying on an analysis from Lydia McGrew in her blog, "What's Wrong With The World," which was also cited by the Washington Post's WonkBlog.
PFB highlighted McGrew's analysis, certainly.  But our criticisms expanded beyond McGrew's and recognized that the Guttmacher Institute report may have included data that PolitiFact neglected to explain to its readers.  One would think from PolitiFact's response above that no criticism of its reporting on this issue contains merit.

Focus on McGrew

Wednesday, February 15, 2012

PFB Smackdown: Rachel Maddow (Updated)

We agree with Rachel Maddow up through about the 55 second mark.  Yes, PolitiFact is bad, and PolitiFact is so bad at fact checking that it doesn't deserve frequent citations as a trustworthy source. 

After that, our level of agreement starts to drop.

Sen. Rubio (R-Fla.) stated that most Americans are conservative and went on to argue the point based on attitudes toward the labels "conservative" and "liberal."

Maddow ignores the context of Rubio's remarks and attacks it using survey data about the way Americans self-identify politically.

Maddow is supposed to be ultra smart.  So how come she can't figure out that Rubio's statement isn't properly measured against self-identification numbers?

It appears that Maddow uncritically followed PolitiFact's approach to judging Rubio's accuracy.  The self-identification numbers serve as interesting context, but it's perfectly possible for 100 percent of Americans to self-identify as "liberal" yet reasonably classify as majority conservative.  That's because people can have inaccurate perceptions of their location on the political spectrum.

So, was Rubio correct that the majority of Americans are conservative?  That depends on his argument.  Rubio didn't cite surveys about self-identification.  He used a method concerned with attitudes toward the respective labels.  One can argue with the method or the application of the method, but using an inappropriate benchmark doesn't cut it.
When you ask people which party they lean toward, the independents split up so that the country is almost evenly divided. For the year of 2011, Gallup reported that 45 percent of Americans identified as Republicans or leaned that way, while 45 percent identified as Democrats or leaned that way.
Is "Republican" the same label as "conservative"?  No, of course not.

PolitiFact came close to addressing Rubio's point by looking at the political leanings of moderates, but fell short by relying on the wrong label along with the self-identification standard.  Maddow's approach was even worse, as she took Rubio's comment out of context and apparently expected PolitiFact to do the same thing.

Meanwhile, PolitiFact defends itself with the usual banalities:
“Our goal at PolitiFact is to use the Truth-O-Meter to show the relative accuracy of a political claim,” Adair explained. “In this case, we rated it Mostly True because we felt that while the number was short of a majority, it was still a plurality. Forty percent of Americans consider themselves conservative, 35 percent moderate and 21 percent liberal. It wasn’t quite a majority, but was close.”

“We don’t expect our readers to agree with every ruling we make,” he continued.
Pretty weak, isn't it?

Update 2/19/2012:

With a hat tip to Kevin Drum of Mother Jones (liberal mag), we have survey data that help lend support to Marco Rubio (as well as to my argument in his defense):

(click image to enlarge)

1)  The survey, from Politico and George Washington University, is limited to likely voters.
2)  The poll essentially forces likely voters to choose between "liberal" and "conservative."
3)  A plurality of those surveyed (43 percent) lean Democrat or self-identify as Democrat.
4)  Despite the plurality of Democrats in the survey sample, 61 percent identify as conservative ("Very conservative" or "Somewhat conservative").

What's Wrong With the World: "How to Lie with Statistics, Example Umpteen"

Jeff and I hugely appreciate bloggers who delve into the more complicated PolitiFact-related issues.

Lydia McGrew of the "What's Wrong With the World" blog gives a proper dressing-down to the Obama administration, the Guttmacher Institute and our beloved PolitiFact over the supposedly "Mostly True" claim that 98 percent of Catholic women use birth control.

As is our wont, we'll focus primarily on PolitiFact's role in the mess.

(T)his Politifact evaluation of the meme gets it wrong again and again, and in both directions.

First, the Politifact discussion insists that the claim is only about women in this category who have ever used contraception. When I first heard that and hadn't looked at the study, I immediately thought of the fact that such a statistic would presumably include women who were not at the time of the study using contraception and had used it only once in the past. It was even pointed out to me that it would include adult converts whose use might easily have been prior to their becoming Catholic. However, that isn't correct, anyway. The study expressly was of current contraceptive use. That's, in a sense, "better" for the side that wants the numbers to be high.
McGrew pointed out earlier that the Guttmacher Institute study uses data for "women at risk for
unintended pregnancy, whom we define as those who had had sex in the three months prior to the survey and were not pregnant, postpartum or trying to get pregnant."  The women surveyed were additionally in the 15-44 age range.  Yet PolitiFact describes the findings like so:
We read the study, which was based on long-collected, frequently cited government survey data. It says essentially that — though the statistic refers specifically to women who have had sex, a distinction Muñoz didn’t make.

But that’s not a large clarification, since most women in the study, including 70 percent of unmarried Catholic women, were sexually experienced.
That's fact checking?

(O)n this point, too, the Politifact evaluation is completely wrong. Politifact implies that only the supplementary table on p. 8 excluded these groups and that Figure 3 on p. 6 included them! But this is wrong. The table on p. 8 is simply supplementary to Figure 3, and both are taken from the same survey using the same restrictions! This is made explicit again and again in the study.
McGrew's exactly right.  The same information accompanies the asterisk for each table (bold emphasis added):  "*Refers to sexually active women who are not pregnant, postpartum or trying to get pregnant."

It doesn't occur to PolitiFact that restricting the survey population like that throws a serious spanner in the works.

That kind of credulity goes by a different name:  gullibility.

Visit What's Wrong With the World and read all of McGrew's skillful fisking of the liberal trio.  It's well worth it.


The Guttmacher Institute drew its data ultimately from here.

It may be the case that the Guttmacher study is reliable.  Regardless of that, PolitiFact did virtually nothing to clarify the issue.  A recent Washington Post story does shed some light on things, however:
I called up Rachel Jones, the lead author of this study, to have her walk me through the research. She agrees that her study results do not speak to all Catholic women. Rather, they speak to a specific demographic: women between 15- and 44-years-old who have ever been sexually active.

Jeff Adds (2/15/2012): Over on PolitiFact's Facebook page, frequent PF critic Matthew Hoy offered up his usual spot on commentary:
I find [PolitiFact's] failure to note that the Alan Guttmacher Institute is closely allied with Planned Parenthood a troubling omission. It isn't some neutral observer and its studies shouldn't be taken at face value without some healthy skepticism.
This isn't the first time PolitiFact has ignored Guttmacher's relationship with Planned Parenthood. Regardless of the studies accuracy, the alliance deserves at least a cursory disclosure. It's also important to note that PolitiFact used a similar connection to justify the rating of Florida Governor Rick Scott's claim about high-speed rail projects:
Scott bases his claims on hypothetical cost overruns from a suspect study written by a libertarian think tank...We rate Scott's claim False.
We highlighted that rating here.

Correction 2/17/2012:  "Guttmacher" was misspelled in the next-to-last paragraph.

Monday, February 13, 2012

Don Surber: "Obama proved it was government-run healthcare"

Don Surber notices the same connection between the actions of the Obama administration and PolitiFact's 2010 "Lie of the Year" that I pointed out last week.  We highlight it here at PolitiFact Bias because Surber makes the criticism of PolitiFact so effectively and directly:
On December 17, 2010, as part of its continuous support and defense of Obamacare, PolitiFact boldly declared as its Lie Of The Year “government takeover of health care.”

This week, President Obama’s own actions proved that PolitiFact’s editors were in error. By requiring that everyone’s health insurance (which will soon be mandatory) carry free birth control for women — no co-payments or no deductibles — not only does President Obama violate the 1st and 14th Amendments to the Constitution (religion and equal protection under the law) but President Obama lays bare the lie that this is not government-run health care.
Surber's observation serves as yet another indicator that liberals critical of the 2011 "Lie of the Year" selection were simply late to the party.  Not that a thorough progressivist indoctrination can't detect a full-on conservative bias at PolitiFact.

If PolitiFact has any chance of redemption on this issue, it comes from the doubt as to whether the executive branch has the authority it claims from the health care bill to require insurance companies to provide a service free of charge.

If no such authority exists, of course, it makes President Obama's position to the contrary a falsehood.

National Review: "Mark Hemingway on 'Fact-Checking'"

National Review's Reihan Salam amplifies (December 2011) the criticism of PolitiFact by Mark Hemingway and published in the Weekly Standard from the same month:
I’ve often noticed that the “fact-checkers” in question are often obtuse, misleading, or both, and Hemingway makes the case in systematic fashion.
Salam quotes Hemingway's article extensively and adds his hearty agreement to Hemingway's criticism, adding his own swipe at PolitiFact's lack of transparency with respect to ideological bias:
To the extent that the mission of PolitiFact is to offer richer context for the statements made by leading public officials, I’m all for it. That is part of what we try to do in this space. Yet there are at least two important distinctions: (a) our ideological perspective is clear; (b) we make an effort to make reference to and to provide links to contrary views, though perhaps not as much as we might if we presented ourselves as neutral observers on the political scene.
PolitiFact's use of the "non-partisan" label is a fig leaf.

Columbia Journalism Review: "What the Fact-Checkers Get Wrong"

If, in "What the Fact-Checkers Get Wrong" the Columbia Journalism Review only mentions the tip of the iceberg, we can at least take solace in the fact that this PolitiFact-related story is a step up from at least one previous effort.

Sure, the new item hints at liberal bias in that the piece appears to take for granted that only one of PolitiFact's last three "Lie of the Year" awards went to a statement that was defensibly true.

Regardless of that, it contains a solid criticism:
In fact, the sights of the broader fact-checking movement often seem to be set on something different than strict truth and falsehood. And by acknowledging that, the fact-checkers might grapple with some important questions about the project in which they’re engaged—and might see more clearly the box in which they’ve trapped themselves.

To get at those questions, it’s helpful to think about why “fact-checking” has emerged now. I’d argue that it’s a response to many journalists’ perception that they are ever more outgunned by the increasing volume and sophistication of professional political communication. The fact-check is a tool with which reporters can rescue themselves from oblivion. And the morally freighted language invoked by full-time fact-checkers—true and false, fact and lie—is a weapon, to be wielded by journalists with authority against other, presumably less trustworthy types who make political claims.
The CJR critique hits close (and the story acknowledges this explicitly) to one of the main points Mark Hemingway made in his controversial piece in the Weekly Standard in December:  Fact checkers are exalting their role in the epistemological chain, staking out for themselves a strategically valuable territory in the realm of public discourse.

Fact checking is, in a real sense, a power grab.

Friday, February 10, 2012

Don Surber: "PolitiFact fisked"

We're way overdue recognizing Don Surber's ongoing commentary regarding PolitiFact, but his "PolitiFact Fisked" is as good a start as any:
There is an old saying in the newspaper trade that if you are taking it from both sides, you must be doing something right. The reality is that you definitely are doing something wrong and in the case of PolitiFact, editor Bill Adair and company are doing plenty of things wrong.
Surber's always a great read, especially when he chooses PolitiFact as his subject.  The above is just a teaser for an excellent fisking of PolitiFact.  Read it all.

And could it be a coincidence that PolitiFact uses the fact of criticism from both sides to implicitly claim it's doing something right?

The Weekly Standard: "Pants on (three-alarm) Fire"

The Weekly Standard has some subscriber-only content criticizing PolitiFact Oregon.

Fortunately there's a preview:
PolitiFact Oregon—which works in partnership with the state’s most influential media outlet, the -Oregonian—has been trying and failing to play referee in the race by evaluating the candidates’ statements. First, PolitiFact gave Cornilles its “Pants on Fire” rating for an ad claiming that Bonamici, a state legislator, voted to raise taxes 60 times. Now depending on how broadly you define “tax,” the claim is defensible. But if you want to split hairs—and boy, does PolitiFact ever like to do that—then you would say not that Bonamici has raised taxes 60 times, but that she has raised taxes and fees 60 times.
Maybe PolitiFact Oregon has a Parse-O-Tron instead of a Truth-O-Meter.

There's more to preview at the link, but only subscribers get the whole story.

Thursday, February 9, 2012

Relevant: "Measuring the Slant"

Power Line blog directs us to free online content from the Claremont Review of Books

CRB reviewer James Q. Wilson reviews Tim Groseclose's "Left Turn:  How Liberal Media Bias Distorts the American Mind" and its place in media bias studies.

We're following Power Line's lead by not including any excerpted material.  Go.  Read.

Tuesday, February 7, 2012

Hoystory: "Obama’s War on Religion and Conscience"

Matthew Hoy is back at it with his usual biting commentary on PolitiFact. This time he shares his thoughts on the current debate about the effect of PPACA mandates on institutions of the Roman Catholic Church.

Hoy deals broadly with the controversy, but we'll highlight his mention of PolitiFact. At issue is PolitiFact's treatment of Newt Gingrich's statement that the PPACA requires religious institutions to provide insurance coverage for contraceptives:
After honestly analyzing the rule and the law, Politifraud labels Gingrich’s charge “mostly false” as they engage in an amount of hand-waving that would enable human flight without the aid of wings, engines or the other commonly required tools.
Still, if you consider a Catholic church to be a "Catholic institution," or a synagogue to be a "Jewish institution," Gingrich isn’t correct that the recent federal rule on contraceptives applies. Those nonprofit religious employers could choose whether or not they covered contraceptive services.
It’s pretty clear that Gingrich chose his words carefully here and Politifraud is muddying the waters. When I hear the words “Catholic institution” I think of everything Catholic that isn’t the church. I think of hospitals, soup kitchens, homeless shelters, adoption services, the Knights of Columbus, etc. Maybe it’s just because I’m likely more familiar with religious terminology than the (snark on) godless heathens (snark off) who populate many newsrooms, that I interpret it this way. But if the difference between a “True” or “Mostly True” ruling and a “Mostly False” ruling is over whether the word “institution” includes the church or not, then there’s way too much parsing going on.
Parsing words is nothing new for PolitiFact. But that's not the biggest flub Hoy spots:
In the video Politifact links to of Gingrich’s statement (provided by none other than Think Progress), Gingrich makes it clear that he is talking about the rule issued “last week.” The rule issued last week was the one regarding religious employers covering contraceptives in their health plans. Politifraud dishonestly expands that specific criticism of that specific rule into states can set their own benchmarks. No, they can’t. Not when it comes to the rule that came down “last week.” That rule says they MUST cover contraceptives.
Once again Hoy is spot on, though as usual our brief review doesn't do his work justice. Head over to Hoystory and read the whole thing.