Showing posts with label 2009. Show all posts
Showing posts with label 2009. Show all posts

Monday, August 25, 2014

Unearthing a truth PolitiFact buried,

2009

We've often reminded readers that we only scratch the surface of PolitiFact's mountain of journalistic malfeasance. Reminding us of that point, we have an item from way back on Sept 15, 2009, when PolitiFact was still connected to Congressional Quarterly.

The issue? Economist Thomas Sowell wrote that President Obama let the economic stimulus bill sit on Obama's desk for three days before the president signed it.

PolitiFact:
In a recent column in Investor's Business Daily, economist and political commentator Thomas Sowell said that President Barack Obama was trying to rush his health care bill through Congress. Sowell cited the quick passage of the economic stimulus bill in February 2009 as proof that Obama is too hasty in passing major legislation.

Sowell wrote that "the administration was successful in rushing a massive spending bill through Congress in just two days — after which it sat on the president's desk for three days, while he was away on vacation."
In truth, Sowell wasn't trying to prove Obama was too hasty in passing major legislation. He was arguing Obama passes legislation hastily when there's no apparent reason to rush the legislation.

"Allow five days of public comment before signing bills"


A PolitiFact item from earlier that same year, on January 29, 2009, helps provide some context for Sowell's complaint:
"Too often bills are rushed through Congress and to the president before the public has the opportunity to review them," Obama's campaign Web site states . "As president, Obama will not sign any nonemergency bill without giving the American public an opportunity to review and comment on the White House Web site for five days."

But the first bill Obama signed into law as president — the Lilly Ledbetter Fair Pay Act — got no such vetting.
So, Obama promised he would wait at least five days before signing non-emergency legislation.  The three-day wait for the stimulus bill implies it qualified as an emergency bill. But not such an emergency that Obama couldn't wait a few days before signing it.

The key to PolitiFact's argument? The ultra-literal reading of "sat on the president's desk." In PolitiFact's judgment, since the bill wasn't literally sitting on the desk waiting for the president's signature therefore the case won't support Sowell's point.

Sowell expresses his point:
The only reasonable alternative seems to be that he wanted to get this massive government takeover of medical care passed into law before the public understood what was in it.

Moreover, he wanted to get re-elected in 2012 before the public experienced what its actual consequences would be.

Unfortunately, this way of doing things is all too typical of the way this administration has acted on a wide range of issues.
The example using the stimulus bill followed. Sowell points out spending from the stimulus bill took place over an extended period, making a joke of the notion the stimulus was intended as a strong short-term Keynesian stimulus.

Sowell's point with his example remains: If the stimulus bill was an emergency, then why not sign it as soon as possible?

How did PolitiFact miss Sowell's point? Maybe PolitiFact wasn't interested in Sowell's point. How did PolitiFact miss the context of President Obama ignoring his broken pledge of transparency on legislative action? Maybe PolitiFact wasn't interested in that context.


Correction 8-25-2014:  Referred to the Affordable Care Act in one instance where the stimulus bill was intended.

Sunday, May 25, 2014

PolitiMath on uninsured Americans

An pseudonymous tipster pointed out problems with an old PolitiFact rating from 2009.

PolitiFact rated President Obama "Mostly True" for his statement that nearly 46 million Americans lack health insurance.

PolitiFact examined Census Bureau data confirming the president's figure, but noted it included 9.7 million non-citizens.  Our tipster pointed out that the number also included an estimated 14 million already eligible for government assistance in getting health insurance. 
The 2004 Census Current Population Survey (CPS) identified 44.7 million non-elderly uninsured in 2003. Blue Cross and Blue Shield Association contracted with the Actuarial Research Corporation (ARC) to provide a detailed analysis of the uninsured identified by the Census Bureau, which found:
  • Nearly one-third were reachable through public programs, such as Medicaid and the SCHIP program for children
  • One-fifth earn $50,000 or more annually and may be able to afford coverage
  • Almost half may have difficulty affording coverage because they earn less than $50,000 per year. Many of these people work for small firms that do not offer health coverage
Given that Obama was using the number of uninsured to promote the need for government intervention, PolitiFact should have mentioned the number of uninsured already able to take advantage of government help.  We're seeing that this year as at least 380,000 of those the administration says are gaining Medicaid through the ACA were already eligible before the law was passed. The administration can claim some credit for getting eligible persons signed up, but it's misleading to say all those signing up for Medicaid are gaining their coverage thanks to the ACA, just as it was misleading to use 14 million assistance-eligible Americans to show the need to offer more of the same kind of assistance.  The need was exaggerated, and PolitiFact failed to properly notice the size of the exaggeration.

The PolitiMath angle

We use the term PolitiMath of the relationships between PolitiFact's math equations and its "Truth-O-Meter" ratings.  Many journalists have trouble properly calculation error percentage, and in this item we find PolitiFact's former chief editor (Bill Adair) and its present chief editor (Angie Drobnic Holan) making a common mistake:
Getting back to Obama's statement, he said, "Nearly 46 million Americans don't have health insurance coverage today." That is the most recent number for the U.S. Census available, but he messes it up in one way that would tend to overcount the uninsured and in another way that would tend to undercount them.

It's an overcount because it counts noncitizens. Take out the 9.7 million noncitizens and the actual number is closer to 36 million. 

... So Obama is sloppy by saying it is for "Americans" but not accounting for the noncitizens, which leaves him off by about 22 percent.
PolitiFact's likely equation:  (46-36)/46   _21.7 percent_

It's the wrong equation, and this is not controversial.  It's basic math.  To find the percentage error the accurate value belongs in the denominator.

The right equation:  (46-36)/36    _27.7 percent_

Marc Caputo of the Miami Herald, a PolitiFact partner paper, made the same mistake months ago and vigorously defended it on Twitter.  Caputo argued that it's okay to do the equation either way.  One can execute the equation accurately in either form, but executing the wrong equation gives the wrong final figure.  Journalists need to consider the ramifications of having two different options for calculating an error percentage.  If one chooses the method in a way that favors one party over another then a pattern of that behavior turns into evidence of political bias.

Caputo used the method more damaging to the Republican to whom he referred.

In Adair and Holan's case, guess which party received the benefit of the wrong equation?

It's a statistic worth following.

Sunday, March 13, 2011

Introducing PFB's Anecdote-O-Meter

Just kidding with the title.  The o-meter stuff is too trite to use as more than a one-off joke.

But the subject is anecdotes and their role in helping to show bias in a body of work.

Most of us think we can perceive bias in an individual piece of journalism--an anecdotal evidence.  And we may often be right, but the appearance of bias in reporting or fact checking may simply occur at the result of writer error.

So how does one tell the difference between mere errors and ideological bias?

One method, bolstered by the methods of science, involves counting the number of errors and tracking any association with political parties or ideas.  Mere errors ought to occur roughly equally in stories regarding Republicans compared to those involving Democrats.  Where the errors harm one party more than the other beyond the line of statistical significance, evidence of a political bias has come to light.  The degree of deviation from best practices also may figure in a scientific study of journalistic errors.

In March of 2008 at my blog Sublime Bloviations, I started tagging relevant posts with "grading PolitiFact."  On occasion I have criticized PolitiFact for harsh grading of Democrats.  The vast majority of the posts, in accordance with my selection bias, is made up of criticisms of faulty grades given to Republicans or conservatives, or ridiculously gentle treatment of Democrats or progressives.

I work under no illusion that the list represents definitive evidence of a systemic bias at PolitiFact.  But the number of times PolitiFact's grades go easy on Democrats and tough on Republicans does count as an important and legitimate evidence supporting (not definitively) the charge of bias.

If the ideological bias at PolitiFact is not significant, then it should be possible to compile a list of comparable size and quality containing criticisms of PolitiFact where PolitiFact favors Republicans and deals harshly with Democrats.

I don't foresee that occurring.

The list from Sublime Bloviations, in chronological order (and do pardon the more polemical bent in the earlier entries.  I was shocked by the amateurish fact checks I was reading):

Friday, January 14, 2011

Internet Scofflaw: "Decision-making, Dithering, and Sitting on Desks"

Internet Scofflaw has an excellent piece detailing PolitiFact's habit of ignoring the common usage of a word or phrase in order to make a rating fit a particular narrative.

In October of 2009 Robert Gibbs and Dick Cheney exchanged barbs over the handling of troop requests, and PolitiFact inevitably came to Gibbs' defense. Internet Scofflaw starts with some background, and explains why Gibbs made a bogus statement in the first place-
Last month, Robert Gibbs fired back at Dick Cheney’s (inarguable) accusation that President Obama is dithering about Afghanistan, saying:

[Gibbs:] "The vice president was for seven years not focused on Afghanistan. Even more curious given the fact that an increase in troops sat on desks in this White House, including the vice president’s, for more than eight months, a resource request filled by President Obama in March."

Obviously Gibbs’s effort to tie in the vice president is rubbish, since the vice president is not in the chain of command. But what about the central accusation that the request sat on President Bush’s desk for more than eight months?
That final question is exactly what PolitiFact decided to rate. Not surprisingly, they rated Gibbs True-
The public doesn't have access to [U.S. commander in Afghanistan, Gen. David] McKiernan's formal request for more troops. But we know that he was talking about it publicly in September 2008, at least 4 1/2 months before the end of Bush's term. And McKiernan told reporters his request went back nearly to the start of his taking over as the top U.S. commander four months before that. That would suggest Gibbs' claim is correct that it had been sitting on desks in the White House for eight months. And so we rule his statement True.
Internet Scofflaw does the legwork PolitiFact fails to do and comes up with a more honest analysis of Gibb's statement-
If “sat on desks” meant the same thing as “was not fully fulfilled”, then Gibbs and the St. Petersburg Times would have a strong case. (Of course, by that definition, Gen. McChrystal’s request will probably be sitting on Obama’s desk forever, since all indications are that it will not be fully granted.) But that’s not what the phrase means. To “sit on a desk” means that no decision was made. That is not at all the case with Gen. McKiernan’s requests for troops.

As ABC News explains, McKiernan made several requests for troops over his months in command, totaling about 30,000 troops. Some of the requests were granted, but most were not, as the Surge in Iraq was making heavy demands. Instead, the Bush administration tried to get NATO to fill the gap. By the fall of 2008 it was clear that NATO was not going to come through, and with the Surge winding down, more US troops were available for Afghanistan and were sent. In March 2009, with Iraq quiet and troops withdrawals underway, the balance was sent by President Obama.

So what you saw from President Bush is the normal process of allocating scarce military resources where they are most needed. In other words, you saw decision-making. In March you saw the same from President Obama. But now, on the other hand, you see Obama unable to make a decision. Dithering.
It's important to note that PolitiFact's close relationship with ABC News didn't begin until 2010. That being said it's interesting that PolitiFact ignored ABC reporter Jake Tapper's conclusion on his blog (made the day prior to PolitiFact's rating)-
So Gibbs’s claim that for “eight months” McKiernan’s request for troops “sat on desks” isn’t accurate.
Internet Scofflaw ends with a question we find ourselves asking all the time-
What use is a fact checker that sides with the administration regardless of the facts?
You can read the full piece here.


*****
Bryan adds:

The Internet Scofflaw assessment largely agrees with one I published at the time.  Please excuse me as I risk upper extremity injury by patting myself on the back.

Tuesday, January 4, 2011

The American Spectator: PolitiFact's Fixers

Back in 2009, PolitiFact used Matthew Vadum as an expert source in a story about ACORN and Rep. Michele Bachmann (R-Minn.).

Vadum was subsequently appalled by the way the story turned out, as shown by his contribution to The American Spectator:
Journalistic bias is one thing, but journalistic arrogance is quite another.

When reporters claiming to be neutral political fact-checkers go beyond mere reporting to state with absolute certainty things they cannot possibly know, they run the risk of churning out political opinion masquerading as high-minded investigative journalism.
Be sure to read the whole thing.