Showing posts with label Bernie Sanders. Show all posts
Showing posts with label Bernie Sanders. Show all posts

Monday, February 24, 2020

Nothing To See Here: Sanders blasts health insurance "profiteering"

While researching PolitiFact's false accusation that Democratic presidential candidate used "bad math" to criticize the budget gap created by fellow candidate Bernie Sanders' spending proposals, we stumbled over a claim from Sen. Sanders that was ripe for fact-checking.

Sanders said his proposed health care plan would end profiteering practices from insurance and drug companies that result in $100 billion or so in annual profits (bold emphasis added):
Just the other day, a major study came out from Yale epidemiologist in Lancet, one of the leading medical publications in the world. What they said, my friends, is Medicare for all will save $450 billion a year, because we are eliminating the absurdity of thousands of separate plans that require hundreds of billions of dollars of administration and, by the way, ending the $100 billion a year in profiteering from the drug companies and the insurance companies.
PolitiFact claims to use an "Is that true?" standard as one of its main criteria for choosing which claims to check.

We have to wonder if that's true, or else how could a fact checker pass over the claim that profiteering netted $100 billion in profits for those companies? Do fact checkers think "profit" and "profiteering" are the same thing?

Is a fact checker who thinks that worthy of the name?

Sanders' claim directly implies that the Affordable Care Act passed by Democrats in 2010 was ineffective with its efforts to circumscribe insurance company profits. The ACA set limits on profits and overhead ("medical loss ratios"). Excess profits, by law, get refunded to the insured.

Sanders said it's not working. And the fact checkers don't care enough to do a fact check?

Of course PolitiFact went through the motions of checking a similar claim, as we pointed out. But using "profiteering" in the claim changes things.

Or should.

Ultimately, it depends on whether PolitiFact has the same interest in finding falsehoods from Democrats as it does for Republicans.

Sunday, February 23, 2020

PolitiFact absurdly charges Pete Buttigieg with "bad math"

PolitiFact gave some goofy treatment to a claim from Democratic presidential candidate Pete Buttigieg.

Buttigieg compared the 10-year unpaid cost of fellow candidate Bernie Sanders' new spending proposals to the current U.S. GDP.

PolitiFact cried foul. Or, more precisely, PolitiFact cried "bad math."


Note that PolitiFact says Buttigieg did "bad math."

PolitiFact's fact check never backs that claim.

If Buttigieg is guilty of bad anything, it was a poor job of providing thorough context for the measure he used to illustrate the size of Sanders "budget hole." Buttigieg was comparing a cumulative 10-year budget hole with one year of U.S. GDP.

PolitiFact notwithstanding, there's nothing particularly wrong with doing that. Maybe Buttigieg should have provided more context, but there's a counterargument to that point: Buttigieg was on a debate stage with a sharply limited amount of time to make his point. In addition, the debate audience and contestants may be expected to have some familiarity with cost estimates and GDP. In other words, it's likely many or most in the audience knew what Buttigieg was saying.

Let's watch PolitiFact try to justify its reasoning:
But there’s an issue with Buttigieg’s basic comparison of Sanders’ proposals to the U.S. economy. He might have been using a rhetorical flourish to give a sense of scale, but his words muddled the math.

The flaw is that he used 10-year cost and revenue estimates for the Sanders plans and stacked them against one year of the nation’s GDP.
PolitiFact tried to justify the "muddled math" charge by noting Buttigieg compared a 10-year cost estimate to a one-year figure for GDP.

But it's not muddled math. The 10-year estimates are the 10-year estimates, mathematically speaking. And the GDP figure is the GDP figure. Noting that the larger figure is larger than the smaller figure is solid math.

PolitiFact goes on to say that the Buttigieg comparison does not compare apples to apples, but so what? Saying an airplane is the size of a football field is also an apples-to-oranges comparison. Airplanes, after all, are not football fields. But the math remains solid: 100 yards equals 100 yards.

Ambiguity differs from error

In fact-checking the correct response to ambiguity is charitable interpretation. After applying charitable interpretation, the fact checker may then consider ways the message could mislead the audience.

If Buttigieg had run a campaign ad using the same words, it would make more sense to grade his claim harshly. Such a message in an ad is likely to reach people without the knowledge base to understand the comparison. But many or most in a debate audience would understand Buttigieg's comparison without additional explanation.

It's an issue of ambiguous context, not "bad math."



Correction Feb. 26, 2018: Omitted the first "i" in "Buttigieg" in the final occurrence in the next-to-last paragraph. Problem corrected.

Thursday, February 20, 2020

PolitiFact weirdly unable to answer criticism

Our title plays off a PolitiFact critique Dave Weigel wrote back in 2011 (Slate). PolitiFact has a chronic difficulty responding effectively to criticism.

Most often PolitiFact doesn't bother responding to criticism. But if it makes its liberal base angry enough sometimes it will trot out some excuses.

This time PolitiFact outraged supporters of Democratic (Socialist) presidential candidate Bernie Sanders with a "Mostly False" rating of Sanders' claim that fellow Democratic presidential candidate Michael Bloomberg "opposed modest proposals during Barack Obama’s presidency to raise taxes on the wealthy, while advocating for cuts to Medicare and Social Security."

Reactions from left-leaning journalists Ryan Grim and Ryan Cooper were typical of the genre.



The problem isn't that Sanders wasn't misleading people. He was. The problem stems from PolitiFact's inability to reasonably explain what Sanders did wrong. PolitiFact offered a poor explanation in its fact check, appearing to reason that what Sanders said was true but misleading and therefore "Mostly False."

That type of description typically fits a "Half True" or a "Mostly True" rating--particularly if the subject isn't a Republican.

PolitiFact went to Twitter to try to explain its decision.

First, PolitiFact made a statement making it appear that Sanders was pretty much right:



Then PolitiFact (rhetorically) asked how the true statements could end up with a "Mostly False" rating. In reply to its own question, we got this:
Because Sanders failed to note the key role of deficit reduction for Bloomberg.
Seriously? Missing context tends to lead to the aforementioned "Mostly True" or "Half True" ratings, not "Mostly False" (unless it's a Republican). Sanders is no Republican, so of course there's outrage on the left.

Anyway, who cuts government programs without having deficit reduction in mind? That's pretty standard, isn't it?

How can PolitiFact be this bad at explaining itself?

In its next explanatory tweet PolitiFact did much better by pointing out Bloomberg agreed the Obama deficit reduction plan should raise taxes, including taxes on wealthy Americans.

That's important not because it's on the topic of deficit reduction but because Sanders's made it sound like Bloomberg opposed tax hikes on the wealthy at the federal level. Recall Sanders' words (bold emphasis added): "modest proposals during Barack Obama’s presidency to raise taxes on the wealthy."

Mentioning the proposals occurred during the Obama presidency led the audience to think Bloomberg was talking about tax hikes at the federal level. But Sanders was talking about Bloomberg's opposition to tax hikes in New York City, not nationally.

PolitiFact mentioned that Bloomberg had opposed the tax hikes in New York, but completely failed to identify Sanders' misdirection.

PolitiFact's next tweet only created more confusion, saying "Sanders’ said Bloomberg wanted entitlement cuts and no tax hikes. That is not what Bloomberg said."

But that's not what Sanders said. 

It's what Sanders implied by juxtaposing mention of the city tax policy with Obama-era proposals for slowing the growth of Medicare and Social Security spending.

And speaking of those two programs, that's where PolitiFact really failed with this fact check. In the past PolitiFact has distinguished, albeit inconsistently, between cutting a government program and slowing its growth. It's common in Washington D.C. to call the slowing of growth a "cut," but such a cut from a higher growth projection differs from cutting a program by making its funding literally lower from one year to the next. Fact checkers should identify the baseline for the cut. PolitiFact neglected that step.

If PolitiFact had noted that Bloomberg's supposed cuts to Social Security and Medicare were cuts to future growth projections, it could have called out Sanders for the misleading imprecision.

PolitiFact could have said the Social Security/Medicare half of Sanders' claim was "Half True" and that taking the city tax policy out of context was likewise "Half True." And if PolitiFact did not want to credit Sanders with a "Half True" claim by averaging those ratings then it could have justified a "Mostly False" rating by invoking the misleading impression Sanders achieved by juxtaposing the two half truths.


 Instead, we got yet another case of PolitiFact weirdly unable to to answer criticism.

Saturday, January 25, 2020

We republished this item because we neglected to give it a title when it was first published.

Forgetting the title results in a cumbersome URL making it a good idea to republish it.

So that's what we did. Find the post here.

Friday, January 17, 2020

Fact checkers decide not to check facts in fact check of Bernie Sanders

As a near-perfect follow up to our post about progressives ragging on PolitiFact over its centrist bias, we present this Jan. 15, 2020 PolitiFact fact check of Democratic presidential candidate Sen. Bernie Sanders:


Sanders said his plan would "end" $100 billion in health care industry profits, and PolitiFact plants a "True" Truth-O-Meter graphic just to the right of that claim.

But there's no fact check here of whether Sanders' plan would end $100 billion in profits. Instead the fact check looks at whether the health care industry makes $100 billion in profits (bold emphasis added):
The Sanders campaign shared its math, and it’s comprehensive.

The $100 billion total comes from adding the 2018 net revenues -- as disclosed by the companies -- for 10 pharmaceutical companies and 10 companies that work in health insurance.

We redid the numbers. Sanders is correct: The total net revenues, or profits, these companies posted in 2018 comes to just more than $100 billion - $100.96 billion, in fact. We also spoke to three independent health economists, who all told us that the math checks out.

There are a couple of wrinkles to consider. Some of the companies included -- Johnson & Johnson, for instance -- do more than just health care. Those other services likely affect their bottom lines.

But more importantly, $100 billion is likely an underestimate, experts told us.
It looks to us like PolitiFact meticulously double-checked equations that did not adequately address the issue of health care profits.

On the one hand "We redid the numbers. Sanders is correct." But on the other hand "$100 billion is likely an underestimate."

The fact checkers are telling us Sanders was accurate but probably wrong.

But we've only covered a premise of Sanders' claim. The meat of the claim stems from Sanders saying he will "end" those profits.

Did Sanders mean he would cut $100 billion in profit or simply reduce profits by some unspecified amount? We don't see how a serious fact-check effort can proceed without somehow addressing that question.

PolitiFact proceeds to try to prove us wrong (bold emphasis added):
Sanders suggested that Medicare for All would "end" the $100 billion per year profits reaped by the health care industry.

The proposal would certainly give Washington the power to do that.

"If you had Medicare for All, you have a single payer that would be paying lower prices," Meara said.

That means lower prices and profits for pharmaceuticals, lower margins for insurers and lower prices for hospitals and health systems.

That could bring tradeoffs: for instance, fewer people choosing to practice medicine. But, Meara noted, the number supports Sanders’ larger thesis. "There’s room to pay less."
Though PolitiFact showed no inclination to pin down Sanders' meaning, the expert PolitiFact cited (professor of health economics Ellen Meara) translates Sanders' point as "There's room to pay less."

Do the fact checkers care how much less? Is PolitiFact actually fact-checking whether Sanders' plan would lower profit margins and it doesn't matter by how much?

Side note: PolitiFact's expert donates politically to Democrats. PolitiFact doesn't think you need to know that. PolitiFact is also supposedly a champion of transparency.

Where's the Fact Check?

PolitiFact does not know how much, if at all, the Sanders plan would cut profit margins.

PolitiFact does not specify how it interprets Sanders' claim of bringing an "end" to $100 billion in profits (the cited expert expects a lower profit margin but offers no estimate).

The bulk of the fact check is a journalistic hole. It fails to offer any kind of serious estimate of how much the Sanders' plan might trim profits. If the plan trims profits down to $75 billion, presumably PolitiFact would count that as ending $100 billion in profits.

Using that slippery understanding, quite a few outcomes could count as ending $100 billion in profits. But how many prospective voters think Sanders is promising to save consumers that $100 billion?

"Fact-checking."

That's no "centrist bias." That's doing Sanders a huge favor. It's liberal bias, the prevalent species at PolitiFact.

Thursday, January 2, 2020

PolitiFact and Bernie Sanders explain the gender pay gap

Everybody knows about the gender pay gap, right?

It's the statistic Democrats habitually misuse to amplify their focus on "equal pay for equal work." Fact checkers like PolitiFact punish that traditional deception by rating it "Mostly True" most of the time, or sometimes just "True."

Let's take a look at PolitiFact latest PolitiSplainer on the gender wage gap, this time featuring Democratic Party presidential candidate and "democratic socialist" Bernie Sanders.

Such articles might more appropriately wear the label "unexplainer."

PolitiFact starts out with exactly the kind of ambiguity Democratic Party leaders love, obscuring the difference between the raw gender wage gap and the part of the gap (if any) caused by gender discrimination:
The disparity in how much women make compared with men comes up often in the political discourse, tagged with a call to action to help women’s paychecks catch up.
Running just above that sentence the featured image directs readers toward the gender discrimination explanation for the gender pay gap. Plausibly deniable? Of course. PolitiFact didn't mean it that way or something, right?


PolitiFact goes on to tell its readers that a number of Democrats have raised the gender pay gap issue while on the campaign trail. The paragraph contains four hotlinks:
Several leading Democratic presidential candidates recently highlighted one of the biggest imbalances — saying that a Latina woman must work 23 months to make the amount a white man makes in one year, or that they make 54 cents on the dollar.
Each of the statements from Democrats highlighted the gender pay gap in an ambiguous and misleading way. None of the statements bothered to distinguish between the raw pay gap, caused by a variety of things including women working fewer hours, and the hard-to-measure pay gap caused by employers' sexual discrimination.

The claim from Mayor Pete Buttigieg was pretty much incoherent and would have made great fodder for a fact check (54 cents on the dollar isn't enough to live on? Doesn't that depend on the size of the dollar in the comparison?).

PolitiFact highlighted the version of the claim coming from Sen. Sanders:



Sanders' use of the gender pay gap fits the standard pattern of deception. He leads with a figure from the raw wage gap, then assures the audience that "Equal pay is not radical ... It's an issue of basic justice."

But Sanders is misleading his audience. "Equal pay for equal work" isn't radical and may count as an issue of basic justice. But equal pay regardless of the work done is very radical in the United States. And that's what Democratic Candidates imply when they base their calls for equal pay on the disparities in the raw gender wage gap.

If only there were fact checkers who could explain that deception to the public!

But, no, PolitiFact does not explain Sanders' deception.

In fact, it appears PolitiFact has never rated Sanders on a claim related to the gender wage gap.

PolitiFact did not rate the misleading tweet featured in its PolitiSplainer. Nor did it rate any of these:
PolitiFact ratings of the gender wage gap tend to graciously overlook the fact that Democrats almost invariably invoke the raw gender wage gap when stumping for equal pay for equal work, as Sanders did above. Does the raw gender wage gap have much of anything to do with the wage gap just from discrimination? No. There's hardly any relationship.

Should Democrats admit they want equal pay for unequal work, it's likely the American people will let them know that the idea is not mainstream and not an issue of basic fairness.

PolitiFact ought to know that by now. But you won't find it in their fact checks or PolitiSplainers dealing with the gender wage gap.

How Big is the Pay Gap from Discrimination?

Remarkably, PolitiFact's PolitiSplainer on the pay gap almost takes a pass on pinning down the role discrimination might play. One past PolitiSplainer from 2015 actually included the line from the CONSAD report's Foreword (by the Department of Labor) suggesting there may be no significant gender discrimination at all found in the raw wage gap.

In the 2019 PolitiSplainer we got this:
We often hear that discriminatory practices are a reason why on average women are paid less than men. Expert say it’s hard to measure how much of a role that discrimination plays in the disparity.

"Research shows that more than half of the gap is due to job and industry segregation — essentially, women tend to work in jobs done primarily by other women, and men tend to work in jobs done primarily by other men and the ‘men’s jobs’ are paid more," said Jennifer Clark, a spokeswoman for the Institute for Women’s Policy Research.

Clark cited education and race as other factors, too.
Such a weak attempt to explain the role of discrimination in the gender pay gap perhaps indicates that PolitiFact's aim was to explain the raw gender wage gap. Unfortunately for the truth, that explanation largely stayed within the lines of the traditional Democratic Party deceit: Mention the raw gender wage gap and then advocate legislation supposedly helping women receive equal work for equal pay.

That juxtaposition sends the clear message the raw gender wage gap relates to discrimination.

Supposedly neutral and objective fact checkers approve the deception, so it must be okay.

We have no reason to suppose mainstream fact checkers like PolitiFact will stop playing along with the misdirection.

Thursday, April 25, 2019

Bernie Sanders + PolitiFact + Equivocation = "True"

When PolitiFact plucks a truth from a bed of untruth (or vice-versa) we call it "Tweezers" and tag the example with the "tweezers or tongs" tag.

But every once in a while PolitiFact goes beyond tweezing to pretend that the tweezed item and the bed of untruth were both true.

And that's the case with a PolitiFact Vermont fact check of Democratic Party presidential candidate Sen. Bernie Sanders (I-Vt.).

It's true, as Sanders said, that people in jail in Vermont may vote. Except perhaps those in jail convicted of voter fraud or other crimes that may run afoul of Vermont's constitutional stipulation that voters must maintain "quiet and peaceable behavior."

The problem occurs in the middle of Sanders' claim. Vermont's original 1793 Constitution (like its 1777 Constitution) limits voting to men above a certain age. So it's just not true that it says "everybody can vote."

PolitiFact might have solved the problem in the fact check header by shortening the quotation with an ellipsis. Like this: "In my own state of Vermont ... people in jail can vote." That statement pretty much counts as true if we assume that everyone in jail is of quiet and peaceable behavior by Vermont's definition.

Unfortunately, the text of PolitiFact Vermont's fact check reinforces the false middle of Sanders' claim instead of either explicitly excluding it or providing accurate context. The fact check does not mention that Vermont's early constitution did not allow women to vote. Nor does it let on that men had to attain a certain age to vote.

Given those omissions, we count it a major victory that PolitiFact noted the stipulation that voters must be of "quiet and peaceable behavior"--not that the potential exceptions affected PolitiFact's rating of Sanders' claim:
Our ruling

Sanders said: "In my own state of Vermont, from the very first days of our state’s history, what our Constitution says is that everybody can vote. That is true. So people in jail can vote."

It’s true that Vermont felons can vote from prison today, and we can’t find anything to suggest that hasn’t always been the case in the state. Though it seems quite possible that the efforts being made today to allow them to cast ballots hasn’t always been made.

The Vermont Constitution requires people to be of "quiet and peaceable behavior," but otherwise places no restrictions on who can vote. And Sanders said prisoners "can" vote, not that they always have voted.

We rate this claim True.
 Though PolitiFact claims Vermont's constitution places no restrictions on who can vote (other than "quiet and peaceable behavior"), the fact is that Vermont places a number of restrictions on who can vote:
§ 42. [VOTER'S QUALIFICATIONS AND OATH]

Every person of the full age of eighteen years who is a citizen of the United States, having resided in this State for the period established by the General Assembly and who is of a quiet and peaceable behavior, and will take the following oath or affirmation, shall be entitled to all the privileges of a voter of this state:

You solemnly swear (or affirm) that whenever you give your vote or suffrage, touching any matter that concerns the State of Vermont, you will do it so as in your conscience you shall judge will most conduce to the best good of the same, as established by the Constitution, without fear or favor of any person.

Every person who will attain the full age of eighteen years by the date of the general election who is a citizen of the United States, having resided in this State for the period established by the General Assembly and who is of a quiet and peaceable behavior, and will take the oath or affirmation set forth in this section, shall be entitled to vote in the primary election.
PolitiFact's fact check fairly overflows with misinformation and ends up calling the false parts of Sanders' statement true.

Is this why we have fact checkers or what?

Monday, March 4, 2019

The underlying point saves the day for Bernie Sanders falsehood?

For some reason there are people who believe that if a fact checker checks both sides that means that the fact checker is neutral.

We've kept pointing out that checking both sides is no kind of guarantee of nonpartisanship. It's a simple matter to give harsher ratings to one side while rating both sides. Or softer ratings to one side while rating both sides.

Latest case in point: Democratic presidential candidate Bernie Sanders.

Sanders claimed that the single-payer health care system in Canada offers "quality care to all people  without out of pocket expenses."

PolitiFact found that the Canadian system does not eliminate out-of-pocket expenses (contradicting Sanders' claim).

And then PolitiFact gave Sanders' claim a "Half True" rating.

Seriously. That's what PolitiFact did.


PolitiFact's summary is remarkable for not explaining how Sanders managed to eke out a "Half True" rating for a false statement. PolitiFact describes what's wrong with the statement (how it's false) and then proclaims the "Half True" ruling:
Sanders said, "In Canada, for a number of decades they have provided quality care to all people without out-of-pocket expenses. You go in for cancer therapy, you don't take out your wallet."

So long as the care comes from a doctor or at a hospital, the Canadian system covers the full cost. But the country’s public insurance doesn’t automatically pay for all services, most significantly, prescription drugs, including drugs needed to fight cancer.

Out-of-pocket spending is about 15 percent of all Canadian health care expenditures, and researchers said prescription drugs likely represented the largest share of that.

The financial burden on people is not nearly as widespread or as severe as in the United States, but Sanders made it sound as though out-of-pocket costs were a non-issue in Canada.

We rate this claim Half True.
See?

PolitiFact says Sanders made it sound like Canadians do not pay out-of-pocket at all for health care. But Canadians do pay a substantial share out of pocket, therefore making it sound like they don't is "Half True."

Republicans, don't get the idea that you can say something PolitiFact describes as false in its fact check and then skate with a "Half True" rating on the "Truth-O-Meter."

Wednesday, December 5, 2018

Handicapping PolitiFact's "Lie of the Year" Candidates (Updated)


It's that time of year again, when the supposedly non-partisan and unbiased folks at PolitiFact prepare an op-ed about the most significant lie of the year, PolitiFact's "Lie of the Year" award.

At PolitiFact Bias we have made a tradition of handicapping PolitiFact's list of candidates.

So, without further ado:


To the extent that Democrats think Trump's messaging on immigration helped Republicans in the 2018 election cycle, this candidate has considerable strength. I (Jeff can offer his own breakdown if he wishes) rate this entry as a 6 on a scale of 1-10 with 10 representing the strongest.


This claim involving Saudi Arabia qualifies as my dark horse candidate. By itself the claim had relatively little political impact. But Trump's claim relates to the murder of Saudi journalist (and U.S. resident) Jamal Khashoggi. Journalists have disproportionately gravitated toward that issue. Consonant with journalists' high estimation of their own intelligence and perception, this is the smart choice. 6.


This claim has much in common with the first one. It deals with one of the key issues of the 2018 election cycle, and Democrats may view this messaging as one of the reasons the "blue wave" did not sweep over the U.S. Senate. But the first claim came from a popular ad. And the first claim was rated "Pants on Fire" while PolitiFact gave this one a mere "False" rating. So this one gets a 5 from me instead of a 6.



PolitiFact journalists may like this candidate because it undercuts Trump's narrative about the success of his economic policies. Claiming U.S. Steel is opening new plants after Trump slapped tariffs on aluminum and steel makes the tariffs sound like a big success. But not so much if there's no truth to it. How significant was it politically? Not so much. I rate this one a 4.



If this candidate carries significant political weight, it comes from the way the media narrative contradicting Trump's claim helped lead to the administration's reversal of its border policy. That reversal negated, at least to some extent, a potentially effective Democratic Party election-year talking point. I rate this one a 5.


That's five from President Trump. Are PolitiFact's candidates listed "in no particular order"? PolitiFact does not say.



Bernie Sanders' claim about background checks for firearm purchases was politically insignificant. Pickings from the Democratic Party side were slim. Democrats only had about 12 false ratings through this point in 2018, including "Pants on Fire" ratings. Republicans had over 80, for comparison. I give this claim a 1.



As with the Sanders' claim, the one from Ocasio-Cortez was politically insignificant. It was ignorant, sure, but Ocasio-Cortez was guaranteed to win in her district regardless of what she said. Her statement would have been just as significant politically if she said it to herself in a closet. This claim, like Sanders', rates as a 1.




Is this the first time a non-American made PolitiFact's list of candidates? This claim ties into the same subject as last year's winner, Russian election interference. About last year's selection I predicted "PolitiFact will hope the Mueller investigation will eventually provide enough backing to keep it from getting egg on its face." One year later it remains uncertain whether the Mueller investigation will produce a report that shows much more than the purchase of some Facebook ads. If and only if the Russia story gets new life in December will PolitiFact make this item its "Lie of the Year." I give this item a 4, with a higher ceiling depending on the late 2018 news narrative.




Yawn. 1.





This claim from one of Trump's economic advisors rates about the same as Ocasio-Cortez's claim on its face. I think Kudlow may have referred to deficit projections and not deficits. But that aside, this item may appeal to PolitiFact because it strikes at the idea that tax cuts pay for themselves. Democrats imagine that Republicans commonly believe that (it may be true--I don't know). So even though this item should rate in the same range as the Sanders and Ocasio-Cortez claims I will give it a 4 to recognize its potential appeal to PolitiFact's left-leaning staff. It has a non-zero chance of winning.



Afters

A few notes:  Once again, PolitiFact drew only from claims rated "False" or "Pants on Fire" to make up its list of candidates. President Obama's claim about Americans keeping their health insurance plans remains the only candidate to receive a "Half True" rating.

With five Trump statements among the 10 nominees we have to allow that PolitiFact will return to its ways of the past and make "Trump's statements as president" (or something like that) the winner.


Jeff Adds:

Knowing PolitiFact's Lie of the Year stunt is more about generating teh clickz as opposed to a function of serious journalism or truth seeking, my pick is the Putin claim.

The field of candidates is, once again, intentionally weak outside of the Putin rating. Of all the Pants on Fire ratings they passed out to Trump this year, PolitiFact filled the list with claims that were simply False (and this is pretending there's some objective difference between any of PolitiFact's subjective ratings.)

Giving the award to Bernie won't generate much buzz, so you can cross him off the list.

It's doubtful the nonpartisan liberals at PolitiFact would burden Ocasio-Cortez with such an honor when she's already taking well-deserved heat for her frequent gaffes. And as far as this pick creating web traffic, I submit that AOC isn't nearly as talked about in Democrat circles as the ire she elicits from the right would suggest. That said, she should be considered a dark horse pick.

It's not hard to imagine PolitiFacter Aaron Sharockman cooking up a scheme during a Star Chamber session to pick AOC as an attempt at outreach to conservative readers and beefing up their "we pick both sides!" street cred (a credibility, by the way, that only PolitiFact and the others in their fishbowl of liberal confirmation bias actually believe exists.)

More people in America know Spencer Pratt sells healing crystals than have ever heard of Larry Kudlow. You can toss this one aside.

The inclusion of the David Hogg claim seems like a PolitiFact intern was given the task of picking out a few False nuggets from liberals and that was what they came up with. Don't expect PolitiFact to pick on the young but misinformed activist. [Update: This is a completely embarrassing take on my part. I was in a rush to publish my thoughts on the Lie of the Year candidates, and in that rush, I glossed over this claim. Obviously, I didn't even give it a passing notice. I'm confident that had I actually paid attention to it, I would have ignored it as a contender anyways (and I still think it's a lame pick on its face.) But that's not an excuse.

I let readers down and I embarrassed myself. As I repeatedly and mockingly point out to fact checkers: Confirmation bias is a helluva drug. I was convinced of the winner, and I ignored information that didn't support that outcome.

I regret that I didn't dismiss it with a coherent argument. My bad.-Jeff]

Putin is the obvious  pick. Timed perfectly with the release of the Mueller report, it piggybacks onto the Russian interference buzz. Additionally, it allows ostensibly serious journos to include PolitiFact's Lie of the Year piece into their own articles about Russian involvement in the election (the catnip of liberal click-bait.) It gets bonus points for confirming for PolitiFact's Democrat fan base that Trump is an illegitimate president that stole the election.

The Putin claim has everything: Anti-Trump, stokes Russian interference RT's and Facebook shares, and gets links from journalists at other news outlets sympathetic to PolitiFact's cause.

The only caveat here is if PolitiFact continues their recent history of coming up with some hybrid, too-clever-by-half Lie of the Year winner that isn't actually on the list. But even if they do that the reasoning is the same: PolitiFact is not an earnest journalism outlet engaged in fact spreading. PolitiFact exists to get your clicks and your cash.

Don't believe the hype.





Updated: Added Jeff Adds section 1306 PST 12/10/2018 - Jeff
Edit: Corrected misspelling of Ocasio-Cortez in Jeff Adds portion 2025 PST 12/10/2018 - Jeff
Updated: Strike-through text of Hogg claim analysis in Jeff Adds section, added three paragraph mea culpa defined by brackets 2157 PST 12/12/2018 -Jeff


Tuesday, August 7, 2018

The Phantom Cherry-pick

Would Sen. Bernie Sanders' Medicare For All plan save $2 trillion over 10 years on U.S. health care expenses?

Sanders and the left were on fire this week trying to co-opt a Mercatus Center paper by Charles Blahous. Sanders and others claimed Blahous' paper confirmed the M4A plan would save $2 trillion over 10 years.

PolitiFact checked in on the question and found Sanders' claim "Half True":


PolitiFact's summary encapsulates its reasoning:
The $2 trillion figure can be traced back to the Mercatus report. But it is one of two scenarios the report offers, so Sanders’ use of the term "would" is too strong. The alternative figure, which assumes that a Medicare for All plan isn’t as successful in controlling costs as its sponsors hope it will be, would lead to an increase of almost $3.3 trillion in national health care expenditures, not a decline. Independent experts say the alternative scenario of weaker cost control is at least as plausible.

We rate the statement Half True.
Throughout its report, as pointed out at Zebra Fact Check, PolitiFact treats the $2 trillion in savings as a serious attempt to project the true effects of the M4A bill.

In fact, the Mercatus report use what its author sees as overly rosy assumptions about the bill's effects to estimate a lower boundary for the bill's very high costs and then proceeds to offer reasons why the bill will likely greatly exceed those costs.

In other words, the cherry Sanders tries to pick is a faux cherry. And a fact checker ought to recognize that fact. It's one thing to pick a cherry that's a cherry. It's another thing to pick a cherry that's a fake.

Making Matters Worse

PolitiFact makes matters worse by overlooking Sanders' central error: circular reasoning.

Sanders' takes a projection based on favorable assumptions as evidence that the favorable assumptions are reasonable assumptions. But a conclusion one reaches based on assumptions does not make the assumptions more true. Sanders' claim suggests the opposite, that when the Blahous paper says it is using unrealistic assumptions the conclusions it reaches using those assumptions makes the assumptions reasonable.

A fact checker ought to point out whaten a politician peddles such nonsensical ideas.

PolitiFact made itself guilty of bad reporting while overlooking Sanders' central error.

Thursday, January 4, 2018

No Underlying Point For You!

PolitiFact grants Trump no underlying point on his claim about the GOP lock on senate seat



The NBC sitcom "Seinfeld" featured an episode focused in part on the "Soup Nazi." The "Soup Nazi" was the proprietor of a neighborhood soup shop who would refuse service in response to minor breaches of etiquette, often with a shouted "No soup for you!"

PolitiFact's occasional refusal to allow for the validity of an underlying point reminded us of the "Soup Nazi," and gives rise to our new series of posts recognizing PolitiFact's occasional failure to recognize underlying points.

PolitiFact's statement of principles assures readers that it takes a speaker's underlying point into account (bold emphasis added):
We examine the claim in the full context, the comments made before and after it, the question that prompted it, and the point the person was trying to make.
We see credit for the speaker's underlying point on full display in this Feb. 14, 2017 rating of Bernie Sanders, at the time running as a Democratic nominee for president of the United States (bold emphasis added):
Sanders said, "Before the Affordable Care Act, (West Virginia’s) uninsured rate for people 64 to 19 was 29 percent. Today, it is 9 percent."

Sanders pointed to one federal measurement, though it has methodological problems when drilling down to the statistics for smaller states. A more reliable data set for West Virginia’s case showed a decline from 21 percent to 9 percent. The decline was not as dramatic as he’d indicated, but it was still a significant one.

We rate the statement Mostly True.
Sanders' point was the decline in the uninsured rate owing to the Affordable Care Act, and we see two ways to measure the degree of his error. Sanders used the wrong baseline for his calculation, 29 percent instead of 21 percent. That represents a 38 percent exaggeration. Or we can look at the difference in the change from that baseline to reach Sanders' (accurate) 9 percent figure. That calculation results in a percentage error of 67 percent.

PolitiFact, despite an error of at least 38 percent, gave Sanders a "Mostly True" rating because Sanders was right that a decline took place.

For comparison, Donald Trump tweeted that former associate Steve Bannon helped lose a senate seat Republicans had held for over 30 years. The seat was held by the GOP by a mere 21 years. Using 31 years as a number greater than 30 years, Trump exaggerated by about 52 percent. And PolitiFact rated his claim "False":
Trump said the Senate seat won by Jones had been "held for more than thirty years by Republicans." It hasn’t been that long. It’s been 21 years since Democrat Howell Heflin retired, paving the way for his successor, Sessions, and Sessions’ elected successor, Jones. We rate the statement False.
Can the 14 percentage point difference by itself move the needle from "Mostly True" to "False"?

Was Trump making the point that the GOP had controlled that senate seat for a long time? That seems undeniable. Is 21 years a long time to control a senate seat? That likewise appears undeniable. Yet Trump's underlying point, in contrast to Sanders', was apparently a complete non-factor when PolitiFact chose its rating.

We say that inconsistency is a bad look for a non-partisan fact checker.

On the other hand, we might predict this type of inconsistency from a partisan fact checker.

Tuesday, July 11, 2017

PolitiFact helps Bernie Sanders with tweezers and imbalance

Our posts carrying the "tweezers or tongs" tag look at how PolitiFact skews its ratings by shifting its story focus.

Today we'll look at PolitiFact's June 27, 2017 fact check of Senator Bernie Sanders (I-Vt.):


Where Sen. Sanders mentions 23 million thrown off of health insurance, PolitiFact treats his statement like a random hypothetical. But the context shows Sanders was not speaking hypothetically (bold emphasis added):
"What the Republican proposal (in the House) does is throw 23 million Americans off of health insurance," Sanders told host Chuck Todd. "What a part of Harvard University -- the scientists there -- determine is when you throw 23 million people off of health insurance, people with cancer, people with heart disease, people with diabetes, thousands of people will die."
The House health care bill does not throw 23 million Americans off of health insurance. The CBO did predict that at the end of 10 years 23 million fewer Americans would have health insurance compared to the current law (Obamacare) projection. There's a huge difference between those two ideas, and PolitiFact may never get around to explaining it.

PolitiFact, despite fact-checkers admitted preference for checking false statements, overlooks the low-hanging fruit in favor of Sanders' claim that thousands will die.

Is Sanders engaging in fearmongering? Sure. But PolitiFact doesn't care.

Instead, PolitiFact focused on Sanders' claim that study after study supports his point that thousands will die if 23 million people get thrown off of insurance.

PolitiFact verified his claim in hilariously one-sided fashion. One would never know from PolitiFact's fact check that the research findings are disputed, as here.

This is the type of research PolitiFact omitted (bold emphasis added) from its fact check:
After determining the characteristics of the uninsured and discovering that being  uninsured does not necessarily mean an individual has no access to health services, the authors turn to the question of mortality. A lack of care is particularly troubling if it leads to differences in mortality based on insurance status. Using data from the Health and Retirement Survey, the authors estimate differences in mortality rates for individuals based on whether they are privately insured, voluntarily uninsured, or involuntarily uninsured. Overall, they find that a lack of health insurance is not likely to be the major factor causing higher mortality rates among the uninsured. The uninsured—particularly the involuntarily uninsured—have multiple disadvantages that are associated with poor health.
So PolitiFact cherry-picked Sanders' claim with tweezers, then did a one-sided fact-check of that cherry-picked part of the claim. Sanders ended up with a "Mostly True" rating next to his false claims.

Does anybody do more to erode trust in fact-checking than PolitiFact?

It's worth noting this stinker was crafted by the veteran fact-checking team of Louis Jacobson and Angie Drobnic Holan.



Correction July 11, 2017: In the fourth paragraph after our quotation of PolitiFact, we had "23,000" instead of the correct figure of "23 million." Thanks to YuriG in the comments section for catching our mistake.

Thursday, March 10, 2016

Bernie Sanders, PolitiMath and the price of water in Flint

In our PolitiMath series we look at how numerical errors correlate to PolitiFact's ratings.

PolitiFact's March 7, 2016 rating of Democratic presidential candidate Bernie Sanders (I-Vt.) suits our purposes well, with Sanders claiming Flint residents pay three times for water what Sanders pays in Burlington, Vt.

PolitiFact found Sanders was right if it used outdated water rates:
When we look at average annual bills from January 2015, Sanders’ 3-to-1 comparison is pretty close. But after August, Flint customers were paying a little more than twice as much as Burlington residents.
A judge's order in August 2015 rolled back water rates. Therefore, as PolitiFact notes, Flint residents now pay about twice what Burlington residents pay, counting Flint's charges for the home water meter. Burlington doesn't charge for the water meter.

To us, it's okay if Sanders wants to round up to get to his "three times" figure. So for PolitiMath purposes, we'll calculate how much 2.5 exaggerates the difference in water rates between Flint and Burlington.

Going by PolitiFact's chart, "a little more than twice as much" turned out to be about 2.4, leading to a very modest exaggeration on Sanders' part: about 4 percent. Yes, allowing for rounding up helped Sanders immensely. That's okay. We'd handle this the same way for a conservative.

PolitiFact gave Sanders a "Mostly True" rating, by the way, for a claim that was literally false.  

Deja vu.

Tuesday, March 8, 2016

PolitiFact judges job-killers

What's new from "Objective, Nonpartisan" PolitiFact?

The expert fact-checkers/liberal bloggers at PolitiFact show everyone how to figure out whether legislation kills jobs.

PolitiFact uses two methods. The first method shows whether the Affordable Care Act caused job loss. The second method shows whether the North American Free Trade Agreement caused job loss.

Here's PolitiFact's method for the ACA, from a fact check of Sen. Ted Cruz (R-Texas, bold emphasis added):
Cruz said that Obamacare cost the country millions of jobs and had forced millions into working part-time.

The government’s employment surveys show no sign of that occurring. By every measure, millions more people are working and millions fewer are stuck unwillingly in part-time work since the time the Affordable Care Act became law. The law might have affected part-time work for certain kinds of people, but that didn’t change the improvement in the overall numbers.
PolitiFact apparently reasons that if that overall employment numbers improved then it doesn't count toward Cruz's point that "The law might have affected part-time work for certain kinds of people."

Here's PolitiFact's method for NAFTA, as related in a fact check of Sen. Bernie Sanders (I-Vt.):
Sanders said that NAFTA, which Clinton used to support, cost the U.S. economy 800,000 jobs. There is a report from a left-leaning policy group that reached that conclusion. On the other hand, many other nonpartisan reports found that the trade deal produced neither significant job losses nor job gains. This is a result of competing economic models and the challenges of teasing out the effects of NAFTA from everything else that has taken place in the economy.

The report Sanders cited is an outlier, and his use of its findings ignores important facts that would give a different impression. We rate his statement Mostly False.
The biggest difference between the two methods comes from PolitiFact's reliance on raw employment numbers when checking the claim from Cruz. Raw employment numbers were a non-factor in checking Sanders but the key to giving a "Pants on Fire" rating to Cruz.

PolitiFact cited studies supporting and contradicting Sanders, but gave no evidence supporting Cruz. The fact check of Cruz omitted mention of a Congressional Budget Office report estimating supply-side net reductions in labor (workers deciding not to work or to work fewer hours), the equivalent of about 700,000 full-time jobs.

We reason that since PolitiFact is objective and non-partisan, it follows that only a non-objective and non-non-partisan (okay, partisan!) source would mention the findings of the CBO relating to unusually slow job recovery following the 2008 recession. In the world of PolitiFact, supply-side job loss doesn't count and doesn't even apparently affect the economy.

Or, to borrow a bit from Stephen Colbert, the fact[checker]s have a liberal bias.


Jeff Adds: Note that Jon Greenberg wrote both the NAFTA and ACA pieces. This makes it even more difficult to reconcile the use of two different methods in performing the fact checks.


Correction March 10, 2016 (bww): I inexplicably identified Delaware as the state Bernie Sanders represents. The text has been changed to identify Vermont as the state Sanders represents in the Senate.

Wednesday, January 20, 2016

Left Jab: "Politifact's Fuzzy Case Against Bernie Sanders on Pentagon Spending"

An article from the Huffington Post gets some good pokes in against PolitiFact's less-than-clear reasoning in a fact check of Democratic presidential candidate Bernie Sanders.

We're not convinced of the author's main point. He says the mainstream media have no interest in real discussions of the military budget.

But he makes a pretty good case that the fact check of Sanders is short on facts:
What is the core of PolitiFact's argument? Sanders' claim is "mostly false" not because we have good reason to believe that some very different number is much more likely to be correct - like 20%, or 50%, or 90% - but because the share of the Pentagon budget which goes into fighting ISIS and international terrorism is a fundamentally unknowable fact, like how God passed the time before creating the world. The question is intrinsically outside the scope of human knowledge.
And what does PolitiFact do with questions outside the scope of human knowledge? Why, fact check them, of course!

These types of fact checks occur often on PolitiFact's pages. More often Republicans are the victims of this shoddy approach to fact-checking. But we don't shy away from sharing good examples showing PolitiFact treating Democrats or progressives unfairly.




Update 1/20/16 2129PST: Added links to Huffington Post and PolitiFact articles in first paragraph- Jeff

Friday, November 20, 2015

PolitiFact gives Bernie Sanders "Mostly True" rating for false statement

When Sen. Bernie Sanders (I-Vt.) said more than half of America's working blacks receive less than $15 per hour, PolitiFact investigated.

It turns out less than half of America's working blacks make less than $15 per hour:
(H)alf of African-American workers earned less than $15.60. So Sanders was close on this but exaggerated slightly. His claim is off by a little more than 4 percent.
PolitiFact found that half of African-American workers earned more than $15 per hour. That makes Sanders' claim false. PolitiFact said Sanders "exaggerated slightly." PolitiFact said he was "off by a little more than four percent." PolitiFact said he was "not far off."

Euphemisms aside, Sanders was wrong. But PolitiFact gave Sanders a "Mostly True" rating for his claim.

Here's a reminder of PolitiFact's definition for its "Mostly True" rating:
Mostly True – The statement is accurate but needs clarification or additional information.
Sanders' statement wasn't accurate. So how does it even begin to qualify for the "Mostly True" rating the way PolitiFact defines it?

The answer, dear reader, is that PolitiFact's definitions don't really mean anything. PolitiFact's "Star Chamber" panel of editors gives the rating they see fit to give. If the definitions conflict with that ruling then the definitions bend to the will of the editors.

Subjective-like.



Update 22:25 11/23/15: Added link to PF article in 4th graph - Jeff

Wednesday, May 27, 2015

Bernie Sanders, PolitiFact, PolitiMath

We do a "PolitiMath" evaluation of PolitiFact's fact checks where numerical errors ought to have a powerful bearing on PolitiFact's "Truth-O-Meter" ratings. We're interested in how percentage error impacts the differences between "Pants on Fire," "False," "Mostly False" and so on.

An older item on Sen. Bernie Sanders (I, Vt.) caught our eye in the midst of one of PolitiFact's bogus "report card" stories. Sanders said the United States spends twice as much per capita on health care than any other nation on earth.

PolitiFact found Sanders was off:
According to the 2009 edition of WHO's World Health Statistics report, which uses figures from 2006, health care spending in the United States — both public- and private-sector — amounted to $6,719 per capita. Ranking next were Luxembourg and Monaco at $6,506 and $6,353 per capita, respectively. All told, either 11 or 15 countries told the WHO they spent more than $3,360 per capita, the point at which the United States no longer doubles their spending. (We provide two possible figures here because the WHO offers both raw figures and statistics adjusted for currency valuations.) The other nations that rank near the top with the United States include Austria, Belgium, Canada, Denmark, France, Germany, Iceland, Ireland, the Netherlands, Norway, Sweden and Switzerland, in addition to tiny Malta and San Marino.
We'd have gone to bat for Sanders if only OECD nations were counted. The United States spends more per capita than its nearest rival by over 50 percent, which is the reasonable floor for rounding up to a "twice as much" claim. But the United States only spends 3.3 percent more per capita than Luxembourg, which is a pretty far cry from 50 percent.

Sanders exaggerated the truth by at least 1,400 percent, using the figures from Luxembourg as the counterexample to his claim.

PolitiFact's rating of Sanders? "False."