Friday, November 10, 2017

'Not a lot of reader confusion' VI

PolitiFact editor Angie Drobnic Holan has claimed she does not notice much reader confusion regarding the interpretation of PolitiFact's "report card" charts and graphs.

This series of posts is designed to call shenanigans on that frankly unbelievable claim.

Rem Rieder, a journalist of some repute, showed himself a member of PolitiFact's confused readership with a Nov. 10, 2017 article published at TheStreet.com.
While most politicians are wrong some of the time, the fact-checking website PolitiFact has found that that [sic] Trump's assertions are inaccurate much more frequently than those of other pols.
When we say Rieder showed himself a member of PolitiFact's confused readership, that means we're giving Rieder the benefit of the doubt by assuming he's not simply lying to his readers.

As we have stressed repeatedly here at PolitiFact Bias, PolitiFact's collected "Truth-O-Meter" ratings cannot be assumed to reliably reflect the truth-telling patterns of politicians, pundits or networks. PolitiFact uses non-random methods of choosing stories (selection bias) and uses an admittedly subjective rating system (personal bias).

PolitiFact then reinforces the sovereignty of the left-leaning point of view--most journalists lean left of the American public--by deciding its ratings by a majority vote of its "star chamber" board of editors.

We have called on PolitiFact to attach disclaimers to each of its graphs, charts or stories related to its graphs and charts to keep such material from misleading unfortunate readers like Rieder.

So far, our roughly five years of lobbying have fallen on deaf ears.

Monday, November 6, 2017

PolitiFact gives the 8 in 10 lie a "Half True."

We can trust PolitiFact to lean left.

Sometimes we bait PolitiFact into giving us examples of its left-leaning tendencies. On November 1, 2017, we noticed a false tweet from President Barack Obama. So we drew PolitiFact's attention to it via the #PolitiFactThis hashtag.



We didn't need to have PolitiFact look into it to know that what Obama said was false. He presented a circular argument, in effect, using the statistics for people who had chosen an ACA exchange plan to mislead the wider public about their chances of receiving subsidized and inexpensive health insurance.


PolitiFact identified the deceit in its fact check, but used biased supposition to soften it (bold emphasis added):
"It only takes a few minutes and the vast majority of people qualify for financial assistance," Obama says. "Eight in 10 people this year can find plans for $75 a month or less."

Can 8 in 10 people get health coverage for $75 a month or less? It depends on who those 10 people are.

The statistic only refers to people currently enrolled in HealthCare.gov.
The video ad appeals to people who are uninsured or who might save money by shopping for health insurance on the government exchange. PolitiFact's wording fudges the truth. It might have accurately said "The statistic is correct for people currently enrolled in HealthCare.gov. but not for the population targeted by the ad."

In the ad, the statistic refers to the ad's target population, not merely to those currently enrolled in HealthCare.gov.

And PolitiFact makes thin and misleading excuses for Obama's deception:
(I)n the absence of statistics on HealthCare.gov visitors, the 8-in-10 figure is the only data point available to those wondering about their eligibility for low-cost plans within the marketplace. What’s more, the website also helps enroll people who might not have otherwise known they were eligible for other government programs.
The nonpartisan fact-checker implies that the lack of data helps excuse using data in a misleading way. We reject that type of excuse-making. If Obama does not provide his audience the context allowing it to understand the data point without being misled, then he deserves full blame for the resulting deception.

PolitiFact might as well be saying "Yes, he misled people, but for a noble purpose!"

PolitiFact, in fact, provided other data points in its preceding paragraph that helped contextualize Obama's misleading data point.

We think PolitiFact's excuse-making influences the reasoning it uses when deciding its subjective "Truth-O-Meter" ratings.
HALF TRUE – The statement is partially accurate but leaves out important details or takes things out of context.
MOSTLY FALSE – The statement contains an element of truth but ignores critical facts that would give a different impression.
FALSE – The statement is not accurate.
In objective terms, what keeps Obama's statement from deserving a "Mostly False" or "False" rating?
His statement was literally false when taken in context, and his underlying message was likewise false.

About 10 to 12 million are enrolled in HealthCare.Gov ("Obamacare") plans. About 80 percent of those receive the subsidies Obama lauds. About 6 million persons buying insurance outside the exchange fail to qualify for subsidies, according to PolitiFact. Millions among the uninsured likewise fail to qualify for subsidies.

Surely a fact-checker can develop a data point out of numbers like those.

But this is what happens when non-partisan fact checkers lean left.


Correction Nov. 6, 2017: Removed "About 6 million uninsured do not qualify for Medicaid or subsidies" as it was superseded by reporting later in the post).