Showing posts with label Truth Hustlers. Show all posts
Showing posts with label Truth Hustlers. Show all posts

Thursday, August 23, 2018

PolitiFact Not Yet Tired of Using Statements Taken Out Of Context To Boost Fundraising

Remember back when PolitiFact took GOP pollster Neil Newhouse out of context to help coax readers into donating to PolitiFact?

Good times.

Either the technique works well or PolitiFact journalists just plain enjoy using it, for PolitiFact Editor Angie Drobnic Holan's Aug. 21, 2018 appeal to would-be supporters pulls the same type of stunt on Rudy Giuliani, former mayor of New York City and attorney for President Donald Trump.

Let's watch Holan the politician in action (bold emphasis added):
Just this past Sunday, Rudy Giuliani told journalist Chuck Todd that truth isn’t truth.

Todd asked Giuliani, now one of President Donald Trump’s top advisers on an investigation into Russia’s interference with the 2016 election, whether Trump would testify. Giuliani said he didn’t want the president to get caught perjuring himself — in other words, lying under oath.

"It’s somebody’s version of the truth, not the truth," Giuliani said of potential testimony.

Flustered, Todd replied, "Truth is truth."

"No, it isn’t truth. Truth isn’t truth," Giuliani said, going on to explain that Trump’s version of events are his own.

This is an extreme example, but Giuliani isn’t the only one to suggest that truth is whatever you make it. The ability to manufacture what appears to be the truth has reached new heights of sophistication.
Giuliani, contrary to Holan's presentation, was almost certainly not suggesting that truth is whatever you make it.

Rather, Giuliani was almost certainly making the same point about perjury traps that legal expert Andrew McCarthy pointed out in a Aug. 11, 2018 column for National Review (hat tip to Power Line Blog)
The theme the anti-Trump camp is pushing — again, a sweet-sounding political claim that defies real-world experience — is that an honest person has nothing to fear from a prosecutor. If you simply answer the questions truthfully, there is no possibility of a false-statements charge.

But see, for charging purposes, the witness who answers the questions does not get to decide whether they have been answered truthfully. That is up to the prosecutor who asks the questions. The honest person can make his best effort to provide truthful, accurate, and complete responses; but the interrogator’s evaluation, right or wrong, determines whether those responses warrant prosecution.
It's fair to criticize Giuliani for making the point less elegantly than McCarthy did. But it's inexcusable for a supposedly non-partisan fact checker to take a claim out of context to fuel an appeal for cash.

That's what we expect from partisan politicians, not non-partisan journalists.

Unless they're "non-partisan journalists" from The Bubble.





 

Worth Noting:

For the 2017 version of this Truth Hustle, Holan shared writing credits with PolitiFact's Executive Director Aaron Sharockman.

Friday, February 17, 2017

PolitiFact: That was then, this is now

Now (2017)

PolitiFact is independent! That means nobody chooses for PolitiFact what stories PolitiFact will cover. PolitiFact made that clear with its recent appeal for financial support though its "Truth Squad" members--persons who contribute financially to PolitiFact (bold emphasis added):
Our independence is incredibly valuable to us, and we don't let anyone — not politicians, not grant-making groups, not anyone — tell us what to fact-check or what our Truth-O-Meter rulings should be. At PolitiFact, those decisions are made solely by journalists. With your help, they always will be.
Got it? Story selection is done solely by PolitiFact journalists. That's independence.

Then (2015)

In early 2015, PolitiFact started its exploration of public funding with a Kickstarter program geared toward funding its live fact checks of the 2015 State of the Union address.

Supporters donating $100 or more got to choose what PolitiFact would fact check. Seriously. That's what PolitiFact offered:

Pledge $100 or more

Pick the fact-check. We’ll send you a list of four fact-checks we’re thinking of working on. You decide which one we do. Plus the coffee mug, the shout out and the mail.
We at PolitiFact Bias saw this scam for what it was back then: It was either a breach of journalistic ethics in selling its editorial discretion, or else a misleading offer making donors believe they were choosing the fact check when in reality the editorial discretion was kept in-house by the PolitiFact editors.

Either way, PolitiFact acted unethically. And if Angie Drobnic Holan is telling the truth that PolitiFact always has its editorial decisions made by journalists, then we can rest assured that PolitiFact brazenly misled people in advertising its 2015 Kickstarter campaign.


Clarification Feb. 18, 2017: Belatedly added the promised bold emphasis in the first quotation of PolitiFact.

Friday, May 13, 2016

Thomas Baekdal: 'These graphs also illustrate how impartial PolitiFact is' (Updated)

Danish blogger Thomas Baekdal wrote up a lengthy piece on public misinformation, "The increasing problem with the misinformed," published on March 7, 2016. The piece caught our interest because Baekdal used graphs of PolitiFact data and made some intriguing assertions about PolitiFact, particularly the one quoted in our title. Baekdal said his graphs show PolitiFact's impartiality.

We couldn't detect any argument (premises leading via logical inference to a conclusion) in the article supporting Baekdal's claim, so we wrote to him asking for the explanation.

Then a funny thing happened.

We couldn't get him to explain it, not counting "the data speaks for itself."

So, instead of a post dealing with Baekdal's explanation of his assertion, that his graphs show PolitiFact's impartiality, we'll go over a few points that cause us to doubt Baekdal and his conclusion.

1) Wrong definition
    Jeff was the first to respond to Baekdal's article, correctly noting on Twitter that Baekdal misrepresented PolitiFact's "Pants on Fire" rating. Baekdal wrote "(T)hey also have a 6th level for statements where a politician is not just making a false statement, but is so out there that it seems to be intentionally misleading." But PolitiFact never explicitly describes its "Pants on Fire" rating as indicative of deliberate deceit. The closest PolitiFact comes to that is calling the rating "Pants on Fire." That's a point we've criticized PolitiFact over repeatedly. It protests that it doesn't call people liars, but its lowest rating makes that accusation implicitly via association with the popular rhyme "Liar, liar, pants on fire!" Readers sometimes take the framing to heart. Perhaps Baekdal was one of those.

    2) An unsupported assertion

    Baekdal asserted, as a foundation for his article, "(W)e have always had a problem with the misinformed, but it has never been as widespread as it is today." But he provided no evidence supporting the assertion. Should we risk aggravating our misinformed state by accepting his claim without evidence? Or maybe extend a license for hyperbole?
      3) Unscientific approach

      Baekdal prepares his readers for his PolitiFact graph presentations by noting potential problems with sample size, but never talks about how using a non-representative sample will undercut generalizations from his data. We think a competent analyst would address this problem.

      4) Unscientific approach (2)

      Baekdal uses an algorithm to score his PolitiFact data, statistically punishing politicians for telling intentional falsehoods if they received a "Pants on Fire" rating. But PolitiFact never provides affirmative evidence in its fact checks that a falsehood was intentional. The raw data do not show the wrong Baekdal claims to punish with his algorithm. Baekdal punishes others for his own misinterpretation of the data.

      5) Unscientific approach (3)


      Jeff also pointed out via Twitter that Baekdal accepts (without question) the dependability of PolitiFact's ratings. Baekdal offers no evidence that he considered PolitiFact might have a poor record of accuracy.

      Conclusion

      What point was Baekdal trying to make with his PolitiFact stats? It looks like he was trying to show the unreliability of politicians and pundits. Why? To highlight concern that people feel more mistrust for the press than for politicians. Baekdal lays a big share of the blame on the press, but apparently fails to realize PolitiFact is guilty of many of the problems he describes in his criticism of the press, such as misleading headlines.

      We see no reason to trust Baekdal's assessment of PolitiFact's impartiality, or his assessment of anything else for that matter. His research approach is not scientific, failing to account for reliability of data, the reality of selection bias or alternative explanations of the data. His unwillingness to justify his claims via email did nothing to change our minds.

      We continue to extend our invitation to Baekdal to explain how his graphs support PolitiFact's impartiality.


      Hat tip to Twitterer @SatoshiKsutra for bringing Baekdal's article to our attention.


      Update May 16, 2016:

      Baekdal responds:


      And this after we did him the favor of not publicly parsing his email responses.

      Baekdal has something in common with some of the likewise thick-skinned folks at PolitiFact.

      Tuesday, January 12, 2016

      True statements ruled "Mostly False"

      What happens when PolitiFact finds that a statement is literally true?

      That issue was brought up indirectly when Jeff retweeted economist Howard Wall. Wall had tweeted:
      We looked up the story where Wall was quoted as an expert. It was a fact check of Mitt Romney from the 2012 presidential election. The Romney campaign said women had suffered the greatest job losses under Obama, implying Obama's leadership had been bad for women.

      PolitiFact ruled the claim "Mostly False."

      The Romney campaign pushed back. PolitiFact looked at the issue again and ruled the claim "Mostly False." But at the same time, PolitiFact said "The numbers are accurate but quite misleading."

      Don't blame the Romney campaign. It probably operated under the assumption that PolitiFact's definitions for its "Truth-O-Meter" ratings mean something.

      Taking PolitiFact's definitions literally, the lowest rating one should receive for a literally true claim is "Mostly True." Once below that level, the definitions start talking about "partially true" statements that give a misleading impression ("Half True") and "The statement contains some element of truth" but ignores facts that could give a different impression ("Mostly False").

      What's our point? We've always said PolitiFact's ratings reveal more about PolitiFact than they do about the entities receiving the ratings. It's a scandal that social scientists keep their eyes closed to that. Search PolitiFact's ratings for claims it says are literally true. Note the rating given to the claim. Then take a look at the ideology of the entity making the claim.

      There's your evidence of journalistic bias by fact checkers.

      This is an important issue. If social scientists aren't looking at it, it suggests they don't care.

      Why wouldn't they care?



      Jeff Adds: We highlighted a Mark Hemingway critique of PolitiFact's Romney claim back in 2012 that is still worth a read. It would seem little has changed at PolitiFact since then.



      Update 0956PST 1/12/2016: Added "Jeff Adds" portion - Jeff


      Sunday, January 3, 2016

      "Not a lot of reader confusion" in San Luis Obispo

      The Tribune of San Luis Obispo, California, published a reader's letter touching on our favorite topic, PolitiFact. The letter provides yet more evidence of reader confusion about PolitiFact's candidate report cards. PolitiFact editor Angie Drobnic Holan insists that little reader confusion exists about the report cards.
      A Dec. 13 New York Times column (“All politicians lie. Some lie more than others”) indicated that the percentages of claims in these categories for candidates Ben Carson, Donald Trump and Ted Cruz are 84 percent, 76 percent and 66 percent, respectively.
      The reader doesn't bother to say whether he believes those percentages reasonably extrapolate to generalizations about the candidates. But there was hardly a need for that, thanks to the headline provided by the Tribune:

      GOP presidential front-runners lie the most

      Do they?

      Well, the editors of the Tribune read it in The New York Times via the editor of Pulitzer Prize-winning PolitiFact, so it must be true, right?

      That's PolitiFact and the mainstream press running their own deceptive political messages while flying the banner of truth overhead.

      Wednesday, September 5, 2012

      PFB Smackdown: Dylan Otto Krider vs. Jon Cassidy (Updated)

      Dylan Otto Krider's orbit around truth-hustler Chris Mooney helped bring him into direct conflict with Ohio Watchdog writer Jon Cassidy this week.  We've featured some of Cassidy's PolitiFact critiques here at PolitiFact Bias, and our review of Cassidy's longer article for Human Events is pending.

      Krider, like Mooney, believes that statistics from PolitiFact indicating that Republicans receive harsher ratings than Democrats help show that Republicans simply have a more cavalier attitude toward telling the truth.  Cassidy's Human Events story challenged that interpretation and prompted a story in reply from Krider.

      Krider's central point carries partial merit.  He challenges Cassidy's headline with his own:  "Does PolitiFact say Republicans lie nine times more? Really?"

      That answer to that question is "no," but Krider used specious reasoning to reach the conclusion.  Examples follow.

      Tuesday, May 1, 2012

      NPR and truth-hustler guests Adair and Nyhan

      Crossposted from Sublime Bloviations.



      National Public Radio brought two of my favorite truth-hustlers on its "All Things Considered" program.   I refer to Bill Adair and Brendan Nyhan.

      (clipped from npr.org 4/29/12)
      Bill Adair serves as founding editor for Politico PolitiFact, a fact-checking operation started by a left-leaning Florida newspaper.  PolitiFact sacrifices journalistic objectivity in part for the sake of its marketing gimmick, called the "Truth-O-Meter."

      Political scientist Brendan Nyhan mangles facts from the realm of academia.  Nyhan has tried to show that partisans don't accept facts that contradict their ideology.  His research often uses facts that beg the question (more on Nyhan), suggesting that Nyhan falls victim to his own research goal.

      The relevant radio program segment deals with a column from Chicago Tribune columnist Rex Huppke, who bemoaned the state of truth following Rep. Allen West's statement, using Huppke's paraphrase, that "as many as 81" members of the Democratic Party are members of the Communist Party.

      Huppke gets some kind of award for irony.  Journalists took West out of context.  West was jokingly, though to make a seriously point, referring to the Congressional Progressive Caucus.  The context makes that absolutely clear.

      Stories like these are to Adair and Nyhan what the Trayvon Martin case is to Jesse Jackson and Rev. Al Sharpton.  The media want the narrative that the former two spin and so seek them out for commentary when these issues make the news.  Adair and Nyhan are the truth-hustler counterparts to race-hustlers.

      Adair shows up to remind us how great it is that Politico PolitiFact is there to save the day, if only we can place sufficient trust in its work.

      Adair (transcript mine, from the NPR audio):
      "What's funny is sometimes I'll get an email that'll say 'You guys are so biased.'  But I won't know who we're supposed to be biased in favor of, because we get criticized a lot by both sides.  And I think that's just the nature of a very rough-and-tumble political discourse."
      Funny is Adair still using the "we get criticized from both sides" dodge to avoid the issue of bias.  Fortunately a journalist asked Adair exactly the right question earlier this year with equally hilarious results.

      PolitiFact's system is perfect for filtering the truth according to media ideology.

      Academia, unfortunately, carries an ideological slant somewhat akin to that found in the U.S. media.  Nyhan perhaps represents one of academic liberalism's top leaders in the war over political truth. 

      NPR brought forth an old example of Nyhan's supposed "backfire" effect, where a correction of a falsehood leads to stronger belief in the falsehood. Though Nyhan's own research  (see descriptions of "Study 2") appears to show that the phenomenon does not occur with clear corrections, that hardly dampens mainstream media enthusiasm for the idea.  They can claim they're doing a great job but the audience is the problem.

      There is something of an information crisis, but Adair and Nyhan probably do as much damage as good in addressing the problem.  We're not getting the best information from either journalists or academia.  Journalists typically do not have the expertise to sift through complex issues of truth.  Academics have shifted left ideologically and do an inadequate job of critically reviewing the journals that ought to provide our best sources of trustworthy information.

      We don't have a reliable gatekeeper for our pool of information.  And it's hard to come up with good solutions to the problem.
      Test everything. Hold on to the good.
      --Paul the Apostle, 1 Thess. 5:21