Showing posts with label Source Bias. Show all posts
Showing posts with label Source Bias. Show all posts

Sunday, September 8, 2013

Bill Clinton enthralls PolitiFact with the magic of ObamaCare

Former President Bill Clinton was a great liar.

He hasn't lost it.

What made the former president from Arkansas such a fine liar?  Part of it was his sincerity.  He seemed so sincere that people wanted to believe him.  Apart from that, Clinton had a gift for saying things that were true but seriously misleading.

This week Clinton showed that he can bamboozle fact checkers with little effort.  Whether PolitiFact bought his act because of his sincerity or perhaps it was just their admiration for President Barack Obama's signature legislation, also known as ObamaCare, we don't know.  But it's pretty amusing how the fact checkers missed the obvious.

Let's pick up PolitiFact's story, already in progress:
Clinton went on to cite data from recent polling by the Commonwealth Fund showing that "large numbers of young people aged 26 and younger have already enrolled in their parents' plans. And interestingly enough -- if I were you guys, I'd promote this, (saying) these Republicans are the personal responsibility party -- there are more young Republicans enrolled in their parents' plans than young Democrats."

The irony that young supporters of the GOP -- the party that has repeatedly tried to repeal or defund Obama’s law -- are actually using this part of the law more than young Democrats are led to chuckles in the audience.
Those GOP hypocrites!  Right?

PolitiFact researches the Commonwealth Fund poll data Clinton cited, and sure enough he was exactly right.  And just to make sure we understand the depth of the Republican hypocrisy, PolitiFact helps Clinton out a bit by clarifying his point (bold emphasis added):
So, Clinton was right -- 63 percent of young Republicans, compared to only 45 percent of young Democrats had signed on to their parents’ plan, something they couldn’t have done without passage of Obama’s law.
Based on this evidence, along with statements from the study's lead author and Obama donor Sara R. Collins, PolitiFact gave Clinton's statement a "True" rating.

But there's a reason Clinton carries the nickname "Slick Willy," and there's also a reason why people often ridicule PolitiFact's rulings.  There's a catch that PolitiFact failed to catch.

As Obvious as the Nose on Clinton's Face

 

Clinton was right to a point about the findings of the survey.  More young Republicans than Democrats signed up or renewed under their parents' insurance policies.  But PolitiFact was exactly wrong to claim that the survey found 63 percent of the Republicans in the survey couldn't sign up under their parents' plans without the ACA.  The study makes that clear (bold emphasis added):
In March 2013, the survey finds that an estimated 15 million young adults ages 19 to 25 had enrolled in a parent’s insurance policy in the prior 12 months—more than half (51%) of that age group—up from the 13.7 million young adults estimated in November 2011 to have enrolled in the prior 12-month period (Exhibit 3, Table 1). Of these 15 million young adults, we estimate that roughly 7.8 million likely would not have been eligible for coverage under their parents’ employer plans prior to the Affordable Care Act, an increase of 1.1 million from November 2011.
So of the Democrats, Independents and Republicans who make up the percentages Clinton and PolitiFact cited, about half were eligible for inclusion under their parents' policies without the ACA.  PolitiFact's reporting is wrong on this point, and the error has obvious implications for Clinton's underlying point.

How Many Hypocrites?

 

What part of the 63 percent of young Republicans signed up for insurance under their parents' policies were eligible thanks to the ACA?

We don't know.  The survey doesn't inform us on that point.

We don't know how many young Republicans are hypocrites.  And we don't know whether the Republican hypocrites outnumber the ideologically pure Democrats who signed up under their parents' insurance thanks to the ACA.

In Clinton's Defense

 

Though we don't know that Clinton got his information on the survey directly from the Commonwealth Fund report, it's appropriate to note that the report encourages the conclusion he suggested even if it lacks the data to back the conclusion:
While public opinion polls have consistently shown a partisan divide in views of the health reform law, the survey finds that young adults who identified themselves as Republicans enrolled in their parents’ policies in greater numbers than young adults who identified themselves as Democrats. In March 2013, 63 percent of Republican young adults had enrolled in a parent’s policy in the past 12 months, compared with 45 percent of Democrats.
If the study has the numbers to back up the contrast between Republican opposition to the ACA and Republican embrace of its benefits, then the study should feature those supporting numbers.  Or maybe Commonwealth Fund is just confirming its reputation for a leftward lean.

How Did PolitiFact Miss It?

 

The meat of the Commonwealth Fund's survey leads with "Exhibit 1," which explains that half of the young Americans signed up for insurance under their parents' policies did not need the ACA to obtain the opportunity.  How does a fact checker miss it?

This is another case where the error is so astonishing that it seems difficult to explain without PolitiFact's predisposition (that is, bias) in favor of the health care law and/or Clinton.

Saturday, September 1, 2012

PolitiFlub: PolitiFact grades Callista Gingrich by the wrong measure

Crossposted from Sublime Bloviations


Words matter -- We pay close attention to the specific wording of a claim. Is it a precise statement? Does it contain mitigating words or phrases?
--Principles of PolitiFact and the "Truth-O-Meter"

It's a testament to PolitiFact's warped self-image that it continues churning out journalistic offal even while enduring a wave of substantive criticism.

Our latest example comes again from the Republican National Convention, where Callista Gingrich claimed that the Obama administration's foreign policy has led to decreased respect for the United States.

A legitimate fact checking enterprise immediately suspects that Gingrich referred to respect from foreign governments in terms of recognizing the U.S. as a power to which deferral yields the most beneficial results.  In other words, other nations fear the United States depending on the degree to which they operate contrary to our policy designs.  Based on that premise, the legitimate fact checker asks Gingrich to clarify the intent and tries to find a verifiable statistic that measures her accuracy.

That's not PolitiFact:
While surveys are currently being undertaken in 20 nations, only 14 of those have been done for long enough to shed light on Callista Gingrich’s claim.

The question asked is, "Please tell me if you have a very favorable, somewhat favorable, somewhat unfavorable or very unfavorable opinion of ... the United States." While favorability isn’t exactly identical to respect, we think it’s very close and a good approximation.
Seriously?

No doubt PolitiFact used the opinions of foreign policy experts to determine that the Pew data were an appropriate measure.

Or maybe not:


Seriously?  No expert sources?  Not one?

That's not a responsible fact check.  The global standing of the United States does not depend on the popular view among the world's peoples.  It comes directly from the way the world's leaders view the United States and whether they believe they can flaunt their power contrary to U.S. interests.

PolitiFact chose the wrong measure.

Why does anyone take PolitiFact seriously?



Jeff adds (9-2-12): 

If there's any doubt that PolitiFact is peddling editorial pieces as objective reporting, check out this Bret Stephens op-ed in the Wall Street Journal last week discussing the same topic and using the same sources:
In June, the Pew Research Center released one of its periodic surveys of global opinion. It found that since 2009, favorable attitudes toward the U.S. had slipped nearly everywhere in the world except Russia and, go figure, Japan. George W. Bush was more popular in Egypt in the last year of his presidency than Mr. Obama is today.

It's true that these surveys need to be taken with a grain of salt: efficacy, not popularity, is the right measure by which to judge an administration's foreign policy. But that makes it more noteworthy that this administration should fail so conspicuously on its own terms. Mr. Obama has become the Ruben Studdard of the world stage: the American Idol who never quite made it in the real world.
Is PolitiFact accusing Mr. Stephens of lying? Inaccuracy? Or is the reality that the world's opinion of America is beyond the scope of objective, measurable standards? How could two reputable outfits come up with such contradictory interpretations of the same facts? What is the measuring stick that makes Louis Jacobson and the Truth-O-Meter the final arbiter of truth on one end and Bret Stephens a dishonest, partisan dolt on the other?

Callista Gingrich made a perfectly reasonable, if not politically rhetorical, statement about Obama's influence on the world's impression of our country. She offered an opinion that has solid, if not conclusive, support. PolitiFact's biggest lie is their claim that they can fit opinions onto a ratings scale and objectively disprove them with opinions of their own.

The reality is PolitiFact often publishes opinion pieces instead of fact checks. And if it expects to maintain whatever shred of credibility it has left, it should take a lesson from Mr. Stephens' employer, and publish its articles on the editorial page.


(Earlier today I explained even more problems with PolitiFact's treatment of Gingrich's claim in the comments section below, so I won't repeat them here.)

Wednesday, July 25, 2012

The Weekly Standard: "PolitiFact Mucks Up the Contraception Debate"

This year has sped by at a breathtaking pace so far, and we've neglected to review some worthy stories about PolitiFact simply because we placed a higher priority on some stories than others.

But it's not too late.

In February, The Weekly Standard's Mark Hemingway weighed in with yet another damning assessment of PolitiFact's talent for fact checking:
Before I explain why PolitiFact is once again being deliberately misleading, grossly incompetent, or some hellbroth of these distinguishing characteristics, you'll have bear with me. Part of the reason PolitiFact gets away with being so shoddy is that it counts on its readers believing that it can be trusted to explain any necessary context to justify its status as judge, jury, and factual executioner.
Obviously the right thing to do now is click the link and read the whole thing for yourself.

For those who don't have the time, I'll sum up:

Hemingway's latest example of PolitiFactian perfidy concerns its use of a Guttmacher Institute publication to support an Obama administration claim that 98 percent of sexually active women use birth control.

The Obama administration was trying to justify its insurance mandate requiring birth control as a basic coverage requiring no copay.

Hemingway noted the Guttmacher Institute's lack of neutrality, a number of the arguments marshaled against its findings and PolitiFact's selective use of the evidence.

At the end of the day, a study drawn from a group of women aged 15-44 does not justify extrapolating the data to the set of all women of any age.  PolitiFact went soft again on an administration claim.

Friday, February 17, 2012

PolitiFact's prophylactic CYA

Yesterday PolitiFact rolled out a CYA article in response to the blowback to the oft-floated claim that 98 percent of all Catholic women use contraception.  PolitiFact rated that claim from an Obama administration official on Feb. 6, finding it "Mostly True."  PolitiFact's treatment of the issue provided little evidence of earnest journalistic curiosity and left its readers with no real means of independently verifying the data.

Watch how PolitiFact deftly avoids taking any responsibility for failing to present a clear account of the issue:
For the past week, thoughtful readers have let us know that we were wrong to give a Mostly True to the claim from a White House official that "most women, including 98 percent of Catholic women, have used contraception."

They said we overlooked a chart in a study from the Guttmacher Institute that showed the percentage was far more limited. But there’s a good reason we didn’t rely on the chart — it wasn’t the right one.
PolitiFact doesn't tell you that the Feb 6 story doesn't refer at all to the relevant chart.  PolitiFact claims to provide its sources. The source list doesn't include the relevant chart.  Instead, it features the charts that drew so much attention in the published criticisms.
Guttmacher Institute, "Contraceptive Use Is The Norm Among Religious Women," April 13, 2011

Guttmacher Institute, "Countering Conventional Wisdom: New Evidence on Religion and Contraceptive Use," April 2011

Centers for Disease Control and Prevention, "National Survey of Family Growth," accessed Feb. 2, 2012

Centers for Disease Control and Prevention, "Key Statistics from the National Survey of Family Growth," accessed Feb. 6, 2012
PolitiFact's mission (bold emphasis added):
PolitiFact relies on on-the-record interviews and publishes a list of sources with every Truth-O-Meter item. When possible, the list includes links to sources that are freely available, although some sources rely on paid subscriptions. The goal is to help readers judge for themselves whether they agree with the ruling.
Um, yeah, whatever.

So did PolitiFact fact check the item without checking the facts or simply forget to link the relevant data in the source list? 

Don't look for a confession in a CYA:
To double-check, we reviewed the criticism, talked with the study’s lead researcher, and reviewed the report and an update from the institute. We’re confident in our original analysis.
We can take that statement for what it's worth, given that the original analysis never produced a baseline for determining the error of the 98 percent figure.  We're left to guess whether the CYA intends to assure us that the original item includes data sufficient to help readers judge for themselves whether to agree with the ruling.

PolitiFact is suggesting that the fact check was perfectly fine, and those of you who used their references to try to reach your own conclusions mishandled the facts.

PolitiFact:
The spate of blog posts and stories this week — some directly claiming to debunk our reporting — unfortunately rely on a flawed reading of a Guttmacher Institute study.

They were easy mistakes to make, confusing the group of women who have "ever used" contraceptives with those who are "currently using" contraceptives — and misapplying footnote information about those "currently using" to the 98 percent statistic.
The "flawed reading" results directly from the fact that neither the Guttmacher Institute nor PolitiFact provided access to the data that might have supported the key claim.  I'll quote from the PFB assessment:  "That's fact checking?"

If PolitiFact had checked the claim properly in the first place then PolitiFact could have answered the criticisms without the wholesale review.  In fact, the criticisms would be clearly wrong based on material included in or linked from the original fact check.

More from PolitiFact:
The critics of our reporting — bloggers for the Weekly Standard, CatholicVote.org and GetReligion.org — were relying on an analysis from Lydia McGrew in her blog, "What's Wrong With The World," which was also cited by the Washington Post's WonkBlog.
PFB highlighted McGrew's analysis, certainly.  But our criticisms expanded beyond McGrew's and recognized that the Guttmacher Institute report may have included data that PolitiFact neglected to explain to its readers.  One would think from PolitiFact's response above that no criticism of its reporting on this issue contains merit.

Focus on McGrew

Wednesday, February 15, 2012

What's Wrong With the World: "How to Lie with Statistics, Example Umpteen"

Jeff and I hugely appreciate bloggers who delve into the more complicated PolitiFact-related issues.

Lydia McGrew of the "What's Wrong With the World" blog gives a proper dressing-down to the Obama administration, the Guttmacher Institute and our beloved PolitiFact over the supposedly "Mostly True" claim that 98 percent of Catholic women use birth control.

As is our wont, we'll focus primarily on PolitiFact's role in the mess.

McGrew:
(T)his Politifact evaluation of the meme gets it wrong again and again, and in both directions.

First, the Politifact discussion insists that the claim is only about women in this category who have ever used contraception. When I first heard that and hadn't looked at the study, I immediately thought of the fact that such a statistic would presumably include women who were not at the time of the study using contraception and had used it only once in the past. It was even pointed out to me that it would include adult converts whose use might easily have been prior to their becoming Catholic. However, that isn't correct, anyway. The study expressly was of current contraceptive use. That's, in a sense, "better" for the side that wants the numbers to be high.
McGrew pointed out earlier that the Guttmacher Institute study uses data for "women at risk for
unintended pregnancy, whom we define as those who had had sex in the three months prior to the survey and were not pregnant, postpartum or trying to get pregnant."  The women surveyed were additionally in the 15-44 age range.  Yet PolitiFact describes the findings like so:
We read the study, which was based on long-collected, frequently cited government survey data. It says essentially that — though the statistic refers specifically to women who have had sex, a distinction Muñoz didn’t make.

But that’s not a large clarification, since most women in the study, including 70 percent of unmarried Catholic women, were sexually experienced.
That's fact checking?

McGrew:
(O)n this point, too, the Politifact evaluation is completely wrong. Politifact implies that only the supplementary table on p. 8 excluded these groups and that Figure 3 on p. 6 included them! But this is wrong. The table on p. 8 is simply supplementary to Figure 3, and both are taken from the same survey using the same restrictions! This is made explicit again and again in the study.
McGrew's exactly right.  The same information accompanies the asterisk for each table (bold emphasis added):  "*Refers to sexually active women who are not pregnant, postpartum or trying to get pregnant."

It doesn't occur to PolitiFact that restricting the survey population like that throws a serious spanner in the works.

That kind of credulity goes by a different name:  gullibility.

Visit What's Wrong With the World and read all of McGrew's skillful fisking of the liberal trio.  It's well worth it.


Addendum:

The Guttmacher Institute drew its data ultimately from here.

It may be the case that the Guttmacher study is reliable.  Regardless of that, PolitiFact did virtually nothing to clarify the issue.  A recent Washington Post story does shed some light on things, however:
I called up Rachel Jones, the lead author of this study, to have her walk me through the research. She agrees that her study results do not speak to all Catholic women. Rather, they speak to a specific demographic: women between 15- and 44-years-old who have ever been sexually active.


Jeff Adds (2/15/2012): Over on PolitiFact's Facebook page, frequent PF critic Matthew Hoy offered up his usual spot on commentary:
I find [PolitiFact's] failure to note that the Alan Guttmacher Institute is closely allied with Planned Parenthood a troubling omission. It isn't some neutral observer and its studies shouldn't be taken at face value without some healthy skepticism.
This isn't the first time PolitiFact has ignored Guttmacher's relationship with Planned Parenthood. Regardless of the studies accuracy, the alliance deserves at least a cursory disclosure. It's also important to note that PolitiFact used a similar connection to justify the rating of Florida Governor Rick Scott's claim about high-speed rail projects:
Scott bases his claims on hypothetical cost overruns from a suspect study written by a libertarian think tank...We rate Scott's claim False.
We highlighted that rating here.



Correction 2/17/2012:  "Guttmacher" was misspelled in the next-to-last paragraph.

Wednesday, December 14, 2011

Engineering Thinking: "PolitiFact’s Analysis of Cain’s 9-9-9 Plan is Fatally Flawed"

We were slow to notice a fresh PolitiFact item by Ed Walker at his blog "Engineering Thinking" from October.

Walker swiftly skewers PolitiFact's treatment of a Herman Cain claim about his 9-9-9 tax plan:
1. The first major problem with PolitiFact’s analysis is that it was not shown to be objective. PolitiFact selected three tax accountants to provide an opinion, but since Cain’s 9-9-9 plan — if implemented — will substantially reduce the need for tax accountants, they are the last folks that should be asked for an assessment.
Indeed, it seems odd that PolitiFact would solicit volunteers* from the ranks of tax accountants to test Cain's claim rather than going to tax experts at a think tank.  Not that the latter route is totally unproblematic.

And Walker's second point:
2. Politifact states in the online version, “For this fact-check, we’ll only be talking about the personal income tax and the sales tax since the business tax directly affects only business owners and corporations.” This assertion is nonsense, however, since everyone’s effective income is directly impacted by the prices that business owners and corporations charge their customers, and those prices are greatly affected by federal corporate and payroll taxes.

PolitiFact completely ignores such taxes, which are often hidden taxes that the Cain plan eliminates.
Walker is deadly accurate with his second point.  PolitiFact seems completely fooled by embedded taxes, formerly neglecting their existence in a fact check of Warren Buffett's claims about effective tax rates for the very rich.  I've coined the term "the Buffett fallacy" for that mistake.

A good fact check does not simply ignore important aspects of the issue it examines.

Walker's post is short, but it's worth a visit to read the entire thing.  So please do so.


* I have a very clear recollection of PolitiFact posting a request for readers with tax expertise to help evaluate Cain's plan.  Unfortunately, the Web page is either a bit hard to find or that item was scrubbed from PolitiFact's Web territory.

Thursday, December 1, 2011

Pete Sepp: "I don't know who the experts you consulted are or whatever policy agendas they may have"

Pete Sepp, vice president for communications and policy for the National Taxpayers Union, usually interacts with PolitiFact as an expert source.  This month, however, the NTU ran an ad that received the PolitiFact treatment, and Sepp ended up as NTU's spokesperson in defending it.  The ad called the federal government's proposed rebate program for drug purchases a "tax."

Sepp did not publish a public rebuttal to PolitiFact.  Rather, we find his arguments hosted by PolitiFact's Texas affiliate.   PolitiFact combined the bodies of three email messages from Sepp on a single reference page.

Sepp's initial email (bold emphasis added):
Based on our experience, calling this rebate plan anything less than a tax fails to capture all of its effects:

1) With a few exceptions that the Secretary of HHS would be able to approve (an uncertain proposition), drug manufacturers would be required to rebate 23 percent of the average manufacturer price (more if the drug price rose quicker than inflation) for a brand-name pharmaceutical that was distributed to lower-income Part D beneficiaries. Otherwise, the company could not participate in providing drugs to Medicaid, Medicare, or other government beneficiaries. Considering there are already genuine rebates (i.e., negotiated discounts) under several such programs, this latest demand from the government for being able to sell to a huge segment of the entire consumer drug market in the U.S. seems more like a mandatory extraction than a voluntary refund.

2) The money collected from these "rebates" don't wind up in the actual consumers' pockets or the various Part D plans; instead they go to a fund that will defray certain government Medicare program costs. A "rebate" as is commonly understood is something that the consumer of product receives after purchase. This "rebate" is nothing of the kind, and represents deceptive terminology.

3) The "rebate" is based on a percentage of price per unit, a lot like the way some excise taxes on products such as some tobacco items work.

4) This "rebate" will in essence squeeze the price bubble somewhere else. Either other Part D beneficiaries get stuck with higher premiums, people in private, non-Medicare plans pay higher prices for their drugs, or drug development and access gets scaled back, or even voluntary discounts start to dry up.

For a good summary of how this could happen, as well as some previous CBO work on this topic. I'd suggest the following link at American Action Forum, which former CBO Director Douglas Holtz-Eakin serves at:

http://americanactionforum.org/topic/cost-shifting-debt-reduction-american-seniors
Sepp from his first followup:
I didn't see a feature yet on your site so I thought I'd send you a couple other good links to commentaries that discuss the rebate scheme: 



Yes, there are several groups like ours (AEI, Galen Institute, American Action Forum) who share concern that this proposal amounts to a tax.
And from Sepp's second followup, registering his apparent incredulity at PolitiFact's ruling:
1) "There's nothing in the proposal that calls this a tax and experts we visited say rebates like the one in Medicaid never have been called taxes." I don't know who the experts you consulted are or whatever policy agendas they may have, but here are people in the health policy field who agree with the ad's contention that the rebate proposal is best described as a tax.
Sepp gave four examples then moved to his second point:
2) In another email you had asked, "There's nothing in the proposal that calls this a tax." My answer: well, of course not! Supporters call this a rebate so they can raise revenues for the federal government without branding their scheme a tax and having to answer a lot of inconvenient questions about it. Just because they don't want to call it a tax doesn't mean it won't function like one (see above). That's exactly the point of our ad, and our mission for the past 42 years -- exposing attempts by the political class to cover up a proposal that walks, talks, and hurts like a tax by calling it something else.
Contrast Sepp's argument with PolitiFact's conclusion:
We see how the Obama proposal could be judged a nearly mandatory give-back in that drug companies that decline to give rebates would do so at risk to their bottom lines. It also makes sense that drug companies wouldn’t swallow the costs of the rebates; they’re not free.

Then again, contrary to the ad's statement, there’s no evidence low-income Medicare beneficiaries would pay a 23 percent "tax." And all told, Obama's urged rebate remains that--money paid in return for a purchase or action/opportunity. One would have to connect more dots to make it a tax. We rate the group’s statement False.
What dots require connecting, other than having the term "tax" appear in the text of the bill to describe the rebate?  Good luck finding it in the story.  I couldn't.  It's hard to know when one has met a secret standard, and other than the absurd standard of requiring the bill to describe the rebate as a "tax" it's hard to see what would serve.

One additional brickbat for PolitiFact:  Where is the full context of the ad?  If there's some reason for not giving readers a copy of the ad to look at then the readers deserve to know what it is.


Thursday, November 3, 2011

Reason: "PolitiFact Gets High-Speed Rail Facts in Florida Wrong"

Given the recent news about California's impressive high speed rail cost overruns, it seems like a good time to call attention to Reason.com's pushback against PolitiFact's defense of the high speed rail system proposed for Florida.

The chief evidence of bias comes from PolitiFact's attempt to discredit Reason.com on ideological grounds--an intriguing move for an organization known to uncritically cite Guttmacher Institute studies when fact checking claims by abortion opponents.  The Guttmacher Institute, of course, is ideologically attached to Planned Parenthood.

Most of PolitiFact's criticisms of the study promoted by Reason.com were quite weak, such as pointing out that data from the study showing cost overruns were not exclusively rail studies.  While that's true, the cost overruns were greater for rail projects, so the supposed problem actually made rail look perhaps better than it deserved.

The key point of dispute concerns the responsibility for costs if the project stays in the red.  PolitiFact argued that Florida's project provided adequate protections.  Reason.com argues the reverse:
When Gov. Scott was making his rail decision, he knew that if Florida had taken federal money for the Tampa-to-Orlando high-speed rail system, one of the federal government’s rules clearly says that a state government can’t take the construction money and then stop operating the project it has accepted the money for. Under long-standing federal rules, the state would have to repay the federal grant money—in this case, $2.4 billion. If it didn’t repay the $2.4 billion, Florida’s taxpayers would be forced to keep the train running —at a loss— and be on the hook for the future operating subsidies. The U.S. Department of Transportation did send notice that it would negotiate over its repayment rule, but only after Gov. Scott had already announced his decision to turn down the federal money.
I'll admit I'm not familiar with the cited rule, but it's easy on principle to imagine it exists.  It could have helped Reason.com's case to include more information about it.

On the whole, Reason.com makes a pretty good case that PolitiFact failed to settle the issue.