Swindler’s List

“I have here in my hand a list of two hundred and five (people) that were known to the Secretary of State as being members of the Communist Party and who nevertheless are still working and shaping the policy of the State Department”

Senator Joseph McCarthy, in his famous accusations about Communist influences in the U.S. government, had a list. A secret list. And he wasn’t revealing his sources.

Fortunately, we’ve come a long way since those dark days. No longer do senators keep secret lists with which to malign the reputation of those who displease them.

The only ones with such lists today are the health insurance companies.

The Wall Street Journal reports the following:

New York Attorney General Andrew Cuomo demanded last week a “full justification” of the rankings that Aetna Inc. and Cigna Corp. have rolled out in the state. He warned the companies that the ratings are confusing and potentially deceptive, in part because insurers don’t disclose how prone to error their rankings are. The move follows rankings lawsuits by doctors accusing insurers of libel, unfair business practices and breach of contract in other states.

Health plans say the designation of preferred doctors is meant to aid patients by calling attention to the best physicians. To that end, UnitedHealth Group Inc., for example, is rolling out a system called United Premium Program. Aetna gives select doctors an “Aexcel” label in its plans. And Cigna launched its Cigna Care Network in at least parts of 26 states and Washington, D.C., early this year.

Sounds pretty benign, doesn’t it? Your health plan publishes a list of “best doctors” based on their interest in keeping you healthy, and promoting their new emphasis on quality. It’s all in the best interest of their customers, after all. Whatever could be the problem with that?

As you might guess, there’s more here than meets the eye:

…critics accuse insurers of concentrating more on cost than quality when handing out the preferred labels. Data from health claims are commonly used to produce the ratings. But the information, while standardized and widely collected, is prone to error, Mr. Cuomo and physicians say. Medical conditions can overlap and doctors’ offices vary in how they assign billing codes to care … Mr. Cuomo warned that rankings based on claims data can be badly flawed, and said insurers have conflicts of interest because of financial incentives to contain costs.

Of course the insurance industry has taken great pains to ensure that their quality rankings are fair and balanced:

An Aetna spokeswoman said the company consults with physicians in developing the ratings. Aetna considers its rating system transparent and posts the criteria and other details about it on the company’s Web site.

UnitedHealth also calls its system transparent, and said it had sent Mr. Cuomo a 25-page response but declined to make it available.

A Cigna spokesman said the company measures doctor performance by “what we believe is the best data available.” However, the measures “represent only a partial assessment of a provider’s quality and cost efficiency” and shouldn’t be the only reason patients pick a doctor, he added.

The emphasis above is mine, BTW — more on that in a minute.

Now, color me skeptical about such claims from insurance companies; having dealt with them first-hand for nearly 30 years does leave one with a certain hardened cynicism about their motives. But being a fair-minded type of fellow, I decided to check out some of their transparency claims. So I moseyed on over to Aetna’s web site, used my considerable influence as a physician to register, and started snooping around.

After some effort I located the area of their web site on “health care transparency,” and after following a few more links, finally found myself at the information area for their “Aexcel” program. Here’s the link, although you’ll need to register as a health care profession to log on. The rating system is contained in two PDF files, one on evaluating physician clinical performance, another on physician efficiency performance. Check them out if you have a stomach for banality blended with baffling bullshit.

A few quick observations: the clinical criteria used to initiate the screens are lowest-common-denominator assessments: they measure, not excellence, but bare adequacy. This is a bar the most agile limbo dancer would shudder to shimmy under:

Subsequent steps involve applying various statistical screens (not specified in detail), finally landing on the desk of the all-caring “Medical Director” who will ensure that there will be No Doc Left Behind.™ And of course, those few physicians who do fail to meet such exacting standards will be swamped with pages of statistics and asked to plead for leniency (Picture the monacle and the cigarette holder: “Ve need to see your PAPERZ!! Do you have ze PAPERZ??!! Ve have some … concerns about you, Herr Doktor …”).

Well, I’ve talked to a number of insurance company Medical Directors –they are frighteningly ignorant about the specialties they hold in judgment. Let’s just say you don’t become a medical director at an insurance company because there’s no more room at the top at the Mayo Clinic.

And I’ve seen these “detailed clinical performance data sheets” — they are inscrutably dense, and would be impossible to rebut — unless you had the IT staff of Aetna to analyze your own data. What’s more, there’s no linked patient information — just raw numbers cranked out by Aetna’s statistical software. It would be impossible to determine which cases were outliers to defend yourself.

Now we move to the meat of the matter: the efficiency screens. Because it’s almost impossible to fail the clinical screens if you know which end of the stethoscope to put in your ears, we have arrived at the process which for all practical purposes determines who the “best doctors” are: their “efficiency” — i.e., who costs the system the least money.

Now, I challenge anyone to go through Aetna’s “transparent” efficiency screens and tell exactly how they work. Go ahead, I’ll wait …

Still not done? Stumped by that one? Wait, there’s more — here’s my favorite:

OK, I’m asking a lot from you. Let me boil it down to basics.

Insurance companies have data. Tons of data. More data than the IRS. Every health claim filed, every line item, every service code, every diagnosis code, every lab test ordered or procedure performed, ends up in their massive databases — millions of submissions daily. Rich fodder indeed for statistical analysis.

But there’s a problem — a huge problem, in fact: almost none of this data can accurately assess the quality of care provided. Surprised? Don’t be.

When you see your doctor, a host of information is generated and submitted to your insurance company. They know you — your age, your gender, your prior health services, your frequency of visits for your current and past medical problems. They also know your physician — what lab work and tests he or she has ordered (not only for you but for many other patients), what level of office visit or hospital service provided, and diagnosis codes specifying why these tests or services have been performed. Like I said, tons of digital fodder for their statistical Cuisinarts.

But there’s a huge black hole about which they know nothing — several black holes, in fact.

The most important is this: they do not have the results of these visits, tests, and procedures. They don’t know what the lab tests showed; what the x-ray revealed; and in most cases, what the doctor wrote about you in his office notes or hospital chart — because they are rarely if ever sent this information. They don’t really know the mitigating factors, risk factors, your personal preferences or concerns, or the opinions of other doctors who have seen you. All they have is the details of tests ordered, the service provided — and the incredibly inadequate diagnosis coding system called ICD-9 which tells them why the doctor ordered the tests or performed the service (more on this in a moment).

They don’t know anything about care you received when you were covered under another health care plan.

They don’t know anything about care you purchased out-of-pocket when you were uninsured, or paid for a non-covered service.

They don’t know anything about your family health history, what drugs resulted in an adverse reaction in the past, what side effects you had from treatments or medications for your current or unrelated health problems.

What they do know is how much your care is costing them.

Now, this is not to say that some useful information may not be gleamed by statistical analysis of patterns of care delivered. But judging the quality of care you received using such indicators is akin to Shell Oil assessing how safe a driver you are by looking at your gas and service receipts.

But what about those diagnosis codes which tell them why your doctor is providing those services and ordering those tests? I’ve written about the ICD-9 codes at length, and I suggest you spend a few minutes looking at this to understand how extraordinarily inadequate and misleading they are when it comes to evaluating medical care. But let me give you just a few examples of how inadequate they really are.

Take, example, the ICD-9 diagnosis code for prostate cancer: 185. That’s right, one code. One code whether you had a few cancer cells detected in your prostate incidentally when they opened it up for inability to urinate; one code for a slow-growing tumor which will never cause problems in an elderly man; one code for a wildly aggressive tumor which untreated will kill you in a few years. One code whether the tumor is the size of a pea or has spread to every bone in your body.

One code.

So, Doctor Jones sees you because you can’t urinate, and does a TURP (a “roto-rooter”, if you will). This surgery will be billed to insurance using the diagnosis code 600.21, for BPH. But one week later the pathologist informs you that a tiny cluster of cancer cells is present, not aggressive-looking. Subsequent visits will be billed out using diagnosis code 185: prostate cancer. You will likely receive no further treatment, only periodic office visits and a blood test once or twice a year. The insurance company will love you, since the care delivered will meet their ridiculously low clinical standards, and will be very “efficient” (i.e., cheap). Dr. Jones gets high marks as a “quality” physician.

Dr. Smith, on the other hand, sees you for a high PSA blood test, and your biopsy shows an aggressive, poorly-differentiated cancer. Same code: 185. The news is not good — you will need major surgery. Big bucks. At surgery, the cancer is more extensive than you had hoped, and you will require radiation therapy after surgery. More big bucks — and the bucks don’t stop here. After radiation, your bladder control gets worse, and you need more surgery to fix that problem. Even more big bucks. The insurance company is not happy. Dr. Smith, a top-notch cancer surgeon (much better, BTW, than Dr. Jones), meets all the clinical standards, but his “efficiency”? Not so much. His “quality” will be lower. Same diagnosis code — totally different worlds.

Or take bladder cancer, ICD-9 group 188. This includes 10 separate codes, 188 through 188.9. These codes specify only the location of your bladder cancer – lateral wall, anterior, etc. — but nothing about its size, or aggressiveness, or whether it has spread to the lymph nodes or lungs. ICD-9 188.2 tells your insurance company you have a bladder tumor on the side wall of your bladder. It may be a tiny growth which can be cauterized in the office — or a wildly aggressive cancer the size of tennis ball blocking the kidney and requiring major surgery, radiation, or chemotherapy. Your insurance company can infer this to be the case by the subsequent therapy you need — but simply does not know enough information to assess the decision-making process or quality of care delivered by your physician.

So this is how the insurance industry creates their secret lists, their “carefully selected panel of physicians” whom they will recommend to you as being the “best.” This is the “transparency” they tout on their web sites. This is the “best data they have available” to determine the quality of your doctor.

Think I’m overstating my case?

Last year, six Seattle-area doctors sued Regence BlueShield, alleging deceptive business practices and defamation after it cut about 500 doctors from its network on quality and efficiency grounds. Regence recently scrapped the rankings, a system called Select Network that elevated some doctors to an elite status, and settled the suit. Under the settlement, the insurer made a contribution to the state medical association \'s education fund and agreed that any new rating program would give consumers more information. That settlement also said Regence would solicit input from doctors for any future rankings and give them an external appeal process.

Now maybe the insurance industry has developed the statistical wizardry to discern, from this highly inaccurate and bizarre coding and billing system, in some measure, what constitutes true quality care. Color me skeptical –ye, cynical — but if you want to convince me, show me the money. Let’s see, in all their complexity, the exact algorithms and data used to determine what is, and isn’t, quality care — not just some one-page PowerPoint flow chart. Show me how you discern quality without ever seeing the chart notes, lab results, x-ray-findings, pathology reports which are integral and indispensable to the process of making high-quality medical decisions. Show me the medical literature which proves that spending X amount on patient A is higher quality care than spending X+Y dollars on patient B. Prove to me that this whole system isn’t just about raking in the cash to pay billions to your corporate CEOs and stockholders, off the backs of physicians and the patients who depend on you to provide coverage for true quality care.

And maybe — just maybe — I’ll consider your claims about quality to be justified. But I’m not holding my breath.

Or maybe I’ll just be quiet — you don’t want to end up on someone’s secret list, you know.

Print Friendly, PDF & Email